Commands for working with Dagster assets.
dagster asset [OPTIONS] COMMAND [ARGS]...
Commands
wipeEliminate asset key indexes from event logs.
Commands for debugging Dagster pipeline/job runs.
dagster debug [OPTIONS] COMMAND [ARGS]...
Commands
exportExport the relevant artifacts for a…
importImport the relevant artifacts for a…
Commands for working with the current Dagster instance.
dagster instance [OPTIONS] COMMAND [ARGS]...
Commands
infoList the information about the current…
migrateAutomatically migrate an out of date…
reindexRebuild index over historical runs for…
Commands for working with Dagster jobs.
dagster job [OPTIONS] COMMAND [ARGS]...
Commands
backfillBackfill a partitioned job.
executeExecute a job.
launchLaunch a job using the run launcher…
listList the jobs in a repository.
list_versionsDisplay the freshness of memoized results…
printPrint a job.
scaffold_configScaffold the config for a job.
Commands for working with Dagster pipeline/job runs.
dagster run [OPTIONS] COMMAND [ARGS]...
Commands
deleteDelete a run by id and its associated…
listList the runs in the current Dagster…
wipeEliminate all run history and event logs.
Commands for working with Dagster schedules.
dagster schedule [OPTIONS] COMMAND [ARGS]...
Commands
debugDebug information about the scheduler.
listList all schedules that correspond to a…
logsGet logs for a schedule.
previewPreview changes that will be performed by…
restartRestart a running schedule.
startStart an existing schedule.
stopStop an existing schedule.
wipeDelete the schedule history and turn off…
Commands for working with Dagster sensors.
dagster sensor [OPTIONS] COMMAND [ARGS]...
Commands
cursorSet the cursor value for an existing sensor.
listList all sensors that correspond to a…
previewPreview an existing sensor execution.
startStart an existing sensor.
stopStop an existing sensor.
Run a GraphQL query against the dagster interface to a specified repository or pipeline/job.
Can only use ONE of –workspace/-w, –python-file/-f, –module-name/-m, –grpc-port, –grpc-socket.
Examples:
dagster-graphql
dagster-graphql -y path/to/workspace.yaml
dagster-graphql -f path/to/file.py -a define_repo
dagster-graphql -m some_module -a define_repo
dagster-graphql -f path/to/file.py -a define_pipeline
dagster-graphql -m some_module -a define_pipeline
dagster-graphql [OPTIONS]
Options
--version¶Show the version and exit.
-t, --text <text>¶GraphQL document to execute passed as a string
-f, --file <file>¶GraphQL document to execute passed as a file
-p, --predefined <predefined>¶GraphQL document to execute, from a predefined set provided by dagster-graphql.
launchPipelineExecution
-v, --variables <variables>¶A JSON encoded string containing the variables for GraphQL execution.
-r, --remote <remote>¶A URL for a remote instance running dagit server to send the GraphQL request to.
-o, --output <output>¶A file path to store the GraphQL response to. This flag is useful when making pipeline/job execution queries, since pipeline/job execution causes logs to print to stdout and stderr.
--ephemeral-instance¶Use an ephemeral DagsterInstance instead of resolving via DAGSTER_HOME
--empty-workspace¶Allow an empty workspace
-w, --workspace <workspace>¶Path to workspace file. Argument can be provided multiple times.
-d, --working-directory <working_directory>¶Specify working directory to use when loading the repository or pipeline/job.
-f, --python-file <python_file>¶Specify python file where repository or pipeline/job function lives
--package-name <package_name>¶Specify Python package where repository or pipeline/job function lives
-m, --module-name <module_name>¶Specify module where repository or pipeline/job function lives
-a, --attribute <attribute>¶Attribute that is either a 1) repository or pipeline/job or 2) a function that returns a repository or pipeline/job
--grpc-port <grpc_port>¶Port to use to connect to gRPC server
--grpc-socket <grpc_socket>¶Named socket to use to connect to gRPC server
--grpc-host <grpc_host>¶Host to use to connect to gRPC server, defaults to localhost
--use-ssl¶Use a secure channel when connecting to the gRPC server
Environment variables
DAGSTER_WORKING_DIRECTORYProvide a default for
--working-directory
DAGSTER_PYTHON_FILEProvide a default for
--python-file
DAGSTER_PACKAGE_NAMEProvide a default for
--package-name
DAGSTER_MODULE_NAMEProvide a default for
--module-name
DAGSTER_ATTRIBUTEProvide a default for
--attribute
Run dagit. Loads a repository or pipeline/job.
Can only use ONE of –workspace/-w, –python-file/-f, –module-name/-m, –grpc-port, –grpc-socket.
Examples:
dagit (works if .workspace.yaml exists)
dagit -w path/to/workspace.yaml
dagit -f path/to/file.py
dagit -f path/to/file.py -d path/to/working_directory
dagit -m some_module
dagit -f path/to/file.py -a define_repo
dagit -m some_module -a define_repo
dagit -p 3333
Options can also provide arguments via environment variables prefixed with DAGIT
For example, DAGIT_PORT=3333 dagit
dagit [OPTIONS]
Options
--use-ssl¶Use a secure channel when connecting to the gRPC server
--grpc-host <grpc_host>¶Host to use to connect to gRPC server, defaults to localhost
--grpc-socket <grpc_socket>¶Named socket to use to connect to gRPC server
--grpc-port <grpc_port>¶Port to use to connect to gRPC server
-a, --attribute <attribute>¶Attribute that is either a 1) repository or pipeline/job or 2) a function that returns a repository or pipeline/job
-m, --module-name <module_name>¶Specify module where repository or pipeline/job function lives
--package-name <package_name>¶Specify Python package where repository or pipeline/job function lives
-f, --python-file <python_file>¶Specify python file where repository or pipeline/job function lives
-d, --working-directory <working_directory>¶Specify working directory to use when loading the repository or pipeline/job.
-w, --workspace <workspace>¶Path to workspace file. Argument can be provided multiple times.
--empty-workspace¶Allow an empty workspace
-h, --host <host>¶Host to run server on
127.0.0.1
-p, --port <port>¶Port to run server on.
3000
-l, --path-prefix <path_prefix>¶The path prefix where Dagit will be hosted (eg: /dagit)
--db-statement-timeout <db_statement_timeout>¶The timeout in milliseconds to set on database statements sent to the DagsterInstance. Not respected in all configurations.
15000
--read-only¶Start Dagit in read-only mode, where all mutations such as launching runs and turning schedules on/off are turned off.
--suppress-warnings¶Filter all warnings when hosting Dagit.
--log-level <log_level>¶Set the log level for the uvicorn web server.
warning
critical|error|warning|info|debug|trace
--version¶Show the version and exit.
Environment variables
DAGSTER_ATTRIBUTEProvide a default for
--attribute
DAGSTER_MODULE_NAMEProvide a default for
--module-name
DAGSTER_PACKAGE_NAMEProvide a default for
--package-name
DAGSTER_PYTHON_FILEProvide a default for
--python-file
DAGSTER_WORKING_DIRECTORYProvide a default for
--working-directory
Run any daemons configured on the DagsterInstance.
dagster-daemon run [OPTIONS]
Options
--use-ssl¶Use a secure channel when connecting to the gRPC server
--grpc-host <grpc_host>¶Host to use to connect to gRPC server, defaults to localhost
--grpc-socket <grpc_socket>¶Named socket to use to connect to gRPC server
--grpc-port <grpc_port>¶Port to use to connect to gRPC server
-a, --attribute <attribute>¶Attribute that is either a 1) repository or pipeline/job or 2) a function that returns a repository or pipeline/job
-m, --module-name <module_name>¶Specify module where repository or pipeline/job function lives
--package-name <package_name>¶Specify Python package where repository or pipeline/job function lives
-f, --python-file <python_file>¶Specify python file where repository or pipeline/job function lives
-d, --working-directory <working_directory>¶Specify working directory to use when loading the repository or pipeline/job.
-w, --workspace <workspace>¶Path to workspace file. Argument can be provided multiple times.
--empty-workspace¶Allow an empty workspace
Environment variables
DAGSTER_ATTRIBUTEProvide a default for
--attribute
DAGSTER_MODULE_NAMEProvide a default for
--module-name
DAGSTER_PACKAGE_NAMEProvide a default for
--package-name
DAGSTER_PYTHON_FILEProvide a default for
--python-file
DAGSTER_WORKING_DIRECTORYProvide a default for
--working-directory
Wipe all heartbeats from storage.
dagster-daemon wipe [OPTIONS]
Log all heartbeat statuses
dagster-daemon debug heartbeat-dump [OPTIONS]
Serve the Dagster inter-process API over GRPC
dagster api grpc [OPTIONS]
Options
-p, --port <port>¶Port over which to serve. You must pass one and only one of –port/-p or –socket/-s.
-s, --socket <socket>¶Serve over a UDS socket. You must pass one and only one of –port/-p or –socket/-s.
-h, --host <host>¶Hostname at which to serve. Default is localhost.
-n, --max_workers <max_workers>¶Maximum number of (threaded) workers to use in the GRPC server
--heartbeat¶If set, the GRPC server will shut itself down when it fails to receive a heartbeat after a timeout configurable with –heartbeat-timeout.
--heartbeat-timeout <heartbeat_timeout>¶Timeout after which to shutdown if –heartbeat is set and a heartbeat is not received
--lazy-load-user-code¶Wait until the first LoadRepositories call to actually load the repositories, instead of waiting to load them when the server is launched. Useful for surfacing errors when the server is managed directly from Dagit
-a, --attribute <attribute>¶Attribute that is either a 1) repository or pipeline/job or 2) a function that returns a repository or pipeline/job
-m, --module-name <module_name>¶Specify module where repository or pipeline/job function lives
--package-name <package_name>¶Specify Python package where repository or pipeline/job function lives
-f, --python-file <python_file>¶Specify python file where repository or pipeline/job function lives
-d, --working-directory <working_directory>¶Specify working directory to use when loading the repository or pipeline/job.
--use-python-environment-entry-point¶If this flag is set, the server will signal to clients that they should launch dagster commands using <this server’s python executable> -m dagster, instead of the default dagster entry point. This is useful when there are multiple Python environments running in the same machine, so a single dagster entry point is not enough to uniquely determine the environment.
--empty-working-directory¶Indicates that the working directory should be empty and should not set to the current directory as a default
--ipc-output-file <ipc_output_file>¶[INTERNAL] This option should generally not be used by users. Internal param used by dagster when it automatically spawns gRPC servers to communicate the success or failure of the server launching.
--fixed-server-id <fixed_server_id>¶[INTERNAL] This option should generally not be used by users. Internal param used by dagster to spawn a gRPC server with the specified server id.
--override-system-timezone <override_system_timezone>¶[INTERNAL] This option should generally not be used by users. Override the system timezone for tests.
--log-level <log_level>¶Level at which to log output from the gRPC server process
--container-image <container_image>¶Container image to use to run code from this server.
--container-context <container_context>¶Serialized JSON with configuration for any containers created to run the code from this server.
Environment variables
DAGSTER_GRPC_PORTProvide a default for
--port
DAGSTER_GRPC_SOCKETProvide a default for
--socket
DAGSTER_GRPC_HOSTProvide a default for
--host
DAGSTER_LAZY_LOAD_USER_CODEProvide a default for
--lazy-load-user-code
DAGSTER_ATTRIBUTEProvide a default for
--attribute
DAGSTER_MODULE_NAMEProvide a default for
--module-name
DAGSTER_PACKAGE_NAMEProvide a default for
--package-name
DAGSTER_PYTHON_FILEProvide a default for
--python-file
DAGSTER_WORKING_DIRECTORYProvide a default for
--working-directory
DAGSTER_USE_PYTHON_ENVIRONMENT_ENTRY_POINTProvide a default for
--use-python-environment-entry-point
DAGSTER_EMPTY_WORKING_DIRECTORYProvide a default for
--empty-working-directory
DAGSTER_CONTAINER_IMAGEProvide a default for
--container-image
DAGSTER_CONTAINER_CONTEXTProvide a default for
--container-context
Commands for working with Dagster pipelines/jobs.
dagster pipeline [OPTIONS] COMMAND [ARGS]...
Commands
backfillBackfill a partitioned pipeline/job.
executeExecute a pipeline.
launchLaunch a pipeline using the run launcher…
listList the pipelines/jobs in a repository.
list_versionsDisplay the freshness of memoized results…
printPrint a pipeline/job.
scaffold_configScaffold the config for a pipeline.