Compute resources
PySpark executor
If the pipeline type is pyspark
, we use PySpark executors for pipeline and
block executions.
You can customize the compute resource of PySpark executor by
updating the instance types of emr_config
in project’s metadata.yaml file.
Example:
Spark compute resource manager
Manage your Spark compute resources and track Spark pipeline execution metrics.
Was this page helpful?