Update Environment
| mwaa_update_environment | R Documentation |
Updates an Amazon Managed Workflows for Apache Airflow (MWAA) environment¶
Description¶
Updates an Amazon Managed Workflows for Apache Airflow (MWAA) environment.
Usage¶
mwaa_update_environment(Name, ExecutionRoleArn, AirflowVersion,
SourceBucketArn, DagS3Path, PluginsS3Path, PluginsS3ObjectVersion,
RequirementsS3Path, RequirementsS3ObjectVersion, StartupScriptS3Path,
StartupScriptS3ObjectVersion, AirflowConfigurationOptions,
EnvironmentClass, MaxWorkers, NetworkConfiguration,
LoggingConfiguration, WeeklyMaintenanceWindowStart, WebserverAccessMode,
MinWorkers, Schedulers, MinWebservers, MaxWebservers)
Arguments¶
Name[required] The name of your Amazon MWAA environment. For example,
MyMWAAEnvironment.ExecutionRoleArnThe Amazon Resource Name (ARN) of the execution role in IAM that allows MWAA to access Amazon Web Services resources in your environment. For example,
arn:aws:iam::123456789:role/my-execution-role. For more information, see Amazon MWAA Execution role.AirflowVersionThe Apache Airflow version for your environment. To upgrade your environment, specify a newer version of Apache Airflow supported by Amazon MWAA.
Before you upgrade an environment, make sure your requirements, DAGs, plugins, and other resources used in your workflows are compatible with the new Apache Airflow version. For more information about updating your resources, see Upgrading an Amazon MWAA environment.
Valid values:
1.10.12,2.0.2,2.2.2,2.4.3,2.5.1,2.6.3,2.7.2,2.8.1.SourceBucketArnThe Amazon Resource Name (ARN) of the Amazon S3 bucket where your DAG code and supporting files are stored. For example,
arn:aws:s3:::my-airflow-bucket-unique-name. For more information, see Create an Amazon S3 bucket for Amazon MWAA.DagS3PathThe relative path to the DAGs folder on your Amazon S3 bucket. For example,
dags. For more information, see Adding or updating DAGs.PluginsS3PathThe relative path to the
plugins.zipfile on your Amazon S3 bucket. For example,plugins.zip. If specified, then the plugins.zip version is required. For more information, see Installing custom plugins.PluginsS3ObjectVersionThe version of the plugins.zip file on your Amazon S3 bucket. You must specify a version each time a
plugins.zipfile is updated. For more information, see How S3 Versioning works.RequirementsS3PathThe relative path to the
requirements.txtfile on your Amazon S3 bucket. For example,requirements.txt. If specified, then a file version is required. For more information, see Installing Python dependencies.RequirementsS3ObjectVersionThe version of the requirements.txt file on your Amazon S3 bucket. You must specify a version each time a
requirements.txtfile is updated. For more information, see How S3 Versioning works.StartupScriptS3PathThe relative path to the startup shell script in your Amazon S3 bucket. For example,
s3://mwaa-environment/startup.sh.Amazon MWAA runs the script as your environment starts, and before running the Apache Airflow process. You can use this script to install dependencies, modify Apache Airflow configuration options, and set environment variables. For more information, see Using a startup script.
StartupScriptS3ObjectVersionThe version of the startup shell script in your Amazon S3 bucket. You must specify the version ID that Amazon S3 assigns to the file every time you update the script.
Version IDs are Unicode, UTF-8 encoded, URL-ready, opaque strings that are no more than 1,024 bytes long. The following is an example:
3sL4kqtJlcpXroDTDmJ+rmSpXd3dIbrHY+MTRCxf3vjVBH40Nr8X8gdRQBpUMLUoFor more information, see Using a startup script.
AirflowConfigurationOptionsA list of key-value pairs containing the Apache Airflow configuration options you want to attach to your environment. For more information, see Apache Airflow configuration options.
EnvironmentClassThe environment class type. Valid values:
mw1.small,mw1.medium,mw1.large,mw1.xlarge, andmw1.2xlarge. For more information, see Amazon MWAA environment class.MaxWorkersThe maximum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the
MaxWorkersfield. For example,20. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the one worker that is included with your environment, or the number you specify inMinWorkers.NetworkConfigurationThe VPC networking components used to secure and enable network traffic between the Amazon Web Services resources for your environment. For more information, see About networking on Amazon MWAA.
LoggingConfigurationThe Apache Airflow log types to send to CloudWatch Logs.
WeeklyMaintenanceWindowStartThe day and time of the week in Coordinated Universal Time (UTC) 24-hour standard time to start weekly maintenance updates of your environment in the following format:
DAY:HH:MM. For example:TUE:03:30. You can specify a start time in 30 minute increments only.WebserverAccessModeThe Apache Airflow Web server access mode. For more information, see Apache Airflow access modes.
MinWorkersThe minimum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the
MaxWorkersfield. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the worker count you specify in theMinWorkersfield. For example,2.SchedulersThe number of Apache Airflow schedulers to run in your Amazon MWAA environment.
MinWebserversThe minimum number of web servers that you want to run in your environment. Amazon MWAA scales the number of Apache Airflow web servers up to the number you specify for
MaxWebserverswhen you interact with your Apache Airflow environment using Apache Airflow REST API, or the Apache Airflow CLI. As the transaction-per-second rate, and the network load, decrease, Amazon MWAA disposes of the additional web servers, and scales down to the number set inMinxWebserers.Valid values: Accepts between
2and5. Defaults to2.MaxWebserversThe maximum number of web servers that you want to run in your environment. Amazon MWAA scales the number of Apache Airflow web servers up to the number you specify for
MaxWebserverswhen you interact with your Apache Airflow environment using Apache Airflow REST API, or the Apache Airflow CLI. For example, in scenarios where your workload requires network calls to the Apache Airflow REST API with a high transaction-per-second (TPS) rate, Amazon MWAA will increase the number of web servers up to the number set inMaxWebserers. As TPS rates decrease Amazon MWAA disposes of the additional web servers, and scales down to the number set inMinxWebserers.Valid values: Accepts between
2and5. Defaults to2.
Value¶
A list with the following syntax:
Request syntax¶
svc$update_environment(
Name = "string",
ExecutionRoleArn = "string",
AirflowVersion = "string",
SourceBucketArn = "string",
DagS3Path = "string",
PluginsS3Path = "string",
PluginsS3ObjectVersion = "string",
RequirementsS3Path = "string",
RequirementsS3ObjectVersion = "string",
StartupScriptS3Path = "string",
StartupScriptS3ObjectVersion = "string",
AirflowConfigurationOptions = list(
"string"
),
EnvironmentClass = "string",
MaxWorkers = 123,
NetworkConfiguration = list(
SecurityGroupIds = list(
"string"
)
),
LoggingConfiguration = list(
DagProcessingLogs = list(
Enabled = TRUE|FALSE,
LogLevel = "CRITICAL"|"ERROR"|"WARNING"|"INFO"|"DEBUG"
),
SchedulerLogs = list(
Enabled = TRUE|FALSE,
LogLevel = "CRITICAL"|"ERROR"|"WARNING"|"INFO"|"DEBUG"
),
WebserverLogs = list(
Enabled = TRUE|FALSE,
LogLevel = "CRITICAL"|"ERROR"|"WARNING"|"INFO"|"DEBUG"
),
WorkerLogs = list(
Enabled = TRUE|FALSE,
LogLevel = "CRITICAL"|"ERROR"|"WARNING"|"INFO"|"DEBUG"
),
TaskLogs = list(
Enabled = TRUE|FALSE,
LogLevel = "CRITICAL"|"ERROR"|"WARNING"|"INFO"|"DEBUG"
)
),
WeeklyMaintenanceWindowStart = "string",
WebserverAccessMode = "PRIVATE_ONLY"|"PUBLIC_ONLY",
MinWorkers = 123,
Schedulers = 123,
MinWebservers = 123,
MaxWebservers = 123
)