Skip to content

Create Environment

mwaa_create_environment R Documentation

Creates an Amazon Managed Workflows for Apache Airflow (MWAA) environment

Description

Creates an Amazon Managed Workflows for Apache Airflow (MWAA) environment.

Usage

mwaa_create_environment(Name, ExecutionRoleArn, SourceBucketArn,
  DagS3Path, NetworkConfiguration, PluginsS3Path, PluginsS3ObjectVersion,
  RequirementsS3Path, RequirementsS3ObjectVersion, StartupScriptS3Path,
  StartupScriptS3ObjectVersion, AirflowConfigurationOptions,
  EnvironmentClass, MaxWorkers, KmsKey, AirflowVersion,
  LoggingConfiguration, WeeklyMaintenanceWindowStart, Tags,
  WebserverAccessMode, MinWorkers, Schedulers, EndpointManagement,
  MinWebservers, MaxWebservers)

Arguments

Name

[required] The name of the Amazon MWAA environment. For example, MyMWAAEnvironment.

ExecutionRoleArn

[required] The Amazon Resource Name (ARN) of the execution role for your environment. An execution role is an Amazon Web Services Identity and Access Management (IAM) role that grants MWAA permission to access Amazon Web Services services and resources used by your environment. For example, ⁠arn:aws:iam::123456789:role/my-execution-role⁠. For more information, see Amazon MWAA Execution role.

SourceBucketArn

[required] The Amazon Resource Name (ARN) of the Amazon S3 bucket where your DAG code and supporting files are stored. For example, arn:aws:s3:::my-airflow-bucket-unique-name. For more information, see Create an Amazon S3 bucket for Amazon MWAA.

DagS3Path

[required] The relative path to the DAGs folder on your Amazon S3 bucket. For example, dags. For more information, see Adding or updating DAGs.

NetworkConfiguration

[required] The VPC networking components used to secure and enable network traffic between the Amazon Web Services resources for your environment. For more information, see About networking on Amazon MWAA.

PluginsS3Path

The relative path to the plugins.zip file on your Amazon S3 bucket. For example, plugins.zip. If specified, then the plugins.zip version is required. For more information, see Installing custom plugins.

PluginsS3ObjectVersion

The version of the plugins.zip file on your Amazon S3 bucket. You must specify a version each time a plugins.zip file is updated. For more information, see How S3 Versioning works.

RequirementsS3Path

The relative path to the requirements.txt file on your Amazon S3 bucket. For example, requirements.txt. If specified, then a version is required. For more information, see Installing Python dependencies.

RequirementsS3ObjectVersion

The version of the requirements.txt file on your Amazon S3 bucket. You must specify a version each time a requirements.txt file is updated. For more information, see How S3 Versioning works.

StartupScriptS3Path

The relative path to the startup shell script in your Amazon S3 bucket. For example, ⁠s3://mwaa-environment/startup.sh⁠.

Amazon MWAA runs the script as your environment starts, and before running the Apache Airflow process. You can use this script to install dependencies, modify Apache Airflow configuration options, and set environment variables. For more information, see Using a startup script.

StartupScriptS3ObjectVersion

The version of the startup shell script in your Amazon S3 bucket. You must specify the version ID that Amazon S3 assigns to the file every time you update the script.

Version IDs are Unicode, UTF-8 encoded, URL-ready, opaque strings that are no more than 1,024 bytes long. The following is an example:

⁠3sL4kqtJlcpXroDTDmJ+rmSpXd3dIbrHY+MTRCxf3vjVBH40Nr8X8gdRQBpUMLUo⁠

For more information, see Using a startup script.

AirflowConfigurationOptions

A list of key-value pairs containing the Apache Airflow configuration options you want to attach to your environment. For more information, see Apache Airflow configuration options.

EnvironmentClass

The environment class type. Valid values: mw1.small, mw1.medium, mw1.large, mw1.xlarge, and mw1.2xlarge. For more information, see Amazon MWAA environment class.

MaxWorkers

The maximum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the MaxWorkers field. For example, 20. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the one worker that is included with your environment, or the number you specify in MinWorkers.

KmsKey

The Amazon Web Services Key Management Service (KMS) key to encrypt the data in your environment. You can use an Amazon Web Services owned CMK, or a Customer managed CMK (advanced). For more information, see Create an Amazon MWAA environment.

AirflowVersion

The Apache Airflow version for your environment. If no value is specified, it defaults to the latest version. For more information, see Apache Airflow versions on Amazon Managed Workflows for Apache Airflow (MWAA).

Valid values: ⁠1.10.12⁠, ⁠2.0.2⁠, ⁠2.2.2⁠, ⁠2.4.3⁠, ⁠2.5.1⁠, ⁠2.6.3⁠, ⁠2.7.2⁠ ⁠2.8.1⁠

LoggingConfiguration

Defines the Apache Airflow logs to send to CloudWatch Logs.

WeeklyMaintenanceWindowStart

The day and time of the week in Coordinated Universal Time (UTC) 24-hour standard time to start weekly maintenance updates of your environment in the following format: DAY:HH:MM. For example: TUE:03:30. You can specify a start time in 30 minute increments only.

Tags

The key-value tag pairs you want to associate to your environment. For example, "Environment": "Staging". For more information, see Tagging Amazon Web Services resources.

WebserverAccessMode

Defines the access mode for the Apache Airflow web server. For more information, see Apache Airflow access modes.

MinWorkers

The minimum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the MaxWorkers field. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the worker count you specify in the MinWorkers field. For example, 2.

Schedulers

The number of Apache Airflow schedulers to run in your environment. Valid values:

  • v2 - Accepts between 2 to 5. Defaults to 2.

  • v1 - Accepts 1.

EndpointManagement

Defines whether the VPC endpoints configured for the environment are created, and managed, by the customer or by Amazon MWAA. If set to SERVICE, Amazon MWAA will create and manage the required VPC endpoints in your VPC. If set to CUSTOMER, you must create, and manage, the VPC endpoints for your VPC. If you choose to create an environment in a shared VPC, you must set this value to CUSTOMER. In a shared VPC deployment, the environment will remain in PENDING status until you create the VPC endpoints. If you do not take action to create the endpoints within 72 hours, the status will change to CREATE_FAILED. You can delete the failed environment and create a new one.

MinWebservers

The minimum number of web servers that you want to run in your environment. Amazon MWAA scales the number of Apache Airflow web servers up to the number you specify for MaxWebservers when you interact with your Apache Airflow environment using Apache Airflow REST API, or the Apache Airflow CLI. As the transaction-per-second rate, and the network load, decrease, Amazon MWAA disposes of the additional web servers, and scales down to the number set in MinxWebserers.

Valid values: Accepts between 2 and 5. Defaults to 2.

MaxWebservers

The maximum number of web servers that you want to run in your environment. Amazon MWAA scales the number of Apache Airflow web servers up to the number you specify for MaxWebservers when you interact with your Apache Airflow environment using Apache Airflow REST API, or the Apache Airflow CLI. For example, in scenarios where your workload requires network calls to the Apache Airflow REST API with a high transaction-per-second (TPS) rate, Amazon MWAA will increase the number of web servers up to the number set in MaxWebserers. As TPS rates decrease Amazon MWAA disposes of the additional web servers, and scales down to the number set in MinxWebserers.

Valid values: Accepts between 2 and 5. Defaults to 2.

Value

A list with the following syntax:

list(
  Arn = "string"
)

Request syntax

svc$create_environment(
  Name = "string",
  ExecutionRoleArn = "string",
  SourceBucketArn = "string",
  DagS3Path = "string",
  NetworkConfiguration = list(
    SubnetIds = list(
      "string"
    ),
    SecurityGroupIds = list(
      "string"
    )
  ),
  PluginsS3Path = "string",
  PluginsS3ObjectVersion = "string",
  RequirementsS3Path = "string",
  RequirementsS3ObjectVersion = "string",
  StartupScriptS3Path = "string",
  StartupScriptS3ObjectVersion = "string",
  AirflowConfigurationOptions = list(
    "string"
  ),
  EnvironmentClass = "string",
  MaxWorkers = 123,
  KmsKey = "string",
  AirflowVersion = "string",
  LoggingConfiguration = list(
    DagProcessingLogs = list(
      Enabled = TRUE|FALSE,
      LogLevel = "CRITICAL"|"ERROR"|"WARNING"|"INFO"|"DEBUG"
    ),
    SchedulerLogs = list(
      Enabled = TRUE|FALSE,
      LogLevel = "CRITICAL"|"ERROR"|"WARNING"|"INFO"|"DEBUG"
    ),
    WebserverLogs = list(
      Enabled = TRUE|FALSE,
      LogLevel = "CRITICAL"|"ERROR"|"WARNING"|"INFO"|"DEBUG"
    ),
    WorkerLogs = list(
      Enabled = TRUE|FALSE,
      LogLevel = "CRITICAL"|"ERROR"|"WARNING"|"INFO"|"DEBUG"
    ),
    TaskLogs = list(
      Enabled = TRUE|FALSE,
      LogLevel = "CRITICAL"|"ERROR"|"WARNING"|"INFO"|"DEBUG"
    )
  ),
  WeeklyMaintenanceWindowStart = "string",
  Tags = list(
    "string"
  ),
  WebserverAccessMode = "PRIVATE_ONLY"|"PUBLIC_ONLY",
  MinWorkers = 123,
  Schedulers = 123,
  EndpointManagement = "CUSTOMER"|"SERVICE",
  MinWebservers = 123,
  MaxWebservers = 123
)