astro.files.locations.amazon.s3

Module Contents

Classes

S3Location

Handler S3 object store operations

class astro.files.locations.amazon.s3.S3Location(path, conn_id=None, load_options=None)

Bases: astro.files.locations.base.BaseFileLocation

Handler S3 object store operations

Parameters:
property hook: airflow.providers.amazon.aws.hooks.s3.S3Hook
Return type:

airflow.providers.amazon.aws.hooks.s3.S3Hook

property transport_params: dict

Structure s3fs credentials from Airflow connection. s3fs enables pandas to write to s3

Return type:

dict

property paths: list[str]

Resolve S3 file paths with prefix

Return type:

list[str]

property size: int

Return file size for S3 location

Return type:

int

property openlineage_dataset_namespace: str

Returns the open lineage dataset namespace as per https://github.com/OpenLineage/OpenLineage/blob/main/spec/Naming.md

Return type:

str

property openlineage_dataset_name: str

Returns the open lineage dataset name as per https://github.com/OpenLineage/OpenLineage/blob/main/spec/Naming.md

Return type:

str

location_type
supported_conn_type
databricks_auth_settings()

Required settings to upload this file into databricks. Only needed for cloud storage systems like S3 :return: A dictionary of settings keys to settings values

Return type:

dict

get_snowflake_stage_auth_sub_statement()
Return type:

str