astro.sql.operators.export_file

Module Contents

Classes

ExportFileOperator

Write SQL table to csv/parquet on local/S3/GCS.

Functions

export_file(input_data, output_file[, if_exists, task_id])

Convert ExportFileOperator into a function. Returns XComArg.

class astro.sql.operators.export_file.ExportFileOperator(input_data, output_file, if_exists='exception', **kwargs)

Bases: airflow.models.BaseOperator

Write SQL table to csv/parquet on local/S3/GCS.

Parameters
  • input_data (Table | pd.DataFrame) – Table to convert to file

  • output_file (astro.files.File) – File object containing the path to the file and connection id.

  • if_exists (astro.constants.ExportExistsStrategy) – Overwrite file if exists. Default False.

template_fields = ['input_data', 'output_file']
execute(context)

Write SQL table to csv/parquet on local/S3/GCS.

Infers SQL database type based on connection.

Parameters

context (dict) –

Return type

astro.files.File

astro.sql.operators.export_file.export_file(input_data, output_file, if_exists='exception', task_id=None, **kwargs)

Convert ExportFileOperator into a function. Returns XComArg.

Returns an XComArg object of type File which matches the output_file parameter.

This will allow users to perform further actions with the exported file.

e.g.:

with sample_dag:
    table = aql.load_file(input_file=File(path=data_path), output_table=test_table)
    exported_file = aql.export_file(
        input_data=table,
        output_file=File(path="/tmp/saved_df.csv"),
        if_exists="replace",
    )
    res_df = aql.load_file(input_file=exported_file)
Parameters
  • output_file (astro.files.File) – Path and conn_id

  • input_data (Table | pd.DataFrame) – Input table / dataframe

  • if_exists (astro.constants.ExportExistsStrategy) – Overwrite file if exists. Default “exception”

  • task_id (str | None) – task id, optional

  • kwargs (Any) –

Return type

airflow.models.xcom_arg.XComArg