astro.sql.operators.export_table_to_file

Module Contents

Classes

ExportTableToFileOperator

Write SQL table to csv/parquet on local/S3/GCS.

Functions

export_table_to_file(input_data, output_file[, ...])

Convert ExportTableToFileOperator into a function. Returns XComArg.

class astro.sql.operators.export_table_to_file.ExportTableToFileOperator(input_data, output_file, if_exists='exception', **kwargs)

Bases: astro.sql.operators.export_to_file.ExportToFileOperator

Write SQL table to csv/parquet on local/S3/GCS. :param input_data: Table to convert to file :param output_file: File object containing the path to the file and connection id. :param if_exists: Overwrite file if exists. Default False.

Parameters:
  • input_data (astro.table.BaseTable | pandas.DataFrame) –

  • output_file (astro.files.File) –

  • if_exists (astro.constants.ExportExistsStrategy) –

template_fields = ('input_data', 'output_file')
astro.sql.operators.export_table_to_file.export_table_to_file(input_data, output_file, if_exists='exception', task_id=None, **kwargs)

Convert ExportTableToFileOperator into a function. Returns XComArg.

Returns an XComArg object of type File which matches the output_file parameter.

This will allow users to perform further actions with the exported file.

e.g.:

with sample_dag:
    table = aql.load_file(input_file=File(path=data_path), output_table=test_table)
    exported_file = aql.export_file(
        input_data=table,
        output_file=File(path="/tmp/saved_df.csv"),
        if_exists="replace",
    )
    res_df = aql.load_file(input_file=exported_file)
Parameters:
  • output_file (astro.files.File) – Path and conn_id

  • input_data (astro.table.BaseTable | pandas.DataFrame) – Input table / dataframe

  • if_exists (astro.constants.ExportExistsStrategy) – Overwrite file if exists. Default “exception”

  • task_id (str | None) – task id, optional

  • kwargs (Any) –

Return type:

airflow.models.xcom_arg.XComArg