astro.sql.operators.export_to_file
Module Contents
Classes
Write SQL table to csv/parquet on local/S3/GCS. |
Functions
|
Convert ExportToFileOperator into a function. Returns XComArg. |
- class astro.sql.operators.export_to_file.ExportToFileOperator(input_data, output_file, if_exists='exception', **kwargs)
Bases:
astro.sql.operators.base_operator.AstroSQLBaseOperator
Write SQL table to csv/parquet on local/S3/GCS.
- Parameters:
input_data (astro.table.BaseTable | pandas.DataFrame) – Table to convert to file
output_file (astro.files.File) – File object containing the path to the file and connection id.
if_exists (astro.constants.ExportExistsStrategy) – Overwrite file if exists. Default False.
- template_fields = ('input_data', 'output_file')
- execute(context)
Write SQL table to csv/parquet on local/S3/GCS.
Infers SQL database type based on connection.
- Parameters:
context (astro.utils.compat.typing.Context) –
- Return type:
astro.files.File
- get_openlineage_facets_on_complete(task_instance)
Collect the input, output, job and run facets for export file operator
- astro.sql.operators.export_to_file.export_to_file(input_data, output_file, if_exists='exception', task_id=None, **kwargs)
Convert ExportToFileOperator into a function. Returns XComArg.
Returns an XComArg object of type File which matches the output_file parameter.
This will allow users to perform further actions with the exported file.
e.g.:
with sample_dag: table = aql.load_file(input_file=File(path=data_path), output_table=test_table) exported_file = aql.export_file( input_data=table, output_file=File(path="/tmp/saved_df.csv"), if_exists="replace", ) res_df = aql.load_file(input_file=exported_file)
- Parameters:
output_file (astro.files.File) – Path and conn_id
input_data (astro.table.BaseTable | pandas.DataFrame) – Input table / dataframe
if_exists (astro.constants.ExportExistsStrategy) – Overwrite file if exists. Default “exception”
task_id (str | None) – task id, optional
kwargs (Any) –
- Return type:
airflow.models.xcom_arg.XComArg