Use the :class:`SQLExecuteQueryOperator<airflow.providers.common.sql.operators.sql>` to execute Sqlite commands in a Sqlite database.
Note
Previously, SqliteOperator
was used to perform this kind of operation. After deprecation this has been removed. Please use SQLExecuteQueryOperator
instead.
Use the conn_id
argument to connect to your Sqlite instance where
the connection metadata is structured as follows:
Parameter | Input |
---|---|
Host: string | Sqlite database file |
An example usage of the SQLExecuteQueryOperator to connect to Sqlite is as follows:
.. exampleinclude:: /../../providers/tests/system/sqlite/example_sqlite.py :language: python :start-after: [START howto_operator_sqlite] :end-before: [END howto_operator_sqlite]
Furthermore, you can use an external file to execute the SQL commands. Script folder must be at the same level as DAG.py file.
.. exampleinclude:: /../../providers/tests/system/sqlite/example_sqlite.py :language: python :start-after: [START howto_operator_sqlite_external_file] :end-before: [END howto_operator_sqlite_external_file]
For further information, look at:
Note
Parameters given via SQLExecuteQueryOperator() are given first-place priority
relative to parameters set via Airflow connection metadata (such as schema
, login
, password
etc).