Skip to content

Commit

Permalink
[AIRFLOW-3062] Add Qubole in integration docs (#3946)
Browse files Browse the repository at this point in the history
  • Loading branch information
msumit authored and kaxil committed Jan 9, 2019
1 parent 1dc3ee7 commit e0fc9e5
Show file tree
Hide file tree
Showing 2 changed files with 58 additions and 0 deletions.
32 changes: 32 additions & 0 deletions airflow/contrib/sensors/qubole_sensor.py
Original file line number Diff line number Diff line change
Expand Up @@ -75,13 +75,45 @@ def poke(self, context):


class QuboleFileSensor(QuboleSensor):
"""
Wait for a file or folder to be present in cloud storage
and check for its presence via QDS APIs
:param qubole_conn_id: Connection id which consists of qds auth_token
:type qubole_conn_id: str
:param data: a JSON object containing payload, whose presence needs to be checked
Check this `example <https://github.com/apache/incubator-airflow/blob/master\
/airflow/contrib/example_dags/example_qubole_sensor.py>`_ for sample payload
structure.
:type data: a JSON object
.. note:: Both ``data`` and ``qubole_conn_id`` fields support templating. You can
also use ``.txt`` files for template-driven use cases.
"""

@apply_defaults
def __init__(self, *args, **kwargs):
self.sensor_class = FileSensor
super(QuboleFileSensor, self).__init__(*args, **kwargs)


class QubolePartitionSensor(QuboleSensor):
"""
Wait for a Hive partition to show up in QHS (Qubole Hive Service)
and check for its presence via QDS APIs
:param qubole_conn_id: Connection id which consists of qds auth_token
:type qubole_conn_id: str
:param data: a JSON object containing payload, whose presence needs to be checked.
Check this `example <https://github.com/apache/incubator-airflow/blob/master\
/airflow/contrib/example_dags/example_qubole_sensor.py>`_ for sample payload
structure.
:type data: a JSON object
.. note:: Both ``data`` and ``qubole_conn_id`` fields support templating. You can
also use ``.txt`` files for template-driven use cases.
"""

@apply_defaults
def __init__(self, *args, **kwargs):
self.sensor_class = PartitionSensor
Expand Down
26 changes: 26 additions & 0 deletions docs/integration.rst
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@ Integration
- :ref:`AWS`
- :ref:`Databricks`
- :ref:`GCP`
- :ref:`Qubole`

.. _ReverseProxy:

Expand Down Expand Up @@ -972,3 +973,28 @@ Google Kubernetes Engine Hook

.. autoclass:: airflow.contrib.hooks.gcp_container_hook.GKEClusterHook
:members:


.. _Qubole:

Qubole
------

Apache Airflow has a native operator and hooks to talk to `Qubole <https://qubole.com/>`__,
which lets you submit your big data jobs directly to Qubole from Apache Airflow.

QuboleOperator
''''''''''''''

.. autoclass:: airflow.contrib.operators.qubole_operator.QuboleOperator

QubolePartitionSensor
'''''''''''''''''''''

.. autoclass:: airflow.contrib.sensors.qubole_sensor.QubolePartitionSensor


QuboleFileSensor
''''''''''''''''

.. autoclass:: airflow.contrib.sensors.qubole_sensor.QuboleFileSensor

0 comments on commit e0fc9e5

Please sign in to comment.