Skip to content

Commit

Permalink
[AIRFLOW-4062] Improve docs on install extra package commands (#4966)
Browse files Browse the repository at this point in the history
Some command for installing extra packages like
`pip install apache-airflow[devel]` cause error
in special situation/shell, We should clear them
by add quotation like
`pip install 'apache-airflow[devel]'`
  • Loading branch information
zhongjiajie authored and ashb committed Mar 25, 2019
1 parent 1b3c54b commit 9a159ce
Show file tree
Hide file tree
Showing 7 changed files with 85 additions and 83 deletions.
8 changes: 5 additions & 3 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
Expand Up @@ -85,7 +85,9 @@ The latest API documentation is usually available
you need to have set up an Airflow development environemnt (see below). Also
install the `doc` extra.

pip install -e .[doc]
```
pip install -e '.[doc]'
```

Generate the documentation by running:

Expand All @@ -112,7 +114,7 @@ There are three ways to setup an Apache Airflow development environment.
cd $AIRFLOW_HOME
virtualenv env
source env/bin/activate
pip install -e .[devel]
pip install -e '.[devel]'
```

2. Using a Docker container
Expand All @@ -126,7 +128,7 @@ There are three ways to setup an Apache Airflow development environment.
# Install Airflow with all the required dependencies,
# including the devel which will provide the development tools
pip install -e ".[hdfs,hive,druid,devel]"
pip install -e '.[hdfs,hive,druid,devel]'
# Init the database
airflow initdb
Expand Down
4 changes: 2 additions & 2 deletions airflow/contrib/example_dags/example_kubernetes_operator.py
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@

try:
# Kubernetes is optional, so not available in vanilla Airflow
# pip install apache-airflow[kubernetes]
# pip install 'apache-airflow[kubernetes]'
from airflow.contrib.operators.kubernetes_pod_operator import KubernetesPodOperator

args = {
Expand Down Expand Up @@ -64,4 +64,4 @@
except ImportError as e:
log.warn("Could not import KubernetesPodOperator: " + str(e))
log.warn("Install kubernetes dependencies with: "
" pip install apache-airflow[kubernetes]")
" pip install 'apache-airflow[kubernetes]'")
2 changes: 1 addition & 1 deletion airflow/contrib/utils/sendgrid.py
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ def send_email(to, subject, html_content, files=None,
To use this plugin:
0. include sendgrid subpackage as part of your Airflow installation, e.g.,
pip install apache-airflow[sendgrid]
pip install 'apache-airflow[sendgrid]'
1. update [email] backend in airflow.cfg, i.e.,
[email]
email_backend = airflow.contrib.utils.sendgrid.send_email
Expand Down
4 changes: 2 additions & 2 deletions docs/howto/secure-connections.rst
Original file line number Diff line number Diff line change
Expand Up @@ -27,8 +27,8 @@ If ``crypto`` package was not installed initially, it means that your Fernet key

You can still enable encryption for passwords within connections by following below steps:

1. Install crypto package ``pip install apache-airflow[crypto]``
2. Generate fernet_key, using this code snippet below. ``fernet_key`` must be a base64-encoded 32-byte key.
#. Install crypto package ``pip install 'apache-airflow[crypto]'``
#. Generate fernet_key, using this code snippet below. ``fernet_key`` must be a base64-encoded 32-byte key:

.. code:: python
Expand Down
2 changes: 1 addition & 1 deletion docs/howto/write-logs.rst
Original file line number Diff line number Diff line change
Expand Up @@ -121,7 +121,7 @@ example:
remote_base_log_folder = gs://my-bucket/path/to/logs
remote_log_conn_id = MyGCSConn
#. Install the ``gcp_api`` package first, like so: ``pip install apache-airflow[gcp_api]``.
#. Install the ``gcp_api`` package first, like so: ``pip install 'apache-airflow[gcp_api]'``.
#. Make sure a Google Cloud Platform connection hook has been defined in Airflow. The hook should have read and write access to the Google Cloud Storage bucket defined above in ``remote_base_log_folder``.
#. Restart the Airflow webserver and scheduler, and trigger (or wait for) a new task execution.
#. Verify that logs are showing up for newly executed tasks in the bucket you've defined.
Expand Down
142 changes: 71 additions & 71 deletions docs/installation.rst

Large diffs are not rendered by default.

6 changes: 3 additions & 3 deletions docs/security.rst
Original file line number Diff line number Diff line change
Expand Up @@ -262,7 +262,7 @@ To use kerberos authentication, you must install Airflow with the `kerberos` ext
.. code-block:: bash
pip install airflow[kerberos]
pip install 'apache-airflow[kerberos]'
OAuth Authentication
--------------------
Expand Down Expand Up @@ -295,7 +295,7 @@ To use GHE authentication, you must install Airflow with the `github_enterprise`
.. code-block:: bash
pip install airflow[github_enterprise]
pip install 'apache-airflow[github_enterprise]'
Setting up GHE Authentication
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Expand Down Expand Up @@ -343,7 +343,7 @@ To use Google authentication, you must install Airflow with the `google_auth` ex
.. code-block:: bash
pip install airflow[google_auth]
pip install 'apache-airflow[google_auth]'
Setting up Google Authentication
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
Expand Down

0 comments on commit 9a159ce

Please sign in to comment.