Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

docs: Update cloud upload docs #168

Merged
merged 2 commits into from
Oct 28, 2024
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 1 addition & 2 deletions docs/source/overview.rst
Original file line number Diff line number Diff line change
Expand Up @@ -159,8 +159,7 @@ loop that input file indefinitely.

If the ``-c`` option is given with a Google Cloud Storage URL, then an
additional node called ``CloudNode`` is added after ``PackagerNode``. It runs a
thread which watches the output of the packager and pushes updated files to the
cloud.
local webserver which takes the output of packager and pushes to cloud storage.
joeyparrish marked this conversation as resolved.
Show resolved Hide resolved

The pipeline and the nodes in it are constructed by ``ControllerNode`` based on
your config files. If you want to write your own front-end or interface
Expand Down
30 changes: 24 additions & 6 deletions docs/source/prerequisites.rst
Original file line number Diff line number Diff line change
Expand Up @@ -119,21 +119,29 @@ Cloud Storage (optional)
------------------------

Shaka Streamer can push content directly to a Google Cloud Storage or Amazon S3
bucket. To use this feature, the Google Cloud SDK is required.

See https://cloud.google.com/sdk/install for details on installing the Google
Cloud SDK on your platform.
bucket. To use this feature, additional Python modules are required.


Google Cloud Storage
~~~~~~~~~~~~~~~~~~~~

If you haven’t already, you will need to initialize your gcloud environment and
log in through your browser.
First install the Python module if you haven't yet:

.. code:: sh

python3 -m pip install google-cloud-storage

To use the default authentication, you will need default application
credentials installed. On Linux, these live in
``~/.config/gcloud/application_default_credentials.json``.

The easiest way to install default credentials is through the Google Cloud SDK.
See https://cloud.google.com/sdk/docs/install-sdk to install the SDK. Then run:

.. code:: sh

gcloud init
gcloud auth application-default login

Follow the instructions given to you by gcloud to initialize the environment
and login.
Expand All @@ -142,9 +150,19 @@ and login.
Amazon S3
~~~~~~~~~

First install the Python module if you haven't yet:

.. code:: sh

python3 -m pip install boto3

To authenticate to Amazon S3, you can either add credentials to your `boto
config file`_ or login interactively using the `AWS CLI`_.

.. code:: sh

aws configure


Test Dependencies (optional)
----------------------------
Expand Down
3 changes: 1 addition & 2 deletions shaka-streamer
Original file line number Diff line number Diff line change
Expand Up @@ -68,8 +68,7 @@ def main():
parser.add_argument('-o', '--output',
default='output_files',
help='The output folder to write files to, or an HTTP ' +
'or HTTPS URL where files will be PUT.' +
'Used even if uploading to cloud storage.')
'or HTTPS URL where files will be PUT.')
parser.add_argument('--skip-deps-check',
action='store_true',
help='Skip checks for dependencies and their versions. ' +
Expand Down