Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add docs, exporting logging to storage permissions. #1905

Merged
Merged
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
63 changes: 62 additions & 1 deletion docs/logging-usage.rst
Original file line number Diff line number Diff line change
Expand Up @@ -205,13 +205,74 @@ Delete a metric:
>>> metric.exists() # API call
False


Export log entries using sinks
------------------------------

Sinks allow exporting entries which match a given filter to Cloud Storage
buckets, BigQuery datasets, or Cloud Pub/Sub topics.

Export to Cloud storage

This comment was marked as spam.

This comment was marked as spam.

~~~~~~~~~~~~~~~~~~~~~~~

Make sure that the storage bucket you want to export logs too has
`[email protected]` as the owner. See `Set permission for writing exported logs`_.

This comment was marked as spam.


Add `[email protected]` as the owner of `my-bucket-name`:

.. doctest::

>>> from gcloud import storage
>>> client = storage.Client()
>>> bucket = client.get_bucket('my-bucket-name')
>>> bucket.acl.reload()
>>> logs_group = bucket.acl.group('[email protected]')
>>> logs_group.grant_owner()
>>> bucket.acl.add_entity(logs_group)
>>> bucket.acl.save()

.. _Set permission for writing exported logs: https://cloud.google.com/logging/docs/export/configure_export#setting_product_name_short_permissions_for_writing_exported_logs

This comment was marked as spam.

Export to BigQuery
~~~~~~~~~~~~~~~~~~

To export logs to BigQuery you must log into the Cloud Platform Console
and add `[email protected]` to a dataset.

See: `Setting permissions for BigQuery`_

.. doctest::
>>> from gcloud import bigquery
>>> from gcloud.bigquery.dataset import AccessGrant
>>> bigquery_client = bigquery.Client()
>>> dataset = bigquery_client.dataset('my-dataset-name')
>>> dataset.create()
>>> dataset.reload()
>>> grants = dataset.access_grants
>>> grants.append(AccessGrant(
... 'WRITER', 'groupByEmail', '[email protected]')))
>>> dataset.access_grants = grants
>>> dataset.update()

.. _Setting permissions for BigQuery: https://cloud.google.com/logging/docs/export/configure_export#manual-access-bq

Export to Pub/Sub
~~~~~~~~~~~~~~~~~

To export logs to BigQuery you must log into the Cloud Platform Console
and add `[email protected]` to a topic.

See: `Setting permissions for Pub/Sub`_

.. doctest::
>>> from gcloud import pubsub
>>> client = pubsub.Client()
>>> topic = client.topic('your-topic-name')
>>> policy = top.get_iam_policy()
>>> policy.owners.add(policy.group('[email protected]'))
>>> topic.set_iam_policy(policy)

.. _Setting permissions for Pub/Sub: https://cloud.google.com/logging/docs/export/configure_export#manual-access-pubsub

Create a Cloud Storage sink:

.. doctest::
Expand Down