Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

signed urls do not work in Cloud Run with django-storages #941

Open
sww314 opened this issue Oct 2, 2020 · 11 comments · May be fixed by #1427
Open

signed urls do not work in Cloud Run with django-storages #941

sww314 opened this issue Oct 2, 2020 · 11 comments · May be fixed by #1427

Comments

@sww314
Copy link
Contributor

sww314 commented Oct 2, 2020

Using Cloud Run the default credentials do not allow signing of urls.
The error is swallowed in most use cases and the file filed just returns a None in serializers or in django admin.

The error is confusing because everything works fine on the upload side.
Even worse, if you use the same service account and run your container locally - it works fine. Since the credential is provided in the different manner.

To recreate:

  • setup Django project in Cloud Run with Media objects in stored using GCS.
  • Create file in Django admin (the file field comes back as None, but the file is in the GCS bucket)
  • On editing a file in Django admin, the error is displayed:
Exception Type: | AttributeError
-- | --
you need a private key to sign credentials.the credentials you are currently using <class 'google.auth.compute_engine.credentials.Credentials'> just contains a token. see https://googleapis.dev/python/google-api-core/latest/auth.html#setting-up-a-service-account for more details.
/usr/local/lib/python3.8/site-packages/google/cloud/storage/_signing.py, line 51, in ensure_signed_credentials

I am still trying to figure out the best work around, but I wanted to add this in case anyone else runs into the error.
This maybe a documentation update or a change to not swallow the error.

@Prikers
Copy link

Prikers commented Oct 12, 2020

Just run on the very same issue with Google App Engine (standard). Took me a while as well to understand the error. Uploading a file works just fine so I thought it should not be a credential issue, but retrieving the file causes the error (at url signing invocation).

@sww314 did you find a nice work around / hack for this issue? Could I store the credentials in a file in Cloud Storage and add an environment variable to the app.yaml to pass the link to this file?

@sww314
Copy link
Contributor Author

sww314 commented Dec 29, 2020

@Prikers
I ended up storing the credentials in my container.
I think this solution may work as well, but I have not had a chance to try it yet:
https://stackoverflow.com/questions/64234214/how-to-generate-a-blob-signed-url-in-google-cloud-run

@robcharlwood
Copy link

➕ 1️⃣ - I think we are experiencing the same issue.

@danfairs
Copy link

Our workaround is as follows:

import datetime

from django.core.cache import cache
from django.utils.deconstruct import deconstructible

import google.auth
import google.auth.compute_engine
import google.auth.transport.requests
from djangoapps.core.context import get_context
# We'll take it on the chin if this moves
from google.cloud.storage.blob import _quote
from storages.backends.gcloud import GoogleCloudStorage
from storages.utils import clean_name


@deconstructible
class GoogleCloudStorageAccessToken(GoogleCloudStorage):
    CACHE_KEY = "GoogleCloudStorageAccessToken.signing_extras"

    def url(self, name):
        """
        Return public url or a signed url for the Blob.
        This DOES NOT check for existance of Blob - that makes codes too slow
        for many use cases.

        We override this to provide an extra information to url signing, so we don't need to have a private key
        available. This is a workaround for https://github.com/jschneier/django-storages/issues/941.
        """
        name = self._normalize_name(clean_name(name))
        blob = self.bucket.blob(name)
        no_signed_url = self.default_acl == "publicRead" or not self.querystring_auth

        if not self.custom_endpoint and no_signed_url:
            return blob.public_url
        elif no_signed_url:
            return "{storage_base_url}/{quoted_name}".format(
                storage_base_url=self.custom_endpoint,
                quoted_name=_quote(name, safe=b"/~"),
            )
        elif not self.custom_endpoint:
            return blob.generate_signed_url(self.expiration, **self.signed_url_extra())
        else:
            return blob.generate_signed_url(
                expiration=self.expiration,
                api_access_endpoint=self.custom_endpoint,
                **self.signed_url_extra()
            )

    def signed_url_extra(self):
        value = cache.get(self.CACHE_KEY)
        if value is not None:
            expiry, extra = value
            if expiry > datetime.datetime.utcnow():
                return extra

        credentials, project_id = google.auth.default()
        auth_req = google.auth.transport.requests.Request()
        credentials.refresh(auth_req)
        extra = {
            "service_account_email": credentials.service_account_email,
            "access_token": credentials.token,
            "credentials": credentials,
        }

        cache.set(self.CACHE_KEY, (credentials.expiry, extra))
        return extra

You should obviously then use this class as a replacement for the GoogleCloudStorage base class (we set up storages explicitly on a case-by-case basis, so this is fairly straightforard for us).

@RW21
Copy link

RW21 commented Jul 8, 2021

Thanks @danfairs for sharing! 👍
I had to set a scope to obtain a token. Getting tokens doesn't work with an empty scope.

credentials, project_id = google.auth.default(
            scopes=['https://www.googleapis.com/auth/cloud-platform'])

Also caching (pickling) Google's credentials object was failing, so I had to remove that also.

@therefromhere
Copy link

I know the OP wanted to use signed URLs, just a note we hit this issue accidentally because we were missing GS_DEFAULT_ACL from our settings.py, so were accidentally enabling signed URLs.

GS_DEFAULT_ACL = 'publicRead’

@bpicolo
Copy link

bpicolo commented May 1, 2022

GS_QUERYSTRING_AUTH may be useful here if you're looking for public access

@tanin-t
Copy link

tanin-t commented Jun 4, 2022

Thanks @danfairs and @RW21.

I also had to update service account permission in GCP Console https://console.cloud.google.com

  • Go to IAM & Admin -> IAM
  • Edit principal (pencil icon) of cloud run service account
  • Add role Service Account Token Creator

or there will be error

google.auth.exceptions.TransportError: Error calling the IAM signBytes API: b'{\n  "error": {\n    "code": 403,\n    "message": "The caller does not have permission",\n    "status": "PERMISSION_DENIED"\n  }\n}\n'

np5 added a commit to zentralopensource/zentral that referenced this issue Sep 6, 2022
@raad-altaie
Copy link

After a long day of troubleshooting, here is how I got around it.
I added the snippet below, as mentioned in the docs, to the settings file.

from google.oauth2 import service_account
GS_CREDENTIALS = service_account.Credentials.from_service_account_file(
    "/SECRETS/SERVICE_ACCOUNT"
)

then added service account json file to "Google Secret Manager", mounted it to Google Run as shown below, and now it's working like charm.

Screenshot 2022-12-02 at 5 58 46 AM

@orkenstein
Copy link

Needs to be updated to support Workload Identity on k8s

@Zwiqler94
Copy link

Any news on this?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment