-
Notifications
You must be signed in to change notification settings - Fork 1.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Pipeline UI unable to access artifacts stored in external cloud storage #2627
Comments
@eterna2 Does that part work for you? |
This is the metadata I am returning:
here is the code I am using for the light weight component. I am able to access the confusion matrix data from s3 inside the component, but the UI cannot. |
I cannot test on AWS, but UI pod already uses standard env vars to get credentials from AWS: https://github.com/kubeflow/pipelines/blob/master/frontend/server/server.ts#L45. What's your use case? Is it enough everyone shares one credentials? You can
and add extra envs needed for authentication. We don't yet have a plan to provide a first party solution. |
Yes. It works for me. But u need to either pass the AWS API keys to the pipeline UI pod (as bobgy mentioned) or in my case, assign an IAM role to the pod. For tensorboard, u need to create a configmap with a modified pod template (to either set the AWS API keys or IAM role), mount it to the UI pod and point to it with the env var |
I will try to do a kustomize overlay for both AWS API key variant and IAM variant. |
@pavankumarboinapalli NOTE that this is assuming u are using kube2iam or some equivalent. Alternatively, u can just add the AWS access key into the env var straight (or from secret). This is a standard k8s podTemplateSpec.
|
sorry I was using kustomize.
|
You can look at #2633 on how to configure the tensorboard podTemplateSpec, for both access key and IAM access. |
it seems the questions are already answered |
@Bobgy: Closing this issue. In response to this:
Instructions for interacting with me using PR comments are available here. If you have questions or suggestions related to my behavior, please file an issue against the kubernetes/test-infra repository. |
@pavankumarboinapalli @eterna2 I am facing similar issue. Pipeline UI gets 500 error: access denied when it tries to read data for confusion matrix from s3 |
@eterna2 @pavankumarboinapalli @Ark-kun Still I get 500 error: access denied when the pipeline UI tries to fetch my source for confusion_matrix from s3 location. What can be causing this? Containers: ml-pipeline-ui:
|
hi @Shasvat0601 You can look at https://github.com/e2fyi/kubeflow-aws/tree/master/pipelines for a manifest that support using S3 as the backend instead of minio. Specifically, you shld look at https://github.com/e2fyi/kubeflow-aws/blob/3220922250ccb9b6207c8fb4fe60db3669ff0508/pipelines/overlay/accesskey/aws-configurations-patch.yaml#L40 on how to mount an AWS credential secret. The link above demonstrate how to mount the credential as env variable. Volume mounting the secret will not work, as we are not using AWS official client (we are using minio). In any case, even if you are volume mounting, you shld ensure the secret is locate in the right place and have the same layout as the AWS credential file. |
hi @eterna2 |
* Make deployment scheduling behaviour configurable Signed-off-by: ddelange <[email protected]> * Preserve PodAntiAffinity from kserve/kserve#2645 --------- Signed-off-by: ddelange <[email protected]>
* Extract modelmesh part in helm chart Signed-off-by: Hyeonki Hong <[email protected]> * Add last newline to helm chart templates Signed-off-by: Hyeonki Hong <[email protected]> * reflect 2b5a770 changes (kubeflow#2627) Signed-off-by: Hyeonki Hong <[email protected]> --------- Signed-off-by: Hyeonki Hong <[email protected]>
Hello,
I am using light weight components as shown in this example:
lightweight_component
But my
source
for confusion_matrix is coming from s3 location.I can specify
use_aws_secret
fromkfp.aws
to access data ats3://xxx
inside the componenets, but Pipeline UI is not able to read data from s3.How do I provide the aws secrets to pipeline UI?
The text was updated successfully, but these errors were encountered: