-
Notifications
You must be signed in to change notification settings - Fork 349
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
no Http Proxy for Connector Download and no authentication method #2787
Comments
Sounds like a short term solution would be to include the most common AF and KF components in with the image OR predownload/upload the packages tot a location in the whitelist, and long term would be to tool in the functionality required to work with the openshift cluster level proxies e.g. username and password I havent much experience yet using the cluster level proxy functionality in openshift. Looking that the link provided, would another short term solution by to tell the proxy to add a rule to allow bypass to the required urls (e.g. files.pythonhosted.org)? perhaps some bigger companies may have an internal mirror for packages and can use that? |
Yes, that would be great in my opinion. I am not sure how the ODH Project Image updates and people working on that project work together with you, as in this issue opendatahub-io/odh-manifests#546, where a new Docker image was integrated into ODH.
Yes, like some sort of environment variable that one can pass into the container for http_proxy and https_proxy.
not an option in our case, we work only with either docker images in enterprise-internal repositories or with enterprise-internal package repositories like Artifactory. Those internal domains are then included in the noproxy-section of openshift cluster config.
That is what I did now. I uploaded the wheel file in question to a repository in our internal artifactory. However, I believe I am not the only one faced with that issue, Chief Information Security Officers insist on some sort of authentication and non-anonymous access, either via Bearer Token or Basic Auth https://www.jfrog.com/confluence/display/JFROG/Artifactory+REST+API @kiersten-stokes Is it possible to include Bearer Token and/or Basic Auth functionality to the airflow package catalog connector at elyra/elyra/pipeline/airflow/package_catalog_connector/airflow_package_catalog_connector.py Line 86 in 66e4009
In your article you mention in the section "Airflow Package Catalog Connector" "Lastly, you’ll need to configure the Airflow package download URL. The URL must meet a few constraints:
in some sensitive enterprise environments, that is not feasible (no need to authenticate), even if using an internal package repository like Artifactory. |
I don't believe we should do this for the following reasons:
Understood. Adding optional basic authentication should not be a problem. |
@ptitzler @akchinSTC source: https://access.redhat.com/solutions/5251461 So, in any case, best practice would be to have (an) optional env variable/s HTTP_PROXY and HTTPS_PROXY that can passed to the deployment env section via e.g. configmaps on OpenShift. in our case. On your side, I believe the relevant section is here respectively ODH in the past |
Basic auth support is now there and working, closing issue. private PKI CA-bundle trust handled in #2787. Http-proxy support will be handled later in order to not overload this issue. |
elyra/elyra/pipeline/airflow/package_catalog_connector/airflow_package_catalog_connector.py
Line 86 in 66e4009
https://medium.com/ibm-data-ai/getting-started-with-apache-airflow-operators-in-elyra-aae882f80c4a
On the face of it, the component catalog feature is great, though I do not understand why common airflow and kubeflow pipeline components are not included in the e.g. Red Hat Operatorhub elyra image by default.
opendatahub-io/odh-manifests#546
In most enterprise environments, as in openshift, there are often cluster-level http- and https proxies involved.
https://docs.openshift.com/container-platform/4.8/networking/enable-cluster-wide-proxy.html
I find no way to integrate apache airflow package operator catalog wheel files via a download url and proxies.
For the gitlab plugin, I was able to do it via the command line.
The text was updated successfully, but these errors were encountered: