You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Please vote on this issue by adding a 👍 reaction to the original issue to help the community and maintainers prioritize this request
Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request
If you are interested in working on this issue or have submitted a pull request, please leave a comment
Tell us about your request
Tangentially related to #69, for testing and debugging it would be nice if containers could be run locally by Conveyor with the right credentials associated with a role.
Tell us about the problem you're trying to solve. What are you trying to do, and why is it hard?
Testing and debugging of a pipeline is quite slow and frustrating. conveyor run is faster than conveyor build && conveyor deploy as one avoids UI interaction, but it still involves:
building the container image
pushing the container image
waiting for Airflow to schedule the task
Additionally, conveyor run requires that whatever you run must be associated with a task inside a DAG; it can not be any arbitrary command that could be run from the container.
For running short/simple tasks in our test environment & writing data in the test bucket, it would be perfectly OK to run and iterate on them locally. This only requires building and running the image and cuts out the push, DAG, and Airflow, which is much faster. However, it requires that the container gets the right credentials to access data.
Are you currently working around this issue?
We can request local credentials as detailed in #69 and inject them into the container with docker run -e. A conveyor run --local could make this much cleaner.
Additional context
Of course running locally will not work for all tasks (e.g. tasks which require access to on-prem systems behind a firewall), but it would be helpful for a large subset of tasks. In addition, running locally could be handy if it could be done from within conveyor IDEs.
Attachments
The text was updated successfully, but these errors were encountered:
Community Note
Tell us about your request
Tangentially related to #69, for testing and debugging it would be nice if containers could be run locally by Conveyor with the right credentials associated with a role.
Tell us about the problem you're trying to solve. What are you trying to do, and why is it hard?
Testing and debugging of a pipeline is quite slow and frustrating.
conveyor run
is faster thanconveyor build && conveyor deploy
as one avoids UI interaction, but it still involves:Additionally,
conveyor run
requires that whatever you run must be associated with a task inside a DAG; it can not be any arbitrary command that could be run from the container.For running short/simple tasks in our test environment & writing data in the test bucket, it would be perfectly OK to run and iterate on them locally. This only requires building and running the image and cuts out the push, DAG, and Airflow, which is much faster. However, it requires that the container gets the right credentials to access data.
Are you currently working around this issue?
We can request local credentials as detailed in #69 and inject them into the container with
docker run -e
. Aconveyor run --local
could make this much cleaner.Additional context
Of course running locally will not work for all tasks (e.g. tasks which require access to on-prem systems behind a firewall), but it would be helpful for a large subset of tasks. In addition, running locally could be handy if it could be done from within conveyor IDEs.
Attachments
The text was updated successfully, but these errors were encountered: