Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

devcontainer arm inclusion #1685

Closed
1 task done
ryanmerolle opened this issue Apr 6, 2022 · 8 comments
Closed
1 task done

devcontainer arm inclusion #1685

ryanmerolle opened this issue Apr 6, 2022 · 8 comments
Assignees
Labels
state: stale type: enhancement New feature or request

Comments

@ryanmerolle
Copy link
Contributor

Enhancement summary

Given the new arm based apple silicon, it would be ideal to support development with said systems in order to further lower barriers to contributing. Currently, apple silicon can try to slowly run amd64 docker images using the Rosetta 2 translation layer, but its pretty slow and system intensive.

I recently submitted a feature request against the docker-avd-base repo to Introduce arm64 image for apple silicon users:

Which component of AVD is impacted

others

Use case example

Lower barriers to contributing.

Describe the solution you would like

We can either:

  • Push the above issue forward with a PR and arm and push to the docker repo. (Preferred since this would enable users to leverage pre-built images in lock step with the amd64 images).
  • Move current devcontainer approach to point to a Dockerfile rather than a docker image, so that the host can build the docker container locally.

Describe alternatives you have considered

Keep the current model which is a huge resource hog for apple silicon devices.

Additional context

No response

Contributing Guide

  • I agree to follow this project's Code of Conduct
@ryanmerolle ryanmerolle added the type: enhancement New feature or request label Apr 6, 2022
@ClausHolbechArista
Copy link
Contributor

I would prefer the 2nd approach, if devcontainer can build from dockerfile on the fly, this would be my preference. This way any requirement changes could be consumed directly.
We have some ideas on deprecating the base container in favor of execution environment / ansible-builder, so this would help us in that direction.

@ryanmerolle
Copy link
Contributor Author

ryanmerolle commented Apr 7, 2022

I will draft up a PR for the team to review and discuss given said approach.

Please assign this issue to me.

@ankudinov
Copy link
Contributor

I'd suggest to park this ticket for a while. Using local Dockerfile is acceptable for a devcontainer, but that still have some disadvantages:

  • The build may fail if there is a problem with Docker setup
  • The environment may change if not all requirements versions are fixed
  • Most important - there will be one more Dockerfile to maintain. And we already have base and all-in-one container that we want to "converge" to simplify maintenance.
    The solution to this problem is simply building the Base container as multi-arch. The parent Python image supports ARM, and it should not be a big problem.

But first I'd focus on simplifying existing containers and reviewing execution environment / ansible-builder concept.
We have a lot to do there.

M1's are not very common yet and we still have a few months to work on that. Meanwhile I'd suggest publishing the updated Dockerfile, devcontainer.json, etc. somewhere on https://github.com/arista-netdevops-community with the description of the workaround, make files to build and run the container.
And may be updating comments in the existing devcontainer.json explaining that ARM is not yet supported and where to get the workaround files.

Feel free to comment and agree or disagree.

@ryanmerolle
Copy link
Contributor Author

I'd suggest to park this ticket for a while. Using local Dockerfile is acceptable for a devcontainer, but that still have some disadvantages:

  • The build may fail if there is a problem with Docker setup

  • The environment may change if not all requirements versions are fixed

  • Most important - there will be one more Dockerfile to maintain. And we already have base and all-in-one container that we want to "converge" to simplify maintenance.

The solution to this problem is simply building the Base container as multi-arch. The parent Python image supports ARM, and it should not be a big problem.

But first I'd focus on simplifying existing containers and reviewing execution environment / ansible-builder concept.

We have a lot to do there.

M1's are not very common yet and we still have a few months to work on that. Meanwhile I'd suggest publishing the updated Dockerfile, devcontainer.json, etc. somewhere on https://github.com/arista-netdevops-community with the description of the workaround, make files to build and run the container.

And may be updating comments in the existing devcontainer.json explaining that ARM is not yet supported and where to get the workaround files.

Feel free to comment and agree or disagree.

I am pretty much in full agreement with this comment for all the reasons stated.

Unfortunately for me I jumped on the Apple silicon bandwagon and would love something more mainstream now, but again your comments about not being as mainstream are right on. 😄.

@ankudinov
Copy link
Contributor

OK. In that case we'll hold this, but not going to close as we definitely have to address that later.

@ryanmerolle
Copy link
Contributor Author

Keep me in the loop either here or the avd base as I am happy to contribute as always while also test with my Macbook.

@jonxstill
Copy link
Contributor

FWIW, I'm also an M1 user and for the moment I've been building my own arm64 containers using the same dockerfiles as the Intel releases. The mkdocs containers in particular are unusable with the x86 binary emulation (takes > 10 minutes from launch to making port 8000 available for browsing documentation).

@github-actions
Copy link

This issue is stale because it has been open 90 days with no activity. Remove stale label or comment or this will be closed in 15 days

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
state: stale type: enhancement New feature or request
Projects
None yet
Development

Successfully merging a pull request may close this issue.

4 participants