Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

RFC: Improvements for better performance, caching & dev experience #64

Open
beriberikix opened this issue Jul 12, 2021 · 10 comments
Open
Labels
enhancement New feature or request

Comments

@beriberikix
Copy link

I'd like to propose several improvements to the Dockerfile in order to make targeted images that are smaller & faster, leading to a better developer experience for maintainers and developers alike. I'm far from a Docker expert but have made my share of builder-type images, so hopefully we can get some extra input.

The biggest improvement would be breaking up into multistage builds. I'd suggest the following stages:

  • Base
  • CI
  • Development
  • Test
  • Docs

I'm roughly mapping to the different requirements.txt and use cases. Not sure if "compliance" should be created as well. I'm not fluent enough in all the dependencies but we can be more efficient with what's downloaded, installed & cached. For example, Base would only install the minimal packages (ex. install python but not renode,) pip install requirements-base.txt, and so on.

Other ideas to further improvements:

  • COPY pip directory from Base to save download time (pip install seems to take awhile)
  • Do "proper" caching with Docker's BuildKit
  • Be smarter when it comes to caching layers (we'll see some benefits just from moving to multistage)
  • Move to a smaller base image like Alpine
  • Create our own "base image" using a Scratch container
  • Having the "child" stages like Development only include what we need (Scratch is useful for that too)
  • Releasing child stages as images on Dockerhub. These could be autogenerated based on Zephyr release or Release+SDK combo
  • Integrate add project groups to upsteam west manifest zephyr#36324 as ARGS to further reduce downloads
@aunsbjerg
Copy link

A thing to consider is to differentiate between docker images needed for zephyr development, and docker images needed for application development. For zephyr development, it can make sense to have all toolchains bundled together in a single image as I may want to test my changes on MCU's from different vendors. But for application development, I usually only target a single MCU or MCU family, it's overkill and increases CI build times because a large image has to be downloaded.

@petejohanson
Copy link

We've done a lot of this sort of work for ZMK in https://github.com/zmkfirmware/zmk-docker if you want to see some strategies we've taken.

We have separate build (for CI usage), and dev images with full tooling.

Also, we use a dedicated tag per target architecture, instead of a single tag with the full SDK.

@galak
Copy link
Collaborator

galak commented Jul 13, 2021

We've done a lot of this sort of work for ZMK in https://github.com/zmkfirmware/zmk-docker if you want to see some strategies we've taken.

We have separate build (for CI usage), and dev images with full tooling.

Also, we use a dedicated tag per target architecture, instead of a single tag with the full SDK.

Can you report and numbers on docker image sizes for the different variants you have?

@petejohanson
Copy link

We've done a lot of this sort of work for ZMK in https://github.com/zmkfirmware/zmk-docker if you want to see some strategies we've taken.
We have separate build (for CI usage), and dev images with full tooling.
Also, we use a dedicated tag per target architecture, instead of a single tag with the full SDK.

Can you report and numbers on docker image sizes for the different variants you have?

You can see all the images (I mispoke, we actually use a different repository per target architecture, and tags for versions only) at https://hub.docker.com/u/zmkfirmware

For example, the zmk-build-arm:2.5 which is enough for our CI usage, the compressed size is 402.43 MB. The equivalent zmk-dev-arm:2.5 image that includes extra tooling, which is used for our VS Code container image has a compressed size of 728.31 MB.

You can peak at the other build/dev images targetting other architectures, but they all end up roughly the same.

@0Grit
Copy link

0Grit commented Sep 2, 2021

I assume Ubuntu is used as the base out of convenience?

@petejohanson
Copy link

I assume Ubuntu is used as the base out of convenience?

Yeah, there were a few deps made easier by basing on Ubuntu, which didn't grow our image size that drastically compared to debian. I believe it was cmake version, and something else that escapes me off the top of my head.

@0Grit
Copy link

0Grit commented Sep 27, 2021

I'm ready to start some work against this issue. @galak @nashif any guidelines?

@0Grit
Copy link

0Grit commented Oct 4, 2021

Bump @galak @nashif

@nashif
Copy link
Member

nashif commented Oct 4, 2021

The biggest improvement would be breaking up into multistage builds. I'd suggest the following stages:

  • Base
  • CI
  • Development
  • Test
  • Docs

we already have 2 stages, CI and Developer. I can see how we can improve this with a Base image that can for example be used for building docs (right now we install doc packages in the doc actions directly)....
Test probably is going to be always included in both CI and Development, so the layering could actually look like this:

Base -> Test
          -> CI
          -> Development
     -> Doc

makes sense?

@beriberikix
Copy link
Author

That makes sense to me! That would still allow someone to developer their own images using Base.

@stephanosio stephanosio added the enhancement New feature or request label Apr 8, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

7 participants