Hit the ground running! From zero to production-ready Docker Conda image in just 20 minutes
You’re packaging your Conda-based Python application with Docker. It’s going to be running in production. And now you have a whole new set of worries:
- Insecure images will put your production data at risk.
- Slow CI builds will bottleneck your whole team, wasting expensive developer time.
- Silent crashes become that much harder to debug, wasting even more developer time.
- Large images slow deployments and increase your bandwidth costs.
You can’t just take any old Docker image and push it to production. You’re going to have to do some work.
Unfortunately, implementing Docker packaging best practices can take multiple days of development and debugging. There are no elegant abstractions when it comes to packaging—it’s just detail after detail after detail that you need to get right.
And getting all those details right takes time, time you don’t have.
Production-ready, the fast way
Your tools shouldn’t slow you down; your tools should make your team more productive.
With the right infrastructure, your team can spin up new applications faster. And when you deploy a new Docker image, you can be confident that you’re following operational best practices, confident that it’s built right, confident that it’s secure—and all running quickly so your team can be more productive.
But only if you have the right infrastructure.
To help you get there, and get there fast, I’m working on a best-practices template for packaging Python applications with Docker, specifically designed for Conda.
While it’s still a work-in-progress, the design is adapted from my existing
pip-based template. Here’s one user’s experience with that product:
“The template worked great 🙂
The app using it is now live within the hospital’s ICU unit, and I’ll be using the same template to deploy an API (still in development) on the same infrastructure.”
– Nel Swanepol, University College London
Want to know when the Conda template for Docker is available? Sign up below.