Skip to content
This repository has been archived by the owner on Oct 16, 2024. It is now read-only.

Commit

Permalink
Merge pull request #137 from iterative/cases/cicd-mlem
Browse files Browse the repository at this point in the history
cases: copy edit CI/CD for ML
  • Loading branch information
jorgeorpinel authored Jul 21, 2022
2 parents 8eb037f + 69a361e commit 37d8661
Showing 1 changed file with 22 additions and 25 deletions.
47 changes: 22 additions & 25 deletions content/docs/use-cases/cicd.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,35 +3,32 @@
Applying DevOps methodologies to machine learning (MLOps) and data management
(DataOps) is increasingly common. This means resource orchestration
(provisioning servers for model training), model testing (validating model
inference), and model deployment to production, as well as monitoring &
feedback. MLEM provides you a simple way to publish or deploy your machine
learning models in CI/CD pipelines.
inference), and model deployment, as well as monitoring and feedback. MLEM
provides a simple way to publish or deploy your machine learning models with
CI/CD pipelines.

- **Packaging and publishing models**: It is a common case when you need to wrap
your machine learning model into some specific format and publish it in some
registry. Examples include turning your ML model into a Python package and
publishing it on PyPi, or building a docker image and pushing it to DockerHub,
or just exporting your model to ONNX and publishing it as an artifact to
Artifactory.
- **Packaging and publishing models**: A common need is when you need to wrap
your ML model in a specific format and publish it in some registry. Examples
include turning your ML model into a Python package and publishing it on PyPi,
building a docker image and pushing it to DockerHub, or just exporting your
model to ONNX and publishing it as an artifact to Artifactory.

- **Deploying models**: Another common scenario is when you want to deploy your
model in your CI/CD pipeline. MLEM can help you with that by providing a
number of ready-to-use integrations with popular deployment platforms.
- **Deploying models**: Another common scenario is when you want to deploy a
model within a CI/CD pipeline. For this, MLEM includes a number of
ready-to-use integrations with popular deployment platforms.

## Examples

### Build and publish
## Package and publish an ML model

To trigger the publishing or deploying of a new version, you usually create a
Git tag that kicks off CI process. To make building and deployment process
consistent you can create and commit MLEM declaration:
Git tag that kicks off the CI process. To make this build process consistent
with future deployment, you can create and commit an MLEM declaration:

```cli
$ mlem declare builder pip -c package_name=mypackagename -c target=package build-to-pip
💾 Saving builder to build-to-pip.mlem
```

And then use that declaration in CI:
And then use that declaration in the CI job (e.g. with GitHub Actions):

```yaml
# .github/workflows/publish.yml
Expand Down Expand Up @@ -59,12 +56,13 @@ jobs:
sh upload_to_pypi.sh package
```
Learn more about building in [Get Started](/doc/get-started/building).
Learn more about building (packaging) ML models
[here](/doc/get-started/building).
### Deploy
## Deploy an ML model
Example with deployment is quite similar. First you need to create environment
and deployment declaration and commit them to Git:
The deployment scenario is similar. First you need to create environment and
deployment declarations, and commit them to Git:
```cli
$ mlem declare env heroku staging
Expand All @@ -74,7 +72,7 @@ $ mlem declare deployment heroku myservice -c app_name=mlem-deployed-in-ci -c mo
💾 Saving deployment to myservice.mlem
```

Then create and commit CI pipeline, e.g. in GH Actions:
Then create and commit the CI pipeline (e.g. with GH Actions):

```yaml
# .github/workflows/publish.yml
Expand All @@ -98,5 +96,4 @@ jobs:
mlem deployment my-model --load myservice.mlem
```
Learn more about deploying in Get Started
[Get Started](/doc/get-started/deploying).
Learn more about deploying ML models [here](/doc/get-started/deploying).

0 comments on commit 37d8661

Please sign in to comment.