Using Ollama, models are not stored in docker images #8594
-
In my spring boot unit test, I want to call Ollama with some models, so I have this code: try (OllamaContainer ollama = new OllamaContainer("ollama/ollama:0.1.32")) {
ollama.withExposedPorts(11434);
ollama.start();
EnumSet.allOf(AIModel.class).forEach(aiModel -> {
//AIModel.getAIModelsForTest().forEach(aiModel -> {
try {
LOGGER.info("> Pulling the model {}...", aiModel.getModelName());
ollama.execInContainer("ollama", "pull", aiModel.getModelName());
LOGGER.info("> ...model {} pulled.", aiModel.getModelName());
} catch (IOException | InterruptedException e) {
throw new RuntimeException("Error when creating container " + e.getMessage());
}
});
ollama.commitToImage(OLLAMA_ISIA);
return ollama;
} Before running it, I delete everything docker on my computer and double checked it: straumat@straumat-portable:~$ docker image list
REPOSITORY TAG IMAGE ID CREATED SIZE
straumat@straumat-portable:~$ I added this dependency management <dependencyManagement>
<dependencies>
<dependency>
<groupId>org.testcontainers</groupId>
<artifactId>testcontainers-bom</artifactId>
<version>1.19.7</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement> And those dependencies: <dependency>
<groupId>org.testcontainers</groupId>
<artifactId>junit-jupiter</artifactId>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.testcontainers</groupId>
<artifactId>ollama</artifactId>
<scope>test</scope>
</dependency> In the logs, we see all models are downloaded: 10:32:05.426 [main] INFO tc.ollama/ollama:0.1.32 -- Pulling docker image: ollama/ollama:0.1.32. Please be patient; this may take some time but only needs to be done once.
10:32:06.750 [docker-java-stream--450417071] INFO tc.ollama/ollama:0.1.32 -- Starting to pull image
10:32:06.751 [docker-java-stream--450417071] INFO tc.ollama/ollama:0.1.32 -- Pulling image layers: 0 pending, 0 downloaded, 0 extracted, (0 bytes/0 bytes)
10:32:11.221 [docker-java-stream--450417071] INFO tc.ollama/ollama:0.1.32 -- Pulling image layers: 2 pending, 1 downloaded, 0 extracted, (101 MB/? MB)
10:32:12.314 [docker-java-stream--450417071] INFO tc.ollama/ollama:0.1.32 -- Pulling image layers: 2 pending, 1 downloaded, 1 extracted, (142 MB/? MB)
10:32:15.234 [docker-java-stream--450417071] INFO tc.ollama/ollama:0.1.32 -- Pulling image layers: 1 pending, 2 downloaded, 1 extracted, (248 MB/? MB)
10:32:15.634 [docker-java-stream--450417071] INFO tc.ollama/ollama:0.1.32 -- Pulling image layers: 1 pending, 2 downloaded, 2 extracted, (265 MB/? MB)
10:32:17.515 [docker-java-stream--450417071] INFO tc.ollama/ollama:0.1.32 -- Pulling image layers: 0 pending, 3 downloaded, 2 extracted, (337 MB/338 MB)
10:32:18.775 [docker-java-stream--450417071] INFO tc.ollama/ollama:0.1.32 -- Pulling image layers: 0 pending, 3 downloaded, 3 extracted, (338 MB/338 MB)
10:32:18.791 [main] INFO tc.ollama/ollama:0.1.32 -- Image ollama/ollama:0.1.32 pull took PT13.36524521S
10:32:18.791 [docker-java-stream--450417071] INFO tc.ollama/ollama:0.1.32 -- Pull complete. 3 layers, pulled in 12s (downloaded 338 MB at 28 MB/s)
10:32:18.793 [main] INFO tc.ollama/ollama:0.1.32 -- Creating container for image: ollama/ollama:0.1.32
10:32:19.178 [main] INFO tc.ollama/ollama:0.1.32 -- Container ollama/ollama:0.1.32 is starting: d105df6ebe5d6222ec411d9d1422e7d310f621c607987401cab6b96d8a4ce91c
10:32:19.616 [main] INFO tc.ollama/ollama:0.1.32 -- Container ollama/ollama:0.1.32 started in PT0.822826165S
10:32:19.617 [main] INFO util.BaseTest -- > Pulling the model dbrx:132b...
10:32:42.181 [main] INFO util.BaseTest -- > ...model dbrx:132b pulled.
10:32:42.181 [main] INFO util.BaseTest -- > Pulling the model llama3:70b...
10:33:02.299 [main] INFO util.BaseTest -- > ...model llama3:70b pulled.
10:33:02.299 [main] INFO util.BaseTest -- > Pulling the model llama3:8b...
10:33:22.453 [main] INFO util.BaseTest -- > ...model llama3:8b pulled.
10:33:22.453 [main] INFO util.BaseTest -- > Pulling the model phi3:mini...
10:33:42.594 [main] INFO util.BaseTest -- > ...model phi3:mini pulled. Everything seems fine but my tests fails saying models are not found... So i do : straumat@straumat-portable:~$ docker image list
REPOSITORY TAG IMAGE ID CREATED SIZE
ollama-isia latest 8c8765726dc1 39 minutes ago 913MB
ollama/ollama 0.1.32 6ffa7903c2d4 2 weeks ago 436MB
testcontainers/ryuk 0.6.0 71009a3edde7 4 months ago 14.9MB I run the image created: straumat@straumat-portable:~$ docker run ollama-isia
time=2024-05-04T09:13:31.695Z level=INFO source=images.go:817 msg="total blobs: 0"
time=2024-05-04T09:13:31.695Z level=INFO source=images.go:824 msg="total unused blobs removed: 0"
time=2024-05-04T09:13:31.695Z level=INFO source=routes.go:1143 msg="Listening on [::]:11434 (version 0.1.32)"
time=2024-05-04T09:13:31.695Z level=INFO source=payload.go:28 msg="extracting embedded files" dir=/tmp/ollama934115486/runners
time=2024-05-04T09:13:34.376Z level=INFO source=payload.go:41 msg="Dynamic LLM libraries [rocm_v60002 cpu cpu_avx cpu_avx2 cuda_v11]"
time=2024-05-04T09:13:34.376Z level=INFO source=gpu.go:121 msg="Detecting GPU type"
time=2024-05-04T09:13:34.376Z level=INFO source=gpu.go:268 msg="Searching for GPU management library libcudart.so*"
time=2024-05-04T09:13:34.376Z level=INFO source=gpu.go:314 msg="Discovered GPU libraries: [/tmp/ollama934115486/runners/cuda_v11/libcudart.so.11.0]"
time=2024-05-04T09:13:34.377Z level=INFO source=gpu.go:343 msg="Unable to load cudart CUDA management library /tmp/ollama934115486/runners/cuda_v11/libcudart.so.11.0: your nvidia driver is too old or missing, please upgrade to run ollama"
time=2024-05-04T09:13:34.377Z level=INFO source=gpu.go:268 msg="Searching for GPU management library libnvidia-ml.so"
time=2024-05-04T09:13:34.377Z level=INFO source=gpu.go:314 msg="Discovered GPU libraries: []"
time=2024-05-04T09:13:34.377Z level=INFO source=cpu_common.go:11 msg="CPU has AVX2"
time=2024-05-04T09:13:34.377Z level=INFO source=routes.go:1164 msg="no GPU detected" Now that it is launched: straumat@straumat-portable:~$ docker ps
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
560652a69260 ollama-isia "/bin/ollama serve" 43 seconds ago Up 42 seconds 11434/tcp funny_heyrovsky I checked what images are in my container and there is none: straumat@straumat-portable:~$ docker exec -it 560652a69260 ollama list
NAME ID SIZE MODIFIED
straumat@straumat-portable:~$ To be sure, i also started ollama/ollama to see if it has models... ollama list is also empty. |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
Sorry, but I was not able to reproduce it. |
Beta Was this translation helpful? Give feedback.
Sorry, but I was not able to reproduce it.