Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(ollama): support calling the Ollama local process #2923

Merged
merged 42 commits into from
Jan 2, 2025

Conversation

stevenh
Copy link
Collaborator

@stevenh stevenh commented Dec 17, 2024

Refactor local process handling for Ollama using a container implementation avoiding the wrapping methods.

This defaults to running the binary with an ephemeral port to avoid conflict.
This behaviour can be overridden my setting OLLAMA_HOST either in the parent environment or in the values passed via WithUseLocal.

Improve API compatibility with:

  • Multiplexed output streams
  • State reporting
  • Exec option processing
  • WaitingFor customisation

Fix Container implementation:

  • Port management
  • Running checks
  • Terminate processing
  • Endpoint argument definition
  • Add missing methods
  • Consistent environment handling

Copy link

netlify bot commented Dec 17, 2024

Deploy Preview for testcontainers-go ready!

Name Link
🔨 Latest commit 39e7af4
🔍 Latest deploy log https://app.netlify.com/sites/testcontainers-go/deploys/677693e6e72e3e0008d8ee42
😎 Deploy Preview https://deploy-preview-2923--testcontainers-go.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify site configuration.

@stevenh stevenh changed the title refactor(ollama): local process feat(ollama): local process Dec 17, 2024
@stevenh stevenh changed the title feat(ollama): local process feat(ollama): refactor local process Dec 17, 2024
@stevenh stevenh force-pushed the refactor/ollama-local branch from 0d0ff48 to 8619850 Compare December 17, 2024 23:58
@github-actions github-actions bot added the feature New functionality or new behaviors on the existing one label Dec 17, 2024
@stevenh stevenh force-pushed the refactor/ollama-local branch 2 times, most recently from 0146fc5 to 2f2865b Compare December 18, 2024 19:02
mdelapenya
mdelapenya previously approved these changes Dec 20, 2024
Copy link
Member

@mdelapenya mdelapenya left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This LGTM, waiting for you to mark is as ready. Great job with the process execution handling 🏆

modules/ollama/local.go Outdated Show resolved Hide resolved
modules/ollama/local.go Show resolved Hide resolved

// Terminate implements testcontainers.Container interface for the local Ollama binary.
// It stops the local Ollama process, removing the log file.
func (c *localProcess) Terminate(ctx context.Context) error {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

todo: conflict with #2926, so depending on which one is merged first, we need to update it.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yer just replied to that PR, I don't think it works in its current form as it breaks the interface separation.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We merged #2926 so I think we need to update this signature to match the interface

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added it here: 9be4b71

I did not transfer the options from Terminate to the Stop call (stop duration), do you think we need to extract it from the options and override the default?

modules/ollama/local.go Show resolved Hide resolved
wait/log.go Outdated Show resolved Hide resolved
@stevenh stevenh force-pushed the refactor/ollama-local branch from 9927ad0 to 5e586c8 Compare December 20, 2024 17:51
@stevenh stevenh marked this pull request as ready for review December 20, 2024 18:05
@stevenh stevenh requested a review from a team as a code owner December 20, 2024 18:05
@mdelapenya mdelapenya changed the title feat(ollama): refactor local process feat(ollama): support calling the Ollama local process Dec 20, 2024
mdelapenya and others added 19 commits January 2, 2025 12:01
Refactor local process handling for Ollama using a container implementation
avoiding the wrapping methods.

This defaults to running the binary with an ephemeral port to avoid port
conflicts. This behaviour can be overridden my setting OLLAMA_HOST either
in the parent environment or in the values passed via WithUseLocal.

Improve API compatibility with:

- Multiplexed output streams
- State reporting
- Exec option processing
- WaitingFor customisation

Fix Container implementation:

- Port management
- Running checks
- Terminate processing
- Endpoint argument definition
- Add missing methods
- Consistent environment handling
Refactor local processing to use the new log sub match functionality.
Validate the container request to ensure the user configuration can be processed
and no fields that would be ignored are present.
Remove temporary simple test.
Allow the local ollama binary name to be configured using the image name.
Detail the container request supported fields.
Update local process site docs to match recent changes.
Refactor Terminate to support testcontainers.TerminateOption.
@stevenh stevenh force-pushed the refactor/ollama-local branch from 9be4b71 to 4c3a06c Compare January 2, 2025 12:17
modules/ollama/local.go Outdated Show resolved Hide resolved
@mdelapenya mdelapenya self-assigned this Jan 2, 2025
Copy link
Member

@mdelapenya mdelapenya left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM, great work with the refactor of the process Start/Wait, thanks for this!

@mdelapenya mdelapenya merged commit 6ec91f1 into main Jan 2, 2025
124 checks passed
@mdelapenya mdelapenya deleted the refactor/ollama-local branch January 2, 2025 15:05
mdelapenya added a commit to mdelapenya/testcontainers-go that referenced this pull request Jan 8, 2025
* main: (103 commits)
  feat(postgres): ssl for postgres (testcontainers#2473)
  feat(ollama): support calling the Ollama local process (testcontainers#2923)
  chore(deps): bump jinja2 from 3.1.4 to 3.1.5 (testcontainers#2935)
  chore(deps): bump sonarsource/sonarcloud-github-action (testcontainers#2933)
  feat(termination)!: make container termination timeout configurable (testcontainers#2926)
  chore(deps): bump slackapi/slack-github-action from 1.26.0 to 2.0.0 (testcontainers#2934)
  chore(deps): bump github/codeql-action from 3.25.15 to 3.28.0 (testcontainers#2932)
  feat(wait): log sub match callback (testcontainers#2929)
  fix: Handle nil value in CleanupNetwork (testcontainers#2928)
  fix: avoid double lock in DockerProvider.DaemonHost() (testcontainers#2900)
  feat!: build log writer for container request (testcontainers#2925)
  feat(gcloud)!: add support to seed data when using RunBigQueryContainer (testcontainers#2523)
  security(deps): bump golang.org/x/crypto from 0.28.0 to 0.31.0 (testcontainers#2916)
  chore(ci): add Github labels based on PR title (testcontainers#2914)
  chore(gha): Use official setup-docker-action (testcontainers#2913)
  chore(ci): enforce conventional commits syntax in PR titles (testcontainers#2911)
  feat(nats): WithConfigFile - pass a configuration file to nats server (testcontainers#2905)
  chore: enable implicit default logger only in testing with -v (testcontainers#2877)
  fix: container binds syntax (testcontainers#2899)
  refactor(cockroachdb): to use request driven options (testcontainers#2883)
  ...
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature New functionality or new behaviors on the existing one
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants