Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[feat] - Add SizedLRU Cache #3344

Merged
merged 5 commits into from
Sep 30, 2024
Merged

[feat] - Add SizedLRU Cache #3344

merged 5 commits into from
Sep 30, 2024

Conversation

ahrav
Copy link
Collaborator

@ahrav ahrav commented Sep 27, 2024

Description:

This PR introduces a size-limited, generic LRU cache with optional metrics tracking, which wraps the golang-lru caching library. Key features include customizable metrics collection (e.g., hits, misses, evictions) and configurable behavior through functional options like setting the maximum cache size.

Checklist:

  • Tests passing (make test-community)?
  • Lint passing (make lint this requires golangci-lint)?

@ahrav ahrav force-pushed the feat-lru-sized-cache branch from 16c1db1 to ecc3967 Compare September 27, 2024 20:16
@ahrav ahrav requested a review from a team September 27, 2024 20:16
@rgmz
Copy link
Contributor

rgmz commented Sep 27, 2024

Is this meant to replace the existing memory cache? If not, what are the use cases of this vs that?

@ahrav
Copy link
Collaborator Author

ahrav commented Sep 27, 2024

Is this meant to replace the existing memory cache? If not, what are the use cases of this vs that?

I don't believe so. The plan is to use this for detection caching.

@ahrav ahrav force-pushed the feat-lru-sized-cache branch from ecc3967 to 59a7884 Compare September 27, 2024 21:07
@rgmz
Copy link
Contributor

rgmz commented Sep 28, 2024

I don't believe so. The plan is to use this for detection caching.

It might be worth tweaking the naming + comments in that case. "Memory" and "LRU" aren't exactly mutually exclusive and don't have clear purposes.

@ahrav ahrav marked this pull request as ready for review September 30, 2024 11:09
@ahrav ahrav requested a review from a team as a code owner September 30, 2024 11:09
@ahrav ahrav force-pushed the feat-lru-sized-cache branch from e7fed02 to f7e5b96 Compare September 30, 2024 15:00
@ahrav ahrav force-pushed the feat-lru-sized-cache branch from f7e5b96 to 51e6206 Compare September 30, 2024 15:00
Comment on lines 55 to 56
const defaultSize = 1 << 29 // 512MB

Copy link
Contributor

@dustin-decker dustin-decker Sep 30, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think a smaller default (like 8MB) would be sensible here.

Comment on lines 45 to 49
// WithCapacity is a functional option to set the maximum capacity of the cache.
// If the capacity is not set, the default value (512MB) is used.
func WithCapacity[T any](capacity int) Option[T] {
return func(lc *Cache[T]) { lc.capacity = capacity }
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you document the units expected here?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yep, great idea.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Whoops, I got the cache implementations mixed up. This capacity refers to the number of items, not total size.

Comment on lines +41 to +42
collectorOnce.Do(func() {
collector = &MetricsCollector{
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Consider having this called automatically when the package is imported using init()?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yea, that makes sense. Good call.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Also cleans up some of the test code.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Another option is to just declar them as package vars like we do with most others

Copy link
Contributor

@dustin-decker dustin-decker left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM. Love the metric instrumentation!

@ahrav
Copy link
Collaborator Author

ahrav commented Sep 30, 2024

I don't believe so. The plan is to use this for detection caching.

It might be worth tweaking the naming + comments in that case. "Memory" and "LRU" aren't exactly mutually exclusive and don't have clear purposes.

I'll update the name for the memory cache in a separate PR.

@ahrav ahrav merged commit a5b0995 into main Sep 30, 2024
12 checks passed
@ahrav ahrav deleted the feat-lru-sized-cache branch September 30, 2024 20:18
kashifkhan0771 pushed a commit to kashifkhan0771/trufflehog that referenced this pull request Oct 1, 2024
* add impl for lru sized cache

* update error message

* address comments

* rename

* update comments
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

Successfully merging this pull request may close these issues.

3 participants