Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

MongoDB Scalers open many open connections #5612

Closed
MichaelMayer-askui opened this issue Mar 19, 2024 · 7 comments
Closed

MongoDB Scalers open many open connections #5612

MichaelMayer-askui opened this issue Mar 19, 2024 · 7 comments
Labels
bug Something isn't working stale All issues that are marked as stale due to inactivity

Comments

@MichaelMayer-askui
Copy link

Report

After connection problems between keda components inside the cluster the MongoDB scaler makes many new request to MongoDB Database which caused that the connection limit was reached. The number of connections per minute was higher as configured.

Expected Behavior

If connection issues appear, Keda should not spawn new connections from a scaler which are more frequently spawned then configured.

Actual Behavior

Keda spawns more sacler connections connections than expected.
image

Steps to Reproduce the Problem

Logs from KEDA operator

example

KEDA Version

2.13.1

Kubernetes Version

1.28

Platform

Amazon Web Services

Scaler Details

MongoDB

Anything else?

No response

@MichaelMayer-askui MichaelMayer-askui added the bug Something isn't working label Mar 19, 2024
@JorTurFer
Copy link
Member

Hello @MichaelMayer-askui ,
Thanks for reporting! Do you see any error in keda-operator logs? Could you confirm the KEDA version you are using?

@MichaelMayer-askui
Copy link
Author

Hello @JorTurFer ,
The versions i can confirm, all keda components on version 2.13.1, see screeshoot.
image

The only error message from keda-operator which i receive is the following:

mongodb_scaler	failed to query DATABASE in COLLECTION_NAME, because of server selection error: context deadline exceeded, current topology: { Type: ReplicaSetNoPrimary, Servers: [{ Addr: ATLAS_URL.mongodb.net:27017, Type: RSSecondary, Tag sets: diskState=READY,availabilityZone=AZ,nodeType=ELECTABLE,provider=AWS,region=AZ,workloadType=OPERATIONAL, Average RTT: 1230992348 }, { Addr: ATLAS_URL.mongodb.net:27017, Type: RSSecondary, Tag sets: provider=AWS,diskState=READY,region=AZ,workloadType=OPERATIONAL,availabilityZone=AZ,nodeType=ELECTABLE, Average RTT: 3196250 }, { Addr: ATLAS_URL.mongodb.net:27017, Type: Unknown, Last error: dial tcp IP:27017: connect: connection refused }, ] }	{"type": "ScaledJob", "namespace": "NAMESPACE_NAME", "name": "SCALED_JOB_NAME", "error": "server selection error: context deadline exceeded, current topology: { Type: ReplicaSetNoPrimary, Servers: [{ Addr: ATLAS_URL.mongodb.net:27017, Type: RSSecondary, Tag sets: diskState=READY,availabilityZone=AZ,nodeType=ELECTABLE,provider=AWS,region=AZ,workloadType=OPERATIONAL, Average RTT: 1230992348 }, { Addr: ATLAS_URL.mongodb.net:27017, Type: RSSecondary, Tag sets: provider=AWS,diskState=READY,region=AZ,workloadType=OPERATIONAL,availabilityZone=AZ,nodeType=ELECTABLE, Average RTT: 3196250 }, { Addr: ATLAS_URL.mongodb.net:27017, Type: Unknown, Last error: dial tcp IP:27017: connect: connection refused }, ] }"}
github.com/kedacore/keda/v2/pkg/scalers.(*mongoDBScaler).getQueryResult
	/workspace/pkg/scalers/mongo_scaler.go:231
github.com/kedacore/keda/v2/pkg/scalers.(*mongoDBScaler).GetMetricsAndActivity
	/workspace/pkg/scalers/mongo_scaler.go:240
github.com/kedacore/keda/v2/pkg/scaling/cache.(*ScalersCache).GetMetricsAndActivityForScaler
	/workspace/pkg/scaling/cache/scalers_cache.go:130
github.com/kedacore/keda/v2/pkg/scaling.(*scaleHandler).getScaledJobMetrics
	/workspace/pkg/scaling/scale_handler.go:830
github.com/kedacore/keda/v2/pkg/scaling.(*scaleHandler).isScaledJobActive
	/workspace/pkg/scaling/scale_handler.go:879
github.com/kedacore/keda/v2/pkg/scaling.(*scaleHandler).checkScalers
	/workspace/pkg/scaling/scale_handler.go:262
github.com/kedacore/keda/v2/pkg/scaling.(*scaleHandler).startScaleLoop
	/workspace/pkg/scaling/scale_handler.go:182

This message occurs after the high amount of connection is open and the database refused more.

@JorTurFer
Copy link
Member

Is your connection using mongo+svr scheme? v2.13.1 doesn't support it. I mean, if you are using Mongo Atlas based on mongo+srv scheme, it could be the problem itself. The support for the scheme was added 3 weeks ago: #5566

Maybe you could try the main tag setting the scheme as it's explained in next version docs: https://keda.sh/docs/2.14/scalers/mongodb/

@MichaelMayer-askui
Copy link
Author

Yes the connection string is in mongo+svr scheme. I will test, if the problem will be resolved by adjusting the connection string.

Now I am surprised that it works. I used this config for now ~1 year without any issues. The problems occurred right after the update to v2.13 from v2.12.
Should this not produce an error regarding wrong configuration?

@JorTurFer
Copy link
Member

Should this not produce an error regarding wrong configuration?

It could be, I don't remind related changes but they definitively could be there, for example the mongo pkg has been updated too

Copy link

stale bot commented May 25, 2024

This issue has been automatically marked as stale because it has not had recent activity. It will be closed in 7 days if no further activity occurs. Thank you for your contributions.

@stale stale bot added the stale All issues that are marked as stale due to inactivity label May 25, 2024
Copy link

stale bot commented Jun 2, 2024

This issue has been automatically closed due to inactivity.

@stale stale bot closed this as completed Jun 2, 2024
@github-project-automation github-project-automation bot moved this from To Triage to Ready To Ship in Roadmap - KEDA Core Jun 2, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working stale All issues that are marked as stale due to inactivity
Projects
Archived in project
Development

No branches or pull requests

2 participants