Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Concurrency #3853

Closed
zhengqianhe0 opened this issue Sep 19, 2024 · 2 comments
Closed

Concurrency #3853

zhengqianhe0 opened this issue Sep 19, 2024 · 2 comments
Assignees
Labels
question Further information is requested

Comments

@zhengqianhe0
Copy link

Has Langflow considered the issue of multiple concurrent users? How many people can be supported to use it online simultaneously if I deploy it by custom?

@dosubot dosubot bot added the question Further information is requested label Sep 19, 2024
Copy link

dosubot bot commented Sep 19, 2024

I found some relevant information regarding handling multiple concurrent users in Langflow.

Similar Issues and Discussions

  1. How to turn LangFlow into an API service? - Unsolved discussion [1].
  2. Does langflow support high-availability deployments? - Open issue [2].
  3. How use async APIs? - Closed issue with a solution for parallel API calls [3].

Handling Multiple Concurrent Users

Langflow has several configuration settings and parameters to optimize performance for multiple concurrent users:

  1. Worker Processes: Adjust the number of worker processes using the --workers option or the LANGFLOW_WORKERS environment variable. The default is set to 1.

    langflow run --workers <number_of_workers>

    Or set the environment variable:

    export LANGFLOW_WORKERS=<number_of_workers>
  2. Worker Timeout: Adjust the worker timeout using the --timeout option or the LANGFLOW_TIMEOUT environment variable.

    langflow run --timeout <timeout_in_seconds>

    Or set the environment variable:

    export LANGFLOW_TIMEOUT=<timeout_in_seconds>
  3. Load Balancing: Langflow supports load balancing by scaling the number of replicas for the backend and frontend services. This can be configured in the values.yaml file.

    langflow:
      backend:
        replicaCount: 3
      frontend:
        replicaCount: 3
  4. Hardware Requirements: For high-concurrency environments, it is recommended to have at least 2Gi of memory and 1 CPU per backend replica, and at least 1Gi of memory and 1 CPU per frontend replica. Scale the number of replicas as needed [4].

  5. Monitoring Tools: Langflow integrates with Langfuse and LangSmith for real-time monitoring and performance tracking. This helps in identifying and resolving concurrency issues [5][6][7].

By configuring these settings, you can optimize Langflow to support multiple concurrent users effectively.

To continue talking to Dosu, mention @dosu.

@lucaseduoli
Copy link
Collaborator

Hello!
The Multi-User functionality is coming up in a few weeks. We're going to be able to support multiple users in the same flow, and, futurely, even cursor tracking.
Thanks for the feedback!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
question Further information is requested
Projects
None yet
Development

No branches or pull requests

2 participants