Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature/background proc with celery #788

Merged
merged 51 commits into from
Dec 5, 2023

Conversation

jasquat
Copy link
Contributor

@jasquat jasquat commented Dec 4, 2023

Implements #584.

This adds the ability to run background processing with celery using a redis queue.

Main features:

  • adds support for celery tasks using a redis queue
  • adds keyboard shortcut support in the frontend so custom keyboard shortcuts can be created on a per-page basis
  • adds new db table future_task which contains tasks that have timers on them
    • these are added to the celery queue from the apscheduler when it finds tasks that will run in the next five minutes

To enable, these variables must be set in all backend-related container types:

  • SPIFFWORKFLOW_BACKEND_CELERY_ENABLED=true
  • SPIFFWORKFLOW_BACKEND_CELERY_BROKER_URL=[redis_host_with_port]
    • for example, redis://localhost
  • SPIFFWORKFLOW_BACKEND_CELERY_RESULT_BACKEND=[redis_host_with_port]
    • for example, redis://localhost

For deployments, a new deployable type (docker container or similar) is generally required for celery workers. This should be configured with environment variables similar to the apscheduler/proc container, but with the following command:

["./bin/start_celery_worker"] # any number of these containers are fine

The two other container types (that probably already exist in your list of deployables) might have commands like the following:

["poetry", "run", "python", "./bin/start_blocking_appscheduler.py"] # appscheduler / proc. you can only run one of these

and

["./bin/boot_server_in_docker"] # api, the default command in the backend Dockerfile. any number of these are fine

Additional notes:
It still uses apscheduler to run periodic tasks, which will mostly check the future_task table and add them to the redis queue if appropriate. This should take significantly less time than processing the waiting and user_input_required tasks that it does now.

… into feature/background-proc-with-celery

# Conflicts:
#	spiffworkflow-backend/src/spiffworkflow_backend/routes/process_instances_controller.py
…oc-with-celery

# Conflicts:
#	spiffworkflow-backend/src/spiffworkflow_backend/routes/process_instances_controller.py
…oc-with-celery

# Conflicts:
#	spiffworkflow-backend/poetry.lock
…nd set its line-length to match ruff w/ burnettk
…oc-with-celery

# Conflicts:
#	spiffworkflow-backend/src/spiffworkflow_backend/api.yml
#	spiffworkflow-backend/src/spiffworkflow_backend/background_processing/background_processing_service.py
#	spiffworkflow-backend/src/spiffworkflow_backend/services/authentication_service.py
#	spiffworkflow-backend/src/spiffworkflow_backend/services/process_instance_service.py
jasquat and others added 19 commits November 30, 2023 16:21
… show page and added additional identifier to locking system to isolate celery workers better w/ burnettk
…ography/spiff-arena into feature/background-proc-with-celery
…oc-with-celery

# Conflicts:
#	spiffworkflow-backend/bin/data_migrations/version_1_3.py
… run a process instance over the api w/ burnettk
…oc-with-celery

# Conflicts:
#	spiffworkflow-backend/src/spiffworkflow_backend/load_database_models.py
@jasquat jasquat merged commit cd0bd3b into main Dec 5, 2023
32 checks passed
@jasquat jasquat deleted the feature/background-proc-with-celery branch December 5, 2023 16:42
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants