Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

build: local event producer #243

Merged
merged 1 commit into from
May 15, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
11 changes: 11 additions & 0 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -181,9 +181,20 @@ dev.up.build-no-cache: dev.up.redis
docker-compose build --no-cache
docker-compose up -d

dev.up.with-events: dev.up.kafka-control-center dev.up

dev.up.redis: # This has the nice side effect of starting the devstack_default network
docker-compose -f $(DEVSTACK_WORKSPACE)/devstack/docker-compose.yml up -d redis

# Start kafka via the devstack docker-compose.yml
# https://github.com/openedx-unsupported/devstack/blob/323b475b885a2704489566b262e2895a4dca62b6/docker-compose.yml#L140
dev.up.kafka-control-center:
docker-compose -f $(DEVSTACK_WORKSPACE)/devstack/docker-compose.yml up -d kafka-control-center

# Useful for just restarting everything related to the event broker
dev.down.kafka-control-center:
docker-compose -f $(DEVSTACK_WORKSPACE)/devstack/docker-compose.yml down kafka zookeeper schema-registry kafka-control-center

dev.down: # Kills containers and all of their data that isn't in volumes
docker-compose down

Expand Down
35 changes: 35 additions & 0 deletions README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -69,6 +69,41 @@ Every time you develop something in this repo

# Open a PR and ask for review.

Setting up openedx-events
-------------------------
Ensure you've installed the ``edx_event_bus_kafka`` and ``openedx_events`` requirements. Entering
a shell with ``make app-shell`` and then running ``make requirements`` should install these for you.

From your host, run ``make dev.up.with-events``, which will start a local kafka container for you.
Visit http://localhost:9021/clusters to access the local "Confluent Control Center".
Confluent is like a cloud wrapper around vanilla Kafka.

Your ``devstack.py`` settings should already be configured to point at this event broker,
and to configure enterprise-subsidy as an openedx event consumer and/or produer.

Start by switching over to your **enterprise-access** repo and make sure it has requirements installed.
We have a specific enterprise "ping" event and management command defined to test
that your local event bus is well-configured. Open a shell with ``make app-shell`` and run::

./manage.py consume_enterprise_ping_events

This will consume ping events from the ``dev-enterprise-core`` topic.
You may see a ``Broker: Unknown topic`` error the first time you run it. When you run your
test event production below, that error will resolve (producing the event creates the topic
if it does not exist). **Leave the consumer running.** You should see the ``enterprise-access-service``
as a registered consumer in your local confluent control center.

Now, go back to your **enterprise-subsidy** directory. Make sure requirements are installed,
specifically the ``edx_event_bus_kafka`` and ``openedx_events`` packages. Use ``make app-shell``
in this repo and we'll *produce* a ping event::

./manage.py produce_enterprise_ping_event

If this event was successfully produced, you'll see a log message that says
``Message delivered to Kafka event bus: topic=dev-events-testing``.
You should also now see the ``dev-events-testing`` topic available in your local confluent control center,
and even the test events that are being published to the topic.

Deploying
=========
Merging a pull request will cause a GoCD `build` pipeline to start automatically.
Expand Down
Empty file.
Empty file.
Original file line number Diff line number Diff line change
@@ -0,0 +1,96 @@
"""
Produce a single event for enterprise-specific testing or health checks.

Implements required ``APP.management.commands.*.Command`` structure.
"""
import logging
import uuid

import attr
from django.conf import settings
from django.core.management.base import BaseCommand
from edx_event_bus_kafka.internal.producer import create_producer
from openedx_events.data import EventsMetadata
from openedx_events.tooling import OpenEdxPublicSignal

logger = logging.getLogger(__name__)


# First define the topic that our consumer will subscribe to.
ENTERPRISE_CORE_TOPIC = getattr(settings, 'EVENT_BUS_ENTERPRISE_CORE_TOPIC', 'enterprise-core')


# Define the shape/schema of the data that our consumer will process.
# It should be identical to the schema used to *produce* the event.
@attr.s(frozen=True)
class PingData:
"""
Attributes of a ping record.
"""
ping_uuid = attr.ib(type=str)
ping_message = attr.ib(type=str)


ENTERPRISE_PING_DATA_SCHEMA = {
"ping": PingData,
}

# Define the key field used to serialize and de-serialize the event data.
ENTERPRISE_PING_KEY_FIELD = 'ping.ping_uuid'

# Define a Signal with the type (unique name) of the event to process,
# and tell it about the expected schema of event data. The producer of our ping events
# should emit an identical signal (same event_type and data schema).
ENTERPRISE_PING_SIGNAL = OpenEdxPublicSignal(
event_type="org.openedx.enterprise.core.ping.v1",
data=ENTERPRISE_PING_DATA_SCHEMA
)
Comment on lines +41 to +47
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Are we going to define more than a small base set of signals in every repo, or is the idea they can be defined DRY in a shared repo?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These signals are really for demo/testing purposes - our production-use signals will be defined in openedx-events: openedx/openedx-events#347



def ping_event_data():
"""
Helper to produce a dictionary of ping event data
that fits the schema defined above by ``PingData`` and the
``data`` expected by our Ping Signal.
"""
return {
'ping': {
'ping_uuid': str(uuid.uuid4()),
'ping_message': 'hello, world',
}
}


class Command(BaseCommand):
"""
Management command to produce a test event to the event bus.
"""
help = """
Produce a single ping event to the configured test topic.

example:
./manage.py produce_enterprise_ping_event
"""

def add_arguments(self, parser):
parser.add_argument(
'--topic', nargs=1, required=False,
help="Optional topic to produce to (without environment prefix)",
)

def handle(self, *args, **options):
try:
producer = create_producer()
# breakpoint()
producer.send(
signal=ENTERPRISE_PING_SIGNAL,
topic=ENTERPRISE_CORE_TOPIC,
event_key_field=ENTERPRISE_PING_KEY_FIELD,
event_data=ping_event_data(),
event_metadata=EventsMetadata(
event_type=ENTERPRISE_PING_SIGNAL.event_type,
),
)
producer.prepare_for_shutdown() # otherwise command may exit before delivery is complete
except Exception: # pylint: disable=broad-exception-caught
logger.exception("Error producing Kafka event")
15 changes: 14 additions & 1 deletion enterprise_subsidy/settings/devstack.py
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,11 @@

}

INSTALLED_APPS += (
'edx_event_bus_kafka',
'openedx_events',
)

# Generic OAuth2 variables irrespective of SSO/backend service key types.
OAUTH2_PROVIDER_URL = 'http://edx.devstack.lms:18000/oauth2'

Expand Down Expand Up @@ -77,4 +82,12 @@
ENTERPRISE_SUBSIDY_URL = 'http://localhost:18280'
FRONTEND_APP_LEARNING_URL = 'http://localhost:2000'

# Application settings
# Kafka Settings
# "Standard" Kafka settings as defined in https://github.com/openedx/event-bus-kafka/tree/main
EVENT_BUS_KAFKA_SCHEMA_REGISTRY_URL = 'http://edx.devstack.schema-registry:8081'
EVENT_BUS_KAFKA_BOOTSTRAP_SERVERS = 'edx.devstack.kafka:29092'
EVENT_BUS_PRODUCER = 'edx_event_bus_kafka.create_producer'
EVENT_BUS_CONSUMER = 'edx_event_bus_kafka.KafkaEventConsumer'
EVENT_BUS_TOPIC_PREFIX = 'dev'

# Application settings go here...
2 changes: 2 additions & 0 deletions requirements/base.in
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
# Core requirements for using this application
-c constraints.txt

confluent-kafka[avro,schema-registry]
Django # Web application framework
djangoql
django-cors-headers
Expand All @@ -15,6 +16,7 @@ drf_yasg
edx-auth-backends
edx-django-utils
edx-django-release-util
edx-event-bus-kafka
edx-drf-extensions
edx-rbac
edx-rest-api-client
Expand Down
38 changes: 35 additions & 3 deletions requirements/base.txt
Original file line number Diff line number Diff line change
Expand Up @@ -15,6 +15,8 @@ attrs==23.2.0
# jsonschema
# openedx-events
# referencing
avro==1.11.3
# via confluent-kafka
backports-zoneinfo==0.2.1 ; python_version < "3.9"
# via
# -c requirements/constraints.txt
Expand All @@ -28,7 +30,13 @@ cffi==1.16.0
charset-normalizer==3.3.2
# via requests
click==8.1.7
# via edx-django-utils
# via
# code-annotations
# edx-django-utils
code-annotations==1.8.0
# via edx-toggles
confluent-kafka[avro,schema-registry]==2.4.0
# via -r requirements/base.in
cryptography==42.0.7
# via
# pyjwt
Expand Down Expand Up @@ -58,7 +66,9 @@ django==4.2.13
# edx-django-release-util
# edx-django-utils
# edx-drf-extensions
# edx-event-bus-kafka
# edx-rbac
# edx-toggles
# jsonfield2
# openedx-events
# openedx-ledger
Expand All @@ -71,6 +81,7 @@ django-crum==0.7.9
# via
# edx-django-utils
# edx-rbac
# edx-toggles
django-extensions==3.2.3
# via
# -r requirements/base.in
Expand All @@ -97,6 +108,7 @@ django-waffle==4.1.0
# -r requirements/base.in
# edx-django-utils
# edx-drf-extensions
# edx-toggles
djangoql==0.18.1
# via
# -r requirements/base.in
Expand Down Expand Up @@ -129,14 +141,18 @@ edx-django-utils==5.13.0
# via
# -r requirements/base.in
# edx-drf-extensions
# edx-event-bus-kafka
# edx-rest-api-client
# edx-toggles
# getsmarter-api-clients
# openedx-events
# openedx-ledger
edx-drf-extensions==10.3.0
# via
# -r requirements/base.in
# edx-rbac
edx-event-bus-kafka==5.7.0
# via -r requirements/base.in
edx-opaque-keys[django]==2.9.0
# via
# edx-ccx-keys
Expand All @@ -148,8 +164,12 @@ edx-rbac==1.9.0
# openedx-ledger
edx-rest-api-client==5.7.0
# via -r requirements/base.in
edx-toggles==5.2.0
# via edx-event-bus-kafka
fastavro==1.9.4
# via openedx-events
# via
# confluent-kafka
# openedx-events
getsmarter-api-clients==0.6.1
# via -r requirements/base.in
idna==3.7
Expand All @@ -162,6 +182,8 @@ inflection==0.5.1
# via
# drf-spectacular
# drf-yasg
jinja2==3.1.4
# via code-annotations
jsonfield2==4.0.0.post0
# via
# -r requirements/base.in
Expand All @@ -170,11 +192,13 @@ jsonschema==4.22.0
# via drf-spectacular
jsonschema-specifications==2023.12.1
# via jsonschema
markupsafe==2.1.5
# via jinja2
mysqlclient==2.2.4
# via
# -r requirements/base.in
# openedx-ledger
newrelic==9.9.0
newrelic==9.9.1
# via edx-django-utils
oauthlib==3.2.2
# via
Expand All @@ -184,6 +208,7 @@ oauthlib==3.2.2
openedx-events==9.10.0
# via
# -r requirements/base.in
# edx-event-bus-kafka
# openedx-ledger
openedx-ledger==1.4.2
# via -r requirements/base.in
Expand Down Expand Up @@ -212,6 +237,8 @@ pymongo==4.4.0
# via edx-opaque-keys
pynacl==1.5.0
# via edx-django-utils
python-slugify==8.0.4
# via code-annotations
python3-openid==3.2.0
# via social-auth-core
pytz==2024.1
Expand All @@ -223,6 +250,7 @@ pytz==2024.1
# openedx-ledger
pyyaml==6.0.1
# via
# code-annotations
# drf-spectacular
# drf-yasg
# edx-django-release-util
Expand All @@ -234,6 +262,7 @@ referencing==0.35.1
# jsonschema-specifications
requests==2.31.0
# via
# confluent-kafka
# edx-drf-extensions
# edx-rest-api-client
# requests-oauthlib
Expand Down Expand Up @@ -271,8 +300,11 @@ sqlparse==0.5.0
# via django
stevedore==5.2.0
# via
# code-annotations
# edx-django-utils
# edx-opaque-keys
text-unidecode==1.3
# via python-slugify
typing-extensions==4.11.0
# via
# asgiref
Expand Down
2 changes: 1 addition & 1 deletion requirements/ci.txt
Original file line number Diff line number Diff line change
Expand Up @@ -34,5 +34,5 @@ tomli==2.0.1
# tox
tox==4.15.0
# via -r requirements/ci.in
virtualenv==20.26.1
virtualenv==20.26.2
# via tox
10 changes: 10 additions & 0 deletions requirements/common_constraints.txt
Original file line number Diff line number Diff line change
Expand Up @@ -3,6 +3,16 @@
# See BOM-2721 for more details.
# Below is the copied and edited version of common_constraints

# This is a temporary solution to override the real common_constraints.txt
# In edx-lint, until the pyjwt constraint in edx-lint has been removed.
# See BOM-2721 for more details.
# Below is the copied and edited version of common_constraints

# This is a temporary solution to override the real common_constraints.txt
# In edx-lint, until the pyjwt constraint in edx-lint has been removed.
# See BOM-2721 for more details.
# Below is the copied and edited version of common_constraints

# A central location for most common version constraints
# (across edx repos) for pip-installation.
#
Expand Down
Loading
Loading