Skip to content

Commit

Permalink
Add perf tests for eventgrid (#16949)
Browse files Browse the repository at this point in the history
* initialize

* implementation

* Update sdk/eventgrid/azure-eventgrid/tests/perfstress_tests/README.md

* comments
  • Loading branch information
Rakshith Bhyravabhotla authored Feb 26, 2021
1 parent ffba8cc commit 5e6e0b4
Show file tree
Hide file tree
Showing 3 changed files with 121 additions and 0 deletions.
48 changes: 48 additions & 0 deletions sdk/eventgrid/azure-eventgrid/tests/perfstress_tests/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,48 @@
# EventGrid Performance Tests

In order to run the performance tests, the `azure-devtools` package must be installed. This is done as part of the `dev_requirements`.
Start by creating a new virtual environment for your perf tests. This will need to be a Python 3 environment, preferably >=3.7.

### Setup for test resources

These tests will run against a pre-configured Eventgrid topic. The following environment variable will need to be set for the tests to access the live resources:
```
EG_ACCESS_KEY=<access key of your eventgrid account>
EG_TOPIC_HOSTNAME=<hostname of the eventgrid topic>
```

### Setup for perf test runs

```cmd
(env) ~/azure-eventgrid> pip install -r dev_requirements.txt
(env) ~/azure-eventgrid> pip install -e .
```

## Test commands

```cmd
(env) ~/azure-eventgrid> cd tests
(env) ~/azure-eventgrid/tests> perfstress
```

### Common perf command line options
These options are available for all perf tests:
- `--duration=10` Number of seconds to run as many operations (the "run" function) as possible. Default is 10.
- `--iterations=1` Number of test iterations to run. Default is 1.
- `--parallel=1` Number of tests to run in parallel. Default is 1.
- `--warm-up=5` Number of seconds to spend warming up the connection before measuring begins. Default is 5.
- `--sync` Whether to run the tests in sync or async. Default is False (async).
- `--no-cleanup` Whether to keep newly created resources after test run. Default is False (resources will be deleted).

### EventGrid Test options
These options are available for all eventgrid perf tests:
- `--num-events` Number of events to be published using the send method.

### T2 Tests
The tests currently written for the T2 SDK:
- `EventGridPerfTest` Publishes a list of eventgrid events.

## Example command
```cmd
(env) ~/azure-eventgrid/tests> perfstress EventGridPerfTest --num-events=100
```
Empty file.
73 changes: 73 additions & 0 deletions sdk/eventgrid/azure-eventgrid/tests/perfstress_tests/send.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,73 @@
#-------------------------------------------------------------------------
# Copyright (c) Microsoft Corporation. All rights reserved.
# Licensed under the MIT License. See License.txt in the project root for
# license information.
#--------------------------------------------------------------------------

import asyncio
from azure_devtools.perfstress_tests import PerfStressTest

from azure.eventgrid import EventGridPublisherClient as SyncPublisherClient, EventGridEvent
from azure.eventgrid.aio import EventGridPublisherClient as AsyncPublisherClient

from azure.core.credentials import AzureKeyCredential

class EventGridPerfTest(PerfStressTest):
def __init__(self, arguments):
super().__init__(arguments)

# auth configuration
topic_key = self.get_from_env("EG_ACCESS_KEY")
endpoint = self.get_from_env("EG_TOPIC_HOSTNAME")

# Create clients
self.publisher_client = SyncPublisherClient(
endpoint=endpoint,
credential=AzureKeyCredential(topic_key)
)
self.async_publisher_client = AsyncPublisherClient(
endpoint=endpoint,
credential=AzureKeyCredential(topic_key)
)

self.event_list = []
for _ in range(self.args.num_events):
self.event_list.append(EventGridEvent(
event_type="Contoso.Items.ItemReceived",
data={
"services": ["EventGrid", "ServiceBus", "EventHubs", "Storage"]
},
subject="Door1",
data_version="2.0"
))

async def close(self):
"""This is run after cleanup.
Use this to close any open handles or clients.
"""
await self.async_publisher_client.close()
await super().close()

def run_sync(self):
"""The synchronous perf test.
Try to keep this minimal and focused. Using only a single client API.
Avoid putting any ancilliary logic (e.g. generating UUIDs), and put this in the setup/init instead
so that we're only measuring the client API call.
"""
self.publisher_client.send(self.event_list)

async def run_async(self):
"""The asynchronous perf test.
Try to keep this minimal and focused. Using only a single client API.
Avoid putting any ancilliary logic (e.g. generating UUIDs), and put this in the setup/init instead
so that we're only measuring the client API call.
"""
await self.async_publisher_client.send(self.event_list)

@staticmethod
def add_arguments(parser):
super(EventGridPerfTest, EventGridPerfTest).add_arguments(parser)
parser.add_argument('-n', '--num-events', nargs='?', type=int, help='Number of events to be sent. Defaults to 100', default=100)

0 comments on commit 5e6e0b4

Please sign in to comment.