Skip to content

Commit

Permalink
refactor: Remove networking imports outside azure core (#3683)
Browse files Browse the repository at this point in the history
# Description

This pull request removes all networking imports flagged by rule
`C4749(networking-import-outside-azure-core-transport)` from
[azure-pylint-guidelines-checker](https://github.com/Azure/azure-sdk-tools/tree/2eaf22b95a3a6bbe51380e6efad14c822f1d7d5e/tools/pylint-extensions/azure-pylint-guidelines-checker),
and refactors the uses of those imports to do http requests through
azure-core.


Concretely this pull request:

* Adds `src/promptflow-evals/promptflow/evals/_http_utils.py` which
includes a sync and async version of azure.core.pipeline.Pipeline that
provides a requests-like api for general http requests.
* Refactors the sdk and tests to use those pipelines
* Removes the dependency on `aiohttp_retry`
* Remove both duplicate implementations of `AsyncHTTPClientWithRetry`.



## Background

This PR is part of an effort to ready `promptflow-evals` to be migrated
to Azure/azure-sdk-for-python.

Azure SDKs are disallowed from directly using networking libraries like
`requests`, `aiohttp`, etc... They're instead meant to use `azure-core`,
which under the hood can delegate to those libraries.


# All Promptflow Contribution checklist:

- [x] **The pull request does not introduce [breaking changes].**
- [ ] **CHANGELOG is updated for new features, bug fixes or other
significant changes.**
- [x] **I have read the [contribution
guidelines](https://github.com/microsoft/promptflow/blob/main/CONTRIBUTING.md).**
- [x] **I confirm that all new dependencies are compatible with the MIT
license.**
- [ ] **Create an issue and link to the pull request to get dedicated
review from promptflow team. Learn more: [suggested
workflow](../CONTRIBUTING.md#suggested-workflow).**


## General Guidelines and Best Practices
- [x] Title of the pull request is clear and informative.
- [x] There are a small number of commits, each of which have an
informative message. This means that previously merged commits do not
appear in the history of the PR. For more information on cleaning up the
commits in your PR, [see this
page](https://github.com/Azure/azure-powershell/blob/master/documentation/development-docs/cleaning-up-commits.md).

### Testing Guidelines
- [x] Pull request includes test coverage for the included changes.
  • Loading branch information
kdestin authored Aug 27, 2024
1 parent f9efeaa commit 8546ad9
Show file tree
Hide file tree
Showing 17 changed files with 982 additions and 531 deletions.
45 changes: 29 additions & 16 deletions src/promptflow-evals/promptflow/evals/_common/rai_service.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,7 +9,6 @@
from typing import Dict, List
from urllib.parse import urlparse

import httpx
import jwt
import numpy as np
from azure.core.credentials import TokenCredential
Expand All @@ -22,6 +21,8 @@
from constants import CommonConstants, EvaluationMetrics, RAIService, Tasks
from utils import get_harm_severity_level

from promptflow.evals._http_utils import get_async_http_client

try:
version = importlib.metadata.version("promptflow-evals")
except importlib.metadata.PackageNotFoundError:
Expand Down Expand Up @@ -61,8 +62,11 @@ async def ensure_service_availability(rai_svc_url: str, token: str, capability:
headers = get_common_headers(token)
svc_liveness_url = rai_svc_url + "/checkannotation"

async with httpx.AsyncClient() as client:
response = await client.get(svc_liveness_url, headers=headers, timeout=CommonConstants.DEFAULT_HTTP_TIMEOUT)
client = get_async_http_client()

response = await client.get( # pylint: disable=too-many-function-args,unexpected-keyword-arg
svc_liveness_url, headers=headers, timeout=CommonConstants.DEFAULT_HTTP_TIMEOUT
)

if response.status_code != 200:
raise Exception( # pylint: disable=broad-exception-raised
Expand Down Expand Up @@ -100,8 +104,11 @@ async def submit_request(question: str, answer: str, metric: str, rai_svc_url: s
url = rai_svc_url + "/submitannotation"
headers = get_common_headers(token)

async with httpx.AsyncClient() as client:
response = await client.post(url, json=payload, headers=headers, timeout=CommonConstants.DEFAULT_HTTP_TIMEOUT)
client = get_async_http_client()

response = await client.post( # pylint: disable=too-many-function-args,unexpected-keyword-arg
url, json=payload, headers=headers, timeout=CommonConstants.DEFAULT_HTTP_TIMEOUT
)

if response.status_code != 202:
print("Fail evaluating '%s' with error message: %s" % (payload["UserTextList"], response.text))
Expand Down Expand Up @@ -134,8 +141,11 @@ async def fetch_result(operation_id: str, rai_svc_url: str, credential: TokenCre
token = await fetch_or_reuse_token(credential, token)
headers = get_common_headers(token)

async with httpx.AsyncClient() as client:
response = await client.get(url, headers=headers, timeout=CommonConstants.DEFAULT_HTTP_TIMEOUT)
client = get_async_http_client()

response = await client.get( # pylint: disable=too-many-function-args,unexpected-keyword-arg
url, headers=headers, timeout=CommonConstants.DEFAULT_HTTP_TIMEOUT
)

if response.status_code == 200:
return response.json()
Expand Down Expand Up @@ -238,15 +248,18 @@ async def _get_service_discovery_url(azure_ai_project: dict, token: str) -> str:
:rtype: str
"""
headers = get_common_headers(token)
async with httpx.AsyncClient() as client:
response = await client.get(
f"https://management.azure.com/subscriptions/{azure_ai_project['subscription_id']}/"
f"resourceGroups/{azure_ai_project['resource_group_name']}/"
f"providers/Microsoft.MachineLearningServices/workspaces/{azure_ai_project['project_name']}?"
f"api-version=2023-08-01-preview",
headers=headers,
timeout=CommonConstants.DEFAULT_HTTP_TIMEOUT,
)

client = get_async_http_client()

response = await client.get( # pylint: disable=too-many-function-args,unexpected-keyword-arg
f"https://management.azure.com/subscriptions/{azure_ai_project['subscription_id']}/"
f"resourceGroups/{azure_ai_project['resource_group_name']}/"
f"providers/Microsoft.MachineLearningServices/workspaces/{azure_ai_project['project_name']}?"
f"api-version=2023-08-01-preview",
headers=headers,
timeout=CommonConstants.DEFAULT_HTTP_TIMEOUT,
)

if response.status_code != 200:
raise Exception("Failed to retrieve the discovery service URL") # pylint: disable=broad-exception-raised
base_url = urlparse(response.json()["properties"]["discoveryUrl"])
Expand Down
Loading

0 comments on commit 8546ad9

Please sign in to comment.