Skip to content

Commit

Permalink
Prediction Guard Guardrails components (opea-project#677)
Browse files Browse the repository at this point in the history
* added files for PG guardrails components

Signed-off-by: sharanshirodkar7 <[email protected]>

* Fix pre-commit issues: end-of-file, requirements.txt, trailing whitespace, imports, and formatting

Signed-off-by: sharanshirodkar7 <[email protected]>

* added package

Signed-off-by: sharanshirodkar7 <[email protected]>

* added package

Signed-off-by: sharanshirodkar7 <[email protected]>

* added package

Signed-off-by: sharanshirodkar7 <[email protected]>

* file structure updated to latest

Signed-off-by: sharanshirodkar7 <[email protected]>

* Fix pre-commit issues: end-of-file, requirements.txt, trailing whitespace, imports, and formatting

Signed-off-by: sharanshirodkar7 <[email protected]>

* Add .DS_Store to .gitignore

Signed-off-by: sharanshirodkar7 <[email protected]>

* updated readme,requirements+changes based on feedback

Signed-off-by: sharanshirodkar7 <[email protected]>

* references classes in init.py

Signed-off-by: sharanshirodkar7 <[email protected]>

* fix readme link error

Signed-off-by: sharanshirodkar7 <[email protected]>

* fix readme link error

Signed-off-by: sharanshirodkar7 <[email protected]>

* fix readme link error

Signed-off-by: sharanshirodkar7 <[email protected]>

* fix readme link error

Signed-off-by: sharanshirodkar7 <[email protected]>

* fix readme link error

Signed-off-by: sharanshirodkar7 <[email protected]>

* fix readme link error

Signed-off-by: sharanshirodkar7 <[email protected]>

* Fix pre-commit issues: end-of-file, requirements.txt, trailing whitespace, imports, and formatting

Signed-off-by: sharanshirodkar7 <[email protected]>

* removed duplicatesa

Signed-off-by: sharanshirodkar7 <[email protected]>

* removed added readme content

Signed-off-by: sharanshirodkar7 <[email protected]>

---------

Signed-off-by: sharanshirodkar7 <[email protected]>
  • Loading branch information
sharanshirodkar7 authored Sep 23, 2024
1 parent a03e7a5 commit 4bbc7a2
Show file tree
Hide file tree
Showing 32 changed files with 868 additions and 0 deletions.
16 changes: 16 additions & 0 deletions .github/workflows/docker/compose/guardrails-compose-cd.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -14,3 +14,19 @@ services:
build:
dockerfile: comps/guardrails/toxicity_detection/Dockerfile
image: ${REGISTRY:-opea}/guardrails-toxicity-detection:${TAG:-latest}
guardrails-pii-detection-predictionguard:
build:
dockerfile: comps/guardrails/pii_detection/predictionguard/Dockerfile
image: ${REGISTRY:-opea}/guardrails-pii-predictionguard:${TAG:-latest}
guardrails-toxicity-detection-predictionguard:
build:
dockerfile: comps/guardrails/toxicity_detection/predictionguard/Dockerfile
image: ${REGISTRY:-opea}/guardrails-toxicity-predictionguard:${TAG:-latest}
guardrails-factuality-predictionguard:
build:
dockerfile: comps/guardrails/factuality/predictionguard/Dockerfile
image: ${REGISTRY:-opea}/guardrails-factuality-predictionguard:${TAG:-latest}
guardrails-injection-predictionguard:
build:
dockerfile: comps/guardrails/prompt_injection/predictionguard/Dockerfile
image: ${REGISTRY:-opea}/guardrails-injection-predictionguard:${TAG:-latest}
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -1,2 +1,3 @@
__pycache__
*.egg-info/
.DS_Store
4 changes: 4 additions & 0 deletions comps/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,10 @@
TextImageDoc,
MultimodalDoc,
EmbedMultimodalDoc,
FactualityDoc,
ScoreDoc,
PIIRequestDoc,
PIIResponseDoc,
)

# Constants
Expand Down
20 changes: 20 additions & 0 deletions comps/cores/proto/docarray.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,26 @@ class TextDoc(BaseDoc, TopologyInfo):
text: str = None


class FactualityDoc(BaseDoc):
reference: str
text: str


class ScoreDoc(BaseDoc):
score: float


class PIIRequestDoc(BaseDoc):
prompt: str
replace: Optional[bool] = False
replace_method: Optional[str] = "random"


class PIIResponseDoc(BaseDoc):
detected_pii: Optional[List[dict]] = None
new_prompt: Optional[str] = None


class MetadataTextDoc(TextDoc):
metadata: Optional[Dict[str, Any]] = Field(
description="This encloses all metadata associated with the textdoc.",
Expand Down
15 changes: 15 additions & 0 deletions comps/guardrails/factuality/predictionguard/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,15 @@
# Copyright (C) 2024 Prediction Guard, Inc.
# SPDX-License-Identitier: Apache-2.0

FROM python:3.11-slim

COPY comps /home/comps

RUN pip install --no-cache-dir --upgrade pip && \
pip install --no-cache-dir -r /home/comps/guardrails/factuality/predictionguard/requirements.txt

ENV PYTHONPATH=$PYTHONPATH:/home

WORKDIR /home/comps/guardrails/factuality/predictionguard

ENTRYPOINT ["python", "factuality_predictionguard.py" ]
39 changes: 39 additions & 0 deletions comps/guardrails/factuality/predictionguard/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
# Factuality Check Prediction Guard Microservice

[Prediction Guard](https://docs.predictionguard.com) allows you to utilize hosted open access LLMs, LVMs, and embedding functionality with seamlessly integrated safeguards. In addition to providing a scalable access to open models, Prediction Guard allows you to configure factual consistency checks, toxicity filters, PII filters, and prompt injection blocking. Join the [Prediction Guard Discord channel](https://discord.gg/TFHgnhAFKd) and request an API key to get started.

Checking for factual consistency can help to ensure that any LLM hallucinations are being found before being returned to a user. This microservice allows the user to compare two text passages (`reference` and `text`). The output will be a float number from 0.0 to 1.0 (with closer to 1.0 indicating more factual consistency between `reference` and `text`).

# 🚀 Start Microservice with Docker

## Setup Environment Variables

Setup the following environment variables first

```bash
export PREDICTIONGUARD_API_KEY=${your_predictionguard_api_key}
```

## Build Docker Images

```bash
cd ../../../../
docker build -t opea/factuality-predictionguard:latest -f comps/guardrails/factuality/predictionguard/Dockerfile .
```

## Start Service

```bash
docker run -d --name="guardrails-factuality-predictionguard" -p 9075:9075 -e PREDICTIONGUARD_API_KEY=$PREDICTIONGUARD_API_KEY opea/guardrails-factuality-predictionguard:latest
```

# 🚀 Consume Factuality Check Service

```bash
curl -X POST http://localhost:9075/v1/factuality \
-H 'Content-Type: application/json' \
-d '{
"reference": "The sky is blue.",
"text": "The sky is green."
}'
```
2 changes: 2 additions & 0 deletions comps/guardrails/factuality/predictionguard/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
# Copyright (C) 2024 Prediction Guard, Inc.
# SPDX-License-Identifier: Apache-2.0
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
# Copyright (C) 2024 Prediction Guard, Inc
# SPDX-License-Identifier: Apache-2.0

services:
factuality:
image: opea/guardrails-factuality-predictionguard:latest
container_name: guardrails-factuality-predictionguard
ports:
- "9075:9075"
ipc: host
environment:
no_proxy: ${no_proxy}
http_proxy: ${http_proxy}
https_proxy: ${https_proxy}
PREDICTIONGUARD_API_KEY: ${PREDICTIONGUARD_API_KEY}
restart: unless-stopped

networks:
default:
driver: bridge
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
# Copyright (C) 2024 Prediction Guard, Inc.
# SPDX-License-Identified: Apache-2.0


import time

from docarray import BaseDoc
from predictionguard import PredictionGuard

from comps import (
FactualityDoc,
ScoreDoc,
ServiceType,
opea_microservices,
register_microservice,
register_statistics,
statistics_dict,
)


@register_microservice(
name="opea_service@factuality_predictionguard",
service_type=ServiceType.GUARDRAIL,
endpoint="/v1/factuality",
host="0.0.0.0",
port=9075,
input_datatype=FactualityDoc,
output_datatype=ScoreDoc,
)
@register_statistics(names=["opea_service@factuality_predictionguard"])
def factuality_guard(input: FactualityDoc) -> ScoreDoc:
start = time.time()

client = PredictionGuard()

reference = input.reference
text = input.text

result = client.factuality.check(reference=reference, text=text)

statistics_dict["opea_service@factuality_predictionguard"].append_latency(time.time() - start, None)
return ScoreDoc(score=result["checks"][0]["score"])


if __name__ == "__main__":
print("Prediction Guard Factuality initialized.")
opea_microservices["opea_service@factuality_predictionguard"].start()
13 changes: 13 additions & 0 deletions comps/guardrails/factuality/predictionguard/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
aiohttp
docarray
fastapi
huggingface_hub
opentelemetry-api
opentelemetry-exporter-otlp
opentelemetry-sdk
Pillow
predictionguard
prometheus-fastapi-instrumentator
shortuuid
transformers
uvicorn
16 changes: 16 additions & 0 deletions comps/guardrails/pii_detection/predictionguard/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@

# Copyright (C) 2024 Prediction Guard, Inc.
# SPDX-License-Identitier: Apache-2.0

FROM python:3.11-slim

COPY comps /home/comps

RUN pip install --no-cache-dir --upgrade pip && \
pip install --no-cache-dir -r /home/comps/guardrails/pii_detection/predictionguard/requirements.txt

ENV PYTHONPATH=$PYTHONPATH:/home

WORKDIR /home/comps/guardrails/pii_detection/predictionguard

ENTRYPOINT ["python", "pii_predictionguard.py" ]
50 changes: 50 additions & 0 deletions comps/guardrails/pii_detection/predictionguard/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
# PII Detection Prediction Guard Microservice

[Prediction Guard](https://docs.predictionguard.com) allows you to utilize hosted open access LLMs, LVMs, and embedding functionality with seamlessly integrated safeguards. In addition to providing a scalable access to open models, Prediction Guard allows you to configure factual consistency checks, toxicity filters, PII filters, and prompt injection blocking. Join the [Prediction Guard Discord channel](https://discord.gg/TFHgnhAFKd) and request an API key to get started.

Detecting Personal Identifiable Information (PII) is important in ensuring that users aren't sending out private data to LLMs. This service allows you to configurably:

1. Detect PII
2. Replace PII (with "faked" information)
3. Mask PII (with placeholders)

# 🚀 Start Microservice with Docker

## Setup Environment Variables

Setup the following environment variables first

```bash
export PREDICTIONGUARD_API_KEY=${your_predictionguard_api_key}
```

## Build Docker Images

```bash
cd ../../../../
docker build -t opea/guardrails-pii-predictionguard:latest -f comps/guardrails/pii_detection/predictionguard/Dockerfile .
```

## Start Service

```bash
docker run -d --name="guardrails-pii-predictionguard" -p 9080:9080 -e PREDICTIONGUARD_API_KEY=$PREDICTIONGUARD_API_KEY opea/guardrails-pii-predictionguard:latest
```

# 🚀 Consume PII Detection Service

```bash
curl -X POST http://localhost:9080/v1/pii \
-H 'Content-Type: application/json' \
-d '{
"prompt": "My name is John Doe and my phone number is 555-555-5555.",
"replace": true,
"replace_method": "random"
}'
```

API parameters:

- `prompt` (string, required): The text in which you want to detect PII (typically the prompt that you anticipate sending to an LLM)
- `replace` (boolean, optional, default is `false`): `true` if you want to replace the detected PII in the `prompt`
- `replace_method` (string, optional, default is `random`): The method you want to use to replace PII (set to either `random`, `fake`, `category`, `mask`)
2 changes: 2 additions & 0 deletions comps/guardrails/pii_detection/predictionguard/__init__.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
# Copyright (C) 2024 Prediction Guard, Inc.
# SPDX-License-Identifier: Apache-2.0
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
# Copyright (C) 2024 Prediction Guard, Inc
# SPDX-License-Identifier: Apache-2.0

services:
pii:
image: opea/guardrails-pii-predictionguard:latest
container_name: pii-predictionguard
ports:
- "9080:9080"
ipc: host
environment:
no_proxy: ${no_proxy}
http_proxy: ${http_proxy}
https_proxy: ${https_proxy}
PREDICTIONGUARD_API_KEY: ${PREDICTIONGUARD_API_KEY}
restart: unless-stopped

networks:
default:
driver: bridge
Original file line number Diff line number Diff line change
@@ -0,0 +1,54 @@
# Copyright (C) 2024 Prediction Guard, Inc.
# SPDX-License-Identified: Apache-2.0


import json
import time
from typing import List, Optional

from docarray import BaseDoc
from predictionguard import PredictionGuard

from comps import (
PIIRequestDoc,
PIIResponseDoc,
ServiceType,
opea_microservices,
register_microservice,
register_statistics,
statistics_dict,
)


@register_microservice(
name="opea_service@pii_predictionguard",
service_type=ServiceType.GUARDRAIL,
endpoint="/v1/pii",
host="0.0.0.0",
port=9080,
input_datatype=PIIRequestDoc,
output_datatype=PIIResponseDoc,
)
@register_statistics(names=["opea_service@pii_predictionguard"])
def pii_guard(input: PIIRequestDoc) -> PIIResponseDoc:
start = time.time()

client = PredictionGuard()

prompt = input.prompt
replace = input.replace
replace_method = input.replace_method

result = client.pii.check(prompt=prompt, replace=replace, replace_method=replace_method)

statistics_dict["opea_service@pii_predictionguard"].append_latency(time.time() - start, None)
if "new_prompt" in result["checks"][0].keys():
return PIIResponseDoc(new_prompt=result["checks"][0]["new_prompt"])
elif "pii_types_and_positions" in result["checks"][0].keys():
detected_pii = json.loads(result["checks"][0]["pii_types_and_positions"])
return PIIResponseDoc(detected_pii=detected_pii)


if __name__ == "__main__":
print("Prediction Guard PII Detection initialized.")
opea_microservices["opea_service@pii_predictionguard"].start()
13 changes: 13 additions & 0 deletions comps/guardrails/pii_detection/predictionguard/requirements.txt
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
aiohttp
docarray
fastapi
huggingface_hub
opentelemetry-api
opentelemetry-exporter-otlp
opentelemetry-sdk
Pillow
predictionguard
prometheus-fastapi-instrumentator
shortuuid
transformers
uvicorn
16 changes: 16 additions & 0 deletions comps/guardrails/prompt_injection/predictionguard/Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@

# Copyright (C) 2024 Prediction Guard, Inc.
# SPDX-License-Identitier: Apache-2.0

FROM python:3.11-slim

COPY comps /home/comps

RUN pip install --no-cache-dir --upgrade pip && \
pip install --no-cache-dir -r /home/comps/guardrails/prompt_injection/predictionguard/requirements.txt

ENV PYTHONPATH=$PYTHONPATH:/home

WORKDIR /home/comps/guardrails/prompt_injection/predictionguard

ENTRYPOINT ["python", "injection_predictionguard.py" ]
Loading

0 comments on commit 4bbc7a2

Please sign in to comment.