Skip to content

Commit

Permalink
Make openai server resilient to unusual input and pin response messag…
Browse files Browse the repository at this point in the history
…e topic
  • Loading branch information
haixuanTao committed Oct 7, 2024
1 parent 823e00b commit 44b693c
Show file tree
Hide file tree
Showing 2 changed files with 12 additions and 8 deletions.
2 changes: 1 addition & 1 deletion examples/openai-server/dataflow.yml
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ nodes:
outputs:
- v1/chat/completions
inputs:
echo: dora-echo/echo
v1/chat/completions: dora-echo/echo

- id: dora-echo
build: pip install -e ../../node-hub/dora-echo
Expand Down
18 changes: 11 additions & 7 deletions node-hub/dora-openai-server/dora_openai_server/main.py
Original file line number Diff line number Diff line change
Expand Up @@ -68,13 +68,17 @@ async def create_chat_completion(request: ChatCompletionRequest):
node.send_output("v1/chat/completions", data)

# Wait for response from dora-echo
event = node.next(timeout=DORA_RESPONSE_TIMEOUT)
if event["type"] == "ERROR":
print("Timedout")
response_str = "No response received"
else:
response = event["value"]
response_str = response[0].as_py() if response else "No response received"
while True:
event = node.next(timeout=DORA_RESPONSE_TIMEOUT)
if event["type"] == "ERROR":
response_str = "No response received. Err: " + event["value"][0].as_py()
break
elif event["type"] == "INPUT" and event["id"] == "v1/chat/completions":
response = event["value"]
response_str = response[0].as_py() if response else "No response received"
break
else:
pass

return ChatCompletionResponse(
id="chatcmpl-1234",
Expand Down

0 comments on commit 44b693c

Please sign in to comment.