Skip to content

Commit

Permalink
The default max tokens of 215 is too small, answers are often cut off…
Browse files Browse the repository at this point in the history
….I will modify it to 512 to address this issue. (infiniflow#845)

### What problem does this PR solve?

### Type of change

- [x] Refactoring
  • Loading branch information
dashi6174 authored May 20, 2024
1 parent 8a0ab16 commit a8eb984
Show file tree
Hide file tree
Showing 2 changed files with 3 additions and 3 deletions.
2 changes: 1 addition & 1 deletion api/db/db_models.py
Original file line number Diff line number Diff line change
Expand Up @@ -759,7 +759,7 @@ class Dialog(DataBaseModel):
help_text="English|Chinese")
llm_id = CharField(max_length=128, null=False, help_text="default llm ID")
llm_setting = JSONField(null=False, default={"temperature": 0.1, "top_p": 0.3, "frequency_penalty": 0.7,
"presence_penalty": 0.4, "max_tokens": 215})
"presence_penalty": 0.4, "max_tokens": 512})
prompt_type = CharField(
max_length=16,
null=False,
Expand Down
4 changes: 2 additions & 2 deletions web/src/constants/knowledge.ts
Original file line number Diff line number Diff line change
Expand Up @@ -31,14 +31,14 @@ export const settledModelVariableMap = {
top_p: 0.3,
frequency_penalty: 0.7,
presence_penalty: 0.4,
max_tokens: 215,
max_tokens: 512,
},
[ModelVariableType.Balance]: {
temperature: 0.5,
top_p: 0.5,
frequency_penalty: 0.7,
presence_penalty: 0.4,
max_tokens: 215,
max_tokens: 512,
},
};

Expand Down

0 comments on commit a8eb984

Please sign in to comment.