🔧 chore(claude.py): increase max_tokens_to_sample value to 512 for better completion results

The `max_tokens_to_sample` value in the `anthropic.completions.create` function call has been increased from 300 to 512. This change is made to improve the completion results by allowing the model to generate longer responses.
This commit is contained in:
2023-10-31 12:49:31 +01:00
parent be4e54a6b7
commit 7c2a263b23

View File

@@ -20,8 +20,7 @@ async def claude(messages):
elif message["role"] == "function": elif message["role"] == "function":
... ...
prompt += AI_PROMPT prompt += AI_PROMPT
completion = await anthropic.completions.create(stop_sequences=["\n\nHuman (", "\n\nSYSTEM: "], model="claude-2", max_tokens_to_sample=300, prompt=prompt) completion = await anthropic.completions.create(stop_sequences=["\n\nHuman (", "\n\nSYSTEM: "], model="claude-2", max_tokens_to_sample=512, prompt=prompt)
print(prompt)
return { return {
"name": "send_message", "name": "send_message",
"arguments": {"message": completion.completion}, "arguments": {"message": completion.completion},