Files
Botator/src/chatUtils/requesters/request.py
Paillat 73ec66deaa 🔧 chore(requirements.txt): update dependencies
The `requirements.txt` file has been updated to include the following changes:
- Removed the comment for the Google API dependency.
- Added the `anthropic` dependency.
- Added a newline at the end of the file.

🐛 fix(ChatProcess.py): fix removing reaction in Chat class
The `remove_reaction` method in the `Chat` class was not awaited, causing it to not be executed properly. The fix ensures that the method is awaited before continuing execution.

🐛 fix(prompts.py): fix placeholder name in createPrompt function
The placeholder name `[datetime]` in the `createPrompt` function has been changed to `[date-and-time]` to improve clarity and consistency.

🔧 chore(chat.txt): update Zenith prompt
The Zenith prompt in the `chat.txt` file has been updated to include additional instructions for the AI character. The update provides more context and guidance for the AI's behavior.

 feat(claude.py): add support for Claude model
A new file `claude.py` has been added to the `chatUtils/requesters` directory. This file contains the implementation for the `claude` function, which interacts with the Claude model from the Anthropoc API. The function takes a list of messages as input and generates a response using the Claude model.

🔧 chore(request.py): add support for Claude model in request function
The `request` function in the `request.py` file has been updated to include support for the Claude model. When the `model` parameter is set to "claude", the function calls the `claude` function from the `claude.py` file to generate a response.

🔧 chore(variousclasses.py): add Claude model to models class
The `models` class in the `variousclasses.py` file has been updated to include the Claude model as an option. The model name "claude" has been added to the `chatModels` list.
2023-10-31 12:08:32 +01:00

39 lines
1.3 KiB
Python

import discord
from src.chatUtils.requesters.openaiChat import openaiChat
from src.chatUtils.requesters.openaiText import openaiText
from src.chatUtils.requesters.llama import llama
from src.chatUtils.requesters.llama2 import llama2
from src.chatUtils.requesters.claude import claude
class ModelNotFound(Exception):
pass
async def request(
model: str,
prompt: list[dict] | str,
openai_api_key: str,
funtcions: list[dict] = None,
custom_temp: float = 1.2,
):
if model == "gpt-3.5-turbo":
return await openaiChat(
messages=prompt,
openai_api_key=openai_api_key,
functions=funtcions,
model=model,
temperature=custom_temp,
)
elif model == "text-davinci-003":
# return await openaiText(prompt=prompt, openai_api_key=openai_api_key)
raise NotImplementedError("This model is not supported yet")
elif model == "text-llama":
return await llama(prompt=prompt)
elif model == "text-llama2":
# return await llama2(prompt=prompt)
raise NotImplementedError("This model is not supported yet")
elif model == "claude":
return await claude(messages=prompt)
else:
raise ModelNotFound(f"Model {model} not found")