The `requirements.txt` file has been updated to include the following changes:
- Removed the comment for the Google API dependency.
- Added the `anthropic` dependency.
- Added a newline at the end of the file.
🐛 fix(ChatProcess.py): fix removing reaction in Chat class
The `remove_reaction` method in the `Chat` class was not awaited, causing it to not be executed properly. The fix ensures that the method is awaited before continuing execution.
🐛 fix(prompts.py): fix placeholder name in createPrompt function
The placeholder name `[datetime]` in the `createPrompt` function has been changed to `[date-and-time]` to improve clarity and consistency.
🔧 chore(chat.txt): update Zenith prompt
The Zenith prompt in the `chat.txt` file has been updated to include additional instructions for the AI character. The update provides more context and guidance for the AI's behavior.
✨ feat(claude.py): add support for Claude model
A new file `claude.py` has been added to the `chatUtils/requesters` directory. This file contains the implementation for the `claude` function, which interacts with the Claude model from the Anthropoc API. The function takes a list of messages as input and generates a response using the Claude model.
🔧 chore(request.py): add support for Claude model in request function
The `request` function in the `request.py` file has been updated to include support for the Claude model. When the `model` parameter is set to "claude", the function calls the `claude` function from the `claude.py` file to generate a response.
🔧 chore(variousclasses.py): add Claude model to models class
The `models` class in the `variousclasses.py` file has been updated to include the Claude model as an option. The model name "claude" has been added to the `chatModels` list.
✨ feat(ChatProcess.py): add support for custom temperature for each character in the 'characters' module to improve chat responses
🐛 fix(openaiChat.py): add 'temperature' parameter to the 'openaiChat' function to allow custom temperature for generating responses
🐛 fix(request.py): add 'custom_temp' parameter to the 'request' function to pass custom temperature to 'openaiChat' function
🐛 fix(openaicaller.py): reduce sleep time from 10 seconds to 5 seconds for retrying API calls to improve responsiveness
✨ feat(variousclasses.py): add 'custom_temp' dictionary to store custom temperature values for each character to improve chat responses
🔧 chore(ChatProcess.py): import fetch_messages_history function from Chat module to use it in Chat class
🔧 chore(ChatProcess.py): import moderate and ModerationError from utils.misc module to use them in Chat class
🔧 chore(Chat.py): add fetch_messages_history function to fetch message history from a channel
🔧 chore(Chat.py): add formatContext function to format the context for the bot to use
🔧 chore(Chat.py): raise an exception if no openai api key is set
🔧 chore(Chat.py): add logic to filter and format messages for the context
🔧 chore(Chat.py): fix typo in the import statement for ModerationError
🔧 chore(Chat.py): fix typo in the import statement for moderate
🔧 chore(Chat.py): fix typo in the import statement for fetch_messages_history
🔧 chore(prompts.py): create prompts dictionary and read chat and text prompts from files for each character
🔧 chore(prompts.py): create createPrompt function to create a prompt from the messages list
🔧 chore(prompts.py): create createTextPrompt function to create a text prompt from the messages list
🔧 chore(prompts.py): create createChatPrompt function to create a chat prompt from the messages list
🔧 chore(requesters/llama.py): create llama function as a placeholder
🔧 chore(requesters/llama2.py): create llama2 function as a placeholder
🔧 chore(requesters/openaiChat.py): import openai_caller from utils.openaicaller module
🔧 chore(requesters/openaiChat.py): create openaiChat function as a placeholder
🔧 chore(requesters/openaiText.py): create openaiText function as a placeholder
🔧 chore(requesters/request.py): import openaiChat, openaiText, llama, and llama2 functions from respective modules
🔧 chore(requesters/request.py): create request function to handle different models and make requests