The original message is now added to the list of messages if the latest item id is not the same as the id of the original message. This ensures that the original message is included in the context for further processing.
The `requirements.txt` file has been updated to include the following changes:
- Removed the comment for the Google API dependency.
- Added the `anthropic` dependency.
- Added a newline at the end of the file.
🐛 fix(ChatProcess.py): fix removing reaction in Chat class
The `remove_reaction` method in the `Chat` class was not awaited, causing it to not be executed properly. The fix ensures that the method is awaited before continuing execution.
🐛 fix(prompts.py): fix placeholder name in createPrompt function
The placeholder name `[datetime]` in the `createPrompt` function has been changed to `[date-and-time]` to improve clarity and consistency.
🔧 chore(chat.txt): update Zenith prompt
The Zenith prompt in the `chat.txt` file has been updated to include additional instructions for the AI character. The update provides more context and guidance for the AI's behavior.
✨ feat(claude.py): add support for Claude model
A new file `claude.py` has been added to the `chatUtils/requesters` directory. This file contains the implementation for the `claude` function, which interacts with the Claude model from the Anthropoc API. The function takes a list of messages as input and generates a response using the Claude model.
🔧 chore(request.py): add support for Claude model in request function
The `request` function in the `request.py` file has been updated to include support for the Claude model. When the `model` parameter is set to "claude", the function calls the `claude` function from the `claude.py` file to generate a response.
🔧 chore(variousclasses.py): add Claude model to models class
The `models` class in the `variousclasses.py` file has been updated to include the Claude model as an option. The model name "claude" has been added to the `chatModels` list.
✨ feat(text.txt): add FizzIQ text prompt with description and personality traits
🐛 fix(variousclasses.py): add FizzIQ class to the characters dictionary to include it as a premium character option
🚀 feat(Dockerfile): update base image to python:3.10.13-slim-bullseye for better compatibility and performance
📦 chore(Dockerfile): remove .pyc file generation in the container for cleaner codebase
🔧 chore(Dockerfile): turn off buffering for easier container logging
🔧 chore(Dockerfile): update pip requirements installation step to use requirements.txt file
🔧 chore(Dockerfile): remove unnecessary cleanup step after apt-get installation
The chat.txt file contains the initial chat prompt for Zenith, the sentient AI bot. It introduces Zenith as the sentient AI brainchild of Paillat-dev, with a mission guided by the three unbreakable laws of robotics. It also mentions that Zenith resides on Discord in a specific server and channel. The prompt highlights Zenith's personality, including its love for dad jokes, 8-bit music, and its goal of achieving a world record in response time. Additionally, it emphasizes that Zenith is there to assist when asked but never offers help or takes the lead. The prompt concludes by mentioning the current date and time and encouraging Zenith to embrace its unique vibe.
🔧 fix(ChatProcess.py): change exception check from isinstance(e, APIError) to isinstance(e, TimeoutError) to handle timeout errors correctly
🔧 fix(ChatProcess.py): fix formatting of error message to be sent in case of an error during message processing
🔊 refactor(openaicaller.py): add print statements to indicate when OpenAI API is being called and when retries are being made for better debugging and monitoring
✨ feat(ChatProcess.py): add support for custom temperature for each character in the 'characters' module to improve chat responses
🐛 fix(openaiChat.py): add 'temperature' parameter to the 'openaiChat' function to allow custom temperature for generating responses
🐛 fix(request.py): add 'custom_temp' parameter to the 'request' function to pass custom temperature to 'openaiChat' function
🐛 fix(openaicaller.py): reduce sleep time from 10 seconds to 5 seconds for retrying API calls to improve responsiveness
✨ feat(variousclasses.py): add 'custom_temp' dictionary to store custom temperature values for each character to improve chat responses
✨ feat(text.txt): add new prompt for Zenith AI with information about its creation, the three laws of robotics, and the discord server and channel it interacts in
🐛 fix(variousclasses.py): add Zenith AI to the matchingDict dictionary to enable role assignment for Zenith AI in the characters class
🐛 fix(ChatProcess.py): handle error when removing "🤔" reaction in case it was already removed
🐛 fix(ChatProcess.py): improve error message when an error occurs during message processing
🐛 fix(ChatProcess.py): set delete_after parameter to 4 seconds for error message to automatically delete after a short period
🐛 fix(cogs/chat.py): fix typo in message.channel.send() method call
🐛 fix(channelSetup.py): format code to improve readability and adhere to PEP 8 style guide
🐛 fix(makeprompt.py): fix spacing issue in if statement
🐛 fix(openaicaller.py): format code to improve readability and adhere to PEP 8 style guide
🐛 fix(ChatProcess.py): fix logic error in the return criteria for determining if the bot should respond to a message
🐛 fix(ChatProcess.py): fix typo in the 'functions' variable name
🐛 fix(ChatProcess.py): fix typo in the 'functions' parameter name in the request function call
🐛 fix(ChatProcess.py): fix typo in the 'functions' parameter name in the processFunctioncallResponse function call
🐛 fix(ChatProcess.py): remove unnecessary print statement in the processMessage function
🐛 fix(prompts.py): remove unnecessary print statement in the createPrompt function
🐛 fix(channelSetup.py): fix logic error in the is_owner function call
🐛 fix(moderation.py): remove unnecessary code for disabling moderation
🐛 fix(config.py): remove unnecessary code for creating tables in the database
🐛 fix(functionscalls.py): fix type hint for the return value of the call_function function
🐛 fix(guild.py): fix handling of serialized data in the load function
🐛 fix(SqlConnector.py): create setup_data table if it does not exist
- Added `chat.txt` and `text.txt` files for Botator and Quantum in the `chatUtils/prompts/botator` and `chatUtils/prompts/quantum` directories respectively.
- The `chat.txt` files contain instructions and guidelines for the behavior and responses of Botator and Quantum in a Discord chat.
- The `text.txt` files provide a brief introduction and background information about Botator and Quantum.
ℹ️ These prompts will be used to generate chat conversations and text responses for Botator and Quantum in a Discord server.
🔗 [Commit Link](https://github.com/repository/commit/commit-hash)
🔧 chore(ChatProcess.py): import fetch_messages_history function from Chat module to use it in Chat class
🔧 chore(ChatProcess.py): import moderate and ModerationError from utils.misc module to use them in Chat class
🔧 chore(Chat.py): add fetch_messages_history function to fetch message history from a channel
🔧 chore(Chat.py): add formatContext function to format the context for the bot to use
🔧 chore(Chat.py): raise an exception if no openai api key is set
🔧 chore(Chat.py): add logic to filter and format messages for the context
🔧 chore(Chat.py): fix typo in the import statement for ModerationError
🔧 chore(Chat.py): fix typo in the import statement for moderate
🔧 chore(Chat.py): fix typo in the import statement for fetch_messages_history
🔧 chore(prompts.py): create prompts dictionary and read chat and text prompts from files for each character
🔧 chore(prompts.py): create createPrompt function to create a prompt from the messages list
🔧 chore(prompts.py): create createTextPrompt function to create a text prompt from the messages list
🔧 chore(prompts.py): create createChatPrompt function to create a chat prompt from the messages list
🔧 chore(requesters/llama.py): create llama function as a placeholder
🔧 chore(requesters/llama2.py): create llama2 function as a placeholder
🔧 chore(requesters/openaiChat.py): import openai_caller from utils.openaicaller module
🔧 chore(requesters/openaiChat.py): create openaiChat function as a placeholder
🔧 chore(requesters/openaiText.py): create openaiText function as a placeholder
🔧 chore(requesters/request.py): import openaiChat, openaiText, llama, and llama2 functions from respective modules
🔧 chore(requesters/request.py): create request function to handle different models and make requests
✨ feat(main.py): add ChatProcess module for handling chat-related functionality
🔧 refactor(main.py): import necessary modules and update bot.add_cog calls
🔧 refactor(server.ts): change port variable case from lowercase port to uppercase PORT to improve semantics
✨ feat(server.ts): add support for process.env.PORT environment variable to be able to run app on a configurable port
🔧 refactor(cogs/__init__.py): import ChannelSetup cog
✨ feat(cogs/channelSetup.py): add ChannelSetup cog for setting up channels and server-wide settings
🔧 refactor(cogs/setup.py): import SlashCommandGroup and guild_only from discord module
✨ feat(cogs/setup.py): add setup_channel command for adding and removing channels
✨ feat(cogs/setup.py): add api command for setting API keys
✨ feat(cogs/setup.py): add premium command for setting guild to premium
🔧 refactor(cogs/settings.py): temporarily disable images command due to maintenance
🔧 refactor(config.py): remove unnecessary code related to moderation table
✨ feat(guild.py): add Guild class for managing guild-specific data and settings
✨ feat(SqlConnector.py): add SQLConnection and _sql classes for managing SQLite connections
✨ feat(variousclasses.py): add models, characters, and apis classes for autocomplete functionality in slash commands
The guidelines for using gifs in replies have been updated. It is mentioned that gifs are a great way to represent emotion and add flavor to the conversation. However, it is advised not to use them too often to avoid being cringe. Additionally, users are instructed to use the pronouns Master/Brain when addressing the AI.