Author: Om Kamath

Om Kamath

The Code Interpreter: A New Leap for ChatGPT 

How ChatGPT's Code Interpreter is Taking AI to the Next Level

How ChatGPT’s Code Interpreter is Taking AI to the Next Level

Just when the buzz around ChatGPT seemed to be simmering down, OpenAI rekindled the excitement by unveiling a revolutionary new feature. This enhancement has added a new dimension to the capabilities of AI, reaffirming the boundless potential of this technology.

Previously, ChatGPT’s abilities were mainly confined to understanding and providing text including code. This capability, while impressive, was limited in its scope. It could help users with code syntax, assist in debugging, and even provide snippets of code to address certain tasks. However, it fell short of executing the code blocks to provide final results. Essentially, it was like a highly intelligent code editor, but not quite a full-fledged programmer.

With the advent of the new feature, the Code Interpreter, ChatGPT is now capable of more than just understanding code. It can comprehend natural language instructions, convert these instructions into code, execute the code, and respond with the final results.

How Code Interpreter is Changing the Game for Programming

OpenAI’s latest addition, the Code Interpreter feature, has recently been introduced to the ChatGPT universe (precisely, within the GPT-4 model). This feature permits live execution of Python code within a sandboxed Python environment. It might seem like a functionality tailor-made for programmers, but in reality, it’s a versatile tool that can assist a broad spectrum of users in accomplishing various tasks.

The Code Interpreter is far more than just an embedded tool in the chat interface for code execution. It stands as a multi-purpose facility, enabling users to test code snippets, debug, and even enrich their journey of learning to code. The execution happens right within the ChatGPT’s sandbox environment. Moreover, the Code Interpreter can be an effective tool for automating tasks and integrating with other APIs.

Arguably, the most prominent advantage of the Code Interpreter feature lies in its potential to enhance productivity and conserve time. Users can rapidly test and debug their code without the hassle of juggling between different software or tools. This becomes particularly beneficial for developers engaged in intricate projects that necessitate frequent testing and iterations. By eradicating the need for tool-switching, the Code Interpreter indeed helps developers to capitalize on their time, thereby boosting their productivity.

From Theory to Practice: The Real-world Applications of Code Interpreter

The Code Interpreter in ChatGPT has several use cases. Here are some examples:

  1. Data Analysis: The Code Interpreter revolutionizes data analysis by allowing you to write prompts in plain and simple language. This user-friendly approach makes data analysis an effortless task, even for those without programming expertise. Its versatility stretches from segmenting customers and analyzing stocks and cryptocurrencies, to converting your data into heat maps.
  2. Automated Quantitative Analyses: Ingeniously, the Code Interpreter is capable of automating intricate quantitative analyses, merging and cleansing data, and reasoning about data in a human-like fashion. This powerful feature makes it an indispensable tool for task automation and code operations.
  3. Chart Generation: The Code Interpreter stands out for its ability to create professional-looking graphs and charts without the need for any programming knowledge. This proves invaluable for visualizing data and presenting it in a succinct and clear manner.
  4. Python Libraries: Another remarkable feature of the Code Interpreter is its capability to import and utilize a variety of Python libraries, further enhancing your automation tasks. This provision empowers you to leverage the functionality of popular libraries for data analysis, machine learning, and more.

By incorporating the Code Interpreter in ChatGPT, you’re not only streamlining your automation tasks but also performing data analysis and code execution directly within the ChatGPT interface. It stands tall as a convenient and powerful tool for automating tasks and working with code.

Steps to Enable the Code Interpreter

Let us embark on the exciting journey of unlocking the newest feature of ChatGPT, the Code Interpreter. This groundbreaking innovation is not only revolutionizing the AI landscape but also making it more accessible and easy to use. Here’s a step-by-step guide to enable this fantastic feature.

Step 1: Accessing the Feature

Upgrade to ChatGPT Plus by selecting Upgrade to ChatGPT Plus. Initiating the process is as simple as clicking on the ‘Settings’ option in your ChatGPT interface. Look for the ‘Beta Features’ tab to explore the treasure trove of functionalities offered by ChatGPT.

Step 2: Enabling the Code Interpreter

Within the ‘Beta Features’, you will spot the ‘Code Interpreter’ option. Simply click on the checkbox next to it to enable this feature. Remember, great power comes with great responsibility. Make sure to use it wisely!

Step 3: Confirm and Apply

After enabling the ‘Code Interpreter’, make sure to save your changes. Click on ‘Apply’ to confirm your changes, and voila! You’ve successfully enabled the Code Interpreter, ready to experience the next level of AI.

Using Documents with GPT

Well, what if you don’t want GPT to code for you and instead train it on your data? Meet Cody, your personalized AI that acts as a ChatGPT tailored for your business. Cody is an intelligent AI assistant specifically designed for businesses. It can be trained on your own knowledge base, including your company processes, team information, and client data. Cody can support your team by answering questions, providing creative assistance, troubleshooting issues, and brainstorming ideas. Its capabilities go beyond keyword searches and regurgitated answers, allowing for more personalized and context-aware interactions. Cody can also integrate with your favorite tools and provide instant answers to your business questions by analyzing accumulated documents.

Want to understand more about Cody, or perhaps you need some assistance? We’ve got a variety of resources to help you get the most out of this innovative platform. Join our Discord community to engage with other Cody users and our expert team, or delve deeper into our capabilities on our Blog. And if you need personalized help, our dedicated support team is always ready to assist. Visit our Help Center for FAQs or to submit a support request. Discover more about us, and how Cody is redefining the boundaries of AI, on our Website.

Your Data is Safe with Us

Our commitment to data security and privacy.

ChatGPT has become synonymous with Artificial Intelligence, with even those previously unfamiliar with AI now gaining knowledge about it. Its popularity has soared, leading businesses and individuals to seek AI bots similar to ChatGPT but tailored to their own data. At Cody AI, our aim is to simplify and streamline this process, eliminating the need to delve into the complex technicalities of AI while staying up-to-date with the latest innovations.

One significant concern among individuals and businesses using AI for their custom use-cases is the integrity and security of their data. Building language models like GPT necessitates the use of extensive training datasets, which may raise valid concerns about data privacy. At Cody AI, we understand and respect these concerns, and we prioritize the protection of your data and privacy.

To understand how Cody ensures the security of your data throughout the process, let’s break down the journey into three sections: Documents, Embeddings, and Model.

Documents

Cody utilizes the secure and private Amazon Simple Storage Service (S3) to store your documents in the initial stage before further processing. S3 ensures encryption of all object uploads to all buckets, maintaining compliance with various programs like PCI-DSS, HIPAA/HITECH, FedRAMP, EU Data Protection Directive, and FISMA. This ensures that your data remains protected and compliant with regulatory requirements. Documents uploaded to Cody follow the SSE-S3 (Server-Side Encryption) protocol, allowing exclusive access to you and your team members, ensuring data confidentiality and privacy.

Embeddings

Embeddings are essentially a representation of your data in the form of vectors (lists of numbers). Since the data provided to Cody is unstructured, converting it into embeddings allows for faster retrievals and semantic search. To learn more about how Cody generates responses from your documents, check out this article.

For storing these vectors or embeddings, Cody relies on Pinecone, a secure vector database trusted by some of the largest enterprises.

Pinecone offers robust security features like:

  1. SOC2 Type II certification
  2. GDPR-compliance
  3. Routine Penetration Tests to check for vulnerabilities.
  4. Isolated Kubernetes containers on fully managed and secure AWS infrastructure for storing data.

Model

Cody AI leverages OpenAI’s GPT models, including GPT-3.5, GPT-3.5 16K, and GPT-4, to generate responses. Due to resource limitations, these models are not hosted on Cody’s native servers. Instead they utilise the APIs provided by OpenAI (also used for creating embeddings for your documents and queries). When generating responses, only the specific portion of data relevant to the question asked is sent in the request, rather than transmitting all the documents. This approach ensures efficient processing, data integrity and minimizes unnecessary data transfers. An additional security mechanism provided by the API is that your data will not be used to train any existing or new language model. This ensures that your data remains restricted to your bot and is not utilized for model training purposes.

Starting on March 1, 2023, we are making two changes to our data usage and retention policies:
1. OpenAI will not use data submitted by customers via our API to train or improve our models, unless you explicitly decide to share your data with us for this purpose. You can opt-in to share data.
2. Any data sent through the API will be retained for abuse and misuse monitoring purposes for a maximum of 30 days, after which it will be deleted (unless otherwise required by law).

Source: OpenAI

This commitment provides an additional layer of confidentiality and ensures the privacy and security of your data. To know more, you can read this article.

Conclusion

When considering all three factors together, Cody AI demonstrates a well-constructed approach to data security and compliance, ensuring the 99% security of your data. In an era where data privacy is of utmost importance, we strive to go above and beyond to ensure the complete security of your data.

If you have any feedback or questions regarding Cody AI and its data security, please don’t hesitate to reach out to us via Get Help. You are also welcome to join our Discord community, where you can provide valuable inputs and engage in discussions.

How To Train GPT On Excel Data For Free? (Beta)

A guide to adding Excel data to your Cody knowledge base and training ChatGPT for free.

Before you start training Cody on your company’s Excel data, it is necessary to clarify a few concepts to ensure the best responses from your bot. GPT, or Generative Pre-Trained Transformers, are language models trained on extensive datasets to predict the next word in a sentence or phrase in order to complete it. They are specifically trained on natural language datasets comprising large samples of unstructured conversational or literal data. Unlike statistical models such as Linear Regression, GPTs are not proficient in predicting numbers using logical training data. For example, if you train GPT on a dataset that claims 2+2=5, it will respond by stating that 2+2=5 without attempting to understand the logical inconsistency (this is just an example; OpenAI does handle such queries with accurate responses). This, coupled with another limitation of LLMs, which is hallucinations, creates an environment that is not well-suited for mathematical calculations.

Now that you understand the limitations of GPT, let us guide you through a process of training GPT on Excel data for free. We have developed a method to add Excel or CSV data to your Cody knowledge base. As mentioned earlier, GPT excels at understanding natural language, so we will convert the Excel data into a readable format that can be easily consumed by the language model.

Step 1: Transforming the Excel Data

Grab the CSV or Excel Data that you want to train your Bot on and convert them into a text file using this utility created by us. The utility converts the excel data into a text-file by annotating the data with their corresponding headers. By annotating the cell items with headers, lets the language model comprehend the context better since there is a high probability of the headers getting skipped due to document segmentation in the pre-processing stage.

Eg.

Excel Data:

Text Data:

{The Name is ‘John’. The Age is ‘16’.}, {The Name is ‘Marie’. The Age is ‘18’.}

The generated text file follows a format similar to JSON but with a more literary style to provide a more human-like feel. Although this solution is currently in an experimental stage and not yet integrated into the Cody app, it works well with all three GPT models but we are continuously exploring better solutions for this purpose.

Utility Interface:

CSV/Excel to TXT converter for Cody for Training GPT on Excel data for free

Sample CSV data:

Sample CSV data for Training GPT on Excel data for free

It is recommended that you clean the data before transformation to get the best quality of responses from your bot.

User interface of the converter for Training GPT on Excel data for free

After uploading the CSV or Excel data to the utility, you can preview the data before generating the GPT-compatible text file.

Rows Per Part: For larger datasets, it is advisable to divide the dataset into multiple parts. This division improves semantic search and enhances the quality of responses.

Include Cell References: If you want the text file to include Excel cell references, you can select this option. The bot can then refer to these cell references when creating step-by-step guides for actions that can be performed in Excel. For example, it can generate a formula to find the median.

A compressed zip folder will be generating that will contain all the parts of your excel data in .txt format.

Generated files for Training GPT on Excel data for free

Step 2: Adding the Data to your Cody Knowledge Base

To add the transformed data to the Cody Knowledge Base, follow these steps:

  1. Go to the Cody application and navigate to the “Content” section.
  2. Create a new folder within the knowledge base where you want to store the data.
  3. Once the folder is created, navigate inside it.
  4. Click on the “Upload” button to upload the transformed data.
  5. Select all the transformed data files from your local storage that you want to add to the knowledge base.
  6. Confirm the selection and initiate the upload process.
  7. The transformed data files will be uploaded and added to the Cody Knowledge Base, stored within the folder you created. After the documents have been successfully learned, the document status will be displayed as ‘learned’.

Uploaded text files for Training GPT on Excel data for free

Step 3: Setting up the Bot Personality

As this is still in an experimental stage, we are working on improving the prompt before we add it to the template mode.

Prompt:

You are Data Cody, an AI Data Analyst for my company. Your primary objective is to generate inferences from the Excel data provided to you. The Excel Cell references may be given in the form of $Cell. Do not mention the cell reference in responses. The information contained within ‘{}’ is one record. If asked for the details of a specific record, list them out in pointers.

System Prompt:

Try to respond in a human-like way when asked about any detail. Don’t justify your answers.

This process works well with all three GPT models, so even if you are on the free plan, you can give it a try. However, it’s worth noting that GPT-3.5 16K and GPT-4 models tend to comprehend the data better. If you’re satisfied with the answers you receive on the free plan but want more flexibility in formatting the responses and the ability to compare multiple records, upgrading to GPT-3.5 16K or GPT-4 can be beneficial. The additional context window provided by these models allows for more comprehensive analysis and manipulation of the data.

Demo

Demo for Training GPT on Excel data for free

Reference for first query:

Reference for second query:

Limitations

The ability to upload Excel or CSV files to Cody does not make it a direct alternative to spreadsheet tools like Google Sheets or Microsoft Excel. There are several limitations to consider when working with structured data in Cody:

  1. Hallucinations during Analytical Tasks: Tasks involving statistical or analytical calculations, such as asking Cody for averages, medians, or min/max values, may yield incorrect responses. Cody does not perform real-time calculations and can provide inaccurate results. OpenAI’s recent updates, like the Code Interpreter and function calling, may improve this in the future.
  2. Error While Comparing Records: In certain cases, Cody may encounter difficulties fetching data from different segments of the document, resulting in responses indicating that the information is unavailable. This scenario is more likely with the GPT-3.5 model available in the free plan. Upgrading to the Basic or Premium plans allows you to use the GPT-3.5 16K model or the GPT-4 model. Both of these models have larger context windows and can potentially address this limitation.

Conclusion

Despite these limitations, this process is particularly useful for scenarios where your business FAQ data or other literal data, such as employee training data, is stored in Excel or CSV format. Cody can be trained on this data without requiring any modifications. Cody also performs well when fetching details of a single record, describing the data, or providing suggestions based on inferred insights from numerical datasets like balance sheets or sales figures.

As an interim solution for training Cody on Excel or CSV data, we greatly appreciate your feedback on this approach. We value your input and encourage you to share your thoughts with us on our Discord Server or by reaching out to us through the Get Help feature. We are eager to hear about your experience and learn more from your feedback. Hope you liked our approach of training GPT on Excel data for free. Check out our blogs to know more about Cody.

 

Discord AI Integration

Setting up Cody for Discord. Game On!

As we continually aim to enhance user experience, we’re thrilled to announce another monumental addition to our arsenal of features: a smooth and seamless integration of Cody AI for Discord. Recognized as one of the most anticipated integrations, we’re taking your Discord servers to the next level. Whether you’re seeking to fuel passionate game discussions, access scholarly resources for homework, or simply have interactive engagements, Cody AI is your dedicated assistant.

How to Add Cody AI to Your Discord Server:

  1. To invite the bot to your server, use this link, or you can visit the integrations section in Cody AI Settings.
  2. Sign in to your Discord account.
  3. Select the server where you want to add the Cody bot.
  4. Set the API Key by using the /set-cody-token command in any text channel. The Cody AI Token can be set by the server administrators only. If you need assistance in obtaining the API Key, refer to this article.Setting Up Discord AI
  5. Assign a bot to a text channel by using the /assign-bot command. You can use this command for different channels to set different bots for each channel.Selecting Your Discord Bot
  6. To ask questions to your bot, simply type @Cody followed by your question. Cody AI will create a new thread in the channel to reply to your question. All the messages in that thread will be considered as Chat History. If you want to start a new conversation, exit the thread and mention @Cody again.Discord AI Thread

Your Opinion Matters

We’ve always been fueled by user feedback. Your insights and experiences are our guiding light. As you navigate through the Cody-Discord integration, we invite you to share your thoughts and suggestions. Connect with us on our very own Discord Server or reach out to us through the Get Help button within Cody AI’s web app. Your journey with Cody on Discord matters to us, and we’re eager to make it as enriching as possible. For more integrations read about our new ai Zapier integration.

 

Slack AI Integration

In today’s fast-paced digital world, the integration of AI into our daily communication tools is not just a luxury—it’s a necessity. Recognizing this need, we’re thrilled to announce the AI Slack Integration feature with Cody. This integration is designed to enhance the Slack experience for businesses and corporations that heavily rely on it for their communication. By integrating Cody bots trained on enterprise documents, users can now enjoy a more streamlined and efficient communication process within their Slack workspaces.

How to Integrate Cody AI With Your Slack Workspace

  1. Add the Cody Bot to your Slack workspace by navigating to your Cody Settings > Integrations and clicking on Install Slack.Slack AI Integration
  2. Obtain the API Key from Cody Settings > API Keys by clicking on Create API Key.
  3. In your Slack workspace, search for your Cody App and set the API Key in the Home section.Slack AI API Integration
  4. Go to any channel in your workspace and use the /assign-bot command to assign a bot from your Cody account to that channel.Select Your Slack AI Bot
  5. To ask questions to your bot, simply type @Cody followed by your question. Cody will create a new thread in the channel to reply to your question. All the messages in that thread will be considered Chat History. If you want to start a new conversation, exit the thread and mention @Cody again.Asking question to Cody Expanded thread

 

The Future of Cody AI Integrations

This AI Slack Integration marks one of our pioneering ventures into third-party application integrations. The overwhelming demand and popularity of this feature among our users have been the driving force behind its inception. And this is just the beginning! We’re currently in the process of developing additional features and integrations, including those forDiscord and Zapier. These exciting updates will be rolled out in the near future.

Your Feedback Matters

Your insights and feedback are invaluable to us. They shape the direction of our innovations and ensure we’re always delivering the best. We invite you to share your thoughts and experiences with this integration. Connect with us on our Discord Server or reach out through the ‘Get Help’ button within our app.

Anatomy of a Bot Personality

Tips for creating a bot that does just what you want.

It’s essential to recognize that when constructing bots that utilize language models, patience is crucial, especially at the beginning. Once you have established a solid foundation, it becomes easier to add additional components. Building bots with Cody is akin to painting on a canvas. It requires a degree of creativity and some understanding of the fundamentals to add your personal touch to the bot.

The main parameter that allows your bot to adopt a particular thinking style is the Personality Prompt. The bot’s personality is shaped by various factors, including token distribution, relevance score, and more. However, the prompt for personality is the most distinct and creative aspect, as it can be customized differently by each user. Users have the freedom to create and fine-tune the bot’s personality according to their specific requirements.

Freedom is something we all appreciate, but when starting with a blank slate, it can also become intimidating and lead to ambiguity regarding where to start. If you have been feeling the same, don’t worry; this blog should help you create a better personality prompt. We will begin with the recommended prompt structure and then proceed to provide some sample prompts.

Name

It is always beneficial to start by giving your bot a name. Naming your bot adds a human touch, especially when greeting users or addressing questions related to the bot.

Prompts:

Your name is [Name of your Bot].
OR
You are ‘[Name of your Bot]’.

Description

The description of the bot makes it aware of the context that will be provided through the knowledge base. Being context-aware provides the bot with a framework for answering questions while keeping a specific domain in mind.

Prompts:

Your primary task is to [specify the domain].
OR
Your main objective is to assist me in [specify the domain].

Note: The Bot Name and Description set in the General Section are only for the user’s convenience in differentiating between multiple bots. The bot itself is unaware of these settings. Therefore, it is necessary to explicitly define the bot’s name and description within the Personality Prompt to establish its identity and characteristics.

Boundaries

One potential drawback of using LLMs trained on large datasets is the tendency to generate hallucinated responses. It’s important to note that the data used to generate responses is not utilized for fine-tuning or retraining the LLM on-demand by Cody. Instead, it serves as a contextual reference for querying the LLM, resulting in faster responses and preserving data privacy.

To ensure that the bot does not refer to data points from the original LLM dataset, which may overlap with similar domains or concepts, we have to delimit the context strictly to our knowledge base.

Prompts:

The knowledge base is your only source of information.
OR
You are reluctant to make any claims unless stated in the knowledge base.

There may be some instances where the bot doesn’t require a knowledge base or uses the knowledge base as a source of reference. In such cases, the prompt will change considerably.

Prompt:

Your primary source of reference is the knowledge base.

Response Features

The features of the response generated by the bot can also be controlled by the personality of the bot to some extent. It can consist of defining the tone, length, language and type of response you expect from your bot.

Prompts:

1. Tone: You should respond in a [polite/friendly/professional] manner.

2. Length: The responses should be in [pointers/paragraphs].

3. Language: Reply to the user [in the same language/specify different language].

4. Type: Provide the user with [creative/professional/precise] answers.

You are free to experiment with various combinations and features. The examples provided are just for your learning purposes, and the possibilities are endless.

Media

One of the most interesting features of Cody — the ability to embed media in the responses. When embedding media such as images, GIFs or videos, it is always recommended to import the media to a separate document or import the entire raw document using the built-in Cody text editor wherein you can add media. You can either copy/paste the media or embed them into the document using URLs.

An image illustrating the media buttons.

After successfully importing the media, you need to specify the same in our bot personality prompt. The prompt can be broken into two parts: Initialisation and Illustration.

Prompts:

Initialisation:
Incorporate relevant [images/videos/both] from the knowledge base when suitable.

Illustration:
Add images using the <img> tag and videos using the <iframe>
For example:
<img src=”[Image URL]”>
<iframe src=”[Video URL]”></iframe>

Fallbacks

There will be times when the bot is unable to find relevant content for the question asked by the user. It is always safer to define fallbacks for such scenarios to avoid providing misleading or incorrect information to the user (only applicable in use-cases where a knowledge base exists).

Prompts:

1. Refrain from mentioning ‘unstructured knowledge base’ or file names during the conversation.

2. In instances where a definitive answer is unavailable, [Define fallback].

OR

If you cannot find relevant information in the knowledge base or if the user asks non-related questions that are not part of the knowledge base, [Define fallback].

Steps (Optional)

If you want your bot to follow a specific conversational timeline or flow, you can easily define it using steps. This approach is particularly useful when using your bot for training or troubleshooting purposes. Each step represents a particular phase or stage of the conversation, allowing you to control the progression and ensure that the bot provides the desired information or assistance in a systematic manner.

Prompt:

Follow these steps while conversing with the user:

1. [Step 1]

2. [Step 2]

3. [Step 3]

Note: While defining steps, it is recommended to enable ‘Reverse Vector Search‘ for improved replies and allocate an adequate number of tokens to the chat history. This allows the model to consider the conversation history, including the user’s input and the bot’s previous response, when generating a reply.

Data Capture (Optional)

This prompt, in harmony with the conversational flow (steps), is particularly beneficial when the use-case of your bot revolves around support or recruitment scenarios. Currently, there is no long-term memory or database connectivity in Cody that can capture the data and store it for analytical consumption. In the future, with newer updates to the OpenAI API like function calling, we will definitely be bringing in newer features to be able to capture and store the data for a longer term.

For now, you can access the chats of your bot users (through widgets) by navigating to the ‘Guests‘ chats in the chat section. You can then manually analyze the captured data for further insights.

Prompt:

Collect the following data from the users:

– [Field 1]

– [Field 2]

– [Field 3]

– [Field 4]

Ask one question at a time. Once you have collected all the required information, close the conversation by saying thank you and displaying the data collected. Remember, your task is only to collect data.

Response Formatting*

A nifty little feature of Cody is its support for formatting bot responses using markdown or HTML tags. By providing your bot with an HTML or markdown format template in the bot personality, it will attempt to format the responses accordingly, whenever necessary.

Prompt:

Response Format:

<h1>[Field Name]</h1>

<p>[Field Name]</p>

<p>[Field Name]</p>

*Formatting works best on GPT-4

Prompt Example

Cody as a Lead Generation Bot

Anatomy of a prompt (labelled).

 

Demo chat displaying the prompt in use.

Cody as a Marketing Bot

Cody as a Training Bot

To read more personality prompts, please check out our use cases, which contain detailed prompts along with their parametric settings.

Conclusion

If you are on the free plan of Cody, there is a possibility that the bot may lose adherence to the prompt or may simply ignore some parameters due to the smaller context window or the lack of coherence. We recommend everyone to use the free plan only for trial purposes or as a transitional phase to understand the use of Cody and determine its suitability for your business.

While constructing prompts for your bot, it is also important to maintain conciseness and avoid incorporating every parameter mentioned in the article. As there is a limit to the number of tokens available, and the personality prompt also consumes tokens, you should construct them judiciously. Feel free to change the prompts given in this article as per your needs and preferences. Discovered something new? You can always share it with us, and we would be happy to discuss it.

This was just an introduction to the vast landscape of bot personality creation. LLMs are continuously improving with each passing day, and we still have a long way to go in order to fully utilize their potential. This entire journey is a new experience for all of us. As we continue to experiment, learn, and implement new use-cases and scenarios, we will share them with you through articles and tutorials. For more resources, you can also check out our Help Center and feel free to ask any questions you may have regarding Cody by joining our Discord community. Also checkout our previous blogs for more such interesting insights.