Why Does ChatGPT Forget the Conversation?

The world of chatbots and artificial intelligence continues to evolve, and with that comes a series of intricate processes to understand. One such process is how language models, like ChatGPT, retain and process chat memories. This article will delve into the way Large Language Models (LLM) manage memory, particularly focusing on token limitations.

Contextual Memory in Chat Conversations

When interacting with ChatGPT, the chatbot retains memories only from the current chat session.  Instead, it keeps track of the ongoing conversation to maintain context.

How Token Limits Work

Each conversation is made up of a series of tokens. In language processing, tokens are roughly equivalent to words, though they can sometimes be shorter or longer depending on the language or the specific content.

To illustrate, let’s consider an example:

Suppose you initiate a chat with the question, “How does photosynthesis work?”

ChatGPT responds with an explanation.

Later, you follow up with, “And what’s its significance?”

You might assume that you’re only sending your subsequent question to ChatGPT. However, the backend system views it as:

[meta data]

User: How does photosynthesis work?

System: [Explanation of photosynthesis]

User: And what’s its significance?

As the conversation proceeds, more tokens (or words) accumulate. Once the conversation reaches a certain threshold, the token limit, earlier parts of the conversation may be discarded to make room for new messages. 

How Token Limitation Can Cause the Chatbot to “Forget” the Context

Every LLM has a predefined token limit. Once this limit is reached, the model can’t process more tokens unless some are removed. Consequently, the earliest parts of the chat will be the first to be excluded, creating a sense that the chatbot has “forgotten” those parts.

For example, after a lengthy conversation about various topics, if you were to reference “photosynthesis” again, ChatGPT might not recall the specifics of the initial discussion about it if that part has been pruned due to token limits.

The token limitation varies depending on the specific version or type of LLM you’re interacting with. For instance, ChatGPT Pro, the premium version, offers a larger token limit compared to its free counterpart.  

How to Get the Chatbot to Remember More of the Conversation Despite Token Limitations

If you’re involved in a long and context-heavy chat with ChatGPT and are concerned about losing key details, there are strategies you can employ.  Periodically asking ChatGPT to summarize the ongoing conversation, or highlighting specific details you want to retain, can help. When the bot responds with the summary or acknowledgment, that information is effectively “refreshed” in the conversation, ensuring it remains in the chat’s active memory for longer.

Moving Forward

As development of AI chatbots continues, there is no doubt that token limitations will keep increasing.  At the same time, developers are feverishly working on ways to get their chatbots to store and retrieve more information while using fewer tokens per interaction.  This will allow chatbots to not only remember more information for deeper interactions, but also output responses faster using less compute power.

Related Tools

Industry Disruptor

Explore AI’s disruptive potential with this tool. Enter your industry and discover how AI may revolutionize it in the near future.

Abstract illustration of a quill pen on a computer keyboard symbolozing a potry generator.

Poetry Machine

Generate a custom poem based on your preferred structure, rhyme, and tone to match your mood or message.

Slangify

Transform your text into a hilariously slang-stuffed remix! Choose from a lineup of pre-tuned slang styles or describe your own.

Illustration of two people sitting at opposite ends of a table negotiating something.

Negotiation Navigator

A tool guiding you through complex negotiations with personalized advice and strategies for your unique situation.

A Doge with code in the background

Doge Decoder

I transform normal boring text into something “much wow”. Have fun and remember to do only good everyday.