Understanding the LLM Context Window and Why It Matters

Version: 1.0.0


What are context windows?

A context window is simply the chat you have with an LLM (like ChatGPT). Think of it as short-term memory! It keeps track of your prompts and its outputs as you chat. So, if you start a new chat, that context will be lost.

A context window has a limit. For example, the limit in Copilot in the Edge Browser is 8,000 tokens. A token can be a word, part of a word, character, or space. Here are some examples:

Why does this matter?

Depending on your user cases, it makes a big difference.

On your phone and on the go for most social usercases such as information retrieval, personal assitance, entertainement, reccomendations, learning and health the 8000 limit will suffice in most cases.

However, in business or any activity that requires lengthy inputs, this limit won’t work. Let’s look at two business examples. Both examples need governance, but given that is in place, both can benefit from increased efficiency.

So what are the Pros and Cons of a larger context windows.

Aspect Pros Cons
Memory Remembers more of the conversation More compute = more cost
Flow Keeps the conversation smoother - Less Hallucinations Might focus too much on old, less important info
Complex Tasks Can handle longer, more detailed tasks Takes longer to respond
User Experience Fewer times you need to repeat yourself Might bring up outdated information
Efficiency Less need to remind it of past info Uses more resources with each conversation

Less Hallucinations!?

Yes and no. It really depends on the context you are giving. The quality of the data will largly impact the output here. But given good data, yes you will get better results and less hallucinations. There is a lot more to this and context windows are increasing in size as we speak.

Increased Cost

Extended context windows require more computational power which can result in increased costs.

Resource Efficiency Trade-offs

While larger context windows can improve understanding it can slow down processing and increase costs.

psychology

Final Thoughts

  • lightbulbLarger context windows can be useful in particular business scenarios, but they can come at a financial and performance cost.
  • lightbulbIt's important to understand the output you are expecting so you can choose the best LLM for your requirements.
  • lightbulbFinally, it's important to understand that the data you input in a context window is probably the most important aspect in achieving the right result.