What to do when a document doesn't fit in AI prompt window

No matter how big the prompt window is, it’s never long enough. We deal with the problem all the time. We ask AI to find information in a document, and it turns out that the prompt is too long. We can split the input manually into several parts, but the task gets tedious very quickly.

In this article, I show you how to automate splitting the document into smaller parts, passing them to AI, and then combining the results into a single response.

The first thing we need is a RecursiveCharacterTextSplitter. The splitter breaks the document into chunks but tries to preserve entire paragraphs. If that’s not possible, the splitter will try to maintain sentences or at least words. It won’t cut words in half unless it’s the only option to split the text.

from langchain.text_splitter import RecursiveCharacterTextSplitter


splitter = RecursiveCharacterTextSplitter(chunk_size=4000)

Next, we need the LLM implementation and a prompt template explaining to AI what to do with the text.

from langchain.llms import OpenAI
from langchain import PromptTemplate


llm = OpenAI(temperature=0, max_tokens=500, openai_api_key='sk-...')
prompt_template = PromptTemplate.from_template(
    """
    Use the following article to answer the user's question.
    Answer by returning bullet points with relevant quotes from the article.
    Start with bullet points. Don't include any header. Don't include a footer either.

    Question: {question}

    Article:
    ---
    {input_text}
    ---
    """
)

Finally, we need to create a MapReduceChain and pass the template variables to the chain:

from langchain.chains.mapreduce import MapReduceChain


chain = MapReduceChain.from_params(
      llm=llm,
      prompt=prompt_template,
      text_splitter=splitter,
      reduce_chain_kwargs={"document_variable_name": "input_text"},
      combine_chain_kwargs={"document_variable_name": "input_text"}
    )
final_answer = chain.run(input_text=a_very_long_article, question=the_question)

That’s it. The chain will automatically split the text into smaller parts, pass them to AI, obtain the result, and combine the results into a single response.


Do you need help building AI-powered information retrieval system for your business?
You can hire me!

Older post

Finding information in long documents with AI using vector databases and MapReduceChain from Langchain

How to find information in long documents with AI, vector databases, and Langchain using MapReduceChain and ParentDocumentRetriever

Newer post

Generate questions and answers from any document using AI

How to use OpenAI GPT models, Langchain, and Doctran to generate questions and answers from long documents

Are you looking for an experienced AI consultant? Do you need assistance with your RAG or Agentic Workflow?
Schedule a call, send me a message on LinkedIn. Schedule a call or send me a message on LinkedIn

>