Use the new GPT-4 api to build a chatGPT chatbot for Large PDF docs (56 pages used in this example).
Tech stack used includes LangChain, Pinecone, Typescript, Openai, and Next.js. LangChain is a framework that makes it easier to build scalable AI/LLM apps and chatbots. Pinecone is a vectorstore for storing embeddings and your PDF in text to later retrieve similar docs.
Get in touch via twitter if you have questions
The visual guide of this repo and tutorial is in the visual guide folder.
- Clone the repo
git clone [github https url]
- Install packages
pnpm install
- Set up your
.envfile
- Copy
.env.exampleinto.envYour.envfile should look like this:
OPENAI_API_KEY=
PINECONE_API_KEY=
PINECONE_ENVIRONMENT=
- Visit openai to retrieve API keys and insert into your
.envfile. - Visit pinecone to create and retrieve your API keys.
-
In the
configfolder, replace thePINECONE_INDEX_NAMEandPINECONE_NAME_SPACEwith your own details from your pinecone dashboard. -
In
utils/makechain.tschain change theQA_PROMPTfor your own usecase. ChangemodelNameinnew OpenAIChatto a different api model if you don't have access togpt-4. See the OpenAI docs for a list of supportedmodelNames. For example you could usegpt-3.5-turboif you do not have access togpt-4, yet.
-
In
docsfolder replace the pdf with your own pdf doc. -
In
scripts/ingest-data.tsreplacefilePathwithdocs/{yourdocname}.pdf -
Run the script
npm run ingestto 'ingest' and embed your docs -
Check Pinecone dashboard to verify your namespace and vectors have been added.
Once you've verified that the embeddings and content have been successfully added to your Pinecone, you can run the app npm run dev to launch the local dev environment and then type a question in the chat interface.
Frontend of this repo is inspired by langchain-chat-nextjs