Generative Question-Answering with Long-Term Memory

Generative AI sparked several “wow” moments in 2022. From generative art tools like OpenAI’s DALL-E 2, Midjourney, and Stable Diffusion, to the next generation of Large Language Models like OpenAI’s GPT-3.5 generation models, BLOOM, and chatbots like LaMDA and ChatGPT.

It’s hardly surprising that Generative AI is experiencing a boom in interest and innovation [1]. Yet, this marks the just first year of generative AI’s widespread adoption. The early days of a new field poised to disrupt how we interact with machines.


This is a companion discussion topic for the original entry at https://www.pinecone.io/learn/openai-gen-qa/

How does one train LLM for large text documents? To get around exceeding token limit error of the embedding model?

Maybe this can help? OpenAI API

In 2022, Generative AI made us say “wow” more times than a magician pulling rabbits out of hats. From DALL-E 2, the Picass-O-Matic, to ChatGPT GPT-3.5 aka “the Word Wizard,” it’s no shocker that Generative AI’s the new cool kid on the tech block. It’s only year one, folks! Prepare for the robo-revolution!