Canopy - chat history using the chat REST API endpoint

Hello,

I have a question about the Canopy framework that Pinecone is building. First of all, thank you so much for making such a great open source tool and supporting it actively in public!

Second, I had a question that I can’t get answered by the docs which is, how is Canopy handling the chat history if the chat REST API endpoint is used? I’m assuming the chat endpoint is a stateless endpoint. Does that mean we need to pass the history just as a blob of text + the last message of the user for example, to the chat endpoint?

Canopy’s /chat API is fully compatible with OpenAI’s chat API, so you can just the openai python client, pointing it to the url of your Canopy server. See here in the README or a more elaborate example with history in this notebook (cell 51).

BTW, any chatbot UI for OpenAI-like API already implements history saving on the client side. So you can should be able to take any OSS chatbot like this one, point it at your Canopy server and get a fully fledged Chatbot with history.

This topic was automatically closed 24 hours after the last reply. New replies are no longer allowed.