Hi everyone,
I’m trying to limit how much context the Assistant sends to the LLM because the defaults (top_k = 16, snippet_size ~2000) are just too expensive for my use case. The documentation under “Control the context snippets sent to the LLM” says you can pass something like:
context_options: {
snippet_size: 1000,
top_k: 10
}
However, when I try it in the Node SDK (v6.0.0), my calls fail because context_options
isn’t recognized:
const stream = await assistant.chat({
messages,
stream: true,
context_options: { snippet_size: 1000, top_k: 7 }
});
MSG: Pinecone chatStream error PineconeArgumentError: Object contained invalid properties: stream, context_options. Valid properties include messages, model, filter, jsonResponse, includeHighlights.
I’m guessing the SDK types or request validator haven’t been updated yet. Has anyone figured out a clean way to override those defaults in the current SDK? Thanks!