the documentation only has integration guide in python.
is it possible to have a similar setup in javascript ( nodejs ) ?
In general, to use the remote MCP you only need to feed your assistant’s url into an MCP client; the Python code provided only demonstrated using the MCP via Langchain.
Can you elaborate more on your use case?
hi @avi ,
my app’s backend is written in nodejs and I want to develop a chatbot in my application powered by Pinecone Assistant’s MCP server.
I’ve found the javascript code for uploading data to the Pinecone Assistant ( Upload files - Pinecone Docs )
I want a similar code for ( Use an Assistant MCP server - Pinecone Docs ) this .
Below is my agent’s code using Python:
from langchain_mcp_adapters.client import MultiServerMCPClient
from langgraph.prebuilt import create_react_agent
from langchain_openai import ChatOpenAI
import asyncio
from langgraph.checkpoint.mongodb.aio import AsyncMongoDBSaver
mcp_endpoint = f"pinecone-mcp-endpoint"
MONGODB_URI = "mongo-db-uri"
async def main():
model = ChatOpenAI(model="gpt-4o-mini", api_key='openai-api-key')
# Create the MongoDB client and checkpointer inside the async context
async with AsyncMongoDBSaver.from_conn_string(MONGODB_URI) as checkpointer:
async with MultiServerMCPClient(
{
"pinecone_assistant": {
"url": mcp_endpoint,
"transport": "sse",
"headers": {
"Authorization": f"Bearer pinecone-api-key"
}
}
}
) as client:
agent = create_react_agent(model, client.get_tools(), checkpointer=checkpointer)
config = {"configurable": {"thread_id": "1"}}
response = await agent.ainvoke({"messages": "hello"}, config)
print(response["messages"][-1].content)
# Run the async function
if __name__ == "__main__":
asyncio.run(main())
I want its equivalent in javascript .
I came across Langchain’s github repo GitHub - langchain-ai/langchainjs-mcp-adapters: Adapters for integrating Model Context Protocol (MCP) tools with LangChain.js applications, supporting both stdio and SSE transports.
and tried setting up the code .
import { MultiServerMCPClient } from "@langchain/mcp-adapters";
import { ChatOpenAI } from "@langchain/openai";
import { createReactAgent } from "@langchain/langgraph/prebuilt";
// Create client and connect to server
const client = new MultiServerMCPClient({
// Global tool configuration options
// Whether to throw on errors if a tool fails to load (optional, default: true)
throwOnLoadError: true,
// Whether to prefix tool names with the server name (optional, default: true)
prefixToolNameWithServerName: true,
// Optional additional prefix for tool names (optional, default: "mcp")
additionalToolNamePrefix: "mcp",
// Server configuration
mcpServers: {
pinecone_assistant: {
url: "pinecone-mcp-server-url",
transport: "sse",
headers: {
Authorization:
"Bearer pinecone-api-key",
},
useNodeEventSource: true,
reconnect: {
enabled: true,
maxAttempts: 5,
delayMs: 2000,
},
},
},
});
const tools = await client.getTools();
// Create an OpenAI model
const model = new ChatOpenAI({
modelName: "gpt-4o-mini",
temperature: 0,
apiKey:"openai-api-key"
});
// Create the React agent
const agent = createReactAgent({
llm: model,
tools,
});
// Run the agent
try {
const aiResponse = await agent.invoke({
messages: [{ role: "user", content: "hello" }],
});
console.log(aiResponse);
} catch (error) {
console.error("Error during agent execution:", error);
// Tools throw ToolException for tool-specific errors
if (error.name === "ToolException") {
console.error("Tool execution failed:", error.message);
}
}
await client.close();
PS C:\Users\gaura\Desktop\Chatbot> node agent.js
(node:22864) [DEP0040] DeprecationWarning: The punycode module is deprecated. Please use a userland alternative instead.
(Use node --trace-deprecation … to show where the warning was created)
file:///C:/Users/gaura/Desktop/Chatbot/node_modules/@langchain/mcp-adapters/dist/client.js:434
throw new MCPClientError(Failed to create SSE transport for server “${serverName}”: ${error}, serverName);
^
MCPClientError: Failed to create SSE transport for server “pinecone_assistant”: MCPClientError: Failed to connect to SSE server “pinecone_assistant”: Error: SSE error: Non-200 status code (401)
at MultiServerMCPClient._initializeSSEConnection (file:///C:/Users/gaura/Desktop/Chatbot/node_modules/@langchain/mcp-adapters/dist/client.js:434:19)
at process.processTicksAndRejections (node:internal/process/task_queues:105:5)
at async MultiServerMCPClient.initializeConnections (file:///C:/Users/gaura/Desktop/Chatbot/node_modules/@langchain/mcp-adapters/dist/client.js:297:17)
at async MultiServerMCPClient.getTools (file:///C:/Users/gaura/Desktop/Chatbot/node_modules/@langchain/mcp-adapters/dist/client.js:314:9)
at async file:///C:/Users/gaura/Desktop/Chatbot/agent.js:33:15 {
serverName: ‘pinecone_assistant’
}
Node.js v22.11.0
my js code is throwing error, looks like its not able to connect with the MCP server , what may be wrong in my code ?
401 is unauthorized error. Double check your assistant’s MCP url and Api key.
But it seems that you are in the right direction; checking Langchain’s docs for the proper setting and putting your assistant’s url and API key in the correct plane.
I would like to help you make it work, please let me know if you have any more problems.
yeah, I get it.
MCP url and API key are 100% correct, and my code also looks correct. I don’t know why is it giving 401 error.