How to use the vectorstore with langchain create_retrieval_chain or RetrievalQAWithSourcesChain

How to use the vectorstore as a retriever to the langchain retrieval chains. It seems to give me a error with ValueError: The argument order for query() has changed; please use keyword arguments instead of positional arguments. Example: index.query(vector=[0.1, 0.2, 0.3], top_k=10, namespace='my_namespace')

The same thing also persists with similarity_search. Even after giving the keyword arguments, the same error shows up.

query = (
"Which training method should I use for sentence transformers when " +
“I only have pairs of related sentences?”
)

res = openai.Embedding.create(
input=[query],
engine=embed_model
)

retrieve from Pinecone

xq = res[‘data’][0][‘embedding’]

get relevant contexts (including the questions)

res = index.query(vector=xq, top_k=2, include_metadata=True)

Pls check Pinecone documentation for better clarity ,particulary RAG implementation exmaples