How to use the vectorstore with langchain create_retrieval_chain or RetrievalQAWithSourcesChain

How to use the vectorstore as a retriever to the langchain retrieval chains. It seems to give me a error with ValueError: The argument order for query() has changed; please use keyword arguments instead of positional arguments. Example: index.query(vector=[0.1, 0.2, 0.3], top_k=10, namespace='my_namespace')

The same thing also persists with similarity_search. Even after giving the keyword arguments, the same error shows up.

query = (
"Which training method should I use for sentence transformers when " +
“I only have pairs of related sentences?”
)

res = openai.Embedding.create(
input=[query],
engine=embed_model
)

retrieve from Pinecone

xq = res[‘data’][0][‘embedding’]

get relevant contexts (including the questions)

res = index.query(vector=xq, top_k=2, include_metadata=True)

Pls check Pinecone documentation for better clarity ,particulary RAG implementation exmaples

Hello, I am doing similarity search but still getting the same error, here is the code:
from langchain.vectorstores import Pinecone

text_field = ‘text’ # field in metadata that contains text content

vectorstore = Pinecone(
index, embed_model.embed_query, text_field
)

vectorstore.similarity_search(
query, # the search query
k=3 # returns top 3 most relevant chunks of text
)
How can I go about this? I cant find any update in the documentation.