Iam geting error that my metadata size exceeds limit

My question is that does the size of text that i am inserting along with my vectors matters or not or the size of only metadata matters and is the metadata for only one chunk for is considered or the whole list of docs iam inserting iam only have two key pairs in my metadata for each chunk and iam still getting this error this is my code :

     texts = [doc.page_content for doc in original_chunks]
    metadatas = [doc.metadata for doc in original_chunks]
   ids = [f"{doc.metadata['vector_id']}_{i}" for i, doc in enumerate(original_chunks)] original_vectorstore.add_texts(texts, metadatas, ids)

original_vectorstore.add_texts(texts, metadatas, ids)

There are limits on batch size and individual vector size, so you’d need to make sure both are met. Pinecone Database limits - Pinecone Docs

The LangChain add_texts() method does not automatically include the text in the metadata, so it would not be a part of the limit on your vector or batch upsert. I am pretty sure of this, but you should be able to inspect your index after running this to confirm that.

The LangChain docs include additional parameters you can use to manage the batch size (Pinecone — 🦜🔗 LangChain documentation).