Upserting data from dataframe

I am experiencing an issue when attempting to upsert data from a DataFrame into my Pinecone serverless index. Initially, I successfully upserted a batch of 200 vectors, each with a dimension of 1536, without any complications. However, when I tried to upsert a significantly larger dataset of approximately 26,000 vectors, the process did not complete, and I have not received a result even after several hours (without any error message or fail).

Here’s the code snippet I’m using for the upsert operation:

index.upsert_from_dataframe( vectorized_documents, batch_size=500, show_progress=True, namespace=self.index_namespace )

I am curious about the expected speed or limitations for the upsert_from_dataframe function, especially when dealing with large volumes of data. Are there any recommended practices or configuration adjustments that can help improve the performance of such large upsert operations?