Error uploading embeddings for a 1.5mb file

Hi, I’m trying to upload the embeddings generated with a text-small openai embeddings model of the attached file but it fails with the following message:

{“code”=>11,
“message”=>“Error, message length too large: found 7355723 bytes, the limit is: 4194304 bytes”,
“details”=>}

This is the file being vectorized:

Any ideas?

Just to check- is your index set to expect 1536 dimensions? You can look in the console for your project.

Yes it is set to that amount of dimensions.

hello. did you figure this out? i am getting the same issue

Welcome to the Pincone community @admin15[1]

I’m not sure what “the same issue” means here, but here are some things to check:

  • The chunk size - if you are parsing a file and upserting all chunks in one request, the payload may exceed the limit. Try to upsert in smaller batches.
  • The metadata - there is a limit on the maximum size of metadata per record

To learn about all limits that apply, please see Pinecone Database limits - Pinecone Docs


  1. Friendly suggestion - change your username to something that won’t discourage members from helping you ↩︎