Hi, I’m trying to upload the embeddings generated with a text-small openai embeddings model of the attached file but it fails with the following message:
{“code”=>11,
“message”=>“Error, message length too large: found 7355723 bytes, the limit is: 4194304 bytes”,
“details”=>}
I’m not sure what “the same issue” means here, but here are some things to check:
The chunk size - if you are parsing a file and upserting all chunks in one request, the payload may exceed the limit. Try to upsert in smaller batches.
The metadata - there is a limit on the maximum size of metadata per record