Tokens limit error with long post in wordpress

This model’s maximum context length is 8192 tokens, however you requested 11943 tokens (11943 in your prompt; 0 for the completion). Please reduce your prompt; or completion length

@zohery this looks like an issue related to the embedding model you are using and not Pinecone. One technique for handling larger documents is chunking.