Pinecone Pod Storage + Vector max "token length"?

I’ve seen in the documentation under limits for Storage options: " Each p1 pod has enough capacity for 1M vectors with 768 dimensions."

I’m i correct in saying if i use a dimension length of 1536 (OpenAI embedding), that this would mean i can store 500k vectors. 768 * 2 = 1536 dims, 1M / 2 = 500k vectors.

Lastly, i wanted to find out if a vector has a “max length”. If i had a corpus of text in the range of 32k tokens (GPT4 32k model version or higher context length window - like MPT-7B [65K] ). Can a single vector in Pinecone store the full 32k tokens within the embeddings, if i wanted to do so - or do i have to do some text splitting magic on the text to make it fit into the “max length” of the vector. Just wanted to find out if vectors had a “max length”.