Transformers Are All You Need

A quick tour through the most popular Neural Net architecture

Have you ever thought about what happens when you read a book? Unless you have a unique ability, you don’t process and memorize every single word and special character, right? What happens is that we represent events and characters to build our understanding of the story. We do this with a selective memorization that allows us to keep the most relevant pieces of information without needing to accumulate each minor detail.

This is a companion discussion topic for the original entry at

Awesome Read! Is there any article by Pinecone that delves deeper into any particular architecture(s) used predominantly for semantic search?