Last time in this series I talked about multilayer perceptrons, an old idea that still remains relevant to modern deep learning systems. This time I’m going to talk about another idea that has a been around a while, but which has recently become a key component of transformer models. These are
Share this post
Deep Dips #2: Embeddings and latent spaces
Share this post
Last time in this series I talked about multilayer perceptrons, an old idea that still remains relevant to modern deep learning systems. This time I’m going to talk about another idea that has a been around a while, but which has recently become a key component of transformer models. These are