I think embedding vectors and encoder-decoders deserve a mention as they were really important in the transformers underlying the meteoric success of LLMs. I'm not sure your ice fisher would recognise them. As one who was such an angler back in the 80s, I was astonished at how successful they were when I first encountered them a decade ago and I still struggle to grasp how they do what they do 😉
I would tend to include those within the latent space models that descend from 1980/90s autoencoder architectures. But yes, they have been instrumental within the development of transformers, and I guess an 80s ice fisher would be quite surprised at how these techniques have been applied to do word embeddings etc. even if they were familiar with these early models.
I think embedding vectors and encoder-decoders deserve a mention as they were really important in the transformers underlying the meteoric success of LLMs. I'm not sure your ice fisher would recognise them. As one who was such an angler back in the 80s, I was astonished at how successful they were when I first encountered them a decade ago and I still struggle to grasp how they do what they do 😉
I would tend to include those within the latent space models that descend from 1980/90s autoencoder architectures. But yes, they have been instrumental within the development of transformers, and I guess an 80s ice fisher would be quite surprised at how these techniques have been applied to do word embeddings etc. even if they were familiar with these early models.