A polynomial autoencoder beats PCA on transformer embeddings

· HN · Transformers ·

Polynomial autoencoders outperform PCA for dimensionality reduction on transformer embeddings, offering a simple non-linear alternative for high-dimensional token representations.

Categories: Research

Excerpt

HN · 101 points · 30 comments

Discussions