Exact Sequence Interpolation with Transformers

Alcalde Zafra A, Fantuzzi G, Zuazua Iriondo E (2026)


Publication Language: English

Publication Status: Submitted

Publication Type: Unpublished / Preprint

Future Publication Type: Journal article

Publication year: 2026

Open Access Link: https://arxiv.org/abs/2502.02270

Abstract

We prove that transformers can exactly interpolate datasets of finite input sequences in ℝd, d≥2, with corresponding output sequences of smaller or equal length. Specifically, given N sequences of arbitrary but finite lengths in ℝd and output sequences of lengths m1,…,mN∈ℕ, we construct a transformer with (∑Nj=1mj) blocks and (d∑Nj=1mj) parameters that exactly interpolates the dataset. Our construction provides complexity estimates that are independent of the input sequence length, by alternating feed-forward and self-attention layers and by capitalizing on the clustering effect inherent to the latter. Our novel constructive method also uses low-rank parameter matrices in the self-attention mechanism, a common feature of practical transformer implementations. These results are first established in the hardmax self-attention setting, where the geometric structure permits an explicit and quantitative analysis, and are then extended to the softmax setting. Finally, we demonstrate the applicability of our exact interpolation construction to learning problems, in particular by providing convergence guarantees to a global minimizer under regularized training strategies. Our analysis contributes to the theoretical understanding of transformer models, offering an explanation for their excellent performance in exact sequence-to-sequence interpolation tasks.

Authors with CRIS profile

How to cite

APA:

Alcalde Zafra, A., Fantuzzi, G., & Zuazua Iriondo, E. (2026). Exact Sequence Interpolation with Transformers. (Unpublished, Submitted).

MLA:

Alcalde Zafra, Albert, Giovanni Fantuzzi, and Enrique Zuazua Iriondo. Exact Sequence Interpolation with Transformers. Unpublished, Submitted. 2026.

BibTeX: Download