π Research Paper Abstract
Below is the abstract from this arXiv research paper. Mathematical notation has been simplified for readability.
We derive a deterministic, non-asymptotic upper bound on the Kullback-Leibler (KL) divergence of the flow-matching distribution approximation. In particular, if the L_2 flow-matching loss is bounded by epsilon^2 > 0, then the KL divergence between the true data distribution and the estimated distribution is bounded by A_1 epsilon + A_2 epsilon^2. Here, the constants A_1 and A_2 depend only on the regularities of the data and velocity fields. Consequently, this bound implies statistical convergence rates of Flow Matching Transformers under the Total Variation (TV) distance. We show that, flow matching achieves nearly minimax-optimal efficiency in estimating smooth distributions. Our results make the statistical efficiency of flow matching comparable to that of diffusion models under the TV distance. Numerical studies on synthetic and learned velocities corroborate our theory.