Best verified results on Llama 2 70B LoRA using 1x and 8x NVIDIA GB200 NVL72 systems, production-deployed to power APAC AI training at YTL.
TAIPEI, Nov. 18, 2025 /PRNewswire/ — Wiwynn announced best results in the MLPerf® Training v5.1 Llama 2 70B LoRA benchmark (Closed division), earning best performance on both 1x and 8x NVIDIA GB200 NVL72 configurations. The submissions were executed on production systems already deployed by YTL AI Cloud, spanning a 1-rack NVIDIA GB200 NVL72 (with 72 NVIDIA Blackwell GPUs) and an 8-rack NVIDIA GB200 NVL72 integrating 576 GPUs—demonstrati
Continue Reading on Laotian Times
This preview shows approximately 15% of the article. Read the full story on the publisher's website to support quality journalism.