Seminar: Beyond Transformers: Evo2 and StripedHyena2 for Long-Context Foundation Models

Time:

Venue/Location: Phòng C102, VIASM, 157 phố Chùa Láng, Hà Nội

Báo cáo viên: Hy Vuong

Tóm tắt: As foundation models face growing demands for longer context and domain-specific understanding, new architectures are emerging beyond the Transformer. This talk introduces Evo2, a 40B-parameter open-source model for biology, and its underlying architecture, StripedHyena2—designed to handle long range context.

We’ll explore how StripedHyena2 blends multi-scale convolutions with efficient attention to outperform Transformers in speed and accuracy for long-range sequence modeling. Evo2 leverages this to process entire genomes, achieving state-of-the-art results in variant prediction and synthetic sequence generation. The talk highlights key architectural innovations, comparative performance, and how these models pave the way for next-gen AI applications in biology and beyond.