Topic

#diffusion-transformers

1 article exploring diffusion-transformers. Expert insights and analysis from our editorial team.

Showing 1–1 of 1 articles

Articles

Newest first
Infrastructure & Runtime

CoCoDiff Exposes the All-to-All Bottleneck That Caps Distributed Diffusion Transformer Inference Well Below Theoretical GPU Count

Ulysses parallelism caps distributed DiT inference scaling on heterogeneous interconnects. CoCoDiff delivers 3.6x average speedups on Aurora via topology-aware scheduling.