The distributed AI lab Gradient releases the Echo-2 distributed reinforcement learning framework
2월 12, 2026 22:26:03
Distributed AI Lab Gradient has released the Echo-2 distributed reinforcement learning framework, aiming to break the barriers of training efficiency in AI research by decoupling the Learner and Actor at the architectural level to reduce the post-training costs of large models.
Official data shows that this framework can reduce the post-training cost of a 30B model from $4500 to $425. Echo-2 utilizes separation of storage and computation technology for asynchronous training (Async RL), supporting the offloading of sampling computation to unstable GPU instances and Parallax-based heterogeneous GPUs. Additionally, Gradient plans to launch the RLaaS (Reinforcement Learning as a Service) platform Logits, which is currently open for reservations for students and researchers.
Latest News
ChainCatcher
2월 18, 2026 04:30:27
ChainCatcher
2월 18, 2026 04:24:41
ChainCatcher
2월 18, 2026 04:22:45
ChainCatcher
2월 18, 2026 04:21:55
ChainCatcher
2월 18, 2026 04:04:43












