News

Startups

Artificial Intelligence

Americas

SGLang Creators Launch RadixArk to Commercialize Open-Source AI Infrastructure

The team behind the popular SGLang inference engine has launched RadixArk to build accessible, large-scale AI infrastructure.

The team behind the popular SGLang inference engine has launched RadixArk to build accessible, large-scale AI infrastructure.

The team behind the popular SGLang inference engine has launched RadixArk to build accessible, large-scale AI infrastructure.

NewDecoded

Published Jan 22, 2026

Jan 22, 2026

3 min read

Image by RadixArk

The creators of the widely used open-source inference engine SGLang have officially launched RadixArk, a new deep-tech company focused on democratizing frontier-level AI infrastructure. Emerging from the UC Berkeley Sky Computing Lab, the startup recently achieved a reported $400 million valuation in a funding round led by Accel. According to CEO Ying Sheng, the team's mission centers on breaking the monopoly of private infrastructure held by major AI labs. RadixArk addresses a critical gap in the industry where engineers often rebuild complex pipelines from scratch. By treating systems and infrastructure as first-class citizens, the company provides tools necessary to make building and running models significantly more efficient. Their core products, SGLang and the Miles reinforcement learning framework, are designed to handle massive scale.

SGLang gained rapid popularity due to its RadixAttention mechanism, which enables efficient cache reuse across generation calls. This innovation makes it the fastest open engine for serving modern models, particularly for complex agentic workflows and structured data. Developers favor it for its ability to produce reliable JSON outputs at high throughput.

In addition to inference, the company is shipping Miles, an open-source framework for large-scale post-training and alignment. Miles brings specialized rigor to reinforcement learning, allowing labs to fine-tune models using the same efficient pipelines used by the world's largest companies. This integrated approach covers the entire lifecycle of frontier AI development. Looking forward, RadixArk is moving beyond open-source repositories to offer a managed infrastructure platform. A waitlist is now open for those seeking a hosted environment that removes the operational burden of cluster management. The team continues to hire specialized talent to ensure high-performance AI infrastructure remains a shared foundation for the entire community.


https://www.notion.so/image/attachment%3A10dded4e-42b0-45a9-98f4-abefb87dcc34%3Arasixark.png?id=2f017f7c-5de1-8056-9e35-dd4a90e13c6e&table=block&spaceId=7cc17f7c-5de1-8126-abfa-000337aec252&width=2000&userId=2d0d872b-594c-817a-ab3b-0002c508034a&cache=v2

Decoded Take

Decoded Take

Decoded Take

The emergence of RadixArk signals a major shift where AI systems design is finally recognized as a primary competitive advantage rather than just a support function. By commoditizing high-efficiency backends, this move lowers the barrier to entry for smaller labs and prevents a total monopoly on compute efficiency. As applications move toward complex reasoning agents, specialized engines like SGLang will become the essential foundation for production-ready AI.

Share this article

Related Articles

Related Articles

Related Articles