News

Startups

Enterprise

Artificial Intelligence

EPFL Researchers Launch Anyway Systems to Run Powerful AI Locally, Reducing Energy Use, Data Risk and Data Center Dependency

EPFL researchers have launched Anyway Systems to run powerful AI on local networks, bypassing the need for massive data centers.

EPFL researchers have launched Anyway Systems to run powerful AI on local networks, bypassing the need for massive data centers.

EPFL researchers have launched Anyway Systems to run powerful AI on local networks, bypassing the need for massive data centers.

NewDecoded

Published Dec 27, 2025

Dec 27, 2025

3 min read

Image by EPFL

EPFL researchers have launched Anyway Systems, a startup providing software that eliminates the need for massive cloud data centers when running artificial intelligence. This technology allows organizations to deploy high-powered AI models locally by networking standard office computers and consumer GPUs. By pooling the power of four standard machines, users can achieve the performance of a 100,000 CHF specialized server for a fraction of the cost.

Current AI infrastructure relies on centralized data centers that consume immense amounts of energy and water to handle constant requests. Experts estimate that inference, the process of generating an answer from a prompt, accounts for up to 90 percent of AI-related power demand. This dependency forces sensitive data like patient records or confidential documents to travel through third-party cloud services owned by Big Tech.

The Anyway Systems software uses self-stabilizing techniques to coordinate distributed machines into a local cluster that functions as a single powerful unit. Professor Rachid Guerraoui, head of the Distributed Computing Laboratory, notes that smarter and more frugal approaches are possible. For years people have believed that it is not possible to have large language models and AI tools without huge resources, but this research proves otherwise.

Moving AI tasks to local networks preserves data privacy and national sovereignty by keeping sensitive information within domestic borders. Organizations no longer have to worry if their proprietary data is being used to train the next generation of commercial models. This shift also improves sustainability by utilizing existing hardware rather than requiring the construction of new, energy-heavy infrastructure.

While massive centers will likely still be needed for the initial training of foundational models, the future of AI usage is moving toward the edge. As optimization continues, researchers believe even complex tasks will eventually be handled on personal devices. This democratization of hardware means that local entities, rather than just transnational corporations, can control their own digital assets.


Decoded Take

Decoded Take

Decoded Take

This innovation arrives at a critical turning point as powerful open weight models like OpenAI’s gpt-oss-120b and the Swiss national model Apertus become available to the public. While these models are free to download, most organizations lacked the hardware to actually run them until now.

By pairing Swiss-made software with local hardware clusters, Anyway Systems creates a fully sovereign AI stack that bypasses the Silicon Valley cloud monopoly. This transition signals a shift where the industry moves from centralized cloud dominance toward a more resilient, distributed network of local intelligence.

Share this article

Related Articles

Related Articles

Related Articles