Skip to main content

College of Computing Student Presents Innovative Distributed AI System at National Hackathon

Published January 28, 2026 by Esther Djan

The largest collegiate hackathons hosted at the University of California

Grand Valley State University student Yordanos Kassa recently represented the GVSU College of Computing at Cal Hacks, one of the nation’s largest collegiate hackathons hosted at the University of California, Berkeley. The event brought together thousands of student innovators for a weekend of rapid prototyping, collaborative experimentation, and cutting-edge technological exploration.

Kassa presented Latentra, an ambitious platform designed to turn campus laptops into a collaborative AI supercomputer by pooling idle computational resources. Using LocalAI for distributed inference, Composio for tool integrations, Chroma for distributed memory and libp2p/WebRTC for peer collaboration. Latentra enables powerful AI agents that can analyze code, interact with APIs, query GitHub, and perform complex reasoning all without relying on cloud-based GPUs.

During a live demonstration, multiple laptops at the hackathon formed a real time mesh network. As additional devices joined, latency decreased noticeably, showcasing the system’s ability to scale performance through local collaboration. The demo highlighted Latentra’s potential to dramatically reduce the cost and energy demands of AI computation, especially within campus research and academic environments.

Kassa built Latentra independently during the event, designed the Electrons desktop application, the React and TypeScript agent builder interface, the distributed networking layer, and an inference pipeline optimized for responsiveness. One of the most significant technical challenges was synchronizing high context high context embeddings across a decentralized network. Kassa solved this by implementing selective caching and incremental vector updates, achieving real-time performance without sacrificing contextual accuracy.

The project demonstrated an estimated 80-90% reduction in energy usage compared to traditional cloud inference methods. This efficiency reflects a promising direction for sustainable computing, an area increasingly important to universities, research centers, and industry leaders.

Page last modified January 28, 2026