POKT Network has released its AI Litepaper, exploring the deployment of Large Language Models (LLMs) on its protocol to provide robust and scalable AI inference services. Since its Mainnet launch in 2020, POKT Network has served over 750 billion requests through a network of approximately 15,000 nodes in 22 countries. This extensive infrastructure positions POKT Network to enhance the accessibility and financialization of AI models within its ecosystem.
The AI Litepaper highlights the alignment of incentives among model researchers (Sources), hardware operators (Suppliers), API providers (Gateways), and users (Applications) through the Relay Mining algorithm. This algorithm creates a transparent marketplace where costs and earnings are based on cryptographically verified usage. The protocol’s quality of service competes with centralized entities, making it a mature permissionless network for application-grade inference.
The integration of LLMs on POKT Network allows for scalable AI inference services without downtime, leveraging the existing decentralized framework. AI researchers and academics can monetize their models by deploying them on the network, earning revenue based on usage without managing access infrastructure or generating demand. The Relay Mining algorithm ensures a transparent marketplace, incentivizing Suppliers to maintain high Quality of Service.
Permissionless LLM Inference
The AI Litepaper, titled “Decentralized AI: Permissionless LLM Inference on POKT Network,” was authored by Daniel Olshansky, Ramiro RodrÃguez Colmeiro, and Bowen Li. Their expertise spans augmented reality, autonomous vehicle interaction analysis, medical image analysis, and AI/ML infrastructure development, contributing to the paper’s comprehensive insights.
Daniel Olshansky brings experience from Magic Leap’s Augmented Reality cloud and Waymo’s autonomous vehicle planning. Ramiro RodrÃguez Colmeiro, a PhD in signal analysis and system optimization, focuses on machine learning and medical image analysis. Bowen Li, formerly an engineering manager at Apple AI/ML, led the development of Apple’s first LLM inferencing platform.
POKT Network’s AI Litepaper underscores its potential to drive innovation, adoption, and financialization of open-source models, positioning the network as a key player in permissionless LLM inference. For more detailed insights, the full AI Litepaper is available online.