Abstract
The AI SPE requests 35,000 LPT from the Livepeer treasury to advance decentralized media-related AI compute jobs. By developing real-time AI pipelines, optimizing batch workflows, and enhancing orchestrator operations, this phase seeks to unlock new revenue streams for orchestrators, boost token utility for delegators, and drive developer adoption. These efforts aim to position Livepeer as the leading decentralized platform for AI-powered video compute, delivering both immediate ecosystem benefits and long-term growth opportunities.
Introduction
Over the past 11 months, we’ve had the privilege of spearheading Livepeer’s AI journey, backed by public goods contributions through the Livepeer treasury. Together with the community, we’ve built a fully operational system that has successfully processed hundreds of thousands of AI jobs on the Livepeer network, generating over $24,000 in fees distributed across the ecosystem. These achievements highlight the growing potential of Livepeer’s network to power AI-based video compute jobs. For a detailed summary of our progress, we invite you to explore the Phase 1 and Phase 2 retrospectives.
The AI SPE was launched to address a critical challenge: the growing centralization of AI infrastructure and innovation. As dominant tech companies control AI research, data access, and compute resources, independent developers, startups, and creators face increasing barriers to participation. By harnessing Livepeer’s decentralized infrastructure, the AI SPE seeks to democratize access to AI compute, empowering a diverse range of contributors to innovate freely and sustainably. At the same time, this mission directly strengthens Livepeer’s ecosystem by creating new revenue streams for orchestrators, increasing token utility for delegators, and enabling developers to build innovative, media-focused applications.
While significant progress has been made, achieving the dual mission of democratizing AI access and establishing Livepeer as the premier decentralized platform for media-related AI compute jobs requires continued growth and innovation. Real-time AI presents a transformative opportunity, offering a key differentiator in the decentralized compute space. By enabling use cases like live video enhancement, real-time transcription, and interactive media, it drives adoption, empowers creators, and positions Livepeer as the leading platform for real-time AI video infrastructure, paving the way for broader ecosystem growth.
To advance this vision, we propose a 4-month treasury-funded initiative centered on three interconnected objectives: (1) developing core real-time AI pipelines to enable a new generation of AI-powered media applications; (2) fostering a vibrant developer community to drive innovation and expand the ecosystem; and (3) optimizing batch and generative pipelines to sustain and grow existing use cases. Together, these efforts mark the next phase of our collaborative journey to decentralize AI compute, foster innovation, and deliver sustainable growth for the Livepeer ecosystem and its stakeholders.
Mission
The immediate mission of the AI SPE remains unchanged: to validate the impact, potential, and competitiveness of bringing AI-based video compute jobs onto the Livepeer network. In this next phase of Livepeer’s AI journey, the focus is on enabling real-time AI workflows—a key differentiator—and optimizing batch and generative AI capabilities to drive demand and foster adoption. By addressing real-world needs through collaboration with startups, builders, orchestrators, and the broader Livepeer community, these efforts aim to expand the network’s utility, attract diverse contributors, and deliver lasting value across the ecosystem.
In the long term, the AI SPE aims to establish Livepeer as the premier decentralized platform for media-related AI compute jobs within the next three years. By empowering creators, developers, and startups worldwide to innovate freely without reliance on centralized providers, we unlock new possibilities for AI-driven applications. This shared mission not only drives adoption and strengthens the Livepeer ecosystem but also fosters a more equitable global AI landscape, ensuring AI’s transformative potential benefits everyone.
Contributors
Similar to our last two proposals, the AI SPE is a collaboration between an open-source R&D group—comprising long-term Livepeer contributors and orchestrators—and the ecosystem team. As the AI SPE transitions to becoming a fully independent, community-driven, open-source R&D core contributor group, the ecosystem team will continue providing operational support in areas such as product management, financial administration, and HR. This partnership enables the R&D group to focus more directly on engineering and delivering impactful contributions to the Livepeer network.
Core Contributors (paid by SPE)
- Rick - Lead Engineer & SPE Proposer (Full-time)
- John (Elite Encoder) - AI Engineer (Full-time)
- Peter (Interptr) - AI Engineer (Full-time)
- Prakarsh - AI Researcher (Full-time, recently joined)
- Brad (Ad Astra Video) - AI/Operational Engineer (Part-time)
Ecosystem Support (paid by Livepeer Inc.)
-
Xilin - Product Management Support
Provides expertise to support roadmap development, identify fee-generating opportunities, and guide impactful deliverables. Facilitates collaboration with the Livepeer Inc. engineering team to optimize resource use and avoid duplication of efforts.
-
Sarah - Finance Support
Offers financial administration expertise to ensure the efficient use of resources and secure management of treasury operations.
-
Mariyana - HR Support
Provides assistance with hiring, team coordination, and administrative processes to support smooth day-to-day operations.
-
Doug - Vision and Growth Support
Provides strategic advice and guidance on key questions to help align our efforts with the broader ecosystem’s best interests.
Deliverables
In this third phase, we aim to accelerate Livepeer’s real-time AI journey and further optimize generative and batch AI pipelines, positioning the network for scalable growth and broader adoption. As highlighted in the Livepeer Cascade post, real-time AI represents a pivotal opportunity to establish Livepeer as a market leader, while advancements in generative and batch AI empower the community to extend the network with innovative use cases, driving increased demand.
This phase lays the groundwork for scalability improvements planned later in the year, preparing the network to support more complex workloads and diverse compute tasks. Our efforts will focus on four key areas:
- Kickstart Real-Time AI Journey: Accelerate the development and deployment of initial real-time AI pipelines on the Livepeer network. This includes delivering five optimized pipelines for high-value use cases like live video enhancement, transcription, and interactive video, along with essential network software improvements to support these pipelines. Additionally, we will provide developer resources—such as documentation and environments—designed to empower the community to create and deploy their own pipelines. Dedicated support for the startup program and hacker cohort will ensure scalable adoption and alignment with ecosystem goals.
- Batch and Generative Pipeline Generalization and Optimization: Simplify core network software, optimize existing batch and generative pipelines, and deliver clear developer resources—such as documentation, examples, and environments—to enable seamless integration of new models and pipelines by the community. Provide ongoing support to existing startups to ensure continued fee generation and ecosystem growth.
- Orchestrator Operation Improvements: Streamline orchestrator operations by addressing critical operational pain points with features like automated container updates, efficient model downloading and management, and simplified setup to facilitate seamless scaling. Monthly community feedback sessions will guide these enhancements, ensuring they effectively meet orchestrators’ needs and support the growing demand for new compute jobs.
- On-Chain Fee Differentiation and Enhanced Hardware Metrics Reporting: Implement precise on-chain fee tracking by job type to improve transparency and enable orchestrators, developers, and delegators to make informed decisions. If resources permit, enhance hardware metrics reporting to provide real-time insights—expanding beyond existing GPU metrics to include CPU, RAM, and other critical system specifications—to support a system-wide data aggregator and enable resource-based job distribution.
For more details on deliverables, milestones, and resource allocation, please refer to our Mission and Roadmap page. This page also outlines our long-term vision, the complete 2025 roadmap, and opportunities for community contributions to decentralized AI compute on Livepeer. While these deliverables represent our primary objectives, we remain flexible in adapting to emerging opportunities and evolving needs.
Programs
Over the next quarter, we plan to support the following initiatives, working closely with the Ecosystem team:
Bounty Program (led by ecosystem team, paid by SPE)
The initial bounty program trial, conducted with the ecosystem team, delivered strong results: 17 bounties completed with payouts exceeding $13,000, and another six bounties worth over $7,000 under review. As highlighted in the Phase 2 retrospective, it demonstrated the enthusiasm, capability, and strong interest of developers contributing to the Livepeer open-source ecosystem. To build on this success, we are requesting 3,700 LPT for new bounties, targeting both AI-focused tasks and critical non-AI areas like bug fixes, explorer enhancements, and transcoding features (e.g., NetInt, AV1, Intel support). These bounties aim to drive innovation, engage the community, and strengthen key areas of the Livepeer network.
AI Builder Cohort Program (led by Ecosystem team, funded by Grants)
The Ecosystem team will run a Developer Cohort Program (hacker program) from December 16th to January 15th, bringing together 15-20 ComfyUI workflow creators. Participants will design, deploy, and showcase high-quality live video AI workflows using ComfyUI, ComfyStream, and Livepeer. The program offers financial incentives, including grants, innovation and usage bonuses, and experimental revenue-sharing opportunities. Additionally, participants will benefit from mentorship, peer support, and community recognition, driving both engagement and innovation. The AI SPE will play a key supportive role for builders throughout this program and any future cohorts.
AI Video Startup Program II (TBD)
After the success of the first AI Video Startup Program run by the Livepeer Ecosystem team, there have been discussions about launching a second program focused on startups integrating real-time video AI pipelines. If this moves forward, the AI SPE will play an active role, though the details are TBD.
AI Network Reliability Initiative
In the last phase, we operated an AI orchestrator with base GPU capacity and a dedicated AI gateway to handle peak traffic and maintain a fast feedback loop with startups. This setup was crucial for achieving the Phase 2 milestones, ensuring network stability and meeting demand while independent orchestrators scaled their operations. As more orchestrators joined the AI network, we scaled down the base capacity and redirected accumulated fees from the hosted orchestrator to the gateway, ensuring they were distributed to independent orchestrators.
In this phase, we plan to continue these services as needed. To enhance transparency and accountability, we will launch a public orchestrator dashboard and share the scaling scripts our group used to spin up base capacity with cloud compute. These efforts aim to empower orchestrators to expand their operations independently and strengthen the network’s resilience.
Budget
The total amount requested from the on-chain treasury is 35,000 LPT to cover operational expenses for Q1 2025. This request factors in a roll-over of the remaining Stage 2 funds, which will be used as a buffer to account for potential price volatility. Our funding assumption uses a price of $10 per LPT. If the price of LPT improves, this funding could extend beyond March and potentially cover operations through June 2025, however, this is highly dependent on LPT market performance.
Projected Q1 2025 Spending
Category | LPT | USD | Description |
---|---|---|---|
SPE Contributors | 21,900 | $219,000 | Compensation for 5 existing core contributors and potential new hires to accelerate real-time AI workflows and support batch and generative AI jobs. |
Infrastructure & Software | 6,000 | $60,000 | For infrastructure scaling, including cloud services, server costs, and software tools needed to support AI pipeline development and deployment. |
Bounty Program | 3,700 | $37,000 | Incentives for community-driven contributions to bug fixes, feature development, and network enhancements. |
Marketing & Communications | 2,500 | $25,000 | Supports developer outreach efforts, documentation, and dev engagement to promote AI workflows and drive adoption. |
ETH Gateway & Gas Fees | 900 | $9,000 | Funding for AI Gateways to support AI compute jobs on the network, and on-chain transaction fees. |
Total | 35,000 | $350,000 |
This funding will ensure the AI SPE can continue delivering impactful contributions to the Livepeer network, supporting growth in AI-powered video workflows and infrastructure.
Footnote 1: Livepeer, Inc. will continue funding the salaries of its team members: Doug, Sarah, Xilin, and Mariyana.
Footnote 2: If the third-round funding is not fully utilized by June 30th 2025, the SPE will either: (1) roll over the remaining funds into a new proposal; or (2) return the unused funds to the treasury.