Transcoder Campaign: Captain Stronk

Captain Stronk: the Orchestrator with community values and a mission to make the Livepeer network the more robust


:wrench:   In need of open source contributions?
:globe_with_meridians:   Looking for a global high performing orchestrator?
:chart_with_upwards_trend:   On the hunt for low commission rates?
:rocket:   Captain Stronk comes to the rescue!


Stake with Captain Stronk


About Stronk

Hi! I’m a full stack video engineer at OptiMist Video. My main responsibility is working on MistServer, an open source, full-featured, next-generation streaming media toolkit
Besides running this Orchestrator I have a passion for making music, maxing out my level on Steam and cuddling with my cats on the couch

Why stake with Captain Stronk?

  • We have been active since December, 2021 and have never missed a reward call
  • To support our contributions to the Livepeer ecosystem
  • We have a professional setup with monitoring, alerts and high uptime. We have consistently been in the top 10 performers and will scale operations as needed to keep it that way
  • Our goal is to run at a 8% reward and 30% fee cut. This way our delegators will have high returns, while ensuring we earn enough revenue to fund our operations and community contributions. We will lower our reward commission over time, as our stake increases

:rocket: In short: we are setting a gold standard in terms of reliability and performance and set an example of how orchestrators can contribute back to the Livepeer ecosystem


Stake with Captain Stronk


Our setup

We are located in Leiden, with a dedicated Livepeer machine and gigabit up/down connection. We are also cooperating with other high performing orchestrators to ensure high availability across the globe while keeping our costs as low as possible.

We have the following GPU’s connected and ready to transcode 24/7:
1x GTX 1070     @ Boston
1x RTX 3080      @ Las Vegas
3x GTX 1070Ti   @ Leiden
1x Tesla T4         @ Singapore

:bulb: Tip: You can visit our public Grafana page for advanced insights


Stake with Captain Stronk


Published Projects

As a part of our mission to make the Livepeer network more robust, we are maintaining the following projects

:wrench: Tip: Feel free to request custom development work if you have have need of something which can benefit the Livepeer network as a whole

Orchestrator API and supplementary explorer - link

Useful data indexing tool which provides information on events happening on Livepeers smart contracts. The API aggregates data from the Livepeer subgraph, blockchain data, smart contract events and more. This data gets cached so that, even if one of these data sources has issues, the API can always return the latest known info without interruption to service.

:construction: Status: Prototype has been running for over a year now. A major refactor is currently in progress, as well as a redesign of the frontend.
The new version will be implemented generically (extensible to other projects than just Livepeer) as well as buffering the state over time (allowing getting the state at a specific point in time)

Stronk Broadcaster - preview

We are working on turning our Livepeer machines into a CDN and provide a simple website where users can trial the Livepeer network. The website allows visitors to stream into the network from the browser and view and share these streams with anyone. It includes basic statistics, like the delay of the transcoded video tracks versus the source track

:construction: Status: Global ingest and delivery of live media is enabled. Side-by-side comparison of WebRTC being streamed from the browser versus what the viewers receive is also functional. Next up: Enable streaming through a canvas to enable overlaying video tracks and adding overlays (or other custom artwork)

Dune Livepeer Dashboard - link

Provides statistics on events happening on Livepeers smart contracts

:ballot_box_with_check: Status: Finished, exploring new stats to add

Orchestrator Linux setup guide - link

Solves the knowledge gap of Windows users who want to improve their setup by running their Orchestrator operations on a Linux machine

:construction: Status: First public release. We have received plenty of feedback from orchestrators in order to cover more information

Orchestrator Discovery Tracker - preview

Pretend Broadcaster which does discovery checks to all active Orchestrators and makes the response times publicly available. These are the same requests a go-livepeer Broadcaster makes when populating their list of viable Orchestrators

:raised_hand: Status: Prerelease. On hold while we work on above projects


Stake with Captain Stronk


Shoutouts

The Livepeer network has an awesome community of Orchestrators. In case you are not convinced of staking with Captain Stronk yet, the following Orchestrators are also a good spot to stake your LPT:

titan-node.eth: the face of Livepeer and also a public transcoding pool. Hosts a weekly water cooler chat in the Livepeer Discord to talk Livepeer, Web3 or tech in general
video-miner.eth: transcoding pool run by a group of Orchestrators (including myself)
xeenon.eth: a web3 media platform and promising new broadcaster on the network

8 Likes

Stonk is stronk. Stake with stronk :handshake:

2 Likes
Update: progress report on the Stronk Broadcaster project

Besides global transcoding using our hosted Orchestrators and Transcoders, captain-stronk.eth is now capable of global ingest and delivery of live media.


Up next

Create a webpage to manage streams, stream into the network using WebRTC and view statistics

6 Likes

Update to the Stronk Broadcaster project


A few weeks ago the Stronk Broadcaster experiment was established. A CDN powered by MistServer combined with Livepeer nodes to enable low latency livestreaming from the browser

Powered by the broadcast-in-browser component of StreamCrafter, a new project from the OptiMist Video team (the maintainers of MistServer), you can now visit a prototype to ‘craft’ livestreams from the browser

Disclaimer: This is a very early release, so expect some unfinished UI elements or instability. The supporting infrastructure is rolled out on a small scale and can only support a small load of streams and viewers
Note that switching to another tab during a broadcast can cause the browser to pause the broadcast to save resources

Features Included

  • Input
    • Camera + microphone
    • ScreenShare + audio
  • Edit
    • Processing input tracks into a new stream (size, position, opacity)
  • Output
    • Broadcasting using WebRTC to the Stronk CDN
  • View
    • Share a link to your stream

Short Term Roadmap

  • UI upgrades
  • Input
    • Camera modes: facing, environment
    • images/gifs
  • Edit:
    • add overlays or logos
    • simple free hand drawing
  • Output
    • Ingesting as MKV via HTTP Put requests
    • Streaming to custom targets (rather than just my CDN)
    • Record to local file

Long Term Roadmap

  • More UI upgrades
  • Mobile compatible UI
  • Input:
    • Other live streams
    • Tracks emitted from WebRTC peers
  • Edit:
    • Generate audio effects

Other areas being researched

  • web3 integration: login with metamask, dstorage, dcdn
  • supporting a VoD workflow rather than live only
  • add (local) audio clippings as a (loopable) input source
  • transcode audio between eg opus/aac before broadcasting
  • Using low-level WebCodecs API to support more inputs and outputs

If there are any feature requests, feedback, ideas or questions on how this all works, ask away on Discord or this Forum post

6 Likes

Introducing the latest expansion of the renowned Stronk operation! We are excited to share our latest venture: operating a hive of Swarm Bees

Our integrated stack combines the strengths of MistServer (CDN, Ingest), Livepeer (Transcoding), Swarm (Storage, Messaging), and an innovative browser-based broadcaster.

Stake to captain-stronk.eth and support us on an exciting journey as we merge cutting-edge technologies and explore the potential of this groundbreaking combination. Together, they create an unparalleled ‘Media Gateway’ that will redefine the future of media experiences.

3 Likes

Integration with Swarm for content delivery

Standards on building a media pipeline are quite high nowadays. People are spoiled with platforms like Youtube, Twitch and Netflix providing them with content which loads instantly, on any device and at a low latency.

To meet modern standards, MistServer is used for ingest and content delivery. This way we can:

  • Do realtime audio transcoding to support low latency streaming.

WebRTC uses Opus audio while most commonly AAC audio is ingested. This requires realtime audio transcoding. Similarily, if WebRTC is used for ingest, audio needs to be transcoded to AAC to support delivery using HLS

  • Support tons of transport protocols

Depending on the network stability and device type, different transport protocols are more suited for content delivery. By leveraging the full potential of MistServer we can provide a pleasant viewing experience under any condition

  • Transmux to different containers

A transport protocol only supports a limited set of containers. For example, HLS only supports TS or MKV segments. Having a proper media server in the stack means we can transmux our content before delivery to enable any kind of transport protocol

A big part of this project is exploring the viability of a fully decentralised workflow, with a CDN powered by Swarm alone. Since Swarm is not built for delivering media content, there are lots of unknowns: Can we get the latency low enough? How does the cost compare to simply self-hosting S3 compatible storage using MinIO? How easy is it to integrate? What is the device support like without being able to transmux content?

There are two approaches we can take here:

File based approach

The most obvious approach is to use a file based approach. This approach involves segmenting the media into MKV or TS segments and writing out a HLS playlist, which can then be shared to viewers. This method inherently has a high latency and might come with significant storage costs, but would be a fairly robust solution with good device support

Streaming workflow

Since Swarm also supports feeds we want to explore streaming protocols, like WebM or MP4 over WebSocket, and deliver those using Swarm feeds. In theory this would enable low latency streaming, but since we would be the first to try this approach there are no guarantees this method is stable enough. One thing we want to measure when using this approach is the jitter. A high jitter would mean that players still need to keep a fairly significant distance to the live point so no interruptions in playback occur

Both of these approaches are being implemented in MistServer initially by directly interfacing with a local Swarm Bee, as this would allow us to get started quickly and get some hard data on the viability of Swarm as a CDN. If the results are good, the browser based media studio StreamCrafter will also be integrated with Swarm directly. This would enable publishing and viewing directly from one webapp, which can be hosted on dStorage as well for a fully decentralised media platform

TL;DR

In conclusion, with MistServer we can achieve low latency streaming, support multiple transport protocols, and seamlessly transmux content. We are working towards enhancing the media pipeline by integrating Swarm for content delivery. However, we can’t achieve this vision alone. We invite you to support our research and development efforts by staking to our node. Join us in shaping the future of media delivery today.

4 Likes

Web2 vs Web3: Exploring Storage and Messaging Solutions

In earlier posts we talked about the potential of Swarm as a storage and messaging solution for web3 applications. However, as this is still cutting edge technology there are unknowns in terms of availability, data persistence, price, features and performance.

In order to make meaningful conclusions about the viability of Swarm as a platform for media pipelines, comparisons have to be made with traditional solutions. To that end we have rolled out two additional pieces to our media-cluster :

Storage

Storage is required to be able to record livestreams and upload videos. MistServer supports replication between nodes and creating clips from streams effortlessly, but in order to store content for an indefinite duration, a solution for storage is required

MinIO is a S3 compatible storage layer which can easily be self-hosted and provide redundancy over multiple regions. There’s a cost to this, as you have to host a server with plenty of storage in each region. However, once it is running it is trivial to write and read from this storage layer with no additional costs and to manage access control.

This will provide us with a baseline to which we can compare Swarm’s storage layer and allows us to offer multiple options depending on the users’ preferences

Messaging

Communication between media-workers and the media-oracle in a cluster can be sensitive. In order to make deploying and managing nodes easier we want a solution where media-workers can broadcast their availability to the oracle, send triggers to the API and receive trigger responses from the API. media-workers can also sync their config between each other using this architecture

Messaging over Swarm requires premining trojan chunks and has a cost as a stamp has to be attached to each message. We expect this to be more robust then relying on a small set of self-hosted MQTT brokers and can be very useful in the discovery process. At the same time it’s important to have a fallback solution to manage costs for high volume channels like the triggers coming from media-workers

For our stack we opted to roll out Eclipse Mosquitto, an open source message broker that implements the MQTT protocol. This will give our stack an additional layer of communication besides Swarm and direct HTTP requests between nodes

TL;DR

In conclusion, we rolled out MinIO for storage and Mosquitto for messaging in our media-cluster. These additions provide a baseline for comparing Swarm with traditional solutions.
We invite you to stake to our node and support our research and development as we explore the options for a fully-featured, web3 powered media pipeline

1 Like

:movie_camera: Status update on the Stronk Broadcaster project :film_projector:


Hey everyone! Here to share a brief update on the Stronk Broadcaster project, which has been progressing nicely over the past weeks

Setting up a media pipeline consists of many different parts which all need to come together. You’d want a stack which is easily deployable, maintainable and scaleable. It needs to support features like recording, low latency livestreaming, access control and modern transport protocols.

But, even if you manage to create the best media pipeline in the world, that is not enough in and of itself to generate more demand for the network. We believe that the most important piece is the interface where creators and their followers can find each other and interact

A couple of months ago we introduced the StreamCrafter, an in-browser broadcasting studio to compete with the likes of OBS studio.
Powered by the Stronk Media pipeline we’ve been building up over the past few months, this is the place where content creators can let their imagination run wild and livestream from any device straight from the browser
When doing user testing with the prototype, we got tons of valuable feedback on what worked well and what didn’t. People loved how easy it was to do low latency streaming from the browser and how you can composite multiple screenshares, cameras and other assets onto a single canvas to broadcast.
What people didn’t like was the interface. This has been our main focus the past few weeks and our internal release now has a completely rewritten interface and an additional TCP based ingest method, which will provide an even more robust experience for broadcasters and viewers using the StreamCrafter

But wait, there’s more!
Since we are fastly approaching a releasable state of the StreamCrafter, we want to maintain this momentum and immediately move on to the next milestone of the Stronk Broadcaster project.
We’ve been in contact with the Swarm foundation and are excited to announce we’ve been accepted into their grants program to write a research paper on various methods to store and deliver media content on Swarm. This grant will be an invaluable resource for any builder that wants to integrate Swarm into their stack and give insights into how they can achieve the lowest cost, widest device/player support and quickest time to load and seek

In the next post we will provide a full overview of what features and subprojects are part of the Stronk Broadcaster project, how far along they are and an estimate of when we feel confident to move from the prototyping stage into the first alpha release
Stay tuned for more info on the StreamCrafter and the infrastructure that powers it, and don’t forget to stake to our node to support our development efforts

4 Likes

:movie_camera: Status update on the Stronk Broadcaster project :film_projector:


Hey everyone!

Development of the Stronk pipeline originally started as an experiment to put myself in the shoes of a new broadcaster, with the goal of setting up a global, load balanced media pipeline capable of low latency livestreaming
After that, plenty of detours have been made to dive deeper into specific integrations that could be done to make the life of a video builder or system integrator easier.

Now, some of you might be wondering what the actual progress is on the project, so today we are here for a quick overview of how our work is coming together:

:wrench: What are we making?

We’re bringing together all individual parts of our work over the last months to develop a reference implementation of a fully-featured web3-powered media pipeline for both live and video-on-demand content. This stack is designed to be transparent, easily deployable and modifiable to power your own solutions. We want to cover all bases, from deploying the software to your infrastructure to delivering the media to a sample webapp.

The Stronk Media Pipeline:

  • Ingest: SRT/RTSP/RTMP, WebRTC, MKV [1] and more
  • Transcoding: local audio transcoding and Livepeer video transcoding
  • Inter-node communication and system health: Mosquitto
  • Global load balancing
  • Recording/VOD: Swarm & S3 (MinIO)
  • Delivery: HLS, WebRTC, WS/MP4, MP4, WebM and more
  • Interface: StreamCrafter

[1] NEW browser based protocol as a supplement for WebRTC ingest. By streaming MKV data through a TCP socket or WebSocket you can still get a realtime streaming experience, but much more resilience against poor network condition or packet loss. Since MKV has a wide codec support, you can immediately deliver tracks without requiring transcoding to enable wider device support. This method of ingesting has been added to MistServer specifically to provide a better browser-based broadcasting experience

:dart: Why?

We want to be prepared for the next wave of video builders and prospective Broadcasters looking to get started with live media or VOD. The idea is to provide a barebones package which exposes the full potential of MistServer and provides all the base functionality a new Broadcaster would need, ready to be tailored to their specific use cases.
By having a reference implementation in place, we can easily onboard new prospects and provide them with the necessary tools to get going as quickly as possible

:package: Deliverables:

  • An browser-based broadcasting studio for seamless ingest and viewing using the media pipeline
  • An overview of the various components that make up the media pipeline, explaining how they interact with each other
  • Easily deployable packages, simplifying setup and scaling the operation

:muscle: What’s working:

We’ve made significant progress on the following core components:

  • Ingest: Smooth ingest of any of MistServer supported protocols
  • Transcoding: Local surround to stereo audio downmixing using gstreamer, as well as conversion between AAC and Opus formats. Any video transcoding tasks are handed off to the livepeer Broadcaster node, with dynamic transcode profiles based on the source stream
  • Content delivery: Seamless, global load-balanced delivery to viewers using any of MistServer supported protocols
    With realtime delay on the source track, ~3-5 seconds delay on the transcoded tracks
  • Recording: Recording or clipping live content
  • StreamCrafter: Browser-based broadcasting studio

:hourglass_flowing_sand: What’s in progress:

We’re actively working on the following features:

  • Storage on Swarm: Integrating Swarm for efficient and decentralized content storage
    We have a running Grant from the Swarm foundation for integrating and researching various methods for storage and delivery of media content on Swarm
  • API integration: Implementing access control, database connectivity, and the ability to read on-chain data
  • MQTT integration: Enabling config synchronization, stream health, and inter-node communication
  • VOD uploading

Once the above is finished we will have the full media pipeline proof-of-concept up and running. The next steps would be:

  • A long roadmap of new features and enhancements to the StreamCrafter
  • Packaging the stack into media-oracle and media-worker packages
  • Deployment guides: Comprehensive documentation to assist users in deploying their own media pipelines.
  • Wallet management: Establishing a single escrow wallet to efficiently manage B’s and Bees.

:new: There are other experiments in progress, like deploying the entire stack to a single-board computer to box the entire stack up in a easily scalable and portable package with local video transcoding. More on that later

:date: Timeline

The StreamCrafter will be unveiled at IBC2023 on Saturday 16 September at 17:00. This will be the initial public Beta release. A code repository and StreamCrafter-specific roadmap will follow soon after that

After that the focus will be on Swarm integration: being able to record to and stream through Swarm. We estimate 4-8 weeks to complete the Grant, which will unlock full streaming and recording capacity in MistServer.

As a conclusion to the year we want to finish up API integration. Here the goal is to make a limited amount of streams, recording and transcoding available to Delegators, with overcapacity being free to trial by anyone


We’re thrilled about the progress we’ve made so far and can’t wait to bring the Stronk Pipeline to the video builder community! Stay tuned for more updates and don’t forget to stake to support our research and development as we explore the options for a fully-featured, web3 powered media pipeline! :muscle::rocket::new_moon:

5 Likes

To all video builders,

Building a platform using live media or VOD is tough. Most of you are already familiar with Livepeer Studio, which offers the full feature set of a media pipeline, including transcoding on the Livepeer network, for an affordable price

In this post however I want to highlight MistServer: an open-source media toolkit for developers who want to build and self-host their media pipeline

MistServer provides you with all the tools you need: supporting any protocol, transport method, DVR, an embeddable player with telemetry and most importantly: integration options which give you exact control over the flow of media and who is allowed to access your content
MistServer is tightly integrated with Livepeer and is able to provide transcoding using Livepeer Studio or your own set of Broadcaster nodes
Note that building your own pipeline is not for the faint-hearted - do check out Livepeer Studio to see if that fits your use case

Our software is applied for a wide variety of use cases: from ingest and CDN for livestreaming platforms like picarto.tv to self-driving cars which need realtime transport of a video feed

Today I want to highlight our latest milestone: a complete revamp of the documentation [1] [2]
This is just one step in our journey to reposition MistServer, with a completely rewritten website, API and interface following over the coming months

Our next release for MistServer is just around the corner, which will feature:
Clipping streams and vods into any output format, rewritten HLS input support, tons of fixes and the first public release of the StreamCrafter (which required some of these fixes)

If you need help to get started or have feedback, don’t hesitate to ask me or contact us @ info@mistserver.org

[1] https://docs.mistserver.org/
[2] Docker
(The docker image will get some more love soon-ish with pre-configured deployments, but this image can get you started quickly already)

2 Likes

:link: Status update on the Stronk API & Explorer project :unlock:


While we anxiously await the next MistServer release for the official alpha release of the StreamCrafter, progress has been made on the Stronk API & Explorer project (formerly known as the ‘Orchestrator API and supplementary explorer’)

This project was once started at the request of Ryan (NightNode), who wanted to have insight in what was happening on the Livepeer smart contracts. Back then Livepeer (and Web3 in general) was new to me, but something functional has been up and running since March 2022

The Stronk API & Explorer have been a useful tool to power Grafana dashboards, websites and even other API’s, providing them with reliable and quick access to coin prices, smart contract events, ENS data and more

There are three phases to reach the end vision we have in mind for this project:

Phase 1: Feature parity

The current API contains a lot of ‘prototype grade’ code. This makes it tough to maintain, extend or deploy. We are currently in the process of rewriting the entire API to open the way for the next two phases. Once the new API offers all of the features of the old API, the Stronk explorer will switch over and the old repository will be archived.

Phase 2: Subgraph substitute

The Stronk API processes smart contract events just like the Livepeer subgraph. However, we still require them for specifics like the Delegators or the total stake of an Orchestrator.

This phase will focus on adding more complicated reducer functions to our event parser until we can remove the subgraph as a dependency

Why?

There are two main reasons why we’re not skipping phase 2:

  • The subgraph has been unreliable at times. We want our API to have recent data available without delay or interruption
  • Querying the subgraph is not free: someone is paying for those queries. The Stronk API runs on any free tier offered by RPC providers

This does not mean we don’t like the Graph protocol. Their Subgraph Studio and hosted service make it way easier for any project to get started. But since there is a clear path into giving our API a similar utility by porting over the Livepeer subgraph reducer functions, why not take the time to remove this fairly significant dependency

Phase 3: Buffered state

Lastly, being able to query the state at a specific point in time is an interesting possibility. The idea is to encode the state similar to video data to allow us to rewind to a specific point in time. Since the state is way easier to encode than a video frame, emitted events are not that frequent and data is immutable, we expect to have no issues with query speed or the amount of storage required to store the entire state over time.

3 Likes

Update on the Stronk Orchestrator


Due to recent pricing changes for transcode work, ETH fees earned from transcoding has decreased significantly. As a result we have been taking a good look at our income and expenses to see where we could offset some of these losses

First of all Delegators might have noticed that the reward commission was changed to 15% and the fee commission was set to 0%. This way the Stronk operation receives a steady, predictable income stream while any excess revenue goes straight to our Delegators. The idea is that once every few months this reward commission is adjusted, just like we’ve been doing in the past

Another change is to our transcoding setup: the Stronk Orchestrator is a global operation, trying to cover any area where transcode demand comes in from Broadcaster nodes. When we first started 23 months ago we started out with three machines to cover the US (Las Vegas, Chicago, New York). Eventually this turned into one cloud GPU in Las Vegas and one colo machine in Michigan

This new setup was still quite costly, had some weird GPU’s in the mix (P4’s) and still provided excess capacity. Especially since traffic on the west coast would only really rise if there were issue with B nodes on the east coast. Pon-node found a great deal for running colo machines in Kansas and through testing with Stronk Broadcasters we found that the location was suitable for replacing the Las Vegas and Michigan machines

So effective immediately, transcodes in the USA will go through our new machine in Kansas, giving:

  • realtime transcoding to broadcasters located anywhere in the USA
  • a better GPU, so no reduction in capacity
  • a low cost, especially due to subletting spare colo machines to other Orchestrators

This should give us the required cost reductions which, together with the recent stake increase, means the reward commission can stay at 15%, so the current commission rates will stay in effect until the next major stake change or LPT price change

3 Likes

:moneybag:LPT treasury recipient


In the past Livepeer Inc used to be the only entity funding grants, where developers could apply for funding for their application, tooling or other ideas benefitting the Livepeer ecosystem

With the recent addition of the Delta proposal, inflationary rewards have been accumulating in the Livepeer treasury to allow for the network as a whole to distribute funds for ideas or projects, like developing an application built on Livepeer or expanding the cabapilities of the network. Any participant in the network can use their stake to vote in favour or against funding a proposal

The first two requests to the treasury have been put in by @Pon, who is requesting funds to reward active and high performing Orchestrators as well as reward Orchestrators with significant contributions to the Livepeer network

It’s a pleasure to announce that the Stronk Orchestrator has been nominated to receive 3750 LPT for our contributions
Even though the proposal has not passed yet, with a support of 95% it is highly likely to pass

If the proposal passes, we intend to use the received tokens for the following:

  • We want to participate in the upcoming alpha for AI compute jobs, which will add job types like stable video diffusion or upscaling. A part of the tokens would be sold in order to build an enterprise grade server, equipped with a threadripper CPU and 4080/4090 GPU’s
  • We want to convert a smaller amount of tokens to ETH, to have some capital on hand for other investments and fund any future Stronk Broadcaster related activities
  • The biggest chunk of the reward will be staked to promising new Orchestrators who are at the risk of getting bumped out of the top 100

We would like to thank @pon for taking the effort to recognize Orchestrators who have been actively involved in the supply side of the protocol. It certainly motivates me to contribute more in the upcoming year and hopefully it also motivates more network participaints to contribute in their own way

6 Likes

Hi @stronk ,
Glad it made you feel appreciated and hopefully it will help you to carry on with your contributions to the network!

4 Likes

:globe_with_meridians: New year, new website!


In order to provide a more professional overview of our activities to prospective or current Delegators, a new website has been launched!

Since the Forum doesn’t allow for editing posts after a certain amount of time, we’ll make sure to keep the website up to date with the latest status of our projects. Any further updates will be crossposted to the official news page as well as here on the Livepeer forums.

4 Likes

Update on the Stronk Orchestrator :chart_with_upwards_trend:


It has been 765 days since our Orchestrator first came online. Initially this was an experiment to learn more about the protocol after Livepeer acquired MistServer and we became core contributors.
However, after earning the first winning ticket and seeing the positive and helpful community, we were hooked and started running the Orchestrator as a hobby and looking for ways to contribute our time and expertise back to the protocol.

Ever since then it has been a slow and steady course to gather more stake and optimise our setup. Over this time we’ve grown to be the 5th largest Orchestrator and the 2nd largest earner on the network. In this small update we want to share some stats of how the Stronk Orchestrator grew over time

In total we’ve earned over 7.5 ETH in transcoding fees:

Most of which has been distributed to our delegators, given that our fee commission was at or below 30% for most of the time:

With all of the recent stake coming in the past three month, we’ve reached a total stake of 359821.47 LPT, almost 3% of all active stake on the network!

Given current utilisation levels and with the mission to retain the top spot in fee earnings, we will be adding a few extra GPU’s to our US setup. The Stronk Orchestrator will be ready to take on any increase in transcode demand and AI compute

6 Likes

Stronk Upgrade :muscle:


Out with the old, in with the new.

The media industry is moving forward. Livepeer is evolving. The most common video codec (H264) has been around for 20 years and platforms are slowly transitioning to modern formats. AV1 is now the promised child to save our souls from visual artefacts caused by rigid macroblocks, while also addressing some of the patent and licensing issues associated with other video compression standards.

To adapt to this shifting landscape, we are future-proofing our EU setup by switching out our 1080Ti’s with RTX 4060Ti cards. These cards give us access to the latest generation of Nvidia encoders, providing a higher quality per bit as well as unlocking additional codecs and pixel formats.

Is Nvidia still king?

Upcoming upgrades to go-livepeer and it’s dependencies unlock the potential to transcode on alternative hardware. Recent versions of FFMPEG now support VPU’s like NetInt or other GPU’s like Intel Arc’s. With new VPU’s like AMD’s MA35D on the horizon, the market is becoming more competitive.

Each hardware encoder is different. Especially for a relatively new codec like AV1 the quality per bit and speed of a transcode is going to differ greatly per card used. Intel ARC has already proven itself to be at least on par with Nvidia at a lower price point.

We’re closely monitoring the progress of go-livepeer and the efforts of other contributors as they work towards enabling AV1 encoding. As soon as these foundations have been made, we’ve got an Intel Arc GPU ready to integrate and start making some apples-to-apples comparisons. Can we actually see a noticeable difference in output quality? How many streams can we transcode at the same time?

Stay tuned as we continue to push the boundaries of what’s possible in the world of live video streaming.

6 Likes

:building_construction: Stronk Livepeer update


2023 saw a lot of R&D efforts with projects being started, but not quite pulled over the finish line. In this post we will summarise our current priorities and commitments.

Current Projects

  • Stronk API / DYI subgraph
    The rewrite of the API was nearly completed. The only major components left to port over are the event parser and monthly summaries. After this we will add some extra reducers to track info like earnings per Delegator. This will enable the API to spit out detailed reports for tax purposes! The new API also allows for tracking all LPT transfers between addresses and within the smart contracts. This way we can accurately analyse where all staked or unstaked LPT are distributed.

  • Stronk Explorer redesign
    The stronk.rocks website is due for a visual overhaul. Since we have to make some changes to use the new API, the website will get a redesign. Especially for the smart contract explorer we want to make better use of screen real estate and have better filtering and searching options.

  • Stronk media pipeline
    We already have a few helpful dashboards online, like the mainnet pricing overview. The pipeline performs great and exposes the full potential of MistServer. It allows for transcoding using Livepeer, as well as ultra low latency hardware accelerated transcoding directly from the media server in case the operator has their own GPU’s to hook up.
    However, deployment is manual and we’re still missing the API to control access to the pipeline by Delegator status and available capacity.
    Our first priority here will be to finish said API and get a demo website online, integrated with the StreamCrafter, so that anyone can have free access to Streaming->Transcoding->Viewing.

  • StreamCrafter
    The StreamCrafter is in a decent state at the moment. It justs needs a public repository, documentation and a website.

  • Intel QSV integration and testing
    More info on this further in the post.

Commission rate

With all the stake increases the past few months and the increase in LPT price, Stronk Tech is receiving record amounts of LPT. In the past we’ve adjusted the LPT commission rate to try and get as close to a 8% reward cut as possible. For now we’re not lowering the reward cut just yet. Rather, we want to use the current revenue stream to scale up our operations a bit. A lower reward cut will still happen if we can’t deliver on our goals or revenue keeps increasing like it has.

Goals for the year

We have some catching up to do to pull our existing projects over the finish line. We’re also keen to look into other ecosystem needs, like taking on running the Livepeer testnet. Running the testnet fits in nicely with our other projects, which are mostly aimed at supporting other node operators and helping onboarding new network participants.

Because of this we’ll be increasing our time commitment and looking into hiring a contractor to help out with any frontend work like improving the UI of the StreamCrafter.

We’re also still missing a dedicated AI rig. We do have two GPU’s (RTX 4080 & RTX 4000ADA) hooked up and ready to take on any jobs, but our goal is to have two 4090’s in a dedicated rig just for the AI pipeline.

Up next

Based on the hard work of Brad enabling AV1 and upgrading the FFMPEG base which Livepeer uses, a new project we’re taking is to add hardware accelerated QSV support to go-livepeer .

This will allow for hardware accelerated trancoding using Intel CPU’s & GPU’s. Up to this point CPU transcoding hasn’t been recommended as it’s using a software based transcode. These look great (a software transcode always looks better than a hardware transcode) but are quite slow. Since Intel CPU’s actually have quite a good encoder in it why not use those?

In our new small form factor build we will benchmark the performance of all of our GPU’s for transcoding AV1, H265 and H264. This will include cards like the 1080Ti, 4000ADA, 4060Ti, 4080, Intel Arc, and Intel integrated graphics on the CPU.

This allows us to answer questions like: can Intel Arc GPUs compete with Nvidia on capacity? Can pools open up their doors to CPU transcoders, or will we have to wait till 14th gen processors with the Arc GPU built in become more widespread?


Stay tuned for the results of the benchmarks and don’t forget: now is the perfect moment to stake to Stronk Tech and support our development efforts. With the crypto market reaching all time highs, you don’t want to wait too long!

6 Likes

Update on the StreamCrafter :screwdriver:


From humble origins, the first public release of the StreamCrafter!
No installation. No configuration. No hassle. Just ultra low latency livestreaming straight from the browser.

The alpha release of the StreamCrafter is now available as an NPM package.

Given that the package is currently being maintained by a single backend engineer, we are actively looking for contributors to join in on the fun. Most notably, we’re in the process of hiring a contractor to:

  • Rewrite UI components. From optimisations for small screen devices to guided first-time usage. The interface needs the most love & attention at the moment.
  • Refactor the code to be in line with industry standards. We want the project to be maintainable and extendable by anyone who wants to contribute.

If you’re interested in contributing or have feature requests, send me a message or check out the respository.

If you’re interested in following along with the development, see the official website.
Extensive Documentation to add the StreamCrafter to your platform and integrate it with a CDN (Livepeer Studio or host your own free CDN using MistServer!) will follow soon-ish.


Stay tuned for updates or announcements regarding the onboarding of new contributors. Development of the StreamCrafter is funded by MistServer and Stronk-Tech. Thanks to the support of our Delegators we’re able to maintain & publish the project for free!

2 Likes

:robot: Announcing Stronk AI Alpha enrollment


As promised when receiving the treasury rewards for developing orchestrators, we’ve staked a significant amount of LPT with two small Orchestrators: 500 LPT at Bemanode and 2000 LPT at Epitonium

Another goal we had in mind was to get ready for AI inference jobs. So as of last week, we have the Stronk-Gateway loaded up with some ETH reserve, which has already sent out a couple of winning tickets to AI workers!

Of course we’re also participating in the AI pipeline with our own prosumer-grade hardware:

  MOTHERBOARD     1 x Asrock TRX50 WS
  RAM             4 x Kingston FURY Renegade Pro / 32GB / DDR5 / 6000 MHz
  HDD             2 x Seagate Exos X18 16TB
  PSU             1 x Corsair AX1600i 1600W
  CPU             1 x AMD Ryzen Threadripper 7970X CPU
  CPU cooler      1 x Noctua NH-D9 TR5-SP6 CPU COOLER
  SSD             1 x Samsung 990 PRO 4TB M.2 SSD 
  Case            1 x Corsair 7000D Airflow
  GPU             2 x MSI GeForce RTX 4090 SUPRIM LIQUID X 24G

Community Contributions

But wait, there’s more! The new AI pipeline is exciting stuff. While the core AI team has been working diligently on enabling Orchestrator discovery and selection, we went ahead and did some experimentation with the fun part: trying out new models and finetuning parameters. All of the changes we made in our fork are being upstreamed to enhance the AI pipeline for everyone.


Announcing your AI playground

In order to test all these changes to the core AI pipeline, we’ve created a simple dashboard which allows you to test our forked version of the AI pipleine even before our changes get merged upstream. Now with a mobile-friendly layout, visit https://inference.stronk.rocks/ to start crafting your own images and video’s!

5 Likes