Make the PPP great again

Hello Livepeer community,

The distribution of encoding tasks among the 100 orchestrators is defined by a broadcaster-side algorithm.
Until recently, this algorithm ( only took into account the weight of the stake.
However, due to the significant disparity in the size of the stakes (3 orchestrators currently hold 40% of the total LPT staked on their node), 30% of jobs were randomly distributed to reduce the dominance of these mega nodes.
Unfortunately, this introduced a perverse effect where some orchestrators multiplied small nodes to optimize their revenue through these 30%.

Then the 30% was removed in favor of the entry of the price per pixel competition in the algorithm. Unfortunately, this has mainly pulled the prices enormously towards, with most nodes trying to compensate for the linear difference in stake by the price, and this has not eliminated the domination of mega nodes, since for example one mega node has currently set a price of 450, even 250 PPP, in order, legitimately, to optimize its revenue, and will lower it further if necessary, leaving very little room for maneuver for other nodes that are stuck somewhere between prices ranging from 300 to 0 wei.

Proposed solutions:

I submit to your opinion different proposals, and await yours.

1 - Reduction of the weight of the stake in the algorithm calculation when the gap is very high, using a logarithmic function. This will reduce the weight of very high stakes compared to others, but having discussed it with Doug, he fears that this will encourage large orchestrators to divide their stakes to optimize their gains.

2 - Set a minimum price in the algorithm. We could decide on a minimum price below which we would consider transcoding to be no longer profitable (we are only talking here about transcoding revenues, not LPT inflation). This price could be defined by vote, for example by the orchestrators, and would automatically be added to their PPP, without being able to exceed the maximum price set by the broadcaster. For example, for a minimum price set at 300 and a orhestrator’s price of :
50 : 50+300=350
300 : 300+300=600
1000 : 1000+300->1200 max.
Of course, a mega node could always set a low ceiling, but unless it decides to set the lowest price, which would not be of interest, this guarantees a minimum fair price for everyone while allowing real competition as desired.

3 - Set a price for the entire network. We could vote for a fixed price for the entire livepeer network. However, this is not desired in the spirit of livepeer, which is seen as a large open transcoding market. Moreover, it would not solve the problem of mega nodes’ domination.

It is important to find a solution quickly to ensure the maintenance of their infrastructure by the orchestrators (since it is the gains in ETH that motivate the orchestrators to deploy and constantly improve them), even if we can hope that in the future, a massive influx of demand will naturally reduce the problem.


Not sure what the best solution would be, but I definitely agree that some measures have to be though of to keep transcoding fees from dropping too low. It would be nice if performance was made a weight in selection rather than a cutoff so that Orchestrators can find other ways to compete for work or if Livepeer Inc decreases the weight on price

In November (which still had partly inflated rewards with the new maxTicketEV changes going in halfway through) only 8.5 ETH fees were paid out, just 11 Orchestrators earned more than 0.2 ETH (after commission). If even the bigger nodes with stake and a wide geo coverage cannot reach such a low threshold, I worry that this is going to cause people to stop chasing ETH fees or join in Karolak’s footsteps and simply run more nodes.

We’re kinda stuck with a prisoner’s dilemma here, but if we want to see a short term change Orchestrators need to take a more principled stance and up their PPP. I know they are just trying to maximise their tickets received (short term), but most of the Orchestrators who I’ve heard complain about their ETH earnings evaporating are still running at sub 200 PPP’s. Would be good to get some of them into this discussion to hear their opinion (maybe share a link in the Discord for comments?)


As mentioned in one of the previous watercoolers, that “one mega node” is currently testing the total received ticket value for various ppp :slight_smile:
So far, 450 gave me the highest ticket values and 250 the lowest - showing that the additional streams don’t make up for the decrease in ppp (obviously, this depends on multiple factors that I can’t control like e.g. the ppp of other Os and the workload of the network etc.). So expect my ppp to increase again in a few days. I’ll also test higher values than 450.

I think if other Os would also optimize their ticket value and not their stream count, we would see a much higher ppp median.

Apart from that, it’s a bit difficult to propose a solution if we don’t know the actual broadcaster algo to distribute streams. One idea would be to use an “inverted s-shape function”:


The parameters should be tweaked a bit, but the main idea is that you get diminishing returns for additional decreases in ppp towards the extremes. E.g. the benefits of going below 300 ppp are hardly noticable, whereas going from 700 to 400 gives you almost linear benefits.


cc @vires-in-numeris the exact selection algo can be found in the server/selection_algorithm.go file


What interests me is the definition of a mega node, no matter who is behind it.
I even think that we are lucky today that the only really active mega node is vires-in-numeris, since he is more interested in the livepeer project than anything else.
But tomorrow, anyone or any entity can arrive on the network with a mega node.
And if today it is clear that their interest will be solely the inflation of LPT, if it were to decrease to approach 0, and if the demand on the network became significant, this could change.
It is even possible in the future, that an entity in competition with the Livepeer network decides to create a mega node just to push other orchestrators to leave the network and regain its monopoly.
That’s why I think it’s important to find a balance today that ensures that everyone benefits.
Effectively setting a minimum price by the orchestrators’ community could ensure protection against possible excesses of orchestrators who might decide to lower their prices below the minimum guaranteeing a certain profitability.
Regarding the algorithm, I have conducted some tests using an inverted logarithmic function to decrease the interest in lowering the price too low. [I still need to perform adjustment tests to see the real interest of this solution.
I think that for now, an algorithm like “inverted s-shape function”, which can be mixed with a minimum price set by the community, would be the best solution.


I have mixed feelings about this, in a decentralized open market like Livepeer, any hand holding of price oversight should not exist. It’s $$$ at the end of the day, the market will adjust over time or lose money.

The issue of O’s with big stakes fragmenting to take advantage of this is water flowing downhill, and a larger issue to solve on its own. Incentivizing smaller community-active O’s with more stake is one short term solution for this. And we are indeed lucky this is vires, and I agree with Franck on the seriousness of this issue.

I too have been playing with adjusting PPP by region and just realized that I’d set pricePerUnit 49 in the config but I also set pricePerBroadcaster {"broadcasters":[{"ethaddress":"0xc3c7cxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx6e61e","priceperunit":899,"pixelsperunit":1},{"ethaddress":"0x68d6FF3938Ff63d2df16567Cb8CA9772e14496F7","priceperunit":0,"pixelsperunit":1}]}
Just wanted to point that out to O’s who may have been reviewing PPP distribution on Stronk’s site.


My 2c is that @vires-in-numeris is exactly right that Os should be approaching pricing as a business would: as a function of costs and revenue (including LPT rewards). The best price/performance/stake combination should win.

I see the appeal of cartel behavior but I want to highlight that it’s a losing game - all it takes is a big actor to undercut the cartel and it falls apart. (Note that this logic doesn’t hold if price restrictions are enforced at the protocol level, but I would advocate against introducing any protocol-enforced price controls.)

I strongly agree with @Strykar that the splitting-Os problem is the critical supply-side issue for the network to solve and would love to see LIPs to address it.


Hopefully the phenomenon of splitting Os should also be resolved with the arrival of demand on the network. Indeed, there is no interest in splitting one’s stake and multiplying the Os if the demand is sufficient to mobilize all the resources that an O has. The splitting of Os was mainly due to the 30% random distribution of the job and the few resources used on the Os systems due to lack of demand.


So far what I’ve been seeing with the new selection algorithm is that:

  • Orchestrators with stake don’t have to compete on price cause they’re getting enough streams by competing on stake
  • Orchestrators without stake have to aggressively lower their price down to the cheapest Orchestrator due to the exponential curve in price weighted selection. It’s difficult for Orchestrators to find the sweet spot where they are eligible cause it drops off so quickly

I think we can mitigate these issues a bit with a few minor changes to the selection algorithm:

  • Rather than sa.PriceWeight*priceProb + sa.StakeWeight*stakeProb, we could multiply the probabilities. This way having stake does not guarantee work, but only helps you out. If we go this route we probably want to increase the StakeWeight a bit or else stake might have too little effect on selection

  • Use an inverse sigmoid curve for selection like Vires brought up. Here is an example function to play around with. c is the center of the curve and should probably be set around 100 - 200

    Even when centering around 50 PPP, there is a bit more wiggle room in price

    Edit: also added the current price selection curve. Increasing the factor will have a similar effect as to using the sigmoid curve, but I like the idea of having a curve which is centered around the cheapest node / bottom 5th percentile. You’ll get a bit more of a dynamic response when the minimum price is a bit higher


I think it would be really interesting to test this approach.
Any solution that could reduce the gap that exists today between orchestrators and that would allow for a better distribution of streams at an acceptable price for everyone should be attempted.

1 Like