The dirty secret is that no-one picks that. There's no-one turning the knobs to adjust your news feeds, just like there's no-one managing your Gmail inbox who you can turn to for help. It's all machines. Algorithms automatically amplify what's most engaging, and flame wars are more engaging than calm discussions.
Not the algorithms OP is referring to. Those are black-box learning models which discover what to recommend based on - in the case of Facebook, YouTube and Twitter - what generates the most advertising revenue.
The "particular values, agendas and biases" you're referring to are just advertising revenue, not some political bias installed by the developer. If you WANT to believe there are such developers working at those companies, I can't stop you. But the evidence isn't really there. Make some friends who work at those companies and see what you can find out from the people who actually develop the tools you're accusing of having such political bias in.
Now, if we are talking about political bias in these platforms' policies, you might have something.
Calling it an amplifying algorithm is an insult to the term - literally a sort by most recent or appending to the bottom of a text file primitives posting would favor flame wars as "engagement". The rhetoric of sinister algorithims has been passed around a lot by propaganda "documentaryists" but really with even a bit of knowledge knows trying to cast a correlation algorithim as a singularity in a box master manipulator is absurd. What is next? Putting the wind on trial for billions of sexual assault charges because of all of the skirts it blew up without their consent?
The fact that machines are helping these people accomplish their goals does not mean that people don’t have goals and that they shouldn’t be liable for the consequences of accomplishing their goals.