>The primary signals I use to judge the worth of a video are 1) its prominence in search results,
This is not as good a signal because for many long-tail videos, the topmost video in the search results has a bad dislike-to-like ratio.
>and 2) the video itself.
This is not efficient use of time for the viewer because the point of a quick statistic is to avoid watching the video in the first place.
>This is because dislike counts are in my experience extremely unreliable—they may be elevated for stupid and irrelevant reasons on a really great video,
Totally opposite experience. For thousands of videos I've consumed that are not-politics and not-music such as tutorials, the bad dislike ratios were the most reliable indicators Youtube had for trash videos.
To be clear, I'm talking about dislike ratios and not absolute counts. Maybe that's the source of the disagreement? E.g. if I see a video with hundreds or thousands of dislikes (absolute counts), it's not a problem unless the ratio is out of whack.
The problem with this discussion is that (presumably) neither of us will ever have the data to back up our position. We can only disagree based on how we ourselves use YouTube, and maybe how we've observed friends and family using it.
So what is the point? Unlike a lot of people I've discussed this with, I can at least respect your position as being based in some kind of believable and relatively well-developed use case. But nothing we've said here has even touched on the (alleged) negative externalities of visible downvote counts that (allegedly) motivated this change in the first place. And, again, we just don't have the data to understand the tradeoffs there.
I entered this discussion because I really, honestly don't understand why so many people are so angry about this. And, despite your efforts, I still don't. I have no trouble believing that these legitimately bad, long-tail, high downvote count videos exist, but I still see them as exactly what my original comment characterized them as: an edge case.
I guess I'm just looking for a bombshell that will make me understand, because that's how obvious the downvote count stans make this issue out to be. You aren't wrong just because you don't have such a bombshell, but you aren't going to convince me you're right just by telling me that the way I use YouTube is wrong.
>The problem with this discussion is that (presumably) neither of us will ever have the data to back up our position. [...] I have no trouble believing that these legitimately bad, long-tail, high downvote count videos exist, but I still see them as exactly what my original comment characterized them as: an edge case.
Because the public dislike counts are now gone, examples are hard to find. In any case, here's a screen shot of a popular video from The Verge that's not long-tail. The deep link showing 1200-dislikes-vs-839-likes: https://youtu.be/WuunLhXcUo4?t=15
That's an example of a high "dislikes ratio" alerting the viewer to a trash video. Lots of bad videos with corresponding dislikes on Youtube are not an edge case. I encounter them every day (e.g. product reviews, DIY how-to, etc)
>, but you aren't going to convince me you're right just by telling me that the way I use YouTube is wrong.
If you're happy using Youtube without considering dislikes, we're not saying you're wrong. The point is that your dismissal of public dislikes is not relevant to how _others_ depend on it to avoid wasting time. The examples of famous youtube creators also looking at dislike counts to save time is very telling. Even though they themselves suffer from dislikes on their own channel, they still depend on dislike counts when consuming others' videos. The youtube creators are also some of the most sophisticated viewers of Youtube content and their usage of dislikes ratios matches the reality of many in this thread who use the metrics in the same way to filter out bad videos.
This is all well and good, but—again—what you aren't addressing is the (claimed) negative effects of visible downvote counts, mainly (as I understand it) spurious downvote brigades against relatively small/vulnerable creators. These use cases you and others (claim to) care about so much must be weighed against the (claimed) damage done by continuing to support them, so it's not just a question of live and let live.
When downvote brigades are brought up, the first response is always something along the lines of "all engagement is good" and "downvotes aren't a negative signal in the algorithm", the thrust being that downvotes don't actually hurt creators and they're just being a bunch of crybabies. But it is, frankly, totally incoherent to claim this and then turn around and also claim that downvotes are an important signal for deciding which videos to not watch. I hope you can see the contradiction there.
I point this out, and the next claim they trot out is "downvote brigades are rare to the point of being insignificant". To which I say, well, prove it. Prove it, show me the data, and make the case to me that saving X man-hours of wasted time watching unworthy how-to videos is worth crushing Y creators' nascent careers.
They care because none of this seemed to matter until the White House and then the big media, who is being tagged as authoritative despite getting a lot wrong, were ratioed consistently.
People asked why disliking POTUS matters now, when it didn't before.
People asked why big media gets to fact check everyone, despite obvious, daily bias and error, and dislikes are hidden for them and this helps everyone how exactly?
People noticed some creators are widely disliked consistently, yet enjoy consistent promotion. They noticed money seems to matter.
Creators who generate many ratioed videos are different from those who don't, or who have a mix of some kind too.
All of these were brought up in discussion, brought to me by normies basically unable to buy into the, "you do not need that info because..." line they were given.
And now that they are talking about it, topic of interest, they are wondering about other things, like why their Facebook keeps removing what they thought were private exchanges...
Your bomb shell is watching people lose trust and see less value in all of this than was true a year ago.
I don't think videos on politicized topics getting "ratioed" is a very good signal of anything in today's political climate. Many, many users are willing to downvote content featuring people and/or organizations they've been told are their enemies, without engaging with the content at all.
In fact, this sort of thing is IMO one of the best arguments for removing downvote counts.
Beyond that, I frankly don't think people should have trust in "platforms" telling them what to think, so if this change wakes up some of the "sheeple" I can only see that as a good thing. I certainly think that taking away a tool that allows people to trivially express their shallow hatred of a thing can only improve the state of discourse and critical thought in society.
This is not as good a signal because for many long-tail videos, the topmost video in the search results has a bad dislike-to-like ratio.
>and 2) the video itself.
This is not efficient use of time for the viewer because the point of a quick statistic is to avoid watching the video in the first place.
>This is because dislike counts are in my experience extremely unreliable—they may be elevated for stupid and irrelevant reasons on a really great video,
Totally opposite experience. For thousands of videos I've consumed that are not-politics and not-music such as tutorials, the bad dislike ratios were the most reliable indicators Youtube had for trash videos.
To be clear, I'm talking about dislike ratios and not absolute counts. Maybe that's the source of the disagreement? E.g. if I see a video with hundreds or thousands of dislikes (absolute counts), it's not a problem unless the ratio is out of whack.