> Google has not yet discovered how to automate "is this a quality link?"
Google has so much more data than just the keywords and searches people make, it seems like this should be a problem they could solve.
Through tracking cookies (e.g. Google Analytics) they should be able to follow a single user's session from start to finish, and they also should be able to 'rank' users in some vague way where they'd learn which users very rarely fall for ads or spend time on the sites that they know are BS. Those sites that are showing up on page 5 or 6 of the search results, but still get far more attention than others on the first few pages, could get ranked higher.
But I don't think many of Google's problems these days are technical in nature. They're caused by the MBAs now having more power at Google than the techies, and thus increasing revenue is more important than accuracy.
Theoretically, they could do a lot of things, but plenty of those would get them in hot water from a regulatory standpoint.
Also, don't underestimate the adversaries. Ranking well on Google means earning a lot of money. So much so, that I'd argue the SEO-people are making significantly more money than Google loses by having spammy SERPs. They will happily throw money at the problem and work around the filters. I don't think you can really select for quality by statistical measures. Google tried and massively threw "trust" at traditional media companies and "brands". The SEO-people responded by simply paying the media companies to host their content, and now they rank top 3, pay less than they did by buying links previously, and never get penalties.
Nope, all they could do there would be to group people together based on their "search behavior graph". The problem of finding BS sites is in itself a "shirt without stripes"-level hard problem. That's why being able to rely on user curation is (was?) so important for Google. People didn't curate the internet at first for money, they did it because they were interested in the subject for which they were curating their web ring.
Google has so much more data than just the keywords and searches people make, it seems like this should be a problem they could solve.
Through tracking cookies (e.g. Google Analytics) they should be able to follow a single user's session from start to finish, and they also should be able to 'rank' users in some vague way where they'd learn which users very rarely fall for ads or spend time on the sites that they know are BS. Those sites that are showing up on page 5 or 6 of the search results, but still get far more attention than others on the first few pages, could get ranked higher.
But I don't think many of Google's problems these days are technical in nature. They're caused by the MBAs now having more power at Google than the techies, and thus increasing revenue is more important than accuracy.