Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think Section 230 needs updating, badly.

I personally find value in entities like Hacker News, Twitter, StackOverflow, etc that curate their content. I of course do want transparency there.

I also value utterly unfiltered sources. But, I do not need or want every source to be unfiltered. Then everything becomes a firehose of utter garbage.

I think there is room for both. But it just makes zero sense to me to essentially force primarily user-driven platforms like Twitter to amplify/aggregate content from literally everybody. If they don't want to amplify the NYPost's stories, that should be their choice. And then I can choose not to read Twitter if I don't like that. I completely fail to see the problem.

    I agree with you that it's just another media outlet
    with a slant, but the government needs to start 
    treating them like it. 
What, specifically, do you want done here? Should it apply to Hacker News as well -- should HN no longer be able to curate its content?


Just to clear the air, I am not a fan of the Post or the particular content that earned it its recent block. I agree that it's Twitter's platform, and they should be able to curate it as they see fit.

Here's how I see it: if I wrote a letter to the editor to a popular US newspaper calling for an ethnic pogrom, they published it, and it incited riots or violence, that news paper would get tar sued out of it. They are responsible for the user-generated content that they chose to publish. If instead I used my phone to start a phone tree to rouse people to ethnic violence, only the phone call makers would be responsible. The phone company just lets me place calls to whoever will receive them and transmits my messages.

Where does Facebook fall? Currently, if they allow people to start groups and publish materials calling for ethnic genocide and it leads to violence, they bear no responsibility. Sure, there's a continuum between phone and newspaper, but currently they are having their cake and eating it too. What should the law look like?

Free speech doesn't mean you are not responsible for the things you say (or re-transmit). If they want to be a non-neutral media outlet, Facebook, Twitter, et al. should be liable for the damages caused when their platform is involved in incitement of violence. If you're attacked by White supremacist counter-protestors, and it turns out they organized or promoted their activities on the platform, it should be partially responsible and should be a co-defendant in court.

Personally, I would prefer them to remain neutral platforms and focus their efforts on providing stronger abuse prevention that makes it harder for extremists to "amplify" their message into the feeds of people that don't want it instead of trying to whack-a-mole offensive content. Tools that would allow users to filter and block more easily, or even better selectively subscribe to the content they do want. I guess that would probably mean reduced engagement, though.


> if I wrote a letter to the editor to a popular US newspaper calling for an ethnic pogrom, they published it, and it incited riots or violence, that news paper would get tar sued out of it.

This is simply not true.


I’d be very interested in learning more about this then. Is the only thing keeping full page ads calling for people to kill BLM protestors out of the papers the papers’ own sense of decency?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: