Op’s take is not reasonable imo- if you think threats are harmful enough to prosecute they should also be harmful enough to censor.
Maybe a more soft form of censorship, such as hiding them behind a cw and a “user was vanned for this post” label rather than outright removal, but you can’t just do nothing.
Prosecution implies a trial before punishment. Censorship is immediate punishment based solely on the judgment of the authorities. That’s not a minor difference.
Exactly. If a judge states that an individual is no longer allowed on SM, then I absolutely understand banning the account and removing their posts. However, until justice has been served, it’s 100% the platform’s call, and I think platforms should err on the side of allowing speech.
I realize I’m jumping back and forth between sides here, but that’s because it’s a complex problem and I haven’t made my mind up. But that said, to return to the previous point…if you need a court order to ban every spammer and troll, you’ll drown in spam and propaganda. The legal system can’t keep up.
I’m not saying companies should need a court order, only that they should only be obligated to remove content by a court order, and ideally they’d lean toward keeping content than removing it. I think it’s generally better for platforms to enable users to hide content they don’t want to see instead of outright removing it for everyone. One person’s independent journalism is another person’s propaganda, and I generally don’t trust big tech companies with agendas to decide between the two.
Op’s take is not reasonable imo- if you think threats are harmful enough to prosecute they should also be harmful enough to censor.
Maybe a more soft form of censorship, such as hiding them behind a cw and a “user was vanned for this post” label rather than outright removal, but you can’t just do nothing.
Prosecution implies a trial before punishment. Censorship is immediate punishment based solely on the judgment of the authorities. That’s not a minor difference.
Exactly. If a judge states that an individual is no longer allowed on SM, then I absolutely understand banning the account and removing their posts. However, until justice has been served, it’s 100% the platform’s call, and I think platforms should err on the side of allowing speech.
I realize I’m jumping back and forth between sides here, but that’s because it’s a complex problem and I haven’t made my mind up. But that said, to return to the previous point…if you need a court order to ban every spammer and troll, you’ll drown in spam and propaganda. The legal system can’t keep up.
I’m not saying companies should need a court order, only that they should only be obligated to remove content by a court order, and ideally they’d lean toward keeping content than removing it. I think it’s generally better for platforms to enable users to hide content they don’t want to see instead of outright removing it for everyone. One person’s independent journalism is another person’s propaganda, and I generally don’t trust big tech companies with agendas to decide between the two.