• 0 Posts
  • 168 Comments
Joined 3 months ago
cake
Cake day: January 20th, 2025

help-circle



  • I don’t necessarily agree that this means blue sky is toxic (and I’m speaking as someone who doesn’t use the app), I see it as a toxic company finding out what people think of them.

    As you noted adobe is a dick (and that’s one hell of an understatement), and they regularly make anti-consumer choices with their software and pricing. This is just them seeing what you’ve been ignoring for a decade or more.

    Maybe the multi billion dollar company should grow subscribe to my monthly subscription to thicker skin.




  • spooky2092@lemmy.blahaj.zoneto196@lemmy.blahaj.zonerobot rule
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    1
    ·
    edit-2
    3 days ago

    Lol, I can’t even downvote you cuz my instance doesn’t support them.

    I’m genuinely curious because it sounds like you’re suggesting that the models are moving past just being generative transformers into something more intelligent, and I just have not seen evidence of that. Only empty claims of it existing and using very weak examples of ‘novel responses’ that still is just a generative transformers response.

    But sure, if you can’t support your point with solid evidence, passive aggressive dismissal of skepticism works just as well. People are constantly fed a narrative that AI is amazing and can do all this novel shit, but I have yet to see anything to back it up.





  • spooky2092@lemmy.blahaj.zoneto196@lemmy.blahaj.zonerobot rule
    link
    fedilink
    English
    arrow-up
    21
    arrow-down
    3
    ·
    3 days ago

    The only time you’re wasting is time you could be improving yourself. By having the AI write for you, you’re choosing to not improve your writing/research/analytical skills, and hoping the dipshit bot that’s writing your essay doesn’t just make bullshit up out of whole cloth.

    I’m not saying not to use the AI to assist with the process, but IMO that should be more on the gathering sources side than the composition side.



  • Copying their post over (with minimal formatting, unfortunately) for anyone that doesn’t care to go to that site (and to make sure it doesn’t randomly disappear)

    r/self 5 mo. ago walkandtalkk You’re being targeted by disinformation networks that are vastly more effective than you realize. And they’re making you more hateful and depressed.

    (I wrote this post in March and posted it on r/GenZ. However, a few people messaged me to say that the r/GenZ moderators took it down last week, though I’m not sure why. Given the flood of divisive, gender-war posts we’ve seen in the past five days, and several countries’ demonstrated use of gender-war propaganda to fuel political division in multiple countries, I felt it was important to repost this. This post was written for a U.S. audience, but the implications are increasingly global.)

    TL;DR: You know that Russia and other governments try to manipulate people online. But you almost certainly don’t how just how effectively orchestrated influence networks are using social media platforms to make you – individually-- angry, depressed, and hateful toward each other. Those networks’ goal is simple: to cause Americans and other Westerners – especially young ones – to give up on social cohesion and to give up on learning the truth, so that Western countries lack the will to stand up to authoritarians and extremists.

    And you probably don’t realize how well it’s working on you.

    This is a long post, but I wrote it because this problem is real, and it’s much scarier than you think.

    How Russian networks fuel racial and gender wars to make Americans fight one another

    In September 2018, a video went viral after being posted by In the Now, a social media news channel. It featured a feminist activist pouring bleach on a male subway passenger for manspreading. It got instant attention, with millions of views and wide social media outrage. Reddit users wrote that it had turned them against feminism.

    There was one problem: The video was staged. And In the Now, which publicized it, is a subsidiary of RT, formerly Russia Today, the Kremlin TV channel aimed at foreign, English-speaking audiences.

    As an MIT study found in 2019, Russia’s online influence networks reached 140 million Americans every month – the majority of U.S. social media users.

    Russia began using troll farms a decade ago to incite gender and racial divisions in the United States

    In 2013, Yevgeny Prigozhin, a confidante of Vladimir Putin, founded the Internet Research Agency (the IRA) in St. Petersburg. It was the Russian government’s first coordinated facility to disrupt U.S. society and politics through social media.

    Here’s what Prigozhin had to say about the IRA’s efforts to disrupt the 2022 election:

    >Gentlemen, we interfered, we interfere and we will interfere. Carefully, precisely, surgically and in our own way, as we know how. During our pinpoint operations, we will remove both kidneys and the liver at once.
    

    In 2014, the IRA and other Russian networks began establishing fake U.S. activist groups on social media. By 2015, hundreds of English-speaking young Russians worked at the IRA. Their assignment was to use those false social-media accounts, especially on Facebook and Twitter – but also on Reddit, Tumblr, 9gag, and other platforms – to aggressively spread conspiracy theories and mocking, ad hominem arguments that incite American users.

    In 2017, U.S. intelligence found that Blacktivist, a Facebook and Twitter group with more followers than the official Black Lives Matter movement, was operated by Russia. Blacktivist regularly attacked America as racist and urged black users to rejected major candidates. On November 2, 2016, just before the 2016 election, Blacktivist’s Twitter urged Black Americans: “Choose peace and vote for Jill Stein. Trust me, it’s not a wasted vote.”

    Russia plays both sides – on gender, race, and religion

    The brilliance of the Russian influence campaign is that it convinces Americans to attack each other, worsening both misandry and misogyny, mutual racial hatred, and extreme antisemitism and Islamophobia. In short, it’s not just an effort to boost the right wing; it’s an effort to radicalize everybody.

    Russia uses its trolling networks to aggressively attack men. According to MIT, in 2019, the most popular Black-oriented Facebook page was the charmingly named “My Baby Daddy Aint Shit.” It regularly posts memes attacking Black men and government welfare workers. It serves two purposes: Make poor black women hate men, and goad black men into flame wars.

    MIT found that My Baby Daddy is run by a large troll network in Eastern Europe likely financed by Russia.

    But Russian influence networks are also also aggressively misogynistic and aggressively anti-LGBT.

    On January 23, 2017, just after the first Women’s March, the New York Times found that the Internet Research Agency began a coordinated attack on the movement. Per the Times:

    >More than 4,000 miles away, organizations linked to the Russian government had assigned teams to the Women’s March. At desks in bland offices in St. Petersburg, using models derived from advertising and public relations, copywriters were testing out social media messages critical of the Women’s March movement, adopting the personas of fictional Americans.
    
    >They posted as Black women critical of white feminism, conservative women who felt excluded, and men who mocked participants as hairy-legged whiners.
    

    But the Russian PR teams realized that one attack worked better than the rest: They accused its co-founder, Arab American Linda Sarsour, of being an antisemite. Over the next 18 months, at least 152 Russian accounts regularly attacked Sarsour. That may not seem like many accounts, but it worked: They drove the Women’s March movement into disarray and eventually crippled the organization.

    Russia doesn’t need a million accounts, or even that many likes or upvotes. It just needs to get enough attention that actual Western users begin amplifying its content.

    A former federal prosecutor who investigated the Russian disinformation effort summarized it like this:

    >It wasn’t exclusively about Trump and Clinton anymore.  It was deeper and more sinister and more diffuse in its focus on exploiting divisions within society on any number of different levels.
    

    As the New York Times reported in 2022,

    >There was a routine: Arriving for a shift, [Russian disinformation] workers would scan news outlets on the ideological fringes, far left and far right, mining for extreme content that they could publish and amplify on the platforms, feeding extreme views into mainstream conversations.
    

  • One of those groups is a marginalized group with no power, and the other is literally the embodiment of state violence.

    Police *should" be held to a different standard than immigrants or literally any other civilian.

    The double standard is so funny and sad to me

    Agreed, but for entirely opposite reasons.





  • You start with mass murderers, proceed with torturers, sadists, animal abusers, child abusers, rapists, and from abjection to abjection you end up justifying to yourself not forgiving your neighbour for letting his dog poop in your yard.

    "But muh slippery slope!’

    “Never forgive, never forget” sounds cool but from a spiritual standpoint, it’s not much less dehumanizing than horrors such as mass murders

    but from a spiritual standpoint, it’s not much less dehumanizing than horrors such as mass murders

    not much less dehumanizing than horrors such as mass murders

    Sorry, I can’t respond to the reductio ad absurdum response of saying that mass murderers’ do not deserve forgiveness and that not forgiving them is any way close to the mass murder of people.

    I get the point you’re trying to make, but we fundamentally disagree on the concept of even the most basic morality, clearly, if you can find any moral similarities between the two situations.