• ClamDrinker@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    6 months ago

    If you’re here because of the AI headline, this is important to read.

    We’re looking at how we can use local, on-device AI models – i.e., more private – to enhance your browsing experience further. One feature we’re starting with next quarter is AI-generated alt-text for images inserted into PDFs, which makes it more accessible to visually impaired users and people with learning disabilities.

    They are implementing AI how it should be. Don’t let all the shitty companies blind you to the fact what we call AI has positive sides.

    • UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      6 months ago

      They are implementing AI how it should be.

      The term is so overused and abused that I’m not clear what they’re even promising. Are they localizing a LLM? Are they providing some kind of very fancy macroing? Are they linking up with ChatGPT somehow or integrating with Co-pilot? There’s no way to tell from the verbage.

      And that’s not even really Mozilla’s fault. It’s just how the term AI can mean anything from “overhyped javascript” to “multi-billion dollar datacenter full of fake Scarlett Johansson voice patterns”.

      • chrash0@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        6 months ago

        there are language models that are quite feasible to run locally for easier tasks like this. “local” rules out both ChatGPT and Co-pilot since those models are enormous. AI generally means machine learned neural networks these days, even if a pile of if-else used to pass in the past.

        not sure how they’re going to handle low-resource machines, but as far as AI integrations go this one is rather tame

        • UnderpantsWeevil@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          6 months ago

          AI generally means machine learned neural networks these days

          Right, but a neural network traditionally rules out using a single local machine. Hell, we have entire chip architecture that revolves around neural net optimization. I can’t imagine needing that kind of configuration for my internet browser.

          not sure how they’re going to handle low-resource machines

          One of the perks of Firefox is its relative thinness. Chrome was a shameless resource hog even in its best days, and IE wasn’t any better. Do I really want Firefox chewing hundreds of MB of memory so it can… what? Simulate a 600 processor cluster doing weird finger art?

          • chrash0@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            6 months ago

            i mean, i’ve worked in neural networks for embedded systems, and it’s definitely possible. i share you skepticism about overhead, but i’ll eat my shoes if it isn’t opt in