• yesman@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    26 days ago

    I just want to remind everyone that capital won’t wait until AI is “as good” as humans, just when it’s minimally viable.

    They didn’t wait for self-checkout to be as good as a cashier; They didn’t wait for chat-bots to be as good as human support; and they won’t wait for AI to be as good as programmers.

    • SlopppyEngineer@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      26 days ago

      And because all the theft and malfunctions, the nearby supermarkets replaced the self checkout by normal cashiers again.

      If it’s AI doing all the work, the responsibility goes to the remaining humans. They’ll be interesting lawsuits even there’s the inevitable bug that the AI itself can’t figure out.

      • atrielienz@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        26 days ago

        We saw this happen in Amazon’s cashier-less stores. They were actively trying to use a computer based AI system but it didn’t work without thousands of man hours from real humans which is why those stores are going away. Companies will try this repeatedly til they get something that does work or run out of money. The problem is, some companies have cash to burn.

        I doubt the vast majority of tech workers will be replaced by AI any time soon. But they’ll probably keep trying because they really really don’t want to pay human beings a liveable wage.

    • AmbiguousProps@lemmy.today
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      26 days ago

      They won’t, and they’ll suffer because of it and want to immediately hire back programmers (who can actually do problem solving for difficult issues). We’ve already seen this happen with customer service reps - some companies have resumed hiring customer service reps because they realized AI isn’t able to do their jobs.

      • SlopppyEngineer@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        26 days ago

        They’ll try the opposite. It’s what the movie producers did try the wrists. They gave them AI generated junk and told them to fix it. It was basically rewriting the whole thing but because now it was “just touching up all existing script” it was half price.

        • xtr0n@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          0
          ·
          25 days ago

          They can try. But cleaning up a mess takes a while and there’s no magic wand to make it ho faster.

        • Eager Eagle@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          26 days ago

          Yeah they’ll try. Surely that can’t cascade into a snowball of issues. Good luck for them 😎

          • SlopppyEngineer@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            26 days ago

            A strike with tech workers would be something else. Curious what would happen if the one maintaining the servers for entertainment, stock market or factories would just walk out. On the other hand, tech doesn’t have unions.

      • peopleproblems@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        26 days ago

        You better fucking believe it.

        AIs are going to be the new outsource, only cheaper than outsourcing and probably less confusing for us to fix

  • Feyd@programming.dev
    link
    fedilink
    English
    arrow-up
    0
    ·
    25 days ago

    Meanwhile, llms are less useful at helping me write code than intellij was a decade ago

    • tzrlk@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      25 days ago

      I’m actually really impressed with the auto complete intellij is packaged with now. It’s really good with golang (probably because golang has a ton of code duplication).

  • Vilian@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    ·
    25 days ago

    The first thing AI gonna replace is CEO, dumb ass job, Mac Donald employer require more expertise

  • irotsoma@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    25 days ago

    And anyone who believes that should be fired, because they don’t understand the technology at all or what is involved in programming for that matter. At the very least it should make everyone question the company if its leadership doesn’t understand their own product.

  • Aceticon@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    25 days ago

    Well, that would be the 3rd or 4th thing during my career that was supposed to make my job a thing of the past or at least severely reduce the need for it.

    (If I remember it correctly, OO design were supposed to reduce the need for programmers, as were various languages, then there was Outsourcing, visual programming and on the server-side I vaguely remember various frameworks being hailed as reducing the need for programmers because people would just be able to wire modules together with config or some shit like that. Additionally many libraries and frameworks out there aim to reduce the need for coding)

    All of them, even outsourcing, have made my skills be even more in demand - even when they did reduce the amount of programming needed without actually increasing it elsewhere (a requirement were already most failed) the market for software responded to that by expecting the software to do more things in more fancy ways and with data from more places, effectively wiping out the coding time savings and then some.

    Granted, junior developers sometimes did suffer because of those things, but anything more complicated than monkey-coder tasks has never been successfully replaced, fully outsourced or the need for it removed, at least not without either the needs popping up somewhere else or the expected feature set of software increasing to take up the slack.

    In fact I expect AI, like Outsourcing before it, in a decade or so is going to really have screwed the Market for Senior Software Engineers from the point of view of Employers (but a golden age for Employees with those skills) by removing the first part of the career pathway to get to the level of experience, and this time around they won’t even be able to import the guys and galls in India who got to learn the job because the Junior positions were outsourced there.

    • jdeath@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      25 days ago

      i didn’t start my tech career after high school because every career advice i got was “all jobs going to india.” could’ve had 10 more year’s experience but instead i joined the military. ugh!

  • letsgo@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    26 days ago

    Don’t worry guys. As long as project managers think “do the thing … like the thing … (waves hands around) … you know … (waves hands around some more) … like the other thing … but, um, …, different” constitutes a detailed spec, we’re safe.

  • masterspace@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    26 days ago

    I worked at a different MAANG company and saw internal slides showing that they planned on being able to replace junior devs with AI by 2025. I don’t think it’s going according to plan.

    At the end of the day, one thing people forget about with these things is that even once you hit a point where an AI is capable of writing a full piece of software, a lot of businesses will still pay money to have a human read through, validate it, and take ownership of it if something goes wrong. A lot of engineering is not just building something for a customer, but taking ownership of it and providing something they can trust.

    I don’t doubt that eventually AI will do almost all software writing, but the field of software companies isn’t about to be replaced by non software people just blindly trusting an AI to do it right (and in legally compliant ways), anytime soon.

  • Voroxpete@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    26 days ago

    I taught myself Python in part by using ChatGPT. Which is to say, I coaxed it through the process of building my first app, while studying from various resources, and using the process of correcting its many mistakes as a way of guiding my studies. And I was only able to do this because I already had a decent grasp of many of the basics of coding. It was honestly an interesting learning approach; looking at bad code and figuring out why it’s bad really helps you to get those little “Aha” moments that make programming fun. But at the end of the day it only serves as a learning tool because it’s an engine for generating incompetent results.

    ChatGPT, as a tool for creating software, absolutely sucks. It produces garbage code, and when it fails to produce something usable you need a strong understanding of what it’s doing to figure out where it went wrong. An experienced Python dev could have built in a day what took me and ChatGPT a couple of weeks. My excuse is that I was learning Python from scratch, and had never used an object oriented language before. It has no excuse.

    • AWittyUsername@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      25 days ago

      ChatGPT only gives good answers if you ask the right questions and to do that you have to be better than a novice. It’s great as a tubby ducky that answers back but it’s usefulness is a reflection of the user.

  • VantaBrandon@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    26 days ago

    Its not like jobs will disappear in a single day. Incremental improvements will render lower level tasks obsolete, it already has to a degree.

    Someone will still need to translate the business objectives into logical structure, via code, language, or whatever medium. Whether you call that a “coder” or not, is kind of irrelevant. The nerdy introverts will need to translate sales-douche into computer one way or another. Sales-douches are not going to be building enterprise apps from their techbro-hypespeak.

  • jubilationtcornpone@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    26 days ago

    Just the other day, the Mixtral chatbot insisted that PostgreSQL v16 doesn’t exist.

    A few weeks ago, Chat GPT gave me a DAX measure for an Excel pivot table that used several DAX functions in ways that they could not be used.

    The funny thing was, it knew and could explain why those functions couldn’t be used when I corrected it. But it wasn’t able to correlate and use that information to generate a proper function. In fact, I had to correct it for the same mistakes multiple times and it never did get it quite right.

    Generative AI is very good at confidently spitting out inaccurate information in ways that make it sound like it knows what it’s talking about to the average person.

    Basically, AI is currently functioning at the same level as the average tech CEO.

  • iAvicenna@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    26 days ago

    That is what happens when you mix a fucking CEO with tech “How many workers can I fire to make more money and boast about my achievements in the annual conference of mega yacht owners” where as the correct question should obviously have always been (unless you are a psychopath) “how can I use this tech to boost productivity of my workers so they can produce the same amount of work in less amount of time and have more personal time for themselves”

    Also these idiots always forget the “problem solving” part of most programming tasks which is still beyond the capability of LLMs. Sure have LLMs do the mundane stuff so that programmers can spend time on stuff that is more rewarding? No instead lets try to fire everyone.

  • SparrowRanjitScaur@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    26 days ago

    Extremely misleading title. He didn’t say programmers would be a thing of the past, he said they’ll be doing higher level design and not writing code.

    • cheddar@programming.dev
      link
      fedilink
      English
      arrow-up
      0
      ·
      26 days ago

      So they would be doing engineering and not programming? To me that sounds like programmers would be a thing of the past.

    • realharo@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      26 days ago

      Sounds like he’s just repeating a common meme. I don’t see anything about higher level design that would make it more difficult for an AI (hypothetical future AI, not the stuff that’s available now) compared to lower level tasks.

    • Tyfud@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      26 days ago

      Even so, he’s wrong. This is the kind of stupid thing someone without any first hand experience programming would say.

      • rottingleaf@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        26 days ago

        Yeah, there are people who can “in general” imagine how this will happen, but programming is exactly 99% not about “in general” but about specific “dumb” conflicts in the objective reality.

        People think that what they generally imagine as the task is the most important part, and since they don’t actually do programming or anything requiring to deal with those small details, they just plainly ignore them, because those conversations and opinions exist in subjective bendable reality.

        But objective reality doesn’t bend. Their general ideas without every little bloody detail simply won’t work.

        • Tyfud@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          26 days ago

          They’re falling for a hype train then.

          I work in the industry. With several thousand of my peers every day that also code. I lead a team of extremely talented, tenured engineers across the company to take on some of the most difficult challenges it can offer us. I’ve been coding and working in tech for over 25 years.

          The people who say this are people who either do not understand how AI (LLMs in this case) work, or do not understand programming, or are easily plied by the hype train.

          We’re so far off from this existing with the current tech, that it’s not worth seriously discussing.

          There are scripts, snippets of code that vscode’s llm or VS2022’s llm plugin can help with/bring up. But 9 times out of 10 there’s multiple bugs in it.

          If you’re doing anything semi-complex it’s a crapshoot if it gets close at all.

          It’s not bad for generating psuedo-code, or templates, but it’s designed to generate code that looks right, not be right; and there’s a huge difference.

          AI Genned code is exceedingly buggy, and if you don’t understand what it’s trying to do, it’s impossible to debug because what it generates is trash tier levels of code quality.

          The tech may get there eventually, but there’s no way I trust it, or anyone I work with trusts it, or considers it a serious threat or even resource beyond the novelty.

          It’s useful for non-engineers to get an idea of what they’re trying to do, but it can just as easily send them down a bad path.

          • rottingleaf@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            26 days ago

            People use visual environments to draw systems and then generate code for specific controllers, that’s in control systems design and such.

            In that sense there are already situations where they don’t write code directly.

            But this has nothing to do with LLMs.

            Just for designing systems in one place visual environments with blocks might be more optimal.

            • Miaou@jlai.lu
              link
              fedilink
              English
              arrow-up
              0
              ·
              26 days ago

              And often you still have actual developers reimplementing this shit because EE majors don’t understand dereferencing null pointers is bad

      • SparrowRanjitScaur@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        26 days ago

        Not really, it’s doable with chatgpt right now for programs that have a relatively small scope. If you set very clear requirements and decompose the problem well it can generate fairly high quality solutions.

        • Tyfud@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          26 days ago

          This is incorrect. And I’m in the industry. In this specific field. Nobody in my industry, in my field, at my level, seriously considers this effective enough to replace their day to day coding beyond generating some boiler plate ELT/ETL type scripts that it is semi-effective at. It still contains multiple errors 9 times out of 10.

          I cannot be more clear. The people who are claiming that this is possible are not tenured or effective coders, much less X10 devs in any capacity.

          People who think it generates quality enough code to be effective are hobbyists, people who dabble with coding, who understand some rudimentary coding patterns/practices, but are not career devs, or not serious career devs.

          If you don’t know what you’re doing, LLMs can get you close, some of the time. But there’s no way it generates anything close to quality enough code for me to use without the effort of rewriting, simplifying, and verifying.

          Why would I want to voluntarily spend my day trying to decypher someone else’s code? I don’t need chatGPT to solve a coding problem. I can do it, and I will. My code will always be more readable to me than someone else’s. This is true by orders of magnitude for AI-code gen today.

          So I don’t consider anyone that considers LLM code gen to be a viable path forward, as being a serious person in the engineering field.

          • SparrowRanjitScaur@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            edit-2
            26 days ago

            It’s just a tool like any other. An experienced developer knows that you can’t apply every tool to every situation. Just like you should know the difference between threads and coroutines and know when to apply them. Or know which design pattern is relevant to a given situation. It’s a tool, and a useful one if you know how to use it.

            • rottingleaf@lemmy.world
              link
              fedilink
              English
              arrow-up
              0
              ·
              26 days ago

              This is like applying a tambourine made of optical discs as a storage solution. A bit better cause punctured discs are no good.

              A full description of what a program does is the program itself, have you heard that? (except for UB, libraries, … , but an LLM is no better than a human in that too)

        • OmnislashIsACloudApp@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          26 days ago

          right now not a chance. it’s okay ish at simple scripts. it’s alright as an assistant to get a buggy draft for anything even vaguely complex.

          ai doing any actual programming is a long ways off.