• MystikIncarnate@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    I try to be understanding with my software brethren. We’re different sides to the whole. Ying and yang, so to speak.

    That said, I’ve gotten some brain-dead requests from you developer types.

    I’m not saying all of you are the problem, but there’s definitely some of you that need to learn how things work.

    • bitchkat@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      It goes both ways. At my old job, they took away local admin. But for some reason they configured visual studio to run as admin. So, I just wrote a little program that opens the shell. Whenever I needed admin, I just ran that program from Visual Studio.

  • Randelung@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    2 months ago

    When trying to request a firewall change IT told me “ports between 1 and 1024 are reserved and can’t be used for anything else” so I couldn’t be using it for a pure TCP connection, and besides, there would have to be a protocol on top of TCP, just TCP as protocol is obviously wrong. I was using port 20 because it was already open…

    • GreenKnight23@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      as a full stack dev, everything you said has offended me.

      port 20 is used for FTP, unless you were using FTP, then go right ahead. Guessing that since you didn’t know the protocol you were not using FTP.

      port usage reservations are incredibly important to ensure that the system is running within spec and secure. imagine each interface like a party telephone line and the ports are time slots.

      your neighborhood has reserved specific times (ports) for everyone to call their relatives. if you use the phone not in your slot (port) your neighbors might get pissed off enough to interrupt your slot. and then it’s just chaos from there.

      • MystikIncarnate@lemmy.ca
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        As IT/network/security, using a well known port for something that’s not what is supposed to run on that port, is inviting all kinds of problems.

        Especially the very well known ones, like ftp, ssh, SMTP, http, HTTPS, etc (to name a few). People make it their mission to find and exploit open FTP systems. I opened up FTP on a system once to the internet as kind of a honeypot, and within a week or so, there was someone uploading data to it.

        No bueno. Don’t use well known ports for things unless the thing that well known port is known for, is what you want to do.

        • Randelung@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          2 months ago

          All of that is fine, and they mentioned the management perspective, which I get. It was a field test and our original choice of 4001 - which is what other serial to TCP servers like us use, also in their network - was unavailable.

          What irks me is the “technical impossibility” of raw TCP and “I must be wrong” when filling out their firewall change form.

          They’ve since given us a different port “close to others that we use”, for whatever reason that matters, and based their choice on some list of common protocols outside the reserved range. But not 4001.

          That by itself is just one thing and I wouldn’t give it a second thought, but it’s all part of a larger picture of ineptitude. They opened a ticket because an arrow at the border of our UI vanished when they screen shared on Teams. Because of the red border. And they blamed our application for it.

          They didn’t set up their PKI correctly and opening our webpage on specific hosts gave the typical “go back” warning. But it was our fault somehow, even though the certificate was the one they supplied us and it was valid.

          • Trainguyrom@reddthat.com
            link
            fedilink
            English
            arrow-up
            0
            ·
            2 months ago

            What irks me is the “technical impossibility” of raw TCP and “I must be wrong” when filling out their firewall change form.

            Most commonly a port is opened to accept traffic of a specific protocol that runs overtop of TCP of UDP. I’m guessing the individual that responded might not be very good at technical communication and was just trying to question “are you sure it’s raw TCP and not just http traffic?” In order to keep the holes poked into the firewall as narrow and specific as possible

            They’ve since given us a different port “close to others that we use”, for whatever reason that matters, and based their choice on some list of common protocols outside the reserved range. But not 4001.

            Usually if infrastructure is assigning a port other than default it’s because that port is already in use. The actual port number you use doesn’t matter as long as it’s not a common default (which basically all ports below 1024 are)

            Using ports that are close together for similar purposes can aid in memorability if that’s a need, but ultimately it doesn’t matter much if they’re not conflicting with common defaults

            They opened a ticket because an arrow at the border of our UI vanished when they screen shared on Teams. Because of the red border. And they blamed our application for it.

            Probably a user was complaining and needed action immediately and they didn’t have time to test a cosmetic issue in an edgecase. For minor issues I’ll open a ticket with the party I think might be responsible just to get it out of the way so I can get to higher priority stuff, and I’ll rely on that party to let me know if it’s not actually their problem. Heck it might even simply be the IT person assumed it was a misrouted ticket, since users open tickets in random queues all the time

            They didn’t set up their PKI correctly and opening our webpage on specific hosts gave the typical “go back” warning. But it was our fault somehow, even though the certificate was the one they supplied us and it was valid.

            If the certificate is correctly generated and valid an SSL error would indicate it was incorrectly applied to the application. I’m guessing by the inclusion in this rant that the conclusion was it was in fact a problem with the certificate, but we don’t have enough details to speculate if it was truly a mistake by the individual that generated it or just a miscommunication

            Honestly it sounds like you’re too quick to bash IT and should instead be more open to dialogue. I don’t know the specifics of your workplace and communications, but if you approach challenges with other teams from an “us vs them” standpoint it’s just going to create conflict. Sometimes the easiest way to do it is to try to hop on a quick call with the person once you get to more than a couple of emails back and forth, plus then you have more social cues to avoid getting angry with eachother and can give more relevant details

    • 0x4E4F@sh.itjust.worksOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      Exactly what a dev would say… you guys don’t have to deal with that 3rd gen i3 Jenny from accounting is running.

      • Destide@feddit.uk
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        Ticket opened I need soda intalled high importance!!! get up there companies paying for Adobe suite it’s there on the desktop…

    • mycodesucks@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      2 months ago

      Yes! Containerize, containerize, containerize until every perfectly good machine built before 2020 is rotting away in a landfill!

  • _____@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    2 months ago

    I think it’s on a case by case basis but having help desk ppl help you out and opening powershell and noodling without any concept of problem solving made me make this face once.

    It probably goes both ways, I’m a dev and I assembled computers at 12 yo so I believe I have a lot of experience and knowledge when it comes to hardware. I’ve also written code for embedded platforms.

    IT people in my pov can really come across as enthusiast consumers when it comes to their hardware knowledge.

    “did you guys hear Nvidia bas the new [marketing term] wow!” . Have you ever thought about what [marketing term] actually does past just reading the marketing announcement?

    At the same time I swear to God devs who use macs have no idea how computers work at all and I mean EXCLUDING their skill as a dev. I’ve had them screen share to see what I imagine is a baby’s first day on a computer.

    To close this rant: probably goes both ways

    • spacecadet@lemm.ee
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      Interesting comment on the Mac. At my workplace we can choose between Mac or Windows (no Linux option unfortunately, my personal computer runs Debian). Pretty much all the principle and senior devs go for Mac, install vim, and live in the command line, and I do the same. All the windows people seem over reliant on VSCode, AI apps, and a bunch of other apps Unix people just have cli aliases for and vim shortcuts. I had to get a loaner laptop from work for a week and it was windows. Tried using powershell and installing some other CLI tools and after the first day just shut the laptop and didn’t work until I got back from travel and started using my Mac again.

        • Trainguyrom@reddthat.com
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 months ago

          WSL is interesting because it manages to simultaneously offer everything a Linux user would want while also actually capable of none of what a Linux user would need it to do. Weird compatibility issues, annoying filesystem mappings that make file manipulation a pain, etc

          In a Windows environment I’ve found it honestly works better to either ssh into a Linux machine or learn the PowerShell way of doing it than to work through WSL’s quirks

      • ichbinjasokreativ@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        2 months ago

        I have to use windows for work. Installed vim through winget and set a powershell alias, allowing me to use it similarly to linux. Windows ist still just ass though.

      • _____@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        If you don’t have access to Linux, MacOS is the closest commercially available option so it makes sense.

        Also please take what I said lightly, I by no means want to bash Mac users and generalize them. It just has been my experience. I’m sure there are thousands of highly competent technical users who prefer Mac.

      • masterspace@lemmy.ca
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        Lmao, devs who insist on using the VIM and the terminal over better graphical alternatives just to seem hardcore are the worst devs who write the worst code.

        “Let me name all my variables with a single letter and abbreviations cause I can’t be bothered to learn how to setup a professional dev environment with intellisense and autocomplete.”

        • zarkanian@sh.itjust.works
          link
          fedilink
          arrow-up
          0
          ·
          edit-2
          2 months ago

          Or maybe…hear me out…different people like different things. Some people don’t like GUIs and enjoy working in the command line. For some other people, it’s the opposite.

          It’s just different preferences.

          • masterspace@lemmy.ca
            link
            fedilink
            English
            arrow-up
            0
            ·
            2 months ago

            I know it has a steep learning curve with no benefit over GUI alternatives (unless you have to operate in a GUI-less environment).

            Which makes it flat out dumb for a professional developer to use. “Lets make our dev environment needlessly difficult, slowing down new hires for no reason will surely pay off in the long run”.

        • Klicnik@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 months ago

          I tried using VScode to play around with Golang. I had to quit coding to take care of something else. I hit save, and suddenly I have way fewer lines of code. WTF? Why did/would saving delete code? After much digging, it turns out because the all knowing VSCode thought because I had not yet referenced my variables, I never would, and since my code I wanted to save and continue later wouldn’t compile, it must be quelled. Off with its head!

          Anyway, I decided to use vim instead. When I did :wq, the file was saved exactly as I had typed it.

          • masterspace@lemmy.ca
            link
            fedilink
            English
            arrow-up
            0
            ·
            2 months ago

            This is either false, or you didn’t understand the environment you were working in.

            You have to explicitly turn on the setting to have VSCode reformat on save, it’s not on by default, and when it is on, it’s there for a reason, because having software developers that do not all follow the same standard for code formatting creates unpredictable needless chaos.

        • PlexSheep@infosec.pub
          link
          fedilink
          arrow-up
          0
          ·
          edit-2
          2 months ago

          You are making prejudiced, generalized, assumptions and presenting them as facts.

          You are at best naive if you think people use vim and a terminal instead of “better graphical alternatives” (which there are none of if you’ve really gotten into vim/emacs/whatever). And we don’t do it to seem hardcore (maybe we are, but that’s a side effect). Software in the terminal is often more simple to use, because it allows chaining together outputs and has often simpler user interfaces.

          The second paragraph is word salad. Developers should name their shit properly regardless of editor and it’s quite simple to have a professional dev setup with ‘intellisense’ and auto complete in neovim. In fact, vim/neovim and I assume emacs too have much more features and flexibility of which users of IDEs or vscode wouldn’t so much as think of.

          I assume your prejudice comes from the fact that vim is not a “one size fits all no configuration needed” integrated development environment (IDE) but rather enables the user to personalize it completely to their own wishes, a Personalized Development Environment. In that regard, using one of the “better graphical tools” is like a mass produced suit while vim is like a tailor made one.

          Just let people use what they like. Diversity is a strength.

    • uis@lemm.ee
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      devs who use macs

      Do they exist? Are you sure they are devs?

      • Zink@programming.dev
        link
        fedilink
        arrow-up
        0
        ·
        2 months ago

        MacOS is literally certified UNIX though.

        I’m not a Mac user at all, and I’m lucky enough to be able to run Linux full time at work, but it seems like macs should be alright in many cases.

      • JordanZ@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        2 months ago

        Our entire .NET shop swapped to MacBook Pros from Dell Precisions for like 2-3 years because our head of development liked them more. Then went back to having a choice after that. So now we have a mix. In all honesty it’s not much different for me but I use everything…Windows, Mac, Linux. Whatever works best for me for the task at hand. DotNet runs on all three so we kind of mix and match. Deploying to Azure allows a mix of windows/linux and utilizing GitHub Actions allows a mix of windows/linux in the same workflows as well. So it’s best to just learn them all. None of them are perfect and have pros/cons.

        I dabble in hardware and networking too. I built my first computer when I was 11 by myself. My parents are kind of tech illiterate. I have fiber switches and dual Xeon servers and the such in my house. My NAS is a 36 hot swap bay 4U server. That knowledge definitely helps when deploying to the cloud where you’re responsible for basically everything.

        Also, yes. I can do more than .Net languages…that’s where my job currently falls though.

      • GoodEye8@lemm.ee
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        They do exist and some of them swear Mac has better workflows (than windows because most of the time your options are Windows or Mac). I would call them loonies but I’ve seen some smart people use Macs.

    • 0x4E4F@sh.itjust.worksOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      2 months ago

      Agreed. I have colleagues that I write scripts for (I don’t do that any more, I stopped and shit stopped working, so they solve things manually now), they don’t know shit about scripting… and still don’t.

      On the other hand, I’ve had the pleasure of working with a dev that was just this very positive, very friendly person and was also very knowledgeable when it came to hardware, so we were on the same page most of the time. He also accepted most of my code changes and the ones that he didn’t, gave him an idea of how to solve it more efficiently. We were a great team to be honest. We’re still friends. Don’t see him as frequently, but we keep in touch.

      • Klicnik@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        I definitely have moments like this too. I have been reflecting more lately and trying to decide if the feeling is temporary or permanent. I have been pondering what else I would do. Are you considering a career change, and if so, what would you do instead? I don’t know if I could transition to something else without going back to school, and it would kill me a bit inside to take out more student loans.

        • Zink@programming.dev
          link
          fedilink
          arrow-up
          0
          ·
          2 months ago

          What has been working for me is not trying to make software my life or my identity. I don’t get home from work just to work on my side project, or my app, or my Arch install, or even watch videos about coding and shit. I hang out at my pond, play with my pets, play with my son, chill with my wife, work on the yard, or just watch/play something that catches my interest.

          It’s like we all have a unique user’s manual for our unique bodies and minds, but we don’t get a copy of it and have to do some reverse engineering to figure out what works. Then you have to have the compassion and empathy for yourself to do the things that increase your happiness instead of doing the things that you’re “supposed” to do.

          • Klicnik@sh.itjust.works
            link
            fedilink
            English
            arrow-up
            0
            ·
            2 months ago

            That’s solid advice. I think I have my identity wrapped up too much in my career, so when I dislike my job, I feel unsatisfied in life. I will try to see it as means to an end more than who I am.

            • Zink@programming.dev
              link
              fedilink
              arrow-up
              0
              ·
              2 months ago

              Awesome to hear! It’s easier said than done (like always) because I think sometimes we don’t even realize when we’re doing it.

              In the first year of COVID my position got eliminated at the company I’d worked at for 16 years. I’d had different positions within the company, but that place was basically my entire career until then.

              That shock to the system, coupled with the fact that several months later I realized I was the same person with the same loved ones, finally flipped some switch in my brain that I didn’t even realize was there. Then the next job I got was fucking horrible and served to weld that switch in its new position, lol.

              So now I have a good job with good coworkers, and I appreciate that fact every day, but that’s not going to erode the healthy boundaries and mental compartmentalization.

        • trainsaresexy@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          edit-2
          2 months ago

          That’s the conversation I was having with my therapist this week. I don’t know. I’ve always massively struggled with this. Thinking about it sends me into a spiral.

          As of now the plan is to look for other opportunities in industry. Some training is fine but I would like to avoid loans. I don’t have anything specific yet, but public sector is likely part of it. I’m less motivated to help people as I am to make certain people miserable. Countries have started to track job quality (“job quality”), it’s data worth looking at.

          Depending on how that goes I have other thoughts but nothing that is sucking me in. Maybe I’ll give up entirely and become a vagrant. I also have a viable non-expiring business idea that would de-employ a certain group of people I don’t like. I’m not ready for either of those yet.

          In the meantime I have a bucket list of things that I’m working through. It helps me feel like my life has forward momentum despite what’s happening with my career (it’s also opening up new doors I didn’t see before, eg acting). Between that and therapy my job feels often feels like something I’ll deal with later.

  • Grandwolf319@sh.itjust.works
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    In my experience it’s been IT people telling me you can’t use a certain tool or have more control over your computer cause of their rules.

    The expression is appropriate but the meme assumes that im doubting the IT person’s expertise. I’m not, I’m just not liking the rules that get in the way of my work. Some rules do make sense though.

    • BilliamBoberts@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      As an IT guy, I’d love to give software devs full admin rights to their computer to troubleshoot and install anything as they see fit, it would save me a lot of time out of my day. But I can’t trust everyone in the organization not to click suspicious links or open obvious phishing emails that invite ransomware into the organization that can sink a company overnight.

      • Grandwolf319@sh.itjust.works
        link
        fedilink
        arrow-up
        0
        ·
        2 months ago

        Fair points but as someone who works in cybersecurity. Phishing emails can happen without admin access. I haven’t heard of any randsomware that is triggered by just clicking on a link.

        I think there should be some restrictions but highly technical people should slowly be given more and more control as they gain more trust/experience.

        • lud@lemm.ee
          link
          fedilink
          arrow-up
          0
          ·
          2 months ago

          Of course but the impact could be much worse if the victim is admin on their computer.

          • BilliamBoberts@lemmy.world
            link
            fedilink
            arrow-up
            0
            ·
            2 months ago

            Exactly this. we try to prevent cyberattacks as much as we can, but at a certain point, they’re impossible to perfectly defend against without also totally locking down our users and making it impossible for them to do their jobs. so then the game becomes one of containing the amount of damage an attack can do.

            Security is restriction. our job is to balance our users’ ability to perform their jobs with acceptable levels of risk.

    • BeigeAgenda@lemmy.ca
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      And the more corporate the organisation the more rules, at least the places I have worked trusts developers enough to give local admin, that takes the edge off many tasks.

    • biscuitswalrus@aussie.zone
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      I think you probably don’t realise you hate standards and certifications. No IT person wants yet another system generating more calls and complexity. but here is iso, or a cyber insurance policy, or NIST, or acsc asking minimums with checklists and a cyber review answering them with controls.

      Crazy that there’s so little understanding about why it’s there, that you just think it’s the “IT guy” wanting those.

      • Grandwolf319@sh.itjust.works
        link
        fedilink
        arrow-up
        0
        ·
        2 months ago

        I thought my comment was pretty clear that some rules are justified and that the IT person can just be the bearer of bad news.

        Maybe not, hopefully this comment clarifies.

      • tastysnacks@programming.dev
        link
        fedilink
        arrow-up
        0
        ·
        2 months ago

        So you don’t trust me, but you trust McAfee to give it full control over the system. Yet my software doesn’t work because something is blocked and nothing is showing up in the logs. But when we take off Mafee, it works. So clearly McAfee is not logging everything. And you trust Mcafee but not me? /s kinda.

        • mosiacmango@lemm.ee
          link
          fedilink
          arrow-up
          0
          ·
          edit-2
          2 months ago

          No one on earth trusts McAfee, be it the abysmal man or abysmal AV suite.

          If the EDR or AV software is causing issues with your code running, it’s possibly an issue with the suite, but it’s more likely an issue with your code not following common sense security requirements like code signing.

            • mosiacmango@lemm.ee
              link
              fedilink
              arrow-up
              0
              ·
              2 months ago

              It’s not common, but it should be.

              Still, that was just one example. EDR reacting to your code is likely a sign of some other shortcut being taken during the development process. It might even be a reasonable one, but if so it needs to be discussed and accounted for with the IT security team.

      • Laser@feddit.org
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        2 months ago

        I worked in software certification under Common Criteria, and while I do know that it creates a lot of work, there were cases where security has been improved measurably - in the hardware department, it even happened that a developer / manufacturer had a breach that affected almost the whole company really badly (design files etc stolen by a probably state sponsored attacker), but not the CC certified part because the attackers used a vector of attack that was caught there and rectified.

        It seemingly was not fixed everywhere for whatever reason… but it’s not that CC certification is just some academic exercise that gives you nothing but a lot of work.

        Is it the right approach for every product? Probably not because of the huge overhead power certified version. But for important pillars of a security model, it makes sense in my opinion.

        Though it needs to be said that the scheme under which I certified is very thorough and strict, so YMMV.

    • 4am@lemm.ee
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      I think the meme is more about perspectives and listening to the way someone thinks about operating IT is very different from the way someone things about architecting IT

      • Ethan@programming.dev
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        Their rules have stopped me from being able to do my job. Like the time the AV software quarantined executables as I was creating them so I literally could not run my code. When security enforcement prevents me from working, something needs to change.

  • RupeThereItIs@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    “IT people” here, operations guy who keeps the lights on for that software.

    It’s been my experience developers have no idea how the hardware works, but STRONGLY believe they know more then me.

    Devops is also usually more dev than ops, and it shows in the availability numbers.

    • Mr. Satan@monyet.cc
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      As a developer I like to mess with everything. Currently we are doing an infrastructure migration and I had to do a lot of non-development stuff to make it happen.

      Honesly I find it really usefull (but not necessary) to have some understanding of the underying processes of the code I’m working with.

    • stupidcasey@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      It very much depends on how close to hardware they are.

      If someone is working with C# or JavaScript, they are about as knowledgeable with hardware as someone working in Excel(I know this statement is tantamount to treason but as far as hardware is concerned it’s true

      But if you are working with C or rust or god forbid drivers. You probably know more than the average IT professional you might even have helped correct hardware issues.

      Long story short it depends.

    • PriorityMotif@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      In my experience a lot of IT people are unaware of anything outside of their bubble. It’s a big problem in a lot of technical industries with people who went to school to learn a trade.

      • Trainguyrom@reddthat.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        The biggest problem with the bubble that IT insulates themselves into is that if you don’t users will never submit tickets and will keep coming to you personally, then get mad when their high priority concern sits for a few days because you were out of office but the rest of the team got no tickets because the user decided they were better than a ticket.

        If people only know how to summon you through the ancient ritual of ticket opening with sufficient information they’ll follow that ritual religiously to summon you when needed. If they know “oh just hit up Rob on teams, he’ll get you sorted” the mysticism is ruined and order is lost

        Honestly I say this all partially jokingly. We do try to insulate ourselves because we know some users will try to bypass tickets if given any opportunity to do so, but there is very much value in balancing that need with accessability and visibility. So the safe option is to hide in your basement office and avoid mingling, but thats also the option that limits your ability to improve yourself and your organization

    • aeiou_ckr@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      20 year IT guy and I second this. Developers tend to be more troublesome than the manager wanting a shared folder for their team.

      • ByteOnBikes@slrpnk.net
        link
        fedilink
        arrow-up
        0
        ·
        2 months ago

        Rough and that sucks for your organization.

        Our IT team would rather sit in a room with developers and solve those problems, than deal with hundreds of non-techs who struggle to add a chrome extension or make their printer icon show up.

        • aeiou_ckr@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          2 months ago

          I would love to work through issues but the stock of developers we currently have seem to either the rejects or have as someone else stated “a god complex”. They remind me of pilots in the military. All in all it is a loss for my organization.

    • ugo@feddit.it
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      Sorry, this comment is causing me mental whiplash so I am either ignorant, am subject to non-standard circumstances, or both.

      My personal experience is that developers (the decent ones at least) know hardware better than IT people. But maybe we mean different things by “hardware”?

      You see, I work as a game dev so a good chunk of the technical part of my job is thinking about things like memory layout, cache locality, memory access patterns, branch predictor behavior, cache lines, false sharing, and so on and so forth. I know very little about hardware, and yet all of the above are things I need to keep in mind and consider and know to at least some usable extent to do my job.

      While IT are mostly concerned on how to keep the idiots from shooting the company in the foot, by having to roll out software that allows them to diagnose, reset, install or uninstall things on, etc, to entire fleets of computers at once. It also just so happens that this software is often buggy and uses 99% of your cpu taking it for spin loops (they had to roll that back of course) or the antivirus rules don’t apply on your system for whatever reason causing the antivirus to scan all the object files generated by the compiler even if they are generated in a whitelisted directory, causing a rebuild to take an hour rather than 10 minutes.

      They are also the ones that force me to change my (already unique and internal) password every few months for “security”.

      So yeah, when you say that developers often have no idea how the hardware works, the chief questions that come to mind are

      1. What kinda dev doesn’t know how hardware works to at least an usable extent?
      2. What kinda hardware are we talking about?
      3. What kinda hardware would an IT person need to know about? Network gear?
      • Eccitaze@yiffit.net
        link
        fedilink
        arrow-up
        0
        ·
        2 months ago

        When IT folks say devs don’t know about hardware, they’re usually talking about the forest-level overview in my experience. Stuff like how the software being developed integrates into an existing environment and how to optimize code to fit within the bounds of reality–it may be practical to dump a database directly into memory when it’s a 500 MB testing dataset on your local workstation, but it’s insane to do that with a 500+ GB database in production environment. Similarly, a program may run fine when it’s using a NVMe SSD, but lots of environments even today still depend on arrays of traditional electromechanical hard drives because they offer the most capacity per dollar, and aren’t as prone to suddenly tombstoning when it dies like flash media. Suddenly, once the program is in production, it turns out that same program’s making a bunch of random I/O calls that could be optimized into a more sequential request or batched together into a single transaction, and now it runs like dogshit and drags down every other VM, container, or service sharing that array with it. That’s not accounting for the real dumb shit I’ve read about, like “dev hard coded their local IP address and it breaks in production because of NAT” or “program crashes because it doesn’t account for network latency.”

        Game dev is unique because you’re explicitly targeting a single known platform (for consoles) or targeting for an extremely wide range of performance specs (for PC), and hitting an acceptable level of performance pre-release is (somewhat) mandatory, so this kind of mindfulness is drilled into devs much more heavily than business software dev is, especially in-house dev. Business development is almost entirely focused on “does it run without failing catastrophically” and almost everything else–performance, security, cleanliness, resource optimization–is given bare lip service at best.

    • ChillPenguin@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      I work on a team with mainly infrastructure and operations. As one of the only people writing code on the team. I have to appreciate what IT support does to keep everything moving. I don’t know why so many programmers have to get a chip on their shoulder.

    • kersplomp@programming.dev
      link
      fedilink
      arrow-up
      0
      ·
      edit-2
      2 months ago

      Apologies for the tangent:

      I know we’re just having fun, but in the future consider adding the word “some” to statements about groups. It’s just one word, but it adds a lot of nuance and doesn’t make the joke less funny.

      That 90’s brand of humor of “X group does Y” has led many in my generation to think in absolutes and to get polarized as a result. I’d really appreciate your help to work against that for future generations.

      Totally optional. Thank you

    • ByteOnBikes@slrpnk.net
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      Absolutely agree, as a developer.

      The devops team set up a pretty effective setup for our devops pipeline that allows us to scale infinity. Which would be great if we had infinite resources.

      We’re hitting situations where the solution is to throw more hardware at it.

      And IT cannot provision tech fast enough within budget for any of this. So devs are absolutely suffering right now.

    • qaz@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      Could you give an example of something related to hardware that most developers don’t know about?

      • 0x4E4F@sh.itjust.worksOP
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        2 months ago

        Simple example, our NASes are EMC2. The devs over at the company that does the software say they’re garbage, we should change them.

        Mind you, these things have been running for 10 years straight 24/7, under load most of the time, and we’ve only swapped like 2 drives, total… but no, they’re garbage 🤦…

        • ByteOnBikes@slrpnk.net
          link
          fedilink
          arrow-up
          0
          ·
          2 months ago

          Accurate!

          Developers are frequently excited by the next hot thing or how some billionaire tech companies operate.

          I’m guilty of seeing something that was last updated in 2019 and having a look of disgust.

          • 0x4E4F@sh.itjust.worksOP
            link
            fedilink
            English
            arrow-up
            0
            ·
            2 months ago

            Well, at least you admit it, not everyone does.

            I do agree that they’re out of date, but that wasn’t their point, their software somehow doesn’t like the NASes, so they had to look into where the problem was. But, their first thought was “let’s tell them they’re no good and tell them which ones to buy so we wouldn’t have to look at the code”.

              • 0x4E4F@sh.itjust.worksOP
                link
                fedilink
                English
                arrow-up
                0
                ·
                edit-2
                2 months ago

                Me too, but apparently, that wasn’t the case.

                My reasoning was, they’d have to send someone over to do tests and build the project on site, install and test, since we couldn’t give any of those NASes to them for them to work on the problem, and they’d rather not do that, since it’s a lot more work and it’s time consuming.

    • r00ty@kbin.life
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      I’ve always found this weird. I think to be a good software developer it helps to know what’s happening under the hood when you take an action. It certainly helps when you want to optimize memory access for speed etc.

      I genuinely do know both sides of the coin. But I do know that the majority of my fellow developers at work most certainly have no clue about how computers work under the hood, or networking for example.

      I find it weird because, to be good at software development (and I don’t mean, following what the computer science methodology tells you, I mean having an idea of the best way to translate an idea into a logical solution that can be applied in any programming language, and most importantly how to optimize your solution, for example in terms of memory access etc) requires an understanding of the underlying systems. That if you write software that is sending or receiving network packets it certainly helps to understand how that works, at least to consider the best protocols to use.

      But, it is definitely true.

      • Blackmist@feddit.uk
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        I think software was a lot easier to visualise in the past when we had fewer resources.

        Stuff like memory becomes almost meaningless when you never really have to worry about it. 64,000 bytes was an amount that made sense to people. You could imagine chunks of it. 64 billion bytes is a nonsense number that people can’t even imagine.

        • r00ty@kbin.life
          link
          fedilink
          arrow-up
          0
          ·
          2 months ago

          When I was talking about memory, I was more thinking about how it is accessed. For example, exactly what actions are atomic, and what are not on a given architecture, these can cause unexpected interactions during multi-core work depending on byte alignment for example. Also considering how to make the most of your CPU cache. These kind of things.

      • ByteOnBikes@slrpnk.net
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        2 months ago

        . I think to be a good software developer it helps to know what’s happening under the hood when you take an action.

        There’s so many layers of abstractions that it becomes impossible to know everything.

        Years ago, I dedicated a lot of time understanding how bytes travel from a server into your router into your computer. Very low-level mastery.

        That education is now trivia, because cloud servers, cloudflare, region points, edge-servers, company firewalls… All other barriers that add more and more layers of complexity that I don’t have direct access to but can affect the applications I build. And it continues to grow.

        Add this to the pile of updates to computer languages, new design patterns to learn, operating system and environment updates…

        This is why engineers live alone on a farm after they burn out.

        It’s not feasible to understand everything under the hood anymore. What’s under the hood grows faster than you can pick it up.

        • r00ty@kbin.life
          link
          fedilink
          arrow-up
          0
          ·
          2 months ago

          I’d agree that there’s a lot more abstraction involved today. But, my main point isn’t that people should know everything. But knowing the base understanding of how perhaps even a basic microcontroller works would be helpful.

          Where I work, people often come to me with weird problems, and the way I solve them is usually based in low level understanding of what’s really happening when the code runs.

      • uis@lemm.ee
        link
        fedilink
        arrow-up
        0
        ·
        2 months ago

        and I don’t mean, following what the computer science methodology tells you, I mean having an idea of the best way to translate an idea into a logical solution that can be applied in any programming language,

        But that IS computer science.

      • jdeath@lemm.ee
        link
        fedilink
        arrow-up
        0
        ·
        2 months ago

        yeah i wish it was a requirement that you’re nerdy enough to build your own computer or at least be able to install an OS before joining SWE industry. the non-nerds are too political and can’t figure out basic shit.

        • ByteOnBikes@slrpnk.net
          link
          fedilink
          arrow-up
          0
          ·
          2 months ago

          This is like saying before you can be a writer, you need to understand latin and the history of language.

            • Ziglin@lemmy.world
              link
              fedilink
              arrow-up
              0
              ·
              2 months ago

              I like informing yourself about the note taking app you’re writing with a little more. It makes it a bit more obvious that it’s kind of obvious but can have many advantages.

              Personally though I don’t really see upside of building a computer as you could also just research things and not build it or vice versa. (Maybe it’s good for looking at bug reports?)

              A 30 minute explanation on how CPUs work that I recently got to listen in on was likely more impactful on my C/assembly programming than building my own computer was.

              • jdeath@lemm.ee
                link
                fedilink
                arrow-up
                0
                ·
                2 months ago

                you wouldn’t want somebody that hates animals to become a veterinarian just because of money-lust. the animals would suffer, the field as a whole, too. maybe they start buying up veterinary offices and squeeze the business for everything they can, resulting in worse outcomes- more animals dying and suffering, workers get shorted on benefits and pay.

                people chasing money ruin things. we want an industry full of people that want to actually build things.

          • Narauko@lemmy.world
            link
            fedilink
            arrow-up
            0
            ·
            2 months ago

            You should if you want to be a science writer or academic, which lets be honest is a better comparison here. If your job involves latin for names and descriptions then you probably should take at least a year or two of latin if you don’t want to make mistakes here and there out of ignorance.

    • ValiantDust@feddit.org
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      As a developer I can freely admit that without the operations people the software I develop would not run anywhere but on my laptop.

      I know as much about hardware as a cook knows about his stove and the plates the food is served on – more than the average person but waaaay less than the people producing and maintaining them.

    • ArbiterXero@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      As a devops manager that’s been both, it depends on the group. Ideally a devops group has a few former devs and a few former systems guys.

      Honestly, the best devops teams have at least one guy that’s a liaison with IT who is primarily a systems guy but reports to both systems and devops. Why?

      It gets you priority IT tickets and access while systems trusts him to do it right. He’s like the crux of every good devops team. He’s an IT hire paid for by the devops team budget as an offering in exchange for priority tickets.

      But in general, you’re absolutely right.

      • magikmw@lemm.ee
        link
        fedilink
        arrow-up
        0
        ·
        2 months ago

        Am I the only guy that likes doing devops thatbhas both dev and ops experience and insight? What’s with silosing oneself?

        • ArbiterXero@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          2 months ago

          I’ve done both, it’s just a rarity to have someone experienced enough in both to be able to cross the lines.

          Those are your gems and they’ll stick around as long as you pay them decently.

          Hard to find.

          Because the problem is that you need

          1. A developer
          2. A systems guy
          3. A social and great personality

          The job is hard to hire for because those 3 in combo is rare. Many developers and systems guys have prickly personalities or specialise in their favourite part of it.

          Devops spent have the option of prickly personalities because you have to deal with so many people outside your team that are prickly and that you have to sometimes give bad news to….

          Eventually they’ll all be mad at you for SOMETHING…… and you have to let it slide. You have to take their anger and not take it personally…. That’s hard for most people, let alone tech workers that grew up idolising Linus torvalds, or Sheldon cooper and their “I’m so smart that I don’t need to be nice” attitudes.

          • MajorHavoc@programming.dev
            link
            fedilink
            arrow-up
            0
            ·
            2 months ago

            Fantastic summary. For anyone wondering how to get really really valuable in IT, this is a great write-up of why my top paid people are my top paid people.

    • lordnikon@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      I know this is not everyone and there’re some unicorns out there but after working with hiring managers for decades i can’t help but see cheap programmers when I see Devops. It’s ether Ops people that think they are programmers or programmers that are not good enough to get hired as Software Engineers outright at higher pay. It’s like when one person is both they can do both but not great at ether one. Devops works best when it’s a team of both dev and Ops working together. IMO

    • Jestzer@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      Yup. Programmers who have only ever been programmers tend to act like god’s gift to this world.

      • madjo@feddit.nl
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        As a QA person I despise those programmers.

        “Sure buddy it works on your machine, but your machine isn’t going to be put into the production environment. It’s not working on mine, fix it!”

        • bitchkat@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 months ago

          My answer is usually “I don’t care how well it runs on your windows machine. Our deployments are on Linux”.

          I’m a old developer that has done a lot of admin over the years out of necessity.

  • Scoopta@programming.dev
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    I’m both IT and development…and I’ve caught both sides being utterly wrong because they’re only familiar with one and not the other

    • bitchkat@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      IT is an administrative function and is really part of operations.

      Software development is generally a creative position and is a profit center. If you work somewhere where you develop internal apps, you may have a different perspective.

        • Landless2029@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          2 months ago

          Drivers would be end users, Clients and project managers sometimes.

          Think about it. Many drivers don’t know about checking the oil, maintaining proper tire pressure, tire wear, brake wear, air filters or topping off fluids.

          I can do all of the above, but I’m nowhere near a mechanic. Just car savvy. So I could make suggestions to mechanics or engineers that look cool but are insane for functionality.

    • Randelung@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      2 months ago

      Infrastructure maintenance is management, security and day to day business, while software engineering is mostly concerned with itself. They use distinct tools and generally have nothing to do with each other (except maybe integration).

      We need new terms, IT means “works with computers, but more than Word and Excel” for too many people. In Switzerland they split the apprenticeship names to ‘platform engineer’ and ‘application engineer’, which I think is fitting.

  • leftzero@lemmynsfw.com
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    I’m both, and while I do hate myself, I don’t think it’s related, so I’m not sure I get it.

    (I hate computers more, though, except when they’re turned off — no bugs when they’re off —, but they’re the only thing I’m good enough at to make a living off of.)

  • cmhe@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    2 months ago

    This is the same between many different software development disciplines, fpga devs (or hardware devs for that matter) vs. driver devs, driver devs vs. backend dev, backend devs vs frontend devs, integrators vs everyone.

  • I Cast Fist@programming.dev
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    That’s how I look at 90% of the shit “systems” I’m forced to interact with (xiaomi’s MIUI, banking apps, govt apps, apps that should’ve been fucking websites, websites that “gently nudge” you to use the app, electron apps that are windows only)

    • 0x4E4F@sh.itjust.worksOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      MiUI is not that bad IMO. The ad services and the integrated apps are horrible (even without the ads), but apart from that, the UI is fairly usable. They really haven’t changed that much from what Android comes with by default.