• Something Burger 🍔@jlai.lu
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    RGB. Please. Finding hardware that doesn’t light up like a Christmas tree is harder than it should be. Even a simple power LED can light up an entire room.

    • Echo Dot@feddit.uk
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      I don’t really mind RGB, but my complaint is why every single LED has to be vivid electric blue. I want old red LEDs back, they were nice, they didn’t scorch my retinas.

      • JohnEdwa@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        2 months ago

        Agreed. My PC case came with a blue power light, after one night of watching the blinking illuminate my entire room I ripped it out and swapped in a dim red one myself.

        For a quick fix, you can make blue power LEDs slightly more tolerable by sticking a piece of yellow post-it note on top of them, it turns them white.

    • flodabo@programming.dev
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      Not anytime soon. Way too cheap to include(like cents for a mouse or ram and a few dollars for a keyboard) , and way too popular not to include. Well at least you can disable it.

    • dinckel@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      It’s an XDA article, what did you expect.

      None of these are trends. They’re all hardware standards, and all but one of them are still very much here anyway

  • cmnybo@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    The thing that I wish would go away is oversized graphics cards that take up 3 or more slots. There needs to be more options for liquid cooling that doesn’t require modifying the card.

    • catloaf@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      That would require cooler mount standards. I don’t think AMD or Nvidia currently have a standard.

    • tal@lemmy.today
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      I am thinking that maybe more liquid cooling will happen with the whole AI thing on the datacenter side. That has a lot of parallel compute cards generating a lot of heat. Easier to move it with liquid than air.

      Some other liquid-cooling annoyances:

      • Cases don’t really have a standard-size mounting spot for the radiators.

      • I want to use one radiator for all of the things that require cooling. Like, I’d rather have an AIO device that provides multiple cold plates.

      • conciselyverbose@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        2 months ago

        I really doubt liquid is easier for a data center. They have airflow solved pretty well and noise doesn’t really matter. Liquid failing could potentially do way more damage, and might require shutting down whole areas for repair/damage prevention in the case of a single leak.

        If they did do liquid at scale, it wouldn’t be done in a way it would work down to consumers. It would be like custom boards with full coverage blocks for the whole system that tied into whole room water chillers or something.

    • borari@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      I think I’m misunderstanding your comment. Once you liquid cool the card, it’s no longer an oversized behemoth. My reference 4080S is only taking up a single slot.

        • borari@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 months ago

          Sure, but the PCB with water block only takes up a single PCIe slot, and is shortened enough to fit in pretty much any case. Is my water cooled 4080S longer than my water cooled RX 480? Yes. Substantially longer? No. Thicker? Also no, basically same thickness.

      • cmnybo@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 months ago

        Most graphics cards have massive air coolers that block other PCIe slots. I want more water cooled options since they are low profile. I just don’t want to have to void the warranty on a brand new card to install a water block.

        • borari@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 months ago

          I know for sure that installing a water block does not void the warranty on reference Nvidia cards. I’ve read that Asus (and evga rip) are the same. Not sure about MSI, and have read that Gigabyte will try to void warranty.

  • outrageousmatter@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    2 months ago

    The capacitor plague era, ever wonder why we don’t see a lot of PC’s in the early 2000s, this is why as everything with a cap would fail and kill the boards, essentially having to call on the oem to fix it.

    • AmbiguousProps@lemmy.today
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      2 months ago

      Uhh, plenty of people still use them (it’s still the default for many gaming monitors), and for 32:9 displays you absolutely need it curved or it’s basically unusable. I don’t think they’re going away any time soon, because they’re not a “trend”.

    • Wispy2891@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      Unfortunately those cards come and went so fast that the LLM that wrote this “article” didn’t have enough data on this

  • corroded@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    This is kind of a shit article. Most of these are just old hardware that eventually had modern improvements, not “trends.”

    A “trend” is cold cathode black lights inside the case, not a silly naming scheme for CPU revisions.

  • sorghum@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    I remember my first serious build, blue acrylic case with as much black light reactive components I could get

    • nocturne@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      I remember the first full build I did. All of my fans had LEDs, the case had LEDs. The first time I tried to play on it in the dark basement the SU was blinding. I disconnected all of the case LEDs, and replaced my fans for plain black ones.

    • __init__@programming.dev
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      Oh man I went through this phase too. I had the clear acrylic case and a bunch of those UV CCFL tubes.

    • RubberDuck@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      My case is an old Tower Server Case tucked away behind my monitors. Loads of space and no need for cable management.

  • Fluffy_Ruffs@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    Intel’s slot CPU interface. Sure it cleaned up motherboard layouts but the need for more comprehensive cooling solutions that would soon follow made this a bad direction to go in.

  • tal@lemmy.today
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    Molex connectors were almost universally hated for being flimsy and requiring a lot of effort to connect properly. They were fortunately replaced by SATA connectors.

    I can understand the “lot of effort”, but flimsy? Those things were built like a tank. SATA connectors certainly aren’t more-durable (not that that normally matters, inside a case).

    • Dave.@aussie.zone
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      2 months ago

      They also came from a time when hard drives could draw several amps while in use and much more on spin-up. There was a good reason why SCSI drive arrays used to spin each disk up one-by-one.

      Molex connectors are good for 10 amps or so, SATA connectors couldn’t have handled that amount of current.

    • lazynooblet@lazysoci.al
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      Yes they were flimsy. When pushing them together the crimped ends would get pushed out the back of the plastic connector casing. Or they wouldn’t assign properly and would require either major force or fiddly realignment.

    • extremeboredom@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 months ago

      I remember instances where the force required to disconnect the connector caused me to slip and rip a wire out.

  • MonkderVierte@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 months ago

    The worst is still around: that GPU’s require more and more power. I wished more focus on efficiency. Not long until water cooling is mandatory, to get all the heat away.

    • JohnEdwa@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      2 months ago

      They are. GTX 590 from 2011 has a TDP of 375W. RTX 4080 has 320W, while offering over ten times better performance. 4060 outperforms the 1060, 2060 and 3060 while having a lower TDP than any of them.

      If you want low TDP, the RX 6400 is twice as powerful as the 590 while having a TDP of 53W.

      It’s the very top of the line stuff like 4090 that push the limit by achieving that very last 10% performance bump at the cost of using double the power, and that’s kinda like complaining a Bugatti Veyron gets terrible highway MPG figures.