I wonder if my system is good or bad. My server needs 0.1kWh.

  • d_k_bo@feddit.org
    link
    fedilink
    English
    arrow-up
    18
    arrow-down
    1
    ·
    edit-2
    3 days ago

    It’s the other way around. 0.1 kWh means 0.1 kW times 1 h. So if your device draws 0.1 kW (100 W) of power for an hour, it consumes 0.1 kWh of energy. If your device factory draws 360 000 W for a second, it consumes the same amount of 0.1 kWh of energy.

    • GravitySpoiled@lemmy.mlOP
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      7
      ·
      3 days ago

      Thank you for explaining it.

      My computer uses 1kwh per hour.

      It does not yet make sense to me. It just feels wrong. I understand that you may normalize 4W in 15 minutes to 16Wh because it would use 16W per hour if it would run that long.

      Why can’t you simply assume that I mean 1kWh per hour when I say 1kWh? And not 1kWh per 15 minutes.

      • 486@lemmy.world
        link
        fedilink
        English
        arrow-up
        15
        ·
        edit-2
        3 days ago

        kWh is a unit of power consumed. It doesn’t say anything about time and you can’t assume any time period. That wouldn’t make any sense. If you want to say how much power a device consumes, just state how many watts (W) it draws.

      • __nobodynowhere@startrek.website
        link
        fedilink
        English
        arrow-up
        6
        ·
        edit-2
        3 days ago

        A watt is 1 Joule per Second (1 J/s). E.g. Every second, your device draws 1 Joule of energy. This energy over time is called “Power” and is a rate of energy transfer.

        A watt-hour is (1 J/s) * (1 hr)

        This can be rewritten as (3600 J/hr) * (1 hr). The “per hour” and “hour” cancel themselves out which makes 1 watt-hour equal to 3600 Joules.

        1 kWh is 3,600 kJ or 3.6 MJ