Hi guys!

I purchased a few months ago a new AMD PC, with a 7700 CPU, 32GB of RAM and a 7800XT GPU. I’ve noticed since, that my electric bill has been increased (compared to when I used an Intel i7 6700 with a 1070 GPU), I was wondering, is it possible to use a hybrid GPU setup kinda like laptops, where the iGPU from the CPU is activated for normal tasks, and the discrete GPU is only activated on demand? Would the GPU be unpowered/sleeping in the meantime?

…all this from a Linux perspective, I’m running Nobara 40.

Thanks!

  • Fisch@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    13
    ·
    9 days ago

    I don’t think that’s your PC. I actually measured how much power my PC and my monitors consumed in a week and used that to calculate how much that would be for the year and compared to the total used energy for that year. My PC setup was only a small fraction of the yearly usage. The vast majority of your energy is gonna be consumed by things like fridges, ovens, heating, water pumps, etc.

    • iturnedintoanewt@lemm.eeOP
      link
      fedilink
      English
      arrow-up
      1
      ·
      9 days ago

      Well I used to turn off the monitors and leave it running…And the last month I started actively suspending it, and the bill went down. I’m waiting to setup some sockets with smart power measuring features, and I’ll have more reliable data on power consumption…but I’m afraid I might need to wait a few days on these.

      • A_Random_Idiot@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        8
        ·
        8 days ago

        dont turn off your monitors.

        The power on/off cycles significantly reduce their lifespan vs just leaving them on.

        The pennies you save on the power bill aint gonna add up to enough to replace the monitor more often.

    • hendrik@palaver.p3x.de
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      2
      ·
      edit-2
      9 days ago

      Good call. Though, if you use natural gas for heating and water heating and don’t own any AC… It’ll be a lot less energy in total and you’ll notice a new gaming PC. Especially if it coincides with a new game you’ve been playing nonstop for a few weeks. But I agree, there are a lot of electrical devices in a regular home. And my usage changes with the seasons. For example I watch a lot more TV when it’s rainy and cold outside, and the TV is like 100W. And I turn on the lights hours before I’d need them in summer. And it’s difficult to tell apart the things in a home just by looking at an electricity bill.

      You should have a look at your computer, though. Have you had a look at powertop? And I suppose there is a tool for AMD graphics cards to tell you if it’s running at full speed all the time or clocking down as it’s supposed to. Or you could get a power meter to plug your PC in to. And do a measurement with GPU and one with the thing ripped out entirely.

      • Fisch@discuss.tchncs.de
        link
        fedilink
        English
        arrow-up
        1
        ·
        9 days ago

        I did the power measurement using a power meter. We also use natural gas for heating and don’t have any AC.

        • hendrik@palaver.p3x.de
          link
          fedilink
          English
          arrow-up
          1
          ·
          9 days ago

          I mainly meant to address OP with the recommendations. (And make a general statement that it depends on circumstances.) But sure. It’s the same for me. My PC makes a small share of total electricity. Each time I take a shower adds more to the electricity bill than having the computer running a full day. And all the household appliances add up, like doing laundry, cooking something or baking a cake in the oven. And the fridge etc is running 24/4 and I measured that, too and it’s like 260kWh a year. I forgot the numbers for the computer. But I don’t really play games so my numbers don’t translate to this situation anyways.

  • LavenderDay3544@lemmy.world
    link
    fedilink
    English
    arrow-up
    12
    arrow-down
    1
    ·
    9 days ago

    Any halfways decent GPU driver and device firmware will put it into a low power state when it’s idle.

      • LavenderDay3544@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 days ago

        I have an RTX 4090 and use the proprietary driver. It works just fine on Fedora and Windows 11 alike. So IDK what to tell you. AMD and Intel are even easier since the drivers are baked into the Linux kernel. I have an AMD iGPU in my desktop and an Intel one in my laptop. Both work just fine and handle power management correctly.

        • Atemu@lemmy.ml
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 day ago

          It’s nice that it’s well integrated but that doesn’t mean it works well.

          Power management of AMDGPUs has always been an absolute shitshow from my perspective.

          With dGPUs they’ve now resorted to always running them in the highest power mode because they couldn’t get power management to properly function.

          I can’t speak for modern intel GPUs but my old ones were fine.

  • slazer2au@lemmy.world
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    1
    ·
    9 days ago

    Has the actual KwH increased on your bill? Power bills are like 50% fees so double check the fees aren’t the thing catching you out.

  • mox@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    7
    ·
    edit-2
    8 days ago

    I built a new machine pretty recently, also with an RX 7800XT GPU (factory overclocked). When sitting idle at the desktop, the system draws about the same amount of power as my old machine did with an RX 480. So I think trying to put the big GPU to sleep during desktop use might be barking up the wrong tree.

    I suggest getting a power monitor, like a Kill-A-Watt, and taking measurements while you experiment. Here are some ideas to consider:

    • Are you using multiple monitors? I have read that newer AMD GPUs sometimes draw more power than they should in this case. It might depend on the resolution and/or windowing system in use. (I don’t remember if the reports I read were on Wayland or Xorg.) It almost certainly is a driver issue.
    • Are you using nonstandard timings? Have you tried different refresh rates? https://community.amd.com/t5/graphics-cards/which-monitor-timing-parameter-allows-gpu-vram-frequency-to/td-p/318483
    • Have you been playing games for hours every day, with no frame rate limit? The graphics card can draw considerably more power pushing polygons at 1440p@180Hz than it does at 90Hz, for example, and I don’t think the wattage progression from idle to full load is linear.
    • Are you using recent kernel and firmware versions?
  • Max-P@lemmy.max-p.me
    link
    fedilink
    English
    arrow-up
    6
    ·
    9 days ago

    That should be mostly the default. My secondary Vega 64 is reporting using only 3W which, on a laptop would be worth it but I doubt 3W affects your electricity. It’s nothing compared to the overall power usage of the rest of the desktop, the monitors. Pretty sure even my fans use more.

    The best way to address this would be to first take proper measurements. Maybe get a kill-a-watt and measure usage with and without the card installed to get the true usage at the wall. Also maybe get a baseline with as little hardware as possible. With that data you can calculate roughly how much it costs to run the PC and how much each component costs, and from there it’s easier to decide if it’s worth.

    Just the electric bill being higher isn’t a lot to go with. Could just be that it’s getting cold, or hot. Little details can really throw expectations off. For example, mining crypto during the winter is technically cheaper than not for me because I have electric heat, so between 500W in a heating strip or 500W mining crypto, they both produce the same amount of heat in the room but one of them also made me a few cents as a byproduct. You have to consider that when optimizing for cost and not maximizing battery life on a laptop.

    • iturnedintoanewt@lemm.eeOP
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      9 days ago

      Lol…I didn´t consider that. What crypto can you mine that can give you at least some cents back? Sorry for the offtopic. Yeah I’m in the process of setting up some sockets with zigbee switches and power metering. I’ll install the one for the desktop soon, and start measuring more accurately. Is there a way to know which GPU are you using at any given time?

  • tekato@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    1
    ·
    8 days ago

    For this to work with a desktop PC, you would need to connect your display cable to your iGPU instead of your dGPU. The driver should take care of the rest. This might yield lower performance when using dGPU for processing (probably unnoticeable, depending on circumstances).