• vivadanang@lemm.ee
    link
    fedilink
    English
    arrow-up
    6
    arrow-down
    1
    ·
    6 months ago

    makes me wonder when people are going to start caring about the power draw from PC’s and consoles.

    I love gaming. Grew up gaming. Make my living working in the fringes of the game industry. And it makes me wonder when we’re going to discuss the clock cycles in the room…

    I take comfort that the coal rollers and assholes who fly 40 miles for nachos far outweigh any contributions my aging PC generates but I do worry in the aggregate: we’re powering machines to generate heat to provide entertainment, and at some point that’s going to come under examination.

    • SkyeStarfall@lemmy.blahaj.zone
      link
      fedilink
      arrow-up
      20
      ·
      6 months ago

      Electricity isn’t that expensive, especially if you have renewables. Stuff and transportation is. The biggest cause for co2 at the top is consumerism.

      Remember that, once you have your electronics, the entertainment you get from it is competing against entertainment from physical things, or travel, or something else.

      In addition, in winter that electricity is turned into heat, which you need anyway to heat up your home. And heating has always taken much more electricity than video games in my experience.

      All things considered, video games are a fairly efficient form of entertainment. You can do everything digitally (cheap), and hardware can be made to be very power efficient (it just isn’t because electricity is cheap).

      • Sylvartas@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        edit-2
        6 months ago

        IIRC the thermal efficiency of a PC/console is basically the same as most electric heating implements. I.e an electric radiator or a computer converts something like 80% of the energy it draws into heat. So theoretically, if you’re heating a room with electricity, you’re not polluting more when using a computer or console in it (apart from the servers/Internet consumption for online stuff)

        • SkyeStarfall@lemmy.blahaj.zone
          link
          fedilink
          arrow-up
          3
          ·
          6 months ago

          Computers and electric heaters turn near-100% of the power into heat, as do most other things. Heat is just a waste product, electric heaters work by effectively “wasting” the electricity into heat.

    • callyral [he/they]@pawb.social
      link
      fedilink
      English
      arrow-up
      7
      ·
      6 months ago

      I imagine most computer related pollution would come from big tech companies like Google (Alphabet), Microsoft, Amazon, etc. Since servers (Google Drive, Youtube, Onedrive, AWS, AI stuff) may require regular replacement of parts and a lot of electricity.

      • flambonkscious@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 months ago

        There’s a reason they like to build their data centres next to power stations - it’s significantly cheaper. Hopefully that gives you an idea just how much power they go through…

    • 31337@sh.itjust.works
      link
      fedilink
      arrow-up
      4
      ·
      6 months ago

      I don’t have a console, but I’ve hooked up a Kill-A-Watt to my crazy gaming PC with a TDP > 600w. When working, browsing, listening to music, watching videos, etc, it only uses around 60w, or the same as a single incandescent light bulb. When playing a modern AAA game, it uses around 250w. Not great considering the power consumption of a Switch or Steam Deck, but orders of magnitude less than typical U.S. household heating and cooling. I’d guess AI and crypto BS uses more energy than all PCs combined. Though I guess we all indirectly use AI (or rather, get used by AI).

    • averyminya@beehaw.org
      link
      fedilink
      arrow-up
      3
      ·
      6 months ago

      Even the top of the line gaming PC’s hardly draw 750w under full load, mine is pretty much the maxed out Gen4 and running stable diffusion will put it at 575w at absolute most, and that’s including my monitor and peripherals (speakers w/ subwoofer, USB etc). Normal gaming will vary, 2077 pushes it to the 450w range sometimes but not much. And even then, I’m gaming for maybe 3 hours at most?

      If I were running Stable Diffusion over night, that’s one thing and it would definitely get my room to 90F. But a few hours gaming, even 8+ hours isn’t too much to account for, especially if it’s used to offset other costs - for example using an electric heater/radiator that draws 1500w and has no other use other than providing temporary heat.

      I also think we have plenty of ways to game with low power if the mobile PC market is anything to go by. We don’t necessarily need 3080-4090’s drawing 500w for their full loads. Especially if we adopted other means of powering our grid, at that point it’s only an issue of the heat generation, which is sort of a necessity anyway so if we’re going this way then we may as well build homes with PC generated heating in mind! (\s but maybe lets do it?)