It is said streaming, especially video, takes plenty electricity, can be bad for the environment. So I wonder, how does watching Kim Possible on Mickey Channel compare to watching it on Mickey Plus? Similarly, and maybe even better of a comparison, how does listening to something like 623.7 FWGR Radio on FM compare to listening to the station’s online stream?
Electrical engineer here. There is almost no difference.
The cost of streaming video from a server to your computer is pretty small, basically just transferring the bytes from a hard drive to a network card. This happens in a datacenter on a big server designed to be efficient at it, and serve a ton of people at once. Your own electricity consumption on your viewing device is likely much higher than that. You can calculate your electricity consumption using a Kill-A-Watt or similar device, but here are some averages of measurements I’ve made on my devices:
- PC with 27" LCD monitor: 150W
- 50" TV: 300W
- Laptop with internal 14" screen: 40W
- Phone with 5" screen: 10W roughly, but it’s complicated
- Phone with screen off, speaker only: 2W (guessing here)
- Handheld FM radio: less than 1W
If you look at your computer’s CPU usage while watching video, it’s mostly idle. So most of the power consumption is the screen’s backlight.
Assuming worst-case coal power, releasing 0.4kg of carbon per kWh, and a large TV, and let’s say 10% overhead for the server’s energy cost, that’s 0.13kg of carbon per hour. So don’t worry about it.
I would counter that it takes significantly more power to provide someone with internet compared to a broadcast antenna.
Google tells me a low power tv antenna broadcasts at around 2.3kw. I’ve deployed datacenters full of racks where each rack pulls more than that. Once you take into account all the networking gear between the server and the consumer, , the internet easily requires more resources. Routers, switches and servers can be pretty power hungry.
Yeah, true, but that’s mostly fixed costs, and has a pretty low incremental cost for each video delivered. The fixed costs we have to pay regardless.
But that’s even more true for broadcast. One 2.3kW transmitter works just as well for one receiver as it does for 10M. Hell even 100M, depending on geography, population density, and frequency band. That’s not true of network infrastructure.
Yeah phones use hardly any power. For example my phone uses about 7.3 kwh per year if charged every night, which I don’t actually need to charge it fully every night or a full recharge because usually it’s at 60% battery at the end of the day.
So it is about 13.88 cents per kwh where I live, or roughly, $1 per year to charge my phone if I needed a full charge every night.
I have had accubattery for my phone’s duration and Sony have the feature that you can force stop charging at 80%. In the past 3.5 years with the phone, I have charged 1 814 046 mAh in that time. Assuming avg 3.8V (not hitting too much or CV with 80%) and my local price of 0.32€/kWh, that is only 2.14€ over 3.5 years.
Phones are absolutely crazy efficient.
I don’t think the problem comes from the clients devices.
I really don’t see how broadcasting could consume the same amount of energy as downloading. When you broadcast something it doesn’t matter how many clients are watching the content, but for streaming or downloading the more clients the more energy you’ll need to support the load
I’m a bit short on time, but I think “streaming” needs to be broken down into categories of scale. Streaming video from your home Plex server (shout-out to !homelab@lemmy.ml) is a lot different than Netflix’s video delivery system.
The latter intentionally stores the same content in multiple geographies, then with caches at local data centers, and sometimes even caches within your ISP’s network. All of this to distribute the load of millions of users, who can just as easily be in Florida as they might be in Oregon. The duplication and redundancy means a lot of power draw, well more than just a few disks spinning up.
Whereas a home server has just one copy of the content, and since it might not always be streaming a video to you, can save power by spinning down drives or other optimizations. It is simply not possible to describe “streaming” when such radically different delivery mechanisms can all plausible be considered as streaming.
I’d say just as you can run your own server cooler (turn it off when not needed), Netflix servers are going to wind down during low demand and run lower power. But while you’re picturing you last laptop as a server vs a data center, try to picture every household out there running their own “server” the same way. Some are watching, some aren’t. I think OP’s question is more appropriate, comparing streaming to broadcast rather than streaming vs local storage. Besides, how’d you get that data? You transported physical media or downloaded it from a server.
I cannot answer the question but I’d like to add there’s also the power used to broadcast the networks from the source too (as well as the user receiving it).