It is said streaming, especially video, takes plenty electricity, can be bad for the environment. So I wonder, how does watching Kim Possible on Mickey Channel compare to watching it on Mickey Plus? Similarly, and maybe even better of a comparison, how does listening to something like 623.7 FWGR Radio on FM compare to listening to the station’s online stream?
Electrical engineer here. There is almost no difference.
The cost of streaming video from a server to your computer is pretty small, basically just transferring the bytes from a hard drive to a network card. This happens in a datacenter on a big server designed to be efficient at it, and serve a ton of people at once. Your own electricity consumption on your viewing device is likely much higher than that. You can calculate your electricity consumption using a Kill-A-Watt or similar device, but here are some averages of measurements I’ve made on my devices:
If you look at your computer’s CPU usage while watching video, it’s mostly idle. So most of the power consumption is the screen’s backlight.
Assuming worst-case coal power, releasing 0.4kg of carbon per kWh, and a large TV, and let’s say 10% overhead for the server’s energy cost, that’s 0.13kg of carbon per hour. So don’t worry about it.
I would counter that it takes significantly more power to provide someone with internet compared to a broadcast antenna.
Google tells me a low power tv antenna broadcasts at around 2.3kw. I’ve deployed datacenters full of racks where each rack pulls more than that. Once you take into account all the networking gear between the server and the consumer, , the internet easily requires more resources. Routers, switches and servers can be pretty power hungry.
Yeah, true, but that’s mostly fixed costs, and has a pretty low incremental cost for each video delivered. The fixed costs we have to pay regardless.
But that’s even more true for broadcast. One 2.3kW transmitter works just as well for one receiver as it does for 10M. Hell even 100M, depending on geography, population density, and frequency band. That’s not true of network infrastructure.
Yeah phones use hardly any power. For example my phone uses about 7.3 kwh per year if charged every night, which I don’t actually need to charge it fully every night or a full recharge because usually it’s at 60% battery at the end of the day.
So it is about 13.88 cents per kwh where I live, or roughly, $1 per year to charge my phone if I needed a full charge every night.
I have had accubattery for my phone’s duration and Sony have the feature that you can force stop charging at 80%. In the past 3.5 years with the phone, I have charged 1 814 046 mAh in that time. Assuming avg 3.8V (not hitting too much or CV with 80%) and my local price of 0.32€/kWh, that is only 2.14€ over 3.5 years.
Phones are absolutely crazy efficient.
I don’t think the problem comes from the clients devices.
I really don’t see how broadcasting could consume the same amount of energy as downloading. When you broadcast something it doesn’t matter how many clients are watching the content, but for streaming or downloading the more clients the more energy you’ll need to support the load