What is it exactly the added bonus _users_ get with the modern way of web development that wasn't there in, say 2011? I remember using Android 2.x with 800 MHz SOC back in the day and web browsing worked fine, really. What modern web sites can offer to justify the increased CPU usage?
> <i>It isn’t far fetched that a device would reduce power consumption when on battery, it makes the device last longer, makes users happier.</i><p>Does it? If I reduce the amount of power available to complete a task, but proceed to complete the task <i>anyways</i>, doesn't the phone now need to potentially have the peripherals (particularly, the screen) on for longer as the user waits for a task that has been slowed down to complete, consuming <i>more</i> energy than had it completed the task using 100% CPU?<p>(Unless using 80% of the CPU can accomplish the same task using less energy (not power) and recoup enough to make up for the losses incurred by needing extra time.)<p>It isn't intuitive to me at all that what we're doing isn't making phones useless so that their battery lasts all day.
The author presents a problem in the lack of device model and battery level information. A website/web application may disable functionality due to browser feature availablity, but if you disable functionality based on the device model, you're doing something wrong.<p>If you have a scenario where you can disable <i>some</i> thing to make the experience better while maintaining functionality, then why in the <i>world</i> would you ever enable those things? This is simply beyond me.<p>Any suggestions that improve the situation are simply generic performance optimizations that benefit everyone.
Given that components are going to need to contain Javascript, what's the value in having them contain separate HTML? "Browsers are good at handling documents" isn't really true any more; at least, browsers don't seem appreciably better at handling documents than Javascript is.