I'm not sure where this is going. Doesn't the computational cost to mine ether bitcoin or ether change over time?<p>And also, we already have units for quantifying computation. We can talk about millions/trillions of floating-point operations and also millions/trillions of instructions executed. The problem is that it's very hard to reach the peak theoretical capacity of a system, and also, when you're talking about trillions of instructions, the time taken depends on your instruction makeup.<p>You'd get a similar problem if you talked about ether as a computational unit, because a machine can be designed to mine ether really fast, but not be suitable/optimized for other types of computation, meaning that the compute power it takes your system to compute an ether might not be a good way to quantify the time it would take said system to compute something else.