> Is it worth it?<p>> Um. Not really. At current Ethereum prices (2021-02-26), it generates $0.14 of profit per day. It’s still a profit, but very miniscule.<p>The real question is if it makes more money than it costs in electricity. Apple claims it uses 39 W max. That adds up to 0.936 kWh of electricity per day, which we can approximate at 1 kWh. Electricity prices range from ~4¢ to ~20¢ per kWh in the USA, which means that depending on where you live, you'll be making a profit or a loss of -6¢ to +10¢ per day.<p>So yeah, this thing can be profitable. But the capital cost required to buy the M1 in the first place is going to be really hard to pay off.
No-one has mentioned that ETH mining is memory bandwidth bound. M1 macs, unfortunately, have slow GPU memory compared to your discrete GPU's GDDR6X.<p>Here's a great post showing that - <a href="https://www.vijaypradeep.com/blog/2017-04-28-ethereums-memory-hardness-explained" rel="nofollow">https://www.vijaypradeep.com/blog/2017-04-28-ethereums-memor...</a> - GPU mining is only 10% slower than the theoretical memory-bound maximum (if compute was infinitely fast).
It's not crazy, but it's telling the software that it's an intel GPU, and taking into account no optimizations targeted at the GPU's strengths. I could see this becoming more useful if it was coded to apple's GPU characteristics, but I'm not even sure anyone outside of Apple knows what those are right now and how to optimize mining software on their GPU.
Cool experiment. But I don't recommend getting into Ethereum mining now, as the network is on the verge of switching to Proof of Stake as part of the migration to Ethereum 2.0.
Quick, rough calculations suggest to me that the RTX 3080 is about what you need to turn a profit - based on 200k annual revenue you need 600k in hardware. Three years to pay itself off - about the lifetime of the hardware, my guess. That's if you could find the hardware - and I didn't include stuff like motherboard/rack/etc.<p>If it's that hard to make a profit on the RTX 3080 then I doubt the M1 Mac Mini was ever in the running.
I thought the main strengths distinguishing the performance of the M1 were the idle power chips that saved a lot of energy, and the shared CPU/GPU memory. Am I remembering that right?<p>Do either of those things equal better intensive mining efficiency, much over a dedicated traditional rig, where you're flat out all the time and emphasizing GPU?
If mining speed is any indication of GPU performance (I am thinking TensorFlow training), M1 is about 50 times slower, than RTX 3090, and about 18 time slower, than 1080 Ti.