"This is not a Motorola MC14500 computer, but it was the MC14500 that introduced me to the idea of one-bit computing. Exploring ways to reduce the chip count, a series of pencil & paper designs revealed the MC14500 itself could be omitted!"<p>That's really gold. I love these optimization rounds.<p>Worst experience in this was that I once spent a lot of time optimizing some function that looked like it was using a whole pile of time, only to realize afterwards that a hand-optimized version in assembly was already graciously provided in the same subdirectory. And it ran a lot faster than mine :(
The same author also has a very nice and detailed description of another project, the "KimKlone" [0], which is basically a coprocessor for the 65C02 CPU, injecting itself onto the address and data buses in such a way as to implement new "opcodes" for the 6502, effectively giving it extra registers and instructions designed to hardware accelerate a FORTH interpreter.<p>[0] <a href="http://laughtonelectronics.com/Arcana/KimKlone/Kimklone_intro.html" rel="nofollow">http://laughtonelectronics.com/Arcana/KimKlone/Kimklone_intr...</a>
That's awesome. It's an actual use case for a OISC.<p><a href="https://en.wikipedia.org/wiki/One_instruction_set_computer" rel="nofollow">https://en.wikipedia.org/wiki/One_instruction_set_computer</a>
This seems extremely humble and yet extremely useful. I have not started looking into electrical engineering though I do have interest in it. How would this sort of chip scale? For sonething like an add, would you send each bit to a chip then collect all 1+1s and add them to correct bits? Im sincerely curious. Maybe I think this is more amazing than it is
Neat project, though as the author points out it's easier to switch to a microcontroller - and cheaper nowadays; you can get an ATtiny84 that would replace that whole circuit for less than what the 2716 EPROM alone costs.