Is it possible to take RISC design to an extreme and design a core
without any practical vectors for entropy based on timekeeping?
Something like lambda calculus on a chip, except differentiating
multiple output devices not just a theoretical "tape". Of course
damage will always create some entropy, but not the kind which can be
harnessed in anyway relevant to the current imagination.