The first 60 years of computing have seen spectacular progress in the technology, driven for the last 40 years by Moore's Law which, though initially an observation, has become a self-fulfilling prophecy and a board-room planning tool. Ever-shrinking transistor dimensions have yielded increasingly complex and cost-effective microchips, a win-win scenario that has driven the explosion in the use of digital electronics and enabled computers to be embedded into a vast range of high-volume products.