Sunday, March 13, 2016

The Future of Computing

Title of cover story in the Economist.  They are quoting some Silicon Valley pundits on the end of Moore's Law.  Gordon Moore, one of the founders of Intel, stated that the number of transistors in integrated circuits doubled every year, later revisions said every 2 years.  The observation was based on steady improvements in silicon lithography, which yielded smaller transistors, and hence more salable chips per silicon wafer.  Back when I started in the business, chips were made with 100 micron design rules.  Now we are down to 19 microns.  Sooner or later we will get to a size that cannot be shrunk anymore.  Silicon Valley pundits have been talking about this for twenty years that I can remember, and probably longer.
  The Economist been listening to the doomsayers, and ran a cover story and a special technology section worrying about the end of Moore's law.  They make it sound like computers will stop getting smarter. 
  Not to worry, the microprocessors are plenty smart enough, and if one chip won't do the job, buy five or ten of 'em, they only cost $10 or so, and get on with it. 
   The real effect of the end of Moore's law is that chips will stop getting cheaper every year.  Back when, Analog Devices introduced their nice new ADSP2181 chip.  The first year, they lost money on every chip they sold.  But after the first die shrink reduced the size of the part, and hence it's cost, it became profitable, and after three or four more die shrinks it became really cheap and profitable. 
   And since chips or now so cheap, I think the world will keep on rotating if they stop getting even cheaper. 

No comments: