The “law” that has defined the digital age faces a fundamental limit; what happens when we reach it?
Consumer technology generally progresses according to three criteria: size, speed, and convenience. If a product fails to satisfy these demands, it flops. Because of this, we’ve come to expect a consistent flow of product innovation.
But where is the limit? Is there a size or speed beyond which science cannot attain? Some say Moore’s Law is that limit—or, at least, a very formidable barrier.
The lunar missions… now in the palm of your hand
Moore’s Law was conceived by Intel co-founder Gordon Moore in the mid-1960’s. While not quite a “law” in the scientific sense, the phenomenon claims that the number of transistors that can fit into a single microchip, or integrated circuit, doubles roughly every 18 months. Transistors are, essentially, the on/off switches that make up the ones and zeros of binary code, with each transistor representing one “bit” of information. The effect of this is a rapid increase in computing power within ever smaller devices—“memory density,” as it’s known.
You can understand it like this: The total computing power of NASA’s Apollo 11 mission was roughly 64 Kbyte of RAM at a speed of 0.043MHz. The iPhone 5, by comparison, utilizes 1 Gb RAM at 1.02 Ghz. That’s a growth factor of more than 16,000. Now think about the progress of consumer technology throughout the 20th century: Transistor radios replaced vacuum tubes; digital cameras replaced film; CD players and, eventually, MP3s replaced record players—even dishwashers and refrigerators have been “digitized.” These advances were made possible by improving memory density.
Read more at Reviewed.com…