A Photonic Life—Our Rapidly Increasing Computer Processing Speed

Abstract sine waves, numbers and design elements I have been thinking this last week about the current advances in technology and what they will mean for us in future computing systems and how we do computing. HP announced last week they are working on a new machine that is dubbed simply “The Machine.” While the moniker is not very inspiring, the technology is groundbreaking. HP is working on two different technologies they will build into The Machine, memristors and silicon photonics. These technologies will need an entirely new computer and operating system wrapped around them. I think that will present some opportunities for forward-thinking information technology professionals willing to blaze some trails.

Memristors

Memristors or “memory resistors” were first proposed by Leon Chua in 1971. This is basically a resistor that remembers its state when electricity is turned off. The first silicon based memristor was announced in 2012, though there is still much work to do to make them commercially viable. Their value is that you can use these as storage now and they can be right on the same board, or even the same chip, as the processing unit and can replace offline disk storage. The ability to access information in such close physical proximity to the processor will boost access speed exponentially. Instead of having dual core or quad core or eight-way core, you can now have a multicore processor.

Photonics

Photonics, or the process of transferring information via light, is not a new process but it is shrinking. Fiber optic cable allows us to easily transfer information and voice across the ocean and is increasingly used within buildings as well. It is faster than copper and requires less energy. What is new is the application of photonics. It is being shrunk now to the point of transferring information across a blade server and even between blade servers in the same rack across the backplane. With this miniaturization, it takes some very creative nano technologies to create the path for transferring those light pulses. This, combined with the new memristors, yield data access rates much faster than ever before. An added benefit is increased energy efficiency because copper paths tend to lose strength and need repeaters to refresh the data. Those repeaters add to the overall heat given off and energy consumed.

Thoughts

Part of the push to create faster computers and faster networking is because we are so data rich right now we cannot process it fast enough. We became data rich in the first place by building fast, low-cost computers and storage that allowed us to collect statistics on anything and everything. I am wondering now if the dog is wagging the tail, or is the tail wagging the dog? In all of this, through change comes opportunity. Today’s programs and operating systems are constrained by the current hardware. If the current hardware changes to the point that there is no delay in data access or processing, there will need to be new software, new applications, and a new operating system. I believe that infrastructure will need to be built from the ground up to maximize the capabilities of the new hardware. Is there anyone out there up for the challenge? Let me know your thoughts. We only have a few years before the future is here.

Author Kelly BrownAbout Kelly Brown

Kelly Brown is an IT professional, adjunct faculty for the University of Oregon, and academic director of the UO Applied Information Management Master’s Degree Program. He writes about IT and business topics that keep him up at night.

 

Print Friendly, PDF & Email