Light-Based Computing: Is the Future Already Here?
Everywhere you look, electrons are hard at work to power the world with connection, communication, and creation.
Moore’s Law, which was initially a prediction made by Intel’s co-founder George Moore in 1965, purports that the number of transistors (binary switches that are the language of computing) in a microprocessor doubles roughly every two years. In turn, chips become more powerful as the cost of computers drops. However, in order to fit more transistors into the same amount of space, they must be made smaller, resulting in faster processing speeds.
So, what’s the downfall?
Increasing the number of transistors within a microchip naturally results in more heat because heat is a byproduct of consumed energy from doing work. In order to overcome the constraints of physics (or economics, as some may argue), light-based computing could offer a game-changing solution.
While we are already on the way to making photonic computing a reality, hurdles do still exist.
What is Light-Based Computing?
Also known as optical computing or photonic computing, light-based computing is the usage of photons (or light) from lasers or diodes for computational purposes. For the most part, electronics and computers utilize electrons for computation.
Optical computing leverages an optical transistor, as opposed to the electronic transistors that Moore’s Law refers to. While data is channeled through photons, today’s technology still relies on processing information using logic and electrons.
To make light-based computing fully functional, there need to be microchips that use photons instead of electrons (i.e. a laser transistor).
How Optical Computing Works
Optical computing follows a similar pathway as standard electronic computing, yet the connections use photons instead of electrons. Information goes through the processor to logic gates. The information is transferred using fiber optic cables.
With pure optical computing, there’s no need to convert information from binary to optical, which dramatically increases the speed of computing.
The Benefits Behind Optical Computing
Optical computing is desirable for many reasons. Researchers are continuing to pave the path forward in order to achieve the various benefits, including:
For electronic computing to work, connections must exist in the form of wires (i.e. copper) so that the electrons can follow their path to transmit information. Unlike electrons, photons have the ability to travel in free space. So, optical computing opens the door to the three-dimensional and parallel transmission of information.
Electrons are charged particles, and thus, they run the risk of interfering with each other. Photons, on the other hand, are not charged. Light beams can pass through one another and remain unaffected.
Optical materials have greater storage density than magnetic materials. Optical storage provides the benefit of immediate access, independent of when data was stored.
Electrons create heat as they pass through semiconductors and conductors. To avoid overheating, this heat must be removed from computers and electronics, which can limit the density of microchips and processors. Photonic computing’s energy dissipation is negligible as it moves in free space, so devices do not overheat.
Computers use roughly 10% of the world’s entire electricity. Photonic computing could solve this energy crisis by reducing energy consumption dramatically as millions of data points could be transferred within seconds using light waves.
The phrase “travels at the speed of light” exists for a reason. Light travels faster than electrons, so it means that your computers and electronics would operate much more quickly with photonic computing.
Artificial intelligence is booming, yet it may be constrained by electronic computing. This is because in order for artificial intelligence to work optimally, AI requires access to massive amounts of data with the ability to perform immediate computations. Hardware is progressing more slowly than AI would hope. Bounded by Moore’s Law, transistors may constrain the potential of AI. With photonic computing, calculations would occur much faster and more efficiently, thereby allowing AI to progress.
What are the Challenges of Optical Computing?
Communication through light has been happening already (think of fiber-optic networks, for example. Or go even further in time with the use of CDs). Light is generally easier than electricity for communication purposes.
However, there’s a trifecta of challenges that arise when attempting to achieve pure optical computing, namely: heat, power, and size.
To switch from electricity to light requires that light be small enough. In atomic terms, light is large, which provides a constraint when trying to use techniques to bend light inside microchips and processors.
Researchers have made some headway with a promising middle-ground, known as surface plasmons. Surface plasmons behave like light but are actually electrons.
The problem is the power issue.
Plasmons lose power quickly, and to maintain power, you’d have to add wire (energy) which then translates into more heat (a byproduct of used energy).
On the bright side, researchers are finding materials like graphene and carbon nanotubes, which may allow surface plasmons to be transported on a nano-scale, which could make photonic computing possible.
The Proof is in the Photons: What We Can Expect Moving Forward
Despite the inherent challenges of photonic computing, the horizon remains promising.
For example, researchers are busy at work to show how squeezed states may allow for programmable photonic circuits that could manage quantum computing. Quantum computing would provide the necessary foundation for artificial intelligence and machine learning to evolve.
Yet, some of the most prominent hurdles in bringing photonic computing to life have to do with the hardware. The hardware will require immense power capabilities, reliability, and the possibility to scale.
That being said, there is surely hope for a future where a computer chip will be able to execute light-based computing.