Created by Materia for OpenMind Recommended by Materia
4
Start Optical Computing: Solving Problems at the Speed of Light
07 February 2020

Optical Computing: Solving Problems at the Speed of Light

Estimated reading time Time 4 to read

According to Moore’s law —actually more like a forecast, formulated in 1965 by Intel co-founder Gordon Moore— the number of transistors in a microprocessor doubles about every two years, boosting the power of the chips without increasing their energy consumption. For half a century, Moore’s prescient vision has presided over the spectacular progress made in the world of computing. However, by 2015, the engineer himself predicted that we are reaching a saturation point in current technology. Today, quantum computing holds out hope for a new technological leap, but there is another option on which many are pinning their hopes: optical computing, which replaces electronics (electrons) with light (photons).

The end of Moore’s law is a natural consequence of physics: to pack more transistors into the same space they have to be shrunk down, which increases their speed while simultaneously reducing their energy consumption. The miniaturisation of silicon transistors has succeeded in breaking the 7-nanometre barrier, which used to be considered the limit, but this reduction cannot continue indefinitely. And although more powerful systems can always be obtained by increasing the number of transistors, in doing so the processing speed will decrease and the heat of the chips will rise.

The hybridization of electronics and optics

Hence the promise of optical computing: photons move at the speed of light, faster than electrons in a wire. Optical technology is also not a newcomer to our lives: the vast global traffic on the information highways today travels on fibre optic channels, and for years we have used optical readers to burn and read our CDs, DVDs and Blu-Ray discs. However, in the guts of our systems, the photons coming through the fibre optic cable must be converted into electrons in the microchips, and in turn these electrons must be converted to photons in the optical readers, slowing down the process.

The overhead view of a new beamsplitter for silicon photonics chips that is the size of one-fiftieth the width of a human hair. Credit: Dan Hixson/University of Utah College of Engineering

Thus, it can be said that our current technology is already a hybridization of electronics and optics. “In the near-term, it is pretty clear that hybrid optical-electronic systems will dominate,” Rajesh Menon, a computer engineer at the University of Utah, tells OpenMind. “For instance, the vast majority of communications data is channelled via photons, while almost all computation and logic is performed by electrons.” And according to Menon, “there are fundamental reasons for this division of labour,” because while less energy is needed to transmit information in the form of photons, the waves associated with the electrons are smaller; that is, the higher speed of photonic devices has as its counterpart a larger size.

This is why some experts see limitations in the penetration of optics in computing. For Caroline Ross, a materials science engineer at the Massachusetts Institute of Technology (MIT), “the most important near-term application [for optics] is communications — managing the flow of optical data from fibres to electronics.” The engineer, whose research produced an optical diode that facilitates this task, tells OpenMind that “the use of light for actual data processing itself is a bit further out.”

The laser transistor

But although we are still far from the 100% optical microchip —a practical system capable of computing only by using photons— advances are increasing the involvement of photonics in computers. In 2004, University of Illinois researchers Milton Feng and Nick Holonyak Jr. developed the concept of the laser transistor, which replaces one of the two electrical outputs of normal transistors with a light signal in the form of a laser, providing a higher data rate.

For example, today it is not possible to use light for internal communication between different components of a computer, due to the equipment that would be necessary to convert the electrical signal to optical and vice versa; the laser transistor would make this possible. “Similar to transistor integrated circuits, we hope the transistor laser will be [used for] electro-optical integrated circuits for optical computing,” Feng told OpenMind. The co-author of this breakthrough is betting on optical over quantum computing, since it does not require the icy temperatures at which quantum superconductors must operate.

Graduate students Junyi Wu and Curtis Wang and professor Milton Feng found that light stimulates switching speed in the transistor laser. Credit: L. Brian Stauffer

Proof of the interest in this type of system is the intense research in this field, which includes new materials capable of supporting photon-based computing. Among the challenges still to be met in order to obtain optical chips, Menon highlights the integration density of the components in order to reduce the size, an area in which his laboratory is a pioneer, as well as a “better understanding of light-matter interactions at the nanoscale.”

Despite all this, we shouldn’t be overly confident that a photonic laptop will one day reach the hands of consumers. “We don’t expect optical computing to supplant electronic general-purpose computing in the near term,” Mo Steinman, vice president of engineering at Lightelligence, a startup from the photonics lab run by Marin Soljačić at MIT, told OpenMind.

Present and future of photonics

However, the truth is that nowadays this type of computing already has its own niches. “Application-specific photonics is already here, particularly in data centres and more recently in machine learning,” says Menon. In fact, Artificial Intelligence (AI) neural networks are being touted as one of its great applications, with the potential to achieve 10 million times greater efficiency than electronic systems. “Statistical workloads such as those employed in AI algorithms are perfectly suited for optical computing,” says Steinman.

Thus, optical computing can solve very complex network optimization problems that would take centuries for classical computers. In Japan, the NTT company is building a huge optical computer that encloses five kilometres of fibre in a box the size of a room, and will be applied to complicated power or communications networks enhancement tasks.

A photonic integrated circuit. Credit: JonathanMarks

“Looking ahead, we believe we can leverage the ecosystem created by optical telecommunications in the areas of integrated circuit design, fabrication, and packaging, and optimize for the specific operating points required by optical computing,” Steinman predicts. However, he admits that moving from a prototype to full-scale manufacturing will be a difficult challenge.

In short, there are reasons for optimism about the development of optical computing, but without overestimating its possibilities: when computer scientist Dror Feitelson published his book Optical Computing (MIT Press) in 1988, there was talk of a new field that was already beginning to reach maturity. More than 30 years later, “optical computing is still more of a promise than a mainstream technology,” the author tells OpenMind. And the challenges still to be overcome are compounded by another stumbling block: technological inertia. Feitelson recalls the warning issued in those days by IBM researcher Robert Keyes: with the enormous experience and accumulated investment in electronics that we already know, “practically any other technology would be unable to catch up.”

Javier Yanes

@yanes68

Comments on this publication

Name cannot be empty
Write a comment here…* (500 words maximum)
This field cannot be empty, Please enter your comment.
*Your comment will be reviewed before being published
Captcha must be solved