Is photonics the solution for power-hungry data centers?
- A photonics startup has developed a workable alternative to transistor-based computing using light
- The result could be greater processing speeds with a fraction of the power consumption of current technology
- Optical computing could slash power consumption at large data centers
An MIT spinout startup called Lightmatter is developing computer processors that use light, rather than electricity. The results could mean greater speeds with lower power consumption.
The technology could prove a boon for data centers owned by cloud giants like Amazon and Microsoft, who invest heavily in computer chips to power energy-intensive machine-learning algorithms. More recently, firms have relied on Graphics Processing Units (GPUs) developed by firms like Nvidia which have been well-suited for powering artificial neural networks. However, firms are increasingly looking at developing their specialized machine-learning chips.
Lightmatter’s chips promise to be faster than current transistor-based chips, including GPUs.
Using a process known as photonics, the company has developed a processor that works with a special fabric to connect computer components together. Serving the role of wires, tiny structures called ‘wave guides’ redirect light, while silicon photonic chips can turn the light into electrical signals that existing computer systems can interpret.
As reported by Bloomberg, Lightmatter’s co-founder Nick Harris claims the systems can send data between components 100 times quicker than the fastest PC, using just 10% of the required energy.
Photonic computers aren’t a new brainwave. According to Wired, the technology was featured in some 1960s military radar systems but was sidelined when the semiconductor industry began to pick up and give rise to the exponential increased in chip density now known as Moore’s Law. The exploration into photonics is seeing a revival as that exponential advance of current technology becomes increasingly difficult to achieve, and rising power dissipation per unit on transistor-based chips rises, testing the limits of current cooling technologies.
“Photonic (or optical) computers have long been considered a holy grail for information processing due to the potential for high bandwidth and low power computation,” read an earlier blog by Lightmatter.
“Developing these machines required three decades of technological advancement that Lightmatter is now harnessing to deliver on the promise of highly power-efficient, parallel computation with light.
Lightmatter is one of the first companies to present a working optical computing chip tailored for AI workloads in the data center and plans to make its first commercial product available towards the end of 2021.
Data centers are forecasts to account for more than 15% of global power use within the next five years. Larger data centers can demand almost a fifth of the output of a conventional coal power plant. Rising energy consumption and cooling costs are not only a vast and growing expense for data center operators but run firmly against environmental commitments, such as carbon-neutrality by 2030 or Microsoft’s Net Zero initiative.
Lightmatter is pitching its solution to hyper-scale data center operators including Amazon, Facebook, and Google. Among these tech giants, some have gone “way beyond” expressing interest, according to Harris.
Founded in 2017 with just 47 employees, Lighmatter has so far received US$33 million in venture funding.
READ MORE
- Data Strategies That Dictate Legacy Overhaul Methods for Established Banks
- Securing Data: A Guide to Navigating Australian Privacy Regulations
- Ethical Threads: Transforming Fashion with Trust and Transparency
- Top 5 Drivers Shaping IT Budgets This Financial Year
- Beyond Connectivity: How Wireless Site Surveys Enhance Tomorrow’s Business Network