Breakthrough: New CMOS chip can process both light and electricity

New CMOS chip

Breakthrough light-based microprocessor chip could lead to more powerful computers, network infrastructure. Researchers at the University of Colorado Boulder, in collaboration with the University of California, Berkeley and the Massachusetts Institute of Technology (MIT), show off a working light-based processor, wich sends and reads bits from memory using nothing but light.

“Light based integrated circuits could lead to radical changes in computing and network chip architecture in applications ranging from smartphones to supercomputers to large data centers, something computer architects have already begun work on in anticipation of the arrival of this technology,” said Miloš Popović, an assistant professor in CU-Boulder’s Department of Electrical, Computer, and Energy Engineering and a co-corresponding author of the study.

The year has been chock-full of scientific breakthroughs, but the University of Colorado is determined to finish 2015 with a bang. Its researchers have created what they say is the first full-fledged processor to transmit data using light instead of electricity. The design isn’t entirely photonic, but its 850 optical input/output elements give it the kind of bandwidth that make electric-only chips look downright modest — we’re talking 300Gbps per square millimeter, or 10 to 50 times what you normally see. The key was finding a way to reuse existing conventional processes to put optics in places where regular circuitry would go.

The design isn’t a powerhouse with a tiny size (3mm by 6mm, or 0.1in by 0.2in) and just two cores. However, it shows the potential for dramatic improvements in computing power without having to completely reinvent the wheel. You could have networking gear that copes with massive amounts of data, for example. And there’s plenty of room for optimization, too, so the possibilities for this technology remain wide open.

Moving data around inside a computer means shoving it through wires, which have inherent bandwidth limitations and produce a lot of heat. Once that data hits a network, however, it often runs across optical hardware, which can send information long distances at high bandwidth without needing a dedicated nuclear reactor for power.

The contrast between the two methods has most companies thinking about ways of getting optical connections inside computers and, eventually, inside chips themselves. This poses a significant challenge. While it’s possible to use silicon to create light-handling features, the processes used to do so are incompatible with the CMOS techniques used to make circuitry. As a result, most efforts in this area have used separate chips: one for the processor, one for the optical interconnect.

Now, a research team has put together a single chip that handles both optical and electrical processing and uses an optical connection to its main memory. While the bandwidth remains low, the entire system was manufactured using standard CMOS processes. And it incorporates a small RISC processor that’s able to run standard text and graphical programs.

I need a laser!

As with many previous efforts, the one thing that isn’t on the chip is a laser; that’s separate hardware, with its output channeled into the chip. Once the light reaches the chip, however, everything that handles that light has to be made of silicon. This includes waveguides to direct the light to specific locations, modulators to chop it up into bits, and detectors that can register those bits.

In this case, the laser was a light source at a wavelength of 1,180 nanometers. That’s a frequency where silicon is transparent, but modifications can be used to change its properties. For instance, a silicon-germanium blend can act as a photodetector, absorbing photons and converting them into electrical pulses.

Putting together a mixed electronic and optical chip required figuring out a whole series of similar approaches to work around limitations. In locations where the silicon waveguides leaked light into surrounding materials, for instance, the surrounding materials had to be etched away. The efficiency of the modulators varied with temperature, which varied with the chip’s workload. So researchers created a feedback system that detected falling light levels and triggered a resistance heater to crank up the local temperature as needed.

With all of those problems handled, the authors were able to build a mixed electronic/optical chip featuring 70 million transistors and 850 photonic components. The processor itself is a dual core RISC-V, an architecture used in academic circles; it’s capable of operating in the gigahertz range. Memory was on a separate chip, which could be located an arbitrary distance away, limited only by optical fiber length.

Both the processor and the memory chips have three external optical interfaces. The first simply accepts light from a laser source for use by the chips. A second carries data from the processor to the memory. The third carries it back. Each of these links has a bandwidth of 2.5 gigabits/second, for an aggregate bandwidth of 5 Gb/s. That’s in the neighborhood of a current phone’s memory bandwidth and well below what’s possible with optical systems, so there’s a lot of work to be done. Since the processor is locked to a multiplier of the memory access, that means it runs at a pokey 31 MHz when using the optical memory.

Still, it all actually works. As shown in the video above, the optical busses are able to support everything from simple “hello world” programs to rendering a teapot in 3D.

While there’s still a lot of work left to get this technology up to its potential, optical hardware may start showing up close to the CPU sooner than that, relying on separate optical chips. The next logical step is to start linking hardware within clusters via optical interconnects (rather than copper), and a number of companies are already testing this sort of hardware. After that, storage is likely to see optical connections before anything else.

So in many ways, this work is extremely forward-looking. It’s an early try at a problem that we don’t even need to solve just yet. Still, by identifying an all-CMOS approach and identifying some limits and bottlenecks, the research team may have laid out a path that will get us where we eventually want to go.

Source: University of Colorado, NatureArs Technica

Check Also

Reddit

Spammers get creative on Reddit, creating Subreddit boards

We heard word of shenanigans on Reddit, and it turned out that a scammer had set …

Leave a Reply

Your email address will not be published. Required fields are marked *