ENSPIRING.ai: The Race to Build a Perfect Computer Chip
The video explores the significant environmental impact of the digital economy, highlighting that 1% of the world's carbon emissions come from streaming video, and digital activities utilize around 7% of global electricity. With reliance on digital devices growing, pressure is mounting on data centers to increase energy efficiency. Researchers are developing low-energy computer chips as a solution, with the potential to improve energy use across various fields by enhancing chip technology.
Focus is placed on innovative computing technologies such as carbon nanotubes and silicon photonics, which promise energy-efficient computing. Carbon nanotubes could outperform silicon in terms of energy consumption and computational capabilities, but challenges remain in their processing and manufacturing. Silicon photonics offer fast and efficient data transmission, leveraging light rather than electricity, potentially reducing energy consumption in large data centers. However, transitioning from current silicon technology poses significant economic and technological hurdles.
Main takeaways from the video:
Please remember to turn on the CC button to view the subtitles.
Key Vocabularies and Common Phrases:
1. carbon emissions [ˈkɑːrbən ɪˈmɪʃənz] - (n.) - The release of carbon, particularly in the form of carbon dioxide, into the atmosphere, often from burning fossil fuels. - Synonyms: (greenhouse gases, CO2 emissions, pollutants)
One data pointer is that 1% of the world's carbon emissions are created by people just streaming video over their phones or on their PCs.
2. semiconductors [ˌsɛmɪkənˈdʌktər] - (n.) - Materials that have a conductivity between conductors and insulators, used in electronic circuits. - Synonyms: (microchips, chips, processors)
Almost everything in modern life depends on silicon based semiconductors called chips.
3. transistors [trænˈzɪstərz] - (n.) - Tiny electronic components used to amplify and switch electronic signals and electrical power. - Synonyms: (amplifiers, valves, switches)
At a very fundamental level, semiconductors are made up of what's called transistors, and these are the absolutely microscopic digital switches.
4. exponential [ˌɛkspəˈnɛnʃəl] - (adj.) - To increase rapidly at a constant rate, often in terms of mathematical growth. - Synonyms: (extensive, rapid, steep)
For half a century, the semiconductor industry has made chips exponentially faster and better by shrinking the size of transistors.
5. nanotubes [ˈnænəˌtjuːbz] - (n.) - Cylindrical structures with a diameter on the nanometer scale, composed primarily of carbon atoms. - Synonyms: (carbon nanotubes, buckytubes, nanorods)
The really cool thing about nanotubes is they conduct electricity better than just about any other material that's ever been discovered
6. stochastic [stoʊˈkæstɪk] - (adj.) - Randomly determined processes, often involving some element of chance or probability. - Synonyms: (random, probabilistic, arbitrary)
It's also partly stochastic. A neuron may spike or may not spike, depending upon inherent internal randomness.
7. analog [ˈænəlˌɒɡ] - (adj.) - A method of representing data using continuous signals, as opposed to digital. - Synonyms: (continuous, analogue, non-digital)
Neurons and synapses. They are fundamentally different from the way computers compute....The brain sort of does a similar thing, except that it works in analog.
8. photonic [foʊˈtɒnɪk] - (adj.) - Relating to the use of light (photons) to process and transmit data. - Synonyms: (light-based, optical, luminescent)
Photonics is using photons, which is the basic constituency of light.
9. tunneling [ˈtʌnəlɪŋ] - (n.) - The quantum mechanical phenomenon where particles pass through a barrier they classically shouldn’t be able to. - Synonyms: (quantum tunneling, barrier penetration, quantum leap)
Each neuron is also highly efficient by using a quantum mechanics phenomenon called band to band tunneling.
10. neuromorphic [ˌnjuːroʊˈmɔrfɪk] - (adj.) - Relating to computing systems that mimic the neural structure and operation of the human brain. - Synonyms: (brain-inspired, neural, biomimetic)
Probably the majority of the groups that are working with Lowehi sees robotics as kind of the long term best application domain for neuromorphic chips.
The Race to Build a Perfect Computer Chip
There are various estimates of how big a drain on the world's resources the digital economy is. One data pointer is that 1% of the world's carbon emissions are created by people just streaming video over their phones or on their PCs. Smartphones, computers, data centers, and all the digital activities we take for granted are estimated to use around 7% of the world's electricity. And that's only going to get worse as we rely more and more on our devices.
Probably the best way to look at that is what's going on in data centers. These are these giant factories full of computers that are processing all of the information that you generate. These giant data centers are on a course to be using much more electricity by perhaps the end of the decade. Clearly, that kind of a drain is just not sustainable. We need new solutions. We need novel solutions.
Scientists and startups around the world are developing low energy computer chips, which could be better for the environment and upgrade the intelligence of our digital devices. Everything's tied to computers getting better. It could be finance, it could be chemical engineering. Name your field. So my goal since founding the company has been to develop a new computing paradigm, a new technology platform that will allow computers to continue to make progress.
It's not only about design, but also sending them for fabrication, testing them, and making system prototypes for various applications. The focus of the application would be reducing the energy consumption to the minimum level. It's a very high risk, high reward type problem. So if we're successful, then we'll transform this industry. It'll be this enormous leap in this field.
From space hardware to toasters. Almost everything in modern life depends on silicon based semiconductors called chips. They're the things that make electronic items smarteen. At a very fundamental level, semiconductors are made up of what's called transistors, and these are the absolutely microscopic digital switches. The on or offs. On or off means one or zero in the digital realm.
Say a modern graphics chip will have tens of billions of transistors on it. And this is something that's the size of your thumbnail. So how do we get all of these tiny switches onto something relatively small? Well, we do that by building up layers of material on a disk of silicon, scraping off patterns into those materials, layering on other materials, until we have these tiny circuits that give the chip its function.
For half a century, the semiconductor industry has made chips exponentially faster and better by shrinking the size of transistors, turning room sized computers into pocket sized smartphones, propelling the digital revolution. Clearly, new technology is presenting both management and staff with an ever growing range of choices about how and where work will be done in the future. But traditional computing is reaching its limit.
State of the art, mass produced chips are made with what's called five nanometer technology, a dimension that's smaller than a virus. And materials inside some devices are already one atom thicken, meaning they can't be made any smaller. We're reaching the limits of physics. We're reaching the limits of energy density. The increasing consensus in the chip industry is that these advances that we've got from silicon are beginning to come to an end.
So that if we really want to take advantage of the full potential of artificial intelligence and the absolute ocean of data that we're all creating today, we're going to need new materials, we're going to need new programming models, we're going to need new versions of computers. My personal first time about learning and discovering what a carbon nanotube was, was when I was just about finishing up college undergraduate, and I saw a presentation by the vice president of research of, of IBM. And immediately I thought it was very special.
A carbon nanotube is a material made entirely of carbon atoms. The carbon atoms are arranged in a tube. Imagine taking a sheet of carbon atoms and rolling them into a tube, and then making that tube smaller and smaller and smaller and smaller, until the diameter of that tube was just a few atoms. It's extremely small, which is about 100,000 times smaller than the diameter of an eyelash and 100 times smaller than a virus.
The really cool thing about nanotubes is they conduct electricity better than just about any other material that's ever been discovered. Electrons will move along the length of a nanotube faster than they do in silicon. And that means you can get, ultimately, faster switching between on and off states. You can make faster computer chips. You could turn them on and off with less voltage. And that means they use less electricity, less power than silicon. In theory, nanotubes will be able to do a thousand times better than silicon. Same computational capabilities, 1000 times less power.
Another advantage is that carbon nanotubes can be processed at a low temperature, so layers of nanotubes can be built on top of one another. Silicon has to be processed at an extremely high temperature, which makes 3d layers much harder. If you really think about a city, what happens when you run out of real estate in a city is you build up, you build skyscrapers, you build into the third dimension.
And so if you can't make the transistors smaller, then you could improve your computer chip by making more transistors by making multiple layers of transistors on top of other transistors. Since the 1990s, when nanotubes were first invented in Japan, different methods have been developed to mass produce them. But every process creates two types of nanotubes, metallic ones and semiconducting ones.
A metallic nanotube, like a copper wire, it's stuck in the metallic state. You can't switch it, and that kills a circuit. So we need only semiconducting nanotubes to really make nanotube electronics work. So molecules and polymers that can be mixed in with the carbon nanotube powder, which is this tangled mess of nanotubes, and those molecules and polymers will stick to just the semiconducting ones and not the metallic ones, or they'll stick differently. And then you could sort the nanotubes and separate them based on these differences.
Initially, in a powder, there's about 67% of the nanotubes are semiconducting. But using these chemical approaches, we could extract over 99.99% of the semiconducting nanotubes. After the semiconducting nanotubes have been extracted, they float around in a solution. And so the next challenge is to line them up neatly on a silicon wafer, which can then be turned into a computer chip.
Carbon nanotubes really got me hooked and excited right from the beginning, and they really have the opportunity to potentially revolutionize electronics. Challenge of aligning them in these highly packed, highly aligned arrays has really been frustrating the field of carbon nanotube electronics since their discovery.
During my PhD, I was kind of tasked with actually determining, you know, how we can better align carbon nanotubes. We just found, you know, kind of by accident, that when we layer carbon nanotube ink or inks, where carbon nanotubes are dissolved in solutions like organic solvents, if we layer that carbon nanotube ink on water, then instantaneously those carbon nanotubes will collect and confine at that ink water interface.
And that high ordering and induced ordering by themselves really helps to align the carbon nanotubes. What you're seeing here is a lined array of hundreds or thousands of carbon nanotubes. You can see individual nanotubes, these light colored regions, and then the dark colored region is the substrate. They're all really well aligned with respect to each other. Occasionally, there's a misaligned nanotube in this. Then we need to improve our manufacturing process to eliminate those instances.
The biggest current challenge is being able to achieve that high alignment in extremely uniform arrays across, you know, 300 millimeter wafers which is an industry standard of silicon wafers, before they start changing silicon to integrate carbon nanotubes instead. If we can overcome these challenges, if we can make these aligned nanotube wafers, the major players in industry really jump into this field, and progress at that point could be very rapid.
The technique isn't perfect yet, but it's already in advance for carbon nanotube research. Michael and Catherine have founded a company with an aim to solve remaining challenges. But many more breakthroughs are needed before nanotube transistors have even a chance of replacing silicon.
It absolutely shows promise, but it's been showing promise for 20 years. There are many issues. You know, how robust are these devices? But more importantly, can you manufacture them? The semiconductor industry is based on silicon transistors, and enormous sums have already been invested in infrastructure to manufacture that technology.
Plants have a price tag of $20 billion, take years to build, and need to be run for 24 hours a day to turn a profit. Change will be difficult without a guarantee that carbon nanotubes would be cheaper. Silicon has been around a long time. People know how to use it. They know how to program it, they know how to mass manufacture it. It's still, in terms of the economics, the winner. And until those economics change, nothing is going to replace it.
I've always been obsessed with computers. You know, I had an opportunity to work in industry at a large semiconductor company for a number of years, and I got to see there some of the fundamental challenges associated with continuing to shrink transistors, and I found that to be not a very exciting future.
So my goal with Lightmatter since founding the company has been to develop a new computing paradigm, a new technology platform that allows you to do calculations using photonic devices. Electronics uses electrons. That's the medium of transfer. That's what represents the data. Photonics is using photons, which is the basic constituency of light.
So, for example, a fiber optic cable that spans the Pacific Ocean is using light to transmit information. Why is it using light? Well, it's because the energy required to shove electrons along a copper cable would be absolutely enormous, and you would just have too much signal decay. So if you can convert that information into photons, you can send it faster and with less energy overhead.
So that's obviously a desirable thing. There's nothing faster than the speed of light. So when you think about the latency between when you make a query and when a photonic computer gives you an answer, it's really at the fundamental limit for how fast you could do these things. Electrical wires there's a certain amount of time that it takes for them to turn on and off.
And every time you do, it takes a lot of energy to do this, to turn this wire off and on. Now, with a photonic computer and photonics in general, you have a type of wire that doesn't take very long at all. Femtoseconds, maybe even addoseconds. To turn on. It takes almost no energy other than any light that's lost while the optical signal is propagating through that wire.
You have another great property with photonics, which is that you can send multiple data streams through this photonic wire, encoded in different colors or wavelengths, all at the same time. So you have massively parallel computation and communication through these nearly perfect wires.
The idea of silicon photonics has been around again for a long time. The problem comes with changing those electrons based signals into photons. So the traditional way of doing that is to have all of these other pieces of equipment that are expensive and use lots of energy to do that. Silicon photonics is, well, maybe we can just design a chip that directly deals in both electrons and photons, that we can have the benefits of both worlds, and that's what we're supposedly on the cusp of doing.
Again, it's been promised for a long time it's going to change the world, or has been about to change the world for a long time, and it hasn't yet. But light matter still believes silicon photonics is going to happen, and it's harnessing existing laser technologies used in the telecoms and semiconductor processing industries.
Their chips are specifically being made for AI applications, like chatbots and self driving cars. The startup plans to launch their first photonic computer later this year. One of their current products is an interconnect, which enables chips to talk to each other. It could help solve energy consumption issues in data centers and potentially bring down costs in these huge factories.
Processors are closely arranged together so they can communicate at high speeds, but this also generates a massive amount of heat. So not only do you have to put the power in to power all of these components, but then you have to use an enormous amount of power to actually cool them down to stop them literally melting.
What this company is trying to do is to come up with interconnects and ways of connecting components that use light rather than electrons over wires. So that means, in theory, that you could have a processor in one room, memory in another, storage in another, and that would help with the density and the power and the heat, and that's definitely a solution.
That's being tried by numerous companies, but we're still waiting for that to show practical, everyday results and be widely adopted. Human brain is very efficient. Our brain operates all the time. It doesn't dissipate more than ten to 20 watts, which is very small value, to solve the problem of Rubin cube.
So the amount of energy which is required to learn and then operate is very low as compared to the actual systems. Just solving the problem, put learning aside, that itself will need thousands of processors to become parallel and then dissipate a lot of power, which can go beyond megawatt.
These scientists in India have been trying to build low energy chips, mimicking the way the human brain works for years. The human brain is packed with neuron cells that communicate with each other through electrical pulses, known as spikes. Each neuron releases molecules that act as messengers and control if the electrical pulse is passed along the chain. This transmission happens in the gaps between neurons, which are called synapses.
Neurons and synapses. They are fundamentally different from the way computers compute. Computers use transistors, which are binary. They do zero one on off. The brain sort of does a similar thing, except that it works in analog. It's also partly stochastic. A neuron may spike or may not spike, depending upon inherent internal randomness.
The brain itself doesn't like to do repetitive things. It likes to do everything with slight variance. This sort of enables creativity. It sort of enables problem solving. And the youngsters creative imagination is not neglected. Once there was a bear that was twice the size of these.
The connections are different. So if you look at a computer chip, every transistor roughly talks to about ten transistors. Every neuron, on the other hand, talks to about 10,000 other neurons. And that's literally what makes it super parallel. Neurons use time as a. As a token of information.
Computers don't understand time per se. They just sort of, you know, work with digital numbers and do mathematical operations. Neurons. To mimic the architecture of our brain more closely, the team designed an artificial neural net based on the biological neural networks in our brain. Each artificial neuron is built with a few silicon transistors and connected to other neurons.
We try to mimic the auditory cortex. So the auditory cortex has these neurons which are connected in random, and that sounds weird, but it turns out it serves a very important function, which is in speech processing. What we are doing is that there are recurrent connections, which means that the neuron could go to any neuron, including itself, creating some sort of loopy pathways, which are fairly random.
And these loops are part of the architecture. They are called liquid state machine. They naturally are able to process time series, like when you hear speech, which is sort of occurs in time. We are doing some analog compute. There is some noise also in the neurons, so there could be some stochasticity. In the traditional or conventional computation platform, everything is digital. Everything is defined based on logic. One and zero analog computers, which are eventually signals which can have any value to, would be able to get more complexity. At the same time, more functionality in the operation.
Here we have our neural network chip, which is on the test board. So what we'll do is we'll speak to the computer, and that will sort of convert the speech into spikes. It's going to get processed in the chip. We run the code one so it detects a one.
This neuromorphic chip has been able to recognize speech using a fraction of the energy used by a traditional chip, just like biological neurons, the artificial neurons wait for inputs and only spike when data comes in, and that means lower energy consumption. Each neuron is also highly efficient by using a quantum mechanics phenomenon called band to band tunneling. So electrons are still able to pass through transistors with very low electric current.
Quantum tunneling is nothing that humans essentially have experience. For if you have a human being that has to cross over a hill, the human being has to walk up the hill, burn energy, and then walk down, which again, burns energy. And so there is no way for you to cross this barrier without using up energy.
If you make this hill thinner and thinner, narrower and narrower, until it goes to the size scale on the order of, say, a few electron wavelengths, at which point an electron doesn't need to go above the barrier to surmount it. It can, in principle, tunnel through it, which means it will sort of disappear on one side and appear on the other side magically.
If you take a transistor, there's this same idea that when it turns on, we have a barrier that we sort of reduce, and when we want to turn it off, we increase the barrier. That's when you expect no current to go through. But that is not true by quantum mechanics. So even through a barrier, when it's narrow enough, electrons can sort of, quote unquote, tunnel through it.
So far, there are 36 artificial neurons on the chip, and their energy consumption is remarkably close to biological neurons if we want to approach the number of neurons which are already in human brain, so we need billions of them, and therefore, we need to reduce the energy per spike generated by a single neuron as much as possible.
Intel one of the world's biggest chip makers, has been working on brain inspired chips for seven years. They've released two versions of their neuromorphic research chip, Loihi. The latest one is only 32 mm² in size, but contains up to a million neurons.
The company said Loihi has demonstrated multiple applications, including recognizing hazardous odours and enabling robotic arms to feel, yeah, in the near term. Based on the most recent results we've had with Loihi, I would say that solving optimization problems are the most exciting application. Things like planning, navigating, you can think of it as solving the shortest path in a map, for example, if you want to get from point a to point b.
These kind of optimization problems are conventionally really compute heavy. But we found that on a neuromorphic architecture, we can reduce the computational demands by orders of magnitude. So over a ten times speed up, as well as over 1000 times energy reduction. Probably the majority of the groups that are working with Lowehi sees robotics as kind of the long term best application domain for neuromorphic chips.
After all, brains evolved in nature to control bodies and to deal with real world stimulus onto it in real time, and adapt to the unpredictable aspect of the real world. And if you think about robotics today, they perform very rigidly prescribed tasks. But if you want to deploy a robot into the real world, we really don't have the technology yet to make them as agile, as adaptive as we really would like. And so that's where we see neuromorphic chips really thriving.
Intel and its collaborators have used the Luihi chip to power icube, a humanoid robot designed to learn in a way that children might. This ICub robot is intended to interact with the environment, as opposed to being pre trained with static datasets beforehand. So using kind of saccadic motions, moving the eyeballs, it can trace through and detect change or edges in the objects in front of this robot.
And using a fully neuromorphic paradigm, we're able to design a network that understands and learns new object classes, even at different angles and different poses. It adds to its understanding, its kind of dictionary of understand objects over time. What is this called? Thats a mouse. Oh, I see, this is a mouse.
Let me have a look at that. Sure. Can you show me the mouse? Here is the mouse. So far, all of this technology is at an early stage, but to fight climate change and power the progress of civilization, we need at least one of them to be a breakthrough.
Greener chips absolutely have to happen. People are concerned about the environmental impact, people are concerned about efficiency.
People are concerned about the results and the way that chips can impact modern life. So there's been, you know, an absolute blossoming of investment in chip startups and trying these things. This is definitely the time of great ideas.
Will they all be successful? Absolutely not. But now, I would say more than ever, there is a chance that out there in some corner of the world, there is a chip startup that will be everywhere within maybe the next ten years. You can't get your enjoyments from only the successes in science. You need to get your enjoyment from just enjoying solving hard problems. And so if you fail, that is just part of the process of eventually finding a solution.
Technology, Innovation, Science, Carbon Nanotubes, Energy Efficiency, Neuromorphic Computing, Bloomberg Originals
Comments ()