The human brain is a massively parallel computational machine which consumes twenty percent of the energy your body produces, which translates to about twelve watts. By comparison, a standard light bulb consumes sixty watts of energy! It uses over one hundred billion neurons, each with about ten thousand connections to its neighbouring neurons. A relatively powerful, recent computer contains anywhere between one and eight cores, which comprise the Central Processing Unit (CPU). The CPU is composed of around two to four billion transistors, which are electrically controlled electrical switches, that convert voltage signals into binary (zeroes and ones) instructions for the computer to process.
The first few computers ran on CPUs with less than two thousand transistors each! Gordon Moore, a co-founder and the current chairman of Intel, prophesied in 1965 that the number of transistors per square inch of a CPU would double every eighteen months starting from the invention of the integrated circuit. This law, known as Moore’s Law, has held until now. However, it is destined to break down at the turn of this decade. Why?
As transistors become smaller, the distance between their source and gate reduces. Beyond a certain point, quantum effects take over, and you no longer know where the electrons are. They begin to demonstrate a quantum mechanical effect seen over short distances, such as the width of an atom. The electrons begin to tunnel through the semiconductor substrate and leak into other transistors as well. At this stage, the CPU short circuits and computation is impossible.
What is the way out? Several ideas have been proposed by scientists worldwide. These ideas include neuromorphic chips, quantum computing, and massively parallel architectures. Quantum computing is a unique form of computing that involves measuring states of quantum mechanical particles called qubits which can be in a quantum superposition of states, such as 40% “up” and 60% “down”. This allows them to take up values that encompass all real numbers between zero and one, whereas the computers we use today can only process binary. This may not allow us to optimize operations that have become synonymous with computers these days, such as browsing the web, or using a photo editing application, but it will enable us to solve extremely difficult optimization problems, deep learning problems, and identification problems, tasks which plague supercomputers even today. But they don’t always give the same answers. Yet.
Massively parallel architectures are a form of mimicking the human brain, thus enabling real time processing tasks which humans excel at, such as image and pattern recognition, speech recognition, and multitasking. IBM’s supercomputer platform Watson, which beat humans at a game of Jeopardy! in February 2010, ran on 90 IBM Power750 servers, each of which required a thousand watts of power to run! That makes 90 kilowatts of power to run something that could do one of the things the human brain can.
This post is stretching way too far, so I will leave it hanging, hopefully to be continued soon. I hope I have piqued your curiosity. Here are a few links and references. I strongly recommend the YouTube videos, they were fantastic.
How amazing it is that we, with our monkey heritage and monkey brains and monkeys fingers, have somehow lucked into a brain that allows us to ask legitimate questions about the nature of physical reality. That’s so cool.
- Michio Kaku’s book: The Physics of the Impossible
- Google and NASA’s Quantum AI Lab
- Transistors and the end of Moore’s Law
- How does a Quantum Computer work?
- Your Brain is Plastic.
- Did you watch the four above? 😛 Subscribe (Follow) for updates 🙂