“Complexity scaling is also in decline owing to the non-scalability of traditional von Neumann computing architecture and the impending ‘Dark Silicon’era that presents a significant threat to multi-core processor technology,” the researchers note in the present (Sept 13) online issue of Nature Communications.
The Dark Silicon era has already been upon us to some degree and refers to the inability of most or all the devices on a pc chip to be powered up at once. This happens because of an excessive amount of heat generated from just one device. Von Neumann architecture is the typical structure of most modern computers and relies on an electronic approach — “yes” or “no” answers — where program instruction and data are stored in exactly the same memory and share exactly the same communications channel.
“Because of this, data operations and instruction acquisition can not be done at the same time,” said Saptarshi Das, assistant professor of engineering science and mechanics.“For complex decision-making using neural networks, you might need a bunch of supercomputers trying to use parallel processors at the same time — a million laptops in parallel — that would use up a basketball field. Portable healthcare devices, for instance, can’t work that way.”
The perfect solution is, based on Das, is to produce brain-inspired, analog, statistical neural networks that not count on devices which can be simply on or off, but provide a range of probabilistic responses which can be then weighed against the learned database in the machine. To get this done, the researchers developed a Gaussian field-effect transistor that is made of 2D materials — molybdenum disulfide and black phosphorus. These devices are more energy efficient and produce less heat, helping to make them ideal for scaling up systems.
“The human brain operates seamlessly on 20 watts of power,” said Das. “It is more energy efficient, containing 100 billion neurons, and it doesn’t use von Neumann architecture.”
The researchers note so it isn’t just energy and heat which have become problems, but it is becoming difficult to match more in smaller spaces.
“Size scaling has stopped,” said Das. “We could only fit approximately 1 billion transistors on a chip. We want more complexity just like the brain.”
The notion of probabilistic neural networks has been around considering that the 1980s, however it needed specific devices for implementation.
“Like the working of an individual brain, key features are extracted from a set of training samples to greatly help the neural network learn,” said Amritanand Sebastian, graduate student in engineering science and mechanics.
The researchers tested their neural network on human electroencephalographs, graphical representation of brain waves. After feeding the network with many types of EEGs, the network could then have a new EEG signal and analyze it and determine if the subject was sleeping.
“We don’t need as extensive an exercise period or base of information for a probabilistic neural network as we truly need for an artificial neural network,” said Das.
The researchers see statistical neural network computing having applications in medicine, because diagnostic decisions aren’t always 100% yes or no. In addition they know that for the best impact, medical diagnostic devices need to be small, portable and use minimal energy.
Das and colleagues call their device a Gaussian synapse and it is founded on a two-transistor setup where in actuality the molybdenum disulfide can be an electron conductor, whilst the black phosphorus conducts through missing electrons, or holes. The unit is basically two variable resistors in series and the combination produces a graph with two tails, which matches a Gaussian function.
Others focusing on this project were Andrew Pannone, undergraduate in engineering science and mechanics; and Shiva Subbulakshmi, student in electrical engineering at Amrita Vishwa Vidyapeetham, India, and a summer intern in the Das laboratory.
The Air Force Office of Scientific Research supported this work.