"Inexact" chips save power by fudging the maths
By Stewart Mitchell
Posted on 18 May 2012 at 09:55
Computer scientists have unveiled a computer chip that turns traditional thinking about mathematical accuracy on its head by fudging calculations.
The Rice University researchers say their “inexact” chip could be useful because it uses dramatically less power than conventional accurate processors.
The scientists claim the prototypes unveiled are 15 times more efficient because they allow occasional errors and could be used in some applications without having a negative effect.
The concept works by allowing processing components — such as hardware for adding and multiplying numbers — to make a few mistakes, which means they are not working as hard so use less power and get through tasks more quickly.
The researchers say the technology is able to "manage the probability of errors and limit which calculations produce errors", in a move that "can simultaneously cut energy demands and dramatically boost performance".
Particular types of applications can tolerate quite a bit of error. For example, the human eye has a built-in mechanism for error correction
One example of the idea in action can be seen with "pruning, or trimming away some of the rarely used portions of digital circuits on a microchip".
Obviously the chips wouldn't be suitable for applications requiring high accuracy, such as financial trading or computer modelling, but the researchers say other tasks can be more flexible.
“Particular types of applications can tolerate quite a bit of error. For example, the human eye has a built-in mechanism for error correction,” said researcher Christian Enz.
“We used inexact adders to process images and found that relative errors up to 0.54% were almost indiscernible, and relative errors as high as 7.5% still produced discernible images.”
According to the team, initial uses for the pruning technology would be in application-specific processors, such as embedded chips used in hearing aids and cameras.
“In the latest tests, we showed that pruning could cut energy demands by 3.5 times in chips that deviated from the correct value by an average of 0.25%,” the researchers said.
“When we factored in size and speed gains, these chips were 7.5 times more efficient than regular chips. Chips that got wrong answers with a larger deviation of about 8% were up to 15 times more efficient.”
Is your business a social business? For helpful info and tips visit our hub.
Clive Sinclair invented this ...
way back in the late 1970's with his first scientific calculator. Basically it just made up the answers to trig functions!
By JohnAHind on 18 May 2012
[quote]Obviously the chips wouldn't be suitable for applications requiring high accuracy, such as financial trading or computer modelling, but the researchers say other tasks can be more flexible.[/quote]
Maybe someone should have explained this to the banks a few years ago ...
By Aasta on 18 May 2012
In fairness to the banks systems, the answers they were producing were correct, it was just the bankers were usually asking the wrong questions...
By Mr_John_T on 18 May 2012
Yeah, the bank computers merely let people ask the wrong questions very, very quickly and accurately.
By steviesteveo on 19 May 2012
Sounds a bit strange...
But given some tasks, such as image processing and voice recognition, probably have some natural error in them anyway, such chips could improve the performance/power ratio for many things. Especially in embedded items, such as the hearing aids mentioned.
In fact, even some computer modelling can probably tolerate some error. For example, given the "butterfly effect" seen in weather modelling, some "noise" in the calculations may paradoxically lead to more accurate end results. If they can run multiple scenarios more efficiently.
It's also interesting that biological systems also have similiar "noisy" processing systems. Which may exist precisely because overall they are more efficient than a 100% solution.
By Penfolduk01 on 19 May 2012
Yes, and when our brains come up with a solution based on incomplete info, or a short-cut, and then get it wrong, we call it "human error". As our technology gets faster, bigger, etc, one person's error has been/can be devastating. Compound that with inherent "computer error"...
What some have called "computer error" in the past has actually been human error, be it software, input error, or whatever. Seems to me that this "innovation" would give the unscrupulous opportunities to point the finger of blame at the computer.
It's no use anyone saying that these chips would not be used for safety- or mission-critical applications... if they're cheaper to make and/or run, they'll get used.
As to weather modelling, it was the minute rounding errors in the various different computers used in the peer-review process to validate the original researchers' work that gave rise to the whole concept of Chaos Theory, because those equations are so sensitive to initial input conditions. The fact that some of the divergences were due to different processors and some to differences in the handling of rounding errors in FP operations make the following point:
If you always want to get the wrong answer, fudge the hardware. If you only sometimes want the wrong answer, fudge the software where you need to.
By mikejdcastle on 24 May 2012
mikejdcastle is correct, if they're cheaper they'll get used.
By 345jon on 25 May 2012
- 20 years of PC Pro: our greatest review mistakes
- 20 years of PC Pro: our first A-List
- Wikipedia's "right to be forgotten" protest hits the wrong note
- 3D printing hits the high street for plastic selfies
- 20 years of PC Pro: What amazed us in our first issue
- How Google Glass ruined my lunch hour
- Smartphone battery packs: can a USB power pack beat the festival battery blues?
- Windows Easy Transfer – not so "easy" in Windows 8.1
- Formula 1: what a difference virtualisation makes
- Office of the future: comfy chairs and tablets everywhere
- 10 ways to make your business more secure
- Top five VoIP mistakes
- How to add in-app purchasing to an iPhone, Android or Windows app
- Remote-control ransomware: TeamViewer and software hardball
- Why laptops with serial ports matter to the Internet of Things
- Make your mobile battery last longer
- Small steps into handling Big Data
- Nexus 5: does it really run stock Android?
- How to get broadband to a garden office
- How to write your company's IT security policy