Their device is highly power-conscious, massively parallel, and can manipulate data in arbitrary ways – even though it doesn't need to be explicitely designed to perform any task. The advance could pave the way for computers that think more like we do.
(When the) chips are down
Electronic chips as they are currently designed come with plenty of drawbacks. Even the simplest operations, like adding or subtracting, need large numbers of transistors arranged in a very specific, well thought-out pattern. These transistors quickly add up and drain power even when idle (unless specific measures are taken). Moreover, most circuits can't effectively process information in parallel, leading to a further waste of time and energy.
All of these factors make it especially hard for today's computers to perform many crucial tasks quickly and on little power – particularly the kinds of tasks that the human brain can tackle with ease, like recognizing a visual pattern or understanding human language. In fact, when it comes to simulating brain-like functionality, many researchers have opted to abandon traditional computer architectures altogether.