With a cool 170 teraflops of performance, the machine is designed to tackle the complex worlds of deep learning and artificial intelligence, areas of research requiring massive amounts of computing power.
The DGX-1 uses the company's newly developed Pascal architecture that recently showed up in its beastly in-car supercomputer. The DGX-1 has eight Tesla GP100 GPUs, each with 16 gigabytes of memory. Alongside that, the knowledge hungry supercomputer contains 512 GB of RAM, and four 1.92 terabyte solid state hard drives. At 60 kg (132 lb), the DGX-1 might be an under the desk job.
"As neural nets become larger and larger, we not only need faster GPUs with larger and faster memory, but also much faster GPU-to-GPU communication," said Director of AI research at Facebook, Yann LeCun, when presenting the DGX-1 at yesterday's GPU Technology Conference in Silicon Valley.