As a senior in college 25 years ago, Jeff Dean built an artificial brain. Using what was considered a supercomputer at the time, he created a system that could analyze information and even learn. Trouble was, the assemblage didn't have enough mental muscle. "We just trained it on toy problems," he says of his neural network. "The computational power wasn't all that great." Today things are different. As one of Google's earliest hires, Dean-now a senior fellow and one of the company's most important engineers-helped create software that could store and process data across thousands of machines. With names like Bigtable and MapReduce, these epic tools were the secret weapons that enabled Google's search engine to instantly serve hundreds of millions of people across the globe. Based on the research that Google later published, other companies like Facebook, Twitter, and Yahoo began using similar tools. And now, drawing on many of the same ideas that allow programs to juggle data across thousands of machines, people like Dean can finally construct high-power neural networks that work. Dean and his colleagues have built massive neural nets that can reliably identify the voice commands you bark into Android phones or recognize the faces in images you post to the Goo-gle+ social network. Humans, Dean says, "have lots and lots of these neurons, and they're all trained to pick up on different types of patterns." The computer "neurons" pick up on patterns too. Result: Crazy-complicated systems that do very interesting things. Google is not alone. At Microsoft similar tech underpins a new Skype tool that instantly translates from one language to another. Such miracles are powered by so-called deep learning, a particular breed of neural network. In the years to come, Dean says, it will remake far more than just Skype. If computers in the cloud can learn, so can the computers inside other machines. Think self-driving cars. And sentient robots.
展开▼