Rice’s Yingyan Lin leads development of next-gen devices that think for themselves
If the internet of things (IoT) is going to live up to its potential, those “things” had better be smart. Yingyan Lin and her collaborators are on the case.
Lin is the principal investigator on a three-year, $1.38 million set of National Science Foundation (NSF) grants to facilitate real-time machine learning in devices at the “edges” of the internet.
Lin, an assistant professor of electrical and computer engineering at Rice’s Brown School of Engineering, will lead the team that includes Richard Baraniuk, Rice’s Victor E. Cameron Professor of Electrical and Computer Engineering, as well as Boris Murmann at Stanford University, Atlas Wang at Texas A&M University and Yiran Chen at Duke University.
They plan to design innovative hardware and software that allow devices to learn by themselves as they see new data.
“Many IoT devices, such as autonomous vehicles, robots and health care wearables, require such real-time and on-site learning capabilities to perform well in real-world applications,” Lin said.
“Machine-learning algorithms that are trained in powerful servers and later deployed into IoT devices can perform poorly,” she said. “This is because the data the algorithms see on the device itself can be very different from the pre-collected data they are trained with in the servers. It’s thus critical that algorithms in the devices learn from a continuous stream of new data in real time.”
Current “edge” devices are limited by their computing, energy and storage resources, while the energy, financial and environmental costs of training machine-learning algorithms are increasingly prohibitive, she said.
That has become a growing concern even for training in the cloud, Lin said. “A recent research paper reported by the New York Times and MIT Technology Review estimated that training a single machine-learning model can generate the same carbon footprint as the lifetime emissions of five American cars, including gas,” she said.
One idea the team plans to develop will facilitate training convolutional neuro networks, most often used to analyze visual data, with 80% less energy. A paper on the topic by Lin and her students at Rice and collaborators at Texas A&M will be presented at the prestigious Conference on Neural Information Processing Systems (NeurIPS) in December.
The NSF grants were issued in collaboration with the Defense Advanced Research Projects Agency (DARPA) for its Real-Time Machine Learning (RTML) program, part of the second phase of DARPA’s Electronics Resurgence Initiative (ERI), a five-year, $1.5 billion investment in the future of domestic, U.S. government and defense electronics systems. Lin’s team and other awardees will meet at DARPA’s Arlington, Virginia, headquarters in late October to strategize on the project’s three-year arc.