{"id":757162,"date":"2021-06-24T12:38:31","date_gmt":"2021-06-24T19:38:31","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?post_type=msr-research-item&p=757162"},"modified":"2021-06-28T12:45:01","modified_gmt":"2021-06-28T19:45:01","slug":"introducing-retiarii-a-deep-learning-exploratory-training-framework-on-nni","status":"publish","type":"msr-video","link":"https:\/\/www.microsoft.com\/en-us\/research\/video\/introducing-retiarii-a-deep-learning-exploratory-training-framework-on-nni\/","title":{"rendered":"Introducing Retiarii: A deep learning exploratory-training framework on NNI"},"content":{"rendered":"

Traditional deep learning frameworks such as TensorFlow and PyTorch support training on a single deep neural network (DNN) model, which involves computing the weights iteratively for the DNN model. Designing a DNN model for a task remains an experimental science and is typically a practice of deep learning model exploration. Retrofitting such exploratory-training into the training process of a single DNN model, as supported by current deep learning frameworks, is unintuitive, cumbersome, and inefficient.<\/p>\n

In this webinar, Microsoft Research Asia Senior Researcher Quanlu Zhang and Principal Program Manager Scarlett Li will analyze these challenges within the context of Neural Architecture Search (NAS). The first part of the webinar will focus on Retiarii, a deep learning exploratory-training framework for DNN models. Retiarii also offers a just-in-time (JIT) engine that instantiates models and manages their training, gathers information for the exploration strategy to consume, and executes the decisions accordingly. Retiarii identifies correlations between the instantiated models and develops cross-model optimizations to improve the overall exploratory-training process. Retiarii does so by introducing a key abstraction, Mutator, that connects the specifications of DNN model spaces and exploration strategies, while exposing the correlations between models for optimization. The benefits include ease of programming, reuse of components, and vastly improved (up to 8.58x) overall exploratory-training efficiency.<\/p>\n

The second part of the talk will introduce Retiarii’s implementation on the open source project Neural Network Intelligence (NNI), and how the toolkit can enable users to design state-of-the-art NAS more efficiently.<\/p>\n

Together, you\u2019ll explore:<\/p>\n