{"id":788765,"date":"2021-10-26T19:53:15","date_gmt":"2021-10-27T02:53:15","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?post_type=msr-project&p=788765"},"modified":"2021-11-21T08:26:44","modified_gmt":"2021-11-21T16:26:44","slug":"automl-nas","status":"publish","type":"msr-project","link":"https:\/\/www.microsoft.com\/en-us\/research\/project\/automl-nas\/","title":{"rendered":"AutoML-NAS"},"content":{"rendered":"
\n\t
\n\t\t
\n\t\t\t\t\t<\/div>\n\t\t\n\t\t
\n\t\t\t\n\t\t\t
\n\t\t\t\t\n\t\t\t\t
\n\t\t\t\t\t\n\t\t\t\t\t
\n\t\t\t\t\t\t
\n\t\t\t\t\t\t\t\n\t\t\t\t\t\t\t\n\n

AutoML-NAS<\/h1>\n\n\n\n

<\/p>\n\n\t\t\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/div>\n<\/section>\n\n\n\n\n\n

AutoML, which aims to automate a machine learning system, has attracted a lot of attention in the research community and made a lot of noise in industry and media. An ML system typically learns from a given dataset D, via optimizing a certain loss L, within a particular hypothesis (function) space F. AutoML covers a wide spectrum of important problems:
1. Data, which aims to find the best training data D and the data processing pipeline for the task at hand. Data plays a similar role to machine learning such as textbooks in human learning.
2. Loss function, which aims to design the most appropriate loss function L to be optimized.
3. Hypothesis space, which aims to identify the hypothesis space F that the model belongs to.<\/p>\n\n\n\n

NAS, abbreviation for neural architecture search, is an important topic in AutoML, which aims to automatically design well-performing neural network architectures for specific target task, to explore neural network architectures that outperform the ones designed by human experts, and largely reduce the human efforts. We mainly focus on general NAS algorithms\/methods, and the applications to various tasks.<\/p>\n\n\n\n\n\n

  • Jin Xu, Xu Tan, Renqian Luo, Kaitao Song, Jian Li, Tao Qin, Tie-Yan Liu, NAS-BERT: Task-Agnostic and Adaptive-Size BERT Compression with Neural Architecture Search, KDD 2021.<\/li>
  • Renqian Luo, Xu Tan, Rui Wang, Tao Qin, Jinzhu Li, Sheng Zhao, Enhong Chen, Tie-Yan Liu, LightSpeech: Lightweight and fast text to speech with neural architecture search, ICASSP 2021.<\/li>
  • Yang Fan, Yingce Xia, Lijun Wu, Shufang Xie, Weiqing Liu, Jiang Bian, Xiangyang Li, Tao Qin. Learning to Reweight with Deep Interactions. AAAI 2021.<\/li>
  • Renqian Luo, Xu Tan, Rui Wang, Tao Qin, Enhong Chen, Tie-Yan Liu, Accuracy Prediction with Non-neural Model for Neural Architecture Search.<\/li>
  • Renqian Luo, Xu Tan, Rui Wang, Tao Qin, Enhong Chen, Tie-Yan Liu, Semi-Supervised Neural Architecture Search, NeurIPS 2020.<\/li>
  • Yang Fan, Fei Tian, Yingce Xia, Tao Qin, Xiangyang Li, and Tie-Yan Liu. Searching Better Architectures for Neural Machine Translation. IEEE\/ACM Transactions on Audio, Speech and Language Processing 2020.<\/li>
  • Renqian Luo, Tao Qin, Enhong Chen, Balanced One-shot Neural Architecture Optimization.<\/li>
  • Lijun Wu, Fei Tian, Yingce Xia, Tao Qin, Jianhuang Lai, and Tie-Yan Liu, Learning to Teach with Dynamic Loss Functions, NeurIPS 2018.<\/li>
  • Renqian Luo, Fei Tian, Tao Qin, Enhong Chen, and Tie-Yan Liu, Neural Architecture Optimization, NIPS 2018.<\/li>
  • Yang Fan, Fei Tian, Tao Qin, Xiangyang Li, and Tie-Yan Liu, Learning to Teach, ICLR 2018.<\/li><\/ul>\n\n\n","protected":false},"excerpt":{"rendered":"

    AutoML, which aims to automate a machine learning system, has attracted a lot of attention in the research community and made a lot of noise in industry and media. An ML system typically learns from a given dataset D, via optimizing a certain loss L, within a particular hypothesis (function) space F. AutoML covers a […]<\/p>\n","protected":false},"featured_media":0,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"footnotes":""},"research-area":[13556],"msr-locale":[268875],"msr-impact-theme":[],"msr-pillar":[],"class_list":["post-788765","msr-project","type-msr-project","status-publish","hentry","msr-research-area-artificial-intelligence","msr-locale-en_us","msr-archive-status-active"],"msr_project_start":"","related-publications":[],"related-downloads":[],"related-videos":[],"related-groups":[],"related-events":[],"related-opportunities":[],"related-posts":[],"related-articles":[],"tab-content":[],"slides":[],"related-researchers":[{"type":"user_nicename","display_name":"Renqian Luo","user_id":40441,"people_section":"Section name 0","alias":"renqianluo"},{"type":"user_nicename","display_name":"Yingce Xia","user_id":37784,"people_section":"Section name 0","alias":"yinxia"}],"msr_research_lab":[199560],"msr_impact_theme":[],"_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/788765"}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-project"}],"version-history":[{"count":3,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/788765\/revisions"}],"predecessor-version":[{"id":798604,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/788765\/revisions\/798604"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=788765"}],"wp:term":[{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=788765"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=788765"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=788765"},{"taxonomy":"msr-pillar","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-pillar?post=788765"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}