{"id":694401,"date":"2020-10-01T09:39:47","date_gmt":"2020-10-01T16:39:47","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?p=694401"},"modified":"2020-10-06T16:28:44","modified_gmt":"2020-10-06T23:28:44","slug":"archai-can-design-your-neural-network-with-state-of-the-art-neural-architecture-search-nas","status":"publish","type":"post","link":"https:\/\/www.microsoft.com\/en-us\/research\/blog\/archai-can-design-your-neural-network-with-state-of-the-art-neural-architecture-search-nas\/","title":{"rendered":"Archai can design your neural network with state-of-the-art neural architecture search (NAS)"},"content":{"rendered":"\n
\"\"\/<\/figure>\n\n\n\n

The goal of neural architecture search (NAS) (opens in new tab)<\/span><\/a> is to have computers automatically search for the best-performing neural networks. Recent advances in NAS methods have made it possible to build problem-specific networks that are faster, more compact, and less power hungry than their handcrafted counterparts. Unfortunately, many NAS methods rely on an array of tricks that aren\u2019t always documented in a way that\u2019s easy to discover. While these tricks result in neural networks with greater accuracy, they often cloud the performance of the search algorithm themselves. Since different NAS methods use different enhancements and some none at all, NAS techniques have become difficult for researchers to compare. The use of a variety of enhancements has also made NAS methods difficult to reproduce (opens in new tab)<\/span><\/a>. Once-promising methods may disappoint when an attempt is made to transfer them to other datasets. Additionally, engineers trying to use NAS often find it challenging to understand the implications of advertised advances because of a deluge of research claims, an inability to fairly compare methods side by side, fragmented code bases in research repos, hyperparameters that aren\u2019t carefully managed, and a lack of plug-and-play for individual techniques.<\/p>\n\n\n\n

\n\t