{"id":1059900,"date":"2024-07-25T12:04:25","date_gmt":"2024-07-25T19:04:25","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/blog\/tracing-the-path-to-self-adapting-ai-agents\/"},"modified":"2024-08-01T14:02:38","modified_gmt":"2024-08-01T21:02:38","slug":"tracing-the-path-to-self-adapting-ai-agents","status":"publish","type":"post","link":"https:\/\/www.microsoft.com\/en-us\/research\/blog\/tracing-the-path-to-self-adapting-ai-agents\/","title":{"rendered":"Tracing the path to self-adapting AI agents"},"content":{"rendered":"\n
\"white<\/figure>\n\n\n\n

The games industry has long been a frontier of innovation for AI. In the early 2000s, programmers hand-coded neural networks to breathe life into virtual worlds (opens in new tab)<\/span><\/a>, creating engaging AI characters (opens in new tab)<\/span><\/a> that interact with players. Fast forward two decades, neural networks have grown from their humble beginnings to colossal architectures with billions of parameters, powering real-world applications like ChatGPT (opens in new tab)<\/span><\/a> <\/strong>and Microsoft Copilots (opens in new tab)<\/span><\/a>. The catalyst for this seismic shift in AI scale and capability is the advent of automatic optimization. AutoDiff frameworks like PyTorch (opens in new tab)<\/span><\/a> <\/strong>and Tensorflow (opens in new tab)<\/span><\/a> <\/strong>have democratized scalable gradient-based end-to-end optimization. This breakthrough has been instrumental in the development of Large Foundation Models (LFMs) that now sit at the core of AI.<\/p>\n\n\n\n

Today, the AI systems we interact with are more than just neural network models. They contain intricate workflows that seamlessly integrate customized machine learning models, orchestration code, retrieval modules, and various tools and functions. These components work in concert to create the sophisticated AI experiences that have become an integral part of our digital lives. Nonetheless, up to now, we do not have tools to automatically train these extra components. They are handcrafted through extensive engineering, just like how neural networks were engineered in the early 2000s.<\/p>\n\n\n\n

End-to-end automatic optimization of AI systems<\/h2>\n\n\n\n

The latest research from Microsoft and Stanford University introduces Trace (opens in new tab)<\/span><\/a>, a groundbreaking framework poised to revolutionize the automatic optimization of AI systems. Here are three highlights of the transformative potential of Trace:<\/p>\n\n\n\n