News & features

Research Focus: Week of March 24, 2025
In this issue, we examine a new conversation segmentation method that delivers more coherent and personalized agent conversation, and we review efforts to improve MLLMs’ understanding of geologic maps. Check out the latest research and other updates.

Magma: A foundation model for multimodal AI agents across digital and physical worlds
| Swadheen Shukla, Jianwei Yang, Reuben Tan, Qianhui Wu, and Jianfeng Gao
Explore Magma, a foundation model that can empower AI assistants to interpret environments, plan actions, and execute tasks across digital and physical spaces. Now available, learn how it advances the field of agentic AI.

ExACT: Improving AI agents’ decision-making via test-time compute scaling
| Baolin Peng, Xiao Yu, Hao Cheng, Michel Galley, Zhou Yu, and Jianfeng Gao
ExACT combines Reflective-MCTS and Exploratory Learning to improve AI agents' decision-making, enabling test-time compute scaling. Learn how these methods help agents refine strategies for state-of-the-art performance and improved computational efficiency.

Data Formulator: Exploring how AI can help analysts create rich data visualizations
| Chenglong Wang, Steven Drucker, Dan Marshall, Jeevana Priya Inala, Kori Inkpen, and Jianfeng Gao
Data Formulator investigates combining UI interactions with natural language input. Powered by AI, it can help users create or adapt visualizations and supports continuous refinement through an iterative process. Now available on GitHub.

Microsoft Research Forum Episode 4: The future of multimodal models, a new “small” language model, and other AI updates
Explore multimodal & small language models, plus advanced benchmarks for AI evaluation. Microsoft researchers are working on breakthroughs in weather prediction, materials design, even a new kind of computer for AI inference and hard optimization problems.

LLM profiling guides KV cache optimization
| Liyuan Liu and Jianfeng Gao
LLMs rely on memory-intensive mechanisms like the key-value (KV) cache to store and quickly retrieve data. FastGen optimizes KV cache usage, reducing LLM memory demands by up to 50% while maintaining performance.

Research Focus: Week of February 5, 2024
Research Focus: New Research Forum series explores bold ideas in the era of AI; LASER improves reasoning in language models; Cache-Efficient Top-k Aggregation over High Cardinality Large Datasets; Six Microsoft researchers named 2023 ACM Fellows.
Jianfeng Gao, Sumit Gulwani, Nicole Immorlica, Stefan Saroiu, Manik Varma, and Xing Xie are among the new class of 68 Association for Computing Machinery (ACM) fellows for their transformative contributions to computing science and technology.

Data Formulator: A concept-driven, AI-powered approach to data visualization
| Chenglong Wang, Bongshin Lee, John Thompson, Steven Drucker, and Jianfeng Gao
Visualization is vital for understanding complex data, but existing tools require “tidy data,” adding extra steps. Learn how Data Formulator transforms concepts into visuals, promoting collaboration between analysts and AI agents.