News & features
Innovations in AI: Brain-inspired design for more capable and sustainable technology
| Dongsheng Li, Dongqi Han, and Yansen Wang
Researchers and their collaborators are drawing inspiration from the brain to develop more sustainable AI models. Projects like CircuitNet and CPG-PE improve performance and energy efficiency by mimicking the brain's neural patterns.
Research Focus: Week of August 26, 2024
Learn what’s next for AI at Research Forum on Sept. 3; WizardArena simulates human-annotated chatbot games; MInference speeds pre-filling for long-context LLMs via dynamic sparse attention; Reef: Fast succinct non-interactive zero-knowledge regex proofs.
Structured knowledge from LLMs improves prompt learning for visual language models
| Xinyang Jiang, Yubin Wang, Dongsheng Li, and Cairong Zhao
Using LLMs to create structured graphs of image descriptors can enhance the images generated by visual language models. Learn how structured knowledge can improve prompt tuning for both visual and language comprehension.
ICLR 2022 highlights from Microsoft Research Asia: Expanding the horizon of machine learning techniques and applications
| Shun Zheng, Jiang Bian, Tie-Yan Liu, Li Zhao, Tao Qin, Yue Wang, Dongsheng Li, Yuqing Yang, and Xufang Luo
ICLR (International Conference on Learning Representations) (opens in new tab) is recognized as one of the top conferences in the field of deep learning. Many influential papers on artificial intelligence, statistics, and data science—as well as important application fields such…