Research talks: Few-shot and zero-shot visual learning and reasoning
Humans learn, infer, and reason by leveraging prior knowledge without necessarily observing a large number of examples. Visual learning and reasoning technologies, such as few-shot and zero-shot learning, aim to enable human-like learning and reasoning in artificial intelligence (AI) systems. Join Professor Kyoung Mu Lee from Seoul National University, Microsoft Principal Researcher Han Hu, and Microsoft Senior Researcher Zhe Gan as they discuss few-shot learning, zero-shot learning, and large-scale vision-and-language pre-training for reasoning. Gain a better understanding of the challenges and opportunities of these technologies.
Learn more about the 2021 Microsoft Research Summit: https://Aka.ms/researchsummit (opens in new tab)
- Track:
- Towards Human-Like Visual Learning & Reasoning
- Date:
- Speakers:
- Kyoung Mu Lee, Han Hu, Zhe Gan
- Affiliation:
- Seoul National University, Microsoft Research
-
-
Han Hu
Principal Researcher
-
Zhe Gan
Principal Researcher
-
Kyoung Mu Lee
Professor
Seoul National University
-
-
Towards Human-Like Visual Learning & Reasoning
-
Opening remarks: Towards Human-Like Visual Learning and Reasoning
Speakers:- Wenjun Zeng,
- Wenjun Zeng
-
-
-
Research talks: Learning for interpretability
Speakers:- Yuwang Wang,
- Hanwang Zhang,
- Shujian Yu
-
Research talks: Few-shot and zero-shot visual learning and reasoning
Speakers:- Han Hu,
- Zhe Gan,
- Kyoung Mu Lee
-
-
Research talks: Generalization and adaptation
Speakers:- Suha Kwak,
- Chong Luo,
- Lu Yuan
-