{"id":548040,"date":"2018-11-14T12:46:29","date_gmt":"2018-11-14T20:46:29","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?post_type=msr-event&p=548040"},"modified":"2020-05-06T13:18:19","modified_gmt":"2020-05-06T20:18:19","slug":"physics-ml-workshop","status":"publish","type":"msr-event","link":"https:\/\/www.microsoft.com\/en-us\/research\/event\/physics-ml-workshop\/","title":{"rendered":"Physics \u2229 ML"},"content":{"rendered":"
Venue: <\/strong> Watch on demand:<\/strong> The goal of Physics \u2229 ML is to bring together researchers from machine learning and physics to learn from each other and push research forward together. In this inaugural edition, we will especially highlight some amazing progress made in string theory with machine learning and in the understanding of deep learning from a physical angle. Nevertheless, we invite a cast with wide ranging expertise in order to spark new ideas. <\/p>\n","protected":false},"featured_media":551232,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","msr_startdate":"2019-04-25","msr_enddate":"2019-04-26","msr_location":"Microsoft Research Redmond","msr_expirationdate":"","msr_event_recording_link":"","msr_event_link":"","msr_event_link_redirect":false,"msr_event_time":"","msr_hide_region":false,"msr_private_event":false,"footnotes":""},"research-area":[13546],"msr-region":[197900],"msr-event-type":[197944,210063],"msr-video-type":[],"msr-locale":[268875],"msr-program-audience":[],"msr-post-option":[],"msr-impact-theme":[],"class_list":["post-548040","msr-event","type-msr-event","status-publish","has-post-thumbnail","hentry","msr-research-area-computational-sciences-mathematics","msr-region-north-america","msr-event-type-hosted-by-microsoft","msr-event-type-workshop","msr-locale-en_us"],"msr_about":"Venue: <\/strong>\r\nMicrosoft, Building 99,\r\nRedmond, Washington, USA<\/a>\r\n\r\nWatch on demand:<\/strong>\r\nDay 1: 9:00 AM\u201310:30 AM<\/a>\r\nDay 1: 11:00 AM\u201312:30 PM<\/a>\r\nDay 1: 2:00 PM\u20134:05 PM<\/a>\r\nDay 2: 9:00 AM\u20139:45 AM<\/a>\r\nDay 2: 10:15 AM\u201312:30 PM<\/a>","tab-content":[{"id":0,"name":"About","content":"
\nMicrosoft, Building 99,
\nRedmond, Washington, USA<\/a><\/p>\n
\nDay 1: 9:00 AM\u201310:30 AM<\/a>
\nDay 1: 11:00 AM\u201312:30 PM<\/a>
\nDay 1: 2:00 PM\u20134:05 PM<\/a>
\nDay 2: 9:00 AM\u20139:45 AM<\/a>
\nDay 2: 10:15 AM\u201312:30 PM<\/a><\/p>\n","protected":false},"excerpt":{"rendered":"Update May 2020:<\/strong> Following up on our workshop here, we have begun a virtual biweekly seminar series continuing our conversations on the interface between physics and ML. Please visit http:\/\/physicsmeetsml.org\/<\/a> for more information.<\/blockquote>\r\n<\/div>\r\n
Organizers<\/h3>\r\nGreg Yang<\/a>, Microsoft Research\r\nJim Halverson<\/a>, Northeastern University\r\nSven Krippendorf, LMU Munich\r\nFabian Ruehle, CERN, Oxford University\r\nRak-Kyeong Seong<\/a>, Samsung SDS\r\nGary Shiu, University of Wisconsin\r\n
Microsoft Advisers<\/h3>\r\nChris Bishop<\/a>, Microsoft Research\r\nJennifer Chayes<\/a>, Microsoft Research\r\nMichael Freedman<\/a>, Microsoft Research\r\nPaul Smolensky<\/a>, Microsoft Research"},{"id":1,"name":"Agenda","content":"
Day 1 | Thursday, April 25<\/h2>\r\n
\r\n\r\n
\r\n Time (PDT)<\/strong><\/td>\r\n Session<\/strong><\/td>\r\n Speaker<\/strong><\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n <\/td>\r\n Session 1<\/strong><\/td>\r\n Plenary talks<\/strong><\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n 8:00 AM\u20139:00 AM<\/td>\r\n Breakfast<\/td>\r\n <\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n 9:00 AM\u20139:45 AM<\/td>\r\n Gauge equivariant convolutional networks<\/td>\r\n Taco Cohen<\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n 9:45 AM\u201310:30 AM<\/td>\r\n Understanding overparameterized neural networks<\/td>\r\n Jascha Sohl-Dickstein<\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n 10:30 AM\u201311:00 AM<\/td>\r\n Break<\/td>\r\n <\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n 11:00 AM\u201311:45 AM<\/td>\r\n Mathematical landscapes and string theory<\/td>\r\n Mike Douglas<\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n 11:45 AM\u201312:30 PM<\/td>\r\n Holography, matter and deep learning<\/td>\r\n Koji Hashimoto<\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n 12:30 PM\u20132:00 PM<\/td>\r\n Lunch<\/td>\r\n <\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n 2:00 PM\u20134:05 PM<\/td>\r\n Session 2<\/strong><\/td>\r\n Applying physical insights to ML<\/strong><\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n 2:00 PM\u20132:45 PM<\/td>\r\n Plenary: A picture of the energy landscape of deep neural networks<\/td>\r\n Pratik Chaudhari<\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n 2:45 PM\u20134:05 PM<\/td>\r\n Short talks<\/td>\r\n <\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n <\/td>\r\n Neural tangent kernel and the dynamics of large neural nets<\/td>\r\n Clement Hongler<\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n <\/td>\r\n On the global convergence of gradient descent for over-parameterized models using optimal transport<\/td>\r\n L\u00e9na\u00efc Chizat<\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n <\/td>\r\n Pathological spectrum of the Fisher information matrix in deep neural networks<\/td>\r\n Ryo Karakida<\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n <\/td>\r\n Q&A<\/td>\r\n <\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n <\/td>\r\n Fluctuation-dissipation relation for stochastic gradient descent<\/td>\r\n Sho Yaida<\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n <\/td>\r\n From optimization algorithms to continuous dynamical systems and back<\/td>\r\n Rene Vidal<\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n <\/td>\r\n The effect of network width on stochastic gradient descent and generalization<\/td>\r\n Daniel Park<\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n <\/td>\r\n Q&A<\/td>\r\n <\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n <\/td>\r\n Short certificates for symmetric graph density inequalities<\/td>\r\n Rekha Thomas<\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n <\/td>\r\n Geometric representation learning in hyperbolic space<\/td>\r\n Maximilian Nickel<\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n <\/td>\r\n The fundamental equations of MNIST<\/td>\r\n Cedric Beny<\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n <\/td>\r\n Q&A<\/td>\r\n <\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n <\/td>\r\n Quantum states and Lyapunov functions reshape universal grammar<\/td>\r\n Paul Smolensky<\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n <\/td>\r\n Multi-scale deep generative networks for Bayesian inverse problems<\/td>\r\n Pengchuan Zhang<\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n <\/td>\r\n Variational quantum classifiers in the context of quantum machine learning<\/td>\r\n Alex Bocharov<\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n <\/td>\r\n Q&A<\/td>\r\n <\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n 4:05 PM\u20134:30 PM<\/td>\r\n Break<\/td>\r\n <\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n 4:30 PM\u20135:30 PM<\/td>\r\n The intersect \u2229<\/td>\r\n <\/td>\r\n <\/td>\r\n<\/tr>\r\n<\/tbody>\r\n<\/table>\r\n Day 2 | Friday, April 26<\/h2>\r\n
\r\n\r\n
\r\n Time (PDT)<\/strong><\/td>\r\n Session<\/strong><\/td>\r\n Speaker<\/strong><\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n <\/td>\r\n Session 3<\/strong><\/td>\r\n Applying ML to physics<\/strong><\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n 8:00 AM\u20139:00 AM<\/td>\r\n Breakfast<\/td>\r\n <\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n 9:00 AM\u20139:45 AM<\/td>\r\n Plenary: Combinatorial Cosmology<\/td>\r\n Liam McAllister<\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n 9:45 AM\u201310:15 AM<\/td>\r\n Break<\/td>\r\n <\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n 10:15 AM\u201311:35 AM<\/td>\r\n Short talks<\/td>\r\n <\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n <\/td>\r\n Bypassing expensive steps in computational geometry<\/td>\r\n Yang-Hui He<\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n <\/td>\r\n Learning string theory at Large N<\/td>\r\n Cody Long<\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n <\/td>\r\n Training machines to extrapolate reliably over astronomical scales<\/td>\r\n Brent Nelson<\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n <\/td>\r\n Q&A<\/td>\r\n <\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n <\/td>\r\n Breaking the tunnel vision with ML<\/td>\r\n Sergei Gukov<\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n <\/td>\r\n Can machine learning give us new theoretical insights in physics and math?<\/td>\r\n Washington Taylor<\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n <\/td>\r\n Brief overview of machine learning holography<\/td>\r\n Yi-Zhuang You<\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n <\/td>\r\n Q&A<\/td>\r\n <\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n <\/td>\r\n Applications of persistent homology to physics<\/td>\r\n Alex Cole<\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n <\/td>\r\n Seeking a connection between the string landscape and particle physics<\/td>\r\n Patrick Vaudrevange<\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n <\/td>\r\n PBs^-1 to science: novel approaches on real-time processing from LHCb at CERN<\/td>\r\n Themis Bowcock<\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n <\/td>\r\n Q&A<\/td>\r\n <\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n <\/td>\r\n From non-parametric to parametric: manifold coordinates with physical meaning<\/td>\r\n Marina Meila<\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n <\/td>\r\n Machine learning in quantum many-body physics: A blitz<\/td>\r\n Yichen Huang<\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n <\/td>\r\n Knot Machine Learning<\/td>\r\n Vishnu Jejjala<\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n 11:35 AM\u201312:30 PM<\/td>\r\n Panel discussion with panelists Michael Freedman, Clement Hongler, Gary Shiu, Paul Smolensky, Washington Taylor<\/td>\r\n <\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n 12:30 PM\u20131:30 PM<\/td>\r\n Lunch<\/td>\r\n <\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n <\/td>\r\n Session 4<\/strong><\/td>\r\n Breakout groups<\/strong><\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n 1:30 PM\u20133:00 PM<\/td>\r\n Physics breakout groups<\/td>\r\n <\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n <\/td>\r\n Symmetries and their realisations in string theory<\/td>\r\n Sergei Gukov, Yang-Hui He<\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n <\/td>\r\n String landscape<\/td>\r\n Michael Douglas, Liam McAllister<\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n <\/td>\r\n Connections of holography and ML<\/td>\r\n Koji Hashimoto, Yi-Zhuang You<\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n 3:00 PM\u20134:30 PM<\/td>\r\n ML breakout groups<\/td>\r\n <\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n <\/td>\r\n Geometric representations in deep learning<\/td>\r\n Maximilian Nickel<\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n <\/td>\r\n Understanding deep learning<\/td>\r\n Yasaman Bahri, Boris Hanin, Jaehoon Lee<\/td>\r\n <\/td>\r\n<\/tr>\r\n \r\n <\/td>\r\n Physics and optimization<\/td>\r\n Rene Vidal<\/td>\r\n <\/td>\r\n<\/tr>\r\n<\/tbody>\r\n<\/table>\r\n