{"id":747094,"date":"2024-12-10T06:12:41","date_gmt":"2024-12-10T14:12:41","guid":{"rendered":"https:\/\/www.microsoft.com\/en-us\/research\/?post_type=msr-project&p=747094"},"modified":"2024-12-13T12:21:57","modified_gmt":"2024-12-13T20:21:57","slug":"robrun","status":"publish","type":"msr-project","link":"https:\/\/www.microsoft.com\/en-us\/research\/project\/robrun\/","title":{"rendered":"RobRun: Upgrade Devices with Promptable Intelligence"},"content":{"rendered":"
\n\t
\n\t\t
\n\t\t\t\t\t<\/div>\n\t\t\n\t\t
\n\t\t\t\n\t\t\t
\n\t\t\t\t\n\t\t\t\t
\n\t\t\t\t\t\n\t\t\t\t\t
\n\t\t\t\t\t\t
\n\t\t\t\t\t\t\t\n\t\t\t\t\t\t\t\n\n

RobRun: Upgrade Devices with Promptable Intelligence<\/h1>\n\n\t\t\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t<\/div>\n\t\t<\/div>\n\t<\/div>\n<\/section>\n\n\n\n\n\n

This project is to develop a LLM-based platform, named RobRun<\/strong>, aiming to upgrade a device with promptable intelligence (i.e., the device can respond to prompts or instructions given by users, and adapt to diverse tasks). RobRun includes modules of multi-modality perception encoder, LLM-based Agent, LLM inference system, database, and the underlying hardware.<\/p>\n\n\n\n

Related publications and tools:<\/p>\n\n\n\n

    \n
  1. ACL\u201924<\/strong> \u201cBitDistiller: Unleashing the Potential of Sub-4-Bit LLMs via Self-Distillation\u201d (opens in new tab)<\/span><\/a>
    https:\/\/github.com\/DD-DuDa\/BitDistiller (opens in new tab)<\/span><\/a><\/li>\n\n\n\n
  2. EuroSys\u201925<\/strong> \u201cT-MAC: CPU Renaissance via Table Lookup for Low-Bit LLM Deployment on Edge (opens in new tab)<\/span><\/a>\u201d
    https:\/\/github.com\/microsoft\/T-MAC (opens in new tab)<\/span><\/a><\/li>\n\n\n\n
  3. arXiv \u201cLUT TENSOR CORE: Lookup Table Enables Efficient Low-Bit LLM Inference Acceleration (opens in new tab)<\/span><\/a>\u201d<\/li>\n\n\n\n
  4. arXiv \u201cAdvancing Multi-Modal Sensing Through Expandable Modality Alignment (opens in new tab)<\/span><\/a>\u201d<\/li>\n\n\n\n
  5. arXiv \u201cMaking Every Frame Matter: Continuous Video Understanding for Large Models via Adaptive State Modeling (opens in new tab)<\/span><\/a>\u201d<\/li>\n<\/ol>\n\n\n","protected":false},"excerpt":{"rendered":"

    This project aims to develop an LLM-based platform named RobRun. The goal is to upgrade a device with promptable intelligence, enabling it to respond to user prompts or instructions and adapt to diverse tasks. RobRun includes modules for a multi-modality perception encoder, an LLM-based agent, an LLM inference system, a database, and the underlying hardware. <\/p>\n

    Related publication: <\/p>\n

    ACL\u201924 \u201cBitDistiller: Unleashing the Potential of Sub-4-Bit LLMs via Self-Distillation\u201dhttps:\/\/github.com\/DD-DuDa\/BitDistiller
    \nEuroSys\u201925 \u201cT-MAC: CPU Renaissance via Table Lookup for Low-Bit LLM Deployment on Edge\u201dhttps:\/\/github.com\/microsoft\/T-MAC
    \narXiv \u201cLUT TENSOR CORE: Lookup Table Enables Efficient Low-Bit LLM Inference Acceleration\u201d
    \narXiv \u201cAdvancing Multi-Modal Sensing Through Expandable Modality Alignment\u201d
    \narXiv \u201cMaking Every Frame Matter: Continuous Video Understanding for Large Models via Adaptive State Modeling\u201d<\/p>\n","protected":false},"featured_media":0,"template":"","meta":{"msr-url-field":"","msr-podcast-episode":"","msrModifiedDate":"","msrModifiedDateEnabled":false,"ep_exclude_from_search":false,"_classifai_error":"","footnotes":""},"research-area":[13547],"msr-locale":[268875],"msr-impact-theme":[],"msr-pillar":[],"class_list":["post-747094","msr-project","type-msr-project","status-publish","hentry","msr-research-area-systems-and-networking","msr-locale-en_us","msr-archive-status-active"],"msr_project_start":"2024-08-01","related-publications":[967623,1112310,1112316,1112322,1112328,1112340],"related-downloads":[],"related-videos":[],"related-groups":[],"related-events":[],"related-opportunities":[],"related-posts":[],"related-articles":[],"tab-content":[],"slides":[],"related-researchers":[{"type":"user_nicename","display_name":"Ting Cao","user_id":37446,"people_section":"Section name 0","alias":"ticao"},{"type":"user_nicename","display_name":"Shijie Cao","user_id":40633,"people_section":"Section name 0","alias":"shijiecao"},{"type":"user_nicename","display_name":"Donglin Bai","user_id":41536,"people_section":"Section name 0","alias":"donglinbai"},{"type":"guest","display_name":"Hao Chen","user_id":1110966,"people_section":"Section name 0","alias":""},{"type":"guest","display_name":"Zewen Wu","user_id":1110960,"people_section":"Section name 0","alias":""},{"type":"guest","display_name":"Muzi Chen","user_id":1110975,"people_section":"Section name 0","alias":""},{"type":"guest","display_name":"Yang Ou","user_id":1110981,"people_section":"Section name 0","alias":""},{"type":"user_nicename","display_name":"Shiqi Jiang","user_id":40675,"people_section":"Section name 0","alias":"shijiang"},{"type":"guest","display_name":"Legend Zhu","user_id":1110987,"people_section":"Section name 0","alias":""},{"type":"guest","display_name":"Xin Ma","user_id":1110993,"people_section":"Section name 0","alias":""},{"type":"guest","display_name":"Lili Sun","user_id":1110999,"people_section":"Section name 0","alias":""},{"type":"user_nicename","display_name":"Mao Yang","user_id":32798,"people_section":"Section name 0","alias":"maoyang"}],"msr_research_lab":[199560],"msr_impact_theme":[],"_links":{"self":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/747094","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project"}],"about":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/types\/msr-project"}],"version-history":[{"count":7,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/747094\/revisions"}],"predecessor-version":[{"id":1112346,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-project\/747094\/revisions\/1112346"}],"wp:attachment":[{"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/media?parent=747094"}],"wp:term":[{"taxonomy":"msr-research-area","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/research-area?post=747094"},{"taxonomy":"msr-locale","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-locale?post=747094"},{"taxonomy":"msr-impact-theme","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-impact-theme?post=747094"},{"taxonomy":"msr-pillar","embeddable":true,"href":"https:\/\/www.microsoft.com\/en-us\/research\/wp-json\/wp\/v2\/msr-pillar?post=747094"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}