新闻与深度文章
| Dongsheng Li, Dongqi Han, 和 Yansen Wang
Researchers and their collaborators are drawing inspiration from the brain to develop more sustainable AI models. Projects like CircuitNet and CPG-PE improve performance and energy efficiency by mimicking the brain’s neural patterns.
Learn what’s next for AI at Research Forum on Sept. 3; WizardArena simulates human-annotated chatbot games; MInference speeds pre-filling for long-context LLMs via dynamic sparse attention; Reef: Fast succinct non-interactive zero-knowledge regex proofs.
| Dongqi Han
The Bayesian behavior framework synergizes habits and goals through variational Bayesian methods, offering new insights on sensorimotor behavior and comprehension of actions.
“If life is a marathon, then health is the key to its duration.” Health is not only the foundation of happiness and societal progress but also a pivotal aspect of the intelligent era. AI’s integration into healthcare represents a transformative…
Deciding between fundamental and applied research is a dilemma that confronts many in the scientific community. Dongqi Han, on the cusp of graduation, ambitiously aspired to bridge this divide by pursuing both avenues of research in his future endeavors. After…
In this issue: New research on appropriate reliance on generative AI; Power management opportunities for LLMs in the cloud; LLMLingua-2 improves task-agnostic prompt compression; Enhancing COMET to embrace under-resourced African languages:
| Xinyang Jiang, Yubin Wang, Dongsheng Li, 和 Cairong Zhao
Using LLMs to create structured graphs of image descriptors can enhance the images generated by visual language models. Learn how structured knowledge can improve prompt tuning for both visual and language comprehension.
| Huiqiang Jiang, Qianhui Wu, Chin-Yew Lin, Yuqing Yang, 和 Lili Qiu
Advanced prompting technologies for LLMs can lead to excessively long prompts, causing issues. Learn how LLMLingua compresses prompts up to 20x, maintaining quality, reducing latency, and supporting improved UX.
In this issue: HyWay enables hybrid mingling; Auto-Tables transforms non-relational tables into standard relational forms; training dense retrievers to identify high-quality in-context examples for LLM; improving pronunciation assessment in CAPT.