Description
Members of the research community at Microsoft work continuously to advance their respective fields. Abstracts brings its audience to the cutting edge with them through short, compelling conversations about new and noteworthy achievements.In this episode, Principal Researcher Alessandro Sordoni (https://www.microsoft.com/en-us/research/people/alsordon/) joins host Gretchen Huizinga to discuss “Joint Prompt Optimization of Stacked LLMs using Variational Inference (https://www.microsoft.com/en-us/research/publication/deep-language-networks-joint-prompt-training-of-stacked-llms-using-variational-inference/).” In the paper, which was accepted at the 2023 Conference on Neural Information Processing Systems (NeurIPS), Sordoni and his coauthors introduce Deep Language Networks, or DLNs, an architecture that treats large language models as layers within a network and natural language prompts as each layer’s learnable parameters.Read the paper (https://www.microsoft.com/en-us/research/publication/deep-language-networks-joint-prompt-training-of-stacked-llms-using-variational-inference/)
Research manager Karin Strauss and members of the DNA Data Storage Project reflect on the path to developing a synthetic DNA–based system for archival data storage, including the recent open-source release of its most powerful algorithm for DNA error correction.Get the Trellis BMA code: GitHub -...
Published 11/19/24
The efficient simulation of molecules has the potential to change how the world understands biological systems and designs new drugs and biomaterials. Tong Wang discusses AI2BMD, an AI-based system designed to simulate large biomolecules with speed and accuracy.Read the paperGet the code
Published 11/14/24