filmov
tv
Large Language Models for Program Synthesis

Показать описание
Dive into the future of programming with our upcoming talk on large language models (LLMs) and their impact on code generation. Join Xinyun Chen, a senior research scientist at Google DeepMind, as we navigate the promising yet challenging world of LLMs in tackling complex programming tasks and competitive programming problems.
📌 What We'll Cover:
The evolution of large-scale language models in code generation.
The achievements and limitations of current LLMs in understanding and solving complex programming tasks.
A detailed discussion on AlphaCode: how it ranks in the top 54.3% in Codeforces competitions, showcasing LLMs' potential in competitive programming.
Introducing Self-Debugging: a novel approach that enables LLMs to debug their own code through rubber duck debugging, significantly improving performance on code generation tasks.
How Self-Debugging optimizes sample efficiency and outperforms traditional models in text-to-SQL, code translation, and synthesizing Python functions from descriptions.
🎤 About Our Speaker:
Xinyun Chen brings a wealth of knowledge from the intersection of deep learning, programming languages, and security. With a Ph.D. from UC Berkeley, her pioneering research includes integrating SpreadsheetCoder into Google Sheets and featuring AlphaCode on the cover of Science Magazine.
🔗 Useful Links:
👉 Follow us on socials:
📌 What We'll Cover:
The evolution of large-scale language models in code generation.
The achievements and limitations of current LLMs in understanding and solving complex programming tasks.
A detailed discussion on AlphaCode: how it ranks in the top 54.3% in Codeforces competitions, showcasing LLMs' potential in competitive programming.
Introducing Self-Debugging: a novel approach that enables LLMs to debug their own code through rubber duck debugging, significantly improving performance on code generation tasks.
How Self-Debugging optimizes sample efficiency and outperforms traditional models in text-to-SQL, code translation, and synthesizing Python functions from descriptions.
🎤 About Our Speaker:
Xinyun Chen brings a wealth of knowledge from the intersection of deep learning, programming languages, and security. With a Ph.D. from UC Berkeley, her pioneering research includes integrating SpreadsheetCoder into Google Sheets and featuring AlphaCode on the cover of Science Magazine.
🔗 Useful Links:
👉 Follow us on socials: