doi.bio/jakob_uszkoreit
Jakob Uszkoreit
Jakob Uszkoreit is a machine learning and natural language processing (NLP) researcher and computer scientist with a particular focus on language translation. He is best known for his work on the Google Brain team and as a co-founder of the biotech company Inceptive.
Early Life and Education
Uszkoreit earned his Master's degree in computer science and mathematics from the Technical University of Berlin in 2007. During his time as a student, he worked as a freelance software developer and interned at Google Research in 2006 and 2007.
Career
After graduating, Uszkoreit worked as a software engineer at Acrolinx, a provider of enterprise content governance for human and AI-generated content. In March 2008, he joined the Google Brain team in Berlin, where he worked on early versions of Google Translate and contributed to the development of Google Assistant. Over his 13 years at Google Brain, he also led a team in Google Machine Intelligence, focusing on large-scale deep learning for natural language understanding.
Uszkoreit has co-authored several influential papers during his career, including the seminal paper on transformer architecture, "Attention Is All You Need". This paper introduced the Transformers architecture, which has since underpinned ChatGPT and most other large language models (LLMs).
In July 2021, Uszkoreit left Google Brain and co-founded Inceptive, a biotech company that incorporates AI into RNA biology to develop novel therapeutics and biotechnologies. He co-founded the company with computational biochemist and Stanford associate professor Rhiju Das. Inceptive has since raised $100 million from investors and developed an AI software platform that designs unique mRNA molecules for use in vaccines and drug treatments.
Publications
Uszkoreit has published numerous papers throughout his career, including:
- "Scene Representation Transformer: Geometry-Free Novel View Synthesis Through Set-Latent Scene Representations"
- "How to Train Your ViT? Data, Augmentation, and Regularization in Vision Transformers"
- "Differentiable Patch Selection for Image Recognition"
- "An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale"
- "Towards End-to-End In-Image Neural Machine Translation"
- "Transforming Machine Translation: A Deep Learning System Reaches News Translation Quality Comparable to Human Professionals"
- "Object-Centric Learning with Slot Attention"
- "An Empirical Study of Generation Order for Machine Translation"
- "KERMIT: Generative Insertion-Based Modeling for Sequences"
- "Natural Questions: A Benchmark for Question Answering Research"
- "Insertion Transformer: Flexible Sequence Generation via Insertion Operations"
- "Blockwise Parallel Decoding for Deep Autoregressive Models"
- "An Improved Relative Self-Attention Mechanism for Transformer with Application to Music Generation"
- "Universal Transformers"
- "Tensor2Tensor for Neural Machine Translation"
- "Fast Decoding in Sequence Models using Discrete Latent Variables"
- "Self-Attention with Relative Position Representations"
- "A Decomposable Attention Model for Natural Language Inference"
- "Language-Independent Discriminative Parsing of Temporal Expressions"
- "A Feature-Rich Constituent Context Model for Grammar Induction"
- "Cross-lingual Word Clusters for Direct Transfer of Linguistic Structure"
- "Inducing Sentence Structure from Parallel Corpora for Reordering"
Jakob Uszkoreit
Jakob Uszkoreit is a machine learning (ML) and natural language processing (NLP) researcher and computer scientist with a particular focus on language translation. He is best known for his work on early versions of Google Translate and for co-authoring the seminal paper on transformer architecture, "Attention Is All You Need".
Early Life and Education
Uszkoreit earned his Master's degree in computer science and mathematics from the Technical University of Berlin in 2007. During his time as a student, he worked as a freelance software developer to support himself.
Career
Uszkoreit first interned at Google Research in 2006 and 2007, where he worked on distributing clustering algorithms for applications in machine translation and language modelling. After graduating, he worked as a software engineer at Acrolinx, a provider of enterprise content governance for human and AI-generated content.
In March 2008, Uszkoreit joined Google Brain in Berlin, where he contributed to early versions of Google Translate. Over his 13-year career at Google Brain, he also worked on natural language query understanding systems for Google Assistant and large-scale deep learning for various NLP applications. He managed the team responsible for the natural language query understanding system that underpins Google Assistant and several other products. He also led a team in Google Machine Intelligence, focusing on large-scale deep learning for natural language understanding.
In July 2021, Uszkoreit left Google Brain and became part of the "brain drain" of AI talent leaving big tech firms to pursue their own interests. That same month, he co-founded Inceptive, a biotech company that incorporates AI into RNA biology, with computational biochemist and Stanford associate professor Rhiju Das. Inceptive aims to develop software that can execute complex functions in biological systems, with the ultimate goal of enabling a new generation of medicines.
Publications
Uszkoreit has co-authored several influential papers, including:
- "Attention Is All You Need"
- "An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale"
- "KERMIT: Generative Insertion-Based Modeling for Sequences"
- "Scene Representation Transformer: Geometry-Free Novel View Synthesis Through Set-Latent Scene Representations"
- "Towards End-to-End In-Image Neural Machine Translation"