doi.bio/kuba_perlin
Kuba Perlin
Early Life and Education
Kuba Perlin is a Science Research Engineer at DeepMind. He previously worked full-time at Cohere, a start-up training and serving Large Language Models.
Perlin studied Computer Science at the University of Cambridge, graduating with a Triple First BA, and at the University of Oxford, graduating with an MSc with Distinction. His studies focused on machine learning and theoretical computer science, including probabilistic algorithms, complexity theory, game theory, and computational learning theory.
Career
At NASA, Perlin worked on applications of deep learning to Computational Fluid Dynamics. He has also worked as a Teaching Assistant at the University of Cambridge, teaching courses in Probability, Complexity Theory, and Discrete Mathematics. In addition, he has experience as a maths teacher, preparing a cohort of gifted Polish students for the national Junior Math Olympiad.
Publications
Perlin has published research in the area of machine learning and language models. His publications include:
- Interlocking Backpropagation: Improving depthwise model-parallelism (2021, JMLR) with Aidan N. Gomez, Oscar Key, Stephen Gou, Nick Frosst, Jeff Dean, and Yarin Gal.
- Scalable Training of Language Models using JAX pjit and TPUv4 (2022) with Joanna Yoo, Siddhartha Rao Kamalakara, and João G.M. Araújo.
Kuba Perlin
Early Life and Education
Kuba Perlin is a Science Research Engineer at DeepMind. Perlin studied Computer Science at the University of Cambridge, graduating with a Double or Triple First BA, and at the University of Oxford, from which he graduated with an MSc with Distinction.
Career
Perlin's studies focused on machine learning and theoretical computer science, including probabilistic algorithms, complexity theory, game theory, and computational learning theory. He wrote his MSc thesis at the Oxford Robotics Institute, on 3D rotation invariance in neural networks for point cloud processing.
Before joining DeepMind, Perlin worked full time at Cohere, a start-up training and serving Large Language Models. Prior to that, he has interned at EPFL, Google, and NASA, among other places. At NASA, he worked on applications of deep learning to Computational Fluid Dynamics.
Perlin has also worked as a Teaching Assistant at the University of Cambridge, teaching courses in Probability, Complexity Theory, and Discrete Mathematics. He was also a maths teacher for a cohort of gifted Polish students, preparing them for the national Junior Math Olympiad.
Publications
- Interlocking Backpropagation: Improving depthwise model-parallelism (2021, JMLR) with Aidan N. Gomez, Oscar Key, Stephen Gou, Nick Frosst, Jeff Dean, and Yarin Gal.
- Scalable Training of Language Models using JAX pjit and TPUv4 (2022) with Joanna Yoo, Siddhartha Rao Kamalakara, and João G.M. Araújo.
Kuba Perlin
Early Life and Education
Kuba Perlin is a Science Research Engineer at DeepMind. He previously worked full-time at Cohere, a start-up training and serving Large Language Models.
Perlin studied Computer Science at the University of Cambridge, graduating with a Triple First BA, and at the University of Oxford, graduating with an MSc with Distinction. His studies focused on machine learning and theoretical computer science, including probabilistic algorithms, complexity theory, game theory, and computational learning theory.
Career
Perlin has interned at several prestigious organizations, including EPFL, Google, and NASA. At NASA, he worked on applications of deep learning to Computational Fluid Dynamics. He has also worked as a Teaching Assistant at the University of Cambridge, teaching courses in Probability, Complexity Theory, and Discrete Mathematics. In addition, Perlin has experience as a maths teacher, preparing gifted Polish students for the national Junior Math Olympiad. His school performed exceptionally well during his second year of teaching.
Publications
Perlin has several notable publications, including:
- "Interlocking Backpropagation: Improving depthwise model-parallelism" (2021, JMLR) with Aidan N. Gomez, Oscar Key, Stephen Gou, Nick Frosst, Jeff Dean, and Yarin Gal.
- "Scalable Training of Language Models using JAX pjit and TPUv4" (2022) with Joanna Yoo, Siddhartha Rao Kamalakara, and João G.M. Araújo.
Awards and Achievements
- G-Research Prize for the Best Student – Cambridge University Part 1A.
- Poland's Mathematics Olympiad – laureate (places 28–41 in Poland).
- Náboj – international team maths competition – winners (2nd place in Europe, 1st place in Poland).
- Poland's English Language Olympiad – laureate (20th place in Poland).
- Poland's Math and Logic Games – laureate (14th place in Poland).
- Poland's Mathematical Linguistics Olympiad – finalist (38th place in Poland).
- Poland's Junior Math Olympiad – laureate (tied for 33–86 place in Poland).