doi.bio/kuba_perlin


Kuba Perlin

Early Life and Education

Kuba Perlin is a Science Research Engineer at DeepMind. He previously worked full-time at Cohere, a start-up training and serving Large Language Models.

Perlin studied Computer Science at the University of Cambridge, graduating with a Triple First BA, and at the University of Oxford, graduating with an MSc with Distinction. His studies focused on machine learning and theoretical computer science, including probabilistic algorithms, complexity theory, game theory, and computational learning theory.

Career

At NASA, Perlin worked on applications of deep learning to Computational Fluid Dynamics. He has also worked as a Teaching Assistant at the University of Cambridge, teaching courses in Probability, Complexity Theory, and Discrete Mathematics. In addition, he has experience as a maths teacher, preparing a cohort of gifted Polish students for the national Junior Math Olympiad.

Publications

Perlin has published research in the area of machine learning and language models. His publications include:

- Scalable Training of Language Models using JAX pjit and TPUv4 (2022) with Joanna Yoo, Siddhartha Rao Kamalakara, and João G.M. Araújo.

Kuba Perlin

Early Life and Education

Kuba Perlin is a Science Research Engineer at DeepMind. Perlin studied Computer Science at the University of Cambridge, graduating with a Double or Triple First BA, and at the University of Oxford, from which he graduated with an MSc with Distinction.

Career

Perlin's studies focused on machine learning and theoretical computer science, including probabilistic algorithms, complexity theory, game theory, and computational learning theory. He wrote his MSc thesis at the Oxford Robotics Institute, on 3D rotation invariance in neural networks for point cloud processing.

Before joining DeepMind, Perlin worked full time at Cohere, a start-up training and serving Large Language Models. Prior to that, he has interned at EPFL, Google, and NASA, among other places. At NASA, he worked on applications of deep learning to Computational Fluid Dynamics.

Perlin has also worked as a Teaching Assistant at the University of Cambridge, teaching courses in Probability, Complexity Theory, and Discrete Mathematics. He was also a maths teacher for a cohort of gifted Polish students, preparing them for the national Junior Math Olympiad.

Publications

- Scalable Training of Language Models using JAX pjit and TPUv4 (2022) with Joanna Yoo, Siddhartha Rao Kamalakara, and João G.M. Araújo.

Kuba Perlin

Early Life and Education

Kuba Perlin is a Science Research Engineer at DeepMind. He previously worked full-time at Cohere, a start-up training and serving Large Language Models.

Perlin studied Computer Science at the University of Cambridge, graduating with a Triple First BA, and at the University of Oxford, graduating with an MSc with Distinction. His studies focused on machine learning and theoretical computer science, including probabilistic algorithms, complexity theory, game theory, and computational learning theory.

Career

Perlin has interned at several prestigious organizations, including EPFL, Google, and NASA. At NASA, he worked on applications of deep learning to Computational Fluid Dynamics. He has also worked as a Teaching Assistant at the University of Cambridge, teaching courses in Probability, Complexity Theory, and Discrete Mathematics. In addition, Perlin has experience as a maths teacher, preparing gifted Polish students for the national Junior Math Olympiad. His school performed exceptionally well during his second year of teaching.

Publications

Perlin has several notable publications, including:

Awards and Achievements










sness@sness.net