Type something to search...
Jack Norrie

Jack Norrie

Senior Machine Learning Engineer specialising in developing sophisticated machine learning systems and deploying them at scale. At Knauf Energy Solutions, I design and develop novel machine learning approaches while architecting production ML infrastructure on AWS, including microservices-based deployment pipelines and cloud-native MLOps systems.

My technical toolkit spans the full machine learning engineering spectrum: Python (Pandas, NumPy, scikit-learn), deep learning frameworks (PyTorch, JAX, TensorFlow), R (Tidyverse), big data technologies (Hadoop & Spark), SQL, Git for version control, Docker for containerisation, Terraform for Infrastructure as Code (IaC), Linux operating systems, and AWS cloud technologies (SageMaker, Glue, Athena, Step Functions, Lambda, Batch) for robust ML operations and deployment.

My academic foundation includes an MSc in Statistics (Distinction) and BSc in Physics (First-Class Honours) from Imperial College London, where I specialised in advanced ML methodologies and completed deep learning research projects. This blend of theoretical understanding and practical implementation enables me to tackle complex data challenges with innovative approaches.

Passionate about building ML systems that scale efficiently and create meaningful impact.

Backpropagation A Modern Explanation

Manual differentiation is prone to errors and does not scale to functions that involve thousands of parameters, e.g. neural networks. This necessitates procedures which are able to take functions and

read more

Understanding Bootstrap Confidence Intervals

Recently I set out to understand Bias-Corrected Accelerated (BCa) bootstrap confidence intervals. I was aware that it was the "gold standard" for bootstrapping and that many statistical packages used

read more

Understanding Optimisers Through Hessians

During my recent autodiff library project I got the opportunity to implement common optimizers from scratch. Furthermore, while investigating the vanishing gradient problem I benefited greatly from di

read more

A Deep Dive On Vanishing and Exploding Gradients

I was first introduced to the vanishing/exploding gradients problem while conducting my Bachelor's thesis. At the time I was reading the textbook "Hands-On Machine Learning with Scikit-Learn and Tenso

read more

Hardware Accelerated Bootstrapping

Following my investigation into bootstrap confidence intervals, I set out to run some simulation experiments to observe how the coverage of these confidence intervals approached their nominal levels.

read more

AoC 2025 Day 10

Introduction Every year I do Advent of Code (AoC), and there are always one or two problems that stick with me. These are problems that I usually struggle with initially, and as such spend a long tim

read more

Building an Autodiff Library

Until recently, despite my extensive experience with auto-differentiation frameworks, I have never implemented one myself. I believe that implementing a tool that you commonly use yourself can yield g

read more