Checkout my branch experience

A software engineer with a passion for building products and solving problems. I have experience across backend software development, infrastructure, data engineering and MLOps. I focus on building scalable and reliable systems.

Education

MEng Chemical Engineering

with Energy and Environment

First Class Honours

The University of Manchester

September 2015 - June 2019

A-Levels

Maths, Physics, Chemistry, Biology

A*A*A*A*

Winstanley College

September 2013 - June 2015

GCSEs

Including the English Baccalaureate

5A*, 3A

Bedford High School

September 2008 - June 2013

Tracify.ai

Software & Infrastructure Engineer

München / Berlin, Jun 2024 - Present

Led infrastructure and observability initiatives, introducing a comprehensive monitoring stack using Prometheus, Grafana and custom JSON logging, significantly improving debugging, incident response and system transparency across all services.
Standardised Kubernetes application deployments by introducing and maintaining a suite of custom Helm charts, enabling consistent, versioned and environment specific rollouts across multiple clusters, eliminating manual configuration and deployment errors.
Created a culture of continuous improvement and improved code quality by introducing better testing, CI/CD and linting practices. Introduced self-hosted runners to reduce costs.
Played a pivotal role in raising engineering standards, mentoring teammates in DevOps best practices and fostering a culture of deployment observability and operational excellence.
Built internal developer tooling using the Kubernetes Python API, enabling staff to rapidly onboard new clients, provisioning custom deployments to meet client specific requirements.
Developed custom Prometheus metrics and Grafana dashboards to track critical performance indicators (API response time, queue latency, resource usage), enabling bottleneck detection and performance tuning.
Designed and maintained Django microservices powering core product features and platform integrations. Closely worked with the Shopify GraphQL API, custom Clickhouse and PostgreSQL database models, django-ninja, polars and many other packages.

Spoke.ai

Software Engineer

Berlin, Nov 2023 - May 2024 (Acquired by Slack)

Developed and deployed LLM focused python microservices to AWS, making use of tools such as langchain and llama-index to develop chat-bots, document retrieval engines and more.
Created a culture of continuous improvement and improved code quality by introducing better testing, CI/CD and linting practices.
Used RabbitMQ to exchange data between microservices, alongside MongoDB and PostgreSQL for data storage.
Developed a variety of command line tools using click, to improve the developer experience, particularly useful for rapidly producing varied datasets, for annotation purposes.
Optimised CI/CD pipelines by introducing caching of docker images, saving significant time and money by reducing the number of billed GitHub action minutes. Made other general improvements to Dockerfile structures to reduce image size and build times.

RegGenome

Software & Infrastructure Engineer

Cambridge, Aug 2022 - Nov 2023

Rapidly developed a horizontally scalable solution using the Kubernetes Python API to improve the speed of data exports by >95%, ensuring deadlines for data drops were met.
Created an AWS SQS messaging application to run web crawlers on demand or at regular intervals using a scalable Kubernetes deployment. Integrated with a slack bot via SNS to inform users of metrics or warn of broken crawlers.
Integrated custom made and off the shelf machine learning models into the regulatory document pipeline using AWS SageMaker to run predictions and translations on text input.
Developed web crawlers using the Scrapy python framework to scrape pdf and html content from regulatory bodies across the globe.
Utilised Terraform and AWS CloudFormation to deploy infrastructure as code across sandbox, staging and production stacks.
Implemented a solution to augment processed documents with human annotations via an EKS cronjob, S3 and the label studio python SDK.
Surfaced cost and usage reports via Athena to provide senior leadership with a view of AWS costs broken down by service, function, team and more, made possible by resource tagging.
Created CLI utilities to standardise CI/CD, testing, packaging, linting and more across projects. Utilised tools such as SonarQube to help identify bugs, code smells and security risks.
Incorporated DataDog and Sentry into cloud applications to ensure alerting, monitoring and logging were provided across all stacks and services.
Practiced test driven development, predominantly driven through PyTest and Jest.

Coop

Data Scientist / Engineer

Manchester, July 2021 - Aug 2022

Developed and maintained ETL pipelines, productionising machine learning models to deliver personalised offers to millions of customers using pyspark, databricks and the Azure platform.
Created python packages to improve the productivity of the team. One package streamlined data transfer between databricks and SQL server, helping reduce manual workload. Estimated time saving of around 300 team-hours per year, whilst eliminating key person dependencies.
Implemented CI/CD pipelines via Azure DevOps to automatically test and deploy code, enabling rapid and efficient software development with reliable and easily maintainable code.
Developed and implemented AB tests to optimise personalised offers; to increase customer engagement and maximise profitability. Developed a custom AB testing framework to facilitate rapid implementation of new AB tests to meet stakeholder requirements.
Translated a cost to serve model developed by a third party in Alteryx into an internal ETL pipeline, using spark and Databricks, delivering increased flexibility and resilience to the project.

United Utilities

Process Engineer

Warrington, Sept 2019 - July 2021

Created automated reports using python and SQL to extract, transform and analyse data, making insights and analysis of site and asset performance easily available to stakeholders.
  • Significantly reduced the amount of time required to complete the analysis.
  • Helped to drive capital investment decision making, optimise the operation of existing assets and ensure sites always remain compliant with environmental permits.
Developed budgets for the renewable energy portfolio, including the development of models to forecast power generation and grid export, liaising with operation and finance teams.
Identified and implemented energy savings opportunities in my role as an area deployed energy engineer, working with site teams to maintain compliance at least cost. Developed a data driven approach to my role, helping to prioritise which sites and opportunities to focus on.
Identified and implemented energy savings opportunities in my role as an area deployed energy engineer, working with site teams to maintain compliance at least cost. Developed a data driven approach to my role, helping to prioritise which sites and opportunities to focus on.
Lead a team studying the carbon emissions associated with the wastewater treatment process – applying a variety of emissions models to help inform decision making processes.
Created Tableau dashboards to share insights about the renewable energy portfolio at board level, facilitating investment decision making and helping identify maintenance requirements.