My PhD was at the Royal Observatory, Edinburgh. My project mostly used the Hubble Space Telescope to look at very distance galaxies. They are far enough away that we look back into the early Universe when we see them (within the first billion years of its 13 billion year history).
We looked for the earliest, faintest galaxies we could find (a few hundred of them at redshift 5 to 9). By carefully measuring their colours and comparing them to computer models of stars and galaxies, we worked out the galaxies’ stellar populations — the types of stars they contained. The tricky parts were working with very faint galaxies in very noisy image data, measuring colours without bias, and figuring out both the distance to the galaxies as well as their properties (they can have similar patterns).
The most important results of my thesis are in this paper:
Rogers, A. B., et al. “The colour distribution of galaxies at redshift five.” Monthly Notices of the Royal Astronomical Society 440.4 (2014): 3714-3725.
I worked on a tech startup, Saberr, for 8 years. They focused on workplace teamwork. I did research and data analysis as well as full-stack web development (and various other things you’d expect in a startup).
My favourite feature to work on was Smart Tips: a natural language processing feature that annotates users’ meeting agendas with relevant expert guides to help them. It was interesting because:
A hardware & software project, to automatically track an object using a consumer point-and-shoot camera. The initial motivation is to film a horse and rider performing Dressage. Good footage (e.g. for practice review or for virtual compeitions) requires a tight crop even at the other end of an arena.
The project uses a Panasonic TZ70, with reverse-engineered WiFi control and streaming thanks to the camera’s companion-app capabilities. A Raspberry Pi connects wirelessly to the camera, streams frames over UDP, and controls the pan/tilt via Monkmakes Servosix controller and TowerPro MG996R servos. Zoom-level and recording status are controlled by issuing WiFi requests to the camera's cgi command server. A 4" touchscreen shows a simple GUI interface and preview. The pan/tilt frame and camera mount are aluminium with steel bearings.
The code is all Python, and uses opencv with a prebuilt caffe deep neural net model to recognise horses and track them in the frame; GPIOZero with non-Python timing functions for jitter-less servo control; and GUIZero for a simple interface. Various tricks and heuristics are needed to preempt the lag and smooth servo and zoom movements.
An app that encourages runners to keep a training journal. It integrates with Strava and let’s athletes keep a private diary of their training. The aim is to encourage more honest training notes, and to guide runners to periodise their training to avoid burnout or injury. It’s built using Firebase and React.js.
The Building Bridges in Medical Science conference tries to help their attendees make useful connections each year. They have a history of trying out innovative networking tech to help. For the 2017 and 2018 conferences, I made an emoji based meet-and-share-contacts app, called EmojiBadg.es. The app starts with a list of Eventbrite attendees, and makes printable name badges for them.
Each badge has an emoji code on it, like 🐨️🌲️📎️🍎️. People can share their own contact info, and get somebody else's, by keying an emoji code into the web app on their phone.
Sometimes I write things on Medium, like this post about making F1 more competitive. In that one, I used a historical F1 database to look at how competitive races need to be for teams to win championship points.
Sometimes I post on Mastodon.