Research

I am employed as an Professor of Machine Vision and Machine Learning at the Centre for Machine Vision within the Bristol Robotics Laboratory. I have worked on a large number of projects over a wide range of applications. Our work is typically highly applied, generating commercial prototypes as project deliverables. My main research interests are in computer vision and machine learning, along with designing and building 3D (photometric stereo) acquisition hardware.

FARM interventions to Control Antimicrobial ResistancE

A JPIAMR funded project with SRUC, University of Copenhagen, PorkColumbia and University College Dublin that aims to investigate how effective different interventions are on reducing antibiotic use to treat post weaning diarrhoea (PWD) in pig farming through microbiome analysis. Our work seeks to:

  • Validate our CNN model estimates the stress of a pig from their face.
  • Stressed mothers tend to give birth to piglets that have a higher probability of developing PWD
  • Read more here.


HOMEs under the microscope: Citizen-led characterisation of airborne micropLAstic sources

A BBSRC funded Citizen Science project with University of Leeds that is a world first in estimating the amounts of microplastics that are present in our homes, and with the aim of helping to guide legislation and manufacturers

  • Microscope image processing tool to count, size and estimate microplastic content
  • Raman based analysis to provide chemical composition of imaged fibres
  • Read more here.


Pig ID: developing a deep learning machine vision system to track pigs using individual biometrics

A BBSRC funded with SRUC that aims to track high density housed pigs for extremely long durations by recognition instead of the more common detection

  • Use deep learning to reliably identify pigs
  • Social networks and types of interaction can be mapped with high accuracy over previously unattainable durations


The Emotional Pig

A BBSRC funded project with SRUC that builds on our successful work in pig face recognition to incorporate pig facial expression analysis to improve animal welfare.

  • Identify if pigs are stressed or comfortable.
  • Identify pain faces (similar to the pig grimace scale).
  • Combine with face recognition to allow automated monitoring of pigs.

In-field estimation of potato size and yield

On harvester mapping of potato size and count using computer vision.

  • GPS enabled.
  • Commercialised system in partnership with B-Hive and Grimme: Harvesteye.
  • Highly commended prize for innovation at the National Potato Industry Awards.
  • Real-time analysis.




Scene recognition

An InnovateUK funded project with Q-Bot Ltd. to improve the automation of their novel underfloor insulation robot. It has the following aims:

  • Map out underfloor architecture.
  • Isolate services such as pipework or cables.
  • Optimise robot path through the void.
  • Spray insulation to reduce heat loss.


Plant Phenotyping using Photometric Stereo

A BBSRC funded project in partnership with Edinburgh University.

  • Allows non-intrusive objective phenotype measurement
  • Variables such as leaf area, leaf angle, petiole angle canbe measured semi-automatically over the plant life cycle
  • Images show the 8 lights source rig and software developed as part of this project, and an example of the surface normal information captured of a growing Arabidopsis plant

Weed detection in pasture

An InnovateUK funded project with Soilessentials Ltd. and Aralia Systems Ltd. to localise broad leaf weeds (dock) in pasture in order to reduce herbicide spraying.

  • Real time weed detection in pasture is far more difficult than in-crop weed detection
  • Operates in real time with over 90% detection rate with low false positives
  • Vision system integrated with solenoid control circuitry
  • Future work will identify the presence of clover and other weed species
  • This has now been developed into a commercial product.

Pig Face Recognition

A NERC/SARIC funded project in partnership with AB-Agri, SRUC and Manchester University to biometrically recognise individuals to aid precision livestock farming.

  • RFID tags are time consuming, stress inducing and have limited range
  • A Convolutional Neural Network is trained on unconstrained images of the pigs at a drinker in on-farm conditions
  • 97% accuracy
  • The snout region, forehead and eyes appear to be the most discriminating areas
  • As featured on Netflix's "Connected" S1Ep1


Non-contact 3D Handprint Recognition

A UWE Bristol funded project that combined high-resolution visible wavelengths with more penetrating near-infrared wavelengths to capture vascular structures for additional security.

  • Real time operation
  • Previously unseen levels of surface morphology for fingerprints/hand prints
  • Extremely difficult to spoof due to multi-modality of capture (surface details + vein structure)
  • As seen on BBC Click!


How's My Cow?

An InnovateUK funded project with Kingshay Farming and Conservation Ltd. currently undergoing commercial trials.

  • Non-intrusive, unobtrusive overhead 3D image capture of cows
  • Measures body condition score, mobility and weight automatically
  • Highly repeatable and objective, twice daily measurements allow interventions to be made earlier

Photoskin

An EPSRC funded project whose aim is to capture skin reflectance data as part of the Photometric Stereo (PS) process for improved 3D face reconstruction. In addition to being important fundamental research, this will bring benefits to face recognition applications and the CGI/gaming industries.

It continues on from the EPSRC funded Photoface device which was developed with colleagues at Imperial College London and is shown below:

The Photoface 3D capture device

Photoface generates a 3D model via PS which allows us to estimate surface orientation for each pixel. While similar devices use lasers or multiple cameras, PS works through using multiple images of the object illuminated from different angles. In the image four light sources (flash guns) can be seen placed on the rear wall (around the monitor) and the camera is positioned just above the monitor. When a person walks through the device, they trigger the ultrasound sensor (on the left) and the flashguns fire in sequence and a synchronised image is captured in around 15milliseconds which effectively freezes the motion. The image below shows examples of the four differently illuminated images and the subsequent 3D rendering.

Examples of the differently illuminated captures and the subsequent 3D rendering.

About me

My PhD, entitled "3D Face Recognition using Photometric Stereo", examined how the inherent products of PS, the surface normals, can be used for effective face recognition. Taking inspiration from how humans process faces (in particular our use of low spatial frequencies and caricature representations), through reducing resolution and variance based methods the dimensionality of the data was significantly reduced without proportionately affecting recognition rates (around 96% at a False Acceptance Rate of 0.001).

This was extended to expression analysis using surface normals and recognition performance was boosted slightly by removing those pixels which were found to encode expression.

As part of this, the Photoface database was created which contains over 3000 sessions of about 450 people captured in a natural environment under far less constrained conditions than similar databases. It is fully labelled and contains a wealth of metadata for each capture (e.g. gender, facial hair, pose, expression, bespectacled). It is available to download for research purposes - please feel free to contact me for instructions on how to do this.

Previously I worked as a software developer in both public and private sectors for about ten years working on a number of diverse projects, programming for mainframes, desktops and web applications. Mainly I was a JEE developer. Prior to that I studied for a BSc(Hons) in Psychology and an MSc in Computer Science at The University of Bristol.