My research aims to develop mobile robots that can interact
autonomously with their environment. To function independently in the real
world, a robot must be able to cope with an environment that is constantly
changing, sometimes rapidly. The main focus of my work is therefore the
development of artificial robot “minds”, capable of perceiving and
modelling the world in real time and then responding appropriately.
A robot decides on its actions through knowledge of the world and of its
internal state. Historically, this decision making has been rather algorithmic,
an approach that is brittle in complex environments. As the role of a robot
becomes more complicated, it becomes increasingly difficult for the robot's
designer to foresee all scenarios that the robot will encounter and makes devising an
optimal program for the robot prohibitively difficult. I am
interested in alternative decision making paradigms that enable the robot to
display adaptive or creative capabilities. A system with such capabilities
requires less effort during design and results in a robot that interacts more
naturally with its surroundings.
The focus of my current research work is in machine perception, particularly
machine vision. The work spans a wide area, from fundamental problems in feature
recognition, through techniques for efficiently utilising the limited sensory
resources of a robot system, to high level applications of robot vision.
Future research students
I am on sabbatical from mid-2019 to mid-2020, and will not be taking on any new graduate students as primary supervisor until my return. Nevertheless, if you are interested in working with me on active vision problems then we may still be able to arrange a different primary supervisor. In this case please send me an email,
including a CV and complete academic transcript
Possible future projects include the following.
- Active vision with time constraints; scheduling of feature calculation,
- Statistical underpinnings of active perception,
- Vision based estimation of robot pose and ego-motion,
- Incorporation of robot end effector management with active vision,
- Incorporation of robot emotion into active perception,
- Multi-modal active perception,
- Fast computational structures for active vision systems.
Please click here
for more information.
Current research students
- Arindam Bhakta - statistical based robot vision strategies (with Will Browne and Marcus Frean),
- Ibrahim Rahman - Active vision for object recognition (with Mengjie Zhang),
- Aisha Ajmal - Active Vision techniques for tracking applications.(with Marcus Frean)
Final Year Projects:
- Amer Alomayri - PZT Control for Optical Systems
- Dylan Guja - Safety-Aware Robotics
- Andrew Tyson - An IMU for Robotic Applications
Previous research students
- Anna Friedlander - (2014) Dirichlet Methods for Bayesian Source Detection in Discretised Radio Astronomy Images (with Marcus Frean and Melanie Jonnston-Hollitt).
- Abigail Arulandu - (2013) Use of Magneto-rheological Fluids in Stroke Rehabilitation (with Will Browne).
Final Year Projects:
- 2017 Steven Pan - A Gimbal Based Robot Eye
- 2016 Andrew Ang - (with ikeGPS).
- 2016 Davis Cooper - (with ikeGPS).
- 2016 Jeffrey Wu - Control of an Electric Skateboard
- 2014 Michael Edwards - A Timer/Sequencer for Machine Vision
- 2014 Damla Guven - Orientation of a Robot Head
- 2013 Rowan Barrie - A Robot Vestibulo-Ocular Reflex
- 2013 Thomas Hughes - Discrete Analogue Electronics
- 2013 Ian Leow - Compliant Robot Actuators
- 2012 Tim Sherry - Visual tracking and laser targeting
- 2012 Michael Lewis - Development of a Robot Head and Neck
- 2011 Scott Mullan - Design of a Biomimetic Robot Arm.
- 2011 James Turner - A Robotic Nervous System.
- 2010 Abigail Arulandu - Hand Rehabilitation using Compliant Actuation (with Will Browne).
- 2010 Matthew Bourne - Pulse Stream Arithmetic.
- 2010 Thomas Lambrecht - Active Robot Vision.
- 2010 Brendan Vercoelen - Design of a True 3D Digital Display (with Robin Dykstra).
For a list of my recent publications, please see the Publications Database