Funding

Self-funded

Project code

ENGN4940219

Department

School of Electrical and Mechanical Engineering,

Start dates

February and October

Application deadline

Applications accepted all year round

Applications are invited for a self-funded, 3 year full-time or 6 year part-time PhD project. The PhD will be based in the School of Mechanical and Design Engineering and will be supervised by Dr David Sanders and Dr Giles Tewkesbury.

Research will be based on years of work by the Systems Engineering Research Group into robots and automated guided vehicles and will support an EPSRC project and funding coming on stream in January 2019.

Various analogue input devices and sensors have been created to safely detect the environment to assist children with different disabilities, including analogue veer-correction systems. These analogue systems have been featured on tv and radio and footage of the older systems.

The analogue systems have been used in schools and institutions and have made a significant and positive impact and more than 2,000 people have directly benefited. The work is being hampered by the analogue nature of the systems and so this bursary will digitise the analogue input devices and object-proximity-sensors.

Main aim of the project

To create new systems to improve mobility and quality of life for people with Disabilities.

Subsidiary aims

Digitise systems, investigate novel AI, and create new digital systems to assist wheelchair users to steer their powered wheelchairs in cluttered environments.

Methodology

Analogue input devices will be digitized and digital veer-correction will be introduced. New digital switches will be interfaced to microcontrollers to improve mobility and manoeuvring and make wheelchairs easier for children to use. Further developments will attempted to tolerate involuntary movements and provide proportional-response controls. Collision avoidance devices will be redesigned as digital systems and connected to expert systems to interpret hand movements and tremors, and AI systems will be created to improve control. Infra-red optical detectors with background suppression will be investigated to see if they can help drivers lacking spatial awareness.

Logical IF THEN programs will be written to interpret input and then Fuzzy Systems will be investigated to see if they can be successfully used to interpret useful hand movements among tremors to control a powered wheelchair. Finally a Rule Based System will generate revised instructions for veer-correction. A novel overall Decision Making System (DMS) will compare the outputs from the three AI systems described above and will suggest the best possible course of action for the wheelchair. Case Based Reasoning (CBR) will provide confidence weightings for the different AI outputs so that the DMS can select a particular output.

The CBR will be revised to compare errors against search criteria and to then normalise those errors with weighting factors to investigate if that improves the result.

Fees and funding

Visit the research subject area page for fees and funding information for this project.

Funding Availability: Self-funded PhD students only

PhD full-time and part-time courses are eligible for the  (UK and EU students only).

Entry requirements

You'll need a good first degree from an internationally recognised university (minimum upper second class or equivalent, depending on your chosen course) or a Master’s degree in an Civil Engineering or related area. In exceptional cases, we may consider equivalent professional experience and/or Qualifications. English language proficiency at a minimum of IELTS band 6.5 with no component score below 6.0.

A good first degree and a master’s degree in engineering and computing (or similar).

How to apply

We’d encourage you to contact David Sanders (david.sanders@port.ac.uk) to discuss your interest before you apply, quoting the project code ENGN4940219.

Apply

When you are ready to apply, please follow the 'Apply now' link on the Mechanical and Design Engineering PhD subject area page and select the link for the relevant intake.