Ali Shafti

Technical Lead, Human-Machine Understanding, Cambridge Consultants


  • 07/2022: Started a new role as Tech Lead in Human-Machine Understanding at Cambridge Consultants.

  • 05-06/2022: Wrapping up projects at Imperial College London as I prepare for a new role. Stay tuned!

  • 04/2022: Talk & Poster accepted at Neural Control of Movement (NCM'22), on physical human-AI collaboration.

  • 03/2022: Preprint from my collaboration with TUM: the role of haptic communication in physical collaboration.

  • 02/2022: Preprint on a human in-the-loop approach to quantify human trust in AI recommendations.

  • 01/2022: Preprint on deep learning for prediction of human intention based solely on natural gaze cues.

  • 01/2022: Happy new year! New year, another updated CV ;)

  • 12/2021: Happy holidays! Tough year, but wrapped a few great projects (news here soon!). Have a great break :)

  • 11/2021: Featured on Imperial College news, our recent work on robotic human augmentation.

  • 11/2021: Promoted to Research Fellow in Robotics & AI @ Imperial College London

  • 11/2021: Now on Scientific Reports: we report on the inherent constraints of robotic augmentation of humans.

  • 10/2021: Paper accepted at Scientific Reports on robotic human augmentation.

  • 09/2021: New academic year, new updated CV :)

  • News archive...

Latest research demo videos (more here):

Real-World Human-Robot Collaborative RL

A setup for real-world human-robot reinforcement learning of a fully collaborative motor task, in the form of a marble-maze game.

Gaze Prediction for Autonomous Driving

Prediction of human visual attention helps with the training of autonomous driving agents - attention masking helps the agent "see what matters".

Learning Explainable Robotic Manipulations

Hierarchical Reinforcement Learning is used to create more explainable representations of the manipulating agent's understanding of world dynamics.

Gaze-based HRI + Arm Inverse Kinematics

The system is aware of the human user's arm kinematics - this allows for full control of the human user's hand orientation, whilst keeping the interaction comfortable.

About me

I study the collaboration and interaction of humans and machines. I look into making these intuitive and natural for increased synergy, and augmented capabilities on both sides. I am curious about achieving machine intelligence, while conserving the role of human intelligence as an essential part of the action/perception loop and the overall interaction. To this aim, my research involves human-machine interaction through machine learning and human behaviour analytics.

My original training is in electronics and electrical engineering. During my BSc and MSc I was focused on microelectronics and analogue/digital circuits design. For my PhD I expanded my research into robotics, focusing on the electronics and computer science aspects, with human-robot collaboration as an area of application. In my PostDoc at Imperial College London I broke into machine intelligence and motor neuroscience research and their application within the physical human-robot collaboration realm.


I am interested in collaborations within the above research topics, as well as other topics that fit in with my expertise. Do get in touch!