Ali Shafti | Research


Head of Human-Machine Understanding, Cambridge Consultants

I am currently leading research at Cambridge Consultants on Human-Machine Understanding.

Previous research at Imperial College London

Research Fellow / Senior Research associate in Robotics and Artificial Intelligence | PI: Dr. A. Aldo Faisal

As of mid-2021, I was a Research Fellow, leading all robotics activities and students, co-supervising PhD students and several projects. I was Senior Research Associate 2020-21.

I research physical human-robot collaboration, particularly the implications and opportunities arising from human motor control and coordination facing intelligent robot control and motion planning. I investigate human and robot in-the-loop methods within machine learning, particularly reinforcement learning, to achieve more intuitive, natural and efficient human robot collaboration. As part of this, I continue the work on the eNHANCE project (see below), and other projects involving human-robot interaction to investigate:

Previously (2017-2019): Lead Research Associate and  Project Manager, EU Horizon2020 eNHANCE| PI: Dr. A. Aldo Faisal

The keywords below list some of the tools I use for my current research.

Keywords: [Robot Operating System (ROS)], [Python], [C++], [MATLAB], [PyTorch], [TensorFlow], [(deep) Reinforcement Learning], [OpenAI Gym], [Simultaneous Localisation and Mapping (SLAM)], [Solidworks], [Formlabs Form2 3D Printer], [Universal Robots UR10], [BioServo SEMGlove], [BioServo CarbonHand], [Myo Armband], [Arduino], [SMI Eye-trackers], [Agile Project Management].

Gallery of research at Imperial College London

Learning to play the piano with the SR3T

Examining the motor coordination constraints of robotic human augmentation.

Real-World Human-Robot Collaborative RL

A setup for real-world human-robot reinforcement learning of a fully collaborative motor task, in the form of a marble-maze game.

Human Visual Attention Prediction for Better Autonomous Driving Agents, 2019.

Explainable Robot Manipulation with Hierarchical Reinforcement Learning, 2019.

Gaze-based, context-aware HRI based on human action grammars, 2019.

Gaze-based, context-aware HRI based on human action grammars, 2018.

Gaze-based, context-aware HRI through multi-modal sensing, 2017.

Supernumerary Robotic 3rd Thumb, a setup for embodiment studies, 2017.

Previous research at King's College London

Research assistant and Project Manager, EU Horizon2020 FourByThree | PI: Prof. Kaspar Althoefer

Aside from this specific application project, my general research involved human behaviour analysis, particularly in the area of human physiological comfort. As part of this I developed methods and devices for the objective real-time assessment of human comfort, which was then used to place the human, and their physical comfort, within the robotic system's action/perception loop. This allowed for active robot-assisted ergonomic interactions within the factory environment as well as objective studies of surgeons' comfort within the clinical environment. The keywords below list some of the tools I used for my previous research.

Keywords: [Robot Operating System (ROS)], [MATLAB], [Solidworks], [Dimension/Formlabs/Utilmaker 3D printers], [Baxter Research Robot], [Arduino], [Microsoft Kinect], [Supervised Learning], [Agile Project Management].

Gallery of research at King's College London

Ergonomic HRI: collaborative robot continuously optimising human posture, 2017.

Textile-based EMG (muscle activity sensors) for studies in-the-wild, 2016.

The above are the main research projects under which I've been hired as both researcher and project manager. For a full list of the different projects in which I have contributed, as well as a detailed timeline of activities, please see my CV.