ESR9 - Hassan Zaal

ESR photo
Hassan Zaal
Research project
Improving Home-Based Care Robots’ Capabilities Using Natural Language Interface
About the project

The global population of adults aged 65 and older (including those with multimorbidity) is growing rapidly, increasing the need for solutions that support independent living. As such there is growing research which explores older adult needs, existing robotic solutions, and adoption challenges for their use at home. Personal Assistant Robots (PARs) are part of a suite of camera-based AAL technologies that have the potential to assist older adults, empowering them to live independently at  home in their communities. The success of PARs to act in this capacity requires a safe and effective interaction between humans and machines. The robot must be able to associate verbal commands with physical objects and locations in the home. This is an example of the "symbol grounding problem" in AI. One of the approaches to understanding verbal instructions is called Vision-and-Language Navigation (VLN). VLN uses visual representations of objects and a map of the environment to parse the user's instructions and execute them with the robot.


The research of ESR9 therefore aims to enhance communication and interaction between robots and humans in a home setting to better support older adults aging in place. To achieve this, the research will analyse user-initiated tasks—expressed through natural language instructions—at a high level of abstraction and over extended time periods, considering the robot's capabilities in a partially observable environment. The following  contributions are envisaged from this work: 1) The development of a framework to enhance a robot’s ability to understand and execute verbal commands (via VLN) by leveraging information from the environment and the robot’s physical and sensory capabilities; 2) investigate the role of large language models (LLMs) in translating high-level verbal instructions into low-level robot navigation and manipulation commands; 3) A dialogue system allowing the robot to initiate interactions with the user when it encounters ambiguity in symbol interpretation. Overall, the effectiveness of the framework (point 1) in relation to points 2 and 3 will be demonstrated in a simulated environment, validating its ability to support real-world applications. 4)  a review of privacy and security issues in vision- and voice-based PARs for assisted living.


Overall, this project will develop improved methods of VLN that are tailored for AAL environments, helping PARs understand user needs and adapt to them. It will address real world problems for older adults using PARs  e.g. finding a target object based on voice commands, such as allocating missing glasses or a missing key.

Start date: January 2022

Expected end date: Summer 2025

Progress of the project

This project started with a review of the needs and requirements for older adults seeking to live independently  (including from a healthcare perspective) in conjunction with PARs available to support this activity. Based on the intersection between the aims of visuAAL and a review on the use of robotics in healthcare, a direction was selected focusing on "Privacy-Aware Embodied Intelligence for Robots in AAL Environments'' from which the research focus was determined. Research progress has included finalisation of the novel framework to enhance a robot’s ability to understand and execute verbal commands (via large language models or LLMs). This involved significant technical exploration and development on the use of PARs and their capabilities to advance the research (e.g. understanding different robots and related visual sensors, voice interfaces (such as a microphone array) and a touch screen use to interact with users); the development of the simulated environment for testing (via NVIDIA Isaac Sim - a robotics simulator that enables the creation of photo-realistic and physically accurate virtual environments to develop robots). Further the evaluation criteria to examine the PARs ability to adapt to changes in the environment and perform the users’ instructions was created.

Scientific publications
About the ESR

Hassan holds a Postgraduate diploma in Vision and Robotics (VIBOT) from Heriot-Watt University in the UK. He spent a semester at Bourgogne University in France and a semester at Girona University in Spain, as an Erasmus student. He received a BSc in Computer and Automation Engineering specializing in Control and Automation Engineering from Damascus University in Syria.

Contact information

Hassan Zaal

ADAPT Centre
O'Reilly Institute
Trinity College,
Dublin 2, Ireland 

Email address: zaalh@tcd.ie