Assistive Technologies (ATs) is an umbrella term that includes, from the one hand, assistive, adaptive, and rehabilitative devices for people with disabilities and, from the other hand, the process needed to select, locate, and use them. ATs promote greater independence by enabling people to perform tasks that they were formerly unable to accomplish (or had great difficulty accomplishing) by providing enhancements to, or changing methods of interacting with, the technology needed to accomplish such tasks. Researching on ATs means to focus both on the individuals, the users, the design, and the consecutive development of any kind of technology that could ease, or even improve, everyday life of disabled, elderly people, and people who are following rehabilitative programs. This dissertation spans on ATs that, starting from a common root and deriving from the realm of Information Technology, have been applied and deployed to several groups of individuals with disabilities. Starting from the issue of detecting hand poses, gestures, and signs for enabling novel paradigms for human-machine interaction, three approaches for hand tracking and gesture recognition from single markerless observation have been developed. The first approach comprises machine learning techniques and optimized features to boost performances. The second one comprises a 3D model of a human hand and optimization techniques. The third approach applies machine learning and statistical techniques on top of technology specifically designed for tracking human hands. Starting from these results, hand gesture recognition has then been proposed to enable new interaction paradigms, suitable for individuals with disabilities, in the eld of Human-Robot collaboration. A reliable real time protocol to remotely control anthropomorphic robotic actuators has been implemented. This protocol allows the user to send commands to one (or many) robotic actuator by simply moving his/her hand; it has been designed, modeled, and formally validated resorting to a knowledge-driven agile approach. This dissertation proposes two use cases enabled from the outcomes of the research activities. The former one is a remote communication system for deafblind individuals based on Sign Languages (SLs) with tactile feedback. With the support of SL experts, I have identified a list of fundamental hand movements and gestures to be recognized accurately. The developed algorithms were successfully tested involving 80+ volunteers (both proficient and not in SLs). This communication system is ready to be used concurrently by many people, allowing 1-to-many communication. In addition, it supports different input (cameras and sensors for non-invasive markerless hand tracking) and output (upper-limb anthropomorphic robotic interfaces) systems. The latter one is a telerehabilitation setup for upper-limb post-stroke rehabilitation, comprising vision-based input and a hand exoskeleton. Knowledge derived from the research activities has been applied to two projects, whom outcomes are discussed in this dissertation, as well. The former one lies in the realm of character recognition and aims at improving accessibility of mathematical and scientific documents for blind and deafblind individuals. The latter one aims at developing inclusive interfaces to a web platform under development for preserving and disseminating the cultural heritage of deaf and deafblind communities. All the research activities presented in this dissertation have involved a strict and direct contact with end-user associations and persons who benefit from the results of the research itself, and have been widely discussed and tested with them.

Design and development of methodologies, technologies, and tools to support people with disabilities / AIRO' FARULLA, Giuseppe. - (2017).

Design and development of methodologies, technologies, and tools to support people with disabilities

AIRO' FARULLA, GIUSEPPE
2017

Abstract

Assistive Technologies (ATs) is an umbrella term that includes, from the one hand, assistive, adaptive, and rehabilitative devices for people with disabilities and, from the other hand, the process needed to select, locate, and use them. ATs promote greater independence by enabling people to perform tasks that they were formerly unable to accomplish (or had great difficulty accomplishing) by providing enhancements to, or changing methods of interacting with, the technology needed to accomplish such tasks. Researching on ATs means to focus both on the individuals, the users, the design, and the consecutive development of any kind of technology that could ease, or even improve, everyday life of disabled, elderly people, and people who are following rehabilitative programs. This dissertation spans on ATs that, starting from a common root and deriving from the realm of Information Technology, have been applied and deployed to several groups of individuals with disabilities. Starting from the issue of detecting hand poses, gestures, and signs for enabling novel paradigms for human-machine interaction, three approaches for hand tracking and gesture recognition from single markerless observation have been developed. The first approach comprises machine learning techniques and optimized features to boost performances. The second one comprises a 3D model of a human hand and optimization techniques. The third approach applies machine learning and statistical techniques on top of technology specifically designed for tracking human hands. Starting from these results, hand gesture recognition has then been proposed to enable new interaction paradigms, suitable for individuals with disabilities, in the eld of Human-Robot collaboration. A reliable real time protocol to remotely control anthropomorphic robotic actuators has been implemented. This protocol allows the user to send commands to one (or many) robotic actuator by simply moving his/her hand; it has been designed, modeled, and formally validated resorting to a knowledge-driven agile approach. This dissertation proposes two use cases enabled from the outcomes of the research activities. The former one is a remote communication system for deafblind individuals based on Sign Languages (SLs) with tactile feedback. With the support of SL experts, I have identified a list of fundamental hand movements and gestures to be recognized accurately. The developed algorithms were successfully tested involving 80+ volunteers (both proficient and not in SLs). This communication system is ready to be used concurrently by many people, allowing 1-to-many communication. In addition, it supports different input (cameras and sensors for non-invasive markerless hand tracking) and output (upper-limb anthropomorphic robotic interfaces) systems. The latter one is a telerehabilitation setup for upper-limb post-stroke rehabilitation, comprising vision-based input and a hand exoskeleton. Knowledge derived from the research activities has been applied to two projects, whom outcomes are discussed in this dissertation, as well. The former one lies in the realm of character recognition and aims at improving accessibility of mathematical and scientific documents for blind and deafblind individuals. The latter one aims at developing inclusive interfaces to a web platform under development for preserving and disseminating the cultural heritage of deaf and deafblind communities. All the research activities presented in this dissertation have involved a strict and direct contact with end-user associations and persons who benefit from the results of the research itself, and have been widely discussed and tested with them.
2017
File in questo prodotto:
Non ci sono file associati a questo prodotto.
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11583/2678711
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo