Projects & Patents

GERIATRIC: Gaze and Hand intERactions for enhancing usabIlity and Accessibility of virTual Reality applicatIons in neuropsyChology (HORIZON EUROPE)

One out of 5 Europeans is ≥ 65 years old, and 10% of Europeans struggle with performing Instrumental Activities of Daily Living like housekeeping, shopping, and managing finances. This everyday struggle is predominantly attributed to cognitive ageing. Early detection of cognitive decline is imperative to slow down the exacerbation of cognitive ageing, yet existing tools fall short in the timely identification of such decline. The proposed project attempts to bridge this gap, utilizing Immersive Virtual Reality (IVR) to create an innovative cognitive assessment system, simulating real-life conditions and providing an accurate representation of individuals' everyday functionality. The GERIATRIC project recognizes the barriers posed by the elderly population's often limited IT proficiency and compromised motor skills, which have historically obstructed the effective use of computerized cognitive assessment tools. GERIATRIC seeks to address these challenges by incorporating Eye-Tracking (ET) and Finger-Tracking (FT) technologies, thereby augmenting the usability, accessibility, and clinical relevance of IVR cognitive assessments. Hand-based (FT) and gaze-based (ET) interactions will be developed, evaluated, and compared in terms of usability, performance, and throughput. The validity and clinical utility of the IVR cognitive test will be examined, especially for the detection of cognitive decline in older adults. In collaboration with industry partners, the final stage transforms the IVR cognitive assessment into a potent cognitive training tool. To foster innovation in the field, GERIATRIC will promote the democratization of IVR for broader research and clinical applications by offering open-source access to IVR software, code, and assets to the broader scientific community.

META-TOO: A transfer of knowledge and technology for investigating gender-based inappropriate social interactions in the Metaverse (HORIZON EUROPE)

The META-TOO proposal underscores the knowledge and skills transfer from two distinguished European institutions, INRIA and IDIBAPS, each renowned for its research contributions in the digital forefront, to the National and Kapodistrian University of Athens (NKUA), Greece, serving as the coordinating institution representing a Widening country (Greece). Specifically, both INRIA (The French National Institute for Research in Computer Science and Automation) and IDIBAPS (Fundació de Recerca Clínic Barcelona in Spain) have earned global recognition for their outstanding contributions to Extended Reality (XR) research. It is precisely within this domain that we have elected to focus the support of INRIA and IDIBAPS (henceforth called “mentors”) for the National and Kapodistrian University of Athens (“coordinator”), with a concerted effort aimed at reinforcing research management and administrative competencies, alongside enhancing research and innovation capabilities.

The "META-TOO" project aims to address cyber-harassment in the rapidly evolving Metaverse, where significant investments by tech giants are creating immersive virtual worlds for social interaction. Recognizing the intensified emotional impact of interactions within these virtual realities, especially among women and gender minorities, the project focuses on gender-based inappropriate behaviors that can escalate to harassment. META-TOO seeks to (1) identify and quantify patterns of such behaviors in Social VR applications, (2) design inclusive avatars representing diverse sex and gender identities, (3) detect user perceptions of inappropriate interactions through physiological biomarkers, and (4) develop tools enabling users to prevent or avoid harassment. This initiative combines cyberpsychology with advanced technology to enhance user safety and well-being in the next-generation internet.

SPAUT : Table Tennis for People with Autism

The AIM of the SPAUT project is on one hand to join forces with table tennis coaches and physical education teachers (with knowledge on table tennis), training and certifying them as “coaches with knowledge and perception of Autism” and on the other hand to involve people with Autism in the table tennis sport, encouraging participation and physical activity. 

The project aims also to promote social inclusion, equal opportunities in sport and access to table tennis for people with Autism in mainstream activities.

It will do so by empowering local communities and sports stakeholders by raising awareness and training activities, and the establishment of a community of practice at local level. People with Autism will be trained by the certified coaches in each country in order to develop a good knowledge in the sport, become competitive and participate in table tennis sport events that will be organized. As extra help, they will have access to a Virtual Reality app where they will be able to learn the table tennis rules and practice on their homes (before reaching the courts).


The AIM of the SPAUT project is,

on one hand to join forces with table tennis coaches and physical education teachers (with knowledge on table tennis), training and certifying them as “coaches with knowledge and perception of Autism” and on the other hand to involve people with Autism in the table tennis sport, encouraging participation and physical activity. The project aims also to promote social inclusion, equal opportunities in sport and access to table tennis for people with Autism in mainstream activities. It will do so by empowering local communities and sports stakeholders by raising awareness and training activities, and the establishment of a community of practice at local level. People with Autism will be trained by the certified coaches in each country in order to develop a good knowledge in the sport, become competitive and participate in table tennis sport events that will be organized. As extra help, they will have access to a Virtual Reality app where they will be able to learn the table tennis rules and practice on their homes (before reaching the courts).


According to Autism-Europe (www.autismeurope.org), over the past 30 years in Europe, the number of reported cases of autism has increased rapidly in all countries where prevalence studies have been conducted. The main reasons are the increased awareness of autism and changes to diagnostic criteria.

It is expected that the project will raise awareness for experts in sport and Autism domain all over Europe. The offering of the training material for the “coaches with knowledge and perception of Autism”, the certification scheme and the VR scenarios as open Educational Resource (OER) in the public domain will also contribute to a greater impact. The project will be a pioneer result of strategic cooperation between Sport and Autism experts, adult learning providers, certification experts and a technical partner with expertise in VR applications.

Project Objectives:

To break down barriers and bring people together from 5 European countries with common goals.

To improve motor and social skills of people with Autism.

To develop and hone talent among autistic persons so that they can eventually play in Para Olympics and other tournaments.

To train physical education teachers or table tennis coaches to train individuals with autism in table tennis

To increase the spirit of volunteerism among young athletes and young people in general regarding their contribution to training autistic persons in table tennis.

Project's Website: http://www.spaut.eu/ 

VRESS : Enhancing Social Skills through Virtual Reality Applications 

People with Autism Spectrum Disorder (ASD) face significant challenges in their social interactions due to a lack of social skills with an emphasis on empathy. This difficulty is one of the main characteristics of people with ASD that can lead to significant difficulties in their daily living, academic course, employment opportunities and other social activities.  

The purpose of the proposed project is to develop a platform for creating personalized virtual reality scenarios through which people with Autism will be able to participate in first-person simulations of social situations based on the Social Stories construction technique. 

The platform will include subsystems for monitoring the person's behavior (biobehavioral monitoring) during the simulations, through sensory instruments (heart rate measurement and eye monitoring) offering new multimodal interfaces which are expected to provide important findings in clinical research as well. 

The project will face significant challenges both technical and pedagogical. On a technical level it will deliver an interactive social story system with multimodal interfaces and apply them to people with autism.

The project also innovates pedagogically by offering a dynamic virtual reality social story system through which personalized scenarios are created that adapt in real time through interoperability with biometric sensors (heart rate wearables, eye tracking software) and according to user performance.

The use of computers to improve the social skills of individuals with autism has been recognized in the past (Ben-Sasson, Lamash, Gal, 2012). In most cases the user moves autonomously in virtual environments and interacts with other characters or objects (Mitchell et al., 2007) as in the real world.

Virtual reality has proven to be a promising technology in the field of autism through which multisensory interventions are offered in controlled and structured environments that minimize the need for actual social interactions in the initial acquisition of social information (Chen & Bernard-Opitz, 1993; Golan & Baron- Cohen, 2006).

Several studies have documented the positive effects associated with similar interventions to increase social skills in children and adolescents with high-functioning Autism (Stichter et al, 2014; Ke & Im, 2013; LeCava et al, 2007; Beaumont & Sofronoff, 2008). ). The main result of the project is the development of a platform that will offer the possibility of developing personalized social story scenarios in a virtual reality environment at low cost by the caregivers of people with autism themselves.

People with autism will be able to test and develop, in a real-world environment, their social skills in a targeted, increasing level of difficulty without stress.

Project's Website: http://www.vress.eu/ 

GENESIS - HORIZON EUROPE

Brain-Computer Interfaces (BCIs) enable the leveraging of cerebral activity of users in order to interact with computer systems. Originally designed for assisting motor-impaired people, a new trend is emerging towards the use of BCI for a larger audience using passive BCI systems, which are able to transparently provide information regarding the users’ mental states. Virtual Reality (VR) technology could largely benefit from inputs provided by passive BCIs. VR enables to immerse users in 3D computer-generated environments, in a way to make them feel present in the virtual space, allowing through complete control of the environment, to offer several applications ranging from training and education, to social networking and entertainment. Given the growing interest of society and major industrial groups‘ investments, VR is considered as a major revolution in Human-Computer Interaction. However, to this day, VR has not yet reached its predicted level of democratization and largely remains at the state of an entertaining experiment. This can be explained by the difficulty to characterize users’ mental state during interaction and the inherent lack of adaptation in the presentation of the virtual content. In fact, studies have shown that users experience VR in different ways. While approximately 60% of users experience “cybersickness”, which represents the set of deleterious symptoms that may occur after a prolonged use of virtual reality systems, users can also suffer from breaks in presence and immersion, due to rendering and interaction anomalies which can lead to a poor feeling of embodiment and incarnation towards their virtual avatars. In both cases user’s experience is severely impacted as VR experience strongly relies on the concepts of telepresence and immersion. The aim of this project is to pave the way to the new generation of VR systems leveraging the electrophysiological activity of the brain through a passive BCI to level-up the immersion in virtual environments. The objective is to provide VR systems with means to evaluate the users’ mental states through the real-time classification of EEG data. This will improve users’ immersion in VR by reducing or preventing cybersickness, and by increasing levels of embodiment through the real time adaptation of the virtual content to the users’ mental states as provided by the BCI. In order to reach this objective, the proposed methodology is to (i) investigate neurophysiological markers associated with early signs of cybersickness, as well as neuromarkers associated with the occurrence of VR anomalies; (ii) build on existing signal processing methods for the real-time classification of these markers associating them with corresponding mental states and (iii) provide mechanisms for the adaptation of the virtual content to the estimated mental states.

More Info: https://www.chistera.eu/projects/genesis

 TACTILITY - EU HORIZON 2020

TACTILITY is a European Horizon 2020 project that incorporates rich and meaningful tactile information into novel interaction systems with virtual environments, increasing the quality of immersive virtual reality and of tele-manipulation.

Scope

TACTILITY, a multidisciplinary innovation and research action entitled “Tactile feedback enriched interaction through virtual reality and beyond”, has the overall aim of including rich and meaningful tactile information into novel interaction systems through technology for closed-loop tactile interaction with virtual environments. By mimicking the characteristics of the natural tactile feedback, it will substantially increase the quality of immersive Virtual Reality (VR) experience used locally or remotely (tele-manipulation).


Approach

The approach is based on transcutaneous electro-tactile stimulation delivered through electrical pulses with high resolution spatio-temporal distribution. To achieve it, significant development of technologies for transcutaneous stimulation, textile-based multi-pad electrodes and tactile sensation electronic skin, coupled with ground-breaking research of perception of elicited tactile sensations in VR, is needed.Novelty

The key novelty is in the combination of:





Aim

This research and innovation action shall result in a next generation of interactive systems with higher quality experience for both local and remote (e.g., tele-manipulation) applications. Ultimately, TACTILITY will enable high fidelity experience through low-cost, user friendly, wearable and mobile technology.


You can find more info on the project's Website: Tactility-H2020

 VIRTUAL REALITY - EVERYDAY ASSESSMENT LAB (VR-EAL)

PhD Thesis 

During my PhD, I developed the Virtual Reality Everyday Assessment Lab (VR-EAL) to create an immersive virtual environment that simulates everyday tasks proximal to real-life to assess prospective memory, episodic memory (immediate and delayed recognition), executive functions (i.e., multitasking and planning) and selective visual, visuospatial and auditory attention. VR-EAL is the first immersive VR neuropsychological battery of everyday cognitive functions. In the VR-EAL, individuals are exposed to alternating tutorials (practice trials) and storyline tasks (assessments) to allow them to become familiarized with both the immersive VR technology and the specific controls and procedures of each VR-EAL task.

Moreover, VR-EAL offers also a shorter version (i.e., scenario) where only episodic memory, executive function, selective visual attention, and selective visuospatial attention are assessed. Also, the examiner can opt to simply assess a specific cognitive function, where the examinee will go through the generic tutorial, the specific tutorial for this task, and the storyline task that assess the chosen cognitive function (e.g., selective visual attention). The VR-EAL appears to be an effective neuropsychological tool for the assessment of everyday cognitive functions, which has enhanced ecological validity, with a highly pleasant testing experience, and does not induce cybersickness.

OTHER XR PROJECTS

 A Virtual Reality Version of the Corsi Block Task (Eye-Tracking Integrated)

Measuring Visuospatial Working Memory. Eye-Tracking assists with estimating whether there is an attentional issue that prevents encoding or a pure working memory issue.

A Virtual Reality Version of the Digit Recall Task (Eye-Tracking Integrated)

Measuring Verbal Working Memory in VR. Eye-Tracking measures focus (gaze-tracking) and emotional state (pupillometry).

A Virtual Reality Version of the Deary-Liewald Task (Eye-Tracking Integrated)

A Virtual Reality Version of the Deary-Liewald Task (Eye-Tracking Integrated). 

Measuring:

Reaction Time (Single Target)

Reaction Time (Multiple Targets)

Attentional Processing Speed (Multiple Targets) – Measured via Eye-Tracking

Motor Speed (Multiple Targets)

A User-Interface for Responding to Questionnaires in Virtual Reality (Eye-Tracking Integrated; CSQ-VR )

This UI allows the collection of data by users while being immersed in VR. The current UI has been used for creating the VR Version of the “Cybersickness in VR Questionnaire” (CSQ-VR).

The Eye-Tracking is used for measuring the Reading Time (Gaze Tracking), the ability to focus on the written stimuli (Gaze Movement), and the emotional state (Pupillometry).

A User-Interface for Calibrating Electrotactile & Vibrotactile Feedback in Virtual Reality

A Fitts’s Law Task with Electrotactile and Vibrotactile Feedback in Virtual Reality

A Virtual Reality Driving Simulation

A Virtual Reality Driving Simulation designed for an experiment pertaining to attentional and executive functioning processes while driving. In this presentation, you may find a description and suggestions on how to design and develop on Unity an immersive virtual reality experiment to assess attentional processes and executive functioning of drivers. The suggestions also include the acquisition of eye-tracking and electroencephalography (EEG) data.

Link to the presentation: Presentation of the project on ResearchGate


VR Exchange of Data: An Immersive Virtual Environment for Exchanging Scientific & Clinical Data.

Double Encryption (Synchronous & Asynchronous) and Steganography (In Video File) of Data, Video Display of Identification Documents, and Interactive Two-Directional Confirmation of the Prospective Exchange. Collaboration with Dr. Nick Pitropakis, Cybersecurity Department, Napier University of Edinburgh, UK.

Singularity Postponed

An Immersive Creative Experience Highlighting the Complexity of Uploading Human Cognition and Emotion on the Cloud. Collaboration with MA Lisa Brown, Edinburgh College of Art, University of Edinburgh, UK.

WebVR Exhibition – Lin Huiyin

This is an immersive exhibition of the life and work of Lin Huiyin, A well-known Chinese architect, diplomat, poetess, and writer. This VR software runs both online on Mozilla Firefox browser (WebVR Version) and on a high-end PC (standalone version). 

Website: https://funnydoudou.wordpress.com

A collaboration with MA Yiwen Zhi, Digital Humanities, University of Bologna, Italy.

Standalone Version 

WebVR Version

uCreate Studio VR Exhibition

A Virtual Tour at (an imaginary - Sci-Fi) uCreate Studio, where you may see and get information about the several cutting-edge technologies that we use in the uCreate Studio, University of Edinburgh. Also, there is a bitter-sweet ecological message at the end of the tour. This VR software aims to promote the technologies, services, and environmental goals of uCreate Studio. 

A collaboration with Mike Boyd, Head of uCreate Studio, University of Edinburgh, UK.

Data Visualisation of Fluid Dynamics

An Immersive VR - Data Visualisation app of fluid dynamics (velocity magnitude, velocity on the 3 directional axes, and static pressure) by using Computational fluid dynamics (CFD) in Unity. The app has two versions (Static Data Visualisation & Transient Data Visualisation).

A collaboration with MEng Scott Towt and Professor Prash Valluri, Chemical Engineering Department, School of Engineering, University of Edinburgh, UK

Static Data Visualisation

Transient Data Visualisation

Virtual Reality Digital Watch

A digital watch to be used in VR projects and improve the quality of the temporal illusion.

Project’s Repository: https://github.com/PanosCortese/Virtual_Reality_Digital_Watch