Tactility - EU Horizon 2020
TACTILITY is a European Horizon 2020 project that incorporates rich and meaningful tactile information into novel interaction systems with virtual environments, increasing the quality of immersive virtual reality and of tele-manipulation.
The ground-breaking research of perception of electrotactile stimuli for the identification of the stimulation parameters and methods that evoke natural like tactile sensations.
The advanced hardware, that will integrate the novel high-resolution electrotactile stimulation system and state of the art artificial electronic skin patches with smart textile technologiesa and the VR control devices in a wearable mobile system.
The novel firmware that handles real-time encoding and transmission of tactile information from virtual objects in VR, as well as from the distant tactile sensors (artificial skins) placed on robotic or human hands.
You can find more info on the project's Website: Tactility-H2020
Virtual Reality - Everyday Assessment Lab (VR-EAL)
During my PhD I developed the Virtual Reality Everyday Assessment Lab (VR-EAL) to create an immersive virtual environment that simulates everyday tasks proximal to real-life to assess prospective memory, episodic memory (immediate and delayed recognition), executive functions (i.e., multitasking and planning) and selective visual, visuospatial and auditory attention.
VR-EAL is the first immersive VR neuropsychological battery of everyday cognitive functions.
In the VR-EAL, individuals are exposed to alternating tutorials (practice trials) and storyline tasks (assessments) to allow them to become familiarized with both the immersive VR technology and the specific controls and procedures of each VR-EAL task.
Moreover, VR-EAL offers also a shorter version (i.e., scenario) where only episodic memory, executive function, selective visual attention, and selective visuospatial attention are assessed. Also, the examiner can opt to simply assess a specific cognitive function, where the examinee will go through the generic tutorial, the specific tutorial for this task, and the storyline task that assess the chosen cognitive function (e.g., selective visual attention). The VR-EAL appears to be an effective neuropsychological tool for the assessment of everyday cognitive functions, which has enhanced ecological validity, a highly pleasant testing experience, and does not induce cybersickness.
Other XR Projects
A Virtual Reality Driving Simulation designed for an experiment pertaining to attentional and executive functioning processes while driving. In this presentation, you may find a description and suggestions on how to design and develop on Unity an immersive virtual reality experiment to assess attentional processes and executive functioning of drivers. The suggestions also include the acquisition of eye-tracking and electroencephalography (EEG) data.
Link to the presentation: Presentation of the project on ResearchGate
In this demo-video (runs on HTC Vive; Refresh rate more than 100fps) you can see how the vehicle physics behave (up left there are the data e.g., speed; note that the Maximum speed can be defined!) and how an AI-Car (the red car) behave and how can be created
Link to the Project's Repository: Project's Repository on GitHub
Double Encryption (Synchronous & Asynchronous) and Steganography (In Video File) of Data, Video Display of Identification Documents, and Interactive Two-Directional Confirmation of the Prospective Exchange. Collaboration with Dr. Nick Pitropakis, Cybersecurity Department, Napier University of Edinburgh, UK.
An Immersive Creative Experience Highlighting the Complexity of Uploading Human Cognition and Emotion on the Cloud. Collaboration with MA Lisa Brown, Edinburgh College of Art, University of Edinburgh, UK.
This is an immersive exhibition of the life and work of Lin Huiyin, A well-known Chinese architect, diplomat, poetess, and writer. This VR software runs both online on Mozilla Firefox browser (WebVR Version) and on a high-end PC (standalone version).
Video – WebVR Version:
Video – Standalone Version:
A collaboration with MA Yiwen Zhi, Digital Humanities, University of Bologna, Italy.
A Virtual Tour at (an imaginary - Sci-Fi) uCreate Studio, where you may see and get information about the several cutting-edge technologies that we use in the uCreate Studio, University of Edinburgh. Also, there is a bitter-sweet ecological message at the end of the tour. This VR software aims to promote the technologies, services, and environmental goals of uCreate Studio.
A collaboration with Mike Boyd, Head of uCreate Studio, University of Edinburgh, UK.
An Immersive VR - Data Visualisation app of fluid dynamics (velocity magnitude, velocity on the 3 directional axes, and static pressure) by using Computational fluid dynamics (CFD) in Unity. The app has two versions (Static Data Visualisation & Transient Data Visualisation).
Video – Static:
A collaboration with MEng Scott Towt and Professor Prash Valluri, Chemical Engineering Department, School of Engineering, University of Edinburgh, UK
A digital watch to be used in VR projects and improve the quality of the temporal illusion.
Project’s Repository: https://github.com/PanosCortese/Virtual_Reality_Digital_Watch