Aalen Mobility Perception & Exploration Lab (vision-research.de)
More than 90% of the environmental input is processed by the visual and auditory/vestibular sensory system. Impairment of related sensory organs have critical impact on orientation,ability to work and activities of daily living. The aim of the Aalen Mobility Perception & Exploration Lab (AMPEL) is to evaluate the dynamic, cross-modal explorative capacity by tracking eye, head and body movements and additional auditory/vestibular parameters regarding positional and directional input. The related results shall indicate how sensory impairment is compensated or how attention is guided towards relevant objects in order to perform adequately and efficiently. AMPEL provides two separate great research environments (area more than 170 m²) with complete shading, lighting control, large-area window, various high-resolution visual and acousting stimulation systems and separate rooms for control and evaluation and server units. The examination rooms can be variably equipped with (curved) screens, projection units, ultra-high (curved) screens, vehicles, shopping shelves, a cupola for examining eye hand co-ordination, experimental setups for quantification of stereoscopic resolution, obstacle courses. Mobile devices allow for time-coded tracking/recording and automated evaluation of eye, head and body position in space.
Projects
The following projects are executed:
- Standardized Tracking of Remote Head and Gaze Movements for Automated Assessment and Dispense of Individualised Varifocals (STRADIVari)
- Virtual Reality Simulation of Spectacles (VR-SPECTACLES)
- Eye hand coordination in spatiotemporally modified vision (ComplexAdapt)
- Automated Measurement of Distortion (DISTORTION)
- Objective Comparison of Progressive Addition Lenses Field Widths (PALCompar)
- Objective Measurement of Adaptation to Progressive Addition Lenses (PALAdapt)
- Analysis of visual acuity of blind patients after implantation of an electronic sub retinal implant
- Smart Ocular Motility Analysis (SOMA)
- Gaze guidance for cross-modal compensation of sensory deficits (SensComp)
- Attention guidance via acoustic and visual signaling (AcoustGuide)
- Automated extraction and comparison of exploratory eye movements in highly dynamic, responsive environments (AutoScanpath)
- Validation and optimization of binaural signal processing in real life situations (BinauralSignal)
- Development and validation of representative, standardised night-driving and glare scenarios in a driving simulator (GlareSim)
- Examination of suitability for use of instruction manuals
Cooperating Research Teams
Our cooperation partners are:
- Würzburg Institute for Traffic Sciences (WIVW)
- Study course Mechatronik, Aalen University of Applied Sciences
- Institute for Computer Sciences, Research group Prof. Dr. W. Rosenstiel, Tuebingen University
- University Eye Hospital Tuebingen
- Institut for Neurobiology, Prof. Dr. H. A. Mallot, Tuebingen University
- Zeiss Vision Lab
- P. Artes, School of Health Professions, Plymouth University
- C. Johnson, Carver College of Medicine, Department of Ophthalmology and Visual Sciences, University of Iowa
- University Eye Hospital Freiburg, Prof. Dr. M. Bach, Dr. S. Heinrich, Prof. Dr. W. Lagreze