Aalen Mobility Perception & Exploration Lab (vision-research.de)

Aalen Mobility Perception & Exploration Lab (vision-research.de)

More than 90% of the environmental input is processed by the visual and auditory/vestibular sensory system. Impairment of related sensory organs have critical impact on orientation,ability to work and activities of daily living. The aim of the Aalen Mobility Perception & Exploration Lab (AMPEL) is to evaluate the dynamic, cross-modal explorative capacity by tracking eye, head and body movements and additional auditory/vestibular parameters regarding positional and directional input. The related results shall indicate how sensory impairment is compensated or how attention is guided towards relevant objects in order to perform adequately and efficiently. AMPEL provides two separate great research environments (area more than 170 m²) with complete shading, lighting control, large-area window, various high-resolution visual and acousting stimulation systems and separate rooms for control and evaluation and server units. The examination rooms can be variably equipped with (curved) screens, projection units, ultra-high (curved) screens, vehicles, shopping shelves, a cupola for examining eye hand co-ordination, experimental setups for quantification of stereoscopic resolution, obstacle courses. Mobile devices allow for time-coded tracking/recording and automated evaluation of eye, head and body position in space.

Projects

The following projects are executed:

  • Standardized Tracking of Remote Head and Gaze Movements for Automated Assessment and Dispense of Individualised Varifocals (STRADIVari)
  • Virtual Reality Simulation of Spectacles (VR-SPECTACLES)
  • Eye hand coordination in spatiotemporally modified vision (ComplexAdapt)
  • Automated Measurement of Distortion (DISTORTION)
  • Objective Comparison of Progressive Addition Lenses Field Widths (PALCompar)
  • Objective Measurement of Adaptation to Progressive Addition Lenses (PALAdapt)
  • Analysis of visual acuity of blind patients after implantation of an electronic sub retinal implant
  • Smart Ocular Motility Analysis (SOMA)
  • Gaze guidance for cross-modal compensation of sensory deficits (SensComp)
  • Attention guidance via acoustic and visual signaling (AcoustGuide)
  • Automated extraction and comparison of exploratory eye movements in highly dynamic, responsive environments (AutoScanpath)
  • Validation and optimization of binaural signal processing in real life situations (BinauralSignal)
  • Development and validation of representative, standardised night-driving and glare scenarios in a driving simulator (GlareSim)
  • Examination of suitability for use of instruction manuals

Cooperating Research Teams

Our cooperation partners are:

196 views

Same Tag Posts

  1. Study: Daily Cannabis Consumers Exhibit Few Changes in Simulated Driving Performance Compared to Controls – NORML ()
  2. Driving force: A new simulator lab opens new avenues for UAB research – Alabama News Center ()
  3. Brembo adopts simulation software for brake system development (,)
  4. E-News | Participants needed for CBD and driving simulation study – West Virginia University ()
  5. Interactive simulator shows local students the dangers of distracted driving – Spectrum News 1 ()
  6. Fast Car Driving Simulator Highway Racing with Extreme Car Driving & Car Racing New Car Games 2025, Now 43% Off – The Inventory ()
  7. Bus driving simulator, food court, more at newly opened Sportran City – KTALnews.com ()
  8. Bus driving simulator, food court, more at newly opened Sportran City – Yahoo ()
  9. Beyond the track: How a Michigan simulator is redefining car design – FOX 2 Detroit ()
  10. Local driving simulator helps teens prepare for ‘100 deadliest days’ on the road – ABC15 Arizona ()

Same Category Posts