Human Factors

Mission

The mission of the UFTI’s human factors research group is to conduct and foster impactful, crosscutting, multimodal transportation research; educate the next generation of transportation leaders; and facilitate technology transfer.

Projects

Project TitlePIFunding Source Status
UF & UAB’s Phase I Demonstration Study: Older Driver Experiences with Autonomous Vehicle TechnologySherrilene Classen, Ph.D.Southeastern Transportation Research, Innovation, Development and Education Center (STRIDE)Completed
UF & UAB’s Phase II Demonstration Study: Developing a Model to Support Transportation System Decisions considering the Experiences of Drivers of all Age Groups with Autonomous Vehicle TechnologySherrilene Classen, Ph.D.Southeastern Transportation Research, Innovation, Development and Education Center (STRIDE)Completed
I-STREET Initiative – Evaluation of Intelligent School Zone Beacon and Vehicle-Cyclist Detection and Warning SystemEakta Jain, Ph.D.Florida Department of TransportationCompleted
Optimizing Highly Automated Driving Systems for People with Cognitive Disabilities Eakta Jain, Ph.D.U.S. Department of Transportation Completed
Driving Performance of People with Parkinson’s Disease via Autonomous Vehicles Sherrilene Classen, Ph.D.National Institute on Disability, Independent Living, and Rehabilitation ResearchCompleted

Participants

Photo of Juan Gilbert Juan Gilbert Distinguished Professor, Department of Computer & Information Science & Engineering (352) 392-1200
Photo of Eakta Jain Eakta Jain Associate Professor, Department of Computer & Information Science & Engineering
Photo of Siva Srinivasan Siva Srinivasan Professor & Associate Director, UFTI, Department of Civil and Coastal Engineering (352) 294-7807

Research Resources

Driving Simulators

Smart Home Driving Simulator

The Driving Simulator Lab is located in the garage of the Gator Tech Smart House, which provides convenient access to participants who enjoy dedicated parking and lessened traffic around 3.5 miles from the main university campus. It is a high-fidelity simulator from Realtime Technologies Inc. (RTI). The RTI simulator is integrated into a full car cab with a 180-degree Field of View, with rear and side view mirrors, steering feedback, and realistic engine, transmission, wind, and tire noises. The full car cab allows the driver to operate a normal accelerator, brake, steering, transmission selection, and signaling controls with the simulator responding accordingly. Longitudinal and lateral movement allows the driver to speed up or slow down, come to a halt, steer laterally, make lane changes, and changes in direction at intersections. Recording software permits the recording of up to 40 vehicles, driver, and simulation parameters in Excel format.

UF Advanced Driving Simulator

The University of Florida Advanced Driving Simulator is a high-fidelity full-motion driving simulator located in the Human Systems Engineering Lab in the Department of Industrial and Systems Engineering. The driving simulator features a 360-degree field of view, presented through eight large LED panels, providing an immersive and realistic visual driving environment. The full motion capabilities of the simulator are facilitated by a 6-degree-of-freedom motion base, which provides motion and acceleration cues that help reduce simulator sickness and allow for a more realistic driving experience. A vehicle cab (a quarter-cab setup) features realistic force feedback steering, a completely customizable LCD instrument panel, accelerator, brake, gearshift, and signaling controls, as well as customizable buttons on the steering wheel and dashboard, allowing drivers to operate the simulated vehicle. The simulation is controlled through a control station located next to the motion base, which allows experimenters to monitor the status of the experiment and participants through digital displays of vehicle parameters, the simulated driving environment, videos of the participant, and voice communication. The UFADS is also equipped with a number of driver state sensors, including eye-trackers and other psychological sensors to help monitor driving visual attention and workload. The UFADS is also capable of supporting a number of Advanced Driving Assistance Systems, including adaptive cruise control, lane keeping, emergency braking, as well as partial, conditional, and fully automated driving. 

Simulator Vehicle Cab

The Driving Simulator Lab houses a 4-door sedan that allows the driver to operate normal accelerator, brake, steering, transmission selection, and signaling controls with the simulator responding accordingly. Longitudinal and lateral movement allows the driver to speed up or slow down, come to a halt, steer laterally, including lane changes and changes of direction at intersections. All driver inputs are controlled by software that interfaces with the electronics in the car cab. 

Human Behavior Observation Systems

Eye Tracking

Our labs have remote-mounted and head-mounted eye tracking units distributed across various labs, including Tobii Pro Glasses, 3 Pupil headsets, 1 SMI ETG2 glasses, and Mirametrix S2 eye tracker; as well as a number of screen-based eye trackers, including Tobii Nano and SMI RED-m.

Body and Gesture Tracking

Physiological and motion sensors can be used to assess the users’ state, including the Xsens MTw Awinda wireless motion tracker.

Interviews and Qualitative Observations

The Safety-Critical Systems User Experience (SCUSE) Facility, located within the Human Systems Engineering Lab in the Department of Industrial and Systems Engineering, comprises an experimental/observational room, as well as a control room. The purpose of this facility is to conduct human subjects experiments, focus groups, and interviews, including experiment space and observation/analysis rooms. The facility is instrumented with video and audio capture devices, including a set of two ceiling-mounted PTZ cameras, two ceiling-mounted cardioid condenser hanging microphones, three USB webcams, and a USB condenser microphone to capture video and audio data of user interactions. Video and audio data are synchronized with other data streams using the D-Lab data capture and analysis software suite. The experimental space consists of movable and customizable tables and chairs that can be configured into a conference table, office desk space, command and control center setups, and can also be removed and stored away to create a large open 300 sq. ft. space for VR and other studies that require movement

Virtual and Augmented Reality

Mixed reality interventions are designed and evaluated on a number of Virtual Reality (VR) and Augmented Reality (AR) platforms. Google Cardboard is a head-mounted display that uses a mobile phone for 3D stereoscopic rendering. Oculus DK2-SMI and Oculus CV1 are desktop head-mounted displays that require VR-ready CPU and GPU for stereoscopic 3D rendering and connects over HDMI and USB. Oculus DK2-SMI integrates eye tracking with virtual reality.