Human Factors

Human Factors

Group Leader: Dr. Wayne Giang

Dr. Wayne Giang, Assistant Professor in the Department of Industrial & Systems Engineering at the University of Florida

Dr. Wayne Giang is a human factors and cognitive engineering researcher with a focus on human-decision making, end-user training, and interface design. His work is centered on understanding, evaluating, and designing new ways to help individuals safely interact and adopt emerging technologies in domains such as surface transportation and healthcare. Dr. Giang’s work has focused on populations that may have unique needs in learning or adapting to new technologies due to perceptual, cognitive, or motor decline (e.g., Parkinson’s Disease, Mild to Moderate Cognitive Impairment, and older adults) or lack of expertise (e.g., health insurance decision making). Dr. Giang’s work has been funded through the US Department of Transportation, the National Institute of Standards and Technology, and the National Institute on Disability, Independent Living, and Rehabilitation. He received his bachelor’s and master’s degrees in Systems Design Engineering from the University of Waterloo in Waterloo, Ontario and his Ph.D. from the University of Toronto.

Mission

The mission of the UFTI’s human factors research group is to conduct and foster impactful, crosscutting, multimodal transportation research; educate the next generation of transportation leaders; and facilitate technology transfer.

Projects

Project TitlePIFunding Source Status
UF & UAB’s Phase I Demonstration Study: Older Driver Experiences with Autonomous Vehicle TechnologyDr. Sherrilene ClassenSoutheastern Transportation Research, Innovation, Development and Education Center (STRIDE)Completed
UF & UAB’s Phase II Demonstration Study: Developing a Model to Support Transportation System Decisions considering the Experiences of Drivers of all Age Groups with Autonomous Vehicle TechnologyDr. Sherrilene ClassenSoutheastern Transportation Research, Innovation, Development and Education Center (STRIDE)Completed
Exploring Younger and Older Driver’s Trust, Attention and Workload During Vehicle AutomationDr. Justin MasonUniversity of Florida Clinical and Translational Science Institute (CTSI)Completed
I-STREET Initiative – Evaluation of Intelligent School Zone Beacon and Vehicle-Cyclist Detection and Warning SystemDr. Eakta JainFlorida Department of TransportationCompleted
Optimizing Highly Automated Driving Systems for People with Cognitive Disabilities Dr. Eakta JainU.S. Department of Transportation Completed
Driving Performance of People with Parkinson’s Disease via Autonomous Vehicles Dr. Sherrilene ClassenNational Institute on Disability, Independent Living, and Rehabilitation ResearchCompleted

Participants

Photo of Sherrilene Classen Sherrilene Classen Professor and Chair, Department of Occupational Therapy (352) 273-6883
Photo of Wayne Giang Wayne Giang Assistant Professor, Department of Industrial & Systems Engineering (352) 294-7729
Photo of Juan Gilbert Juan Gilbert Banks Family Preeminence Endowed Professor & Department Chair, Department of Computer & Information Science & Engineering (352) 392-1200
Photo of Eakta Jain Eakta Jain Associate Professor, Department of Computer & Information Science & Engineering
Photo of David Kaber David Kaber Department Chair & Dean’s Leadership Professor, Department of Industrial & Systems Engineering (352) 294-7700
Photo of Pruthvi Manjunatha Pruthvi Manjunatha I-STREET “Living lab” Manager and Research Assistant Professor, Department of Civil and Coastal Engineering
Photo of Justin Mason Justin Mason Research Assistant Professor, Department of Occupational Therapy (352) 273-6146
Photo of Siva Srinivasan Siva Srinivasan Associate Director, UFTI; Associate Professor, Department of Civil and Coastal Engineering (352) 294-7807

Research Resources

Driving Simulators

Smart Home Driving Simulator

The Driving Simulator Lab is located in the garage of the Gator Tech Smart House which provides convenient access to participants who enjoy dedicated parking and lessened traffic around 3.5 miles from the main university campus. It is a high-fidelity simulator from Realtime Technologies Inc. (RTI). The RTI simulator is integrated into a full car cab with a 180 degree Field of View, with rear and side view mirrors, steering feedback, and realistic engine, transmission, wind and tire noises. The full car cab allows the driver to operate a normal accelerator, brake, steering, transmission selection, and signaling controls with the simulator responding accordingly. Longitudinal and lateral movement allows the driver to speed up or slow down, come to a halt, steer laterally, make lane changes, and changes in direction at intersections. Recording software permits the recording of up to 40 vehicles, driver, and simulation parameters in Excel format.

UF Advanced Driving Simulator
Image shows a driving simulator that is elevated on a platform above the ground. Doors are open to display the simulator seat located on the platform.

The University of Florida Advanced Driving Simulator is a high-fidelity full motion driving simulator located in the Human Systems Engineering Lab in the Department of Industrial and Systems Engineering. The driving simulator allows for the presentation of 360 degree field of view through 8 large LED panels providing an immersive and realistic visual driving environment. The full motion capabilities of the simulator is facilitated through a 6 degree-of-freedom motion base providing motion and acceleration cues that can help reduce simulator sickness and allow for a more realistic driving experience. A vehicle cab (a quarter-cab setup) with realistic force feedback steering, a completely customizable LCD instrument panel, accelerator, brake, gearshift, signaling controls, and customizable buttons on the steering wheel and dashboard allows drivers to operate the simulated vehicle. The simulation is controlled through a control station located next to the motion base, that allows experimenters to monitor the status of the experiment and participants through digital displays of vehicle parameters, the simulated driving environment, videos of the participant, and voice communication. The UFADS is also equipped with a number of driver state sensors, including eye-trackers and other psychological sensors to help monitor driving visual attention and workload. The UFADS is also capable of supporting a number of Advanced Driving Assistance Systems, including adaptive cruise control, lane keeping, emergency braking, as well as partial, conditional, and fully automated driving. 

Simulator Vehicle Cab

The Driving Simulator Lab houses a 4-door sedan allows the driver to operate normal accelerator, brake, steering, transmission selection, and signaling controls with the simulator responding accordingly. Longitudinal and lateral movement allows the driver to speed up or slow down, come to a halt, steer laterally including lane changes and changes of direction at intersections. All driver inputs are controlled by software that interfaces with the electronics in the car cab. 

Human Behavior Observation Systems

Eye Tracking

Our labs have remote mounted and head mounted eye tracking units distributed across various labs including Tobii Pro Glasses, 3 Pupil headsets, 1 SMI ETG2 glasses, and Mirametrix S2 eye tracker; as well as a number of screen based eye trackers, including Tobii Nano and SMI RED-m.

Body and Gesture Tracking

Physiological and motion sensors can be used to assess the users’ state, including the Xsens MTw Awinda wireless motion tracker.

Interviews and Qualitative Observations

The Safety-Critical Systems User Experience (SCUSE) Facility is part of the Human Systems Engineering Lab in the Department of Industrial and Systems Engineering consisting of an experimental/observational room and a control room. The purpose of this facility is to conduct human subjects experiments, focus groups, and interviews.including experiment space and observation/analysis rooms. The facility is instrumented with video and audio capture devices including a set of 2 ceiling mounted PTZ cameras, 2 ceiling mounted cardioid condenser hanging microphones, 3 USB webcams, and a USB condenser microphone to capture video and audio data of user interactions. Video and audio data are synchronized with other data streams using the D-Lab data capture and analysis software suite. The experimental space consists of movable and customizable tables and chairs that can be configured into a conference table, office desk space, command and control center setups, and can also be removed and stored away to create a large open 300 sq. ft. space for VR and other studies that require movement

Virtual and Augmented Reality

Mixed reality interventions are designed and evaluated on a number of Virtual Reality (VR) and Augmented Reality (AR) platforms. Google Cardboard is a head-mounted display that uses a mobile phone for 3D stereoscopic rendering. Oculus DK2-SMI and Oculus CV1 are desktop head-mounted displays that require VR ready CPU and GPU for stereoscopic 3D rendering and connects over HDMI and USB. Oculus DK2-SMI integrates eye tracking with virtual reality.