Research Keywords

  • Cognitive processes
  • Human information processing models
  • Multiple target tracking
  • Visual scanning patterns and their visualization
  • Usability testing
  • Situation awareness
  • Assistive learning methods
  • Multimodal analysis

Current Research Applications

National Science Foundation CAREER Award

  • Non-tex-based Smart Learning in Multi-person VR using eye movements, brain activities, and haptic interactions :Non-text-based smart learning refers to technology-supported learning that uses non-text features (i.e. visualized information) and adapted learning materials based on the individual’s needs. Fully immersive multi-person virtual reality (MVR) refers to humans located at different places wearing VR devices to join a single virtual room, a classroom with unlimited size, to learn from an instructor. The learning environment is poised to undergo a major reformation, and MVR will augment, and possibly replace, the traditional classroom learning environment. The purpose of this research is to discover new smart learning methodologies within the MVR environment using nonintrusive multimodal analysis of physiological measures, including eye movement characteristics, haptic interactions, and brain activities.

  • The image above shows how we can obtain raw eye movement data in a fully immersive VR environment. The image above shows how we can additionally obtain brain activity data and haptic interaction data. The technology already exists and the issues that we need to address are: [1] developing effective non-text-based (or visualized) learning materials in VR and [2] effectively analyzing the physiological data (eye movements, brain activities, and haptic interactions) in near real-time to provide timely scaffolding options. The NSF Career award enables us to develop predictive algorithms on whether the learner is effectively engaging and learning within the VR world.

    Researchers:
    PI: Dr. Ziho Kang
    Mr. Saptarshi Mandal
    Mr. Ricardo Fraga


Air Traffic Management Systems

  1. Airport Tower Control Risk Analysis using VR: This research involves discovering expert air traffic controllers'visual scanning patterns that can be effectively taught to the novices to possibly prevent accidents and improve performance.


    The image above shows the expert's eye movements overlaid onto the recorded field of view. A realistic VR environment was created using eight 60 inch monitors. The center of the circles indicate the eye fixation locations. The size of the circels are proportional to the eye fixation durations. The lines indicate the fast saccadic movements among the eye fixations. The numbers in the circles indicate the eye fixation sequences. The representation of the eye fixation sequence is called a visual scan path.


  2. The image above shows how the experts' visual scanning patterns can be discovered through clustering many instances of visual scan paths.

    Researchers:
    PI: Dr. Ziho Kang
    Researchers in the Aerospace Human Factors Research Division at FAA CAMI
    Mr. Saptarshi Mandal
    Mr. Ricardo Fraga

  3. Eye movement analysis and data visualization: This research involves a wide vareity of methods to aid air traffic controllers with managing multiple dynamic aircraft. Eye tracking data analysis is performed to further understand the cognitive process of controllers as they scan aircraft, then identify and solve conflicts between them. This understanding leads to detailed characterization of scanning strategies and more effective training methods.


    The image above on the left shows a zoomed-in area of a realistic simulation of an air traffic radar display. Each green diamond represents an aircraft, the line projecting out represents its travel direction, and the text is a data tag listing the flight number, altitude, and velocity respectively. The image on the right shows a 1.5 minute eye movement scan of an air traffic controller overlaid on a scenario including 20 aircraft. The circled numbers show the order in which the eye fixations were made and the white lines connecting them are rapid movements between fixations. There are many ways to characterize these scanning startegies including categories such as duration, number of comparisons between aircraft, and search patterns.


  4. The image to the left represents a simulated RADAR display of low altitude Enroute airspace. The Green dot shows the eye fixation location of an ATC. The green lines represent saccades of the eye movement. The image on the right represents the directed weighted network (DWN) representation of the AOI fixation sequence data of the ATC. The size and the colour of the nodes (AOIs) are proportional to the number of times the AOIs are visited and to the amount of eye fixation duration that happened on the AOIs respectively (Red means higher fixation duration yellow means lower fixation duration).

    Researchers:
    PI: Dr. Ziho Kang
    Researchers in the Aerospace Human Factors Research Division at FAA CAMI
    Mr. Saptarshi Mandal
    Mrs. Sarah McClung

  5. Multimodal Analysis using Neuro-imaging and Eye Movements to Assess Cognitive Workload: The emergence of wearable sensors that measure physiological signals enables us to monitor the mental workload in real time without interfering in everyday operational activity.



  6. The images above show the task difficulty increase based on the number of aricraft vs. number of commands issued by the controller. We can investigate the correlations among the eye movement characteristics, brain activities (i.e. oxygenation), and task complexity.

    Researchers:
    PIs: Dr. Ziho Kang and Dr.Kurtulus Izzetoglu (Biomedical Eng., Drexel U.)
    Mr. Ricardo Fraga (OU)
    Ms. Yerram Pratusha Reddy (Biomedical Eng., Drexel U.)

  7. Gamification of visual scanning strategies.: The aim is to develop fun and engaging games to increase the performance of the trainees.


    Researchers:
    PIs: Dr. Ziho Kang
    Mr. Clay Wyrrick

  8. Development of automatic process of mapping eye tracking data to multiple moving objects

  9. Researchers:
    PI: Dr. Ziho Kang
    Researchers in the Aerospace Human Factors Research Division at FAA CAMI
    Mr. Saptarshi Mandal

  10. Universal Design for Learning (UDL) and multimodal training : The aim is to to identify and implement the UDL principles to embrace different learning styles in order to support the training of FAA Academy candidates. In addition, this research investigates the possibiity of applying state-of-the-art technologies such augemented reality, virtual reality, eye tracking, and EEG (brain activity).

  11. Researchers:
    PI: Dr. Ziho Kang
    Co-PIs: Drs. Randa R. Shehab, Lei Ding, and Han Yuan.
    SME: Stephen G. West
    Ms. Lauren N. Yeagle
    Ms. Mattlyn R. Dragoo
    Mr. Josiah Rippetoe
    Mr. Ricardo F. Palma

  12. Combined analysis of visual scanning patterns and aircraft control strategies for traininig intervention: Develop improved training materails for the FAA Academy candidates.

  13. Researchers:
    PI: Dr. Ziho Kang
    Co-PI: Dr. John Dyer
    Mr. Saptarshi Mandal
    Mr. Ricardo F. Palma
    Mrs. Sarah N. McClung
    Mr. Kelvin Egwu

Pilot Cockpit Systems

  • Predicting fatigue using eye movement characteristics: This research involves the investigation of pilot behaviors when interacting with cockpit interfaces under fatigue. Human errors and interface usability issues are being investigated. The yellow boxed highlight in the figure below are defined as areas of interest (AOIs) so that we investigate whether the pilots were attending to visually specified events when fatigued.


  • Researchers:
    Lead: Dr. Ziho Kang
    Mr. Salem Naeeri


Weather Forecasting and Alerting Systems

  1. Investigating forecasters' cognitive processes when detecting weather hazards: This research is based on a collaborative effort with researchers at NOAA.

  2. Researchers:
    Lead: Dr. Pamela Hiselman
    Dr. Ziho Kang
    Dr. Katie Bowden


  3. Understanding forecasters' situation awareness when predicting flash floods.

  4. Researchers:
    Lead: Dr. Randa Shehab
    Dr. Ziho Kang
    Dr. Jonathan Gourley
    Dr. Elizabeth Argyle

  5. Usability testing of weather alert applications: Supported by Weather Decision Technologies.

  6. Researchers:
    Lead: Dr. Ziho Kang
    Mr. Khamaj Abdulrahman

    This research focuses on evaluating the usability and determining the usability issues of mobile weather alert applications. Specifically, we are investigating the Weather Radio application as the sample representative for the weather alert applications.

    The image above on the left is a screen shot of the home screen of the Weather Radio application. The image on the right shows a sample of eye movement scan of one of the application’s users.

Online Healthcare Systems

  • This research involves the usability of softkey when interacting with online healthcare systems using tablets. In addtion, we are investigating the users' comprehension level on the medical answers posted by the doctors. The image depicts the evolution of typing error calculation methologies in the domain of human-computer interaction. We have developed a novel approach to assess such errors.


  • Researchers:
    PI: Dr. Ziho Kang
    Co-PI: Dr. Kwangtaek Kim
    Ms. Jiwon Jeon
    Mr. Lucas Mosoni
    Mrs. Sarah McClung

Enhancing Expert Containment, Decision Making, and Risk Communications: Virtual Reality Offshore Operations Training Infrastructure

  • This research is based on the Deepwater Horizon operations, specifically Gulf BP oil crisis. We are investigating ways for operators to effectively perceive imminent risk and communicate among one another. The figures show the real-time display of indicators before the accident and the analysis of the operators' verbal responses when they were interrogating the designated areas of interest (AOIs 1~7).


  • Researchers:
    PI: Dr. Saeed Salehi
    Co-PIs: Drs. Ziho Kang and Edward Cokely
    Ms. Jiwon Jun
    Mr. Vincent ybarra
    Mr. Raj Kiran

Marketing strategies: Investigation of human behavior when watching Youtube commercials

  • We are investigating humans' visual attention behavior when observing commercials on Youtube. As a very simple example, we are visually attentive to an interesting clip (lower left figure) but are eager to skip a clip if it is does not grasp our attention (lower right figure).


  • Researchers:
    Lead: Dr. Doyle Yoon
    Dr. Ziho Kang
    Mr. Fuwei Sun
    Ms. Morgan Corn

Haptic cue: Predicting mouse or touchpad patterns

  • We have developed methods to use mouse or touchpad patterns as a control method.

  • Researchers:
    Lead: Dr. Ziho Kang
    Mr. Lucas Cezar