Inadequate situation awareness and response are increasingly recognized as prevalent critical errors that lead to young driver crashes. To identify and assess key indicators of young driver performance (including situation awareness), we previously developed and validated a Simulated Driving Assessment (SDA) in which drivers are safely and reproducibly exposed to a set of common and potentially serious crash scenarios. Many of the standardized safety measures can be calculated in near real-time from simulator variables. Assessment of situation awareness, however, largely relies on time-consuming data reduction and video coding. Therefore, the objective of this research was to develop a near real-time automated method for analyzing general direction and location of driver's gaze in order to assess situation awareness. Head tracking was employed as a proxy and standard display of computer readable patterns at the corners of the simulator monitors created fixed locations within the simulator display. The analysis system algorithmically detected whether each unique pattern was in the driver's field of view and computed a homography transformation from the camera view to each of the three screens. The homography transformation standardized the gaze tracking data streams for each of the simulator screens and generated corrected scene-view videos for manual validation. All software and immediate dependencies are open source. We verified that our automated procedure, called SimGaze: (1) produced comparable results to those produced by hand coding of well-calibrated videos and (2) worked in real time, reducing researcher time required for analysis and improving the simulator report.