With the improved safety performance of vehicles, the number of accidents have been reducing than before. However, accidents due to driver distraction still remain to occur, and thus there could be a relatively high need to determine whether a driver properly looks at the surroundings or not. Meanwhile, in the wake of the trend of partial automatic driving of vehicles in recent years, it is also highly required to grasp the state of a driver, and regardless of whether it is automatic driving or not it is sought to grasp the state of a driver and perform application for control depending on the state of the driver. Under the circumstance, we have built an algorithm that determines a gazing point of a driver for performing basic determination of whether or not the driver is in a state suitable for safe driving of a vehicle. In this algorithm, it is determined whether or not a driver faces front by using information such as face and visual direction angles calculated based on an image from a grayscale camera installed on a steering column. Here we are reporting about this algorithm that has been examined on basic conditions where research subjects gaze various targets, in a state of mounting the algorithm in a vehicle in actual.