With the improved safety performance of vehicles, the number of accidents has been decreasing. However, accidents due to driver distraction still occur, which means that there is a high need to determine whether a driver is properly looking at the surroundings. Meanwhile, with the trend toward partial automatic driving of vehicles in recent years, it is also urgently required that the state of the driver be grasped. Even if automatic driving is not installed, it is desired that the state of the driver be grasped and an application for control be performed depending on the state of the driver. Under these circumstances, we have built an algorithm that determines of the direction a driver is looking, to make a basic determination of whether or not the driver is in a state suitable for safe driving of the vehicle. In this algorithm, it is determined whether or not a driver is facing forward by using information such as face and viewing direction angles calculated from images from a grayscale camera installed on the steering column. Here, we report on this algorithm, including testing under basic conditions where test subjects gaze at various targets, in an actual vehicle in which a device operating according to this algorithm has been mounted.