Improvements in computer image processing and identification capability have led to programs that can rapidly perform calculations and model the three-dimensional spatial characteristics of objects simply from photographs or video frames. This process, known as structure-from-motion or image based scanning, is a photogrammetric technique that analyzes features of photographs or video frames from multiple angles to create dense surface models or point clouds. Concurrently, unmanned aircraft systems have gained widespread popularity due to their reliability, low-cost, and relative ease of use. These aircraft systems allow for the capture of video or still photographic footage of subjects from unique perspectives.This paper explores the efficacy of using a point cloud created from unmanned aerial vehicle video footage with traditional single-image photogrammetry methods to recreate physical evidence at a crash scene. The unique aspects of photographs or video taken with unmanned aircraft systems ease some of the challenges of creating point cloud data with ground level footage. To explore the accuracy of this process, the authors constructed a mock scene with physical evidence that is typical of vehicular crashes. The scene was scanned with a FARO laser scanner and photographed. The evidence was then removed and video was taken of the scene from an unmanned aerial vehicle. That video footage was processed with image-based scanning software to create a point cloud, and the point cloud was used as a means to determine the positions and characteristics of the camera at the time the evidence was photographed. The evidence was then reconstructed with traditional single image photogrammetry techniques, and the position and size of the reconstructed evidence was compared to the actual position as documented by the FARO scanner. Through this process, the authors determined that the use of unmanned aerial vehicle footage and image-based scanning software could be used to accurately reconstruct the location of physical evidence.