Recent advances in Small Unmanned Aerial Systems (SUAS) or drone technologies has resulted in their widespread use in a number of civilian applications, such as aerial imaging, infrastructure inspection, precision agriculture, among others. While this technology is accessible for everyone, it still requires a highly skilled operator to be able to successfully operate these drones in a safe and efficient manner. At the same time, the developments in Virtual/Augmented Reality (V/AR) technologies present opportunities for combining the two into novel applications and use cases, while providing an intuitive interface for interacting with the drones – this opens up possibilities for effective use of the drones by relatively untrained operators for civilian and military use. This effort addresses the development and implementation of an interface that provides an operator wearing an Oculus Rift virtual reality headset with a Leap Motion controller the ability to control drones in a virtual reality environment and translate the control to a real world implementation, in a motion capture setting. This includes actions such as selecting drones, take-off and landing, or moving them to a specified location. DroneKit-Python is used to communicate commands to drones while OptiTrack cameras and the NatNet SDK combine to provide the precise physical location of each drone in an indoor laboratory. Unreal Engine is the development platform used to create the virtual scene which the operator resides in. A QAV250 quadcopter from Luminier Labs will be used as the UAS platform, with either a Pixhawk or a Navio2 flight controller, interfaced with a Raspberry Pi 3 Single Board Computer (SBC) as the companion computer. Within the VR interface, a heads up display will be presented to the user with various tools and movement controls to allow a user to select a single drone model or multiple selections through gestures or a list.