Recreating traffic scenarios for testing autonomous driving in the real world requires significant time, resources and expense, and can present a safety risk if hazardous scenarios are to be tested. Having a 3D virtual environment to enable testing many of these traffic scenarios on the desktop or on a cluster reduces the amount of required road tests significantly. In order to facilitate the development of perception and control algorithms for level 4 autonomy, with potential applications to level 2 active safety systems as well, a shared memory interface between MATLAB/Simulink and Unreal Engine 4, such that perception and/or control algorithms running within or interfacing with MATLAB/Simulink can receive virtual sensor data generated in an Unreal Engine 3D virtual environment, and send information such as vehicle control signals back to the virtual environment. The shared memory interface has been demonstrated to convey arbitrary numerical data, RGB image data, and point cloud data for the simulation of LiDAR sensors. The interface consists of a plugin for Unreal Engine which contains the necessary read/write functions, and a beta toolbox for MATLAB capable of reading and writing from the same shared memory locations specified in Unreal Engine and MATLAB/Simulink. The LiDAR sensor model has been tested by generating point clouds with beam patterns that mimic Velodyne HDL-32E (32 beam) sensors and is demonstrated to run at sufficient frame rates for real-time computations by leveraging the Graphics Processing Unit (GPU).