The Synthetic Vision System Technology Demonstration is a joint FAA/DOD/Industry project to document and demonstrate aircraft sensor and system performance achieved with pilots using fog-penetrating millimeter wave (MMW) sensors, a Forward Looking IR (FLIR) sensor, and a Head-Up Display (HUD). These on-board sensors provide a real-time image of the runway to the pilot which is combined with symbology and alphanumerics and displayed on the HUD. The pilot has all the information needed on the display to land, taxi, and rollout in low-visibility conditions. Two major parts of the project are testing of the sensors in measured conditions on a tower overlooking an unused airfield, and testing sensor and pilot performance in a test aircraft.Sensor performance in clear air and in weather was measured at the Air Force Avionics Directorate Tower Test Facility at Wright-Patterson AFB, Ohio. might data are being collected during approaches to a variety of airports, some with calibrated scenes, using a specially configured Gulfstream 11 as the test aircraft. Digital frames from selected portions during the approach are analyzed according to metrics developed for image quality assessment. Pilot performance is measured using established workload and performance rating tools.Results to date are that the MMW sensor penetrates the fogs encountered and provides an image suitable to land the aircraft when used in combmation with HUD symbology. The IR sensor provides an excellent image of the runway and surroundings in clear weather and in haze. FLIR was significantly degraded in clouds, fog, and rain. Pilots had no difficulty using the MMW image on the HUD combined with symbology to make low workload approaches, landings, and take-offs in simulated zero ceiling, zero visibility conditions, and in actual instrument meteorological conditions down to and including 700 feet visibility.