Diminishing petroleum reserves and increasingly stringent emission targets globally, have forced the automotive industry to move towards downsized boosted direct injection engines. Boosted engines operate at high mean effective pressure (MEP) resulting in high in-cylinder pressure and thermal loading which could give rise to abnormal combustion events like knock and pre-ignition. These events could lead to damage of engine components; therefore the compression ratio and boost pressure are restricted, which in-turn limits the engine efficiency and power. To mitigate conditions where the engine is prone to knocking, the engine control system uses spark retard or mixture enrichment, which decrease indicated work and increase specific fuel consumption. Several researchers have advocated water injection as an approach to replace existing knock mitigating techniques. The first studies on its potential for knock inhibition can be traced back to early 1930's studies by Ricardo. Water, having high latent heat of vaporization, acts as a heat sink and reduces temperatures in the end gas zone thereby reducing the tendency for auto-ignition. Added water also changes the ratio of specific heats of the charge mixture, and slightly dilutes the oxygen concentration. The changes greatly reduce the tendency to knock or detonate. There can also be a reduction in NOx emissions. Optimum strategy for injection to maximize benefits is still debatable, due to the fact that the latent heat of vaporization decreases as pressure increases. The ability of water to improve antiknocking properties can potentially allow engine designs with higher compression ratio and boost pressure to operate with maximum brake torque spark timing under all operating conditions. This paper examines the history of water injection research from its inception and evaluates the effects of water injection in GDI engines.