Charge air cooling in turbocharged diesel engines is widely used practice to reduce temperature of the engine intake air, increasing air density and improving cylinder filling and engine volumetric efficiency. Usually, charge air is cooled by environmental air that crosses a heat exchanger placed in front of the vehicle: its cooling capacity is related to the vehicle speed and other constraints (presence of the main radiator). This leads to an intake air temperature in the range of 30-70°C, depending on engine load, external air conditions and vehicle speed again. Certainly, if the intake air was more cooled down, the engine volumetric efficiency would be increased. This can only be done by the use of a cooling fluid specifically prepared for this, operating at lower temperature (-10-0°C). In this paper, therefore, an evaporator was placed on the intake line of a turbocharged diesel engine, tested on a high speed dynamometer bench: the evaporator was a part of an air refrigeration unit – the same used for cabin cooling -composed also by a compressor, a condenser and a thermostatic expansion valve. The effects of the undercooling of the charge air have been experimental assessed in terms of fuel consumption and regulated emission reduction, evaluated on the most common homologation cycles (for passenger cars) or on fixed engine operating points (for heavy duty vehicles). The evaporator size has been optimized in terms of geometrical factors fixed by the dimension of the intake pipe, pressure drop introduced in the intake air line. Pressure loss, in fact, reduces engine volumetric efficiency as the cooling of the air would improve it. Mechanical power needed by the compressor has obviously taken into account in order to assess overall benefits. Fuel consumption reduction has been demonstrated in the order of 2% on the NEDC when the intake temperature decreases of about 20°C. A benefit on the regulated emissions has been observed (NOx, CO, PM). HC behavior, on the contrary, deserves some more attention.