Improving fuel economy has been a key focus across automotive and truck industry for several years if not decades. In heavy duty commercial vehicles, the benefits from small gains in fuel economy lead to significant savings for fleets as well as owners and operators. Additionally, the regulations require vehicles to meet certain GHG levels which closely translate to vehicle fuel economy. For current state of the art FE technologies, incremental gains are so small that they are hard to measure on an actual vehicle. Engineers are challenged with high level of variability to make informed decisions. In such cases, highly controlled tests on Engine and Powertrain dynos are used, however, there is an associated variability even with these tests due factors such as part to part differences, fuel blends and quality, dyno control capabilities and so on. This variability grows dramatically during controlled vehicle track testing and eventually in customer trucks where driver habits and environmental factors can contribute significantly to the final realized fuel economy. Although, this information is intuitive, little literature exists that attempts to quantify it. This paper describes the variability seen in test cells, powertrain dyno, test tracks and real world driving by statistically analyzing data from past several years. The contribution from the possible factors such as idle times, fuel quality, ambient conditions and road friction changes was then quantified using vehicle level simulations and other modeling techniques. Statistical ways to analyze and compare data at various stages of product development are discussed. The end customer should find this information useful to determine if the observed fuel economy discrepancies are controllable or uncontrollable and take corrective actions where possible. Policy makers will be able to use it to assess how the regulatory levels eventually pan out under real world operating conditions.