Operation of spark-ignition (SI) engines with high levels of charge dilution through exhaust gas recirculation (EGR) achieves significant engine efficiency gains while maintaining stoichiometric operation for compatibility with three-way catalysts. Dilution levels, however, are limited by cyclic variability - including significant numbers of misfires - that becomes more pronounced with increasing dilution. This variability has been shown to have both stochastic and deterministic components. Stochastic effects include turbulence, mixing variations, and the like, while the deterministic effect is primarily due to the nonlinear dependence of flame propagation rates and ignition characteristics on the charge composition, which is influenced by the composition of residual gases from prior cycles.The presence of determinism implies that an increased understanding of the dynamics of such systems could lead to effective control approaches that allow operation near the edge of stability, effectively extending the dilution limit. This nonlinear dependence has been characterized previously for homogeneous charge, port-fuel-injected (PFI) SI engines operating fuel-lean as well as with inert diluents such as bottled N₂ gas. In this paper, cyclic dispersion in a modern boosted gasoline direct injection (GDI) engine using a cooled external EGR loop is examined, and the potential for improvement with effective control is evaluated through the use of symbol sequence statistics and other techniques from chaos theory. Observations related to the potential implications of these results for control approaches that could effectively enable engine operation at the edge of combustion stability are noted.