Concepts of forensic soundness as they are currently understood in the field of digital forensics are related to the digital data on heavy vehicle electronic control modules (ECMs). An assessment for forensic soundness addresses: 1) the integrity of the data, 2) the meaning of the data, 3) the processes for detecting or predicting errors, 4) transparency of the operation, and 5) the expertise of the practitioners. The integrity of the data can be verified using cryptographic hash functions. Interpreting and understanding the meaning of the data is based on standards or manufacturer software. Comparison of interpreted ECM data to external reference measurements is reviewed from the current literature. Meaning is also extracted from interpreting hexadecimal data based on the J1939 and J1587 standards. Error detection and mitigation strategies are discussed in the form of sensor simulators to eliminate artificial fault codes. A transparent process for data gathering and handling is discussed.The needs for improved techniques are motivated through examples of manipulated data and an analysis of potential opportunities that exist to alter the data. As an example, a step by step process of changing the records of a DDEC Reports. XTR file is provided. A detailed examination of resetting the ECM clock is also presented, which motivates a design of a hardware write-blocking device.Some recommendation to provide more forensically sound records of ECM data is outlined. The strategy records and hashes network traffic to create a verification technique for later use. The data is then used in a replay algorithm so the diagnostic software can recreate the information from the forensic copy of the network traffic. Finally, application of the digital forensics beyond accident reconstruction is noted.