This paper introduces a “Direct Discrete” (DD) model calibration and uncertainty propagation approach for handling aleatory variability and epistemic uncertainties in replicate tests of stochastic systems and in computational models calibrated to the experimental data. The DD approach appears to have several advantages over Bayesian and other calibration and uncertainty propagation approaches for capturing and utilizing the inference information obtained from the typically small number of experiments in model calibration situations. In particular, the DD approach preserves the fundamental information from the sparse sample data in a way that enables model predictions to represent the sparse-data aleatory-epistemic uncertainty differently according to what best supports inference accuracy for different quantities such as the central 95% of response vs. a small exceedance probability associated with a tail of the response distribution. The DD methodology is explained and demonstrated on several 1D monotonic calibration problems in the Sandia Cantilever Beam End-to-End UQ problem. For such calibration problems, the DD methodology is conceptually and implementationally simpler than Bayesian calibration approaches (which can exhibit significant analyst-to-analyst variability of proposed prior distributions for calibration variables). However, for calibration problems with non-unique parameter sets that match the experimental data, the DD method can get considerably more complex—on par with Bayesian approaches—in order to achieve prediction robustness to parameter set non-uniqueness. Nonetheless, the DD approach appears more feasible for calibration to sparse realizations of random function data (e.g. stress-strain curves) and random field data.