Purpose: To track cardiac-autonomic functioning, indexed by heart-rate variability, in American college football players throughout a competitive period. Methods: Resting heart rate (RHR) and the natural logarithm root mean square of successive differences (LnRMSSD) were obtained throughout preseason and ∼3 times weekly leading up to the national championship among 8 linemen and 12 nonlinemen. Seated 1-minute recordings were performed via mobile device and standardized for time of day and proximity to training. Results: Relative to preseason, linemen exhibited suppressed LnRMSSD during camp-style preparation for the playoffs (P = .041, effect size [ES] = −1.01), the week of the national semifinal (P < .001, ES = −1.27), and the week of the national championship (P = .005, ES = −1.16). As a combined group, increases in RHR (P < .001) were observed at the same time points (nonlinemen ES = 0.48–0.59, linemen ES = 1.03–1.10). For all linemen, RHR trended upward (positive slopes, R 2 = .02–.77) while LnRMSSD trended downward (negative slopes, R 2 = .02–.62) throughout the season. Preseason to postseason changes in RHR (r = .50, P = .025) and LnRMSSD (r = −.68, P < .001) were associated with body mass. Conclusions: Heart-rate variability tracking revealed progressive autonomic imbalance in the lineman position group, with individual players showing suppressed values by midseason. Attenuated parasympathetic activation is a hallmark of impaired recovery and may contribute to cardiovascular maladaptations reported to occur in linemen following a competitive season. Thus, a descending pattern may serve as an easily identifiable red flag requiring attention from performance and medical staff.