Purpose: To investigate the relationship between prescribed (preDI), perceived (perDI), and actual delivery intensity (actDI) in cricket pace bowling. Methods: Fourteen male club-standard pace bowlers (mean [SD]: age 24.2 [3.2] y) completed 1 bowling session comprising 45 deliveries. The first 15 deliveries composed the warm-up, where participants bowled 3 deliveries each at a preDI of 60%, 70%, 80%, 90%, and 95%. Bowlers reported the perDI after each delivery. The fastest delivery in the session was used as a reference to calculate relative ball-release speed for the warm-up deliveries, with this measure representing the actDI. Ball-release speed was captured by a radar gun. Results: For perDI, there was a very large relationship with preDI (rs = .90, P < .001). Similarly, for actDI, there was a large relationship with preDI (rs = .52, P < .001). Higher concordance was observed between perDI and preDI from 60% to 80% preDI. A plateau was observed for actDI from 70% to 95% preDI. Conclusions: The relationship between perDI and actDI was very large and large with respect to preDI, indicating that both variables can be used to monitor delivery intensity against the planned intensity and thus ensure healthy training adaptation. The optimal preDI that allowed pace bowlers to operate at submaximal perDI but still achieve close to maximal ball-release speeds was 70%. Bowling at the optimal preDI may significantly reduce the psychophysiological load per delivery in exchange for a trivial loss in ball-release speed.