Context: Dynamic movement-based screens, such as the Landing Error Scoring System (LESS), are becoming more widely used in research and practical settings. Currently, 3 studies have examined the reliability of the LESS. These studies have reported good interrater and intrarater reliability. However, all 3 studies involved raters, who were founders of the LESS. Therefore, it is unclear whether the reliability reported reflects that which would be observed with practitioners without such specialized and intimate knowledge of the screen and only using the standardized set of instructions. Objective: To investigate the interrater and intrarater reliability of the final score and the individual scoring criteria of the LESS. Design: Reliability protocol. Setting: Controlled laboratory. Participants: Two raters scored 30 male participants (age = 21.8 [3.9] y; height = 1.75 [0.46] m; mass = 75.5 [6.6] kg) involved in a variety of college sports. Main Outcome Measure: Two raters using only the standardized scoring sheet assessed the interrater reliability of the total score and individual scoring criteria independently of each other. The principal author scored the videos again 6 weeks later for the intrarater reliability component of the study. Intervention: Participants performed a drop box landing from a 30-cm box was recorded with a video camera from the front and side views. Results: The intraclass coefficients interrater and intrarater reliability for the total scores were excellent (intraclass coefficients range = .95 and .96; SEM = 1.01 and 1.02). The individual scoring criteria of the LESS had between moderate and perfect agreement using kappa statistics (κ = .41–1.0). Conclusion: The final score and individual scoring criteria of the LESS have acceptable reliability with raters using the standardized scoring sheet. Practitioners using only the standardized scoring sheet should feel confident that the LESS is a reliable tool.