The National Academy of Kinesiology 2020 Review and Evaluation of Doctoral Programs in Kinesiology

Click name to view affiliation

John H. Challis
Search for other papers by John H. Challis in
Current site
Google Scholar
PubMed
Close
Full access

The results of the 2020 review and ranking of U.S. doctoral programs in kinesiology conducted by the National Academy of Kinesiology (NAK) are presented. These results represent data collected for the  2015, 2016, 2017, 2018, and 2019 calendar years for 43 programs. The rankings reflect data collected on program faculty (productivity, funding, and visibility) and program students (admissions, support, publications, and employment). The data for each assessment index were first transformed into z scores, and then the z scores converted into T-scores. Weights were applied to the T-scores of the indices and then summed to obtain a total T-score. Programs were ranked in two ways: one based on the total T-scores from the data not normalized (unadjusted) and the other with total T-scores from the data normalized with respect to the number of faculty members in each program (adjusted). In addition to program rankings, descriptive data are presented on faculty and student data.

The National Academy of Kinesiology (NAK) first conducted a review of U.S. doctoral programs in kinesiology in 2005, and every lustrum (5 years) the review has been repeated. The aim of these reviews was to enhance the status of doctoral education in kinesiology (Thomas et al., 2007), and they also provide useful data for programs performing self-evaluations. This is the report of the NAK’s fourth review, which reflects data from the period 2015 to 2019. Across all four reviews (Spirduso & Reeves, 2011; Thomas & Reeves, 2006; Ulrich & Feltz, 2016; and the current review), the majority of the elements of the review have remained unchanged.

The NAK started developing an evaluation tool in 1996; in 2000, it was piloted on 20 volunteer programs (Thomas et al., 2007). The NAK review ranking is based on nine indices related to program faculty, and seven indices related to program students. While the NAK has now conducted four reviews of U.S. kinesiology doctoral programs, the National Research Council (NRC) has completed three reviews of U.S. doctoral programs (1982, 1995, and 2010), with kinesiology only included as field of study in the most recent review (Ostriker, Kuh, & Voytuk, 2011). The NRC used 20 metrics which are similar to the items in the NAK survey (Voytuk, Kuh, & Ostriker, & National Research Council, 2003). The NRC provided two final ranks: S-rank, which was based on survey data only, and R-rank, which was based on a regression analysis of faculty ranking of programs. In contrast, the NAK ratings are based on a statistical analysis of survey data with one ranking based on data unadjusted for program faculty size and the other with those data adjusted for program faculty size.

Alongside the 16 survey items which are used to determine program rank, the NAK also collects other data to obtain a description of U.S. doctoral programs. These data provide opportunities for program benchmarking. Mimicking the previous review, programs reported the distribution of races of their doctoral students. The categories for identifying race were the same as in the last round of reviews, which was based on categories formerly used by the National Institutes of Health. The National Institutes of Health has since updated these categories, and future NAK reviews will likely incorporate these revised categories. New items in this round of the review related to faculty and included aspects related to workload (e.g., courses taught per semester, percentage of effort on teaching, percentage of effort on research, percentage of effort on service) and research impact (h-index and number of citations).

The NAK review is of doctoral programs, not departments. For many of the surveyed programs, the program was run from within a department with all faculty having appointments in that department, but this was not uniformly the case. There are programs which are interdepartment within a college, and others across departments in different colleges. While each program structure creates its own unique problems for collating data for the NAK survey, the program inclusion criteria are designed to be as inclusive as possible.

This purpose of this report is to describe the process and outcome of the NAK 2020 survey of U.S. kinesiology doctoral programs. The next section describes the methods, the following section presents the results, and the final section provides a discussion and summary of the survey. Two appendices are presented: Appendix A presents a list of all participating institutions (and program titles) and lists programs invited but electing not to participate, and Appendix B presents the Instructional Guide which contains the information provided to programs for data submissions.

Method

In the following subsections, information is presented on the following: data collection, data verification, indices and weighting factors, and data analysis. Two measurement experts were contracted to conduct the data analysis and therefore produce the ranking of programs.

Data Collection

Since the 2015 review, the Doctoral Program Evaluation Committee (DPEC) and NAK Executive Committee have discussed the review process and the contents of the review. These discussions were augmented by open discussions among NAK fellows at the annual conference during the business meetings. The data collected which contribute to the ranking of programs remained the same as used in the previous review (Ulrich & Feltz, 2016), but some additional data were collected to help characterize U.S. doctoral program.

To identify programs that could participate in the review, previous programs lists were augmented by the DPEC, providing a mailing list from which to solicit participants (see Appendix A). In addition to publicize the review, notices were placed in professional newsletters, and information was posted on the NAK website.

In August 2019, a letter highlighting the upcoming review was sent to chairs and deans of programs in the list of U.S. doctoral programs. In November 2019, letters were sent to program chairs and their deans initiating the review process. In January 2020, programs that completed the last review but had not agreed to participate in this round were contacted, as were any NAK fellows in those programs. Submissions were due in March 2020, with the requested data covering the calendar years 2015, 2016, 2017, 2018, and 2019.

An Instructional Guide was sent to all participating programs; it contained details on all required data and how they should be reported (Appendix B). New items not requested in previous reviews related to program faculty included the following: courses taught per semester; percentage effort for teaching, research, and service; h-index; and career number of citations. One new item was included related to program students: it was number admitted students. All data were submitted in an Excel file (Microsoft Corp., Redmond, WA) to make for ease of data verification and analysis.

Data Verification

To verify submitted data, there were various steps. Immediately after submission, each submission was reviewed to confirm that all items requested were received. For all submissions, the program chairperson signed to authenticate the faculty and student data. In addition, each program submitted a bibliography of all of the publications used to arrive at the total number of publications submitted by a program. A separate bibliography listed all of the book chapters from a program. On submission, the program chairperson and a budget officer signed to authenticate the reported funding data. One item in the submitted data is the number of members of the NAK; on submission, these data were confirmed, against academy records of membership by institution.

All raw data for all indices were checked for outliers that may have occurred due to erroneous data entry, if required specific verifications were requested from program chairs. A random selection of programs was reviewed for detailed verification of their publication lists. These bibliographies were checked for duplications, abstracts presented as full papers, papers published outside of the data collection window, and other anomalies. The sample size was predetermined by the DPEC and NAK Executive Committee to be 10% of the number of programs in the review, which resulted in an n = 4. The analyzed bibliographies averaged a mean error of 2.7%, well below the actionable criteria established (10%).

Indices and Weighting Factors

To compute the overall rank of programs, the faculty indices contribute 66% and student indices 34% of the total score. The weighting of these indices to the overall rank remained the same as in the previous round of the review (Ulrich & Feltz, 2016). The faculty-related indices can be divided into three categories (productivity, funding, and visibility): their weightings were

  • Faculty Indices—66%

    • Productivity—30%

      • Journal publications—20%

      • Book chapters—5%

      • Presentations—5%

    • Funding—26%

      • Federal research funds—15%

      • Nonfederal research funds—8%

      • Internal grant funds—3%

    • Visibility—10%

      • Editorial boards—6%

      • NAK fellow—2%

      • Other national fellowships—2%

The student-related indices can be divided into four categories (admissions, support, publications, and employment): their weightings were

  • Students Indices—34%

    • Admissions—12%

      • Selectivity—2%

      • GRE: verbal—5%

      • GRE: quantitative—5%

    • Graduate assistant support—13%

    • Doctoral publications—2%

    • Employment—7%

      • Postdoctoral positions—4%

      • Employment in the field—3%

Data Analysis

Data analysis adopted the procedures of Ulrich and Felz (2016). Faculty indices were analyzed in two ways: unadjusted for faculty size and adjusted for faculty size.

The raw data, unadjusted and adjusted, were processed using the following sequence:

  1. Faculty and student indices were converted to z scores.
  2. Extreme scores were truncated to a z score of ±2.576.
  3. T-scores were calculated from the z scores (mean = 50, SD = 10).
  4. Weightings applied to individual indices.
  5. Total T-score computed by summing across weighted indices.

From the sum of the T-scores, the programs were ranked in two ways: using the indices unadjusted for faculty size and using the indices adjusted for faculty size.

The means and SDs for faculty and student indices were calculated. In addition, the programs were placed in one of four groupings based on total T-score (a) <40, (b) 40–49, (c) 50–59, and (d) ≥ 60, and then descriptive statistics computed for these groups. Correlations were computed between the individual indices and the summed T-scores.

Results

In this section, reference is made to assessed indices and nonassessed indices, meaning those indices which contributed to the overall program rankings and those metrics not used to determine program rankings, respectively. Program rankings were determined from the weighted sum of the T-scores from the assessed indices.

The Programs

A total of 74 kinesiology doctoral programs were identified in United States (Appendix A). These programs were all invited to participate in the review, of which 43 (58%) submitted the required data. The NRC 2010 review had 41 programs that participated. In 2020, there were four programs that did not participate in the 2015 review, and 13 programs that participated in 2015 but elected not to participate in 2020. The number of faculty in the participating programs ranged from 5 to 31 (15.0 ± 5.8). These basic data show a moderate increase in faculty size from the previous rounds of the review (Table 1).

Table 1

The Number of Programs Participating (+Number of New Programs in Review/−Number of Programs Completing Previous Review But Not Current Review) and the Mean Faculty Size (Median) for the Four NAK Reviews of Doctoral Programs in Kinesiology

Review yearYears for dataNumber of programsFaculty size
20052000–200432
20102005–200936 (+9/−5)13.7 (14.0)
20152010–201452 (+17/−1)14.7 (14.0)
20202015–201943 (+4/−13)15.0 (15.0)

Note. NAK = National Academy of Kinesiology.

Rankings

The overall ranking of programs was performed with and without adjustment for faculty size (Table 2).

Table 2

Overall Final Ranking and Total T-Scores When Both Adjusted and Unadjusted for Faculty Size

Adjusted for size of facultyUnadjusted for size of faculty
RankUniversityT-scoreRankUniversityT-score
1University of South Carolina72.681University of Michigan80.65
2University of Michigan68.342University of North Carolina at Chapel Hill71.79
3University of Connecticut65.043Pennsylvania State University70.09
4Teachers College, Columbia University63.224University of South Carolina69.82
5University of North Carolina at Chapel Hill61.915University of Delaware61.78
6University of Mississippi61.006University of Mississippi56.99
7Pennsylvania State University60.007University of Illinois, Urbana-Champaign56.97
8University of Virginia59.178University of Wisconsin—Madison56.28
9University of Central Florida58.079University of Texas at Austin55.57
10The Ohio State University57.5810University of Utah55.02
11University of Minnesota57.4711Indiana University54.63
12University of Southern California57.3612University of Minnesota54.03
13University of Delaware56.4413Auburn University53.60
14University of Illinois, Urbana-Champaign56.3714University of Georgia53.48
15University of Texas at Austin56.1815University of North Carolina at Greensboro53.12
16Rutgers University55.9216University of Southern California52.95
17University of Utah53.2717The Ohio State University52.80
18University of Wisconsin—Madison53.2118Iowa State University52.72
19University of Florida52.9419University of Florida52.71
20Indiana University52.6720University of Tennessee, Knoxville51.68
21Iowa State University52.3221Michigan State University51.06
22Auburn University49.9922University of Virginia49.86
23Michigan State University48.8523University of Texas at Arlington47.26
24University of North Carolina at Greensboro48.3024University of Maryland47.25
25University of Georgia47.4525University of Connecticut46.00
26University of Massachusetts, Amherst47.4426University of Massachusetts, Amherst45.40
27University of Tennessee, Knoxville46.7627University of Nebraska Omaha45.27
28University of Maryland45.8128Teachers College, Columbia University44.30
29University of Arkansas43.7929Florida State University43.61
30University of Nebraska Omaha43.3930Louisiana State University43.39
31University of Illinois, Chicago42.9831University of Illinois, Chicago42.91
32Syracuse University41.2132University of Central Florida42.88
33University of Alabama41.0433Colorado State University42.60
34East Carolina University40.8134Oregon State University41.95
35The University of Texas at Arlington40.1035Purdue University40.78
36Oregon State University39.7936Rutgers University40.09
37Colorado State University38.8537University of Oklahoma39.81
38Virginia Commonwealth University37.9538University of Alabama39.70
39Florida State University37.0139University of Arkansas39.49
40Purdue University36.5540East Carolina University38.65
41Louisiana State University36.1241Virginia Commonwealth University37.32
42University of Oklahoma34.9542Mississippi State University37.12
43Mississippi State University29.6943Syracuse University36.63

Note. T-scores are presented to two decimal places as this was required to determine the rank for some programs.

Faculty Data

The relative standing of a program’s faculty data for the individual indices can be determined from the program’s T-scores for each of the assessed indices (Table 3). The programs were grouped by ranking T-score, and then descriptive statistics were computed for the four subgroups for the assessed faculty indices (Table 4). Descriptive data for program funding were computed for all programs (Table 5). Finally, the associations between the assessed indices and program rank were examined by quantifying the correlations between the assessed faculty indices and the summed T-scores used to arrive at overall rank (Table 6).

Table 3

T-Score Results for Faculty Indices, Adjusted for Faculty Size

UniversityPublicationsBook chaptersPresentationsFederal fundingNonfederal fundingInternal fundingEditorsNAK fellowsNational fellows
Auburn University504554445147475550
Colorado State University464235455555407147
East Carolina University404152564349345039
Florida State University424337454645403842
Indiana University494949435360764760
Iowa State University464646504844444852
Louisiana State University434745424045554242
Michigan State University504948434546544841
Mississippi State University364535424044365030
The Ohio State University515771475747706276
Oregon State University444443484544444461
Pennsylvania State University507654484946625448
Purdue University394141434648375149
Rutgers University555767714657533870
Syracuse University584245434344353837
Teachers College, Columbia University547360604662657143
University of Alabama494760414347443841
University of Arkansas495075425544443848
University of Central Florida714951414765433855
University of Connecticut697571457647555954
University of Delaware434550766244505041
University of Florida494449614576464939
University of Georgia485144444445506251
University of Illinois, Urbana-Champaign545255455948555641
University of Illinois, Chicago464657494744554454
University of Maryland415137465144586049
University of Massachusetts, Amherst454344514649425356
University of Michigan444851636846615452
University of Minnesota486044484144637664
University of Mississippi764645414045603844
University of Nebraska Omaha474446634157424439
University of North Carolina at Chapel Hill614663517446504854
University of North Carolina at Greensboro445040474245506350
University of Oklahoma394438424046454447
University of South Carolina625261637245434761
University of Southern California474361574645424459
University of Tennessee, Knoxville435047434044504244
University of Texas at Arlington484745474245423846
University of Texas at Austin505940486854595763
University of Utah585147475344535251
University of Virginia555157525576636073
University of Wisconsin—Madison424847765565433842
Virginia Commonwealth University504342414147464846

Note. NAK = National Academy of Kinesiology.

Table 4

Descriptive Statistics, M and SD, for Assessed Faculty Indices by Total T-Score Category (Adjusted for Faculty Size)

Category of total T-scoresPublicationsBook chaptersPresentationsFederal fundingNonfederal fundingInternal fundingEditorsNAK fellowsNational fellows
<40 (n = 8)
M15.40.717.7$102,831$61,723$29,2821.00.10.5
SD4.20.43.3$91,388$70,096$34,6580.50.10.3
40–49 (n = 14)
M19.71.426.6$248,249$74,405$28,2431.30.10.5
SD4.30.88.5$246,998$59,047$34,3960.60.10.2
50–59 (n = 14)
M23.72.229.1$570,012$174,552$125,4531.90.20.8
SD7.01.27.9$539,720$100,906$146,3850.90.10.4
≥60 (n = 7)
M34.04.733.7$490,003$287,706$45,4242.10.20.7
SD15.34.27.5$361,700$210,506$60,7350.60.10.2
Total (n = 43)
M22.52.126.9$365,310$139,375$62,8831.60.10.6
SD9.62.28.8$404,201$134,220$98,4720.80.10.3

Note. NAK = National Academy of Kinesiology.

Table 5

Descriptive Statistics for the Total Funding Reported for All Faculty in the National Academy of Kinesiology 2020 Doctoral Program Review

IndexnMinimumMaximumSumMSD
External federal funding43$0$35,434,581$253,425,035$5,893,605$7,869,494
External nonfederal funding43$65,140$11,936,686$100,521,471$2,337,709$2,885,873
Internal funding43$6,912$6,762,936$35,565,251$827,099$1,339,037
Table 6

The Correlation Between the Assessed Faculty Indices and the Programs’ Total T-Scores

IndexAdjustedUnadjusted
Publications (20%).61**.83**
Books (5%).55**.50**
Presentation chapters (5%).54**.84**
Federal funding (15%).43**.60**
External funding (nonfederal) (8%).64**.79**
Internal funding (3%).23.11
Editorial boards (6%).55**.76**
NAK fellow (2%).26.51**
National fellows (2%).47**.71**

Note. Values in parentheses refer to the contribution of the index to the total T-score (the weighting). NAK = National Academy of Kinesiology.

**p < .01.

Student Data

The relative standing of a program’s student data for the individual indices can be determined from the program’s T-scores for each of the assessed indices (Table 7). The programs were grouped by ranking T-scores, and then descriptive statistics were computed for the four subgroups for all assessed student indices (Table 8). Finally, the associations between the assessed indices and program rank were examined by quantifying the correlations between the assessed student indices and the summed T-scores used to arrive at overall rank (Table 9).

Table 7

T-Score Results for Student Indices

UniversityGRE verbalGRE quantitativeAssistantships (FTE)SelectivityDoctoral publicationsPostdoctoral positionsEmployed
Auburn University35336149685566
Colorado State University48454324414139
East Carolina University55554547375039
Florida State University44494751445047
Indiana University35495241565864
Iowa State University52706156524650
Louisiana State University48474046403851
Michigan State University52455356484867
Mississippi State University37485230413642
The Ohio State University45394453625576
Oregon State University52494256404050
Pennsylvania State University56705162636344
Purdue University56664059384542
Rutgers University48513352374038
Syracuse University48454749384339
Teachers College, Columbia University76683955385039
University of Alabama44374745564064
University of Arkansas52373956454642
University of Central Florida64585063464046
University of Connecticut39454341545150
University of Delaware50524756474555
University of Florida53494256577643
University of Georgia46465050616161
University of Illinois, Urbana-Champaign45525044657065
University of Illinois, Chicago34394347446142
University of Maryland60584356496345
University of Massachusetts, Amherst50425462496046
University of Michigan54627661555849
University of Minnesota52586556505355
University of Mississippi39337652643655
University of Nebraska Omaha38524524394140
University of North Carolina at Chapel Hill60604245714650
University of North Carolina at Greensboro44416755445352
University of Oklahoma46395241504351
University of South Carolina54536352717048
University of Southern California67755262506050
University of Tennessee, Knoxville56416861533661
University of Texas at Arlington44455035404339
University of Texas at Austin52544852575554
University of Utah49565051444551
University of Virginia47454656644652
University of Wisconsin—Madison54424655445342
Virginia Commonwealth University56454233394040

Note. FTE = full-time equivalent; GRE = Graduate Record Examination.

**p < .01.

Table 8

Descriptive Statistics, M and SD, for Assessed Student Indices by Program Total T-Score Category (Adjusted for Faculty Size)

Category of total T-scoresGRE verbalGRE quantitativeAssistantships (FTE)SelectivityDoctoral publicationsPostdoctoral positionsEmployed in field
<40 (n = 8)
M152.1152.817.00.642.83.19.9
SD1.51.95.80.317.82.56.4
40–49 (n = 14)
M151.8151.724.20.473.48.215.9
SD1.91.710.70.244.45.413.4
50–59 (n = 14)
M152.7154.121.80.493.89.919.8
SD1.92.49.10.140.56.414.2
≥60 (n = 7)
M154.1154.630.80.4128.910.312.6
SD4.23.219.80.156.26.76.0
Total (n = 43)
M152.5153.223.20.483.38.215.5
SD2.42.411.80.248.56.012.0

Note. FTE = full-time equivalent; GRE = Graduate Record Examination.

Table 9

The Correlation Between the Assessed Student Indices and the Programs’ Total T-Scores

IndexAdjustedUnadjusted
Average GRE—Verbal (5%).29.09
Average GRE—Quantitative (5%).32*.31*
Assistantships (FTE) (13%).32*.52**
Selectivity (2%).44**.34*
Doctoral publications (2%).60**.68**
Postdoctoral positions (4%).44**.47**
Positions in field (3%).22.29

Note. FTE = full-time equivalent; GRE = Graduate Record Examination. Values in parentheses refer to the contribution of the index to the total T-score (the weighting).

*p < .05. **p < .01.

Indices Not Used in Ranking

For the faculty, data were collected for six indices; these indices related to workload distribution (teaching, research, and service) and research profile (h-index and total citations). The programs were grouped by total T-scores, and then descriptive data were computed for the four subgroups for the nonassessed faculty indices (Table 10). The associations between the nonassessed indices and total T-scores were examined by quantifying the correlations between the nonassessed faculty indices and the summed T-scores for both adjusted and nonadjusted data. Those indices associated with faculty workload had small and nonsignificant correlation coefficients (|r| from .00 to .17). The correlation between program summed T-scores and h-index and number of citations were low for the unadjusted program rank (.23 and .20, respectively) but were greater and statistically significant for the adjusted program rank (.52 and .49, respectively, p < .01). In a similar fashion, the associations between the assessed faculty indices and nonassessed faculty indices were assessed by computing their correlations (Table 11).

Table 10

Descriptive Statistics M (Median) and SD for Nonassessed Faculty Indices by Total T-Score Category, Adjusted for Faculty Size

Category of total T-scoresCourses per semester%Effort teaching%Effort research%Effort serviceh-indexNumber of citations
<40a (n = 8)
M1.5 (1.3)34.5 (31.9)43.9 (40.5)21.1 (21.0)18.8 (20.2)1,934 (2,113)
SD0.34.78.27.24.2787
40–49b (n = 14)
M1.6 (1.6)36.5 (35.6)41.8 (41.3)20.9 (22.0)25.0 (25.3)4,018 (3,259)
SD0.35.86.55.95.22,572
50–59c (n = 14)
M1.4 (1.5)33.9 (35.6)45.1 (43.6)20.9 (20.3)28.9 (28.2)4,847 (4,454)
SD0.48.68.05.36.72,384
≥60 (n = 7)
M1.6 (1.7)38.3 (40.0)41.0 (41.5)20.8 (19.2)29.8 (30.4)5,892 (5,703)
SD0.37.55.64.74.82,226
Totalc (n = 43)
M1.5 (1.5)35.6 (35.7)43.1 (41.6)20.9 (20.3)26.1 (25.4)4,265 (3,790)
SD0.36.97.15.56.62,494

aSeven programs from this category provided responses for these variables. bSample size was 13 for h-index, and number of citations. cSample size was 42 for courses per semester, %effort teaching, %effort research, and %effort service and 41 for h-index and number of citations.

Table 11

Correlation Coefficients Between Two Sets of Faculty Indices, Specifically Those Indices Not Used in Program Ranking (e.g., Courses per Semester) and Those Used in the Ranking (e.g., Number of Faculty Publications)

IndexPublicationsBook chaptersPresentationsFederal fundingExternal funding (nonfederal)Internal fundingEditorial boardsNational academy membersNational fellows
Courses per semester−.06.13−.09−.18−.10−.11.05−.09−.15
%Effort teaching−.01.37*.04−.10−.05−.12.31*.30.09
%Effort research.02−.34*.07.35*.19.24−.17−.14−.08
%Effort service.01−.03−.12−.29−.16−.15−.17−.19−.003
h-index.12.30.21−.02.09.17.24.29.27
Number of citations.09.22.18−.01.11.05.14.24.25

Note. Values in this table are based on the 42 programs who provided these data.

*p < .05.

For the students in kinesiology doctoral programs for the survey years, data were collected for the race of those students (Table 12). For comparison purposes, the data for the National Science Foundation 2018 survey of doctoral degree recipients are also presented. In their survey, they collected data on race and ethnicity, which was not the case in the NAK survey.

Table 12

Demographics of Doctoral Students in Kinesiology for the Survey Years of 2015–2019 Compared With Data From the National Science Foundation for Recipients of a Doctoral Degree in 2018

United StatesaNAK
Race201820152016201720182019
Hispanic or Latinob3,603 (6.5)
American Indian or Alaska Native116 (0.2)5 (0.7)4 (0.5)6 (0.8)6 (0.7)6 (0.7)
Asian14,815 (26.8)104 (13.9)107 (13.7)115 (14.7)123 (15.1)115 (13.7)
Black3,058 (5.5)38 (5.1)41 (5.3)41 (5.2)42 (5.2)58 (6.9)
Native Hawaiian or Other Pacific Islanderc1 (0.1)1 (0.1)1 (0.1)1 (0.1)0 (0.0)
White28,585 (51.8)468 (62.7)512 (65.7)510 (65.2)524 (64.4)531 (63.4)
More than one race1,213 (2.2)28 (3.8)25 (3.2)21 (2.7)22 (2.7)29 (3.5)
Other race or race not reported862 (1.6)102 (13.7)94 (12.1)88 (11.3)107 (13.1)102 (12.2)
Ethnicity not reportedb2,943 (5.3)
Total55,195746779782814837

Note. NAK = National Academy of Kinesiology.

aThese data indicate number of doctorates awarded by race and ethnicity for U.S. doctoral students (National Center for Science and Engineering Statistics, 2019). bThese data were not collected by the NAK for this evaluation period. cThese data were not reported by the National Science Foundation for this reporting period.

Discussion

This report has presented the results from the fourth NAK review of U.S. doctoral programs. The quality of a doctoral program is determined by many factors, and to some extent, these factors are student specific—the intention of this review is to try and capture at least some of these factors. The data set from the current review provides a temporal snapshot of doctoral kinesiology programs in the United States. A historical perspective can be obtained by comparing the results of the current review with those from previous reviews (Spirduso & Reeves, 2011; Thomas & Reeves, 2006; Ulrich & Feltz, 2016). Hopefully, it will be useful for administrators as benchmark data and may create the opportunity to request new or additional resources.

The overall number of faculty in programs has shown modest increases over the last three reviews (Table 1), likely in part driven by the increasing undergraduate enrollments in the last decade (Bassett, Fairbrother, Panton, Martin, & Swartz, 2018). The number of students studying for a doctoral degree in kinesiology has shown steady growth over the last decade (Figure 1a). Figure 1a could be potentially misleading as it combines data from the 2015 review which included 52 programs with the 2020 review which has 43 programs; therefore, the student enrolment data are also presented as the mean number of students per program; these data also show steady growth over the last decade (Figure 1b). The data from the 2015 and 2020 surveys on the demographics of the doctoral students can be combined to examine trends in the student enrolments. The largest proportion of the students are White, but the proportion of these students has diminished over the last decade (Figure 2a). The data for the other groups show some modest increases over the decade (Figure 2b).

Figure 1
Figure 1

—The number of students studying for a doctoral degree in a kinesiology in the United States: (a) total number of students and (b) number of students per program. These graphs combine data from the 2015 and 2020 National Academy of Kinesiology surveys.

Citation: Kinesiology Review 10, 1; 10.1123/kr.2020-0049

Figure 2
Figure 2

—The percentage of the students studying for a doctoral degree in kinesiology in the United States representing different groups based on race: (a) contribution to the total of White students and (b) contribution to the total of the other race groups in the survey. These graphs combine data from the 2015 and 2020 National Academy of Kinesiology surveys.

Citation: Kinesiology Review 10, 1; 10.1123/kr.2020-0049

The review is of doctoral programs, not departments. Some programs were run out a single department, but other programs were interdepartmental. This review was conducted by the NAK, but not all surveyed programs were doctoral programs titled as a kinesiology program; they had a large spectrum of titles (e.g., kinesiology, biokinesiology, kinesiology and applied physiology, biomechanics and movement science, movement science). As Newell (1990) has highlighted, the word kinesiology works well to capture disciplinary and interdisciplinary efforts to study physical activity. It may be elusive to describe all areas which fit under the kinesiology label, but practitioners echo the sentiments of U.S. Supreme Court Justice Potter Stewart in that they know it when they see it.

Institutions also differ in how faculty are associated with a program; thus, three criteria were used to determine if faculty should be included in the survey. The criteria were as follows:

  1. 1.Currently teach doctoral-serving courses, and/or direct doctoral dissertations, and/or serve on doctoral advisory committees.
  2. 2.Hold a doctoral degree and be in a tenured or tenure-earning position at the rank of assistant, associate, or full professor.
  3. 3.At least 25% of their base salary support is provided by the academic unit sponsoring the doctoral program.

Item 3 creates problems because faculty salaries are administered differently at different institutions, but here the term “academic unit” has been used to provide flexibility. Item 2 creates problems because the professoriate has shown an expansion of appointment types in recent years, with various appointment types paralleling the traditional tenure-line titles. There has been an increase in the proportion of faculty with appointments outside of the traditional tenure-line (Fuesting & Schmidt, 2020), and with this increase, there has been an expansion of potential duties for nontenure line faculty (Finkelstein, Conley, & Schuster, 2016). Should these faculty be included in the review? For some programs including these faculty would boost the program profile, but for other programs, their institution makes doctoral student teaching and dissertation supervision the sole purview of the tenure-line faculty. The tenure-line appointments and associated titles are long established and have the same common elements across institutions. To include nontenure line appointments in the NAK review would create a potential bias for institutions with more liberal policies for the engagement of faculty in graduate student education. Due to the nature of tenure-line appointments, they reflect those faculty to whom the host institution has long-term fiduciary commitment, and other appointments do not necessarily have this commitment. The professoriate is evolving and revisiting the criteria for eligible faculty needs frequent reconsideration by the NAK.

The indices for the students in the review of doctoral program included GRE scores on the verbal and quantitative parts of the exam. These scores collectively contribute 10% to the overall ranking and contribute just under a third of the score from student indices used in the rankings. These measures in theory capture the quality of the students in the programs, but, of course, this assumes quality on admission impacts student quality on graduation. The use of GRE scores as part of an admission decision is fraught with problems given evidence that women and certain minority groups are disadvantaged by the exam relative to White males (Miller & Stassun, 2014). For students in the biomedical sciences, the assumption of a relationship between GRE scores and student first authored papers has not been found (Hall, O’Connell, & Cook, 2017). Once again in the biomedical sciences, GRE scores were identified by Moneta-Koehler, Brown, Petrie, Evans, and Chalkley (2017) as bad predictors of completing doctoral milestones (passing qualifying exam, time to defense, and successful graduation) and of measures of productivity (conference presentations, first authored papers, and grants). There is a trend for academic programs and institutions to abandon the GRE, leading to the so called GRExit. This trend has been more prevalent, for example, for programs in neuroscience than it is for programs in psychology (Langin, 2019). With respect to the inclusion of the GRE scores as indices used to determine program rank, these are all important factors to consider for future rounds of the review.

In 2005 when Hirsch (2005) introduced his eponymous h-index, the proposal was that this index would capture the impact and relevance of an individual’s research output. These data were collected for the first time in this round of the NAK survey of doctoral programs, and it was not used to determine program rank, but its association with program rank was assessed. The h-index had a statistically significant correlation with size adjusted program total T-scores (0.49, p < .01) but had relatively low nonsignificant correlations with the individual indices used to determine program rank (Table 11). Similar patterns were seen for number of citations. It could be argued that these measures are capturing something different to the existing indices used to arrive at the rank and are therefore worthy of further examination. Kinesiology comprises academics with different areas of interest, which creates a challenge when using a metric which includes the number of citations as different areas have different citations rates (e.g., Radicchi & Castellano, 2012; Seglen, 1997), and, of course, different research areas also have different publication patterns (e.g., Kulczycki et al., 2018).

When making measurements, there is always a risk of making changes to the very thing being measured (Dent, 2013). This NAK initiative for the continuing review of kinesiology doctoral programs is intended to provide feedback on the state of the area, not to guide program features which should be emphasized.

The results of the 2020 review of U.S. doctoral programs by the NAK have been presented. This is the fourth review completed by the NAK; the current review presents the current state of the doctoral programs, and collectively, they illustrate the evolution of these programs.t

Acknowledgments

The members of the NAK Doctoral Program Committee (2015–2020) for their contributions to the review process: David Bassett (NAK Fellow #495), Kim Graber (NAK Fellow #526), Diane Gill (NAK Fellow #331), Jane Kent (NAK Fellow #477), Duane Knudson (NAK Fellow #588), Jeff McCubbin (NAK Fellow #400), Karl Newell (NAK Fellow #319), Cesar Torres (NAK Fellow #531), Patricia Vertinsky (International Fellow), and Howard Zelaznik (NAK Fellow #337). The presidents of the NAK over the period of the review for their support of the process: Karl Newell (NAK Fellow #319), Debra Rose (NAK Fellow #447), Bradley Hatfield (NAK Fellow #452), Bradley J. Cardinal (NAK Fellow #475), and Dave Perrin (NAK Fellow #401). The analysis team for their work on this project: Matthew Mahar (NAK Fellow #521) and Nicholas D. Myers from Michigan State University. The authors would like to thank Kim Scott for all of her work in the background.

References

  • Bassett, D.R., Fairbrother, J.T., Panton, L.B., Martin, P.E., & Swartz, A.M. (2018). Undergraduate enrollments and faculty resources in kinesiology at selected U.S. public universities: 2008–2017. Kinesiology Review, 7(4), 286294. doi:10.1123/kr.2018-0043

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Dent, E. (2013). The observation, inquiry, and measurement challenges surfaced by complexity theory. In Managing the complex: Philosophy, theory and practice (Vol. 1, pp. 253283). Greenwich, CT: Information Age Publishing.

    • Search Google Scholar
    • Export Citation
  • Finkelstein, M.J., Conley, V.M., & Schuster, J.H. (2016). The faculty factor: Reassessing the American academy in a turbulent era. Baltimore, MD: Johns Hopkins University Press.

    • Search Google Scholar
    • Export Citation
  • Fuesting, M.A., & Schmidt, A. (2020). Faculty in the health professions: Growth, composition, and salaries. Knoxville, TN: College and University Professional Association for Human Resources.

    • Search Google Scholar
    • Export Citation
  • Hall, J.D., O’Connell, A.B., & Cook, J.G. (2017). Predictors of student productivity in biomedical graduate school applications. PLoS One, 12(1), e0169121. PubMed ID: 28076439 doi:10.1371/journal.pone.0169121

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Hirsch, J.E. (2005). An index to quantify an individual’s scientific research output. Proceedings of the National Academy of Sciences, 102(46), 1656916572. doi:10.1073/pnas.0507655102

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Kulczycki, E., Engels, T.C.E., Palanan, J., Bruun, K., Dušková, M., Guns, R., … Zuccala, A. (2018). Publication patterns in the social sciences and humanities: Evidence from eight European countries. Scientometrics, 116(1), 463486. doi:10.1007/s11192-018-2711-0

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Langin, K. (2019). Ph.D. programs drop standardized exam. Science, 364(6443), 816. PubMed ID: 31147501 doi:10.1126/science.364.6443.816

  • Miller, C., & Stassun, K. (2014). A test that fails. Nature, 510(7504), 303304. doi:10.1038/nj7504-303a

  • Moneta-Koehler, L., Brown, A.M., Petrie, K.A., Evans, B.J., & Chalkley, R. (2017). The Limitations of the GRE in predicting success in biomedical graduate school. PLoS One, 12(1), e0166742. PubMed ID: 28076356 doi:10.1371/journal.pone.0166742

    • Crossref
    • Search Google Scholar
    • Export Citation
  • National Center for Science and Engineering Statistics. (2019). Doctorate Recipients from U.S. Universities: 2018. Alexandria, VA: National Science Foundation.

    • Search Google Scholar
    • Export Citation
  • Newell, K.M. (1990). Kinesiology: The label for the study of physical activity in higher education. Quest, 42(3), 269278.

  • Ostriker, J.P., Kuh, C.V., & Voytuk, J.A. (2011). A data-based assessment of research-doctorate programs in the United States. Washington, DC: The National Academies Press.

    • Search Google Scholar
    • Export Citation
  • Radicchi, F., & Castellano, C. (2012). Testing the fairness of citation indicators for comparison across scientific domains: The case of fractional citation counts. Journal of Informetrics, 6(1), 121130. doi:10.1016/j.joi.2011.09.002

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Seglen, P.O. (1997). Why the impact factor of journals should not be used for evaluating research. BMJ, 314(7079), 497. doi:10.1136/bmj.314.7079.497

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Spirduso, W.W., & Reeves, T.G. (2011). The National Academy of Kinesiology 2010 review and evaluation of doctoral programs in kinesiology. Quest, 63(4), 411440. doi:10.1080/00336297.2011.10483689

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Thomas, J.R., Clark, J.E., Feltz, D.L., Kretchmar, R.S., Morrow, J.R., Reeves, T.G., & Wade, M.G. (2007). The academy promotes, unifies, and evaluates doctoral education in kinesiology. Quest, 59(1), 174194. doi:10.1080/00336297.2007.10483547

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Thomas, J.R., & Reeves, T.G. (2006). A review and evaluation of doctoral programs 2000-2004 by the American Academy of Kinesiology and Physical Education. Quest, 58(1), 176196. doi:10.1080/00336297.2006.10491878

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Ulrich, B.D., & Feltz, D.L. (2016). The National Academy of Kinesiology 2015 review and evaluation of doctoral programs in kinesiology. Kinesiology Review, 5(2), 101118. doi:10.1123/kr.2016-0004

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Voytuk, J.A., Kuh, C.V., Ostriker, J.P., and National Research Council. (2003). Assessing research-doctorate programs: A methodology study. Washington, DC: National Academies Press.

    • Search Google Scholar
    • Export Citation

Appendix A: Lists of Universities That Participated and Those Programs Invited That Elected Not to Participate

Participating universityProgram titleNonparticipating universityProgram name
Auburn UniversityKinesiologyArizona State UniversityExercise & Nutritional Sciences
Colorado State UniversityHuman BioenergeticsBall State UniversityHuman Bioenergetics
East Carolina UniversityKinesiologyBaylor UniversityExercise & Nutrition Sciences
Florida State UniversityExercise PhysiologyBrigham Young UniversityExercise Sciences
Indiana UniversityKinesiologyGeorgia State UniversityKinesiology
Iowa State UniversityKinesiologyKansas State UniversityKinesiology
Louisiana State UniversityKinesiologyKent State UniversityExercise Physiology
Michigan State UniversityKinesiologyMiddle Tennessee State UniversityHuman Performance
Mississippi State UniversityKinesiologyNew Mexico State UniversityKinesiology
The Ohio State UniversityHuman SciencesNorth Dakota State UniversityExercise Science & Nutrition
Oregon State UniversityKinesiologySpringfield CollegeExercise Physiology & Psychology
Pennsylvania State UniversityKinesiologyTemple UniversityKinesiology
Purdue UniversityHealth & KinesiologyTexas A&M UniversityKinesiology
Rutgers UniversityKinesiology & Applied PhysiologyTexas Christian UniversityHealth Sciences
Syracuse UniversityExercise ScienceTexas Tech UniversityExercise Physiology
Teachers College, Columbia UniversityKinesiologyTexas Woman’s UniversityKinesiology
The University of AlabamaKinesiologyUniversity of HawaiiKinesiology
University of ArkansasExercise ScienceUniversity of HoustonKinesiology
University of Central FloridaExercise PhysiologyUniversity of IdahoExercise Science
University of ConnecticutExercise ScienceUniversity of IowaHealth & Human Physiology
University of DelawareKinesiology & Applied PhysiologyUniversity of KansasHealth, Sport Management, & Exercise Science
University of FloridaApplied Physiology & KinesiologyUniversity of KentuckyExercise Science
University of GeorgiaKinesiologyUniversity of MiamiExercise Physiology
University of Illinois, ChicagoKinesiology & NutritionUniversity of Nevada, Las VegasKinesiology
University of Illinois, Urbana-ChampaignKinesiology & Community HealthUniversity of New MexicoExercise Science
University of Maryland, College ParkKinesiologyUniversity of Northern ColoradoSport & Exercise Science
University of Massachusetts, AmherstKinesiologyUniversity of ToledoExercise Science
University of MichiganMovement ScienceUniversity of West FloridaHealth & Physical Activity
University of MinnesotaKinesiologyUniversity of Wisconsin—MilwaukeeKinesiology
The University of MississippiHealth & KinesiologyWayne State UniversityKinesiology
University of Nebraska OmahaExercise ScienceWest Virginia UniversitySport, Exercise & Performance Psychology
University of North Carolina at Chapel HillHuman Movement Science
University of North Carolina at GreensboroKinesiology
The University of OklahomaHealth & Exercise Science
University of South CarolinaExercise Science
University of Southern CaliforniaBiokinesiology
The University of Tennessee, KnoxvilleKinesiology & Sport Studies
The University of Texas at ArlingtonKinesiology
The University of Texas at AustinKinesiology & Health Education
The University of UtahHealth & Kinesiology
University of VirginiaKinesiology
University of Wisconsin—MadisonKinesiology
Virginia Commonwealth UniversityRehabilitation & Movement Science

Appendix B: Instructional Guide

B.1. Introduction

This guide provides definitions and specific instructions for completing and returning data for the NAK doctoral program evaluation.

The required data falls into four categories:

  • Faculty Data—data specific to each faculty member in a program.

  • Program Funding Data —total amount of funding by year.

  • Student Data—data related to students in the program.

  • Doctoral Student Demographics —student demographic data.

These data are all entered into an Excel spreadsheet. As explained in this guide, some data are used in the evaluation process, and other data are requested to help provide a clearer picture of the range of doctoral programs.

B.1.1. Review Period

Data to be included are for five calendar years (2015, 2016, 2017, 2018, and 2019). You can report data for faculty members who are currently conducting doctoral activities in your program (see detailed description below). Data for faculty who left the program (e.g., retired or resigned) before the end of this period cannot be reported. Student data include all doctoral students enrolled in the program at any time in the 5-year period of 2015–2019 as described later in this Instructional Guide.

B.1.2. Questions

There is a list of frequently asked questions which can be viewed at https://nationalacademyofkinesiology.org/SubPages/Pages/Frequently%20Asked%20Questions

If you still have questions regarding the information requested, please contact Kim Scott in the NAK Business Office.

B.1.3. Program Eligibility

A program is represented by a grouping of faculty who graduate doctoral students under a common degree title; it might run within a department or across departments. The following are the criteria for a program to be eligible to participate in the review.

  1. (a)The program must graduate an average of at least one doctoral student per year.
  2. (b)The program must be offered by a college or university that has current regional higher education accreditation.
  3. (c)At least one third of the faculty members in the program must be kinesiologists, where kinesiology is broadly defined; it is up to individual programs to determine if their expertise falls under this umbrella term.

B.1.4. Returning the Data

There are four items to be returned,

  1. 1.Excel File—four tabs to complete.
  2. 2.Individual Faculty/Student Data Verification Page—this requires one signature, the Program Chairperson. Please print, sign, scan, and send electronically as a PDF.
  3. 3.Program Funding Verification Page—this requires two signatures, the Program Chairperson and the program Budget Officer. Please print, sign, scan, and send electronically as a PDF.
  4. 4.Bibliography—this is a list of all unique refereed publications and book publications (listing to include all publications used to arrive at values for “Journal Publications” and “Book Publications” in the Excel spreadsheet). Send electronically as a Word file (see specific instructions below).

Please name the files as follows,

  • Excel Spreadsheet: “University Name–NAK–2020”

  • PDF of Faculty/Student Data Verification: “University Name–NAK-Data–2020”

  • PDF of Funding Verification: “University Name–NAK-Funding–2020”

  • Word File of Bibliography: “University Name–NAK–Bibliography-2020”

B.2. Faculty Data

The faculty data relate to publications, conference presentations, editorial boards, fellowships, and items related to workload.

B.2.1. Criteria for Inclusion of Faculty

To include a faculty member, all three of the following criteria must be met.

  1. 1.Currently teach doctoral-serving courses, and/or direct doctoral dissertations, and/or serve on doctoral advisory committees.
  2. 2.Hold a doctoral degree and be in a tenured or tenure-earning position at the rank of assistant, associate, or full professor.
  3. 3.At least 25% of their base salary support is provided by the academic unit sponsoring the doctoral program.

B.2.2. Data Entry

All of the following items were used for program evaluation except those marked with an *, where this information is not used for evaluation of the programs, but is being collected as potentially useful program demographics, which will be presented as group statistics with individual programs not identified.

Faculty Member Name—for each eligible faculty member, please enter their name (first and last).

Journal Publications—enter the number of full-length scholarly articles in peer-reviewed journals published during the review period (calendar years 2015–2019) by each faculty member.

Each publication is counted only one time, so if a publication has multiple authors who are also members of the program, then the publication should be assigned to one of the authors only. Therefore, the sum of the numbers in this column will be the total number of unique research publications produced by the program.

A bibliography must be submitted which reports all of the publications, so the total number of publications in the bibliography is the same as the total number of Journal Publications listed for all faculty members.

If the journal is published both online and in print, the publication date is that associated with the hard copy.

Do not include abstracts or proceedings.

Book Chapter Publications—enter the number of chapters in books published during the review period (calendar years 2015–2019) by each faculty member. If more than one edition is published in the 5-year period, count each edition.

Each publication is counted only one time, so if a publication has multiple authors who are also members of the program then the publication should be assigned to one of the authors only. Therefore, the sum of the numbers in this column will be the total number of unique book chapter publications produced by the program.

A bibliography must be submitted which reports all of the publications, so the total number of publications in the bibliography is the same as the total number of Book Publications listed for all faculty members.

Do not include project reports.

Conference Presentations—enter the number of conference presentations whether presenter or coauthor during the review period (calendar years 2015–2019) by each faculty member.

Each presentation is counted only one time, so if a presentation has multiple authors who are also members of the program then the presentation should be assigned to one author only. Therefore, the sum of the numbers in this column will be the total number of unique research presentations produced by the program.

Include only scholarly presentations at national and international meetings. Do not include sessions for which the faculty member simply acted as a presider.

Editorial Boards—enter the number of editorships and editorial boards for scholarly journals that each faculty member has held for any period over the review period (calendar years 2015–2019).

Do not include journals for which a faculty member simply serves as a reviewer.

These data are used in the evaluation of the programs.

NAK Fellow—simply enter a 1 next to each faculty member who is an active fellow of the NAK in 2019.

These data are used in the evaluation of the programs.

Other Fellowships—enter the number of other fellowships of societies, other than the NAK, of which the faculty is a member in 2019, for example, American College of Sports Medicine Fellow, Society of Gerontology Fellow, and SHAPE (Society of Health and Physical Educators) America Research Fellow.

*Courses per Semester—for each faculty member, give the number of courses taught per semester. The number of courses taught may vary by semester in which case give the mean. If a course is team taught assign in proportion to the relative contribution to the course. At most institutions there is a typical size of course, report courses per semester in proportion to this typical course (e.g., if typical class is three credits, and the faculty member teaches one course of six credits, this would count as two).

*Percentage Effort—for each faculty member, give their percentage effort for Teaching (Column J), Research (Column K), and Service and Administration (Column L) in the Excel spreadsheet. This should reflect the workload for 9 months, and not account for any classes the faculty member may have bought out. The sum of these three columns for each faculty member should equal 100%.

*Sabbatical Frequency—give the time interval between eligibility for sabbaticals for each faculty member. For example, if a faculty member is eligible for a sabbatical every 6 years enter “6.” If a sabbatical is not an option, leave the cell blank.

*h-index—using Google Scholar, enter for each faculty member their h-index. Faculty will need to register on Google Scholar to have a profile so that the citation count can be determined. Google Scholar was selected for two primary reasons. First, the faculty who have had a name change, for example, due to marriage, can ensure that their publications from all used names can be included. Second, because Google Scholar includes books, in some research domains, books are considered more important than journal publications.

*Number of Citations—using Google Scholar, enter for each faculty member their total number of citations.

B.2.3. Verification

The Program Chairperson must sign and submit the “Verification Page for Faculty and Student Data.” In addition, include a bibliography in a Word file that reports all of the publications, so the total number of publications in the bibliography is the same as the total number of Journal Publications listed for all faculty members. A separate section of the bibliography should report all of the Book Chapter Publications. Please report the publications using APA format, listed alphabetically by author, ensuring to include the names of all authors.

B.3. Program Funding Data

These data reflect the funding which the program has had for each year in the review period (2015–2019). They are reported separately for each year.

The funding reflects support provided to faculty members in the program, that is those faculty listed for the “Faculty Data” in the Excel spreadsheet.

These should reflect funds processed through the program’s budget for the review period (2015–2019).

Report the program total extramural dollars (direct costs) for all contracts, grants, training program grants, and so forth.

If the grant is for instruction, but not for research, then these funds should only be reported if the funds are for graduate education.

B.3.1. Data Entry

These data are all used in the evaluation of the programs, except for “Cost to Buyout of a Class.”

Federal Funding—list the program total extramural dollars for all federal contracts, for example, grants and program grants. These reported expenditures, direct costs, should be for the program faculty that were processed through the department’s budget for each of the past 5 years (2015, 2016, 2017, 2018, and 2019).

Nonfederal Funding—list the program total extramural dollars for all nonfederal contracts, for example, foundations and corporations. These are expenditures, direct costs, for the program faculty that were processed through the department’s budget for each of the past 5 years (2015, 2016, 2017, 2018, and 2019).

Internal Funding—list the program total university or college intramural dollars for research received for each of the past 5 years (2015, 2016, 2017, 2018, and 2019). Awards should not be reported that come from within the departmental unit, funding must come from the college or university level. Faculty start-up packages would not be included as funding in this category.

Cost to Buyout of a Class—outline in 50 words or less the cost for a faculty member to buyout of teaching one class. This may, for example, be a dollar amount (e.g., $10,000) or a percentage of salary (e.g., 15%).

B.3.2. Verification

The Program Chairperson and a Budget Officer must sign and submit the “NAK Doctoral Program Funding Verification Page.”

B.4. Student Group Data

To include data associated with a student, they must have been a student in the doctoral program at any point during the review period (2015, 2016, 2017, 2018, and 2019), unless otherwise stated.

B.4.1. GRE Scores

Report GRE values using the score range from 130 to 170 (https://www.ets.org/gre/revised_general/scores/). If some students have scores based on the previous GRE reporting method (200–800), they must be converted to the 130–170 system (https://www.ets.org/s/gre/pdf/concordance_information.pdf).

B.4.2. Data Entry

All of the following items were used for program evaluation except those marked with an *, where this information is not used for evaluation of the programs, but is being collected as potentially useful program demographics, which will be presented as group statistics with individual programs not identified.

*Minimum GRE Verbal Score—enter the current minimum GRE verbal score required for admission to the doctoral program. If there is no minimum, leave this cell blank.

*Minimum GRE Quantitative Score—enter the current minimum GRE quantitative score required for admission to the doctoral program. If there is no minimum, leave this cell blank.

Mean GRE Verbal Score—enter the mean GRE verbal score for all doctoral students currently in the program in calendar year 2019. If a student has completed the GRE more than once, use only the scores used to make the admission decision. In computing the mean, use all full- and part-time students enrolled in the doctoral program in this field during the 2019 calendar year (spring, summer and/or fall 2019).

Mean GRE Quantitative Score—enter the mean GRE quantitative score for all doctoral students currently in the program in calendar year 2019. If a student has completed the GRE more than once, use only the scores used to make the admission decision. In computing the mean, use all full- and part-time students enrolled in the doctoral program in this field during the 2019 calendar year (spring, summer and/or fall 2019).

Number of Publications—for doctoral students who were/are enrolled any time during the 5-year review period (2015, 2016, 2017, 2018, and 2019), count the number of publications that they have had where they were the first author. Include any publications for 2 years following their graduation if they are (a) the first author and (b) the publication is based on work they conducted while a student in your program. Do not include abstracts, proceedings, or project reports.

Student Support—enter the total number of FTE for which your program had graduate student (master’s and doctoral) support for the calendar year 2019 (Spring, Summer, and Fall). These could be Research Assistants, Graduate Assistants, Teaching Fellows, Teaching Assistants, and so forth.

Number of Applications—enter the number of completed doctoral applications received for the doctoral program for the 5-year review period (2015, 2016, 2017, 2018, and 2019). This is the number of applications that have reached your program’s decision point. (This might be the Graduate School, the Department Chair, or the Graduate Coordinator, etc.)

Number Accepted Students—enter the number of doctoral students who have been accepted into this doctoral program for the 5-year review period (2015, 2016, 2017, 2018, and 2019).

*Number Admitted Students—enter the number of doctoral students who were admitted into this doctoral program for the 5-year review period (2015, 2016, 2017, 2018, and 2019).

Number in Postdoctoral positions—enter the total number of doctoral graduates in the 5-year review period (2015, 2016, 2017, 2018, and 2019) who on graduating accepted postdoctoral positions. This does not include regular faculty positions.

Number with Employment in Field—enter the total number of doctoral graduates in the 5-year review period (2015, 2016, 2017, 2018, and 2019) who on graduating accepted full-time professional positions that required a doctoral degree. Examples of such positions would include university faculty positions and research positions. Positions in industry and institutes should also be included. This does not include postdoctoral positions.

B.4.3. Verification

The Program Chairperson must sign and submit the “Verification Page for Faculty and Student Data.”

B.5. Doctoral Student Demographics

This information is not used for evaluation of the programs, but is being collected as potentially useful program demographics, which will be presented as group statistics with individual programs not identified.

B.5.1. Data Entry

These data require data for each of the 5 years of the review period (2015, 2016, 2017, 2018, and 2019). The number of students is required by race/ethnicity using the same categories used by the National Science Foundation. Therefore, the number of students each year, separated by sex, should be reported for each of the following categories,

  1. American Indians or Alaska Natives
  2. Asians
  3. Native Hawaiians or Other Pacific Islanders
  4. Blacks
  5. Hispanic/Latina
  6. Whites
  7. More than one Race
  8. Unknown or Not Reported

The totals for each column are automatically calculated.

B.5.2. Verification

The Program Chairperson must sign and submit the “Verification Page for Faculty and Student Data.”

B.6. Ranking Procedure

The raw data for the faculty and student variables will be converted to z scores. From the z scores T-scores will be calculated and weightings for individual variables applied. The total T-score will be determined by summing across weighted variables. Programs’ total T-scores will then be ranked in two ways:

  1. (a)adjusted for faculty size and
  2. (b)unadjusted for faculty size.

The faculty indices contribute 66% toward the overall score, comprising measures of productivity (30%), funding (26%), and visibility (10%). The student indices contribute 34% toward the overall score. The detailed weightings of the variables are as follows:

  • Faculty Indices (66%)

    • Productivity 30%

      • Journal publications 20%

      • Book chapters 5%

      • Presentations 5%

    • Funding 26%

      • Federal research funds 15%

      • Nonfederal research funds 8%

      • Internal grant funds 3%

    • Visibility 10%

      • Editorial boards 6%

      • NAK fellow 2%

      • Other fellowships 2%

  • Students Indices (34%)

    • Admissions 12%

      • Selectivity 2%

      • GRE verbal 5%

      • GRE quantitative 5%

    • Graduate assistant support 13%

    • Doctoral publications 2%

    • Employment 7%

      • Postdoctoral positions 4%

      • Employment in the field 3%

All other reported data are not used for ranking purposes but will be presented using descriptive statistics but without identifying individual programs.

Challis (jhc10@psu.edu) is with the Biomechanics Laboratory, Pennsylvania State University, University Park, PA, USA.

  • Collapse
  • Expand
  • Figure 1

    —The number of students studying for a doctoral degree in a kinesiology in the United States: (a) total number of students and (b) number of students per program. These graphs combine data from the 2015 and 2020 National Academy of Kinesiology surveys.

  • Figure 2

    —The percentage of the students studying for a doctoral degree in kinesiology in the United States representing different groups based on race: (a) contribution to the total of White students and (b) contribution to the total of the other race groups in the survey. These graphs combine data from the 2015 and 2020 National Academy of Kinesiology surveys.

  • Bassett, D.R., Fairbrother, J.T., Panton, L.B., Martin, P.E., & Swartz, A.M. (2018). Undergraduate enrollments and faculty resources in kinesiology at selected U.S. public universities: 2008–2017. Kinesiology Review, 7(4), 286294. doi:10.1123/kr.2018-0043

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Dent, E. (2013). The observation, inquiry, and measurement challenges surfaced by complexity theory. In Managing the complex: Philosophy, theory and practice (Vol. 1, pp. 253283). Greenwich, CT: Information Age Publishing.

    • Search Google Scholar
    • Export Citation
  • Finkelstein, M.J., Conley, V.M., & Schuster, J.H. (2016). The faculty factor: Reassessing the American academy in a turbulent era. Baltimore, MD: Johns Hopkins University Press.

    • Search Google Scholar
    • Export Citation
  • Fuesting, M.A., & Schmidt, A. (2020). Faculty in the health professions: Growth, composition, and salaries. Knoxville, TN: College and University Professional Association for Human Resources.

    • Search Google Scholar
    • Export Citation
  • Hall, J.D., O’Connell, A.B., & Cook, J.G. (2017). Predictors of student productivity in biomedical graduate school applications. PLoS One, 12(1), e0169121. PubMed ID: 28076439 doi:10.1371/journal.pone.0169121

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Hirsch, J.E. (2005). An index to quantify an individual’s scientific research output. Proceedings of the National Academy of Sciences, 102(46), 1656916572. doi:10.1073/pnas.0507655102

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Kulczycki, E., Engels, T.C.E., Palanan, J., Bruun, K., Dušková, M., Guns, R., … Zuccala, A. (2018). Publication patterns in the social sciences and humanities: Evidence from eight European countries. Scientometrics, 116(1), 463486. doi:10.1007/s11192-018-2711-0

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Langin, K. (2019). Ph.D. programs drop standardized exam. Science, 364(6443), 816. PubMed ID: 31147501 doi:10.1126/science.364.6443.816

  • Miller, C., & Stassun, K. (2014). A test that fails. Nature, 510(7504), 303304. doi:10.1038/nj7504-303a

  • Moneta-Koehler, L., Brown, A.M., Petrie, K.A., Evans, B.J., & Chalkley, R. (2017). The Limitations of the GRE in predicting success in biomedical graduate school. PLoS One, 12(1), e0166742. PubMed ID: 28076356 doi:10.1371/journal.pone.0166742

    • Crossref
    • Search Google Scholar
    • Export Citation
  • National Center for Science and Engineering Statistics. (2019). Doctorate Recipients from U.S. Universities: 2018. Alexandria, VA: National Science Foundation.

    • Search Google Scholar
    • Export Citation
  • Newell, K.M. (1990). Kinesiology: The label for the study of physical activity in higher education. Quest, 42(3), 269278.

  • Ostriker, J.P., Kuh, C.V., & Voytuk, J.A. (2011). A data-based assessment of research-doctorate programs in the United States. Washington, DC: The National Academies Press.

    • Search Google Scholar
    • Export Citation
  • Radicchi, F., & Castellano, C. (2012). Testing the fairness of citation indicators for comparison across scientific domains: The case of fractional citation counts. Journal of Informetrics, 6(1), 121130. doi:10.1016/j.joi.2011.09.002

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Seglen, P.O. (1997). Why the impact factor of journals should not be used for evaluating research. BMJ, 314(7079), 497. doi:10.1136/bmj.314.7079.497

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Spirduso, W.W., & Reeves, T.G. (2011). The National Academy of Kinesiology 2010 review and evaluation of doctoral programs in kinesiology. Quest, 63(4), 411440. doi:10.1080/00336297.2011.10483689

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Thomas, J.R., Clark, J.E., Feltz, D.L., Kretchmar, R.S., Morrow, J.R., Reeves, T.G., & Wade, M.G. (2007). The academy promotes, unifies, and evaluates doctoral education in kinesiology. Quest, 59(1), 174194. doi:10.1080/00336297.2007.10483547

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Thomas, J.R., & Reeves, T.G. (2006). A review and evaluation of doctoral programs 2000-2004 by the American Academy of Kinesiology and Physical Education. Quest, 58(1), 176196. doi:10.1080/00336297.2006.10491878

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Ulrich, B.D., & Feltz, D.L. (2016). The National Academy of Kinesiology 2015 review and evaluation of doctoral programs in kinesiology. Kinesiology Review, 5(2), 101118. doi:10.1123/kr.2016-0004

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Voytuk, J.A., Kuh, C.V., Ostriker, J.P., and National Research Council. (2003). Assessing research-doctorate programs: A methodology study. Washington, DC: National Academies Press.

    • Search Google Scholar
    • Export Citation
All Time Past Year Past 30 Days
Abstract Views 13 6 0
Full Text Views 13623 4640 316
PDF Downloads 8947 2504 182