From Research to Application of Wearable-Derived Digital Health Measures—A Perspective From ActiGraph

Click name to view affiliation

Jeremy Wyatt ActiGraph LLC, Pensacola, FL, USA

Search for other papers by Jeremy Wyatt in
Current site
Google Scholar
PubMed
Close
https://orcid.org/0000-0002-5111-0125
and
Christine C. Guo ActiGraph LLC, Pensacola, FL, USA

Search for other papers by Christine C. Guo in
Current site
Google Scholar
PubMed
Close
https://orcid.org/0000-0003-1530-0172 *
Free access

ActiGraph counts were first conceptualized in 1996 to provide an accelerometer-derived metric that can quantify physical activity based on intensity. ActiGraph incorporated this metric into its product suite, enabling its wide adoption in research studies. Over the last 20 years, ActiGraph activity counts have become one of the most common metrics and building blocks of health outcome measures used in wearable research, with >24,000 journal articles published (based on Google Scholar search in 2023). Recently, this field of research is increasingly moving toward clinical application where wearable-derived metrics are growing in industry-sponsored clinical trials, including several use cases endorsed by the regulatory authorities. We celebrate this emerging trend as these patient-generated measures help reduce trial burden and enhance the meaningfulness of developed medical products to the patients. However, true adoption of digital measures in industry research is only in its infancy and still faces many challenges. As a digital health technology provider, ActiGraph has launched several strategic initiatives to support the research community to overcome these challenges and accelerate the translation of research to clinical application. The open-source release of the ActiGraph count algorithm was one of those initiatives. In this commentary, we take the opportunity to share our perspective in supporting the research community with this metric over the last 20 years, the motivation for making this open source, and what we are building to accelerate clinical adoption and realize the promise of better patient care.

A Brief History of Counts in Research

The ActiGraph “activity count” was first conceptualized in the original article by Tryon and Williams (1996). Counts were created as summary metrics of movement acceleration to assess the intensity of physical activity. By design, accelerometers are sensitive to a wide range of motions—from slow human movements (e.g., standing up from a seated position) to fast mechanical vibration (e.g., in a vehicle). The count algorithm introduces a simplified data reduction process to isolate signals related to human movement and produce summary metrics in proportion to the movement intensity. This process preserves low-frequency acceleration values and attenuates high-frequency values (center-pass frequency = 0.75 Hz). Data are further compressed by summing acceleration values into a finite time interval (epochs, which are typically 1 min long for adult research and shorter for pediatric research), thereby producing an epoch-level (minute-level resolution) summary view of human movement.

Under the business name “Computer Science and Applications, Inc.” or “CSA,” Tryon and Williams worked to create a version of an “ActiGraph,” the 7164 model that incorporated this counts algorithm onboard. When ActiGraph was incorporated in 2004, many peer-reviewed publications had already been produced on the basis of this metric. Freedson, for instance, developed the first cut point algorithm in 1998 (“Calibration of the Computer Science and Applications, Inc., accelerometer”), which derived cut points for the count values. Others followed suit, including the National Institutes of Health which leveraged these data in the National Health and Nutrition Examination Survey study in 2002–2003. To support the scale of global research using these products, we developed a desktop application (ActiLife) in 2004 that incorporated these count-based algorithms into the software (originally through Excel macros; later through a graphic user interface [GUI]).

As needs and technology evolved, we continued to refine our products. In 2005, we built the GT1M as a digital replica of the original 7164 model. The GT1M leveraged a single-axis micro electro mechanical system (MEMS) accelerometer (replacing the problematic piezoelectric substrate present in the 7164 model), a rechargeable battery, USB interface, and a digital version of the eight-pole analog filter used to produce the counts algorithm. When we released our first device with a three-axis accelerometer (GT3X) in 2009, we adapted the count algorithm to vector magnitude across three axes. To enable research flexibility to customize the calculation of counts to their specific needs, we built in options such as sample rate, epoch length (which had already been contemplated in various publications), the selection of the axis in the software, and various controls related to the behavior of the device, including flashing the LED, and updating the firmware remotely. As the research community evolved, we incorporated additional cut points and algorithms derived from peer-reviewed publications. In 2013, when we developed CentrePoint for multisite clinical trials, we ported over the most popular count-based algorithms into a cloud-based system to allow for decentralized deployment of the activity monitors and quasi-real-time monitoring of data collection.

Throughout these 20 years of hardware and software evolution, we have learned to prioritize the backward compatibility of the data, especially activity counts, at the center of our strategy. Scientific research, especially clinical research, places especially high demands on the longevity of data. Clinical evidence takes a long time and many iterations across studies to generate and solidify. Therefore, our community is often concerned whether data generated from one study can be compared with studies conducted in the future, much more so than users in the consumer wearables space. To meet this requirement, the hardware used to generate the sensor data and the software (algorithms) used to process the sensor data would ideally not change. On the other hand, technology evolves at a much faster pace than clinical research, and changes are required to take advantage of the benefits of new technology (hardware and software). In the early days, when the accelerometer technology was evolving and improving rapidly, we incorporated new features into new hardware models when they offered substantial benefit for the researchers. Some of these evolutions have introduced changes between devices (Kozey et al., 2010; Rothney et al., 2008; Tanha, Tornberg, Dencker, et al., 2013; Tanha, Tornberg, Wollmer, et al., 2013), but we always tried to learn from each product release and find better solutions for further iterations. When onboard storage of raw sensor data became technically feasible, thanks to the introduction of NAND flash memory and microcontrollers with high-speed USB, we introduced this into our product line in 2011. The retention of raw accelerometer data provided a robust solution to long-term data comparability and flexibility to adopt different algorithms and became the default configuration for research-grade device manufacturers alike, such as Activinsights, Axivity, and PAL Technologies. To further facilitate intradevice and backward compatibility, we are collaborating with academic researchers to develop testing standards for raw accelerometer data building on our internal processes (LaMunion et al., 2022).

Similarly, we have tried to keep “Activity Counts” backward compatible. While there have been critiques about the methodology to compute the ActiGraph counts, and other types of summary metrics have been developed by the research community (Bai et al., 2016; Brønd & Arvidsson, 2016; John et al., 2019; Migueles et al., 2019), we have chosen to keep the computation of counts the same and taken great care to ensure its consistency across generations of hardware and software. Its robustness to noise and ease of use serves the purpose of capturing the general activities of daily living that concern most clinical and public health researchers. To this date, activity counts have been incorporated and used in tens of thousands of research studies. Based on a rough estimate generated using keywords “ActiGraph” and “Counts” in the full-text research on Google Scholar in 2023, Counts were used in hundreds of research studies each year in the 2000s, and steadily increase to an impressive 3000+ number per year over the last 5 years, covering a diverse array of topics across public health, clinical trials, rehabilitation, behavioral intervention, and even animal research (Figure 1). The consistency of counts makes it easy to compare the findings across these generations of research studies.

Figure 1
Figure 1

Number of journal articles that include “ActiGraph” and “Counts” in the full text by year based on Google Scholar.

Citation: Journal for the Measurement of Physical Behaviour 7, 1; 10.1123/jmpb.2023-0045

Forward-Looking Perspective

Despite flourishing productivity in academia, actigraphy and wearables were rarely included in industry-sponsored clinical research for most of the early years. Around 2015, a small yet steady trend in using wearables started to appear in the clinical trial industry, primarily driven by a motivation to replace traditional clinical endpoints, which are often decades old with known limitations in psychometric properties (Figures 2 and 3; Masanneck et al., 2023; Mittermaier et al., 2023; Woelfle et al., 2023). This need is particularly high in neurology and rare diseases, where drug development effort is critically hindered by the lack of clinical endpoints that can accurately capture the course of disease progression. In addition, wearable-derived disease measures are also on the rise in respiratory, cardiovascular, and rheumatology clinical trials, among other chronic conditions. Driven by the Food and Drug Administration’s patient-focused drug development guidance, drug developers are mandated to demonstrate treatment benefits that are meaningful to the patients. The ability of wearable devices to monitor patients’ function in their daily lives makes them an attractive tool to meet this need.

Figure 2
Figure 2

Cumulative number of industry-sponsored clinical trials (ActiGraph internal database).

Citation: Journal for the Measurement of Physical Behaviour 7, 1; 10.1123/jmpb.2023-0045

Figure 3
Figure 3

Number of clinical trials that include data collection using actigraphy or wearable devices by year based on ClinicalTrials.gov database.

Citation: Journal for the Measurement of Physical Behaviour 7, 1; 10.1123/jmpb.2023-0045

To truly leverage wearable-derived measures as clinical endpoints, however, researchers and regulators require fit-for-purpose evidence supporting their validity. Despite the extensive use of wearables in academic research, there are still substantial evidential gaps in scientific literature. This evidential gap and barrier to clinical adoption were highlighted and examined in a recent systematic review by Woelfle et al. (2023), with a focus on multiple sclerosis. The authors identified >300 studies published on the use of wearables to assess motor performance in multiple sclerosis, with a rapid growth in the last 3 years. ActiGraph counts were used in almost half of this literature. On the other hand, only a very small number of industry studies in multiple sclerosis have included sensor data as clinical endpoints, and few have been applied to decision making in clinical trials nor care (Morant et al., 2019; Woelfle et al., 2023; Zhang et al., 2019). The authors cited many barriers to the full clinical adoption of digital monitoring, including reproducibility of results, lack of external validation, and cost of digital devices, and suggested a potential guideline to bridge these gaps.

ActiGraph’s Mission

ActiGraph’s mission is to pioneer the digital transformation of clinical research, ultimately making real-world impact that can benefit people’s lives. This mission can only be accomplished by working closely with our research community. We hereby offer some perspectives as a technology company and describe the initiatives we have put in place to accelerate the translation and bridge the evidentiary gaps to clinical adoption.

Research Comparability and Longevity

As discussed above, it could take multiple iterations and years of evidence generation to validate a new tool for clinical application, and technology is likely to evolve during this time span. Therefore, it is essential to anticipate, and ideally control for technology evolution during study planning. In support of this, we will continue to prioritize the following features in our services.

  1. a.Access to raw data. We have discussed the importance of raw data retention and the limitations without it in digital health research (Guo, 2022; Lee et al., 2023). We will continue to make raw data available to our users, including new sensor modalities, such as photoplethysmography in our devices.
  2. b.Version control and documentation. Our cloud-based solution, CentrePoint, can lock the firmware and software version for the duration of a study. Hardware models and software versions should be reported in the Methods section of a publication.
  3. c.Backward compatibility. To provide our users with the benefits of new technology, we need to release new device models, data processing algorithms, and software platforms. As discussed previously, we always have and will continue to balance toward backward compatibility. However, in cases when the benefit of new technology is clear, as often is the case of processing algorithms, we will recommend the adoption of new technology and documentation of the updates and comparison to previous generations.
  4. d.We will attempt to make some of our algorithms open source, such as ActiGraph counts, and those updates will be made available on our GitHub page (https://github.com/actigraph/).

Algorithm Library and Scalability

The demand for algorithms is growing rapidly as wearables are applied to capture different health concepts such as gait, tremor, and scratch. To meet this demand, we have modularized the data processing back-end system in our cloud-based platform CentrePoint in 2023. This new system allows us to implement algorithms as containers, which are lightweight, standalone, executable packages of software without being constrained to a specific programming language. Furthermore, these containers are version based, meaning that new versions of an algorithm can be introduced without impacting ongoing studies. This has substantially sped up the expansion of our algorithm library, especially those incorporating new sensor modalities such as photoplethysmography for vital sign monitoring.

We understand that there are significant interests from the research community to implement many newly developed algorithms. In the near term, this could be performed by our data science team upon service request. In the long term, we hope to expand this capability to the community so the researchers can implement their own algorithms with flexibility. We also plan to expand the algorithm library in ActiLife—but such expansion in a desktop solution like ActiLife is unlikely to match the pace and scalability enabled by the cloud-based container solution in CentrePoint.

Scientific Partnerships

We have the privilege of working closely with researchers in both academia and industry, as well as colleagues in regulatory agencies. We believe this puts us in a unique position to bridge the evidential gaps to clinical application by uniting our scientific community with partnerships. For many years, we have implemented algorithms developed by academic researchers in our platform to provide broad access to the community, including the hundreds of industry-sponsored clinical trials we support. In 2022, we launched our first partnership initiative, the Digital Endpoint Accelerator Research program, to specifically tackle the gaps in the validation of processing algorithms in various clinical indications. We have also launched an annual scientific conference, ActiGraph Digital Data Summit, to bring researchers together to network, exchange ideas, and inspire partnerships. Beyond those dedicated initiatives, we are always keen to hear ideas on scientific partnerships, joint funding efforts, and other ways to work together. Researchers are welcome to send their ideas to dear@theactigraph.com.

Conclusions

It is an exciting time for the field of digital health sciences. The technology is established and well accepted by society, the regulatory environment is maturing, and the need to transform clinical trials and care is greater than ever. We believe by working together and being intentional about evidence generation, we will see not only continuing growth in digital health research but also the incorporation of digital data into decision making in clinical development and care, and ultimately faster and better care to the people needing it.

References

  • Bai, J., Di, C., Xiao, L., Evenson, K.R., LaCroix, A.Z., Crainiceanu, C.M., & Buchner, D.M. (2016). An activity index for raw accelerometry data and its comparison with other activity metrics. PLoS One, 11(8), Article 160644.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Brønd, J.C., & Arvidsson, D. (2016). Sampling frequency affects the processing of Actigraph raw acceleration data to activity counts. Journal of Applied Physiology, 120(3), 362369.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Guo, C. (2022). Maximize your R&D investment in DHTs through the collection and retention of raw sensor data. https://6407355.fs1.hubspotusercontent-na1.net/hubfs/6407355/White%20Papers/AG_WhitePapers_RawData.pdf

    • Search Google Scholar
    • Export Citation
  • John, D., Tang, Q., Albinali, F., & Intille, S. (2019). An open-source monitor-independent movement summary for accelerometer data processing. Journal for the Measurement of Physical Behaviour, 2(4), 268281.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Kozey, S.L., Staudenmayer, J.W., Troiano, R.P., & Freedson, P.S. (2010). Comparison of the ActiGraph 7164 and the ActiGraph GT1M during self-paced locomotion. Medicine & Science in Sports & Exercise, 42(5), 971976.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • LaMunion, S., Nguyen, J., Brychta, R., Troiano, R., Friedl, K., & Chen, K. (2022). Comparing ActiGraph CentrePoint Insight Watch, GT9X Link, and wGT3X-BT accelerometers to NHANES 2011-2014 GT3X+ devices using an orbital shaker. National Institute of Diabetes and Digestive and Kidney Diseases, 5, 300400.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Lee, P.H., Neishabouri, A., Tse, A.C.Y., & Guo, C.C. (2023). Comparative analysis and conversion between Actiwatch and ActiGraph open-source counts. Journal for the Measurement of Physical Behaviour, 1, Article 54.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Masanneck, L., Gieseler, P., Gordon, W.J., Meuth, S.G., & Stern, A.D. (2023). Evidence from ClinicalTrials.gov on the growth of digital health technologies in neurology trials. NPJ Digital Medicine, 6(1), Article 1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Migueles, J.H., Rowlands, A.V., Huber, F., Sabia, S., & Hees, V.T. (2019). GGIR: A research community–driven open source R package for generating physical activity and sleep outcomes from multi-day raw accelerometer data. Journal for the Measurement of Physical Behaviour, 2(3), 188196.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Mittermaier, M., Venkatesh, K.P., & Kvedar, J.C. (2023). Digital health technology in clinical trials. NPJ Digital Medicine, 6(1), Article 1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Morant, A.V., Jagalski, V., & Vestergaard, H.T. (2019). Labeling of disease-modifying therapies for neurodegenerative disorders. Frontiers in Medicine, 6, Article 223. https://www.frontiersin.org/articles/10.3389/fmed.2019.00223

    • Search Google Scholar
    • Export Citation
  • Rothney, M.P., Apker, G.A., Song, Y., & Chen, K.Y. (2008). Comparing the performance of three generations of ActiGraph accelerometers. Journal of Applied Physiology, 105(4), 10911097.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Tanha, T., Tornberg, Å., Dencker, M., & Wollmer, P. (2013). Accelerometer measured daily physical activity and sedentary pursuits—Comparison between two models of the Actigraph and the importance of data reduction. BMC Research Notes, 6, Article 439.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Tanha, T., Tornberg, Å.B., Wollmer, P., & Dencker, M. (2013). Head-to-head comparison between Actigraph 7164 and GT1M accelerometers in adolescents. Clinical Physiology and Functional Imaging, 33(2), 162165.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Tryon, W.W., & Williams, R. (1996). Fully proportional actigraphy: A new instrument. Behavior Research Methods, Instruments, & Computers, 28(3), 392403.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Woelfle, T., Bourguignon, L., Lorscheider, J., Kappos, L., Naegelin, Y., & Jutzeler, C.R. (2023). Wearable sensor technologies to assess motor functions in people with multiple sclerosis: Systematic scoping review and perspective. Journal of Medical Internet Research, 25(1), Article 44428.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Zhang, Y., Salter, A., Wallström, E., Cutter, G., & Stüve, O. (2019). Evolution of clinical trials in multiple sclerosis. Therapeutic Advances in Neurological Disorders, 12, Article 547.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Collapse
  • Expand
  • Figure 1

    Number of journal articles that include “ActiGraph” and “Counts” in the full text by year based on Google Scholar.

  • Figure 2

    Cumulative number of industry-sponsored clinical trials (ActiGraph internal database).

  • Figure 3

    Number of clinical trials that include data collection using actigraphy or wearable devices by year based on ClinicalTrials.gov database.

  • Bai, J., Di, C., Xiao, L., Evenson, K.R., LaCroix, A.Z., Crainiceanu, C.M., & Buchner, D.M. (2016). An activity index for raw accelerometry data and its comparison with other activity metrics. PLoS One, 11(8), Article 160644.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Brønd, J.C., & Arvidsson, D. (2016). Sampling frequency affects the processing of Actigraph raw acceleration data to activity counts. Journal of Applied Physiology, 120(3), 362369.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Guo, C. (2022). Maximize your R&D investment in DHTs through the collection and retention of raw sensor data. https://6407355.fs1.hubspotusercontent-na1.net/hubfs/6407355/White%20Papers/AG_WhitePapers_RawData.pdf

    • Search Google Scholar
    • Export Citation
  • John, D., Tang, Q., Albinali, F., & Intille, S. (2019). An open-source monitor-independent movement summary for accelerometer data processing. Journal for the Measurement of Physical Behaviour, 2(4), 268281.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Kozey, S.L., Staudenmayer, J.W., Troiano, R.P., & Freedson, P.S. (2010). Comparison of the ActiGraph 7164 and the ActiGraph GT1M during self-paced locomotion. Medicine & Science in Sports & Exercise, 42(5), 971976.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • LaMunion, S., Nguyen, J., Brychta, R., Troiano, R., Friedl, K., & Chen, K. (2022). Comparing ActiGraph CentrePoint Insight Watch, GT9X Link, and wGT3X-BT accelerometers to NHANES 2011-2014 GT3X+ devices using an orbital shaker. National Institute of Diabetes and Digestive and Kidney Diseases, 5, 300400.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Lee, P.H., Neishabouri, A., Tse, A.C.Y., & Guo, C.C. (2023). Comparative analysis and conversion between Actiwatch and ActiGraph open-source counts. Journal for the Measurement of Physical Behaviour, 1, Article 54.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Masanneck, L., Gieseler, P., Gordon, W.J., Meuth, S.G., & Stern, A.D. (2023). Evidence from ClinicalTrials.gov on the growth of digital health technologies in neurology trials. NPJ Digital Medicine, 6(1), Article 1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Migueles, J.H., Rowlands, A.V., Huber, F., Sabia, S., & Hees, V.T. (2019). GGIR: A research community–driven open source R package for generating physical activity and sleep outcomes from multi-day raw accelerometer data. Journal for the Measurement of Physical Behaviour, 2(3), 188196.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Mittermaier, M., Venkatesh, K.P., & Kvedar, J.C. (2023). Digital health technology in clinical trials. NPJ Digital Medicine, 6(1), Article 1.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Morant, A.V., Jagalski, V., & Vestergaard, H.T. (2019). Labeling of disease-modifying therapies for neurodegenerative disorders. Frontiers in Medicine, 6, Article 223. https://www.frontiersin.org/articles/10.3389/fmed.2019.00223

    • Search Google Scholar
    • Export Citation
  • Rothney, M.P., Apker, G.A., Song, Y., & Chen, K.Y. (2008). Comparing the performance of three generations of ActiGraph accelerometers. Journal of Applied Physiology, 105(4), 10911097.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Tanha, T., Tornberg, Å., Dencker, M., & Wollmer, P. (2013). Accelerometer measured daily physical activity and sedentary pursuits—Comparison between two models of the Actigraph and the importance of data reduction. BMC Research Notes, 6, Article 439.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Tanha, T., Tornberg, Å.B., Wollmer, P., & Dencker, M. (2013). Head-to-head comparison between Actigraph 7164 and GT1M accelerometers in adolescents. Clinical Physiology and Functional Imaging, 33(2), 162165.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Tryon, W.W., & Williams, R. (1996). Fully proportional actigraphy: A new instrument. Behavior Research Methods, Instruments, & Computers, 28(3), 392403.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Woelfle, T., Bourguignon, L., Lorscheider, J., Kappos, L., Naegelin, Y., & Jutzeler, C.R. (2023). Wearable sensor technologies to assess motor functions in people with multiple sclerosis: Systematic scoping review and perspective. Journal of Medical Internet Research, 25(1), Article 44428.

    • Crossref
    • Search Google Scholar
    • Export Citation
  • Zhang, Y., Salter, A., Wallström, E., Cutter, G., & Stüve, O. (2019). Evolution of clinical trials in multiple sclerosis. Therapeutic Advances in Neurological Disorders, 12, Article 547.

    • Crossref
    • Search Google Scholar
    • Export Citation
All Time Past Year Past 30 Days
Abstract Views 96 96 0
Full Text Views 2123 2122 211
PDF Downloads 244 244 9