Using Machine Learning and Behavioral Patterns Observed by Automated Feeders and Accelerometers for the Early Indication of Clinical Bovine Respiratory Disease Status in Preweaned Dairy Calves

Melissa C. Cantor, Enrico Casella, Simone Silvestri, David L. Renaud, Joao Costa

Research output: Contribution to journalArticlepeer-review

4 Scopus citations


The objective of this retrospective cohort study was to evaluate a K-nearest neighbor (KNN) algorithm to classify and indicate bovine respiratory disease (clinical BRD) status using behavioral patterns in preweaned dairy calves. Calves (N=106) were enrolled in this study, which occurred at one facility for the preweaning period. Precision dairy technologies were used to record feeding behavior with an automated feeder and activity behavior with a pedometer (automated features). Daily, calves were manually health-scored for bovine respiratory disease (clinical BRD; Wisconsin scoring system, WI, USA), and weights were taken twice weekly (manual features). All calves were also scored for ultrasonographic lung consolidation twice weekly. A clinical BRD bout (day 0) was defined as 2 scores classified as abnormal on the Wisconsin scoring system and an area of consolidated lung ≥3.0 cm2. There were 54 calves dignosed with a clinical BRD bout. Two scenarios were considered for KNN inference. In the first scenario (diagnosis scenario), the KNN algorithm classified calves as clinical BRD positive or as negative for respiratory infection. For the second scenario (preclinical BRD bout scenario), the 14 days before a clinical BRD bout was evaluated to determine if behavioral changes were indicative of calves destined for disease. Both scenarios investigated the use of automated features or manual features or both. For the diagnosis scenario, manual features had negligible improvements compared to automated features, with an accuracy of 0.95 ± 0.02 and 0.94 ± 0.02, respectively, for classifying calves as negative for respiratory infection. There was an equal accuracy of 0.98 ± 0.01 for classifying calves as sick using automated and manual features. For the preclinical BRD bout scenario, automated features were highly accurate at -6 days prior to diagnosis (0.90 ± 0.02), while manual features had low accuracy at -6 days (0.52 ± 0.03). Automated features were near perfectly accurate at -1 day before clinical BRD diagnosis compared to the high accuracy of manual features (0.86 ± 0.03). This research indicates that machine-learning algorithms accurately predict clinical BRD status at up to -6 days using a myriad of feeding behaviors and activity levels in calves. Precision dairy technologies hold the potential to indicate the BRD status in preweaned calves.

Original languageEnglish
Article number852359
JournalFrontiers in Animal Science
StatePublished - 2022

Bibliographical note

Funding Information:
The authors also thank IceRobotics for the pedometer data. The authors would like to thank Megan Woodrum Setser, Charlotte Pertuisel, Justine Alary, Clemence Dudoit, Mathilde Campedelli, Giulia Gobbo Rodrigues, Anna Hawkins, Gustavo Mazon, Emily Rice, Maria Eduarda Reis, and all other staff and students for assistance with the trial.

Funding Information:
The research for this study was funded by a United States Department of Agriculture NIFA Hatch Grant Project KY007100 at the University of Kentucky and by the National Science Foundation Smart and Connected Communities grant nr. 1952045 “SCC-IRG Track 2: Smart Integrated Farm Network for Rural Agricultural Communities (SIRAC).

Publisher Copyright:
Copyright © 2022 Cantor, Casella, Silvestri, Renaud and Costa.


  • activity
  • cattle
  • disease detection
  • pneumonia
  • precision livestock farming
  • sickness behavior
  • technology

ASJC Scopus subject areas

  • Animal Science and Zoology


Dive into the research topics of 'Using Machine Learning and Behavioral Patterns Observed by Automated Feeders and Accelerometers for the Early Indication of Clinical Bovine Respiratory Disease Status in Preweaned Dairy Calves'. Together they form a unique fingerprint.

Cite this