Animal Behavior Reliability
  • Home
  • About
  • Foundations
    • Proposal
    • Measurements >
      • Definitions
    • Team makeup
    • Training >
      • Features of test subsets
      • Assessment
    • Metrics
  • Diving deeper
    • Iterative training processes >
      • Tasks and techniques
      • Categorical data
      • Continuous data
      • Rare outcomes
    • Timeline
    • Troubleshooting
    • Reporting
  • Checklist
  • Resources

Reporting.​

There are many layers to reliability training, and it can get complicated to try to determine how much of this detail to report in publications. Our opinions on this matter have evolved over time, and we have some key features that we consider to be best practice, some of which we are still striving to include in our own manuscripts.
  • How many observers were used?
  • What metric was used, and what was the cutoff value that observers had to meet for each outcome to complete training?
  • What was the actual reliability value for each outcome of interest? 
  • Did the cutoff value or training process change for any behaviors or outcomes (i.e., ones that were rare, or difficult to train)?
  • How robust was the training subset? Here, we consider number of photos or videos, duration of videos, modality of training subset compared to the true data collection modality, whether the photos or videos captured the full array of outcomes that observers may experience, and whether each outcome was representing multiple times.
  • Was any initial orientation training used, or was there a retraining process if observers did not meet the cutoff value for each outcome?
  • If multiple levels of training were used, which dataset was used to generate the reported metric?
  • When did training occur - before, during, and/or after data scoring? 
  • If we found poor reliability at any stage of our project timeline that affects data being presented in the manuscript, is it acknowledged and discussed?
  • ​Can we provide photo or video examples of our definitions to demonstrate what they look like in practice? Photo examples, such as photos documenting wound progression for a scoring system, are often included directly in the manuscript, while video examples are typically hosted in an external repository, like Dryad.

Examples

Here are some examples from our own publications where we have described aspects of our reliability process. Our own examples are not flawless - we strive for continuous improvement with each publication. Read each photo below to see our descriptions of what we did well. Beneath the photo, we'll describe what information we (ideally) would have included. Hindsight is 20/20.
Example 1: Downey and Tucker, 2023a
Picture
Elsewhere in this paper, we described additional data collection, including weighing calves with a scale and measuring daily feed and water intake. We did not include information on how we trained our team to accurately collect these data, despite having detailed SOPs and training processes for these tasks. In the future, including that information would be more robust.
Example 2: Harmon et al., 2023 
Picture
Elsewhere in this paper, we described how we obtained weekly photos to score. We noted that researchers were trained on that photography using a task-based standard operating procedure that demonstrated required photo angles and distance from the calf, in order to generate consistent images. We also noted that all photos were taken with a ring light, to ensure consistent lighting across all photos.
Example 3: Downey and Tucker, 2023b
Picture
<< Diving deeper
< Troubleshooting
Picture
Picture
Picture
Proudly powered by Weebly
  • Home
  • About
  • Foundations
    • Proposal
    • Measurements >
      • Definitions
    • Team makeup
    • Training >
      • Features of test subsets
      • Assessment
    • Metrics
  • Diving deeper
    • Iterative training processes >
      • Tasks and techniques
      • Categorical data
      • Continuous data
      • Rare outcomes
    • Timeline
    • Troubleshooting
    • Reporting
  • Checklist
  • Resources