•  
  •  
 

Abstract

While some studies in kinesiology have investigated the accessibility of educational material and electronic applications used to promote physical activity (eg, ease to perceive/navigate content resources), few studies report on the accessibility of survey tools before their use in research. PURPOSE: The purpose of this study was to measure the accessibility of one survey website to a study directly investigating comprehension of mock physical activity promotion material created by Thomas et al. (2023). METHODS: The website was for study participants to complete an online version of a cloze form (for a visual, see Nielsen, 2011; eg, see Cardinal et al., 1995). The accessibility check, done before the research website launched, had two phases: (1) a valid and reliable quantitative accessibility rating form was administered by the research team (Jul - Aug 2022, Wu et al. 2022a & 2022b) and (2) after edits were made based on the rating form findings, a pilot test of the website survey instruments was done with mock end users (Feb - Mar 2023). Mock end users (n = 12) were volunteers from the first author’s research lab and were invited to give qualitative feedback on the site’s usability before leaving the website (eg, on webpage/site navigation, on instruction clarity). Ten gave website feedback. The analytic plan was to (1) identify descriptive trends in the mock end user feedback (2) relate feedback trends to rating form criteria scores with unanimous consensus, as a measure of similarity between the two study findings (ie, Phase 1 vs 2). RESULTS: Phase 1 data analysis suggested each webpage fully met accessibility standards in 7 subareas (eg, plain language use, clear navigation). Phase 2 analysis supported most conclusions derived from rating form results (eg, clear instructions & layout), but challenged others (eg, cloze form was somewhat accessible because website platform, Canvas, required scrolling once finished, not due to missing frequently asked questions page). Pilot test showed text at a lower reading level (before edits) had lower comprehension, but text at a higher level had good validity. CONCLUSION: Our results evidenced the rating form could ensure websites have adequate accessibility. Findings also underscored significance of pilot testing research instruments with mock end users outside the research team.

Share

COinS
 
 

To view the content in your browser, please download Adobe Reader or, alternately,
you may Download the file to your hard drive.

NOTE: The latest versions of Adobe Reader do not support viewing PDF files within Firefox on Mac OS and if you are using a modern (Intel) Mac, there is no official plugin for viewing PDF files within the browser window.