Banner-- illustration of motorcyclists

V. DATA COLLECTION INSTRUMENTS FOR SITE VISITS

Five States, Delaware, Idaho, Maryland, Nevada, and Oregon, were selected for site visits because of their overall high promising-practices scores for rider education and licensing (see table 2).2 The purpose of the site visits was to gather detailed information from administrators, instructors, and students about the features of programs that deliver high-quality and effective training. Because extensive data about characteristics of rider education and licensing programs had previously been collected in phase 1, the site visits focused on gathering information about specific processes and policies implemented across the five promising-practices States that result in a quality rider training program and rigorous licensing procedures.

Data collection instruments were developed for each of the three groups of respondents (State motorcycle rider education administrators, instructors, and students). The instruments were written to capture information about the key features of motorcycle rider education and licensing identified in the promising-practices model. The model comprises 13 features that encompass program administration, rider education, and licensing (see figure 1). The first step in creating the instruments was to identify the appropriate group of respondents (administrators, instructors, or students) who could provide insight into these features. The goal of collecting detailed information had to be balanced against the burden placed on respondents to answer questions. State administrators, for example, have a broad range of knowledge that spans all the key promising-practices features. Yet a data collection instrument that addressed all 13 features would reach a point of diminishing returns because of the time necessary to conduct the interview and the corresponding onset of exhaustion on the part of the respondent.

Table 4 links the key promising-practices features to the three groups of respondents. As indicated in the table, 5 of the 13 features were assigned to two or more groups of respondents. This strategy allowed not only the collection of more extensive data about particular features but also comparisons to be drawn across respondents with different views and experiences. Administrators were linked to the greatest number of promising-practices features (10), followed by instructors (6) and students (5).

Table 4. Type of Data Collected by Respondent Type

Feature State Administrator Instructor Student
1. Integration between rider education and licensing check mark
2. An adequate, dedicated funding source check mark
3. Collection of rider training, licensing, and crash data check mark
4. Comprehensive curricula check mark
5. Effective training and delivery check mark check mark
6. Outreach and information efforts check mark check mark check mark
7. Incentives for training check mark check mark check mark
8. Regular program assessments and quality control check mark check mark
9. Instructor education and training check mark check mark check mark
10. Graduated licensing check mark
11. Comprehensive testing check mark
12. Comprehensive procedures for obtaining and renewing a license check mark
13. Incentives for licensing check mark

Once the features were linked to the respondents, the data collection instruments were created. To capture the greatest diversity of perspectives, focus groups were used to collect information from instructors and students. Two unique focus group moderator’s guides were devised, one for instructors and the other for students. The moderator’s guides were written with open-ended questions designed to spark discussion and to encourage participants to reflect on their experiences with their State’s rider training program. The goal of the focus groups was to capture information about what specific steps the promising programs take to deliver training to students and how these steps result in effective rider education across the features of the promising-practices model. Each focus group lasted approximately one hour.

Data from program administrators were collected through a one-on-one interview conducted with a detailed interview protocol. Similar to the focus group moderator’s guide, the interview protocol comprised questions linked to the corresponding features of the promisingpractices model. Questions posed to administrators centered on the structure of the State program, from coordination with licensing agencies to the strategies employed to meet the demand for training and recruiting new instructors. The administrator interviews lasted approximately 30 to 45 minutes.


2Although New Mexico scored higher than Maryland, this score is based on 2001 data. At the time of the site visits (2004), New Mexico had completely reorganized its motorcycle rider education program, awarding a contract to the Motorcycle Safety Foundation for day-to-day administration. For these reasons, we did not include it in our analyses.

Back to NHTSA Home Page