CAEE Home > Building Community > Research Brief

Keywords: cross-sectional study, engineering persistence, recruitment, survey methodology

Scaling Up: Taking the Academic Pathways of People Learning Engineering Survey (APPLES) National

The Academic Pathways of People Learning Engineering Survey (APPLES) was deployed for a second time in spring 2008 to undergraduate engineering students at 21 US universities. The goal of the second deployment of APPLES was to corroborate and extend findings from the Academic Pathways Study (APS; 2003-2007) and the first deployment of APPLES (spring 2007) on factors that correlate with persistence in engineering on a national scale. The APPLES2 set of deployments, which surveyed over 4,200 students, was among the largest and broadest cross-sectional surveys focusing on undergraduate education ever undertaken.

The APPLES2 set of deployments, which surveyed over 4,200 students, was among the largest and broadest cross-sectional surveys focusing on undergraduate education ever undertaken.
Lessons Learned from the APPLES2 Deployment
Development and deployment of the second APPLE Survey added to the many lessons learned from the deployment of the PIE survey and the first APPLE Survey in 2007:

Using individual URLs for each school provided many benefits. The team had used institution-specific URLs for APPLES1 deployment to avoid the survey appearing spam-like to students. The individualized URLs more thoroughly protected the identity of APPLES institutions from those outside each participating institution. From an IT standpoint it was easier to track and correct specific technical problems, the scalable architecture allowed additional sites to be added more easily, and the overall structure was more robust in terms of supporting server load. And finally, because the instances of large-scale fraud started at individual schools, the fraudulent data and incentive claims were more easily isolated and did not impact the larger deployment or data sets.

More was learned about recruiting non-persisters, transfer students, and part-time students. The team had difficulty recruiting these groups of students with the APPLES1 survey, but hypothesized then that their low response rates were due to the institutional characteristics of the four core APS institutions. However, the team found these same groups difficult to recruit even with the larger sample and greater diversity of the APPLES2 institutions. Non-persisters were most successfully recruited at institutions that sent a recruitment email to technical non-engineering departments, though not all institutions had non-engineering majors or had internal constraints to recruiting outside engineering. Transfer students were most easily recruited at large public institutions and those that enrolled 3+2 engineering students. Part-time students were difficult to recruit at all institutions, even those with large part-time student enrollments.

An open-ended question—“Is there anything else you want to tell us that we didn’t already cover?”—at the end of the survey produced rich data. Students voiced passions, concerns, and experiences in answering this question that could not be easily captured in standard survey format.

Staggering deployments was logistically beneficial. The four APPLES2 deployments were staggered every other week starting in late January 2007 with four institutions participating the first week followed by eight, seven, and two institutions in the following deployments, respectively. This arrangement ensured that the team was able to meet participating campus coordinator needs, allowed for last-minute survey extensions, and provided the flexibility to address anomalies with incentive claims.

Design and Logistical Decisions
Due to the Academic Pathways Study focus on persistence in engineering education, three groups of undergraduate students were recruited to take APPLES: (1) engineering students—those who had declared an engineering major or had already committed to engineering programs; (2) pre-engineering students—those who intended to declare an engineering major; and (3) non-persister students—those who were initially interested in majoring in engineering but decided to pursue a non-engineering major.

The APPLES survey was administered online and took approximately 10 minutes to complete. Respondents were offered an incentive of $4 paid to them through an online financial transaction company.

Sampling plan. The study population was undergraduate engineering students at US institutions. However, with no readily-available list of US undergraduate engineering students to randomly sample, the team sampled by institution.

The team limited their scope to four-year institutions with at least one ABET-accredited undergraduate engineering program in 2004. To ensure a balanced national sample of engineering students and institutions, the team stratified the selection using several institutional characteristics, in order of importance:

The team estimated that 21-25 institutions would be needed with at least 1,080 total participants to attain statistical significance. For a detailed explanation of the stratification and sampling plan, please refer to the full text article at the link below.

Selection of Institutions. Institutions were invited to participate in APPLES based on “strategic sampling.” US institutions with at least one ABET-accredited engineering major were ranked using their primary and secondary characteristics. An institution was then selected based on these characteristics and whether a member of the research team had personal contacts there that could facilitate participation.

The team invited 25 institutions to participate in APPLES. Institutional recruitment was initiated in mid-2007 with invitation letters sent out to each institution’s dean and a special session held at the American Society for Engineering Education Conference. As an incentive to participate in the study, the APPLES2 institutions would each receive a complimentary report highlighting their institutional data relative to the rest of the APPLES2 cohort. Twenty-one institutions accepted the invitation to participate. Refer to the full text article for more details, although the school names are not being made public.

Design for Deployment. Each campus was asked to appoint a local coordinator to assist the team in understanding the local institutional culture, provide updated institutional data, and plan and implement campus-specific recruitment.

APPLES2 deployments were planned to last five days (Monday through Friday) and a total of four deployments were offered to the participating institutions.

The survey and its implementation were designed in such a manner as to secure an umbrella Institutional Review Board (IRB) approval for all participating students and institutions through Stanford University. However, two APPLES2 institutions with APS researchers on their staff were required to obtain IRB approval in addition to the umbrella approval through Stanford. An additional four institutions voluntarily obtained local IRB approval with support from the research team.

To ensure a diversity of students from each of the participating APPLES institutions, including over-sampling of specific student groups, the team defined student strata groups at each institution based on the research goals and grouped them by:
The APPLES2 survey underwent one round of piloting with 52 undergraduate engineering students at three institutions not participating in APPLES or affiliated with APS. Based on the piloting results, the team included new motivation variables and additional items, as well as streamlined the survey as a whole.

The main participant recruitment method was an email sent from a senior administrator to undergraduate engineering students at the institution. Recruitment also included posters, and in some cases, directed advertisements on a popular social networking Web site. Each APPLES institution had its own institution-specific survey URL, and two dedicated computer servers were used to achieve high confidence in operating reliability.

What We Found
The total response for the survey was 4,266 subjects from the 21 institutions after data cleaning. The average survey response rate relative to the undergraduate engineering populations at the participating institutions was 14 percent. Eighty-five percent of the APPLES participants claimed the $4 incentive, although only 76 percent of the incentive claimants followed through to collect the incentive.

Eight out of the 21 institutions met all their strata targets. The most commonly missed targets were non-persisters and ethnic minority students. There were two cases of attempted large-scale fraud. Fraud was defined as a large number of ineligible submissions during one institution’s survey deployment. Using a combination of IP tracking and timing data, the team was able to identify these submissions for removal from the data set.



Authors: Krista M. Donaldson, Helen L. Chen, George Toye, Mia Clark, and Sheri D. Sheppard
Source: Proceedings of the 38th ASEE/IEEE Frontiers in Education Conference 2008

Brief created December 2008

For a printable pdf of this research brief, click here.

Back to Research Briefs