CAEE Home > Building Community > Research Brief

Keywords: survey, variables

From PIE to APPLES:
The Evolution of a Survey Instrument to Explore Engineering Student Pathways

The Academic Pathways Study (APS) of the NSF-funded Center for the Advancement of Engineering Education (CAEE) is a cross-university study that systematically examines how engineering students navigate their education, and how engineering skills and identity develop during their undergraduate careers.

The APS has utilized a variety of methods including surveys, structured interviews, ethnographic methods, engineering design tasks, and academic transcripts to gain a broader and richer picture of students’ undergraduate engineering experiences. Of this portfolio of methods, a key component has been the survey, and through the collective work of the APS, two instruments have emerged – the Persistence In Engineering (PIE) survey and the Academic Pathways of People Learning Engineering Survey (APPLES).

Building upon the PIE survey and the findings from the other APS methods, APPLES was designed to look at academic and professional persistence in two different cross-sectional populations of American undergraduate engineering students. The main objective of APPLES was to corroborate earlier findings from the APS on a larger scale and in particular, to explore the generalizability of findings from the PIE survey to engineering students at a greater number of American higher education institutions.

The main objective of APPLES was to corroborate earlier findings from the APS on a larger scale.
Background of Survey Development
The evolution of PIE to the APPLE surveys was guided by the need for comparability of findings while ensuring the integrity of each individual survey. The decisions about which PIE variables to include or exclude in the APPLES instruments were made according to methodological, institutional, and practical constraints.

The PIE survey was designed to identify and characterize the fundamental factors that influence students’ intentions to pursue an engineering degree over the course of their undergraduate career and upon graduation to practice engineering as a profession. First administered in winter 2003 of the students' freshman year, the PIE survey was deployed seven times to approximately 160 students at four institutions with the final deployment in spring 2007 of their senior year (Longitudinal Cohort).

The first administration of APPLES (APPLES1) was conducted in April 2007 and was focused on the broader population of students at the same four core institutions (Broader Core Sample) who participated in the earlier PIE survey. These students had not previously taken the PIE survey and represented a second sample of students from these institutions, comparable to the Longitudinal Cohort.

The second administration of APPLES (APPLES2) was conducted from January to March 2008 with a carefully selected, stratified sample of students at 21 universities in the US (Broader National Sample). Sampling was done by institution using a stratified approach based on institutional characteristics.

Constraints and Process of Survey Development
In an ideal world, the team would simply have been able to administer the PIE survey instrument to both the Broader Core Sample and the Broader National Sample. However, cost was one constraint: the students in the Longitudinal Cohort were paid $175 annually to fill out the PIE surveys and participate in other data collection activities, while the APPLES respondents were compensated $4 to complete the survey. In addition, modification of the PIE survey's language as well as length was also necessary for a cross-sectional population.

The obvious contrast in both time commitment and monetary incentive spurred the redesign and streamlining of the PIE instruments (which took students 20 to 40 minutes to complete) into a leaner survey that could be completed in approximately 10 minutes. The APPLES1 instrument was created using the PIE survey as a foundation. In reducing the length of the survey, one major design consideration of APPLES was to still meet the research objective of testing the generalizability of preliminary results from the PIE survey. Other factors considered in the development of the APPLES instrument included scale reliability and validity, evidence of viability, and interest of the target audience in the findings.

At their core, the PIE and APPLES instruments share a common set of variables representing the key concepts that researchers have suggested influence undergraduates’ persistence in the engineering major. (To view the table organizing the core variables according to the relevant APS research question category, see the full paper at the link below.)

From PIE to APPLES
Three factors guided the modifications and streamlining from the PIE survey to APPLES. The first involved checking the language of the questions, given that the targeted audience for the survey had now broadened to include freshmen, sophomores, juniors, seniors (traditional and those 5+ years), as well as transfer and part-time students. The second set of changes focused on identifying which variables should be kept or eliminated. And, third, emerging findings from the other APS data collection methods were considered in order to identify additional factors about the undergraduate engineering experience that would influence persistence.

From APPLES1 to APPLES2
One of the first steps in refining the APPLES1 instrument for deployment to the national sample of 21 American institutions was to ensure that the demographic questions were appropriate and detailed enough to capture the diversity in institutions and student respondents. Changes made during the transition from APPLES1 to APPLES2 were minimal and were focused on bolstering internal reliability of several items. New items addressing psychological and behavioral perspectives of intrinsic motivation were added after reviewing student responses to an open-ended question in APPLES1 asking about their experiences in engineering education.

One of the incentives for institutions to participate in the APPLES research study was the offer of an individualized campus report summarizing their students’ responses. As a result, the selection of questions for the APPLES instrument was also influenced by the interests of the anticipated audience for APPLES findings—deans, department chairs and faculty involved in ABET accreditation efforts, etc.—and by how these research findings might be interpreted and put into practice.

Internal and external piloting of the survey questions was found to be essential both for checking the overall time to complete the survey online as well as the choice of items to include. In addition, features could be programmed into the survey tool to capture an estimated completion time for each potential survey item to guide decisions about items.

Although results of these analyses are ongoing, the integration of findings from both quantitative and qualitative APS methods represents a valuable contribution and a useful model for future research.



Authors: Helen Chen, Krista Donaldson, Özgür Eriş, Debbie Chachra, Gary Lichtenstein, Sheri Sheppard, and George Toye
Source: Proceedings of the 2008 American Society for Engineering Education Conference

The full paper, including references, is available via ASEE proceedings search.

For a printable pdf of this research brief, click here.

Brief created June 2008

Back to Research Briefs