CAEE Home > Student Experiences and Pathways > Research Brief

Keywords: undergraduate, student experience, beliefs

More to Say: Analyzing Open-ended Student Responses
to the Academic Pathways of People Learning Engineering Survey

The Academic Pathways of People Learning Engineering Survey (APPLES2) was administered in Spring 2008 to undergraduate engineering students at 21 American universities. Students took the 10-minute online survey that asked mainly multiple choice questions related to their undergraduate engineering experience. A final optional open-ended question asked, Is there anything else you want to tell us that we didn’t already cover? This paper explores the responses from survey participants to this open-ended question.

It is instructive to read student responses and hear their voiced passions, concerns and experiences that could not be easily captured in standard survey format.
Implications of Findings
It is illuminating to read student responses and hear their voiced passions, concerns and experiences that could not be easily captured solely in a standard multiple choice question format. The situations and barriers that students describe in their comments are wrenching at times. But for each extremely bad or frustrating comment there is usually a paired positive one. In reviewing the student responses it seems that for each “[my institution] sucks” there is a complimentary “[my institution] rocks.” Due to the very nature of the question asked and examined in this paper, perhaps the wide range of responses is to be expected.

These open-ended responses provided a rich addition to the emerging quantitative research findings from the APPLES2 survey instrument. Issues important to students such as advising and gender were not subjects that were probed as part of the multiple choice survey questions. Additionally, there are a considerable number of untapped student experiences outside the classroom to be understood.

These open-ended student responses add qualitative descriptions and substantiations to the other survey data that was collected, with personal and sometimes passionate descriptions of the students’ experience, and will inform further iterations of the survey instrument.

Methods and Background
This paper analyzes student responses to the open ended question “Is there anything else you want to tell us that we didn’t already cover?” 4,266 participants from 21 sites submitted survey responses to the APPLES2 deployment and 37 percent, or 1578 of the participants provided free form responses. The remaining 2688 participants, or 63%, gave no response and left the question response box blank.

Survey responses were removed from the data set if they had no applicable content. To maintain student anonymity within individual school reports, a data cleaning schema was developed and implemented for each set of school survey records. Student names were redacted and replaced with a generic placeholder indicating the removed information, i.e., [name], with brackets and italicized words as to indicate the edit. The same was done for any possibly identifying organizations, companies, or other affiliations. For further cross-school analysis of student responses (and to prepare the same data set for archiving), the aggregate data set was anonymized for school-specific information such as school name, individual names, course number and names; other identifiers were also given generic replacements such as [institution], [name], [introductory computer science course], etc.

Student responses were graded by two coders for negative and positive values of the comment. A scale of 1 through 5 was used to indicate how negative (criticizing), neutral, or positive (complementing) the response was. A value of 1 was used to code a very negative response, 2 slightly negative, 3 neutral, 4 slightly positive, and 5 very positive. Comments that were judged in the very categories (1 or 5) used exclamation marks, rather damning or laudatory language and otherwise conveyed much displeasure or excitement about the topic described. Comments without tone or opinion were marked as neutral. Comments that were only slightly negative or positive were categorized as 2 or 4. To make meaning of students responses, comments were read multiple times to generate and refine an emerging thematic coding scheme. These topics were grouped by whether they were comments about School or Individual Beliefs. Issues emerging from comments at the School level are those that could be addressed by institutions but not easily by the student. Similarly, issues emerging from comments at the Individual Beliefs level are those that may be more difficult to alter if the institution attempted to address them.

What We Found
Student responses (scored along the 1 to 5, negative to positive, scale) were coded to a schema that included the following 12 codes: advising, co-op, gender, social, teaching (curriculum), and teaching (language) as part of the School theme; calling, challenge, future, lifestyle, money, and understanding as part of the Individual Beliefs theme (for more detail, please see the full paper at the link below).

The topics under the School theme were mostly found at the extremes of the scale of positive/negative comments. For the topics under the Individual Beliefs theme, responses were mostly found to be neutral on the scale of positive/negative comments. The number of items per topic is generally similar with an exception of the Teaching (curriculum) topic which had 324 comments.

The School themed topics are generally more negative than the Individual Beliefs topics. Interestingly, both Co-op and Money are exceptions in this data set. It may be that these two topics are much more concrete than the other more abstract items or that, in reflection, the categorization of each should be reconsidered. In other words, finding benefit from experiencing a co-op and being worried about the financial overload of tuition could be construed as a miscategorization.

These open-ended responses provided a rich addition to the emerging quantitative research findings from the APPLES2 survey instrument.



Authors: Micah Lande, Sarah Parikh, Sheri Sheppard, George Toye, Helen Chen, and Krista Donaldson
Source: Proceedings of the 2009 American Society for Engineering Education Conference

The full paper, including references, is available via ASEE proceedings search.

For a printable pdf of this research brief, click here.

Brief created April 2009

Back to Research Briefs