student success
Utilizing Student Satisfaction Data: Three Campuses Share Their Stories
RNL hosted a live webinar in September to hear directly from three of our campus partners on why they regularly assess student satisfaction on their campuses (this session is also available to listen to on-demand). We know from the research that student satisfaction has been linked to higher retention and higher graduation rates, but it is helpful to understand from client institutions not only why they chose to assess student satisfaction, but specifically why they chose to work with RNL, and how these data have informed and guided decision making on their campuses.
Widener University, PA
Stephen Thorpe, recently retired Director of Institutional Research and Effectiveness
Widener University is a four-year private institution with the main campus located in Chester, Pennsylvania.Their fall 2023 enrollment was 5,600.They administer the Student Satisfaction Inventory in the spring of even years, a pattern they have had since 2008. Hear from Steve in his own words:
Two of our strategic priorities are student success and creating a sense of belonging for our students. This means that student satisfaction data is critical.How can you talk about student success without measuring student satisfaction?How can you talk about student belonging without knowing how your students feel about belonging on your campus?We measure these factors every two years and benefit from the external regional and national comparisons with four-year private schools that the SSI offers.
Because the SSI is consistent over time, we can longitudinally compare students’ perceptions regarding everything from academic advising to instructional effectiveness, campus services, and safety and security.We opt to survey every other year, to give ourselves space between administrations to implement strategies that invariably help us see positive change.
To build response rates, we work with our student affairs team, who reach students/catch their attention in ways that the Institutional Research office cannot. They send out a “warm up” email focused on WHY students should participate. Their message seeks to demonstrate that we use their feedback to improve their own experiences as students. People respond when they know they’re being heard.
We also involve our student government in generating action items after the report is delivered. They review every dimension of the results and help to develop solutions for our identified challenges. This demonstrates that participation actually improves the student experience. Incentives for participation are also highly effective, because every student can use an unexpected treat once in a while.
We organize and segment the data so that each dean gets only the results for their students along with the external comparison data. Our Student Affairs department is also a heavy user of the survey findings, with many of their plans being driven by measures from the SSI.Our administration and finance leaders provided their teams with relevant sections on how they could improve services. The student comment section also provides a valuable qualitative dimension.
The demographic data slices that I find most helpful are class level, residency, and areas of DEI.I look for nuggets where a group of students feel differently about something than other groups of students.I am especially looking for groups of students who feel more or less of a sense of belonging on campus.
Massasoit Community College, MA
Mary Goodhue Lynch, Associate Dean, Institutional Research
Located in southeastern Massachusetts, this two-year institution has 9000 plus credit and non-credit students annually. They survey their students once every four years (since 2015) with the Student Satisfaction Inventory. Mary shares her experience here:
It’s important to keep our students engaged and satisfied so they keep coming back. Whether it’s taking one or two courses or to get their degree, we want to make sure they succeed and that they feel like they belong.
Every time we do the RNL SSI, the faculty and staff say “Oh I like that one” because it asks students how important things are along with how satisfied they are.
We want to know our challenges but also our strengths. The SSI data shows us exactly where we excel (the areas we can celebrate) and where we fall short of meeting the students’ expectations. The gap score (importance minus satisfaction), it is something that our folks value and we’re able to wrap our head around it a little bit better than some of the other data points we collect.
Like many institutions, we struggle with response rates. It is therefore imperative to ensure that students know that the results are being actively used to address their areas of concern. The email invitation comes from the Dean of Students (as opposed to Office of Institutional Research) and our faculty and advisors also share the survey link with their students. This support in reaching students is crucial.
We present the results to our enrollment management intake committee. Members then build findings into their discussion topics to ensure that the findings are digested. In addition, I had ambassadors from academic affairs and student services look at the results and share them with their different departments to consider how to best move forward.
We use the data for our strategic plan and accreditation self-study, and the fact that the findings map to the regional accreditation criteria has been very helpful.The Executive Summary view gives us a wide lens of our overall results, but the ability to drill down into demographic subgroups provides even more help in ensuring equity and equitable experiences. Our accreditors want us to focus on the student voice, and the SSI allows for that. What we value the most about RNL’s SSI is the ability to disaggregate the data by different populations given our diverse enrollment.Our faculty appreciate that the survey is “sector-specific” (for community colleges in this case) because our students are different than those at a university and the ability to compare nationally and regionally with others community colleges gives us an excellent perspective on where we are as an institution.
Colorado Technical University, CO
Ada Uche, Director of Assessment and Institutional Effectiveness
With more than 26,000 students on campuses in Colorado Springs, Denver and full online, CTU administers the Adult Student Priorities Survey and the Priorities Survey for Online Learners in the fall of odd years. Ada shares her experience with RNL in her words:
RNL is well-known for assessing the student perspective. As an independent organization, they can also layer in peer-to-peer (institutional) comparisons and benchmarking, which is immensely helpful to us, our Board of Directors and our accrediting organizations. The findings also show us how longitudinal progress by comparing current results with previous administrations.
Survey items help us see the student perspective on a broad set of issues: instructional services, student services, and various academic services. The combination of assessing both the importance of a factor and satisfaction with that factor, which is not available on other survey instruments, gives us a unique lens to see what students value and where we are performing satisfactorily or not.
We also really like being able to customize the survey by adding ten unique items that allows us to capture feedback about specific services where we have recently put some funding. It gives us the opportunity to survey students about what we’re working on, which is super helpful.
We have increased our response rates (by more than 20 percent) by making the survey available on our CTU mobile app, in addition to the email invitation. Tell us it is easier to complete the survey on the app with their mobile phones than when we had it available only by email. We are also continuing to be intentional about communicating the results to our students so they know what we have done with their feedback.
Upon delivery of the report, we discuss the findings and implications in our Institutional Effectiveness and Assessment Committee. I also present the data at the Academic Leadership Committee where decisions are made. Presenting the data and findings in these ways and focusing discussions on what we are going to do to improve the student experience has sparked a lot of good discussions. We make sure that the data goes all the way to the top. Our president really looks at the data and we can be confident that actions will be taken.
The Executive Summary is a great snapshot of the results, and the year-over-year comparisons are incredibly helpful to see how results are similar or different from the last SSI administration. The side-by-side demographic comparisons are also valuable, for age, gender and race/ethnicity; you can identify the disparities and target your resources. But perhaps the richest information may come from the qualitative student comments that are included in the report.
Conclusion
We are grateful to Steve, Mary and Ada for sharing their stories with us. To hear the full session, you can listen to the recording on-demand. Please contact me if you would like to learn more about administering an RNL Student Satisfaction-Priorities Survey on your campus or if you have a success story to share about your work with the data.