Filtered by:

Clear All
1 2 3 4 5  ... 
(AAIR Best Presentation) Engaging the GenY Student Voice
Time Slot: Fri 06/02/17 09:00 AM, Room: University of DC, Session Code: 107834
Session Type: Affiliated Organization Best Presentation, Track: Decision-Support

Achieving a high response rate to online surveys is challenging for many institutions. Engaging Generation Y students requires a different approach due to the changing ways this generation communicates. This presentation reports on a push pull strategy implemented at Otago Polytechnic to boost response rates to online surveys with a focus on Generation Y who constitute the largest cohort of current students. A combination of changing communication methods to suit Generation Y students together with supporting teaching staff to engage with student feedback more effectively has resulted in a marked increase in participation rates by students.

(AIRUM Best Presentation) Data Visualization and Reporting Feedback: Inside What Users Really Think
  • by Lesley Lydell, University of Minnesota  
Time Slot: Thu 06/01/17 10:45 AM, Room: Monument, Session Code: 89152
Session Type: Affiliated Organization Best Presentation, Track: Operations

As the volume of available institutional data and reporting tools continues to increase, coupled with additional reporting demands, institutional research offices can face a daunting challenge in determining how to communicate information effectively and meet the needs of diverse stakeholders. Although the management of data collection, compliance, and time-sensitive data requests can take priority over other considerations, as IR/IE offices increase their agency and data-engagement with decision makers across campus (Swing & Ross, 2016), learning more about data users’ needs and how current reporting does or does not address those needs takes on added importance. User experience analysis can provide a useful tool to both broaden and deepen IR/IE’s ability to inform decisions on campus and beyond. The field of user experience analysis has roots across college campuses in diverse fields of study, from design, rhetoric, and psychology to human factors research, architecture, and computer engineering (Morville, 2014). Yet user experience analysis has not been widely incorporated into institutional research and reporting. A growing interest in data visualization, however, has created an opportunity for greater synergy between the fields. This presentation seeks to strengthen that connection by applying best practices from user experience research into institutional reporting in order to help IR/IE offices: 1) deepen their impact by providing information in a way that is accessible and empowering to decision makers, 2) improve their own efficiency by decreasing needs for iterative reporting and analyses, and 3) help anticipate information needs of decision makers and lessen the last-minute information pressures. This session builds on the current discussion about the best practices in data reporting and visualization to include the valuable insights from user experience research in helping make those best-practice determinations. The session will begin with a brief overview of the field of user experience analysis, with particular attention to applications in data reporting and visualization for institutional research. The presenter will then detail a case study at a large Midwestern research university of designing a web-based framework for data reports and visualizations that incorporated user feedback throughout the design planning and implementation process. The presenter will discuss specific methodologies used, examples of user feedback gathered, and integrations of user-based research at various stages of the project, including illustrations of changes made based on feedback from both the “hits” and the initial “misses,” and the resulting lessons learned. The presentation will address free or low-cost resources, both on campuses and online, for participants in collecting and incorporating user feedback in their own institutional reporting. The presentation will also include strategies for incorporating and managing end-user feedback from diverse stakeholders on an ongoing basis, both through formal and informal methodologies. The presenter will engage participants in an active discussion of current reporting projects on their own campuses and potential strategies for incorporating and managing user feedback. The presentation will conclude with ample time for further questions and discussion.

(CAIR Best Presentation) Data Mining to Identify Grading Practices
Time Slot: Wed 05/31/17 02:30 PM, Room: Mint, Session Code: 107116
Session Type: Affiliated Organization Best Presentation, Track: Technologies

To enhance student success and to build inclusive classrooms, researchers have used technical tools and expertise to yield insight into problems many undergraduate students experience in higher education. A collaborative research effort explored how k-means cluster analysis can reveal contrasting patterns in the distribution of letter grades among large course offerings. Consistent with previous studies, our findings showed that norm-referenced grading practices exacerbate an existing achievement gap, while a criterion-referenced grading approach enhances student learning and overall success in school. This session will provide a hands-on opportunity for attendees to learn statistical techniques using SPSS, with datasets and syntax available during the session for data mining skills development.

(CIRPA Best Presentation) Collaborating with Student Services to Improve Retention
Time Slot: Wed 05/31/17 11:15 AM, Room: Mint, Session Code: 104589
Session Type: Affiliated Organization Best Presentation, Track: Decision-Support

Nova Scotia Community College (NSCC) has a regular practice of surveying incoming students to create a first year student profile for each of our 13 campuses. In 2016, Institutional Research collaborated with Student Services to revise and strengthen the purpose and use of the Incoming Student Survey. The new instrument was split into two surveys and is administered both before and after the start of September classes. Students identified as potentially at-risk are contacted and offered early support. This cross-departmental project is a key initiative in a suite of college-wide efforts to develop targeted retention strategies. The focus of the survey is to collect data that can be used to forecast students who may be at risk of withdrawing within the first few months of enrollment. This session will provide an overview of what questions are asked, what flags a student as being “at-risk,” and how our Student Success Surveys have turned into effective retention tools.

(CUNY Best Presentation) Assessment Best Practices from Diverse Institutional Perspectives
Time Slot: Wed 05/31/17 03:30 PM, Room: Mint, Session Code: 109040
Session Type: Affiliated Organization Best Presentation, Track: Assessment

Assessment best practices provide a framework for answering the question: How well are we meeting our goals? Answering this question relies on institution-wide involvement in systematic and sustained assessment. This session focuses on how several colleges within the City University of New York system engage constituents from academic and non-academic units in the assessment process. Examples are provided from the perspectives of a community college, comprehensive college, senior college, and the CUNY Graduate Center. Presenters share their assessment best practices and provide suggestions for overcoming challenges.

(FAIR Best Presentation) Early Warning System for Identifying and Monitoring Potential Drop-outs
  • by Ching-Hua Huang, Chang Jung Christian University and Chia-Liang Lin, Chang Jung Christian University  
Time Slot: Wed 05/31/17 10:15 AM, Room: Mint, Session Code: 107831
Session Type: Affiliated Organization Best Presentation, Track: Decision-Support

An alternative approach for evaluating the performance of student learning is presented in this paper. We took into account the differences between warning groups, to construct an influence framework for students' learning. Specifically, we employed a pattern in order to form an early warning matrix based on number of absences and earned credits via a two-stage analysis process.

(IAIR Best Presentation) Everything in Life Has a Seinfeld Reference … Even Predictive Modeling
  • by David Rudden, Elgin Community College 
Time Slot: Thu 06/01/17 10:45 AM, Room: Salon I/J, Session Code: 105999
Session Type: Affiliated Organization Best Presentation, Track: Decision-Support

There’s a saying that “everything in life has a Seinfeld reference.” The classic TV sitcom show about nothing was popular because it was irreverent and provided a fun take on many of life’s most common questions and situations, to which the viewing audience could relate at some level. This session will attempt to borrow that wit and irreverence, while focusing on a simple question to which all IR folks can relate: How can you effectively use predictive modeling to identify and evaluate student success initiatives at your institution? We’ll highlight a process for answering questions that Elgin Community College has utilized, and how past attempts at exploring this topic have helped shape the “backwards” approach that we’re currently exploring. There will be time at the end of the session for attendees to share their own thoughts and experiences with using statistical modeling to facilitate institutional discussions around student success initiatives. All Seinfeld fans are welcome!

(INAIR Best Presentation) Guide to Understanding Why Your Admitted Students Rejected You
  • by Robert Wade 
Time Slot: Thu 06/01/17 10:45 AM, Room: Union Station, Session Code: 107712
Session Type: Affiliated Organization Best Presentation, Track: Technologies

Just imagine: It’s mid-September, and your freshman class has finally arrived. Whether or not enrollment goals were achieved, a substantial number of admitted students decided to attend elsewhere. To meet future recruitment benchmarks, you need to know where they went and why. Hear how Purdue University confronts this issue by enabling recruiters and marketing professionals across campus to better understand the market demand for specific academic programs. SPOILER ALERT: It’s a dashboard - a dynamic, user-friendly dashboard that utilizes National Student Clearinghouse data to create a more complete picture of the recruitment funnel from application to matriculation. This presentation covers everything from data prep to dashboard design to uncovering actionable insights. If you want to make the most of your Clearinghouse data and/or enjoy dad jokes, you are strongly encouraged to attend.

(MAIR Best Presentation) Propensity Score Matching: Moving Toward Causality in Education
  • by Kenneth Thompson 
Time Slot: Thu 06/01/17 03:00 PM, Room: Mint, Session Code: 108600
Session Type: Affiliated Organization Best Presentation, Track: Analysis

As a statistical method for addressing selection bias, propensity score matching is gaining popularity within the education arena. Although experimental design remains the gold standard for causal inference, experimental design is not always possible in education. Instead, education researchers often rely on quasi-experimental designs lacking the randomization required for causal inference. Propensity score matching provides a statistical technique for addressing systematic differences in groups by matching participants on their likelihood of group assignment. Grounded in the Rubin counterfactual framework, this session will provide an introduction to and overview of propensity score matching.

(MdAIR Best Presentation) Making the Shift from Assessment to Student Learning Improvement
  • by Courtney Sanders and Nicholas Curtis 
Time Slot: Thu 06/01/17 03:00 PM, Room: Salon I/J, Session Code: 107900
Session Type: Affiliated Organization Best Presentation, Track: Assessment

Though assessment practice is increasingly prevalent in higher education, clear expectations for what constitutes quality assessment are less common. Less common still is the use of assessment results to make pedagogical or curricular changes to enhance student learning. Through measurement of quality assessment practices on our campus, we aim to show that using specific resources and understanding the critical role of re-assessment can augment faculty development in the area of assessment and lay the foundation for student learning improvement. Participants will reflect on the assessment culture at their own institutions and identify developmental resources appropriate for faculty and staff on their campuses.

1 2 3 4 5  ...