Abstract Literature Review Research Questions Context, Data, and Methods Results Discussion Implications for Research and Practice Conclusion References Appendix A. Proactive and Developmental Advising Scales Appendix B. Interview Protocol

Understanding the Impact of Data-Driven Tools on Advising Practice and Student Support

Sarah Blanchard Kyte*, Celeste Atkins, Elizabeth Collins, & Regina Deil-Amen

The University of Arizona

Abstract

Universities are increasingly turning toward data-driven technologies like data dashboards to support advisors’ work in student success, yet little empirical work has explored whether these tools help or hinder best practices in advising, which is in many ways a relationship-based enterprise. This mixed-methods study analyzed whether and why the release of a student success dashboard impacted proactive and/or developmental advising at a large public university. After quantitatively demonstrating no measurable changes in advising practice following the release of the dashboard, qualitative evidence from interviews with advisors was used to interpret and explain the disconnect between the tool and the advising community. This study places scholarly attention from a practitioner perspective on the largely structural challenges to integrating retention software into advisors’ work in supporting their students’ success, with implications for the successful implementation of data-driven student success tools more broadly.

Keywords: student success, technology, dashboard, risk-management, academic advising

* Contact: skyte@arizona.edu

© 2023 Kyte et al. This open access article is distributed under a Creative Commons Attribution 4.0 License (https://creativecommons.org/licenses/by/4.0/)

Understanding the Impact of Data-Driven Tools on Advising Practice and Student Support

Academic advisors are often characterized as at the front lines of student success efforts, tasked with helping students navigate college (Drake, 2011; Troxel et al., 2021). Though this is a fundamentally interpersonal enterprise where relationships between advisors and students are crucial, advisors often have large caseloads and finite resources for serving student populations that are increasingly diverse in their needs (Hughey, 2011; Thomas & McFarlane, 2018; Vianden, 2016). In response, many universities are making substantial investments of funds and staff time in technologies meant to help advisors better serve and retain their students (Krumm et al., 2014; Parnell, 2020; Phillips, 2013). These include tools that allow advisors to more easily identify students’ needs and streamline the most prescriptive aspects of advising, thus “freeing up time for advisors to provide more sustained, holistic support” and leading to greater student success (Kalamkarian et al., 2018, p. 6; see also Mattei et al., 2014).1 Specifically, data-driven technologies, like dashboards, that present factors thought to place students at risk of stop-out have the potential to facilitate more proactive and developmental advising. Proactive advising involves deliberate and early intervention to address students’ needs (Varney, 2012), whereas developmental advising takes a holistic approach and focuses on students’ academic, extracurricular, and personal development (Gordon et al., 2008; Grites, 2013). Yet, little is known about whether data-driven technologies related to student risk impact the practices of academic advisors in situ or how, as an intended user-base, they view these tools.

To this end, this study examines through a mixed-methods lens whether and how the launch of one such student success dashboard impacted advising practice and in turn, the support that students receive at a large public university. The advising data dashboard is a locally developed, interactive webpage that presents information about students’ academic performance, behavior, and background (e.g., residency, high school achievement) thought to impact student success. Advisors are then able to review the number of risk factors identified for each student in their caseload as well as a profile of their caseload as a whole. By comparing advisors’ self-reported advising behaviors before and after the launch of the tool, this study quantitatively identifies whether there was an observable shift in proactive and/or developmental approaches to advising following its release. Next, this study draws on interviews with advisors to qualitatively understand these potential changes and how advisors view the impact of data tools in the context of their work more generally. This blending of quantitative and qualitative inquiry offers a deeper understanding of potential changes in practice following the introduction of a data-driven advising tool and the meaning of those changes for advisors and the students they serve.

This study is important because increasing investment in technology and tools, like the one at the center of this study, have been redefining the work of student success practitioners, like advisors, and their influence is only expected to grow (Calhoun-Brown, 2023; Parnell, 2020). However, existing research has yet to adequately address the impact of retention-focused tools on how practitioners support their students. An optimistic reading positions these tools as potentially facilitating both proactive and developmental advising among busy advisors, allowing them to better serve individual students (Felten & Lambert, 2020; Parnell, 2020; Thomas & McFarlane, 2018). On the other hand, it is also possible that something like a new dashboard may have a minimal impact on practice within the larger context of advisors’ competing responsibilities or perhaps even undermine the relational aspect of advisors’ work with students (Coffin, 2018; Cuseo, 2003a). Therefore, this study examines the impact of a student success dashboard on the proactive and developmental aspects of advisors’ work with students and draws out their perspective on integrating these tools with practice.

Literature Review

The Rise of Student Success Technology and the Role of Advising

As universities increasingly turn toward data-driven solutions to support student success, examples abound of institutions making major gains in persistence and graduation following the adoption of new technologies (Calhoun-Brown, 2023; Campbell et al., 2007; Gardner, 2019). One standout example is that of Georgia State University, which received national attention by nearly doubling its graduation rate over a decade by using a slew of strategies informed by predictive analytics including providing advisors with early alerts about students at risk of stopping out (McMurtrie, 2018; PBS NewsHour, 2016). A 2015 study done in collaboration between the National Academic Advising Association (NACADA), The Bill and Melinda Gates Foundation, the National Association for Student Personnel Administrators (NASPA), and others found that 44% of surveyed colleges and universities had made recent increases in spending on advising technologies (Tyton Partners, 2016). To illustrate, between 2013 and 2015, the percentage of universities reporting using software to identify students at risk rose from 50% to 84% (Pasquini & Steele, 2016; Tyton Partners, 2016). As of a 2021 update, large advising caseloads continue to be cited by colleges as one of the top three barriers to improving advising with more than 80% of institutions adopting technological solutions to address caseload management (Shaw et al., 2021, fig. 7). At their core, these early-alert technologies, like the dashboard at the center of this study, flag students as at risk when they meet certain criteria based on their institutional data profile. These solutions are often promoted within higher education as placing useful information at advisors’ fingertips so that they can spend more time on the student-facing activities that advisors are uniquely able to do (Kalamkarian et al., 2018; Mattei et al., 2014; Phillips, 2013; Steele, 2018). Touting the potential gains of these tools, Civitas Learning, a student success software vendor, concludes a recent report on effective student success strategies by quoting a partner, saying “When those [practitioners] have the data and tools they need to do their jobs well, ‘the impact is palpable.’” (Civitas Learning, 2019, p. 10).

Despite this potential, we know very little about the adoption of these new technologies from an advising perspective. This is particularly problematic given advisors’ role as front-line professionals tasked with supporting student success (Wallace & Wallace, 2016). A study from Pasquini and Steele (2016) takes the first steps of considering the landscape of technology usage in advising by reporting results from a broad survey of NACADA members, arguing “it is also critical to look at the design and delivery of our advising models, to best understand how technology impacts our user experiences and the barriers to innovating our institutional functions for student support” (pp. 12–13). The authors report that new tools are typically brought to advising by campus leadership, suggesting that there could be much to learn from advisors about how new technologies impact practice and the factors impacting adoption.

Consistent with the idea of an external impetus for introducing new technologies, much of the literature on the emergence or impact of these technologies tends to pay minimal attention to advisors as key stakeholders (Goodman & Cole, 2017; Kalamkarian et al., 2018). A smaller body of empirical literature explores the development or deployment of new advising tools from an analytics and innovation perspective (Faulconer et al., 2013; Krumm et al., 2014; Mattei et al., 2014). Work that does engage with academic advising trends toward very general reports of usage and perceptions (Klempin et al., 2018; Pasquini & Steele, 2016), or is more conceptual in nature, making the case for how advising might be reimagined to incorporate new technologies for student success (Joslin, 2018; Steele, 2016). Thus, these broader perspectives tend to reinforce the view that risk-management technologies, like data dashboards, hold significant promise for advisors’ role in supporting student success.

Two studies begin to unpack advisor perspectives of these tools in the context of implementation. In 2018, Klempin and colleagues explored the general perceptions of college personnel around risk-management software for advising using interviews at nine institutions at various stages of implementation. In contrast to the positivity noted by the authors with which these technologies are generally regarded in the media and most higher education discourse, when they disaggregated practitioner perceptions by role (i.e., administrators, development team, advisors), they revealed that academic advisors were the most removed from decision-making around the tools and the most negative about them (Klempin et al., 2018). Moreover, a 2018 dissertation considered whether advisors and administrators retrospectively perceived changes in broader advising philosophies following the implementation of a risk-management software at two institutions (Coffin, 2018). Drawing on interviews with academic advisors and administrators, no meaningful changes were reported in advising approaches (Coffin, 2018). However, both of these studies rely on cross-sectional designs, with interviews that are generally focused on implementation at the institutional level rather than advising practice. Therefore, we were unable to identify any prior empirical studies examining potential change over time, using behavioral measures, in the work everyday advisors do to support students.

Centering Advising Within Data-Driven Advising and Student Risk

An advising perspective is sorely needed in this area. As academic advising has evolved as a profession, advisors have been typically tasked with large caseloads, limited time, and increasing responsibility for student success as front-line professionals (Aiken-Wisniewski et al., 2015; Kuh, 2008; Ohrablo, 2018). While new tools could potentially help advisors streamline time-consuming work in identifying students who need support, it is also possible that new technologies could face barriers to adoption given the competing pressures advisors face and their more general orientations towards their work (Thomas & McFarlane, 2018). For example, building strong relationships with students and recognizing students as unique individuals are central to the core values of advising (Drake, 2011; Hughey, 2011; Vianden, 2016). Risk-management software, by contrast, aims to sift through institutional data in order to flag students as “at risk” of stopping out when they share similar characteristics with students who have struggled in the past (Attewell et al., 2022). Though some advisors may appreciate these insights, others may find it problematic when their students are presented as at risk in this way (He et al., 2020). Given advisors’ positionality as “street-level bureaucrats,” they may exercise their discretion with regards to adopting tools that conflict with their beliefs (Howard, 2017; Karp & Fletcher, 2014; Lipsky, 2010). Lastly, advising has well established best practices which may be impacted, positively or negatively, by new tools. Thus, we see an opportunity to explore whether advisors do indeed spend their time differently following the rollout of a data-driven advising tool.

Two of the advising approaches that could potentially be most impacted by new software for identifying students at risk are proactive and developmental advising. Proactive and developmental approaches to advising are not mutually exclusive, yet each is potentially facilitated by data-driven tools in distinct ways. In short, proactive advising asks advisors to pre-emptively intervene on behalf of students by anticipating their challenges or needs and working to educate them on all options before a situation develops (Varney, 2012). Tools designed to offer advisors multi-faceted information about their students may facilitate proactive advising by highlighting indicators that a student may be struggling and making it easier for advisors to intervene on behalf of their students (Faulconer et al., 2013). Developmental advising requires a holistic approach attentive to the educational, extracurricular, and personal dimensions of students’ lives (Gordon et al., 2008; Grites, 2013). Technology and tools that offer greater efficiency in handling the most prescriptive aspects of advising would free-up time to address a wider range of students’ individual needs. Thus, offering advisors actionable information about student challenges may on the one hand enhance proactive advising, and on the other hand, allow more time during advising sessions to focus on developmental dimensions of the student experience in higher education. Yet, no studies to date have closely examined the impact of these increasingly popular tools on these approaches to advising practice and student support.

Research Questions

The research questions addressed through this study are:

  1. 1. Do advisors engage in more frequent proactive and/or developmental approaches to advising following the release of a student success dashboard?
  2. 2. How do advisors make sense of the impact of a student success dashboard on their work in supporting students?

Context, Data, and Methods

Institutional Setting

The institutional setting of this study adds to its impact in several ways. First, the largest percentage of U.S. students attending four-year colleges (39%) enroll in semi-selective, public universities like the one in which this study takes place; yet, these institutions graduate only 59% of new students within six years, according to the latest data (Carnevale & Van Der Werf, 2017; National Center for Education Statistics, 2022, table 326.15). Second, advising at the focal university reflects a decentralized model where department- and college-based advisors typically serve large caseloads of hundreds of students each, a common advising format that has been shown to offer advisors less time to work one-on-one with students (Cuseo, 2003a; Fosnacht et al., 2017). Third, a large proportion of the 40,000 students at this university are the first in their families to attend college (28%),2 from a racial or ethnic group minoritized within higher education (43%), and receive Pell grants (29%; Institute of Education Sciences, 2020). Thus, this setting is in many ways typical and also plays a critical role supporting the success and retention of underserved groups within higher education more broadly (Tinto, 2012). Therefore, to the extent that new advising technologies impact advising practice and in turn, student success, the institutional context for this study is a useful one.

Dashboard Development and Implementation

The advising data dashboard was developed primarily in 2017 in partnership between staff in academic affairs, advising, and university analytics as a way to offer advisors actionable information in support of student success. A small handful of advisors were asked to provide regular feedback on its design during development. A recent landscape analysis showed that most institutions similarly take the “homegrown approach” to modeling student success tools and that these are primarily geared toward advisors (Parnell et al., 2018). Though many of the data points within the dashboard (e.g., student GPA, registration status, residency) are available to advisors in other areas of the university analytics platform, the dashboard presented a broader range of indicators, as well as some measures newly created for the dashboard (for example, a downward trending GPA) in one place that could be filtered to the advisor’s caseload or used to look up a single student. To encourage dashboard adoption leading up to and during implementation, updates on the dashboard were presented frequently to advising leaders within the colleges, hands-on workshops were offered for the advising community, and the new tool was highlighted periodically in the university’s advising newsletter.

Analytical Approach

We addressed our two research questions using a sequential explanatory mixed-methods design (Creswell, 2009). In doing so, we took a quantitative approach to answering our first question and used surveys of advisors, administered before and after the launch of the advising data dashboard, to identify any changes in self-reported proactive and/or developmental advising practices. We then took a qualitative approach via in-depth interviews to answer our second question and understand the impact of this tool on advisors’ work supporting students from an advising perspective.

Quantitative Methods

Survey Data Collection

All members of the advising community were invited via email in 2017 and 2018—before and after the launch of the dashboard—to participate in a short, online survey in order to learn about their work supporting students and the tools and technology they use. The focal university had approximately 150 advisors in both years and had a response rate of 66% in 2017 and 63% in 2018. Participating advisors were compensated with a $10 gift card. In all, 58 advisors—38% of those in the 2018 wave—took both surveys, allowing for individual-level comparisons of advising practice before and after the tool’s launch.

Instrument Design and Measures

Embedded within the surveys were a number of items adapted from the extant advising literature to identify proactive and developmental approaches to advising (Cuseo, 2003b; Grites, 2013; Szymanska, 2011; Winston & Sandor, 1984). Proactive advising approaches are measured via a 3-item scale taking the average of three items: the extent to which advisors report using student data outside of appointments to identify and respond to various student needs (0 = never to 4 = daily) as well as a measure indicating the extent to which they agree that “I often reach out to students without them contacting me first” (0 = strongly disagree to 4 = strongly agree). Developmental advising approaches are measured in the same way using a series of items asking advisors to report how often they cover a range of topics during advising appointments related to students’ academic planning and support, extracurriculars, and wellbeing (0 = never to 4 = always) and whether they agree that “Academic advising contributes to my students’ personal growth and development” (0 = strongly disagree to 4 = strongly agree). Statistical details for all scales, including Cronbach’s alpha as a measure of reliability, are included in Appendix A. Finally, the fall 2018 survey included a measure of advisors’ self-reported use of the newly released advising data dashboard (0 = never to 4 = daily), but was otherwise identical to the 2017 wave.

Data Analysis

The online survey interface used to distribute the survey allowed advisors’ responses to the fall 2018 survey (after the release of the dashboard) to be linked to their previous responses to the fall 2017 survey (before the release of the dashboard) using a unique identifier. A paired-sample t-test of proportions was used to identify whether average year-over-year changes in proactive and developmental approaches are statistically different than zero in the analytic sample, i.e., the 58 advisors who completed both surveys. Further, we also disaggregated our data by dashboard usage to examine whether changes in practice were concentrated only among those advisors who reported using the tool (N = 33, 57% of the analytic sample).

Qualitative Methods

Data Collection

Advisors were invited to participate in interviews using a purposeful selection approach (Maxwell, 2012) based on their 2018 survey responses to ensure we had a range of colleges from within the focal university, reported familiarity with the dashboard, and proactive and developmental advising practices represented among our interviewees. A total of 36 advisors were invited, and 27 agreed to be interviewed. Interviews were semi-structured with open-ended questions that allowed for focused but conversational dialogue. Our protocol included 21 general questions, including 6 about technology and the dashboard (see Appendix B), with the interviewer including more specific questions arising from the conversation to probe for additional details and allow for a more organic conversation around the most relevant issues. The protocol was designed by the research team and guided by our research questions, as well as our combined institutional knowledge regarding academic advising and the dashboard. The protocol inquired about advising roles and philosophies, work with students, the challenges and rewards of advising, and advising technologies including the new dashboard. Participating advisors were compensated with a $20 gift card. Interviews lasted about an hour and were then transcribed verbatim and imported into NVivo qualitative software for coding and analysis.

Data Analysis

Our qualitative analytical approach was iterative and based largely on the principles of flexible coding proposed by Deterding and Waters (2018), which is particularly useful for collaborative coding using qualitative data analysis software. The transcripts were first indexed using broad categories based on the major sections of the interview protocol. Next, three members of the research team coded clean versions of several transcripts, developing new sub-codes within the pertinent sections of the interviews (dashboard reactions and use, advising and job challenges, advising philosophy, advising rewards, advising technology, dashboard challenges, and students’ needs and issues). For example, “advising rewards” was divided into student success, helping students, and relationships with students. This coding process used first-cycle coding strategies including concept, emotion, and in vivo coding as well as a final layer of overarching themes that were emerging across the interviews (Saldaña, 2015). The coders met multiple times to compare codes and come to agreement around a strategy for using, modifying, and adding to the initial codes. As described by Saldaña (2015), following this consensus, the second author took the lead on coding the remaining transcripts in frequent consultation with their co-authors. Thus, at each point during the coding process and theme development, multiple researchers were involved in the process, which enhanced the validity of the findings by reducing the impact of any biases or assumptions brought to bear on the data by any one researcher (Maxwell, 2012). Finally, we used some of the analytical tools within NVivo to explore patterns in the prevalence of our various codes among the advisors we interviewed.

Results

Quantifying Change in Advising Practice

Our first research question asks whether the advisors at the focal university engaged in more proactive and/or developmental advising following the rollout of the advising data dashboard. Table 1 presents a pre- and post-launch comparison of advising approaches for all 58 advisors taking both surveys (left side of table) and for the 33 advisors (57% of all advisors in both waves) taking both surveys who also reported at least occasional use of the advising data dashboard (right side of table). Neither set of comparisons shows any significant differences within any of the measures of proactive and developmental advising according to a paired-sample t-test of proportions (p < .05). Thus, the quantitative portion of our study suggests both meager adoption of the dashboard (i.e., only about half of advisors reported using it at all) and no measurable, significant changes in advising practice among the entire advising community or the subset of dashboard-adopters.

Table 1. Proactive and Developmental Advising Before and After the Release of the Advising Data Dashboard Among All Advisors and Those Using the Dashboard

All advisors

Advisors using the dashboard

Pre

Post

Pre

Post

Mean

SD

Mean

SD

p

Mean

SD

Mean

SD

p

Proactive advising

“I often reach out to students without them contacting me first”

2.83

(1.08)

2.88

(0.90)

2.94

(1.00)

3.06

(0.70)

Proactive data use

2.29

(1.02)

2.12

(0.97)

2.31

(0.85)

2.21

(0.91)

Developmental advising

“Academic advising contributes to my students’ personal growth and development”

3.66

(0.51)

3.69

(0.47)

3.70

(0.47)

3.70

(0.47)

Developmental advising scales

Academic planning

2.46

(0.50)

2.50

(0.55)

2.54

(0.46)

2.60

(0.54)

Academic support

1.76

(0.51)

1.81

(0.62)

1.78

(0.54)

1.87

(0.58)

Extracurriculars

1.69

(0.61)

1.73

(0.68)

1.68

(0.63)

1.76

(0.62)

Student wellbeing

1.49

(0.58)

1.57

(0.86)

1.60

(0.66)

1.56

(0.73)

Number of advisors

58

33

Note. Tests report whether advisors completing both surveys showed a change in practice according to a paired-sample t-test.

*p < .05

Qualitative Explanations for a Lack of Impact

Our second research question asks how advisors make sense of the impact of the advising data dashboard on their work. We turn to the qualitative data within our mixed-methods design to help us understand the perhaps surprising finding that the dashboard appeared to have no impact on how advisors support students. Of the 27 advisors we interviewed, 9 were unfamiliar with it, 13 were familiar with it but did not use it, and only 5 reported using it. Yet, when discussing the dashboard, advisors—regardless of their level of familiarity with the new tool—repeatedly described a set of barriers that undermined its impact on their work with students. These barriers pertain to problems with functionality, a reluctance to invest scarce time in learning new—and likely to change—technologies, and a disconnect between the tool and the challenges and rewards of advising within a decentralized system.

A Work in Progress

The most prevalent theme within advisors’ largely negative reaction to the tool during our interviews had to do with its functionality. The majority of the respondents who were familiar with the dashboard (10 respondents of 18) viewed the information presented as redundant with information they could access through other, more user-friendly tools. In that same vein, about half of all of our respondents (13) continued to prefer materials they had created or customized—such as reports and spreadsheets—over the new system. Seven advisors felt that it was missing key pieces of information or functionality needed for their roles. For example, one advisor shared, “[The dashboard is] okay, because it’s pretty, and it’s visually good. It doesn’t have all the information I want or that I use, and so I just continue to use my reports.” Another advisor said that many of their peers were holding off on using the tool until it was improved, saying, “I think most of us are just waiting for it to get worked out before using it.” Dashboard users encountered additional problems with the perceived accuracy of the data (N = 4) as well as transparency around the flags used to identify students who may be need extra support (N = 2). One advisor discussed their discomfort with the dashboard flagging a recent change of major as a risk factor, saying,

Yeah. I understand that as far as graduating in four years. Yes, that’s going to be a hindrance, but I think changing your major is part of college. You know what I mean? It’s part of exploring what you enjoy, so I personally don’t know that, that should be a flag, and am I going to reach out to a student who has three flags and one of them is that?

Taken together, dashboard adoption was undermined by advisors’ continued preference for their existing systems and lack of confidence in the functionality and validity of the dashboard and the data it presented.

Time and Cost of Early Adoption

Going a bit deeper, a second theme that advisors frequently pointed to was a reluctance to invest precious time in new technologies, particularly when there was concern that the dashboard might be replaced by yet another a new tool and therefore become obsolete. Throughout our in-depth interviews, time management and large caseloads were frequently discussed as major stressors, arising in 16 and 17 of our interviews, respectively. Most of the advisors we spoke with had caseloads of a few hundred students which they described as undermining both their adoption of the tool and their ability to work with students in the ways they would like.

To illustrate, advisors did link their shortage of time and their large caseloads to their difficulty in engaging with developmental and proactive advising with students, but not in a way that suggested a student success dashboard would be especially relevant. One advisor shared their frustrations about not being able to offer more developmental opportunities to students:

I would love to have more time to do things specifically for my students. Things like workshops, things like bring in guest speakers from industry . . . something that students need, I’ve had them ask personally for it, and it’s just like there’s no time in the day for certain things.

Another advisor discussed how this scarcity of time undermines their ability to be proactive in their role, saying, “I think if I could spend more time checking in with students, that would be really nice, but just with the amount of students we have, it’s not possible.” Therefore, rather than see the tool as opening up time that the advisor could use for developmental or proactive engagement with students, advisors saw scarce time as limiting both their desired approaches to advising and their adoption of the tool.

A final consideration among the advisors related to their reluctance to invest time in learning the new technology was their perception that new technologies were continually coming and going. Twenty advisors pointed to past experiences of being trained or encouraged to adopt new technologies only to have those technologies replaced or withdrawn relatively quickly. One advisor summarized their reluctance saying, “That’s the other thing, you don’t want to invest your time really becoming proficient in a program, and then it being dismantled once you’ve mastered everything.” Overall, the approach for most seemed to be a “wait and see” philosophy in which they were reluctant to invest time or energy into a program unless they were directly asked to, or it had proven to be worthwhile.

A Disconnect Around Supporting Students

Taking a step back, additional clues as to why advisors were reluctant to adopt the advising dashboard or see it as a solution surfaced in the wider conversations we had with advisors around the rewards and frustrations within their roles. We learned that although advisors’ roles and needs varied substantially within the decentralized structure of the university, the advisors we spoke with were all primarily motivated by serving students within the one-on-one context of the advising relationship. This is evidenced by the primary rewards advisors reported from their roles which were student success (30 references), helping students (21), and relationships with students (20). Further, when advisors reflected on challenges in their roles advising students, they identified 16 unique types of challenges and most often pointed to policies and procedures within the university (65 references), inconsistency across units (61), and caseloads (42) and never to issues related to data availability (0). Thus, the dashboard was largely not seen as relevant to the benefits and challenges within advisors’ work with students.

Going a bit deeper into the content of our conversations, almost all of the advisors we spoke with—22 of 27—spent some time breaking down how their roles and responsibilities were shaped by the department or college in which they worked. One advisor explained how this often made it difficult to adopt across-the-board changes within advising, even relatively small changes in business practices. In sharing an anecdote related to introducing walk-in times for students, they shared,

Because there was something like, “Oh, we can like have the advising coordinator for each college designate walk-in times,” and this and that. And I’m like, “Wait, [our college’s] advising is decentralized.” And I’m by myself, but [another major] has two advisors and they work off of the same appointment and walk in schedule, which is great for them. But what they do is different than what I do. So . . . we need to set our own calendars.

Further, when advisors identified challenges in their work, they pointed to policies that created persistent roadblocks for students and a lack of power for advisors within decision-making. When asked about the biggest frustrations in their role, one advisor shared,

It’s related to policies. I just feel like that’s another challenge for advisors in general. It’s that most of the time we get no say, we have no votes, we have no voting members. We’re not kept in the loop about anything and then we’re the final enforcers.

Although advisors described a wide range of roles and challenges, all but two advisors we spoke with described engaging with students as their greatest reward in ways which could present a barrier to adoption of the new dashboard. For some, it was about being able to help students: “Well, I think ultimately helping students . . . [it’s] certainly rewarding to really feel like you’ve made a difference in their day.” For others, it was about working with a wide variety of students and seeing them succeed,

But I get joy out of the differences and the type of students I get in my office. They’re all across the board. Even if I get frustrating ones, I think it’s rewarding when you actually break through with them and help them learn.

Several advisors specifically discussed helping students overcome barriers. For example, one shared that the best part of their work was, “students being successful, in even the tiniest ways. They’re on probation and they get one B and the rest are C’s, it’s still success. You celebrate that success with them and congratulate them on it.” Another advisor told us, “High performers are great. I like to see them be successful too. But there’s something about when a student sort of stumbles but then finishes really strong, that I like.” Others distilled the best part of their jobs to graduation, the ultimate symbol of student success, “It’s when they graduate. I love to see them graduate. I really do.” Another shared that they cry every year at commencement. Once again, advisors did not connect their enthusiasm for their role in student success to the advising dashboard or other data-driven tools.

Discussion

A growing focus on student success and institutional interest in leveraging student data has led to increasing investments in software solutions meant to identify students at risk (Attewell et al., 2022). Yet, no longitudinal studies to date have brought a practitioner lens to the open question of whether new technologies fulfil their promise to allow advisors to work more efficiently and effectively in supporting student success. This mixed-methods study takes a first step towards filling this gap by gathering quantitative and qualitative evidence during the rollout of a new data dashboard at a large public university. After first quantifying whether advisors were able to engage with their students using more proactive or developmental advising approaches following the rollout of the advising data dashboard (RQ1), a series of in-depth interviews allowed us to make sense of the advising experience with adopting the new technology (RQ2).

We learned that the data dashboard was only minimally adopted by the advising community. Moreover, no changes were observed in advisors’ reports of engaging in proactive or developmental advising behaviors. Our qualitative findings underscored dissatisfaction with the functionality of the tool, a hesitancy to adopt a new tool that could soon be phased out,3 and perhaps most importantly, a disconnect between the tool and the challenges and rewards of advising. Specifically, advisors saw the dashboard as largely unrelated to aspects of their work they would like to improve and instead, as potentially undermining their ability to work with and support students as unique individuals by flagging predetermined risk factors.

In our study, individual advisors’ accounts of their use of the data dashboard pointed to a set of barriers to adoption that were largely structural. Some insights from the social construction of technology and more recent frameworks from higher education help make sense of these findings. Early on, the success of new technologies was thought to hinge on the relevant groups of social actors reaching agreement around their own flexible interpretations of a new technology (Pinch & Bijker, 1984). Though influential, this view has been criticized for inadequate attention to asymmetries of power in the design and negotiation process and other constraining structural factors (Klein & Kleinman, 2002). From this perspective, structures represent the “rules of play” which define capacities, opportunities, and dynamics of power (Klein & Kleinman, 2002, p. 35). In our case, though the dashboard was developed for advisors with a select subset of advisors invited to participate in the process, it ultimately struggled in adoption in large part because of structural barriers pertaining to the larger community of advisors’ day-to-day work and orientation towards supporting students.

More practically speaking, the Community College Research Center introduced a framework for the adoption of advising technologies, like the one in our study, that brings organizational behavior insights to how college and departmental cultures influence whether individuals adopt new technologies (Karp & Fletcher, 2014). Their readiness framework considers the technological and organizational cultural aspects of readiness at the institution and project level and also points to some of the barriers influencing the lack of adoption we observed. For example, their framework considers the motivational readiness of potential adoptees including their perceived need for reforms and vision of the benefits of the new technology (Karp & Fletcher, 2014). This framework should be of keen interest to others considering adopting similar technologies.

Implications for Research and Practice

Our study underscores the value of mixed-methods approaches to studying changes in practice and within higher education more generally. Our quantitative finding that the launch of the student success dashboard had no measurable impact on advising guided our attention towards an analysis of underlying barriers to adoption. Specifically, we linked advisors’ lack of engagement with the tool to realities within their day-to-day work, their needs and challenges, and how advisors think about student risk. A recent framework for interpreting null results within educational research argues that null results can point towards contextual factors—related to systems, people, and policies—that may help or hinder implementation (Jacob et al., 2019). In our case, our sequential mixed-methods design allows us to offer larger insights by qualitatively addressing the contextual “why not” behind the lack of impact.

It is important to note that as the first longitudinal, empirical study in this area, the work presented here does not definitively show that advising technologies like student success dashboards or risk-management software are incompatible with supporting best practices in advising. Additional work is needed to understand whether and why these patterns might vary across contexts with attention to institutional settings, implementation strategies, and the design and functionality of new tools. For example, it may be the case that significant retention gains are more feasible when institutions commit to more complete, centralized redesigns of advising services and the student-success strategy as a whole (Civitas Learning, 2019; McMurtrie, 2018). Instead, our findings are more in line with scholarly findings around practitioner—and especially advisor—dissatisfaction that have been highlighted in retrospective work on new advising technologies, as was the case here and elsewhere (Coffin, 2018; Klempin et al., 2018). Given that these tools are very likely to remain a growing trend, this is a pressing area for future research. Furthermore, studies should also examine how asymmetries of power in a university’s organizational structure impact the advising function more generally and, in turn, the adoption and impact of new tools and student success strategies.

Growing interest in student success technologies and the challenges to adoption detailed here also highlight important considerations for practice. Universities considering adoption—and especially tools related to managing risk—would do well to consider any and all barriers to implementation across their campuses like those discussed here and within Karp and Fletcher (2014). In doing so, it is critical that university leaders create authentic opportunities for a large and diverse group of advisors to contribute early and often to these discussions during piloting and afterwards so that there can be a shared perspective on how to best equip advisors to support their students. It is possible that the structural barriers identified here—including dissatisfaction with the functionality of the tool, limited time to learn new technologies, and a disconnect around how to best support students—may be surmountable with more authentic collaboration between advisors and technology decision-makers.

Finally, scholars, practitioners, and assessment professionals could consider drawing on the quantitative measures used here to capture developmental and proactive advising approaches to learn about existing practices within their institutions and explore how these might shift over time as institutions work to address barriers to effective advising. Though novel to this paper, the scales presented here are derived from the advising scholarship and held together well statistically, in terms of Cronbach’s alpha as a measure of internal consistency, across two cross-sections of a large advising population with quite a bit of variability in advising roles and philosophies. Validation and refinement of these measures across advising contexts within different institutions would be valuable as well.

Conclusion

Given the growing trend of new technologies intended to help advisors identify and support at-risk students, empirical research from an advising perspective is sorely needed. This paper examined through a mixed-methods lens how a newly introduced student success dashboard impacted proactive and developmental advising at a large public university. Our findings point to significant structural barriers that undermined measurable adoption and change in advising practice. In advancing technologies to support student success, universities should carefully partner with advising communities to understand their needs and priorities in the context of their day-to-day work with students. Particularly as engagement with students becomes increasingly digital, scholarship that consider structural aspects of technology and adoption are needed for the wellbeing and success of practitioners and the students they serve.


Author Note: We have no conflicts of interest to disclose. This work was funded by a grant from the National Academic Advising Association (NACADA).

1 For a comprehensive review of advising technologies, see Parnell et al. (2018).

2 Per the university’s assessment and research website.

3 To date, no significantly modified version of the tool has been developed.

References

Aiken-Wisniewski, S. A., Johnson, A., Larson, J., & Barkemeyer, J. (2015). A preliminary report of advisor perceptions of advising and of a profession. NACADA Journal, 35(2), 60–70. https://doi.org/10.12930/NACADA-14-020

Attewell, P., Maggio, C., Tucker, F., Brooks, J., Giani, M., Hu, X., Massa, T., Raoking, F., Walling, D., & Wilson, N. (2022). Early indicators of student success: A multi-state analysis. Journal of Postsecondary Student Success, 1(4), 35–53. https://doi.org/10.33009/fsop_jpss130588

Calhoun-Brown, A. (2023, January 9). How data and technology can improve advising and equity. The Chronicle of Higher Education. https://www.chronicle.com/article/how-data-and-technology-can-improve-advising-and-equity

Campbell, J. P., DeBlois, P. B., & Oblinger, D. G. (2007). Academic analytics: A new tool for a new era. EDUCAUSE Review, 42(4), 40–57.

Carnevale, A. P., & Van Der Werf, M. (2017). The 20% solution: Selective colleges can afford to admit more Pell grant recipients. Georgetown University Center on Education and the Workforce. https://cew.georgetown.edu/wp-content/uploads/The-20-Percent-Solution-web.pdf

Civitas Learning. (2019). What really works: A review of student success initiatives.

Coffin, A. (2018). Implementing academic analytics and the impact to academic advising [Unpublished doctoral dissertation]. University of Kansas.

Creswell, J. W. (2009). Research design: Qualitative, quantitative, and mixed methods approaches (3rd ed.). Sage Publications.

Cuseo, J. (2003a). Academic advisement and student retention: Empirical connections & systemic interventions. NACADA. https://www.nacada.ksu.edu/Resources/Clearinghouse/Retention.aspx

Cuseo, J. (2003b). Assessment of academic advisors and academic advising programs. NACADA. https://www.nacada.ksu.edu/Portals/0/CandIGDivision/documents/assessment%20of%20advising%20resources/Cuseo_Marymount1.pdf

Deterding, N. M., & Waters, M. C. (2018). Flexible coding of in-depth interviews: A twenty-first-century approach. Sociological Methods & Research, 50(2), 708–739. https://doi.org/10.1177/0049124118799377

Drake, J. K. (2011). The role of academic advising in student retention and persistence. About Campus, 16(3), 8–12. https://doi.org/10.1002/abc.20062

Faulconer, J., Geissler, J., Majewski, D., & Trifilo, J. (2013). Adoption of an early-alert system to support university student success. Delta Kappa Gamma Bulletin, 80(2), 45–48.

Felten, P., & Lambert, L. M. (2020). Relationship-rich education: How human connections drive success in college. Johns Hopkins University Press.

Fosnacht, K., McCormick, A. C., Nailos, J. N., & Ribera, A. K. (2017). Frequency of first-year student interactions with advisors. NACADA Journal, 37(1), 74–86. https://doi.org/10.12930/NACADA-15-048

Gardner, L. (2019, October 13). Students under surveillance? Data-tracking enters a provocative new phase. The Chronicle of Higher Education. https://www.chronicle.com/article/Students-Under-Surveillance-/247312

Goodman, K. M., & Cole, D. (Eds.). (2017). Using data-informed decision making to improve student affairs practice. Jossey-Bass.

Gordon, V. N., Habley, W. R., & Grites, T. J. (2008). Academic advising: A comprehensive handbook (2nd ed.). Jossey-Bass.

Grites, T. J. (2013). Developmental academic advising: A 40-year context. NACADA Journal, 33(1), 5–15. https://doi.org/10.12930/NACADA-13-123

He, Y., Hutson, B. L., Bloom, J. L., & Cuevas, A. P. (2020). Advisor beliefs, practices, and perceptions of well-being: Development of an advisor self-evaluation instrument. NACADA Journal, 40(1), 23–35. https://doi.org/10.12930/NACADA-18-02

Howard, F. (2017). Undocumented students in higher education: A case study exploring street-level bureaucracy in academic advising [Unpublished doctoral dissertation]. Virginia Commonwealth University.

Hughey, J. K. (2011). Strategies to enhance interpersonal relations in academic advising. NACADA Journal, 31(2), 22–32. https://doi.org/10.12930/0271-9517-31.2.22

Institute of Education Sciences. (2020). College navigator. National Center for Education Statistics: College Navigator. https://nces.ed.gov/collegenavigator/

Jacob, R. T., Doolittle, F., Kemple, J., & Somers, M.-A. (2019). A framework for learning from null results. Educational Researcher, 48(9), 580–589. https://doi.org/10.3102/0013189X19891955

Joslin, J. E. (2018). The case for strategic academic advising management. New Directions for Higher Education, 2018(184), 11–20. https://doi.org/10.1002/he.20299

Kalamkarian, H. S., Boynton, M., & Lopez, A. G. (2018). Redesigning advising with the help of technology: Early experiences of three institutions. Community College Research Center, Teachers College, Columbia University.

Karp, M. J. M., & Fletcher, J. (2014). Adopting new technologies for student success: A readiness framework. Community College Research Center, Teachers College, Columbia University.

Klein, H. K., & Kleinman, D. L. (2002). The social construction of technology: Structural considerations. Science, Technology, & Human Values, 27(1), 28–52. https://doi.org/10.1177/016224390202700102

Klempin, S., Grant, M., & Ramos, M. (2018). Practitioner perspectives on the use of predictive analytics in targeted advising for college students (No. 103; CCRC Working Paper). Community College Research Center, Teachers College, Columbia University.

Krumm, A. E., Waddington, R. J., Teasley, S. D., & Lonn, S. (2014). A learning management system-based early warning system for academic advising in undergraduate engineering. In J. A. Larusson & B. White (Eds.), Learning analytics: From research to practice (pp. 103–119). Springer.

Kuh, G. D. (2008). High-impact educational practices: What they are, who has access to them, and why they matter. Association of American Colleges and Universities.

Lipsky, M. (2010). Street-level bureaucracy: Dilemmas of the individual in public service. Russell Sage Foundation.

Mattei, N., Dodson, T., Guerin, J. T., Goldsmith, J., & Mazur, J. M. (2014). Lessons learned from development of a software tool to support academic advising. arXiv, 1–8.

Maxwell, J. A. (2012). Qualitative research design: An interactive approach (Vol. 41). Sage Publications.

McMurtrie, B. (2018, May 25). Georgia State U. made its graduation rate jump. How? The Chronicle of Higher Education. https://www.chronicle.com/article/Georgia-State-U-Made-Its/243514

National Center for Education Statistics. (2022). Digest of education statistics 2020. U.S. Department of Education. https://nces.ed.gov/pubsearch/pubsinfo.asp?pubid=2022009

Ohrablo, S. (2018). High-impact advising: A guide for academic advisors. Academic Impressions.

Parnell, A., Jones, D., Wesaw, A., & Brooks, D. C. (2018). Institutions’ use of data and analytics for student success. EDUCAUSE: Center for Analysis and Research.

Parnell, A. (2020). Advancing from prediction to prescription: Strategies for proactively and thoughtfully addressing students’ needs. Journal of Postsecondary Student Success, 2(1), 1–11. https://doi.org/10.33009/fsop_jpss131554

Pasquini, L. A., & Steele, G. E. (2016). Technology in academic advising: Perceptions and practices in higher education (NACADA Technology in Advising Commission Sponsored Survey, 2013).

PBS NewsHour. (2016, June 28). Innovative program helps even the playing field for poor students—And boost graduation rates. PBS News Hour. https://www.pbs.org/newshour/show/innovative-program-helps-even-the-playing-field-for-poor-students-and-boost-graduation-rates

Phillips, E. D. (2013). Improving advising using technology and data analytics. Change: The Magazine of Higher Learning, 45(1), 48–55. https://doi.org/10.1080/00091383.2013.749151

Pinch, T. J., & Bijker, W. E. (1984). The social construction of facts and artefacts: Or how the sociology of science and the sociology of technology might benefit each other. Social Studies of Science, 14(3), 399–441. https://doi.org/10.1177/030631284014003004

Saldaña, J. (2015). The coding manual for qualitative researchers. Sage Publications.

Shaw, C., Atanasio, R., Bryant, G., Michel, L., & Nguyen, A. (2021). Driving toward a degree: Caseload’s impact on advising practices and student success. Tyton Partners. https://drivetodegree.org/wp-content/uploads/2021/06/TYT105_D2D21_01_Caseload_Rd9.pdf

Steele, G. E. (2016). Technology and academic advising. In T. J. Grites, M. A. Miller, & J. G. Voller (Eds.), Beyond foundations: Developing as a master academic advisor (pp. 305–325). Jossey-Bass.

Steele, G. E. (2018). Student success: Academic advising, student learning data, and technology. New Directions for Higher Education, 2018(184), 59–68. https://doi.org/10.1002/he.20303

Szymanska, I. (2011). Best practices for evaluating academic advising. UNC Charlotte. https://studylib.net/doc/18701779/best-practices-for-evaluating-academic-advising

Thomas, C., & McFarlane, B. (2018). Playing the long game: Surviving fads and creating lasting student success through academic advising. New Directions for Higher Education, 2018(184), 97–106. https://doi.org/10.1002/he.20306

Tinto, V. (2012). Leaving college: Rethinking the causes and cures of student attrition (2nd ed.). University of Chicago Press.

Troxel, W. G., Bridgen, S., Hutt, C., & Sullivan-Vance, K. A. (2021). Transformations in academic advising as a profession. New Directions for Higher Education, 2021(195–196), 23–33. https://doi.org/10.1002/he.20406

Tyton Partners. (2016). Driving toward a degree: Establishing a baseline on integrated approaches to planning and advising. https://drivetodegree.org/report-archive/driving-toward-degree-establishing-baseline-integrated-approaches-planning-advising/

Varney, J. (2012). Proactive (intrusive) advising. Academic Advising Today, 35(3), 1–3.

Vianden, J. (2016). Ties that bind: Academic advisors as agents of student relationship management. NACADA Journal, 36(1), 19–29. https://doi.org/10.12930/NACADA-15-026a

Wallace, S. O., & Wallace, B. A. (2016). Defining student success. In T. J. Grites, M. A. Miller, & J. G. Voller (Eds.), Beyond foundations: Developing as a master academic advisor (pp. 83–106). Jossey-Bass.

Winston, R. B., & Sandor, J. A. (1984). The academic advising inventory. NACADA. http://www.nacada.ksu.edu/Portals/0/Clearinghouse/links/documents/AAI-Inventory-Master.pdf

Appendix A. Proactive and Developmental Advising Scales

PROACTIVE ADVISING BEHAVIORS

How often advisor uses student data outside of appointments to identify students

Proactive Data Use (Cronbach’s alpha = 0.85)

Correlation with Total

Who had not yet completed something important

0.68

Who may be struggling or need support

0.84

Who are improving or excelling

0.85

DEVELOPMENTAL ADVISING BEHAVIORS

How often advisor typically discusses each topic with students during advising appointments

Academic Planning (Cronbach’s alpha = 0.67)

Correlation with Total

Major/minor exploration

0.67

Progress toward their degree

0.63

Academic standing or probation status

0.62

Dropping or adding courses

0.62

Planning courses for future terms

0.62

Content of courses

0.60

Academic Support (Cronbach’s alpha = 0.75)

Correlation with Total

Academic performance in class

0.62

Study skills

0.73

Concerns related to instructors

0.69

Academic policies

0.70

Transfer credit and policies

0.75

Extracurriculars (Cronbach’s alpha = 0.85)

Correlation with Total

Participation in extracurriculars

0.81

Internships or engagement opportunities

0.81

Career goals or alternatives

0.80

Going to graduate school

0.81

Paid work outside of school

0.85

Student Wellbeing (Cronbach’s alpha = 0.93)

Correlation with Total

Family or relationship issues

0.90

Personal concerns or problems

0.89

Physical or emotional health and wellbeing

0.92

Appendix B. Interview Protocol

  1. 1. Can you begin by telling me what you understand to be your main purpose, or function, as an advisor here at this university?
  2. 2. What is your own personal approach or philosophy about your role as an advisor?
    1. a. How has this changed over time, if at all?
  3. 3. Can you walk me through a typical advising meeting and what it would entail?
  4. 4. How would you describe an ideal advisor/student relationship? An ideal advisor/student interaction?
  5. 5. In terms of your job duties, describe your duties and the time you tend to spend on each type of task as an advisor.
  6. 6. Is there anything you do to prepare for your advising meetings with students?
  7. 7. Can you describe what you do to prepare?
  8. 8. Is there any way you ever reach out to students and why do you do that?
  9. 9. What do you see as students’ greatest needs?
  10. 10. In what ways do you feel capable or not capable to meet those needs?
  11. 11. Can you talk with me about the challenges you face in your efforts to advise students here?
    1. a. What types of things have you done in the past in order to address any of those challenges?
  12. 12. What are the greatest rewards you get from your position as an advisor?
  13. 13. What technologies do you routinely use in working with students?
  14. 14. Can you talk about your familiarity with or awareness of the advising dashboard?
  15. 15. Can you remember for me how you were initially introduced to the data dashboard and what your first thoughts were about it?
  16. 16. How do you feel about this whole process?
    1. a. Changes like this at this university?
    2. b. The rolling out of new technologies?
    3. c. This dashboard in particular?
  17. 17. Given your thoughts and feelings on this, to what extent have you tried to implement the data dashboard into your advising practices?
  18. 18. If you could change the data dashboard or how it’s used in any way, what would you change?
  19. 19. If you could change anything about your job, what would that be?
    1. a. In what ways do you feel able to, or unable to, make some of these changes?
  20. 20. If you could imagine a change in how students approach their advising meetings, what do you wish could change about students and what they do or how they think?
  21. 21. What could the university do to make the advisor/student relationship more effective?