We’re always curious about which edtech solutions are leading the edge of higher education marketing and enrolment. Our head of tech solutions Alex and I visited a certain magic kingdom in Anaheim, CA last week to attend Liaison’s flagship HE event to find out more.

From AI empowerment to data-driven student retention strategies, we had two insight-fuelled days packed with fresh ideas to drive the industry forward:


Shining a light on the murky middle with predictive analytics

One of the most insightful sessions came from Lauren Speerhas of Othot, who explored using predictive analytics to identify current students at risk of dropping out. By analysing factors like work obligations, finances, and support systems, universities can provide well-timed interventions to boost retention for those navigating the often obscured “murky middle” of the student journey. It’s an innovative, data-led approach to proactively supporting students when they need it most.

Never make the assumption that a student will see it through to graduation based on their initial academic merit. Of those in this category, 20% go on to graduate within 6 years, while 13% eventually leave without a degree. By separating the “hidden population of struggling students from the likely graduates” within the murky middle, institutions can deploy more targeted, effective support to improve overall retention and completion outcomes for this all-too often overlooked group.


Reimagining AI’s role

AI integration raises valid concerns around bias and inequity. A thought-provoking session from University of Pittsburgh and Liaison reframed the debate. The speakers outlined how practices like responsible data governance, critical thinking, and iterative verification can mitigate risks. Inspired by George Box’ “Box Loop” framework, auditing of AI outputs, solving problems collaboratively, and continuously critiquing processes are all ways to avoid AI’s confirmatory biases.

How can we be more intentional and consistent whilst still being exploratory? Think of AI as a smart but inexperienced colleague, and hold them up to those same standards. Find discrepancies in the data to surface problems, then use critical thinking to solve them. 

If you want to check for bias, try putting the same prompts or data again. But this time change the gender or ethnicity, and see if the output changes. If it does, you have to ask why.

We were reminded that prediction and judgement are very different. This stresses the importance of final human checks on AI outputs and measuring actions against preset expectations.


Empowering teams with the skills to practically apply AI

A huge highlight was our very own Alex Calder’s talk on “AI Empowerment” – a practical guide for Higher Education marketers to harness generative AI’s potential ethically and effectively.

Key insights included navigating policymaking, cultivating open and experimental mindsets across teams, and fostering a true culture of innovation. Alex also demoed our proprietary AI-powered persona builders, a citation-backed search engine, and in-depth model comparisons.  

We know Higher Education marketing teams can struggle to make the right choices from the AI models out there. But wherever you are on your AI journey, Alex’s talk leaves attendees with critical conceptual tools for moving forward. 

We’ve been running a bespoke version of this session for universities over the last few months. It’s been great to see its impact on individuals’ confidence with AI, and on strategic AI adoption planning.

(So if you’d like us to run this session for your institution, just get in touch!)


From data-driven retention strategies to ethical AI implementation, the Experience: Liaison conference sparked valuable conversations on the future of higher ed. We’re energised to keep the momentum going. We’re attending upcoming industry events including the HE Professionals Higher Education Marketing conference and the QS Higher Ed Summit.

Let us know if you’ll be attending those events – we’d love to connect and continue sharing insights.