Demonstration Sessions at LAK 12

A few notes here based on the Demonstration Sessions at LAK12. See previous post for context!


A 43K student Community College faces less than 50% course completion rate. SHERPA is a software architecture to address this. Class search engine, personalization engine, and connections to various vendor systems (Banner etc). It includes a course recommendations system to address full course selection. Seems to me that this could have XCRI / Course Data Programme implications.
It Provides ‘nudges’ based on student declared selection criteria in their ‘profiles’. The intention is to nudge students into making a more informed decision. It alters students to course selection choices that may be ‘nudging’ them toward failure.
The system is designed for a single institution.
Nudges can be by various delivery methods, automated by time, can be task based via institutional services such as students’ ‘To do list’, ‘Academic Plan’, ‘Calendar’
Use of students as part of the service design group and as video and voice over to promote new services.

E2 (Expert Electronic Coaching)

The ‘Better than Expected Project’ (generates actionable intelligence based on a single course) and an adaptation of public health communication tools to help achieve those.
Student grades can be predicted accurately before they start the course based on predictions analysing 14 years of data on 48.5K students taking the course from an institutional data warehouse.
Their Grade Performance Average is as good an indicator though!
They took the quantitative analytics and used these in student interviews to add qualitative data generating actionable intelligence.
John Campbell’s ‘obligation of knowing’; how we expect students to perform and what leads to success. How to tailor approaches to optimize success? Computer Tailored Communication is one opportunity (see the Michigan Open Source Tailoring System. The effort lies in determining what you need to say. Input from students was of more use than staff. Includes testimonials about how students were successful matched to individual students.
First measure of impact; 2.3% higher student achievement (it’s early days).

Instructional Designers to mediate the locus of control in adaptive eLearning Systems

The goal of Learning Analytics tends to be adaptive modeling. But who validates these? Proposition here is that the tensions between adaptable (HCI) and Adaptive (AI). Instructional Designers can play a role as pattern finders and mediators of adaptation for learning analytics adaptive systems. Should we be building analytics systems to help instructional designers (not students)?

Leave a Reply

Your email address will not be published.