Monthly Archives: March 2014

2014 EUNIS BI Conference Discussion Session

The conference so far has been entirely plenary lecture based. I think we’re all looking forward to some new formats. First up we have an expert led discussion session. Our panel comprises Ora Fish, Stefano Rizzi, Bodo Rieger, Elsa Cardoso. Of the four I’ve managed to make contact with three regarding the work Jisc and HESA have planned on that national business intelligence service for the UK.

The panel are to comment on the following;

1. Critical Success Factors for BI in HE
2. Vision for BI in the next 5 – 10 years
3. Recommendations and expectations for EUNIS Services

My own comments are marked >

1. CSFs for BI in HE
Ora Fish – The CSF is down to the individuals involved in the initiative; their energy, adaptability, relationships and knowledge.
> Nice one Ora – capability of the team, leadership and their efficacy is clearly key. So what attributes does a successful BI service implementation require? What distinguishes this from other projects? Presumably quite a lot as this bridges the divide between IT and Business leaders, tackles cultural issues of data use, touches on records management and attempts to translate data into knowledge palatable to wide range of roles.

Hans Pongratz – agile systems and the business value they bring, how does it meet the needs of consumers (students, lecturers, managers)

Alberto – the content of the data warehouse – look toward employability of graduates as this is your final product
> absolutely true, but attributing graduate employability to a BI implementation seems a little on the tenuous side. Maybe here we’re touching on what are the data sources to assist in predicting employability, providing actionable insights and promoting the opportunity for people to take the actions. I quite like this as an alternative to analytics to predict success at University.

Elsa – People, strategy, technology and processes are the four areas. Ora covered people, map to institutional strategy, technologies should map to architecture, enhance and embrace processes to smooth the way

> Can’t faulty that as a response! From the floor a note of a Key Failure Factor being over reliance on a single person who became so integral the implementation became entirely reliant.

Steffano Rizzi – the importance of a functioning BICC and its role in data cleanliness and quality
> Definitely required capability but the BICC seems to me an ethereal entity comprising many different capabilities to smooth the way to BI uptake.

Ora Fish – The data warehouse is the key to success. Other system aspects can come and go.
> Isn’t that the case – the most difficult things are often a) getting your data (and processes) in order b) cultural change and people. Technologies merely enhance.

2. Next 5 – 10 years
A growth in demand and supply of data scientists capable of providing analytical services
> but also incorporation of data literacies in existing roles across the institution including senior managers
From the floor –
The UK Moodle community are looking (via University of London) at analytical insight across the 137 institutional Moodle instances. The panel are concerned about masking and security of data.
Analytical predictions for insight into likely employability is a capacity we might focus on as well as academic performance. Simply, based on student data exhaust across a range of indicators, what is the likelihood of good employability?
Ora Fish – Look to enhance the value we can bring to Universities, take the focus away from the technologies. More analytics, more ‘what if’ scenario solutions, better examples of Big Data value.
Stefano Rizzi – Delivery technologies (mobile), personalisation and recommendation services based on what others who ran this query did, collaborative business intelligence – a network of university consortia sharing data for BI
Incorporation of cross sector data to provide insight into future performance. In a nut shell analyse school, college and pre university data exhaust to provide insight into changes that could be made to enhance chance of success.
> Yeah yeah yeah…. Minority Report?! In the UK Schools already do a bit of this but pretty basically / badly. Do we really want to model success based on life experiences? Really?

And so my attendance at the EUNIS BI Taskforce 2014 event must conclude. I’ve a Eurostar to catch. It’s been a long old week. I left home on Monday and will be home tonight at 21.30. It’s Friday. I hope my presence here has helped others via the blogs. I hope the contacts I’ve forged are able to move to action with me and with Jisc and the Jisc / HESA BI Project.

It’s been emotional…

March 2014 EUNIS BI Conference Day 2 Session 1

It’s been a great event so far. Let’s start with a summary slide of the BI capability achieved at Turin – something for us all to aspire to….

Turin

I’m back for the start of day 2 where we’re hearing about analytics and predictive analysis.
A quick update on yesterday. I’ve agreed to get more involved with the EUNIS BI Taskforce. It seems we share a mutual aim of increasing maturity of BI capability across Higher Education Institutions. I’m now in touch with Elsa Cardoso to see what opportunities we can identify to collaborate.

I’m also in touch with Bodo Rieger regarding that large scale mature Cognos implementation he described yesterday and to explore the overlaps between our respective national data collections HESA in the UK and HIS in Hannover. Here’s a great summary of the national data collection system of HESA in the UK.

So on with analytics. Or specifically learning analytics (of which there is quite a bit of information on this blog).

Not surprisingly the Educause definitions were given an airing

educausedefs

and of course one cannot hear of learning analytics without mention of the Purdue University Course Signals system and John Campbell.

While Check My Activity from the University of Baltimore Maryland is a new one to me.

Here’s a shot of a smart phone app from University of Kentucky.

Kentucky

This reminds me of some of the work we had submitted to the Jisc elevator last time we ran the Student Innovator initiative. It’s just opened this year. It’s a great initiative tapping into the power of using students as innovators applied to enhancing the student experience through technology and linked to entrepreneurship as key graduate attribute.

There’s a lot of work from Educause available via their Analytics papers. I’m talking to Susan Grajek, Vice President for Data and Analytics later this month about the BI initiative and Malcolm Brown, Director of the Educause Learning Initiative in April.

Back to EUNIS and something that strikes me is the lack of concern on privacy issues.

Another key contact I’ve made here is Ora Fish. Ora gave yesterdays opening key note which is covered in this blog previously. She’s expressed an interest in the Jisc / HESA project. We’re talking about Educause Conference 2015 (of which I’m fortunate to have been invited onto the organising committee) and whether there should be opportunities to air some of the BI for HE issues there.

Exciting stuff…..!

March 2014 EUNIS Business Intelligence Conference Session 3

Next up at the EUNIS BI Conference it’s Targets, Implementation, Experiences, Recommendations from Osnabrueck University. Bodo Rieger and Sonja Schulze.

15 years of BI initiative to report here in 30 minutes. I’m strapped in and ready!

This is an IBM Cognos implementation over Oracle with 5 objectives;

Address all levels of decision making
Integrate qualitative and quantitative data
From all operational systems
Provide standard and analytics reporting
Support operational and strategic decisions

Not all have been met!

There are four user group targets; Students (seen as the most important), Lecturers, Faculty Management, Top Management.

A popular dashboard allows personal benchmarking by students against student exam performance updated when new results are added (not on demand).

student dashboard

One for lecturers here showing red where incoming students have a lower level of education and will need additional support.

lecturer

Accurate predictions of likely demand for course registration from students allows forward resource planning. Decision support systems improve resource allocation by assigning scarce resources to students to assist in timetabling.

There’s something in here on utilising a mixture of visualisation, text and hypertext allowing query and presentation of the data. Another interesting feature is to ensure that nominated data owners are able and expected to clean up the data where errors are identified. The system has over 10K users (skewed as it has student access).

They started with analytics reporting giving all the suers cubes to analyse the data as people wanted. This was NOT successful. Standard reporting was the way to go. They’re now bringing cubes back in as readiness is achieved.

Here are some examples of strategic decisions

operational and strategic

Benchmarking is done using national data sets based on mandatory returns. This is a parallel with the UK situation and one for follow up.

They have an example of complex realtime modelling to optimally assign exam dates and rooms – smart predictive timetabling ensures students aren;t over burdened and room allocation is optimal.

Here’s the Osnabruck BI Maturity;

maturity

Here’s an example of success factors based on user acceptance

critical success

March 2014 EUNIS BI Conference Day 1 Session 2

Next up we have a talk on BI Maturity Models from Elsa Carduso of University of Lisbon, Portugal.
Elsa again kicks off with her own definition of Business Intelligence based on enhancing enterprise decision making capability. She also notes the distinction in time frame in that BI tends to give insight based on what has happened in the past. We also need alerts regarding real time present. In terms of future we’re into the realms of predictive analytics. Jisc did some work on this in 2013.

So, on with BI Maturity Models. Elsa states that these are usually used to analyse the ‘as is’ situation in terms of organisational (and individual?) capabilities to undertake BI. There are many BI Maturity models but EUNIS has focused on the OCU BI Maturity Model. I can’t find this anywhere other than on our site but I think it’s been updated since we published.

Reporting on the recent EU wide survey in order to give an indication of the As Is situation and identify gaps and hence potential offers to help out.

The survey had 66 respondents across 9 countries. 13 responses from the UK, 12 from France.
26% have no dedicated BI staff which maps well to the number of respondents having no BI capability! On profile of respondents 50% were from IT which either echoes that BI tends to start there, or that the survey was targeted that way. In the UK it went out through the University and Colleges Information Systems Association UCISA.

Anyway, responses aside, there are 9 dimensions to the model and 5 levels of maturity (see previous link for a diagram but below diagrams have summaries of findings). Elsa outlined the questions within the survey. I can’t find those but here’s the rational.

There are representations of the analysis for each country polled. Within that Jisc / HESA / HESPA BI project we’re considering how we can best use maturity models. Current thinking is we should stick with the OCU one and encourage UK institutions to create a picture of the ‘as is’ to give us a snapshot of capacity to assist different aspects of the project, as well as profiling out our potential customer base. I’m also interested in how we might work with institutions to play with the ‘to be’ state. What are their aspirations mapped to show how the new BI service can assist in the transition. Key to this I think will be something I’m calling satellite services for now. These will be broad covering advice, case studies, training master classes, working groups and maybe even on site consultancy offers.

Anyway, back to the survey. We’re being shown examples of survey analysis by country. It’s a little like Eurovision. Am afraid I missed a few of these (Germany, Finland to name but two) Elsa suggests benefits in delving deeper to determine why results are as they are. Here’re a few examples;

Aggregated

aggregated

Sweden has problems with funding yet all respondents have each of the capabilities. There’s no indication of why.

sweden

France has a good deal of BI activity with non existent provision yet delivery and overall are high.

fra

Ireland managed to make 8 returns yet has only 7 public universities. Some problem with the underlying data?! The picture here looks rather rosy, yet opinions varied in the room

ire

Drum roll please as we reveal the UK;

uk

EUNIS are keen to engage better with the UK. Indeed I’m the only British delegate.

March 14 European University Information Systems Business Intelligence Conference – Key Note

I’m very excited about this event. I’m currently working with the UK Higher Education Statistics Association (HESA) and HESPA the Higher Education Strategic Planners Association on a collaborative project. We’re planning on building a National Business Intelligence System for UK Higher Education.
We’re in the planning phase of that when along comes this event. The EUNIS BI Conference. Full program here
So this blog post may be a little selfish; rather than focusing on reporting the entire content I’ll be cherry picking tips and tricks and issues and solutions in BI implementation for HE.

The Louvre

More than half of European countries are represented here today. Three consortia of Universities collaborating on BI related initiatives have sponsored the event. Our top table are; Jean Francois Desnos (Eunis), Jan Madey (MUCI Poland), Ramon Grau (SIGMA Spain), Elsa Cardoso (University Institute of Lisbon Portugal), Michele Menvielli (Cineca Italy)

We’re already hearing about the needs to reduce cost, increase productivity, optimise budget and the opinion that BI solutions are imperative in helping universities achieve these.

The EUNIS task force plans the following activities;
Current state of BI in European HEIs (I know this is based on the OCU Maturity Model as my colleague Steve Bailey of Jisc InfoNet contributed)
Linked In Group, BI Maturity Survey, Case studies and Training sessions on specific topics (I’m interested in this as we intend to put on similar via a virtual national BICC (Business Intelligence Competency Centre)

Here’s a quick summary of Jisc involvement in BI

Jisc BI Journey to Date

I’ll blog the Keynote here and other sessions on different pages;

Keynote; Ora Fish – Practical Approach to implementing Business Intelligence in Higher Education (HE)
HE is a late adopter of BI. Reasons why are many and include the breadth of operations not driven by short term objectives, difficulties in KPI identification (though UK HESPA Group have done recent national work on this), lack of analysts and analytical culture (some of these in the UK and across EU but not the case in the US), years of service and loyalty to old ways of operation, academic freedom and complicated governance / over reliance on committees making change slow.
EUNIS Top Technology report (Feb 2014) has reported BI as the number one concern.
Ora outlines the readiness for change agendas with this slide;

IMG_4432

Ora reports on the HEDW – it’s an Educause working group of 2100 people on data warehousing. I’m in touch with Educause so one for me to pursue. Looking forward the US is seeing HE BI job advertisements and more interest from BI vendors in the sector. Start with the questions is a message Gartner press. Here’s a great slide on the sorts of questions HEIs want the answers to.

IMG_4433

On we got through the fragmented nature of data storage across silos of legacy ERP systems and that thee have not been designed for information retrieval and analysis. A data warehouse i defined as a solution to this, so a collection of data from multiple sources tuned for easy analysis for decision making. Orsa defines BI as the processes, tools, and techs required to turn data into information, information into knowledge and plans to drive effective business activity.

Little gem from Ora – How to measure success of a BI implementation? Look to volume and variety (bit not velocty as that would the the tree Vs of big data) of end users. I like this – in out national BI project we aim to expand the user base of HESA (and non HESA) data to far wider roles than current (tends to be information specialists). Diversity of your BI community is key!

So summing up for the Jisc project, seems as though we’re on the right trajectory;
1. Problems of data locations, accuracy and definitions – sorted by using HESA data sets (our Level 1 service), problematic when expanding to other non HESA data sets (our Level 2 service), very problematic when including institutional data (Level 2 plus)
2. BI as a programme not a project – yep
3. Leadership buy in, governance and sponsorship – at national level – check
4. Funding – tick
5. Technology – to be addressed
6. Change management, training and support – develop from our BICC

I think we also need to add here dashboard prioritisation matrix; wow factor / usefulness for competitive edge (and supporting evidence). We’ll be looking to BICC for that. Seems imprtant that BICC includes the consumers of BI – so the range of roles, the language they use and the questions they hold dear. One for our project there – diversity of role inclusion and effectiveness in prioritisation of dashboard and visualisation features.

Top tips on implementation approaches;
Plan Big, Deliver Small and Fast. Be iterative and agile in development. Focus on business questions to answer
Integrated data warehouse – Kimball Bus Architecture is recommended here
Star Schemas and conformed dimensions
Capture business meta data
Capture business definitions
Testing, Validations and Certification (What, how, when)
Rollout (What, how, when and what training)

Ora presented a few dashboards for specific roles. Wouldn’t it be great if we shared these for Higher Education by a Flikr account? At Jisc we made a start on this here. I’ll add it to the Jisc/HESA BI Project Plan.

Lastly Ora concentrated on Change Management issues. The big one of BI exposing poor data was tackled by procedures to raise awareness of inaccuracies and timescales given for correction. Sounds simple. Bet it isn’t!

Questions here were very much focused on institutional BI implementations. There was one on predictive analytics and Ora described student performance, so what I know as learning analytics, but said nothing much else going forward on predictions for enterprise intelligence. Scenario planning is still prevalent. Master data management also gets a mention.