Reconciling VLE Minimum Standards

I am torn by VLE standards. On the one hand, I see their purpose and utility and on the other, I see a tool of management implemented primarily for the purpose of compliance. It is with this in mind that I titled this post ‘Reconciling VLE Minimum Standards’. Yesterday, I took part in a UCISA Digital Education Group webinar discussion on VLE Minimum Standards. An hour is not enough time to discuss the complexities of this subject, so this post is an attempt to answer the questions we didn’t get time to cover and to reconcile my internal conflict.

A recording of the webinar VLE Minimum Standards – Lessons from the Sector and Padlet containing the questions and resources are available.

For context, the University of Warwick does not have a VLE minimum standard. It is the direction we are moving in but are yet to begin. Our VLE is still young, having been in ‘pilot’ around five years ago. We’re at a stage now where its importance is recognised by the ‘University’ and an appetite is growing for increased consistency. The answers to the questions below are based on my experience of implementing minimum standards at my previous institution.

How do you monitor and measure compliance with the standard? Are there any automated ways of doing this? What, if any are the consequences or sanctions for non-compliance? What flexibility do your standards offer?

Monitoring – We took a sample across the faculties.

Automation – Can data gained through automaton be meaningful? We can automatically trawl the database and say module x contains a forum, x number of files, a quiz etc. But what can’t be measured through automated means is whether those things were meaningful or successful. Is having a forum with nothing in it enough to pass the standard? Often, it is. But that’s a tick box measure, not something that will change practice. There are too many variables, too much context and too much open to interpretation to automatically to measure whether a module page has met the standard.

Consequences – We thought it best to allow each department to decide how they enforced the standards. It was their decision to determine what the consequences would be for non-compliance. We wanted to position ourselves as a source of support, as the carrot, not the stick.

Flexibility – We used lots of phrases such as “as appropriate” and “where necessary” to enable staff to apply only what was relevant to their teaching context. There were standards that were not optional that were applicable to the institution regardless of context.

How have you evaluated the impact of the standards?

This is where standards fall down. What is the measure of success? What is the desired impact? This needs to be clear early. Is impact to be measured by looking at the measures of student satisfaction? Is 80% of the modules meet the standard success? We hadn’t got that far and I’m not sure we really had this clear either. Sadly, standards are often something we have to do so thinking about this stuff can become a lower priority than delivering a standard.

How have your standards evolved over time? What would you change about them now?

I think minimum VLE standards have to be part of a wider holistic approach to improving the student experience. All VLE standards do is improve one small aspect of the experience. The VLE is a small part of a big picture, if you’re clinging on to the hope that improving the VLE will fix the student experience, I fear you’ll be disappointed. I like this description from Reed and Watmough.

If these are truly hygiene factors in Herzberg’s use of the word, they will not necessarily make students’ HE experience completely satisfactory; rather, they will reduce the likelihood students will be dissatisfied in a preventative sense. These factors could be present but other aspects central (or intrinsic) to the teaching and learning experience could be missing, thus preventing students from extreme satisfaction. Reed, P. Watmough, S. 2015

I found a dichotomy between compliance and use which I still find difficult to reconcile. I found the more time I spent developing and auditing the standards the more I felt they needed to be broader to suit different disciplines. BUT whilst auditing them, I thought they were too open to interpretation and needed to be more specific. How do you measure for example “Appropriate learning content available through a structured content organisation”? My interpretation was different to my colleague.

What sort of approaches to reporting on the standards have you done?

We performed an audit from a sample of modules across all faculties. We created an infographic that showed 3 positive areas and 3 areas for improvement which was visible to the whole University. We also sent the raw data to the Head of Department. We felt it was important for departments to decide on the best approach for monitoring and taking action.

What have institutions found successful in raising adoption of the standards, including staff development and communication approaches? Do staff really understand the impact of meeting/not meeting the standards on students?

How can they understand the impact of not meeting them if there are no consequences? If we want compliance then there should be a compliance mechanism and we should be honest that they will be measured and reported on. We can use the student experience as a motivator but without seeing that surface or demonstrated in a tangible way, e.g. module evaluations or NSS (by which time it’s too late) where are they going to see the impact? So we’re back to someone policing them and there being clearly articulated consequences.

Have you updated your standards? Do you have a schedule as to when they will be updated?

I didn’t get that far at my previous institution but I would expect any that we develop would be reviewed and updated annually. Technologies, policy and focus may change from year to year and the standards should reflect where the institution wants to go, not where it is now.

What balance have you taken in producing minimum standards for your VLE between functional standards e.g. put up the handbook and broader pedagogic or principle base standards e.g. accessibility

This is a question that has stuck with me. I want to create a standard, baseline or whatever you want to call it that changes behaviour and practice whilst improving consistency, student experience and use of the VLE. I don’t want to create a tick list of things to do to avoid getting scrutiny from the institution. Would a standard be more successful if it was focussed on the change in practices we want to achieve than just a tick box list of things that should be on the module page?

It can still be linked to specific measurable criteria and further guidance but the emphasis would be on pedagogy and practice first. The platform, features and functions are immaterial. For example “Teach inclusively – Ensure your materials follow accessibility guidelines. Ensure a variety of assessment types and materials etc.”. We can combine VLE compliance with pedagogical practice. I’m thinking along the lines of nudge theory. So watch this space, the University of Warwick VLE minimum standard might be a little different.

Ultimately I want to support good teaching whether that is through the VLE or not. Online practice can be beneficial to the face to face and vice versa. We want to get people thinking about practice not about ticking a compliance box.

 


Reed, P. Watmough, S. 2015 Hygiene factors: Using VLE minimum standards to avoid student dissatisfaction. E-Learning and Digital Media
Vol 12, Issue 1, pp. 68 – 89. First Published January 29, 2015.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.