Skip to main content

Learning analytics in secondary schools

What can be learnt from the challenges faced in the use of learning analytics in tertiary institutions, when considering its application in secondary education?

photo by Frank Dabek

I posed this question after reading several sources regarding the use of learning analytics in education.  As a secondary school teacher I was interested in finding out if there was anything to be learnt about the application of analytics in tertiary setting , before it is embedded into secondary schooling.  The NMC 2013 Horizon Report claims that within 2-3 years it will have developed beyond the 20% penetration point.  After summarising sources I found common themes in the challenges faced when utilising learning analytics.

  • Driving forces behind analytics
  • Error correction and data override
  • Collection of valuable data
  • Ethics, morals and privacy

I will evaluate the considerations of each challenge when applied in the secondary context to raise the achievement of learners and inform successful teaching.

Driving forces behind analytics

The first area is a personal concern of mine and I think it is extremely important to create a system which is predominantly beneficial to the learner and teacher. +George Siemens  and Long (2011) make the comparison between learning analytics and academic analytics - with learning analytics
being of benefit to the learner, and academic analytics benefiting the academic institution. Recent policy change (, 2013) in the UK has allowed schools to introduce performance related pay for secondary school teachers, academic analytics would really fit in with assessing teachers and students results. Supported by investment from software companies, the emphasis would be on end results and analysis of value added to education rather than the real time guidance and support that learning analytics can achieve as Siemens and Long (2011) reinforce. In this case, the analytics would be driven by an academic motive and has less benefit to the learner. To be used in secondary schooling to raise and benefit success of learners a policy framework must be established that supports the use of learning analytics, as opposed to academic analytics. The use must be driven by pedagogy rather than institutions.

Error correction and data override

Ferguson (2012), Pea (2013), as well as Siemens and Long (2011) all specify that learning analytics must appreciate the ‘softer’ side of the learning process for it to be beneficial to the learner. ‘Learning is messy’ (Siemens and Long 2011 p. 8), effective teachers relate to the learners and appreciate challenges they face in a professional context, as well as the wider pastoral role. The algorithms used in learning analytics can not encompass such requirements at this stage, therefore there must be some level of override to avoid errors. Parser (2011) highlights that we are now being pushed into a “filter bubble” where the data collected from our online activities actually restricts our horizons. As educators we cannot allow the analysis of data to narrow learning choices and minimal progression. There is potential danger of seriously limiting the learning journey for younger pupils, having a greater impact on lifelong learning pathways.

Pea (2013) also touches on the risks of stereotyping students with analytics. He establishes that we have labels for students already (ADHD, Autism, Dyslexia), and with the increased amount of analysed data available we will have the potential of creating more boxes to place students in? How do we overcome the stigma attached to becoming a certain type of learner, as determined by learning analytics. Do we allow the students, parents and other teachers access to this data or could it have a detrimental effect to their learning progress? This also relates to the common ethical theme that ran through the artefacts.

It is essential at this stage in the development of learning analytics that we proceed cautiously with data used in the secondary context, hopefully we will be able to adapt algorithms successfully from tertiary level and ensure that the type of data collected is as accurate as possible. If we could attain the level of the ‘whole person analytics’ that Siemens discusses in his interview with Watters (2011) I would be more confident in the use of learning analytics in schools.

Collection of valuable data

Brown (2012) states that most schools have had an LMS since 2005, but there is a wide range of varying systems and platforms. For us to move forward in the secondary context we must insist on data interoperability, not only between the LMS’s but also between any other systems (Khan Acadamy, Knewton) that a student may use to guide learning. As a Google Apps for Education (GAFE) user I know that there are complete districts in the US that share the same GAFE LMS, if the API’s could be made available to enable software developers to create a program to analyse all data from all students then we would be moving closer to the LMS 3.0 that Brown proposes. As Dawson states in Clarke and Nelsons (2012) article, schools need to buy a complete system which works across all platforms - perhaps this is an area that the +Network for Learning (N4L)  (, 2013) will incorporate in their plan for New Zealand.

As immersive environments like the virtual laboratories mentioned in the NMC Horizon Report (Johnson et al, 2013) and augmented reality (Martin et al, 2011) are used more in education it will be easier to collect data, but it will still be difficult to determine data for the complete learner. Perhaps the introduction of galvanic skin response (GSR) bracelets and webcams to measure emotions would be able to create such data, however this is controversial. The Gates Foundation funded a project to use GSR to track physiological responses, there was uproar from teachers as well as parents (Kroll, 2012). This type of data may be available voluntarily from students when wearable technology is more prolific, but it is still difficult to force learners to submit personal data.

A less intrusive way, but still challenging freedom and privacy rights is the use of virtual machines within a students learning device, this will go someway towards creating the profile of the learner (Pardo & Kloos, 2011). An important consideration that I am intrigued by, especially with teenagers (who we know are very aware of what people think of them) was mentioned by Siemens in his interview with Watters (2011), the Hawthorne effect may also make collecting valuable data complicated.

Ethics, morals and privacy

The general consensus between Siemens and Long (2011), Ferguson (2012) and Dringus (2012) is that the use of the data collection and use should be transparent. In a secondary context it would have to be made available to everyone involved in the students learning. The bureaucracy involved could slow the development of learning analytics, perhaps schools could look to other systems for guidance, health care overcame privacy issues to successfully use personal data and predictive modelling from patients to reduce illness and disease (Cortada et al. 2012).

Integration of learning analytics in a secondary context 

Learning analytics is in its infancy in the secondary sector. If the predictions are correct in the Horizon Report (Johnson et al, 2013) then we will be looking at over 20% penetration point within the next 2-3 years. If this is to happen I believe that we need to develop LMS 3.0 as Brown (2011) emphasises. This needs to incorporate successful data collection techniques, transparent privacy laws and be applied by the teachers who know their students - It needs to have a metacognitive element for it to work effectively.

“the way forward is not to delve into our toolkit of existing solutions and applying them to problems taking a known shape. We must walk forward with an adaptive mindset—recognizing pattern changes and adjusting as the environment itself adjusts.” Siemens (2006)

The integration must be a collaborative effort between the institutions driving the change - the teachers applying the change and the learners who are creating the change. We must understand the analytical process and be aware of their limitations and implications when applied to learners. We can then move forward, promoting an informed change in secondary education.


Anderson, C. (2008). The End of Theory: The Data Deluge Makes the Scientific Method Obsolete. [online] Retrieved from: [Accessed: 19 Aug 2013].
Brown, M. (2011). Learning analytics: the coming third wave. 
EDUCAUSE Learning Initiative Brief, 1-4.
Clarke, J. and Nelson, K. (2012). Perspectives on Learning Analytics: Issues and challenges. Observations from Shane Dawson and Phil Long. The International Journal of the First Year in Higher Education, 4 (1), pp. 1-8. Retrieved from: [Accessed: 28th July, 2013].
Cortada, J. W., Gordon, D., & Lenihan, B. (2012). The value of analytics in health care. IBM Institute for Business Value IBM, Global Business Service. 
Dringus, L. P. (2012). Learning Analytics Considered Harmful. Journal of Asynchronous Learning Networks, 16(3), (2013). School teachers' pay and conditions 2013 - The Department for Education. [online] Retrieved from:'-pay-and-conditions-2013 [Accessed: 1 Sep 2013].
Ferguson, R. (2012). The state of learning analytics in 2012: A review and future challenges. Knowledge Media Institute, Technical Report KMI-2012-01.
Johnson, L., Adams, S., Cummins, M., Estrada, V., Freeman, A., & Ludgate, H. (2013). The NMC Horizon Report: 2013 Higher Education Edition. 1-40.
Kroll, L. (2012). Gates Foundation Responds To GSR Bracelets Controversy. [online] Retrieved from: [Accessed: 1 Sep 2013].
Martin, S., Diaz, G., Sancristobal, E., Gil, R., Castro, M., & Peire, J. (2011). New technology trends in education: Seven years of forecasts and convergence. Computers & Education, 57(3), 1893-1906. (2013). N4L | About. [online] Retrieved from: [Accessed: 1 Sep 2013].
Pardo, A., & Kloos, C. D. (2011). Stepping out of the box: Towards analytics outside the learning management system. Proceedings of the 1st International Conference on Learning Analytics and Knowledge. ACM.
Parser, E. (2011). Beware online "filter bubbles". [video online] Available at: [Accessed: 1 Sep 2013].
Pea, R. (2013). Emerging opportunities in K-12 learning analytics. [video online] Available at: [Accessed: 31 Aug 2013].
Siemens, G. (2006). Knowing knowledge. Lulu. com.
Siemens, G. and Long, P. (2011). Penetrating the fog: Analytics in learning and education. Educause Review, 46 (5), pp. 30-32. Retrieved from: [Accessed: 24th July, 2012].
Watters, A. (2011). How data and analytics can improve education. [online] Retrieved July 19, 2013, from


Popular posts from this blog

How to collaborate with ChatGPT in the research process and actually learn something

If you have used chatGPT before, it can sometimes feel like talking with someone who has done too much of their 'research on Facebook', filling in gaps with random facts marginally related to the topic just so they can respond and keep the conversation going. However, if applied or 'prompted' correctly, with the user utterly aware of the limitations and ethical considerations, chatGPT can be a helpful research assistant. There is already a wide range of tools available that are built on chatGPT that can support many of the things described below; however, I am still a bit hesitant to rush in with most of them being 'freemium' or asking you to upload your own research and other details or data into their database, I'm happy to stick with the open version of chatGPT as it is what our students have access to. Image created with AI The following guide highlights some prompts, some follow-up questions and most importantly, what you need to do next to follow up a

Motivation and homework follow up...

Last week I wrote about setting a homework challenge to learn muscles of the body as an online game - the students then had to post screen grabs on google+ to show they had done it and to be in contention for the hallowed prize of 'King of the Muscles' and a cafe voucher. I wasn't quite sure how it was going to go, but by Thursday the buzz in all my senior classes was about ' poke-a-muscle '.  The boys were so excited about it they'd post a score, and then find out that someone had beaten them, and then rush out of the class at interval to get to a computer and beat the top score.  I even had an email on Saturday (two days after the due date) from two boys who had been practicing and spent the afternoon working together to try and beat the original high scores they had submitted with the homework!!!

Group email parents with Kamar and Gmail

After # EdChatNZ on Thursday night I was really determined to make digital contact with parents and share with them the great work their sons were doing!  At our school we use Kamar to collect absences, store student data and report back to parents.  It has a handy function where you can click on a student and email the parents directly, but I wanted to email all the parents of classes at once.  After a bit of playing around I managed to find a way to do it, here's what I did!!! 1) In Kamar select 'Printing' then select 'Export'  This will save the file so you can copy the addresses into gmail, rather than printing it out. 2) Now you need to select your class - as I wanted to email one class at a time I need to select a single class, but you have to option to select multiple groups.  Click on 'Option Subject' and then type your teacher code into the box.  Your classes should all appear and then when you select the class it will copy into the bo