Data driven organisations

We hear a lot about Big Data and Analytics these days. Data-driven organisations use analysed data to predict future behaviour of society as a whole, groups of people, and individuals. They use it to intervene in ways that decrease the likelihood of the ‘worst’ happening and accelerate the likelihood of the ‘best’ happening.

“Big Data is considered a key part of social reform. Eventually, it will predict those most likely to suffer social ills as children, those educationally deprived, those in households where violence is most likely to emerge. Early interventions can be staged - even before the problems emerge.” (NZ Herald, 2014)

“When it comes to reaching your fitness goals, steps are just the beginning. Fitbit tracks every part of your day—including activity, exercise, food, weight and sleep—to help you find your fit, stay motivated, and see how small steps make a big impact.” (Fitbit, 2016)

What is Big Data and analytics?

Big Data is described as the huge sets of electronic data that is available for analysing. Industry analyst Doug Laney articulated the now-mainstream definition of big data as the three Vs.

  • Volume - high volumes of data come relatively cheaply, for example when collected on smartphones and smartphone apps or through social media and need new ways of storing.
  • Variety - the data comes in a variety of formats – from structured, numeric data in traditional databases to unstructured text documents, email, video, photo, audio, stock ticker data and financial transactions, and it is comes from multiple sources, which makes it difficult to link, match, cleanse and transform data across systems. The quality and quantity can vary over time.
  • Velocity: data arrives at an unprecedented speed and must be dealt with in a timely manner.

Whereas, Analytics, according to Wikipedia, is "the discovery and communication of meaningful patterns in data."

New technologies make it all possible as they provide massive storage for any kind of data, enormous processing power and the ability to handle virtually limitless concurrent tasks or jobs. Hadoop which is an open-source software framework, is an example of this technology.

In New Zealand we can see this trend in data driven organisations. The government has set up the New Zealand Data Futures Forum to guide thinking about the use of data in response to questions such as “Who has what data about me/us and what will they be doing with it?” “What data do I/we have access to that can help us?”. NZ Data Future Forum’s vision is to set an “agenda to significantly advance New Zealand’s ability to unlock the latent value of our data assets and position us as a world leader in the trusted and inclusive use of shared data to deliver a prosperous society.” The forum identified that “Harnessing the benefits of data sharing and use requires a trusted, transparent, and balanced environment – one where privacy is paramount and trust maintained.” They recommended four principles to help New Zealanders navigate the data future.

Value: New Zealand should use data to drive economic and social value and create a competitive advantage.

Inclusion: All parts of New Zealand society should have the opportunity to benefit from data use.

Trust: Data management in New Zealand should build trust and confidence in our institutions.

Control: Individuals should have greater control over the use of their personal data.

What does this mean for education?

Analytics profoundly shape the educational reality that they measure. What is measured and reported through the use of infographics or dashboards becomes more important than what is not reported. All levels of education are becoming data driven organisations. ‘Big Data’ and the use of analytics can provide insights into some of the gnarly challenges associated with improving equity and excellence. Governments are using it and businesses are selling it. For example:

  • OECD uses PISA data (mainly achievement data associated with conditions of learning) to tell the world which country has the best education system.
  • Individual countries use achievement data to show which schools have high and low levels of achievement.
  • Online learning tools can be bought to provide moment-by-moment information for students, teachers and school leaders about individual student’s performance.

And in each case, the expectation is that the analytics will guide next steps. There is an expected ‘now what?’ response. Governments are guided to improve in their education system, schools are guided to know which students need targeted support and teachers are guided to know what the focus of support should be.

The key thing is human assumptions underpin data collecting, analysing, interpreting and reporting and these assumptions are then applied to the tools and analytics. For example in the national and international analytics it is assumed that literacy, mathematics and science achievement are essential life skills and signal that a country is preparing young people for the future. One problem with this is that readers of the reports may ‘forget’ things such as literacy is of service to the curriculum (and is not the curriculum). For example, in New Zealand student success is about students being “confident, connected, actively involved, lifelong learner” (and achievement both leads to and is becuase of student success) (NZC and expanded in ERO indicators).

For learning organisations to be data driven organisations, all assumptions should be transparent and checked that they align with the purpose of education and the outcomes we want for our young people:

  • Who are the ‘we’? (does it include the young person, their family and their teachers?)
  • How can we ‘tell’? (are we relying on what appears easy to measure or are we gathering a deep picture; and with what confidence does this data tell us?)
  • What information will be used in the ‘digital profile’ (does it reflect the whole person; and what information will not be used?)
  • What counts as learning and achieving (does it reflect our vision for young people and the experiences we wish them to have; and how is ‘what counts’ decided?)

The assumptions brought to these questions are embedded worldviews, and beliefs about young people, education, pedagogy and curriculum. We need to ask ourselves

“Do the analytics we use support all learners to develop the knowledge and capabilities needed to be confident, connected, actively involved, life-long learners, and all educators to provide young people with holistic educational experiences or are we using them in ways that will perpetuate inequality and will not realise potential?”.


If the purpose of analytics is to provide feedback as to whether the intent of the teacher/ leader/ implementers/policy makers’ actions is the reality in learner outcomes then analytics can change education in at least three positive ways.

  1. Analytics can provide faster feedback loops for students as they can provide rapid adaptation of teaching actions and therefore quicker learning for students. In the past research knowledge was not easily accessible to teachers, and sometimes teacher practice did not reflect best practice. Now this knowledge is used to design the analytics of online courses for students. For example if an online reading programme has analytics to notice whether students answer questions that require inferencing it can present learning activities to students who have been getting these questions wrong. Just like all effective learning activities, these online ones must be engaging and worthwhile, and the feedback needs to help learners know where they are at, where they are going and how they are going. These programmes are designed to be part of a rich learning experience for students (i.e. they do not replace the teacher). Lovett, Meyer and Thistle (2008) (PDF, 1.7 MB) found in their study that “OLI-Statistics students [blended learning] learned a full semester’s worth of material in half as much time and performed as well or better than students learning from traditional instruction over a full semester.”
  2. Educators and researchers can develop a shared knowledge base when analytics are developed in collaborative discussions about the intent of actions, who determines what the outcomes are, and who receives what feedback can support. For example Tony Bryk’s Carnegie Foundation team of researchers help teachers discuss the assumptions associated with the question Can we tell from your digital profile whether you are at risk of learning and achieving? They reduced complex literature into driver maps, developed related analytics and tested personal theories. One collaboration found that the most powerful predictor of student success in College remedial mathematics courses is students’ sense of belonging to their mathematics class. The teachers have now been trialling different ways to improve students’ sense of belonging.
  3. Analytics can be used to shift from the ‘easier’ to measure to the important. As educators we know the importance of wellbeing, capabilities, competencies and mindsets; the characteristics of 21C learners and leaders, but we have not had ways of knowing these things. Now the technology will allow us to focus on what is important. Learning organisations will be able to provide parents and whānau information about their young person that is important to them. Literacy and mathematical achievement can be one of the indicators of learning and achievement rather than the only indicators. For example, since 2009 Danish exams have analytics to assess students ability to explore and sift information for sense making therefore students need to use the internet to undertake the exams. New Zealand’s qualification authority has a programme to shift from internal and external assessment to one that emphasises assessment for learning.

Questions to consider

What does being a data driven organisation mean for our organisation? The following questions may be a useful guide for the conversations.

  • What data do you collect and how do you use it?
  • Does it make a difference for young people?
  • What are the assumptions behind what you collect (and what you don’t collect), how you analyse it, and how you use the analysis?
  • What protocols do you have about the collection, access and use of any data to ensure it is safe and people trusting you with it?
  • How collaborative is the process - with young people, parents and whānau, teachers and leaders - in deciding what to collect, how to use it, how to feedback findings, and protocols?