education – The Policy and Internet Blog https://ensr.oii.ox.ac.uk Understanding public policy online Mon, 07 Dec 2020 14:25:34 +0000 en-GB hourly 1 Tackling Digital Inequality: Why We Have to Think Bigger https://ensr.oii.ox.ac.uk/tackling-digital-inequality-why-we-have-to-think-bigger/ Wed, 15 Mar 2017 11:42:25 +0000 http://blogs.oii.ox.ac.uk/policy/?p=3988 Numerous academic studies have highlighted the significant differences in the ways that young people access, use and engage with the Internet and the implications it has in their lives. While the majority of young people have some form of access to the Internet, for some their connections are sporadic, dependent on credit on their phones, an available library, or Wi-Fi open to the public. Qualitative data in a variety of countries has shown such limited forms of access can create difficulties for these young people as an Internet connection becomes essential for socialising, accessing public services, saving money, and learning at school.

While the UK government has financed technological infrastructure and invested in schemes to address digital inequalities, the outcomes of these schemes are rarely uniformly positive or transformative for the people involved. This gap between expectation and reality demands theoretical attention; with more attention placed on the cultural, political and economic contexts of the digitally excluded, and the various attempts to “include” them.

Focusing on a two-year digital inclusion scheme for 30 teenagers and their families initiated by a local council in England, a qualitative study by Huw C. Davies, Rebecca Eynon, and Sarah Wilkin analyses why, despite the good intentions of the scheme’s stakeholders, it fell short of its ambitions. It also explains how the neoliberalist systems of governance that are increasingly shaping the cultures and behaviours of Internet service providers and schools — that incentivise action that is counterproductive to addressing digital inequality and practices — cannot solve the problems they create.

We caught up with the authors to discuss the study’s findings:

Ed.: It was estimated that around 10% of 13 year olds in the study area lacked dependable access to the Internet, and had no laptop or PC at home. How does this impact educational outcomes?

Huw: It’s impossible to disaggregate technology from everything else that can affect a young person’s progress through school. However, one school in our study had transferred all its homework and assessments online while the other schools were progressing to this model. The students we worked with said doing research for homework is synonymous with using Google or Wikipedia, and it’s the norm to send homework and coursework to teachers by email, upload it to Virtual Learning Environments, or print it out at home. Therefore students who don’t have access to the Internet have to spend time and effort finding work-arounds such as using public libraries. Lack of access also excludes such students from casual learning from resources online or pursuing their own interests in their own time.

Ed.: The digital inclusion scheme was designed as a collaboration between a local council in England (who provided Internet services) and schools (who managed the scheme) in order to test the effect of providing home Internet access on educational outcomes in the area. What was your own involvement, as researchers?

Huw: Initially, we were the project’s expert consultants: we were there to offer advice, guidance and training to teachers and assess the project’s efficacy on its conclusion. However, as it progressed we took on the responsibility of providing skills training to the scheme’s students and technical support to their families. When it came to assessing the scheme, by interviewing young people and their families at their homes, we were therefore able to draw on our working knowledge of each family’s circumstances.

Ed.: What was the outcome of the digital inclusion project —- i.e. was it “successful”?

Huw: As we discuss in the article, defining success in these kinds of schemes is difficult. Subconsciously many people involved in these kinds of schemes expect technology to be transformative for the young people involved yet in reality the changes you see are more nuanced and subtle. Some of the scheme’s young people found apprenticeships or college courses, taught themselves new skills, used social networks for the first time and spoke to friends and relatives abroad by video for free. These success stories definitely made the scheme worthwhile. However, despite the significant good will of the schools, local council, and the families to make the scheme a success there were also frustrations and problems. In the article we talk about these problems and argue that the challenges the scheme encountered are not just practical issues to be resolved, but are systemic issues that need to be explicitly recognised in future schemes of this kind.

Ed.: And in the article you use neoliberalism as a frame to discuss these issues..?

Huw: Yes. But we recognise in the article that this is a concept that needs to be used with care. It’s often used pejoratively and/or imprecisely. We have taken it to mean a set of guiding principles that are intended to produce a better quality of services through competition, targets, results, incentives and penalties. The logic of these principles, we argue, influences they way organisations treat individual users of their services.

For example, for Internet Service Providers (ISPs) the logic of neoliberalism is to subcontract out the constituent parts of an overall service provision to create mini internal markets that (in theory) promote efficiency through competition. Yet this logic only really works if everyone comes to the market with similar resources and abilities to make choices. If their customers are well informed and wealthy enough to remind companies that they can take their business elsewhere these companies will have a strong incentive to improve their services and reduce their costs. If customers are disempowered by lack of choice the logic of neoliberalism tends to marginalise or ignore their needs. These were low-income families with little or no experience of exercising consumer choice and rights. For them therefore these mini markets didn’t work.

In the schools we worked with the logic of neoliberalism meant staff and students felt under pressure to meet certain targets — they all had to priortise things that were measured and measurable. Failure to meet these targets would then mean they would have to account for what went wrong, face losing out on a reward or they would expect disciplinary action. It therefore becomes much more difficult for schools to devote time and energy to schemes such as this.

Ed.: Were there any obvious lessons that might lead to a better outcome if the scheme were to be repeated: or are the (social, economic, political) problems just too intractable, and therefore too difficult and expensive to sort out?

Huw: Many of the families told us that access to the Internet was becoming evermore vital. This was not just for homework but also for access to public and health services (that are being increasingly delivered online) and getting to the best deals online for consumer services. They often told us therefore that they would do whatever it took to keep their connection after the two-year scheme ended. This often meant paying for broadband out of their social security benefits or income that was too low to be taxable: income that could otherwise have been spent on, for example, food and clothing. Given its necessity, we should have a national conversation about providing this service to low income families for free.

Ed.: Some of the families included in the study could be considered “hard to reach”. What were your experiences of working with them?

Huw: There are many practical and ethical issues to address before these sorts of schemes can begin. These families often face multiple intersecting problems that involve many agencies (who don’t necessarily communicate with each other) intervening in their lives. For example, some of the scheme’s families were dealing with mental illness, disability, poor housing, and debt all at the same time. It is important that such schemes are set up with an awareness of this complexity. We are very grateful to the families that took part in the scheme and the insights they gave us for how such schemes should run in the future.

Ed.: Finally, how do your findings inform all the studies showing that “digital inclusion schemes are rarely uniformly positive or transformative for the people involved”. Are these studies gradually leading to improved knowledge (and better policy intervention), or simply showing the extent of the problem without necessarily offering “solutions”?

Huw: We have tried to put this scheme into a broader context to show such policy interventions have to be much more ambitious, intelligent, and holistic. We never assumed digital inequality is an isolated problem that can be fixed with a free broadband connection, but when people are unable to afford the Internet it is an indication of other forms of disadvantage that, in a sympathetic and coordinated way, have to be addressed simultaneously. Hopefully, we have contributed to the growing awareness that such attempts to ameliorate the symptoms may offer some relief but should never be considered a cure in itself.

Read the full article: Huw C. Davies, Rebecca Eynon, Sarah Wilkin (2017) Neoliberal gremlins? How a scheme to help disadvantaged young people thrive online fell short of its ambitions. Information, Communication & Society. DOI: 10.1080/1369118X.2017.1293131

The article is an output of the project “Tackling Digital Inequality Amongst Young People: The Home Internet Access Initiative“, funded by Google.

Huw Davies was talking to blog editor David Sutcliffe.

]]>
What are the limitations of learning at scale? Investigating information diffusion and network vulnerability in MOOCs https://ensr.oii.ox.ac.uk/what-are-the-limitations-of-learning-at-scale-investigating-information-diffusion-and-network-vulnerability-in-moocs/ Tue, 21 Oct 2014 11:48:51 +0000 http://blogs.oii.ox.ac.uk/policy/?p=2796 Millions of people worldwide are currently enrolled in courses provided on large-scale learning platforms (aka ‘MOOCs’), typically collaborating in online discussion forums with thousands of peers. Current learning theory emphasizes the importance of this group interaction for cognition. However, while a lot is known about the mechanics of group learning in smaller and traditionally organized online classrooms, fewer studies have examined participant interactions when learning “at scale”. Some studies have used clickstream data to trace participant behaviour; even predicting dropouts based on their engagement patterns. However, many questions remain about the characteristics of group interactions in these courses, highlighting the need to understand whether — and how — MOOCs allow for deep and meaningful learning by facilitating significant interactions.

But what constitutes a “significant” learning interaction? In large-scale MOOC forums, with socio-culturally diverse learners with different motivations for participating, this is a non-trivial problem. MOOCs are best defined as “non-formal” learning spaces, where learners pick and choose how (and if) they interact. This kind of group membership, together with the short-term nature of these courses, means that relatively weak inter-personal relationships are likely. Many of the tens of thousands of interactions in the forum may have little relevance to the learning process. So can we actually define the underlying network of significant interactions? Only once we have done this can we explore firstly how information flows through the forums, and secondly the robustness of those interaction networks: in short, the effectiveness of the platform design for supporting group learning at scale.

To explore these questions, we analysed data from 167,000 students registered on two business MOOCs offered on the Coursera platform. Almost 8000 students contributed around 30,000 discussion posts over the six weeks of the courses; almost 30,000 students viewed at least one discussion thread, totalling 321,769 discussion thread views. We first modelled these communications as a social network, with nodes representing students who posted in the discussion forums, and edges (ie links) indicating co-participation in at least one discussion thread. Of course, not all links will be equally important: many exchanges will be trivial (‘hello’, ‘thanks’ etc.). Our task, then, was to derive a “true” network of meaningful student interactions (ie iterative, consistent dialogue) by filtering out those links generated by random encounters (Figure 1; see also full paper for methodology).

Figure 1. Comparison of observed (a; ‘all interactions’) and filtered (b; ‘significant interactions’) communication networks for a MOOC forum. Filtering affects network properties such as modularity score (ie degree of clustering). Colours correspond to the automatically detected interest communities.
One feature of networks that has been studied in many disciplines is their vulnerability to fragmentation when nodes are removed (the Internet, for example, emerged from US Army research aiming to develop a disruption-resistant network for critical communications). While we aren’t interested in the effect of missile strike on MOOC exchanges, from an educational perspective it is still useful to ask which “critical set” of learners is mostly responsible for information flow in a communication network — and what would happen to online discussions if these learners were removed. To our knowledge, this is the first time vulnerability of communication networks has been explored in an educational setting.

Network vulnerability is interesting because it indicates how integrated and inclusive the communication flow is. Discussion forums with fleeting participation will have only a very few vocal participants: removing these people from the network will markedly reduce the information flow between the other participants — as the network falls apart, it simply becomes more difficult for information to travel across it via linked nodes. Conversely, forums that encourage repeated engagement and in-depth discussion among participants will have a larger ‘critical set’, with discussion distributed across a wide range of learners.

To understand the structure of group communication in the two courses, we looked at how quickly our modelled communication network fell apart when: (a) the most central nodes were iteratively disconnected (Figure 2; blue), compared with when (b) nodes were removed at random (ie the ‘neutral’ case; green). In the random case, the network degrades evenly, as expected. When we selectively remove the most central nodes, however, we see rapid disintegration: indicating the presence of individuals who are acting as important ‘bridges’ across the network. In other words, the network of student interactions is not random: it has structure.

Figure 2. Rapid network degradation results from removal of central nodes (blue). This indicates the presence of individuals acting as ‘bridges’ between sub-groups. Removing these bridges results in rapid degradation of the overall network. Removal of random nodes (green) results in a more gradual degradation.
Figure 2. Rapid network degradation results from removal of central nodes (blue). This indicates the presence of individuals acting as ‘bridges’ between sub-groups. Removing these bridges results in rapid degradation of the overall network. Removal of random nodes (green) results in a more gradual degradation.

Of course, the structure of participant interactions will reflect the purpose and design of the particular forum. We can see from Figure 3 that different forums in the courses have different vulnerability thresholds. Forums with high levels of iterative dialogue and knowledge construction — with learners sharing ideas and insights about weekly questions, strategic analyses, or course outcomes — are the least vulnerable to degradation. A relatively high proportion of nodes have to be removed before the network falls apart (rightmost-blue line). Forums where most individuals post once to introduce themselves and then move their discussions to other platforms (such as Facebook) or cease engagement altogether tend to be more vulnerable to degradation (left-most blue line). The different vulnerability thresholds suggest that different topics (and forum functions) promote different levels of forum engagement. Certainly, asking students open-ended questions tended to encourage significant discussions, leading to greater engagement and knowledge construction as they read analyses posted by their peers and commented with additional insights or critiques.

Figure 3 – Network vulnerabilities of different course forums.
Figure 3 – Network vulnerabilities of different course forums.

Understanding something about the vulnerability of a communication or interaction network is important, because it will tend to affect how information spreads across it. To investigate this, we simulated an information diffusion model similar to that used to model social contagion. Although simplistic, the SI model (‘susceptible-infected’) is very useful in analyzing topological and temporal effects on networked communication systems. While the model doesn’t account for things like decaying interest over time or peer influence, it allows us to compare the efficiency of different network topologies.

We compared our (real-data) network model with a randomized network in order to see how well information would flow if the community structures we observed in Figure 2 did not exist. Figure 4 shows the number of ‘infected’ (or ‘reached’) nodes over time for both the real (solid lines) and randomized networks (dashed lines). In all the forums, we can see that information actually spreads faster in the randomised networks. This is explained by the existence of local community structures in the real-world networks: networks with dense clusters of nodes (i.e. a clumpy network) will result in slower diffusion than a network with a more even distribution of communication, where participants do not tend to favor discussions with a limited cohort of their peers.

Figure 4 (a) shows the percentage of infected nodes vs. simulation time for different networks. The solid lines show the results for the original network and the dashed lines for the random networks. (b) shows the time it took for a simulated “information packet” to come into contact with half the network’s nodes.
Figure 4 (a) shows the percentage of infected nodes vs. simulation time for different networks. The solid lines show the results for the original network and the dashed lines for the random networks. (b) shows the time it took for a simulated “information packet” to come into contact with half the network’s nodes.

Overall, these results reveal an important characteristic of student discussion in MOOCs: when it comes to significant communication between learners, there are simply too many discussion topics and too much heterogeneity (ie clumpiness) to result in truly global-scale discussion. Instead, most information exchange, and by extension, any knowledge construction in the discussion forums occurs in small, short-lived groups: with information “trapped” in small learner groups. This finding is important as it highlights structural limitations that may impact the ability of MOOCs to facilitate communication amongst learners that look to learn “in the crowd”.

These insights into the communication dynamics motivate a number of important questions about how social learning can be better supported, and facilitated, in MOOCs. They certainly suggest the need to leverage intelligent machine learning algorithms to support the needs of crowd-based learners; for example, in detecting different types of discussion and patterns of engagement during the runtime of a course to help students identify and engage in conversations that promote individualized learning. Without such interventions the current structural limitations of social learning in MOOCs may prevent the realization of a truly global classroom.

The next post addresses qualitative content analysis and how machine-learning community detection schemes can be used to infer latent learner communities from the content of forum posts.

Read the full paper: Gillani, N., Yasseri, T., Eynon, R., and Hjorth, I. (2014) Structural limitations of learning in a crowd – communication vulnerability and information diffusion in MOOCs. Scientific Reports 4.


Rebecca Eynon holds a joint academic post between the Oxford Internet Institute (OII) and the Department of Education at the University of Oxford. Her research focuses on education, learning and inequalities, and she has carried out projects in a range of settings (higher education, schools and the home) and life stages (childhood, adolescence and late adulthood).

]]>