2019 – Proposal
for Policy regarding Information Communications Technology in Education
A Human-centred
agenda
Consistent with the ILO, we propose a human-centred agenda
for the future of work (and specifically education) that strengthens the social
contract by placing people and the work they do at the centre of economic and social
policy and business practice (ILO, 2019). The three principles required to achieve
this are:
1.
Increasing investment in people’s capabilities
In regard to investment in people’s capabilities, we propose
that lifelong learning being an expectation and requirement of the future world
of work means that the role of teachers and educators more broadly will continue
to be important to the conditions of work. This includes funding and supporting
the work of organisations involved in upskilling people and especially those
providing transitioning support for people as they move between careers across
their life. Actively working to address disparities that exist between people
and placing those most able to assist in their development in front of them is
key.
2.
Increasing investment in the institutions of
work
Conditions of human’s work continue to be important and certain
rights and responsibilities need to become part of a universal labour guarantee.
Technologies role within this allows for the blurring between work and non-work
hours, we suggest that these changes be monitored, and time sovereignty and flexibility
remain those most important elements in this exchange. “Harnessing and
monitoring technology for decent work” (ILO, 2019) is the role of all
institutions to track the impact of technology on workers hours and ongoing
wellbeing.
3.
Increasing investment in decent and sustainable
work
Consistent with the UNs Sustainable development goals we
propose a continued commitment to sustainable and decent work, especially that
which uses technology in line with these principles. The impact on the
environment and the ability to extend human experience beyond normal means are
both worthy of consideration and must be carefully balanced.
The reason we promote a human-centric agenda is because we
have seen throughout history that technology can and does often increase the divide
between genders, social classes and so forth. We promote the role of human
beings as central to education going forward for our own ability to motivate, develop
and maintain relationships and adapt our demeanour and approach in line with
the needs of our students. We see the role of technology as supporting humans’
abilities to greater leverage our core humanity in the educative process rather
than replacing or bypassing teachers.
Teacher-mediated
technology
The core message of this policy is that equity and thoughtful
use are the most important elements of technology adoption. These disparities
that emerge in equity most often occur as a result of one of the two following
elements:
The Digital Divide
|
The Human Divide
|
1.
The digital divide is around the hardware and
software that students have available as well as the provision and access to
high-speed internet and quality tools both at home and at school.
2.
The human divide is centred around the provision
of education mediated by a human and to what degree. We can foresee a
near-future where different countries and levels of economic development
produce different levels of educational outcome based upon the differing levels
of involvement with fully qualified teachers, that students receive.
Both the digital and human divide occur at the class, school
and global level. Classes that are deemed ‘difficult’ could be provided
different forms of technology, be provided with more adults in class, but less fully
qualified teachers in front of them. At the school level, those in the remote
and rural areas receive inferior internet access, inferior devices and less face-to-face
instruction from fully qualified and experienced teachers (via Distance
Education or as a result of difficulty sourcing teachers). At the global level
whole nations or geographical areas may decide that their human capacity cannot
keep pace with the population boom and opt for a ‘teacherless’ form of
education that is ‘scale-able’ and above all affordable. Each of these
illustrations of the digital and human divide illustrate ways that equity issues
can result in a broader level of inequality, something that Neil Selwyn (2019) refers
to as a ‘Two-tiered system’. We seek to further complicate this two-tiered
system further into 4 forms or levels of teaching and learning, whilst also
noting that this list will likely become dated and redundant in the near
future. It does however illustrate different forms of instruction, rather than
pedagogies, that provide differing outcomes and means of engaging with the
educational process.
4
levels / forms of engaging w/ Teaching and Learning
Face-to-face teaching
|
Video-mediated face-to-face teaching
(Distance Ed. etc.)
|
Group or peer-to-peer online
text-mediated discussion
|
Entirely digital forms of teaching
(without qualified teacher mediation)
|
This very crude overview of different forms of teaching and
learning available to students illustrates the differences between primarily
the top two elements and the bottom two. We expect that there is a significant
drop-off at the final form of ‘entirely digital forms of teaching’ which are not
mediated by a qualified teacher. This is due to the centrality of
teacher-student relationships and their impact on positive student outcomes.
In line with our VGSA document, the teaching and learning
that students can expect to receive within schools is consistent and
standardised, often to the minute. Regardless of forms of technology, we expect
that students should receive the same level of face-to-face teaching in line
with these core governing documents.
Andragogy versus
Pedagogy: Definitions and relevance to Technology use in schools
When considering the current means by which digital forms of
teaching are applied without qualified teachers. It is worth commenting on the
fact that there are two broad forms of pedagogy, which though overly reductive,
can give some shape to the discussion that follows (Reid, 2019). Invariably the
means of teaching without qualified, human teachers, rely greatly on Instructivist
approaches because these traditional forms of direct address are most easy to
adapt into the online and technological spaces.
Instructivist
-
Direct Instruction:
-
Explicit
-
Direct
-
Testable
|
Constructivist
-
Project Based Learning:
-
Humanist
-
Group-based
-
Collaborative
|
We can foresee a future where technology-mediated
instruction begins to more genuinely replicate those ‘constructivist’ ideas
more convincingly, which is where it will be worth considering which of the
four forms of instruction are being used to deliver these forms, which may have
differing outcomes for quality and engagement.
Before moving on from the topic of means of engagement with
learning environments it is prudent to discuss the difference between andragogy
and pedagogy. Where Andragogy literally means "leading man", whereas
"pedagogy" means "leading children". This dialectic difference
is not mere semantics. The rise of MOOCS and similar online learning pathways
are made available to adults, who often sacrifice some of the quality of
education for the benefit of convenience, less social aspects, fewer
relationships at the expense of an ‘available-any-time’ approach. It is crucial
that these two approaches to learning are not conflated and that students are
taught in ways consistent with pedagogical approaches and that adults do not
make decisions that they themselves would accept, but which will negatively impact
on students who are most in need of the core social, participatory and democratic
elements of their schooling.
‘Datafication’ and
‘learnification’ of education systems
Biesta (2009, 2015) describes the ‘learnification’ of
education as an element of a broader Neoliberal ad performativity agenda (Ball,
2003, 2016). In short, it means that the ultimate goal of learning is success
on tests, largely standardised tests and as such schools are becoming ‘learnified’
around these concepts. This is a dominant ideology, despite running directly counter
to other narratives calling for broader ranges of skills such as ’21st
century skills, ‘general capabilities’ and so forth. The ‘datafication’ of
schools runs along similar lines (Lycett, 2013; Stevenson, 2017)
A brief summary of
three major movements within technology in Education
This diagram indicates three major considerations for the
future of education mediated by technology. This is not an exhaustive list of elements
but provides key talking points for impact upon students and teachers as well
as the educational systems as a whole.
These three elements rely upon one another and separating them
is illustrative rather than actual. As algorithmic thinking, personalisation, artificial
intelligence all rest on the elements of big data and learning analytics. However,
the key noticeable and most problematic element is that there is no human,
teacher element within this triptych this conflict will be explored below.
For each of these sections the focus is one of ethical
considerations and framing with monitoring to ensure these principles are being
met.
Algorithmic thinking and personalisation – It is worth
noting that ‘Artificial Intelligence’ (AI) is just an advanced form of an algorithm
that learns as it operates. But nonetheless, algorithmic thinking is by itself
a threat to the human-mediated curriculum that we deliver, due to concepts such
as ‘management by algorithm’, ‘incomprehensible assessment by algorithm’ and ‘personalisation
without person’. To explore each concept in detail, management by algorithm is
a means by which human behaviour is managed without direct interaction with a
human being and is something that should be challenged and repelled from within
educational spaces. Assessment practices in schools that focus on rubrics and
move away from student immediate comprehensibility can make the jump to
algorithmic thinking whereby the result gained cannot be ‘reverse-engineered’ by
the students and are therefore less clear and increase the divide between students
who are more literate than their peers. Personalisation as an agenda emerged
from the colloquially known ‘Gonski 2.0’ report (DET, 2018) and is at the
ideation phase for appearing as a tool, aimed as a tool for teachers rather
than a replacement (Donovan, 2019). The personalisation agenda is something being
actively push by many ‘Big Tech’ agents and in most situations minimises the teacher’s
role, or dramatically changes it from the traditional ‘sage-on-the-sage’ to the
‘guide-on-the-side’. This shift is not inherently bad and in many reckonings is
a positive, but it remains a shift that needs closer consideration and awareness
paid to it by educators, unions and international bodies.
Big Data and learning analytics
Schools are awash with data (Balacco, 2010; Hattie, 2005) and
big technology companies and a whole suite of unregulated and uncontrolled apps
routinely collect copious amounts of often sensitive data from our students. Data
is widely considered ‘the new gold’ of modern commerce, being traded, scraped
over and analysed to improve products and profit margins. The groups and people
who control and command the largest quantity and quality of data are and will
continue to be the most powerful agents in the global space. The ideas that teachers
and Students are being used as guinea pigs and viewed as mere data-points for
collecting information in regards to informing improving products and
educational offerings is problematic. This idea runs directly counter to inalienable
human rights and undermines the humanity and democratic nature of our societies.
It is generally understood that data is beyond the pail of comprehensibility in
that it is no longer possible to realistically unravel the ways that data has
been collected and used to deliver customised offerings to you. This has
ramifications for the ways that this concept is challenged, in that it is too
late to put the genie back into the bottle. What must instead occur is a
re-emphasis on the ethical use of this data and the rights of individuals in
the face of these complex algorithms that re-constitute data in ways that are incomprehensible
to the lay-person. Implicit within this discussion is the exceptional value of student’s
data and the patterning of students as committed consumers to various software
and hardware in their most formative developmental years. As such, this space is
an area for ‘white-hat’ experts to be trained and supported by Governments or
NGOs to monitor and protect their citizens from these complex and powerful
forces. Grand standardised and samples-based tests fit within this umbrella,
with tests like NAPLAN, PISA, TIMSS and PIRLs all being a more global
manifestation of these ‘Learnification’ and ‘Datafication’ of schools.
Artificial Intelligence
“Australia is in a unique situation as the only Western
democracy without comprehensive enforceable protection of human rights (that
is, no bill of rights, no comprehensive constitutional protection of rights).”
(Daly et al., 2019)
The increasing importance and emergence of AI seems unlikely
to ease and “Blocking all of these technologies is not an option, any more than
cutting off access to the internet would be, but there may be scope to
ban particularly harmful technologies if they emerge.” (Emphasis added,
Page 15, Dawson, et. al., 2019). The identification obligation is key and “People
should always be aware when a decision that affects them has been made by an
AI, as difficulties with automated decisions by government departments have
already been before Australian courts” (Page 7, Dawson, et. al, 2019). For monitoring
purposes, these types of issues should be made visible and transparent long
before litigation is required.
The question of expertise, knowledge and transparency are
key as “Full transparency is sometimes impossible, or undesirable (consider
privacy breaches). But there are always ways to achieve a degree of
transparency. Take neural nets, for example: they are too complex to explain,
and very few people would have the expertise to understand anyway.” (Page 11,
Dawson, et. al, 2019). As with the above examples, a ‘Humans-in-the-loop’ (Page
33, Dawson, et. al, 2019), or, in this case, a qualified expert in the loop
doing the monitoring of these technologies.
A more targeted set of AI responsive goals will be expanded
through the ‘rights-based, human approach’ section.
Concluding the three
major movements within technology in Education
The low status of the teaching profession in nations around
the world hold significant threats in an era where teacher replacement is
technologically possible and economically a great deal more cost-effective. In
this respect the two issues of low teacher status and technological
advancements in AI, Data and Algorithmic thinking need to always be considered
alongside one another. Greater need for
quality teachers as the move towards lifelong learning and career flexibility
generally suggested by future prognostication.
3 proposed
solutions
As Reid (2018) posits, more time needs to be spent focusing
on the goals of certain interventions, beyond the earlier Gillard goal of
improving PISA results and brining in 1-to-1 devices. Too often in the
technology space little thought is committed to focussing on the ‘why’ of
technological developments. In this space, many elements of technology adoption
in schools have been assumed, the introduction and continued rollout of AI and
Machine-learning into schools allows us a unique opportunity to set up ethical
frameworks and problematise this technology within schooling. Here are three
proposed solutions that allow for monitoring and pause for consideration at this
timely moment.
Firstly, Dawson, Schleiger, Horton, et. al (2019) suggest
that all AI systems should and must have a ‘human-in-the-loop’ (HITL), but in
educational settings this is not sufficient, what is required is a ‘Qualified-Teacher-In-the-loop’
(QTIL). Second, there is also space and room for a new role within schools, an
empathetic and human data manager who is responsible not only for upskilling
teachers around data, its humane collection and use, monitoring, security and
stewardship. Lastly, there is space for a monitoring group who are responsible
for monitoring these three forces, and others as relevant.
A rights-based,
human-focused agenda
Generic human rights
Rights to:
-
Disappear
-
Disconnect
-
Manage own data
Teacher rights
Rights to:
-
Be unavailable / off-the-clock from work requirements
-
Choose appropriate tools without bias
-
Refuse to use dictated tools
-
Control own resources and materials
-
Avoid double-handling
-
To produce content and share willingly via creative
commons
-
Share all intellectual property as freely as
possible
-
Be tech company blind and agile around tool use
-
Recognition for resources created and work
completed
-
‘Whistle-blow’ around poor practices in the tech
space
-
Avoid commercial incursion into professional
identity
-
Avoid conflicts of interest
Student rights
Rights to:
-
Not appear on social media without express
permission
-
Own classwork and have agency and influence over
such works wider use in regards to promotion
-
Agency around key educational decisions within
reason
-
Trust in the safety and protection and safe
stewardship of their data
A rights-based,
human-focused agenda on AI
The following ethical guidelines are modified from (Dawson,
Schleiger, Horton, et. al, 2019) and have been shaped and edited to more
closely suit the focus of educational AI and the problems that it poses.
1) Right to Transparency. All individuals have the
right to know the basis of an AI decision that concerns them. This includes
access to the factors, the logic, and techniques that produced the outcome.
2) Right to Human Determination. All individuals have
the right to a final determination made by a person.
3) Human-in-the-loop. All AI systems must have a
human person within the system. (Dawson, Schleiger, Horton, et. al, 2019)
4) Identification Obligation. The institution
responsible for an AI system must be made known to the public.
5) Fairness Obligation.
Institutions must ensure that AI systems do not reflect unfair bias or make
impermissible discriminatory decisions
6) Assessment and Accountability Obligation. An AI
system should be deployed only after an adequate evaluation of its purpose and
objectives, its benefits, as well as its risks. Institutions must be
responsible for decisions made by an AI system.
7) Accuracy, Reliability, and Validity Obligations.
Institutions must ensure the accuracy, reliability, and validity of decisions.
8) Data Quality Obligation. Institutions must
establish data provenance and assure quality and relevance for the data input
into algorithms.
9) Termination Obligation. An institution that has
established an AI system has an affirmative obligation to terminate the system
if human control of the system is no longer possible.
10) Comprehensible origins. As all AI systems replicate
and magnify human biases and subjective decisions, each AI system needs to
provide a logical thought piece, or literature review explaining the thinking and
ideas that underpin its processes.
Students
Students need to be offered agency and co-agency over their
learning and their environment. In addition to this the OECD dictates three
further forms of Key Competencies: ‘Creating new value’; ‘reconciling
tensions and dilemmas’; and ‘taking responsibility’ (Howells, Page 5, 2018). In
particular Howells stresses that students when “dealing with novelty, change
diversity and ambiguity assumes that individuals can think for themselves and
work with others” (p6). This agency and co-agency exist alongside teacher
agency, whereby teachers are empowered to use their professional judgement,
skills expertise and pedagogical freedom to choose what pedagogical and
classroom approaches are most suitable in their context. Whilst agency exists
on both sides, the core element that ties teachers and students together is a
shared professional relationship with which the two parties mediate a mutually
beneficial environment to meet needs consistent with the teacher’s worldview
and that of broader society. Notably, we as a union and as a society disagree
with any pedagogical approaches that do not hold this as central to its beliefs.
Indeed, “To help enable agency, educators must not only recognise leaders’
individuality, but also acknowledge the wider set of relationships – with
teachers, peers, families and communities – that influence their learning. A
concept underlying the learning framework is “co-agency” – the interactive,
mutually supportive relationship that help learners to progress towards their
valued goals. In this context, everyone should be considered a learner, not only
the students but also teachers, school managers, parents and communities.” (Howells,
Page 4, 2018). This broadening of everyone as learners is important but
considering the climate of the role of teachers and educators, we also strongly
emphasise teachers expertise and agency as professionals. In short, “It is time
to shift the focus of our students from “more hours of learning” to “quality
learning time”. (p6)
Teachers
Teachers never were consulted or worked with alongside in
regard to hardware or software beyond the most tokenistic fashion. This is not
a goal that we should expect or promote. Rather there is a space for regulation
and monitoring of the global Edu-business and ‘Big Tech’ or ‘Big data’, in both
the positive (UNESCO Institute, 2019), but also crucially, the negatives. What
is required and asked for from teachers may be untenable or unrealistic. In an
ideal world, without restrictions of time, teachers would be immune to ‘platform
capitalism’ and able to pick up and drop different apps, software and tools
with equal skill and ability. They would find no need for accreditations,
unpaid or paid endorsement deals or feel the need to supplement their income or
profile using the above means. As these things may be too difficult to achieve,
instead we propose the ethical approach and rights-based ideas outlined above. One
key innovation in this space will be the generation and extension of the above
rights-based approach towards an agreed upon ethical model that teachers can
opt-in for, as a ‘Big Tech’ neutral type of accreditation. Such an
accreditation would allow teachers to show their commitment to SDGs, active
monitoring and challenging of ‘Big Tech’ and ‘Big Data’ into their schools.
These types of accredited teachers would assist and tap into the activities of the
broader regulatory bodies and serve as ‘on-the-ground’ experts who can communicate
with and exchange ideas with such crucial regulatory bodies.
Running Word Count: 32,990
References:
Adams, S. (2019) Can AltSchool—The Edtech Startup With $174M From Billionaires Like Zuckerberg And Thiel—Save Itself From Failure? Accessed on 26/8/2019, from: https://www.forbes.com/sites/susanadams/2019/01/30/can-altschoolthe-edtech-startup-with-174m-from-billionaires-like-zuckerberg-and-thielsave-itself-from-failure/#6f6ce8ad1997
Ball, S. J. (2003). The teacher's soul and the terrors of performativity. Journal of education policy, 18(2), 215-228.
Ball, S. J. (2016). Subjectivity as a site of struggle: refusing neoliberalism?. British Journal of Sociology of Education, 37(8), 1129-1146.
Balacco, D. (2010). Using school data to inform students' learning. Curriculum & Leadership Journal, 8(3).
Balkan, S. (2019) Online Safety in An A.I. World. Accessed on 1/3/2019, available from: https://www.fosi.org/about/press/online-safety-in-an-ai-world/#
Biesta, G. (2009). Good education in an age of measurement: On the need to reconnect with the question of purpose in education. Educational Assessment, Evaluation and Accountability (formerly: Journal of Personnel Evaluation in Education), 21(1), 33-46.
Biesta, G. (2015). What is education for? On good education, teacher judgement, and educational professionalism. European Journal of Education, 50(1), 75-87.
Caruba, L. (2018) Carpe Diem to close at end of school year. MySanAntonio.com. Accessed on: 26/8/2019, from: https://www.mysanantonio.com/news/education/article/Carpe-Diem-to-close-at-end-of-school-year-12753477.php
Daly, A., Hagendorff, T., Li, H., Mann, M., Marda, V., Wagner, B., ... & Witteborn, S. (2019). Artificial Intelligence, Governance and Ethics: Global Perspectives. The Chinese University of Hong Kong Faculty of Law Research Paper, (2019-15).
Dawson D and Schleiger E*, Horton J, McLaughlin J, Robinson C∞, Quezada G, Scowcroft J, and
Hajkowicz S† (2019) Artificial Intelligence: Australia’s Ethics Framework. Data61 CSIRO, Australia.
Department of Education and Training. (2018). Through growth to achievement: report of the review to achieve educational excellence in Australian schools.
Dobo, N. (2017) Students sat in cubicles using computers. It wasn’t popular. The Hechinger Report. Accessed on 25/6/2019, from: https://hechingerreport.org/students-sat-cubicles-using-computers-wasnt-popular/
Donovan, J. (2019) Where are we going with the National Learning Progressions and Online Formative Assessment Initiative? ResearchEd conference.
Gorur, R. (2011). ANT on the PISA trail: Following the statistical pursuit of certainty. Educational Philosophy and Theory, 43(sup1), 76-93.
Gorur, R., & Wu, M. (2015). Leaning too far? PISA, policy and Australia's ‘top five’ambitions. Discourse: studies in the cultural politics of education, 36(5), 647-664.
Harari, Y. N. (2016). Homo Deus: A brief history of tomorrow. Random House.
Harari, Y. N. (2018). Lessons for the 21st Century. Spiegel & Grau.
Hattie, J. (2005). What is the nature of evidence that makes a difference to learning?. 2005-Using data to support learning, 7.
Howells, K. (2018). The future of education and skills: education 2030: the future we want.
ILO. (2019). Work for a brighter future-Global Commission on the Future of Work.
Jobin, A., Ienca, M., & Vayena, E. (2019). Artificial Intelligence: the global landscape of ethics guidelines. arXiv preprint arXiv:1906.11668.
Lycett, M. (2013). ‘Datafication’: Making sense of (big) data in a complex world.
Melendez, S. (2018) After rapid growth, Zuckerberg-backed school program faces scrutiny over effectiveness, data privacy. Accessed on: 26/8/2019, from: https://www.fastcompany.com/90269809/after-rapid-growth-zuckerberg-backed-school-program-faces-scrutiny-over-effectiveness-and-data-privacy
Messenger, J. C. (2018). Working time and the future of work. Future of Work Research Paper Series, (6).
Morrison, N. (2014) Fewer Teachers, More Data In The Schools Of The Future. Accessed on 26/8/2019, from: https://www.forbes.com/sites/nickmorrison/2014/12/18/fewer-teachers-more-data-in-the-schools-of-the-future/
Peacock, M. N. (2019). Is constructivism a prerequisite to unlock the power of web based platforms in teacher training?: A case study on the enablers for web based learning platforms for teacher training in Cambodia.
Puentedura, R. (2010). SAMR
and TPCK: Intro to advanced practice. Retrieved February, 12,
2013.
Reid, A. (2018). Beyond certainty: A process for thinking about futures for Australian education. Report Commissioned by the Australian Secondary Principals’ Association. ISBN, 978-0.
Rieland, R. (2016) How AltSchool Is Personalizing Education By Collecting Loads of Data on Its Students. Accessed on 26/8/2019, from: https://www.smithsonianmag.com/innovation/how-altschool-personalizing-education-by-collecting-hordes-data-on-students-180960463/
Riep, C. (2019) What do we really know about Bridge International Academies? A summary of the research findings. Education International Research.
Romrell, D., Kidder, L., & Wood, E. (2014). The SAMR model as a framework for evaluating mLearning. Online Learning Journal, 18(2).
Sjorberg, S. (2019) The PISA-syndrome – How the OECD has hijacked the way we perceive pupils, schools and education.
Stevenson, H. (2017). The “datafication” of teaching: can teachers speak back to the numbers? Peabody journal of education, 92(4), 537-557.
Tate, E. (2018) ‘Dear Mr. Zuckerberg’: Students Take Summit Learning Protests Directly to Facebook Chief. Accessed on 26/8/2019, from: https://www.edsurge.com/news/2018-11-15-dear-mr-zuckerberg-students-take-summit-learning-protests-directly-to-facebook-chief/
Tucker, N. (2011) Carpe Diem Marketing Video. Accessed on 26/8/2019, from: https://vimeo.com/23834061
UNESCO Institute (2019) Artificial Intelligence in Education Compendium of Promising Initiatives UN. Accessed on 11/9/2019 , available from: https://iite.unesco.org/publications/ai-in-ed-compendium-of-promising-initiatives-mlw-2019/
Unwin, A. & Yandell, J. (2016) PISA-envy, Pearson and Starbucks-style schools. Accessed on 13/9/2019. Available from: https://newint.org/features/2016/04/01/edu-businesses-impact
Zalnieriute, M., & Gould-Fensom, O. (2019). Artificial Intelligence: Australia’s Ethics Framework Submission to the Department of Industry, Innovation and Science. UNSW Law Research Paper, (19-40).
Comments
Post a Comment