The obsession with production.
Artificial intelligence as fuel for unhealthy educational perspectives.
I’ll know my song well before I start singin’ - Bob Dylan
Listen to the podcast segment of Ideology in Education (at 12:50) based on this blog post for the Teacher’s Education Review below 👇.
Following an apparent “first wave” of artificial intelligence (AI) discussions, it seems as though fears and concerns are still circulating surrounding the impact of AI upon education as a whole.
In this post, I aim to contribute to these continuing discussions by redirecting our gaze towards that of educational aims and purposes.
It is my argument that current discourse surrounding AI limits education to a process of production, whereby “outputs” are valued above all else as evidence of education. There is presently a growing trend for fields to become defined in terms of outputs or “content” (elsewhere referred to as “contentification”). Education is not immune to this turn and as a result, current discussions of AI seem consumed by an ideological focus on the product of education as the end goal (i.e. the grade, essay, assignment, ATAR, etc.).
In fueling an obsession for production, such ideological perspectives obscure important educational concerns in relation to how its outputs are achieved. Furthermore, there seems little evidence of discussions critiquing the way that these educational perspectives impact upon the existential possibilities for individuals. Educational perspectives that have an interest in ethical action, individual freedom and responsibility are subsequently disregarded as we come to value what we produce, rather than produce what we value.
In considering alternative ideological perspectives on education, we may find that AI is not necessarily the threat to (nor the saviour of) education as it is currently made out to be.
A turning point.
I recently signed up to ChatGPT.
But let me be clear, it was not used as a means of populating this post.
I downloaded it more out of novelty rather than for any real desire to investigate its use in the classroom. When it comes to technologies promising educational revolution, I tend to feel the need to hold an obstinate position. Nevertheless, my curiosity overtook me after some discussions with teaching colleagues. I guess you could say I wanted to see what all the fuss was about.
I will admit that it was a very helpful tool in providing summary explanations of some concepts that I needed to get my head around for a new course I have been teaching this year. It is also refreshing to not be bombarded with hundreds of advertisements clawing for my attention on any given website.
But am I concerned about it? Not really.
However, the day after I read this opinion piece and it unnerved me for a number of reasons (which I’m sure was the point!). But I found the following statement rather telling:
‘As any former student knows, one of the main challenges of writing an essay is just thinking through the subject matter and coming up with a strong, debatable claim. With one snap of the fingers and almost zero brain activity, I suddenly had one’ (Terry, 2023, emphasis added).
A critical examination of the above statement indicates that it is reflective of a number of dominant, unquestioned and ideological assumptions regarding the purpose of education. Leaving these assumptions unexamined has the potential to result in some fairly unhealthy educational perspectives.
Ideologies of output.
Ideologies not only impact on how we understand education, but also how we act within it (Ostrowski, 2022). How we interact and engage with AI in education is then also dependent on the ideologies we hold.
I see the above statement as symptomatic of particular ideological frameworks that understand the purpose of education as a means of producing an output. As a result, education becomes a performative endeavour whereby one’s level of “education” is determined by the artifacts produced by the individual as a result of educational interventions.
One does not have to look too far to see that Australian education currently values the definable “output” of education much more highly than other, less measurable qualities. This has led to a focus on meeting “outcomes”, reaching ever-higher final year scores and producing the educational product. The ever-increasing demand on teachers to present “evidence” of learning is another example of these ideological beliefs at work.
The issue in privileging the outputs of education is that the means to achieve such outputs are often left open to abuse, such that the product is simply a performative measure that gives the appearance of educational value. One need only think about the questionable ways schools have manipulated data to present a more favourable account of their community, whether “teaching to the test”, removing “difficult” students or moving growing numbers of final year high school students to complete an '“unscored” final year.
I’m sure you can think of a few too.
Look around at any advertisements and promoters of AI and you will find two central themes: productivity and efficiency. AI privileges the end product above all else, rather than the value of creating and reflecting on knowledge for oneself. In a world of education where the end product is the purpose, it is perfectly valid for an individual to avoid effort with AI if it still results in the same outcome. On the other hand, we may feel that to be educated means more than just producing something.
If we continue to understand education as simply a means of producing some output in the form of a particular skill, conceptual understanding or predefined citizen, AI will continue to disrupt the work of teachers in a negative way. It is not too difficult to see how the teacher might become replaced by AI in this scenario.
That being said, it is comforting to know that there are less fatalistic positions to take.
Finding ourselves.
Here’s a scenario to ponder...
You’ve given your class a number of tasks to engage with and they’re mostly working away. After some time, you find one of your fairly diligent students consistently chatting to their peer next to them. You find this disappointing, so you pull them up on it. They respond by saying:
“But sir/miss, I’ve done all the work.”
With a careful look over, you see that they are telling the truth.
How do you respond?
If it were me (and I have had this experience more than a few times), I would reiterate that this student has missed the point. There is a risk that in valuing the product over the process such as in the case above, we not only miss the point, but reduce education to a shallow performative act of production.
What’s worse, we may even obscure the person.
When we centre on the product, we move our gaze away from the individual/s involved and their choices. It is almost as if the student/producer is an unimportant apolitical and amoral being.
Regardless of whether an educational artifact was developed or inspired by AI, we can ask a few questions to bring the individual back into view. Such as:
How has the person been shaped by this experience?
What relevance does the artifact have for the person’s life and future?
What kinds of ethical decisions have/have not been made?
What value does the artifact have for the person?
It is questions like these that help bring other, less measurable educational concerns into view.
Valuing what we produce, or producing what we value?
Ideologies that understand education as a means of solely producing outcomes or educational artifacts do not consider the transformational potential of the process of creating and personally investing in one’s education.
It has been said that the increasing standardisation of education has led to valuing what can measured, rather than measuring what we value (Biesta, 2015). In the case of AI, we may find that we’re so obsessed with efficiency that we come to value what is produced, rather than produce what we consider valuable.
If we simply try to find the quickest way to produce an output with AI, this leaves little space for critical examination of whether these outputs, along with the methods to achieve such outputs, are desirable or not. When we use AI in education this way, we are not creating, possessing or reflecting on knowledge.
We are transporting it.
Let’s keep this in mind.
Till next time,
References
Biesta, G.J., (2015). Good education in an age of measurement: Ethics, politics, democracy. Routledge.
Ostrowski, M. S. (2022). Ideology. Polity Press.
Terry, O. K. (2023). I’m a student. You have no idea how much we’re using ChatGPT. The Chronicle of Higher Education. https://www.chronicle.com/article/im-a-student-you-have-no-idea-how-much-were-using-chatgpt.
I think we have to adapt to the latest technology including what AI has to offer even if it means adjusting to how we assess. AI is not going away. We need to teach the skills of analysing information, where is the origin of the text coming from, is it trustworthy, how do we test its authenticity and reliability, peer review, trusted sources, what are the consequences if the information is inaccurate, how do confer and compare with other similar texts. I think these questions will become more important as we interact with AI information as time goes on. If we are teaching critical thinking and preparing students for jobs that do not exist yet surely we owe it to them to teach rigorous analytical inquiry and interrogation of texts.
This is brilliant and incredibly important and extremely relevant in speaking to today's misdirected educational environment where outcomes without or neglecting values has supremacy! Data, metrics, outcomes, evidence of learning, AI scripts and ATARS are an all consuming focus that can also lead to depression and anxiety or other mental health issues of Australian youth. Education needs to address a focus on the heart and values as well as reflective knowledge of knowing yourself and contributing to democratic citizenship for the wellbeing of community and individuals.