The title talks about "AI Tutors" yet reading the whole paper it's just very general and obvious statements about use of AI in education/by students in general, nothing to do with "Tutors". If you ask someone else to do your homework and they do it for you, no one would call them a tutor.
The problem with this is similar as I have with every AI criticism. None of these problems are specific to AI. All of these problems ... are money/effort problems.
They start by discussing the difference between teachers who teach a subject ... and teachers who will discuss changing the foundation of a subject and the implications of big changes. Which is of course required for critical thinking.
But that's a BIG step up in skill from what normal teachers bring to the table. At that point you should be so versed in the subject that you can discuss how the subject is constructed, and why (e.g. the connection between calculus and war). Which, at minimum, requires knowledge of the subject itself, it's history including it's failed history (which paths were not taken or abandoned, like say the axiom of Choice, and why they were abandoned), current research directions (like what the arguments are for and against various kinds of large cardinal numbers. Hell, what large cardinal numbers even are, continuum hypothesis, ...).
I have a master's in Math and I've had 4 teachers, in 27 years, who had anything approaching that level of knowledge. I remember each of them vividly. And I agree with the article: with such teachers you learn 10x what you learn with "normal" teachers. But they are so uncommon that they are literally a rarity in pure math university departments (which have also gotten worse by choosing cheaper over better candidates). Frankly if you have that level of knowledge you leave the teaching profession unless you're insane, because you can do so much better.
In other words: AI can be a pretty sizeable improvement on the average teacher and this paper is the traditional argument against AI. The argument goes "AI doesn't (yet) beat the very best humans at X, so it is totally unusable for X", when AI easily beats average humans. If anything, this is an argument to have those very best teachers switch to teaching AIs, and get rid of the average ones.
And of course, there's the undertone in the article that teaching children provides a measure of social control over society. Which of course is also already a problem. Every subject has extremely controversial parts, like the first applications of calculus (which is to calculate ballistic trajectories. In other words, to kill people from as large a distance as possible. THAT is why we founded calculus, that is what it does very, very well). And if it's that controversial for math ... well, in social sciences papers European scholars started arguing for a holocaust (removing bad genes by terminating incurable patients) at the beginning of the 20th century, when Hitler was a baby crying on his mother's lap. In fact, Autism's invention/discovery and popularization by Hans Asperger had the singular purpose to "purify the genes of the great German people", not to help patients suffering from it. His words, not mine. In other words, really discussing a subject requires coldly and matter-of-factly discussing incredibly bad political ideologies, including when such ideologies are held by scientists/teachers (and pointing out just how bad they can get, how much damage they can do, and how science enables such ideologies to incredible damage)
The main criticism is skill atrophy which you don't recognize and rather talk about literal nazism.
Society is a construct and we teach children to be a part of that. If Society is shit we teach our children shit. That's it.
Can AI retrieve the knowledge for that? I guess that's possible.
Can AI make it meaningful and actually transfer that knowledge?
To learn something and truly take it as your own human contact is very much wired into our DNA. Trying to replace that with some form of text will rob an essential part in learning that has consequences that we can't properly measure. We can only see what happens with humans as they have less contact, increased loneliness and lack of role models which would likely increase.
> Frankly if you have that level of knowledge you leave the teaching profession unless you're insane, because you can do so much better.
Honestly: where in the job market is such knowledge or are such skills actually appreciated?
My life experience says that at least in academic teaching these skills are more appreciated than nearly anywhere else in industry, but if you know better, I'm interested to get to know your perspective.
> My life experience says that at least in academic teaching
If you mean financially compensated, just about anywhere. There's only a few places where academic teaching is well compensated and even in those places it doesn't compare to the private sector. Oh and there's an enormous but not very visible industry of having university professors on boards of companies as advisors to make sure there's a very easy transition for them to move to private industry.
>
If you mean financially compensated, just about anywhere.
I explicitly wrote "appreciated".
There are of course a lot of jobs that pay a lot better than some low teaching position in academia, but in these, a great knowledge nearly always gets you a lot of hate and resistance. Nearly all of these jobs are rather about "shut down your brain, keep your mouth shut, and take the money".
The title talks about "AI Tutors" yet reading the whole paper it's just very general and obvious statements about use of AI in education/by students in general, nothing to do with "Tutors". If you ask someone else to do your homework and they do it for you, no one would call them a tutor.
I am not sure whether just shouting "AI literacy" will cut it, given the sad quotes presented.
The problem with this is similar as I have with every AI criticism. None of these problems are specific to AI. All of these problems ... are money/effort problems.
They start by discussing the difference between teachers who teach a subject ... and teachers who will discuss changing the foundation of a subject and the implications of big changes. Which is of course required for critical thinking.
But that's a BIG step up in skill from what normal teachers bring to the table. At that point you should be so versed in the subject that you can discuss how the subject is constructed, and why (e.g. the connection between calculus and war). Which, at minimum, requires knowledge of the subject itself, it's history including it's failed history (which paths were not taken or abandoned, like say the axiom of Choice, and why they were abandoned), current research directions (like what the arguments are for and against various kinds of large cardinal numbers. Hell, what large cardinal numbers even are, continuum hypothesis, ...).
I have a master's in Math and I've had 4 teachers, in 27 years, who had anything approaching that level of knowledge. I remember each of them vividly. And I agree with the article: with such teachers you learn 10x what you learn with "normal" teachers. But they are so uncommon that they are literally a rarity in pure math university departments (which have also gotten worse by choosing cheaper over better candidates). Frankly if you have that level of knowledge you leave the teaching profession unless you're insane, because you can do so much better.
In other words: AI can be a pretty sizeable improvement on the average teacher and this paper is the traditional argument against AI. The argument goes "AI doesn't (yet) beat the very best humans at X, so it is totally unusable for X", when AI easily beats average humans. If anything, this is an argument to have those very best teachers switch to teaching AIs, and get rid of the average ones.
And of course, there's the undertone in the article that teaching children provides a measure of social control over society. Which of course is also already a problem. Every subject has extremely controversial parts, like the first applications of calculus (which is to calculate ballistic trajectories. In other words, to kill people from as large a distance as possible. THAT is why we founded calculus, that is what it does very, very well). And if it's that controversial for math ... well, in social sciences papers European scholars started arguing for a holocaust (removing bad genes by terminating incurable patients) at the beginning of the 20th century, when Hitler was a baby crying on his mother's lap. In fact, Autism's invention/discovery and popularization by Hans Asperger had the singular purpose to "purify the genes of the great German people", not to help patients suffering from it. His words, not mine. In other words, really discussing a subject requires coldly and matter-of-factly discussing incredibly bad political ideologies, including when such ideologies are held by scientists/teachers (and pointing out just how bad they can get, how much damage they can do, and how science enables such ideologies to incredible damage)
The main criticism is skill atrophy which you don't recognize and rather talk about literal nazism.
Society is a construct and we teach children to be a part of that. If Society is shit we teach our children shit. That's it.
Can AI retrieve the knowledge for that? I guess that's possible.
Can AI make it meaningful and actually transfer that knowledge?
To learn something and truly take it as your own human contact is very much wired into our DNA. Trying to replace that with some form of text will rob an essential part in learning that has consequences that we can't properly measure. We can only see what happens with humans as they have less contact, increased loneliness and lack of role models which would likely increase.
> Frankly if you have that level of knowledge you leave the teaching profession unless you're insane, because you can do so much better.
Honestly: where in the job market is such knowledge or are such skills actually appreciated?
My life experience says that at least in academic teaching these skills are more appreciated than nearly anywhere else in industry, but if you know better, I'm interested to get to know your perspective.
«Skunkworks», MIC, three letter agencies, problem-solving contractors, staff-level employees
> My life experience says that at least in academic teaching
If you mean financially compensated, just about anywhere. There's only a few places where academic teaching is well compensated and even in those places it doesn't compare to the private sector. Oh and there's an enormous but not very visible industry of having university professors on boards of companies as advisors to make sure there's a very easy transition for them to move to private industry.
> If you mean financially compensated, just about anywhere.
I explicitly wrote "appreciated".
There are of course a lot of jobs that pay a lot better than some low teaching position in academia, but in these, a great knowledge nearly always gets you a lot of hate and resistance. Nearly all of these jobs are rather about "shut down your brain, keep your mouth shut, and take the money".