4 Comments

Ben Riley condemns AI Chatbots because a school-based product called Khanmigo cannot solve this problem correctly—

"If a pen and crayon together cost $2.50, and the crayon costs $2 less than the pen, how much do the pen and crayon each cost?"

Apparently, it stumbled over adding 2 and 2.5.

I am puzzled why Ben didn't just ask OpenAI's GPT-3.5, which was freely available in April, when he tested Khanmigo. I provided the same prompt, verbatim, to ChatGPT and got this—

Let's denote the cost of the pen as p and the cost of the crayon as c.

According to the problem, we have two pieces of information:

1. The total cost of the pen and crayon is $2.50.

2. The crayon costs $2 less than the pen.

We can set up the following equations based on this information:

p + c = 2.50

c = p - 2

We can substitute the second equation into the first equation to solve for p:

p + (p - 2) = 2.50

2p - 2 = 2.50

2p = 4.50

p = 2.25

Now that we have the cost of the pen, we can find the cost of the crayon:

c = p - 2

c = 2.25 - 2

c = 0.25

So, the pen costs $2.25 and the crayon costs $0.25.

ChatGPT has no problem adding 2.25 and 0.25. Nor any problem explaining it to a student. Feel free to ask ChatGPT "What is the solution for x in 4x^(2)-5x-12=0?"

I might be missing the point here, but it seems Ben is nailing the problem squarely on the head: He's confining himself to the School World and missing out on what the Real World already offers.

Expand full comment

I find it disturbing that in both articles — this one and The 74's — that short shrift is given to students actually learning how to use AI chatbots. That's actually an overstatement: there really is no focus on expanding the knowledge and skills of the student.

Even the articles listed at the bottom of this page — Welcome to Think Forward: Learning with AI, Think Forward: AI and educational assessments, and Can we educate every child "to the max"? — assume that children should be taught essentially the same knowledge/skills they acquire in today's classrooms. It's legitimate as an organization or writer to focus only on a specific segment of the business — teacher tech in this case — so criticizing the lack of discussion of young humans learning this new technology to use in higher ed or work might be unfair (I'll click through and read the other CRPE articles to more fully understand the mission of this substack).

Admittedly, the article's title : "Is AI 'Destined to Fail'?" : prompted me to think it is about teaching/education fads and my observations over 20 years of teaching of teacher resistance to change in the classroom kicked in. It's hard not to notice that the classrooms of the 2020s look little different from Miss Landers's 1950s classroom. The social and technological achievements and tools that transformed retail stores, airplanes, communications, mass media, family, etc. have missed schools, leaving them essentially unchanged. Unfortunately, this means students today are pretty much living in the same educational structure their great-grandparents "enjoyed".

There's a reason the word "fad" is used in this article.

Employers are anxious to hire talent that have ChatGPT experience. Gartner estimates that 70% of white collar workers use ChatGPT daily in their work tasks. Yet, whole school divisions are banning access to AI chatbots for both students and staff. Color me more than a little puzzled.

Expand full comment

There are good reasons why teachers as a group might be more resistant to change than most other professions: I used to joke that school teachers and farmers are the most knowledgeable job applicants — farmers because they've spent their whole lives at their future workplaces and teachers aren't far behind. Consider that every teacher comes to their first job after spending 17 years in a classroom — we must ask ourselves, armed with that cultural experience, what sane teacher would want to change the workplace they likely find great comfort in?

Should we be surprised that when it comes to policy and practices guidance, "state teams continue to diverge in what they prioritize—and what they are proposing schools do"? They have little incentive to change teaching practices — and consequent learning — much less the culture. And those classrooms are sacrosanct.

If I'm a teacher reading your final declaration — "AI’s potential to make education more personalized and accessible..." — I see only the end of my world.

Expand full comment

My response to Ben Riley's "Another Mindless Mistake? We humans haven’t learned how humans learn" is that humans learn when they have reason to learn — such as for a job or something tangible.

Expand full comment