The Death of the Student Essay—and the Future of Cognition
-
Argues for the importance of student essays, and then:
When artificial intelligence is used to diagnose cancer or automate soul-crushing tasks that require vapid toiling, it makes us more human and should be celebrated.
I remember student essays as being soul-crushing vapid toiling, personally.
The author is very fixated on the notion that these essays are vital parts of human education. Is he aware that for much of human history - and even today, in many regions of the world - essay-writing like this wasn't so important? I think one neat element of AI's rise will be the growth of some other methods of teaching that have fallen by the wayside. Socratic dialogue, debate, personal one-on-one tutoring.
I've been teaching myself some new APIs and programming techniques recently, for example, and I'm finding it way easier having an AI to talk me through it than it is grinding my way through documentation directly.
It IS easier than reading the documentation, just like using a GPS is easier than reading a map.
In both cases, the harder task helps you build a mental model much better than the easier task.
For the GPS it doesn't really matter much, since the stakes are low--it's not important to build a mental model of a city if you can always use GPS.
With programming I'm more cautious -- not knowing what you're doing can lead to serious harms. Just look at the Therac.
-
This post did not contain any content.
I loved writing essays and see the value for a student in knowing how to state a case and back it up with evidence, what counts as evidence, and the importance of clearly communicating the ideas.
That said, I also use AI to write copy daily and the most important thing for anyone's cognition is critical thinking and reading comprehension, both of which AI is going to teach us whether we want it or not. Critical analysis is the only way we can navigate the future.
Maybe this is another Great Filter for technologically advancing critters?
-
There are kids who find exercise soul-crushing vapid toiling too.
Just for some perspective on “what’s good for you.” I personally think I’d have been more successful in life if I was better at essay writing. But I’m not sure if it’s a practice thing, or an innate ability thing. I have to assume I just need(ed) lots more practice and guidance.
I’m also on a similar path right now learning more about programming. AI is helping me understand larger structures, and reinforcing my understanding and use of coding terminology. Even if I’m not writing code, I need to be able to talk about it a bit better to interact with the AI optimally.
But this need to speak in a more optimum way may go away as AI gets better. That’s the thing I worry about, the AI crossing a threshold where you can kind of just grunt at it and get what you want. But maybe Idiocracy is on my mind there.
… just some random thoughts.
-
This post did not contain any content.
I'm still looking for a good reason to believe critical thinking and intelligence are taking a dive. It's so very easy to claim the kids aren't all right. But I wish someone would check. An interview with the gpt cheaters? A survey checking that those brilliant essays aren't from people using better prompts? Let's hear from the kids! Everyone knows nobody asked us when we were being turned into ungrammatical zombies by spell check/grammar check/texting/video content/ipads/the calculator.
-
I loved writing essays and see the value for a student in knowing how to state a case and back it up with evidence, what counts as evidence, and the importance of clearly communicating the ideas.
That said, I also use AI to write copy daily and the most important thing for anyone's cognition is critical thinking and reading comprehension, both of which AI is going to teach us whether we want it or not. Critical analysis is the only way we can navigate the future.
Maybe this is another Great Filter for technologically advancing critters?
I hated writing pointless essays about topics I don't care about, and yet I still like to research and debate.
-
There are kids who find exercise soul-crushing vapid toiling too.
Just for some perspective on “what’s good for you.” I personally think I’d have been more successful in life if I was better at essay writing. But I’m not sure if it’s a practice thing, or an innate ability thing. I have to assume I just need(ed) lots more practice and guidance.
I’m also on a similar path right now learning more about programming. AI is helping me understand larger structures, and reinforcing my understanding and use of coding terminology. Even if I’m not writing code, I need to be able to talk about it a bit better to interact with the AI optimally.
But this need to speak in a more optimum way may go away as AI gets better. That’s the thing I worry about, the AI crossing a threshold where you can kind of just grunt at it and get what you want. But maybe Idiocracy is on my mind there.
… just some random thoughts.
The problem with AI here is that it tends to prefer agreeing to you over being correct and it's very likely that it teaches patterns and terminology to you that doesn't exist.
For example, I just asked ChatGPT to explain a "backflip" in the context of agile development (I claimed I was an university student and that a teacher used that term in the context of moving tickets), and it came up with this:
If your teacher linked "backflip" to moving tickets in a predictable fashion, they might have been emphasizing:
The importance of minimizing rework or regressions.
Understanding why work items move backward (if they do) and making that visible in your process.
Managing workflow policies so that tickets don’t repeatedly “backflip” between stages, which can disrupt predictability.
How This Might Help You Move Tickets More Predictably:
If "backflip" means work moving backward:Track the Cause of Backflips:
Identify why tickets are moving backward (incomplete acceptance criteria, insufficient definition of done, unclear requirements).Improve Definition of Ready/Done:
Tighten entry/exit criteria to reduce backflows.Add Checkpoints:
Build small validation steps earlier to catch issues sooner.Visualize Flow:
Use a Kanban board to visualize backward movement and analyze bottlenecks or quality gaps.It just takes the nonsensical word, makes something up, and claims that it's right.
-
This post did not contain any content.
Another look at students, AI, and Essays on the Search Engine podcast. "What should we do about teens using AI to do their homework?"
Opinions from students and experts.
-
This post did not contain any content.
Once again I'll say, I'm perfectly fine with the death of the essay as viable school homework.
In my experience, teachers graded only on grammar and formatting. Teaching - and more to the point, grading - effective writing skills is harder than nitpicking punctuation, spelling and font choices, so guess what happens more often?
You want school to mean anything, you're going to have to switch to verbal or demonstrable skills instead of paperwork. Which society probably needs to do anyway.
-
The problem with AI here is that it tends to prefer agreeing to you over being correct and it's very likely that it teaches patterns and terminology to you that doesn't exist.
For example, I just asked ChatGPT to explain a "backflip" in the context of agile development (I claimed I was an university student and that a teacher used that term in the context of moving tickets), and it came up with this:
If your teacher linked "backflip" to moving tickets in a predictable fashion, they might have been emphasizing:
The importance of minimizing rework or regressions.
Understanding why work items move backward (if they do) and making that visible in your process.
Managing workflow policies so that tickets don’t repeatedly “backflip” between stages, which can disrupt predictability.
How This Might Help You Move Tickets More Predictably:
If "backflip" means work moving backward:Track the Cause of Backflips:
Identify why tickets are moving backward (incomplete acceptance criteria, insufficient definition of done, unclear requirements).Improve Definition of Ready/Done:
Tighten entry/exit criteria to reduce backflows.Add Checkpoints:
Build small validation steps earlier to catch issues sooner.Visualize Flow:
Use a Kanban board to visualize backward movement and analyze bottlenecks or quality gaps.It just takes the nonsensical word, makes something up, and claims that it's right.
I believe you and agree.
I have to be carful to not ask leading questions of the AI to much. It’s very happy to go off and fix things that don’t need fixing when I suggest there is a bug, but in reality it’s user error or a configuration error on my part.
It’s so eager to please.
-
I believe you and agree.
I have to be carful to not ask leading questions of the AI to much. It’s very happy to go off and fix things that don’t need fixing when I suggest there is a bug, but in reality it’s user error or a configuration error on my part.
It’s so eager to please.
Yeah, as soon as the question could be interpreted as leading, it will directly follow your lead.
I had a weird issue with Github the other day, and after Google and the documentation failed me, I asked ChatGPT as a last-ditch effort.
My issue was that some file that really can't have an empty newline at the end had an empty newline at the end, no matter what I did to the files before committing. I figured, that something was adding a newline and ChatGPT confirmed that almost enthusiastically. It was so sure that Github did that and told me that it's a frequent complaint.
Turns out, no, it doesn't. All that happened is that I first committed the file with an empty newline by accident, and Github raw files has a caching mechanism that's set to quite a long time. So all I had to do was to just wait for a bit.
Wasted about an hour of my time.
-
The problem with AI here is that it tends to prefer agreeing to you over being correct and it's very likely that it teaches patterns and terminology to you that doesn't exist.
For example, I just asked ChatGPT to explain a "backflip" in the context of agile development (I claimed I was an university student and that a teacher used that term in the context of moving tickets), and it came up with this:
If your teacher linked "backflip" to moving tickets in a predictable fashion, they might have been emphasizing:
The importance of minimizing rework or regressions.
Understanding why work items move backward (if they do) and making that visible in your process.
Managing workflow policies so that tickets don’t repeatedly “backflip” between stages, which can disrupt predictability.
How This Might Help You Move Tickets More Predictably:
If "backflip" means work moving backward:Track the Cause of Backflips:
Identify why tickets are moving backward (incomplete acceptance criteria, insufficient definition of done, unclear requirements).Improve Definition of Ready/Done:
Tighten entry/exit criteria to reduce backflows.Add Checkpoints:
Build small validation steps earlier to catch issues sooner.Visualize Flow:
Use a Kanban board to visualize backward movement and analyze bottlenecks or quality gaps.It just takes the nonsensical word, makes something up, and claims that it's right.
The joke is on you (and all of us) though. I'm going to start using "backflip" in my agile process terminology.
-
I'm still looking for a good reason to believe critical thinking and intelligence are taking a dive. It's so very easy to claim the kids aren't all right. But I wish someone would check. An interview with the gpt cheaters? A survey checking that those brilliant essays aren't from people using better prompts? Let's hear from the kids! Everyone knows nobody asked us when we were being turned into ungrammatical zombies by spell check/grammar check/texting/video content/ipads/the calculator.
IMO, kids use ChatGPT because they are aware enough to understand that the degree is what really matters in our society, so putting in the effort to understand the material when they could put in way less effort and still pass is a waste of effort.
We all understand what the goal of school should be, but that learning doesn't really align with the arbitrary measurements we use to track learning.
-
-
-
-
An Alabama City Recommends Changing Its Laws to Accommodate One of the Country’s Largest Proposed Data Centers
Technology1
-
YouTube relaxes moderation rules to allow more controversial content. Videos are allowed if "freedom of expression value may outweigh harm risk"
Technology1
-
-
-
Microsoft’s vast advertising business is target of Irish Council for Civil Liberties (ICCL) Enforce application for class action launch under EU data law
Technology1