AI in Education: Kevin Shindel on #UMustLearn Radio Show
Download MP3A friend of the show, Kevin Shindell, was on a radio program called You Must Learn Radio the other day, and, uh, good stuff.
So I thought it would be good with his permission.
I'm adding his podcast here to artificial education or artificial intelligence, real talk, and I hope you, uh, enjoy it.
Check it out and lemme know what you think.
Radio Show, hashtag you Must Learn.
Hashtag you Must Learn is the Liberatory Education Platform for those who seek a better understanding of all things education and enlightenment.
I'm Dr. Darrel Howard, and I'm one of the members of the hashtag You Must Learn Collective here to give you an hour's worth of intellectual and practical insight from individuals in our community who seek to advance the cause of education, equity, leadership, and liberation.
In the words from Sonya Sanchez's, 1975 Play Uhhuh.
But how does it free us?
Similarly, I'm always seeking to understand with educational and social change agents uhhuh, but how does your work advance the culture?
And speaking of advancing the culture, I wanna begin by highlighting this week's black history fact.
A man who can be described as a peacemaker.
Dr. Ralph Johnson Bunch became the first African American to receive the Noble Peace Prize on December 10th, 1950 for the 1949 Pharmacist Agreements.
Bunch.
Was born on August 7th, 1904 in Detroit, Michigan.
He was a diplomat in political scientist.
Who is remembered internationally for his work as the chief mediator for the United Nations.
During the Arab Israeli conflict in which he received the Nobel Peace Prize Bunch was also involved as a mediator in other major conflicts around the world.
In 1968, he became the United Nations under Secretary General.
President John F. Kennedy awarded a bunch with the Nation's highest civilian award, the Presidential Medal of Freedom in 1963 early in life.
Bunch recedes his undergraduate degree from UCLA by graduating Summa CU Laude and Class Valedictorian in 1927.
He earned his master's in 1928 and doctorate in 1934 from Harvard in political science.
He was the first African American to receive his doctorate in political science.
He later did postgraduate anthropological work at the London School of Economics and the University of Kate.
Ta.
Shout out UCT.
Before completing his graduate degrees from Harvard Bunch, started teaching political science at Howard University, the real HU, where he served as chair of the Department of Political Science from 1928 to 1950.
Bunch.
Was active in civil rights and served for over 20 years on the NAACP board.
He and his wife, the former Ruth Harris, had three children.
Bunch, died on December 9th, 1971 in New York City.
At age 68.
Let me just say something about Brother Bunch.
Here we're talking about a person who received some real peace prize acknowledgements.
I just wanted to make sure that we are clear on how these things were earned back in the day.
Um, let's give a shout out to this legend of our culture.
And speaking of those legends, I have two legends on here with me tonight.
I have, uh.
Two friends, two comrades, two colleagues who are doing some amazing things and, um, they're joining the show.
And we'll be learning a little bit about what they do in advancing the culture in our first segment.
Social science educator, uh, philosophy teacher, media literacy, AI guru Ken, Kevin Shindel, will talk to us about what we need to know and consider about artificial intelligence in education.
That's a big topic, y'all make sure you're paying attention for that one.
And the second segment, Omar Barlow, the founder of the Salt and Light School for boys, will share his first year experiences in the new school.
So I'm, I'm excited also to hear that.
So WPFW listeners, I know you're excited as I am, so go ahead, take a moment and share about these incredible guests that we have coming up next, text or call a few of your friends, let them know, Hey, we got some dynamic guests on hashtag you must learn tonight.
Tell 'em to put on our radio dial WPFW 89.3, or they can go online, wpfw fm.org.
Tell 'em across, share across social media as well.
Hashtag you Must Learn is the platform where we put folks on game to what's new and relevant in the education and equity space.
As always, we wanna hear from our guests too.
So call us at (202) 588-0893 or you can use the hashtag you must learn on social media, and we might read your comment or question on the air.
Be sure to include at Wpfw FM and at Darrel Howard PhD in your post so you can be duly recognized.
Alright folks, I told you I got a good, I got a good lineup today.
So I'm excited to get started with the first guest, the one and only Kevin Shindel.
Kevin Hinde.
Tell my WP FW audience a little bit about who you are.
Thank you, Dar.
I appreciate you, uh, having me on tonight to talk about, uh, this, uh, my name's Kevin Shinde and I've been a teacher at Montgomery Blair High School for about the last 25 years, uh, 24, 25 years.
So, um, I, I have to give a shout out to all my Blair High School students if I forget that they're gonna let me know tomorrow.
Um, and so yeah, I've been a, uh, social studies teacher.
I've taught history and government philosophy, media literacy.
Um, AI impact, um, a number of courses.
Um, and so yeah, that's, um, that's kind of it.
I'm just, you know, excited to get into it with you tonight.
Fantastic.
Alright, well let, let's start, um, let's start in the past a little bit.
Mm-hmm.
Because I know prior to kind of like your interest in your learning and teaching and ai, I know you, you were already doing some stuff around.
Looking at students' cell phone use and just media literacy.
And you had a, you had a digital detox program, um, that you did with your students.
So let's talk about like the nature of students' reliance on technology and the challenges that you address.
Like how has that changed over the past decade?
Let's start with, with that.
Yeah, so I've been, um, I, I started that digital detox in, in the spring of 2012.
Um, and at that point, you know, social media had been around for a little bit.
Um, I, I remember vividly, you know, my first introduction to social media when one of my colleagues came in and, um, this was in the, uh, fall of 2005, probably late September, 2005.
And he came in and he said, um, Hey, have you ever heard of MySpace?
And I said, ah, what is MySpace?
And he said, man, you need to get on this and, and we gotta figure this out.
He said, three of my, uh, daughter's friends at their school just got suspended for something they posted on MySpace and he said, you should get on and, and check out.
I bet your students have pages.
And so I, I did that.
Um, right before class.
And, um, I can tell you within about 10 minutes, I just, you know, I, I was intuitively feeling this is going to go off the rails.
This could get really, really bad.
Um, and then it took me some years and, and, you know, I was always interested in studying it and talking about it with kids and things like that.
And then one day in 2012, um, I read an article that said more than 50% of teens will wake up.
Um, from, from asleep in the middle of the night to either send or receive a text message.
And I went in and I read the headline to the students.
I said, how many of you do this?
And, um, I be damned that every hand didn't go up in the air.
And at that point the digital downtime project was born.
Right?
And so it was kind of, we really need to dive in and take a look at the, um, at the effects of screen technologies.
And, you know, I think there was probably a moral panic about it around that time.
Right.
Um, I think that, that a lot of people were really concerned and, and, and the idea that kids were addicted to screens.
Um, and, and we didn't have all the information, but I think some of that panic was justified, right?
And some of us could see it early on.
Um, and even though the research, you know, over the last couple years has really started to kind of drill in to show, show the long term effects, some of us saw really early what, what was, what was down the road.
Um, but as far as the nature of change.
I think social media began, and it was really social media, right?
It was about how kids could connect with each other.
It was a lot about social connection.
Um, and, and you know, one of the quotes that I, I'll never forget is, like I said, there was all this, you know, concern about the kids being addicted to the screens.
The kids are spending too much time on the screens.
Um, and there was a. Fantastic researcher.
Her name was Dana Boyd.
She's worked for Microsoft and she's done her own stuff.
And she said, you know, these kids aren't addicted to their screens.
They're addicted to each other.
And so, you know that that put it kind of in a different light.
You know, I'm looking at the three of us.
We're about the same age.
If you give us a lifeline to our friends 24 7 at the age of 13, 14, 15, we're gonna take that lifeline.
Right.
Um, you know, I remember in high school this, the person with that had the most social status was the one who got their own landline.
Right.
Or the 10 foot cord to wrap around three rooms or go down the hall into a room by yourself.
Um, you know, we didn't have, we didn't have access to this, but if we did, we would've been using it.
Just, you know, just like they were.
And so the, the, the nature of the online communication then was just kind of an extension of, of their relationships in, in the physical world.
Right.
Um, and then I think it started to change and, and it started to change.
We, I don't know whether we were, whether it was destined to change, whether it had to change, but we'll never know that because once the social media platforms.
Understood that they could use AI algorithms to make money.
Once they understood that they could monetize those algorithms, then they became weaponized.
Right?
And so then it's just how much attention can we capture?
Um, and there's been a. Decided change in the last five years, right?
I mean, Facebook is terrible.
Now, let's be honest.
Every other thing that you see is an ad. You might get a feed with five or six people of your, of your friends.
Um, but it's all taken up by reels, you know, the short form videos.
And so I think that's kind of where we've gone.
We've gone from a time where it was really kind of about social connection, connecting with people around the country, around the world, establishing relationships.
Um, but once the attention, you know, got ramped up.
Now it's just all about, you know, um, trying to keep kids and, and people on as much as they can with short form videos that, you know, really out.
They've had a devastating impact, you know, on, on students academically, socially, cognitively.
It's, it's, you know, so there's been a big change over the last 10 years, I think.
Yeah.
And, and, and the attention piece is, is what stood out to me and what you just said, um, attention's like the new currency.
Like that's, that's the thing folks want.
Um, and, and I often think about it myself, like how am I using the, the, the focus and attention that I have and what am I using it for?
Um, so it's impacting adults just as it is impacting, impacting our students.
Mm-hmm.
Um, but, but, but to, to the, to the.
To the fact that you're talking about adults and students like you, you described this as like an, an ecosystem.
Mm-hmm.
It's not just a tool, like it's an ecosystem.
Can you elaborate a little bit more on what you mean by that?
And then particularly with students, like what are the academic and non-academic changes you foresee in the ade, uh, the, the ecosystem?
Mm-hmm.
In, on the, on the student experience.
Right.
So, I mean, you know, now here comes ai, right?
Um, ai, um, in a different format, right?
That we, we've already, kids have already been exposed to AI and, and they've been abused by AI for 10 years, right?
But now we're going to invite AI into our classrooms, uh, more formally.
And I think that the, the metaphors matter, man.
The metaphors matter a lot.
And, and we, we want to say that AI is a tool, right?
So, Darryl, you tell me when I say tool, what do you think of?
What's the first tool?
Something that you can use to improve something, but you can, you control it.
Okay.
You control it.
Um, mm-hmm.
So usually when you say tool, people say a hammer, right?
A screwdriver.
Um, and, and AI is not a hammer, right?
Um mm-hmm.
So the idea that AI is not a tool and it's more of an ecosystem, it's related to the work of Neil Postman.
Um, Neil Postman was an old cultural critic and media theorist, and author and professor.
Um, I would urge everybody to dive into some Neil Postman.
Um, he wrote a book in 1992 called technoly, and I'm, I'm convinced if, if, if everybody in America read Technoly, we would solve ha at least half of our technological issues.
Um, but what Neil Postman said was that, you know.
Technology is not additive or subtractive.
It's ecological.
It changes the entire environment.
It doesn't just add to what we can do and subtract other things.
It fundamentally alters the environment and we've seen this time and time and time again.
Tools don't do that, right?
When, when you think of a tool, you think of a hammer.
A hammer has a limited usage, right?
It can be used for good or bad.
Sure, but there's nobody programming that hammer so that you sit there and look at it all day long.
There's nobody telling that you can use this hammer to write an essay and you could use this hammer to order your groceries and you can use this hammer to buy your concert tickets and you can use this hammer to improve your productivity.
Right.
So tools are very limited.
Yep.
We, we do have multipurpose tools and that's great.
But AI is not a tool.
AI is an entire ecosystem, um, and is being built as such so that, um, once you are in that ecosystem, it impacts every aspect of that ecosystem.
And so when you're talking about, you know, how has this played out in, in classrooms?
You know, when I started at Blair in, in the fall of 2002, a classroom of 30 students had one screen.
It was the teacher's desktop.
Right.
You fast forward 20 years and now a classroom has more than two screens per person.
Everybody's got a Chromebook.
I still have my laptop.
Everybody's got a phone and there's a big box light, or Promethean, you know, big smart board at the front of the screen.
That technological change didn't add to what we did.
It fundamentally altered every aspect of that classroom.
Um, and, and you know, to be clear, I'm not opposed to technology.
I'm not opposed to Chromebooks.
I'm not opposed to ai.
Um.
I'm opposed to the developers of ai.
That's a whole different story.
We could get into that.
I'm, I'm opposed to, you know, I would, those guys,
that'd be part, that'd be part two.
Part two, I'll fight those guys to the hilt.
Right.
Um, but I'm not opposed to technology.
But, but the problem is we didn't look at the comprehensive effects of that.
And I will, I will say that given every kid a chrome.
In a classroom has been an unmitigated disaster, right?
Um, it's affected their relationships, it's affected their ability to communicate.
It has affected their attention span, it's affected their memory.
Um, it's, it's at a total impact on every aspect of that classroom.
And ai, um, AI is going to be, uh, more profound than that, right?
Um, AI has the potential, um, and again.
I'm not opposed to ai.
I think there are really, really good use cases for ai even though the research hasn't necessarily found them all out yet.
Um, but I, I think that there are some real positive use cases of ai, but we have to know what's coming.
We have to know, um, not just how to use ai, but, but if we don't study how AI is planning to use us.
Um, we're going, we're gonna be at the whim of, of somebody else's goals.
And, and that's a big problem.
So I think AI is gonna transform student relationships.
It's gonna transform their ability to communicate, um, their ability to think, um, through problems, right?
If you have an AI.
That, um, you get stuck on a problem.
Are you gonna sit there and deliberate?
You're gonna reflect, you're going to take a break and, and come back to it?
No, you're just gonna go to Chad, GBT and say, I'm stuck.
Help me out.
Right.
Um, and so it's going to be transformative and, and unless we know, um, you know, the potential, um, I don't think any of this stuff is necessarily inev inevitable.
Uh, but unless we understand the potential impact, we're, we're really gonna set ourselves up for, uh, for some potential harm for students and teachers.
So, so in thinking about some of these negative impacts, um, like from a practical sense mm-hmm.
Like, we know that there's significant hurdles for, um, for school faculty to, to overcome, to ensure responsible and effective implementation of ai.
Um, is it a matter of a training policy or something else?
Yeah, it's, it's all the above, right?
I mean, it's training, it's policy.
It is.
Um, right now, I mean, there's such a, a lack of training, um, in, in the impacts of ai, how to use ai.
Um, we're getting, uh, there's a shortage of policies, um, around AI that's coming from districts.
Um, and, you know, I think that.
You know, the, the biggest thing, the, the biggest hurdle I think might be relationships themselves, right?
It all comes back to relationships.
In that classroom, you, you will accomplish nothing.
As a teacher, you'll accomplish nothing positive if you don't have a, a good relationship with those students.
Right?
And so the biggest hurdle to really implementing, um, any, any positive, you know, AI integration and deployment, um.
Could be those relationships.
And, and I think there's a lot of fear.
You know, when, when, when AI first started and kids started to use it and some kids started to cheat, and, you know, um, I think that's more overblown than people believe, but maybe I'm wrong about that.
I don't, I think the perception is everybody, all the kids are using AI to write all their essays.
That's just not true.
Right?
Um, but I do think that that's the perception among some teachers.
And so if you're talking about the positive integration of ai.
And teachers are now going to say, well, I'm assuming that the kids are going to use AI to write all of their papers.
So I'm gonna now ask them questions about their papers.
We're gonna have time set aside, and they have to prove to me that they wrote their paper and that they know what they're talking about.
How you gonna start and establish a good relationship?
With the premise that you are cheating.
And it's my job to investigate whether you are actually telling the truth.
So you are fundamentally harming that relationship.
And once that happens, in fact, there, there was a study that came out a couple, um, couple months ago from the Center of Democracy and TE technology.
Half of students felt that their relationship with their teachers was harmed by the presence of AI in the classroom.
Right.
Um, and so we just got big questions that, that need to be answered, uh, before we kind of integrate this.
And so, you know, I know that, um, you know, a lot, a lot of companies, a lot of platforms are approaching systems outta fear saying, you know, you're gonna harm your kids.
They're not gonna be prepared for the work world and all this stuff.
Um, and so, you know, they're, they're trying to ramp up, um, the speed at which we deploy and integrate ai.
And I, I think.
That's not, that shouldn't be part of the framework, right?
The, you know, the, um, the purposes have to be aligned.
The relationship has to be there.
That, that purpose alignment is really important because our purposes as educators probably, um, are in conflict with the purpose of, of for-profit corporations that have, you know, uh, different incentives than we do.
Right.
Yeah.
So I, I would say that, you know, what we're trying to do at Blair is we're trying to establish an AI impact and integration team.
And everybody has to be at that table.
Parents have to be there, students have to be there, teachers have to be there, admin has to be there.
Um, so that we decide before we, before we bring some platform in, before we bring some corporation in, we decide what has value in our classrooms.
We decide what the purposes of education are, and at that point, then you unlock the door and you bring, you invite people in, you say.
We want our students to communicate with each other.
We don't want them on screen all day.
We want our students to, um, to enhance their critical thinking, their conceptual thinking.
We don't want the, um, we don't want that stuff outsourced to, to, you know, a large language model chat bot.
And so, you know that those, they, that has to be part of the framework.
Yeah.
So, but, but from what I heard you say, it sounds like.
Like you mentioned teachers and parents and administrators, and you said students should be involved in that too, but how are you like intentionally like helping the students understand some of the ethics of, of, of AI and trust and privacy?
Like how do you, how do you start having the conversation with them so they understand the impact?
They,
Darrell, they already know, right?
They, they, if you want to talk to a student about social media, that may be the one time where they won't stop talking in high school, right?
It's tough to wake kids up these days.
You get 'em in a conversation about social media and the impacts of social media and what they see and, and how they understand those messages.
They will talk for days.
Right.
So the students are, you know, I think there was all this talk about students are digital natives.
They know more about the technology that we do, and I, I don't necessarily know that that's true, right?
But students are already fluent in a lot of this stuff.
They've been, they've been the victims, they've been the users of this, you know, and so when it comes to talking to students about ai, um, and cheating and, and do you find purpose and value in what you do?
And if you don't.
Why wouldn't you use chat GPT to write your essay if this has no relevance to your life or what you want to do?
Um, and so they have to be at the table.
They, they, they should be the first ones with the seat at the table, right?
Um, the question that gets asked most these days, what should I do if ai.
As we're being told is going to cause massive, massive unemployment.
You got kids getting ready to plop down hundreds of thousands of dollars potentially on college educations.
You got kids that, that are wondering whether, um, the trades is the best avenue now.
Right?
Should I be a journalist or should I be a plumber?
That's a good question.
Right.
So the, the kids are ready for these conversations and the kids need to be empowered to drive some of that.
I don't think the kids want to be on a Chromebook all day long.
I don't think the kids want to have, um, a large language model as their best friend, but we're not, we are not engaging them in this conversation enough where that they can have some agency.
Right.
And, and so, so much of what we're being told is.
This is what AI is going to do.
This is why you have to use ai.
Um, and whatever we decide, you know what, whatever the big six chat, GBT and Grok and you know, c Claude, whatever they decide, you know, that becomes inevitable and, and you just completely rob kids of their agency.
Nothing we should accept.
Nothing is inevitable.
Mm-hmm.
Mm-hmm.
So, so we're, we're running outta time.
We got about five more minutes left before we go to break.
Mm-hmm.
If we're thinking about, like, if somebody's listening and, and they're in a different school district, um, different state, different part of the world,
careful consideration of AI adoption, what are, what are the top three non-negotiable criteria that that needs to be in that, in that conversation right there?
Top three there, there's like 10, right?
Well, gi gimme as many as you can
in the next 30
minutes.
I can, sure, I can, I can rattle all three.
So number one is, um, you, you have to have your purposes aligned, um, among educator, students, you know, teachers and, and that has to be aligned before you adopt the technology, right?
Um, number two, th there has to be design features that do not allow students off the cognitive hook.
Right, and that's, I think that's the biggest fear around AI is that students will use it to e, even if they don't use it to flat out plagiarize and cheat, that they'll be offloading too much of that cognition, the productive struggle that it takes.
It's really hard to be an educated person.
There are no shortcuts and, and the idea that we have this shiny, beautiful tool.
Right.
That is going to make your learning easier.
Um, it's just, you know, it's fool's gold, right?
And so I think there has to be design features that, that, that the AI before a student must demonstrate some metacognition and some reflective capacity before the AI goes into deeper explanations.
The AI should really kind of be.
You know, a scaffolded, you know, Socrates.
But first and foremost, the AI needs to ask, ask questions more than anything else, right?
Let the students do the heavy cognitive lifting, and the AI can guide them in that as far as scaffolding goes.
Um, but the AI can't do the cognitive work.
Um, and another thing that, that I'm sure kids and, and the companies will, Hey, I want every design feature stripped from any ai.
That humanizes that ai.
In fact, it, it maybe every fifth comment, the AI should say, you know, I'm just a, a computer crunching numbers and data here.
Right?
Um, I want, I love that.
I love that.
Yeah.
Because the, the, the danger of anthropomorphizing these things, right, and treating it as your friend, the number one use, Daryl, the number one use of, of large language models today is as therapy.
Right, and I understand that there's an absolute lack of therapist therapy is expensive, but man, we're going, we're going to, we're gonna wonder what the costs are of having large language models providing therapy to potentially millions of Americans and, and kids.
And so.
Um, yeah, I want, I want humanity.
I want emotion stripped that thing, it crunches numbers and data and can help us in logic.
I want a Mr. Spock ai, right?
I, I don't want the one that, um, that, that says, oh, I really care about you.
And, um, if nobody else is listening to you, I will listen to you.
I think that's a, that's a dangerous, dangerous slope.
Um, and a and a real quick, finally, there has to be verifiable checks that protect students' privacy and data.
Students are, are not.
To be taken advantage of and not to be exploited for, um, for a corporation to study all their clicks, how long it takes 'em to do something, what they're doing, what their flaws are, all these things so that they can maximize their profits.
And, um, and that's, you know, there needs to be verifiable, uh, checks in there because these, the companies can't be trusted.
Google for a dozen times over the last 10 years Right.
Has been sued by school districts for, uh, misusing, um, and exploiting students' data, uh, contrary to their terms of agreement.
And we know this.
So there you go.
There's, there's three.
We keep going.
Equity.
You wanna talk equity?
We
So drop, drop, drop, drop a second to equity on me.
A second to equity.
Um, so these tools, um, will most likely dramatically expand the opportunity gap.
Right.
Unless there's some real checks in there, they're gonna, you know, if they don't run on older hardware and you, they only run on newer hardware, the kids that can't afford that, the kids that don't have that, they're at a loss.
And so I know that there are systems that are saying, you know, we're gonna have checks in to make sure that, you know, everybody has the same thing.
Um, but, but the, the most, you know, the, the most probabil, the, the biggest probability is that that gap gets widened, you know, dramatically.
Yeah.
Yeah.
A-W-P-F-W audience.
We have just been listening to a lecture by the Kevin Sheel talking about all the considerations around AI and education.
We we're so, we're, so let's say this first, let's just establish this.
We need a part two.
That's, that's clear that we need to, we need you to come back and we need to have a, a part two about this.
Um, because there's so many things that we still have to consider and a lot of people have a lot of curiosity around, um, around ai.
In this show, we talk about education.
So of course we're going to, we're gonna blend the two, but WPFW you must learn show.
We are dropping the new information on you that you are not getting anywhere else.
Um, so.
I'm going to ask Kevin this last question.
What else do you want the WPFW audience to know of?
Um, you know, caution, right?
Keep your eyes open.
Um, you, you have to be skeptical.
The, the, we're being, you know, sold these tools, um, quote unquote right?
As, as the panacea and, and just understand that the people that are selling these things are, are, are.
Oftentimes gonna profit from 'em, and they're trying to give you an idealistic vision of the world.
And we should know that the reality is never the ideal.
Right?
And, and we look back to social media.
If we could have done things differently in 2010, 2012 with regard to social media, what will we have done?
And once you answer that question, then you have to do that as, as AI emerges, because I see us just where it's the same path.
Right.
The same uncritical, non skeptical path.
Um, and, and we need to maintain, you know, our, our agency over this.
Yeah.
Yeah.
Alright.
You heard it from Kevin Sheel folks
