Emily Musil reviews her 11- and 13-year-old children’s report cards, scanning through the typical categories: language arts, math, social studies.But one day she is hopeful there will be a new metric for assessing all children: artificial intelligence literacy.“Yes, I do think we'll get there,” Musil says, pointing toward the recent rapid progression in what elementary schools offer, from typing class to computer literacy instruction to teaching coding.
But currently “as a parent, I’m not seeing how my children are doing in understanding deep computing and AI tools — and that needs to shift.”Musil is a managing director of social innovation at the nonprofit think tank Milken Institute.She served as the head researcher for a report released in November focused on building the nation’s talent engine in the age of AI.“If you care about economic mobility, what choices do you need to make?” she asks.
“We are behind, because technology is so rapidly advancing and is so tied to all jobs.” The report pushed for K-12 institutions to emphasize AI literacy in conjunction with critical thinking and decision-making skills.But getting to that point of an expanded curriculum — let alone one that tackles the nuances of AI technology — may prove difficult.That’s because it will take concerted efforts from institutions, schools and leaders to make it happen.
Lack of Standards and ExpertiseFederal standards for AI education began in the Obama administration and were most recently pushed again by the Trump administration, with the presidential action “Advancing Artificial Intelligence Education for American Youth.” Yet when it comes to local implementation, it is largely up to the school or administrator — and more than half of U.S.schools or districts, many of them rural or classified as Title 1, have no standards at all.According to the report, 60 percent of U.S.schools or districts have no guidance for generative AI usage.
Many schools have previously told EdSurge that because the technology is so rapidly changing, decisions are often left up to the teacher’s discretion.The lack of standards could also be a lack of expertise on AI and technology as a whole in the classroom.For example, according to the Milken report, only 17 percent of current computer science teachers have computer science degrees.
While the report did not delve into what those teachers may have majored in instead, Musil added that some teachers are asked to cover the subject as the scope of their workload increases.The same phenomenon could happen with AI literacy curricula.“If you've been a teacher for 20 years, suddenly, you may not be an expert in medieval history, but you had to do something on it,” she says.“So, they're teaching something they're not necessarily deeply skilled in.” Collective ActionThe report had four specific focus points for K-12 schools: developmentally appropriate instruction in AI; ethical and critical use of AI tools; pairing human cognition with AI use; and learning through human interaction rather than screens alone.For students, “K–12 education is often the first place they encounter STEM and computing topics,” the report says.
“As an AI-driven workforce demands specialized skills earlier and earlier, K–12 has become an even more crucial intervention point.By building future-ready curricula and support systems, we can address gaps early and support student flourishing.”They are lofty goals.A related challenge is the dearth of girls pursuing STEM fields.
The report found roughly half (49 percent) of elementary school computer science students are girls.That drops to 44 percent by middle school, 33 percent by high school, and to about 20 percent by college graduation.The Milken report acknowledges there is no easy, silver bullet solution to achieving these goals.There are the necessary federal efforts at play.
And Musil suggested that employers or individual philanthropists could help support schools in funding, advocating and collaborating in curricula changes, to benefit both students and hiring organizations.“This report makes it clear that the challenge is national in scope and the solutions must be collective,” says Michael Ellison, co-founder and CEO of CodePath, a nonprofit focused on diversifying the technology industry.The organization supported the Milken Institute in producing its report.
“Philanthropists, industry leaders, policymakers, and educators all must act to rewire education and workforce systems for an AI-driven world.”Risks of AI IntegrationBut there are also considerations to take when integrating the rapidly changing technology.A report released last month by The Center for Democracy and Technology found that schools’ embrace of AI was connected to increased risk of worse outcomes for students: half of the surveyed students said using AI in class makes them feel less connected with their teacher.“As many hype up the possibilities for AI to transform education, we cannot let the negative impact on students get lost in the shuffle,” Elizabeth Laird, director of the Equity in Civic Technology Project at CDT, said in a statement.
“Our research shows AI use in schools comes with real risks … Acknowledging those risks enables education leaders, policymakers, and communities to mount prevention and response efforts so that the positive uses of AI are not overshadowed by harm to students.”And in a 2023 report titled “Artificial Intelligence and the Future of Teaching and Learning,” the Department of Education warns of unchecked usage.“We especially call upon leaders to avoid romancing the magic of AI or only focusing on promising applications or outcomes, but instead to interrogate with a critical eye how AI-enabled systems and tools function in the educational environment,” the report says.But Musil points out that whether schools have specific rules or not when it comes to AI integration, the students will be using it in their free time — and it is best to teach them the best way to avoid those negative outcomes.
“My daughter is told AI is cheating, but there’s lots of things to do with pedagogy with AI; that piece of it is going to be their future.” she says.“When I’m hiring, I want someone to use AI and know when it is cheating, when it isn’t, and when it supports human thinking and when it supplants it.”