Jim Franco/Albany Times Union via Getty Images
Since ChatGPT was announced a little over two years ago, universities have struggled to figure out the place of generative artificial intelligence on campus. But the State University of New York, which invested heavily in AI research early on, has turned the technology into a major subject that all undergraduates must study to earn their degrees.
The university system announced earlier this month that it would adjust one of its “core competencies,” a general education requirement that all undergraduate students must take, to include education on AI. This change comes in parallel with other changes to the system’s general education program, including the addition of new civics education core competencies.
Starting in fall 2026, courses that meet the Information Literacy core competency will include lessons on AI ethics and literacy. The learning outcomes for this requirement include a clause stating that students’ ability to “demonstrate an understanding of the ethical aspects of the use, production, and dissemination of information” should extend to “emerging technologies such as artificial intelligence.” Now I can do it.
This new requirement brings to the fore questions and concerns about the ethics of AI across all sectors and industries, from concerns about the prevalence of AI fraud to concerns that companies will use the technology to replace much of their workforce. It was introduced at a time when Especially in creative roles.
“SUNY is committed to academic excellence, including a robust general education curriculum,” SUNY President John B. King said in the system’s press release regarding the changes. “We are proud to help students recognize and use AI ethically when considering different sources of information.”
There are no uniform requirements for how professors should incorporate information about AI into their lessons, as a wide range of courses across the system’s 64 universities can meet information literacy requirements. These choices will be left to institutions and departments, who will work together to develop curriculum and assignments over the next year and a half.
Lauren Bryant, a lecturer in the University of Albany’s Department of Communication, has already incorporated lessons about AI into her Introduction to Communication Theory, a large lecture course for communication majors that meets information literacy requirements. When introducing her students to the weekly diary assignments they must complete throughout the semester, she shows the class different examples and asks them to choose their favorite, which is written entirely by AI. I didn’t pay attention to that.
Some people choose AI writing because they say it’s better written, more polished, and sounds more professional than other options. This exercise starts a discussion about what AI does well, but also where it stumbles, such as omitting details that should have been included in the diary assignment.
“I think it’s important to teach them that this is something that can’t be helped. And I think this is a technology that we have to learn to use, and we have to learn how to use effectively.” she said.
Bryant, like other professors across the NYU system, is in the very early stages of figuring out how her courses will meet new AI-related learning outcomes. One of the topics she wants to tackle is citations. This means that when students use the information provided by AI, they need to cite it accurately, just like information gleaned from a textbook.
“Prerequisite skills” for working on AI
Many experts have called for AI literacy education since generative AI became mainstream in late 2022, but many wonder if universities will struggle to educate students on the technology’s most pervasive pitfalls. Some people are concerned.
Sam Weinberg, the Margaret Jacks Professor of Education at Stanford University and co-director of the Digital Inquiry Group, a nonprofit organization that researches digital literacy and develops curriculum and materials, says that high school students We conducted an experiment to show that the majority were unable to complete the lesson. It even includes basic media literacy tasks, such as distinguishing between news articles and advertisements. (Weinberg prefers the term “civic online reasoning,” which refers to the ability to examine claims that affect civic life, rather than “digital” or “media literacy.”)
He believes that students who enter higher education without excelling in these skills are doomed to the “illusion” of generative AI. This term refers to technology spewing out completely fabricated information as if it were truth.
“There is no indication that students have the prerequisite skills to check the veracity of large-scale language model responses,” Weinberg says. “It would be a further insult to offer AI literacy education until higher education first devises metrics that establish prerequisite skills.”
But Billy Franchini, director of the University at Albany’s Center for the Advancement of Teaching, Learning, and Online Education and a member of the working group that developed the new general education requirements, said the broad nature of the AI requirements means that universities will continue to make adjustments. He pointed out that it would be possible to continue. As technology develops.
“This new core competency is the way it was written within the general education framework at SUNY. We wrote it pretty broadly, recognizing that things would change, right?” she said. said. “That is, we did not mention specific AI tools or types of AI tools. It is important to do so.”