Jim Franco/Albany Times Union via Getty Images
Since the launch of ChatGPT a little over two years ago, universities have struggled to figure out generative artificial intelligence’s place on their campuses. But the State University of New York—which, early on, invested heavily in AI research—has given the technology a place of prominence as a key subject every undergraduate student will be required to study to earn their degree.
The university system announced earlier this month that it would adjust one of its “core competencies”—general education requirements that all undergraduate students are required to take—to include education about AI. The change comes alongside others to the system’s general education program, including the addition of a new civic education core competency.
Starting in fall 2026, courses that satisfy the Information Literacy core competency will be adding lessons about AI ethics and literacy. The learning outcomes for the requirement now include a clause stating that students’ ability to “demonstrate an understanding of the ethical dimensions of information use, creation, and dissemination” should extend to “emerging technologies, such as artificial intelligence.”
The new requirement comes at a time when questions and worries are bubbling to the surface about the ethics of AI across every sector and industry, from concerns about an epidemic of AI cheating to fears that companies will use the technology to replace much of their workforces, especially in creative roles.
“SUNY is committed to academic excellence, which includes a robust general education curriculum,” SUNY chancellor John B. King said in the system’s press release regarding the changes. “We are proud that … we will help our students recognize and ethically use AI as they consider various information sources.”
Because a wide swath of courses across the system’s 64 colleges and universities can satisfy the Information Literacy requirement, there are no across-the-board requirements for how professors must incorporate information about AI into their lessons. Those choices will fall to institutions and departments, which will coordinate to develop curricula and assignments over the coming year and a half.
Lauren Bryant, a lecturer in the Department of Communication at the University at Albany, already includes a lesson on AI in Introduction to Communication Theory, a large lecture course for communication majors that fulfills the Information Literacy requirement. When introducing students to a weekly journal assignment they must complete throughout the semester, she shows the class various examples and asks them to select their favorite—without noting that one is entirely written by AI.
Some choose the AI writing, saying it’s more well-written, polished and professional-sounding than the other options. The exercise opens up discussions about what AI does well—but also where it falters, such as leaving out details that are supposed to be included in the journaling assignment.
“I think it’s important to teach them this is not going anywhere, and I think it’s a technology that we have to learn how to use—and learn how to use effectively,” she said.
Bryant, like other professors across the SUNY system, is in the very nascent stages of figuring out how her course will satisfy the new AI-related learning outcome. One topic she hopes to tackle is citations—that if students use information provided by AI, they should cite it accurately, just as they would information gleaned from a textbook.
‘Prerequisite Skills’ for Tackling AI
Though many experts have called for AI literacy education since generative AI became mainstream in late 2022, some worry that universities may struggle to guide students through the technology’s most pervasive pitfalls.
Sam Wineburg, the Margaret Jacks Professor of Education at Stanford University and the co-principal of the Digital Inquiry Group, a nonprofit that researches digital literacy and develops curricula and classroom materials, has run experiments showing that a majority of high schoolers are unable to complete even basic media literacy tasks, like differentiating between a news article and an advertisement. (Wineburg prefers the term “civic online reasoning,” which refers to one’s ability to verify a claim that impacts civic life, rather than “digital” or “media literacy.”)
He believes that, without already excelling at those skills, students who move on to postsecondary studies are doomed to fall for generative AI “hallucinations,” a term for when the technology spits out entirely made-up information as if it were true.
“There’s no indication that students have the prerequisite skills in order to check out the veracity of a large language model response,” Wineburg said. “Until higher education comes up with metrics that first establish the prerequisite skills,” offering AI literacy education “is adding insult to injury.”
But Billie Franchini, director of UAlbany’s Center for the Advancement of Teaching, Learning, and Online Education and a member of the working group that developed the new general education requirements, noted that the broad nature of the AI requirement will allow the university to continue adjusting it as the technology develops.
“The way that this new core competency was written within the SUNY general education framework—we wrote it pretty broadly, with the recognition that the landscape is going to change, right?” she said. “So, we didn’t address, let’s say, a specific AI tool or type of AI tool. It’s really about recognizing AI as one potential source of information and really ensuring that students know how to treat that source of information responsibly.”