Friday, November 22, 2024

Are AI skills a key part of career preparation in college?

A May 2024 survey by Inside Higher Ed and Generation Lab asked students if they knew when, how or whether to use generative artificial intelligence to help with coursework. Student responses revealed the importance of faculty communication around generative AI policies in the classroom but also highlighted some learners’ disdain for using the technology in any capacity.

Among the 5,025-plus survey respondents, around 2 percent (n=93), provided free responses to the question on AI policy and use in the classroom. Over half (55) of those responses were flat-out refusal to engage with AI. A few said they don’t know how to use AI or are not familiar with the tool, which impacts their ability to apply appropriate use to coursework.

But as generative AI becomes more ingrained into the workplace and higher education, a growing number of professors and industry experts believe this will be something all students need, in their classes and in their lives beyond academia.

Methodology

Inside Higher Ed’s annual Student Voice survey was fielded in May in partnership with Generation Lab and had 5,025 total student respondents.

The sample includes over 3,500 four-year students and 1,400 two-year students. Over one-third of respondents were post-traditional (attending a two-year institution or 25 or older in age), 16 percent are exclusively online learners and 40 percent are first-generation students.

The complete data set, with interactive visualizations, is available here. In addition to questions about their academics, the survey asked students about health and wellness, the college experience, and preparation for life after college.

“The big picture is that it’s not going to slow down and it’s not going to go away, so we need to work quickly to ensure that the future workforce is prepared,” says Shawn VanDerziel, president and CEO of the National Association of Colleges and Employers (NACE). “That’s what employers want. They want a prepared workforce, and they want to know that higher education is equipped to fill those needs of industry.”

Students Say

The Student Voice survey reflects other national studies on student perceptions of generative artificial intelligence. While some learners are ready to embrace the technology head-on, they remain in the minority.

A summer 2023 study by Chegg found 20 percent of students in the U.S. (n=1,018) say they’ve used generative AI for their studies, the second-lowest adoption rate among other surveyed countries. A majority of U.S. students believe use of generative AI tools should be limited in assessed work (53 percent), and 10 percent believe it should be banned.

Fewer than half of U.S. learners said they want their curriculum to include training on AI tools (47 percent). One-quarter of respondents indicated AI would not be relevant to their future career, and 17 percent said they don’t want the training at all.

What’s the Holdup?

Student Voice survey participants indicated a variety of reasons why they didn’t want to use AI tools. Some were disdainful of the technology as a whole, and others indicated it wasn’t appropriate to use in higher education.

When asked their top three concerns about using generative AI in their education, Chegg’s survey found students were worried about cheating (52 percent), receiving incorrect or inaccurate information (50 percent), and data privacy (39 percent).

“Whether you’re very leery of this for a variety of reasons—whether they be ethical, environmental, social, economic—or enthusiastic, I think we have to occupy the space for a while and recognize it’s going to be odd and complicated,” says Chuck Lewis, writing director at Beloit College in Wisconsin.

In a recently published study in Science Direct, University of California, Irvine, researchers surveyed 1,001 students to understand their usage and concerns around using ChatGPT. Among students who held concerns, the top themes were around ethics, quality, careers, accessibility and privacy or surveillance.

Some survey respondents indicated they were concerned about unintentional plagiarism or use of ChatGPT compromising their work, which could lead to consequences from their institution.

“I am afraid to be flagged, so I refrain from utilizing it at all,” a junior from Florida Gulf Coast University wrote in the Student Voice survey.

Others surveyed by Irvine researchers were worried about the quality of the output ChatGPT provides, which could impact students’ creativity or result in inaccurate information.

“I do not see any application in a chat bot. I spend more time fixing its mistakes than I would actually writing the thing,” a junior at the University of Wisconsin–Milwaukee said in the Student Voice survey.

Additionally, some students shared in the Irvine study that they were worried a reliance on ChatGPT could erode their critical thinking skills or make them feel “too comfortable” sidestepping learning processes, which could harm their job prospects.

Reversing the Trend

Afia Tasneem, senior director of strategic research at the consulting firm EAB, points to institutional hesitancy to respond to AI and a negative stigma around the tech as one reason students may be anti-AI. In fall 2022, colleges and universities were quick to implement anti-AI policies to limit plagiarism or other academic misconduct, which instilled fear in students.

Lewis finds learner inclinations toward or against the tech can be tied in part to the student’s field of study. His humanities students are much more likely to express a disdain for AI compared to those in STEM, for example.

“I’ve sensed a kind of bi-modality in student attitudes,” Lewis says. “Some are like, ‘Ooh, ick, that’s not why we’re here’ … For example, when you talk about AI to creative writers, they feel really like, ‘This is just bad news. No fun.’ And yet, on the other extreme, you have a lot of students who are like, ‘Why would I not want to use a tool that’s going to make my getting this task done faster and easier?’”

Now, as more industry professionals consider AI literacy and skills essential, universities have to turn culture on its head, which isn’t an easy task. But some think higher education is doing students a disservice if it allows them to opt out of AI use entirely.

A May survey from Cengage Group found 70 percent of recent college graduates (n=1,000) believe basic generative AI training should be integrated into courses, and 69 percent say they need more training on how to work alongside new technologies in their current roles.

“While there are certainly objections to the use of AI in many circumstances, we need to put guardrails around AI clearly, but we also, as instructors, as mentors, as professionals, need to help the next generation of workers apply other kinds of skills … to be able to make wise decisions related to AI,” NACE’s VanDerziel says.

Looking to the Future

Generative AI tools have exploded in capability and availability since 2022, stirring excitement among institutions and employers about the next evolution.

“Businesses, for good reason, want to embrace it, and embrace it in a way that helps their bottom line, helps them be more competitive, helps them be more efficient. All those things that typically are reasons why technology is adopted in the first place, this is just, in some respects, another technology that companies will have to adopt,” says James DiLellio, professor of decision sciences at the Graziadio School of Business at Pepperdine University.

Understanding the future impact of AI on today’s college students, though, is like looking into a crystal ball—mostly unclear and up to interpretation.

“I think a lot of universities moved fairly quickly to start thinking of this as a new competency and a kind of essential workforce skill,” says Dylan Ruediger, senior program manager for research at Ithaka S+R. “Whether that will prove to be true or not, is still, I think, kind of hard to know. There seems to be a little bit of a disillusionment going on around the technology in the business world. Whether that’s a blip or, you know, a permanent trend, I don’t know.”

VanDerziel emphasizes that employers, by and large, are not requiring workers to be using AI currently, but instead consider AI part of a larger technology competency students will need for the future and to be applied alongside other skills.

A May survey by NACE found 75 percent of employers hadn’t used AI in the past year, and only 3 percent planned to use AI within the next year for workplace tasks.

“We learned from our internship study that we published in the spring that less than 10 percent of interns learned AI skills in their internships,” VanDerziel says. “I thought that was really telling … of how employers are using AI currently. That’s such a small portion of students [who] actually probably even touched it in their internship, which is where you would expect the application to actually happen. It’s just not happening yet.”

David Syphers, a physics professor at Eastern Washington University, sees generative AI as a fad that has been getting too much attention recently in higher education.

“It’s not what most people think it is. It’s not intelligent, it’s not conscious, it’s not going to take our jobs,” Syphers says. “It’s a really interesting piece of software.”

To Syphers, the conversation around AI and preparing students for the workforce feels like a direct response to national pressures to justify the value of higher education. But making students AI competent is a moving target because of how fast generative AI and tools are evolving.

Instead, Syphers argues, higher education’s role should be on providing students enduring tools for careers, not just their next job, through promoting communication, critical thinking and other lasting skills.

Considering Pedagogy and Curriculum

If, as some experts believe, AI skills are critical for the future of work, the question becomes how to deliver these skills equitably across academic programs. Recent trends in higher education have seen institutions engage with students earlier on career development and planning, to ensure every student receives personalized support and assistance as they begin their journey after college.

“To level the playing field and ensure that there aren’t students who are being left behind with AI, we need to integrate [it] throughout disciplines and throughout the curriculum,” VanDerziel says. “That’s the only way to do it, so that students, no matter what course load they have, we know that they are going to have exposure to technologies that large portions of our population are using and that will be required by the workforce of the future.”

But placing generative AI in the classroom is trickier than teamwork or communication skills.

“As long as individual instructors have ultimate say over how it gets used in their classroom, it’s likely that there will be instructors who prefer not to allow the use of generative AI,” says Ruediger. “I would be surprised to see that disappear on its own any time soon.”

As a faculty member at Pepperdine, DiLellio sees his mission to prepare students to bring what they’ve learned into the workforce immediately, and that includes using new technologies.

“I want students to be able to take advantage of that [generative AI], because I know in the workplace, these tools are not going to go away,” DiLellio says. “We’ve got to figure out ways to encourage students to be willing to embrace the technology, and faculty can help.”

Some of DiLellio’s M.B.A. students use ChatGPT to run analytical calculations, similarly to how they would in Excel, for a faster and more efficient computation. “It’s very valuable—you could find software that could allow them to think more critically about the results, as opposed to just figuring out how to generate those results,” DiLellio says.

Syphers, on the other hand, considers the rigor of completing calculations as the reason for learning and attending college.

“I’m not asking my Introductory Physics students to solve problems because the world needs to know the answer to those problems,” he says. “They’ve been solved many, many times before. I’m asking them to solve those problems as intellectual exercise, to better themselves.”

Ultimately, understanding where AI belongs in the curriculum requires instructors to distill to the core learning outcomes of their courses, whether that’s creative thinking, problem-solving, communication, analysis or research, says Beloit’s Lewis.

“I think that we’re, as educators, in an uncanny valley, where we really don’t know what we think we mean by what should be human or what should be machine,” Lewis says.

Does your institution require students to use AI? Tell us more.

This article has been updated to correct the spelling of David Sypher’s name.

Related Articles

Latest Articles