California State University’s controversial $17 million deal to provide ChatGPT on all of its campuses has met with mixed results, with widespread but uneven use across the system, high distrust of AI-generated content and widespread fear that the technology could jeopardize job security – even as people say they want more training in the systems they believe will be “essential” to their businesses.
Those complicated feelings were among the findings of the largest study to date of artificial intelligence in higher education, which surveyed 94,000 students, faculty and staff at 22 CSU campuses from San Diego to Arcata.
A survey conducted last year by researchers at San Diego State University shows CSU is grappling with how AI is impacting assignments, classroom instruction, competition for jobs and academic integrity. It found that almost every respondent had used AI at some point, with personal use being more common than educational purposes.
According to survey results, employees are most excited about technology, followed by students and faculty – the group that is most divided. released on wednesday. Most of them also said that they believe AI can boost creativity and innovation.
In a statement, CSU Chancellor Mildred Garcia said she viewed the results “not merely as a measure of current attitudes” but as a “call to action.”
“CSU has an opportunity to lead higher education by shaping how AI can be incorporated thoughtfully, equitably, and responsibly,” he said. “And we will answer that call.”
AI in the crosshairs
The new CSU data comes at a critical moment for AI in education.
The university’s 18-month contract with OpenAI to license the ChatGPT chatbot for 460,000 students and 63,000 faculty members and staff expires in July. A petition With more than 3,300 signatures — more than half of them CSU students, staff or faculty — circulating to call for an end to the partnership.
At the same time, other universities are also joining this trend. In December, USC announced it would provide ChatGPT to its 80,000 students, staff, and faculty members at a cost of $3.1 million per year. Some campuses, including Caltech, are also using AI tools to screen applicants.
A CSU spokesperson declined to say whether the administrators would renew their ChatGPT deal.
“We are considering all options that will allow CSU to continue to provide students, faculty and staff access to AI tools, resources and training,” the spokesperson said.
The survey found that despite mixed views on AI, more than 70% of faculty want formal training on it, and nearly half of students do as well.
How do students use AI?
The CSU survey was not specific to ChatGPT, but found it to be the most popular AI tool by far. Over 84% of students, staff, and faculty said they use it to some extent. Other tools like Gemini and Canva also ranked highly, while writing tool Grammarly was the second most popular among students.
Of those who named ChatGPT as their top tool, about 30% of students and 40% of employees said they use it daily. Nearly two-thirds of students and staff, and more than half of the faculty reported using it at least weekly.
The majority of students – 80% – say they would not use AI to present classwork on their own. Nearly 9 in 10 students also said they believe it is “necessary” for humans to check AI-generated content for accuracy. The high rates of staff and faculty say the same.
Landon Block, a senior studying political science at Cal Poly San Luis Obispo, said he “rarely” uses AI for several reasons, including “profound environmental impacts, local consequences for data centers across the country, ethical issues over training and deployment, and critical skills being lost/underdeveloped.”
Block, who did not participate in the survey, said he has used his university-distributed ChatGPT account only once.
“However, I have several friends in STEM-heavy courses who consistently, yet responsibly, use AI to help code and implement class material. I’ve also seen classmates use AI irresponsibly to cheat or otherwise get work done,” he said.
Katie Caroum, a Cal State Northridge senior majoring in communication studies, said AI has been “inconsistently used and applied.” This perception is expressed in survey results, which found wide variation in the way faculty members mention the use of AI in the curriculum or encourage or discourage AI in classrooms.
“The thing I hear most from students is that they are struggling with AI detectors and how they can get it very wrong,” said Caroum, vice president of systemwide affairs for the Cal State Student Association.
Faculty Division
Employees – non-instructional staff such as finance, information technology, clerical roles and food service – view AI most favourably, with more than 70% saying the technology has a “positive” impact on their work. About 64% of students said they believe the same is true for their education.
Faculty members are more divided. The study states, “56% report a positive impact on their teaching and research, and 52% report a negative impact. Faculty are the only group in the survey where a majority reports both.”
Still, more than half of faculty, 55%, said they use AI to develop course content.
martha lincolnAn associate professor of medical anthropology at San Francisco State is among those who oppose AI. Lincoln – together Martha KennyThe man behind the petition – a professor in the university’s Department of Women’s and Gender Studies – is asking CSU to “invest in humans” and “reject Silicon Valley’s AI hype”.
“The way I cope with AI is that I now have to dedicate time in my courses to confirm to my students that they are not allowed to use AI in homework assignments,” Lincoln said. “I have to read my students’ work to see if I can spot clear signs of AI use, which is a very frustrating and wasteful way to spend time.”
Lincoln said they have had to “redesign a lot of their assignments and assessments so that they can’t be easily hacked with the use of AI,” such as in-class or multiple-choice exams, or creative presentation projects.
zach justusThe director of faculty development at Chico State said he’s heard such views among the 900 faculty members he works with, but he’s also seen many who are excited about AI.
Justus said, “We still have people who want to pretend it doesn’t exist. We still have people who are adapting in real time and doing amazing work. And we have people who would prefer to keep it out of their classrooms.” “I always tell faculty, ‘Don’t outsource what you love.’ If you like reading and then creating visuals for a complex article, great, keep doing that. But if it was something you hated doing and you weren’t good at, you could get some help with it.”
The stress is one of those that Cal Poly Maritime Academy professors Taiyo Inoue and Sarah Senk explored in a podcast, “My robot teacher,” Which he launched last year.
“We wanted a faculty-led space that would make room for more than just propaganda or depressing narratives,” said Senk, professor of literature, whose project is funded by California Education Learning Lab And it looks at “how AI could push higher education toward better forms of learning than we’ve settled for.”
Cenk said, “The big question for me is how to teach students to control their attention, decisions, and thoughts in a society that treats them as rapidly extractable resources.” “Over the last 20 years, it has become easier and easier to put your thinking away. Companies compete for attention, platforms compete for your attention, and now AI makes cognitive outsourcing frictionless. Higher education should be one of the few places still committed to helping students keep a grip on their brains.”
