Big Tech’s New Classroom: How Cal State is Embracing AI
Spurred by titans like Amazon and OpenAI, California State wants to become the nation’s “largest A.I.-empowered” university.
The inaugural AI Camp at California State University kicked off with students like Savannah Bosley diving into Amazon Bedrock, a platform designed for building artificial intelligence applications. Ms. Bosley, a recent computer science graduate from California Polytechnic State University (a Cal State campus in San Luis Obispo), saw the camp as a valuable opportunity to acquire marketable skills, noting, ‘I figured it wouldn’t hurt to put it on the résumé, to learn a new tool that’s maybe marketable.’
The intensive five-day program, hosted at Cal Poly, was prominently ‘powered by’ Amazon Web Services, the cloud computing arm of the e-commerce giant. Participants were treated to Amazon-branded merchandise and engaged in AI skill-building exercises through AWS Jam, a dedicated training application. Throughout the camp, Amazon employees also shared their company’s core principles, including the philosophy to ‘Think Big.’ Despite the educational benefits, Ms. Bosley, 25, humorously likened the experience to a ‘timeshare presentation,’ remarking, ‘You get the vacation — but you also have to sit through the propaganda.’


This AI Camp is part of a much larger public-private initiative by Cal State, the nation’s largest university system with 460,000 students. Partnering with corporate giants like Amazon, OpenAI, and Nvidia, the university aims to become the country’s ‘first and largest AI-empowered’ institution. Key objectives include making generative AI tools — capable of producing human-like text and images — accessible across its 22 campuses, integrating chatbots into teaching, and preparing students for the evolving, AI-driven job market.
A significant aspect of this endeavor is a $16.9 million agreement with OpenAI to bring ChatGPT Edu to over half a million students and staff. OpenAI proudly announced this as the largest deployment of ChatGPT in an educational setting globally. Furthermore, Cal State has formed an AI committee, featuring representatives from a dozen major tech companies, tasked with identifying essential skills for California’s workforce and enhancing student career prospects.
Cal State’s expanding collaborations with the tech industry signal a profound shift in power dynamics across American universities. Traditionally, tech companies have provided infrastructure like computers and email. Now, they are being invited to serve as influential ‘thought partners,’ AI instructors, and even curriculum developers, deeply shaping the educational landscape.
This deeper integration means powerful tech firms are increasingly dictating how an entire generation learns about and utilizes AI. This trend proceeds with limited concrete evidence of widespread educational benefits, amidst growing concerns that chatbots could propagate misinformation and undermine essential critical thinking skills. Partners like Amazon emphasize their goal is to help students explore a diverse array of AI tools, moving beyond simple information retrieval with chatbots.

Kim Majerus, Amazon Web Services’ vice president for global education, articulated the need for students to continually develop ‘problem-solving skills, their strategic thinking, their ability to communicate’ in this evolving environment.
Cal State’s approach mirrors similar initiatives elsewhere. The California Community Colleges system, for instance, recently announced a partnership with Google to provide AI tools and training to 2.1 million students and faculty. Microsoft also committed $4 billion to AI skills education for students and adult workers. Critics, however, are raising alarms, suggesting that Silicon Valley’s push to embed AI chatbots into the educational fabric is akin to a vast, uncontrolled experiment on young minds.

Since ChatGPT’s 2022 debut, widespread student use of chatbots, including for academic dishonesty, has left many institutions scrambling to establish appropriate guidelines for this technology. As universities like Cal State embrace what they term an ‘AI-driven future,’ a critical concern emerges: researchers are cautioning that educational institutions risk relinquishing their autonomy to Silicon Valley’s agenda. Olivia Guest and Iris van Rooij, computational cognitive scientists from Radboud University, strongly voiced their concerns, stating, ‘Universities are not tech companies. Our role is to foster critical thinking, not to follow industry trends uncritically,’ as they argued against rapid AI adoption in academia.
The genesis of Cal State’s AI initiative can be traced to state officials responding to complaints from prominent tech companies that California students were not adequately equipped with necessary AI skills. Edmund Clark, Cal State’s chief information officer, confirmed these industry pressures, noting, ‘They were getting complaints from California’s A.I. giants that we weren’t doing a good job in preparing our students for this evolving workforce.’

Cal State’s comprehensive AI initiative was officially unveiled in February. Internal university documents, obtained via records requests by a former student, reveal that leaders always intended for major tech companies to be central to this effort, allowing the university to project an image of technological leadership. One document explicitly stated the university’s intent to ‘collaborate with industry giants’ to construct an ‘AI-empowered higher education system that surpasses any existing model in both scale and impact.’
However, the AI initiative has met resistance from some faculty, particularly given the university system’s severe budget cuts. Critics deemed the multimillion-dollar deal with OpenAI, which bypassed competitive bidding from rivals like Google, as financially imprudent. Faculty senates across several Cal State campuses passed resolutions criticizing the initiative, highlighting the university’s insufficient response to chatbot-driven cheating. Professors also argued that administrative plans overlooked AI’s risks to critical thinking and disregarded ethical concerns regarding industry labor practices and environmental impact.
Martha Kenney, a professor of women and gender studies at San Francisco State University, characterized the AI program as merely a ‘marketing vehicle’ for Cal State, serving to legitimize unproven chatbot technologies for tech companies. Professor Kenney sharply distinguished this arrangement from a genuine collaboration: ‘It’s not a ‘partnership.’ If you switch out the product, we would never say, ‘Xerox is collaborating with San Francisco State to offer photocopiers to all the members of its community.’’
In response, Jason Maymon, a Cal State spokesman, asserted the university’s duty to prepare students and faculty for a fast-evolving world. He stated, ‘Like the rise of the internet, artificial intelligence is another technological revolution, and higher education can’t simply stand by and watch.’ Clark further clarified that Cal State intends to educate students on critically evaluating AI and has engaged an independent firm to evaluate the initiative’s effectiveness. While defending the OpenAI deal as ‘unusually low-priced,’ Clark acknowledged the contrast with California’s community college system, which secured free AI chatbot services from Google for over two million students and faculty — nearly four times the number of users Cal State is paying OpenAI for.
The Amazon-partnered AI camp was a cornerstone of Cal State’s broader initiative, with administrators from various campuses submitting proposals for student-led projects. The camp was hosted at Cal Poly, a university established in 1901 as a vocational training center. It originated from the Digital Transformation Hub (DxHub), an initiative where students collaborate with university and Amazon staff to create applications for non-profit and government entities.


The camp’s primary goal was to empower students from across all Cal State campuses to leverage AI in solving real-world administrative challenges, such as optimizing transfer student placement in math courses. Eighty students, representing 19 campuses and diverse majors ranging from computer science to zoology, united with a shared objective: to acquire essential AI career skills.

Aiman Madan, a recent computer science graduate from Cal State San Marcos, articulated the urgency felt by many: ‘AI is the next phase of life, just like the internet, which changed everything. It’s a race, and we need to know how to get ahead.’ At the camp’s outset, Cal Poly and Amazon employees provided an overview of generative AI, discussing its applications in fields such as business and medicine. However, Ryan Matteson, DxHub’s technology director, also highlighted critical concerns like AI bias and environmental impact. Matteson advised students to ‘Make sure you are laser-focused on actually trying to solve a real problem for real human beings and not just chasing shiny technology.’

Student teams then engaged in a challenge: to design an AI-powered video game gatekeeper character responsible for vetting players seeking entry into a mystical realm. Dianella Sy’s team, for instance, programmed an AI system named Amazon Nova to imbue their gatekeeper with ‘an assertive personality in a strict and stern tone.’ Ms. Sy, 20, a computer science major from Cal State Fullerton, shared her excitement, stating, ‘I never built an A.I. before.’

Later in the week, students viewed a futuristic video from Matter and Space, an AI education startup. The video depicted Native American youth wearing AI glasses and tracking devices that continuously monitored their activities and moods. Paul J. LeBlanc, a co-founder of the startup, then informed students that AI technologies would soon surpass doctors in diagnosing illnesses, effectively displacing human medical expertise. This presentation prompted some students to draw parallels with the themes of the dystopian television series ‘Black Mirror.’ Charles Walker Cano, a biology major at Stanislaus State aspiring to be a physician, expressed his concern: ‘I don’t want A.I. to create more inequalities and disparities.’

In the culminating activity, students engaged in group projects addressing various campus administrative challenges. One team, tasked with streamlining procurement, utilized AI to rapidly prototype an app capable of automatically scoring vendor proposals against university criteria. Students, including Arash Peighambari, a recent computer science master’s graduate from Cal State San Marcos, valued the opportunity to solve practical problems for real clients. Peighambari noted that his graduate research focused on building an AI system to detect power grid issues. However, he found the AWS camp to be ‘more client and product and commercial-oriented,’ a distinct shift from his academic work.
