Students and Professors Speak On AI

Catherine Lang | September 9, 2025


Many students at Warren Wilson College (WWC) avoid using ChatGPT and similar Artificial Intelligence (AI) engines, citing its costs to the environment and critical thinking skills. Through a digital survey, 137 students were polled on their use of ChatGPT.  

Seventy-two percent of students reported that they avoid ChatGPT entirely. Campbell Humphrey, a new transfer student, developed a critical perspective on AI after the company xAI began construction of a supercomputer and data center in her hometown of Memphis, Tennessee. 

“It was not announced publicly until they had already started to build [the data center], and there [were] a lot of under the table deals with the city,” Humphrey said.  “So the citizens of Memphis didn't get much of a heads up on this until it was already being built.”

For Humphrey, choosing to avoid using AI is her response to the ecological footprint of Memphis’s data center, which requires large quantities of clean water to cool their computers and smog-emitting gas combustion turbines to power them. 

“This is in a very poor neighborhood in Memphis where all this industrial pollution has already caused a lot of health issues for the people who lived in that neighborhood,” Humphrey said. “There's a much higher rate of asthma in those neighborhoods than in the rest of the city.” 

In June 2025, the Southern Environmental Law Center, on behalf of the NAACP, sent a letter to xAI expressing their intent to sue over the company’s violation of the Clean Air Act.

Humphrey said that she did not anticipate seeing a strong anti-AI student culture when she arrived at Warren Wilson. 

“I was actually surprised about how many people I've spoken to who are also against the use of AI,” Humphrey said. “From my experience, a lot of people who aren't directly impacted by it don't tend to have very strong opinions on it, and often don't mind using it. I was very happy to see that there were a lot more people than expected who were very against using AI.”

Among students who reported using the platform occasionally, most placed their usage at less than five times per month.  2% of students reported using the platform every day. In an open-ended prompt, several students expressed that while their use of ChatGPT was infrequent, Google’s built-in AI search assistant makes engaging with AI difficult to avoid.

When asked to describe their political views on AI, 91% of students cited concern for its environmental cost, overlapping with 75% of students who are concerned about anticipated cuts in the job market. 4% of students did not think they were informed enough to comment, and 3% of students shared that they were indifferent to the subject. 12% of students believe that AI is a positive innovation that will serve future generations.

In “The Climate Cost of AI”, a webinar sponsored by WWC, Dr. Ahmad Antar of Digital Emissions claimed over one trillion dollars have already been invested in AI data centers. As global societies enter the age of AI, communities must decide how they will interact with, or resist, the technologies. 

Many WWC students expanded on their perspectives through anonymous comments. 

“I think that like any technology, AI comes in a variety of forms and can be used in a variety of ways,” wrote one student.  “The way that it is being integrated despite environmental costs is heartbreaking, but also inevitable. While I avoid using it for many reasons, I think the next societal moves are to develop it into something more energy efficient and concrete, well-established, meaning it's time to make moves with regulations before it's even more privatized.”

Ninety-one percent of students claim they use ChatGPT less than their peers, compared to 4% of students who believe they use the service more often than their peers. 3% of students said that they have submitted schoolwork written by ChatGPT, and 2% of students reported that they rely on it to meet their goals at school. The most common AI service used after ChatGPT was Grammarly. 

Ben Feinberg, a professor of anthropology at WWC, believes consistent use of AI is undermining the values of higher education. Feinberg countered the idea that Grammarly, a popular AI-powered editor, is a benevolent educational resource. 

“People say, ‘Oh, I'm just using Grammarly,’ but [Grammarly’s] advanced features rewrite [papers],” Feinberg said. “They're not just proofreading.”

For Lena Nelson, a senior working as an anthropology tutor, ChatGPT’s use in the classroom represents a decline in students’ engagement with their school community. Nelson said that her department saw a record-low number of students seeking academic assistance last semester, with only two students utilizing the free service.

“It is so much easier [to use ChatGPT] than to talk to a tutor, and so nobody gets tutoring,” Nelson said. 

For some of her peers, AI is simply a tool, not something to categorically reject as evil or immoral. Nelson sympathized with this attitude, but emphasized that ChatGPT cannot substitute the social element of direct peer support.

Professors at WWC have continued to adapt their teaching praxis since ChatGPT’s public launch in November 2022. While the school’s administration holds no explicit AI policy, teachers were provided with a template for syllabi with three options: strict (no use of AI for class assignments), moderate (AI can be used in some situations, but not others), and open (no defined limits to AI use). Individual professors weighed both personal convictions and academic standards into their decision. 

Dr. Hayley Joyell “Dr. J” Smith, a visiting assistant professor of environmental science, took a new approach for her first year of teaching at WWC.  Since her last college class in 2023, she has seen the availability of open-source AI spread dramatically. Smith does not think that a strict policy on ChatGPT will prevent students from using it for her class. Instead, she is shifting her assessments from Moodle to in-class blue books. 

“It'll be interesting to see if colleagues might follow suit, just going back to having students do things [with] pen and paper, where AI is not an option,” Smith said.

Feinberg would like to engage students to think critically about AI. In a previous anthropology class, Feinberg asked students to critique AI-generated writing. With several exceptions, he believed most of the class did not understand the function of the assignment. Feinberg also reflected on AI’s place in the arc of technological shifts at WWC. 

“I've been at Warren Wilson for a really long time, and our students have generally been late adopters,” Feinberg said. “The first flip phones started to appear here several years after everyone else had them, [along with] the first smartphones. Student culture tended to shame people who were using their smartphones in public in the initial period. So just as smartphones became taken for granted here despite the late adoption, [...] once people get accustomed to [AI], it’s hard for them to pull back.”

Within the sample of WWC students who participated in the survey, a majority expressed negative views on AI. However, the 137 students polled reflect less than 20% of the total student body. Within a school culture that favors progressive politics, students who hold contrarian views may feel obliged to withhold their true opinions. For this reason, it is challenging to capture the range of student perspectives. This article is a snapshot, not a comprehensive report. In consideration of its limits, any student or faculty member may contact the author if they would like to have their voices heard in a follow-up article.

Author’s note: The interviews for this article were transcribed using Otter.ai.

Next
Next

Impact and Importance of International Students at Wilson