The library hosted “AI Made Simple: A Workshop for the Curious Mind” on Oct. 16 to inform students how to use artificial intelligence as an academic tool while considering the ethics of AI.
Librarians Rochelle Perez and April Austin led the workshop with 12 students in attendance.
“AI’s already affecting not just the faculty, but also the students in teaching and learning,” Perez said. “It’s important that students are aware of what’s out there and the development of it.”
AI can be used as a tool to help students evaluate new information, bad information and sources on the internet, Perez said.
“There’s some faculty allowing different ways of using AI,” Perez said. “Some can be very restricting, some faculty are permissive and some are in the middle.”
Perez and Austin discussed various productivity tools for students, the top three including Grammarly, QuilBot and Turbolearn.ai.
“It is important that people are able to give credit where credit is due,” Perez said. “There’s a lot of information and because of AI, there’s a tendency where people steal and claim it as their own.”
By including role, context, task and format, students are likely to engineer prompts that will have a productive output by generative AI, Perez said.
AI models consist of all the stereotypes, discrimination and bias that pops up in internet forums, such as Reddit, Austin said.
“An example of it is from West Point. They used CoPilot to generate images of what a cadet at West Point, which is a military academy, would look like,” Austin said. “They did it in different styles, but you’ll notice every single one of them is a white male.”
CoPilot is an AI tool developed by Microsoft that students can access with their student ID number.
The rest of students at West Point are not accounted for in the pictures generated, said Austin.
“One of the things I want to think when I’m using generative AI is ‘Who owns the stuff once I post it?’” Austin said.
The Los Rios District shares a corporate license for CoPilot, which ensures information shared by the student remains private and the student’s intellectual property, Austin said.
“If you put your medical records into ChatGPT, then OpenAI owns that information now and they will use it to generate other people’s responses,” Austin said. “There’s no accountability in how they are going to use it.”
AI may also generate inaccurate information and the key idea is to ultimately trust information from a human, Austin said.
“There’s an overhanging question of how they’re making money,” Austin said. “They’re relying on investment capital and not producing profit.”
AI may not be transparent about what its training is based on, so a student doesn’t know where a source is coming from, Austin said.
“If you are having a mental health crisis, go to a human and not ChatGPT,” Austin said. “It is using statistics to figure out the next word, it’s not using morality or ethics.”
Timothy Baker, a 29-year-old film major, said he attended the workshop to take initiative to learn more about AI.
“I find myself in a situation where AI is extremely prevalent in everything I do, but I’m completely naive to it,” Baker said.
Baker said he uses Siri and hears people talk about ChatGPT.
“I’m learning that AI can be used as a short-term, quick fix,” Baker said. “It doesn’t mean when your job is complete when you use an AI prompt.”
Baker said after attending the workshop, he plans on using prompt engineering to generate a concept as a guideline for his papers.
“I don’t want to be fully AI,” Baker said. “I’d rather take the advice of the librarians here in this workshop and still use my critical thinking skills and not use robots to replace my brain.”
The next AI workshop will be hosted on Oct. 30, starting at 11 a.m. in the Cosumnes River College library computer lab L-245.