Hancock College Hosts Inaugural AI Summit Highlighting Ethics, Innovation, Education

Allan Hancock College held its inaugural Artificial Intelligence Summit on Friday, ushering in a new era of conversation and collaboration around AI’s transformative role in education, workforce development, and society at large.

Hosted at the Boyd Concert Hall on Hancock’s Santa Maria campus, the day-long event brought together more than 200 students, educators, technology experts, and community members for a series of keynote presentations, panels, and hands-on workshops.

“This event happened through our connection with the Chancellor’s Office, our connection with our regional consortium — which is the eight community colleges in our vicinity — who have championed AI. We see it as so necessary for skill development for our students,” said Nancy Jo Ward, a faculty member in media arts at Hancock and one of the summit’s lead organizers.

The summit opened with a welcome address from Hancock College Superintendent/ President Kevin Walthers, and representatives from the California Community Colleges Chancellor’s Office, followed by keynote presentations from LinkedIn and Moorpark College speakers. The morning sessions delved into how AI is reshaping the classroom and job market, setting the tone for a day of reflection and engagement.

Keynote speaker Don Daves-Rougeaux, a senior advisor for workforce development at the Chancellor’s Office, underscored the essential role of AI across all areas of higher education.  “AI is here; it’s in everything we are doing now, and it’s really critical for us to explore the use of AI in our operational areas, our curriculum development, our teaching and learning, our student support and even our infrastructure,” Daves-Rougeaux said.

A highlight of the summit was the mid-morning panel discussion, where academic leaders and technologists tackled pressing questions about AI’s ethical use, its integration into instruction, and the responsibility institutions have in preparing students to use AI tools wisely.

“We have to teach our students AI literacy. Everyone should go through it. It should be part of their freshman seminar, part of onboarding. Every student should do a module on this,” said Dr. Alegría Ribadeneira, professor and department chair at Colorado State University, Pueblo.

Echoing the call for AI transparency, Jason Gulya, PhD, professor of English at Berkeley College, emphasized modeling over monitoring. “We really need to give students models of ethical behavior. I don’t think we should go too far down the path of AI detection or even process-tracking,” he said. “I actually show my students how I use AI in my own writing process … I ask them to break their process down and give me an AI transparency statement.”

Panelists agreed that while generative AI tools like ChatGPT are powerful, students often misuse them, sacrificing their authentic voices.  “When they get this technology and start experimenting with it, [students] don’t use it particularly well. They often give up their voice in the process,” Gulya added.

Workshops later in the day provided hands-on exploration of topics like bias in machine learning, classroom applications of AI, and ethical data use. The sessions built on the summit’s central theme: fostering AI literacy through openness, experimentation, and critical thinking.

“Humans have to intervene with any kind of AI result. It’s not just about accepting what comes — we have to read it, see if it’s correct, maybe adjust it, make it our own,” said Ward, who has included an AI policy in her syllabus for the past two years.

Another critical voice, Trudi Radtke, instructional technologist at Moorpark College, reminded attendees of the importance of transparency in AI systems themselves.

“Claude … is run by Anthropic. They’re more transparent about their training data, their models, and the architecture. We know more about how it works, and that makes a difference,” they said.

Panelists also examined how pedagogical shifts — like flipped classrooms and collaborative learning — can help reduce overreliance on AI and build real-world skills.

“We’ve seen that flipping the classroom … can help facilitate creative thought,” said Cecily Hastings, of LinkedIn Learning and the chief council of Librarians AI Task Force.

Librarian Danielle Kaprelian also offered practical guidance for navigating the evolving norms around citing AI: “There are current guidelines for citing AI … You could cite from AI for maybe five to six sentences, but you don’t want the entire paper to be AI-generated.”

As part of the summit, students in Hancock’s Media Arts Club created an AI Art pop up gallery in the foyer of the college’s Fine Arts Complex. The students’ art will be on display through April 25.

For more information about the AI Summit and upcoming professional development events at Allan Hancock College, visit www.hancockcollege.edu.

Share this Article: