A Future-Proof Education: How Dickinson College Teaches Students to Thrive in the AI Age

During Dickinson’s first AI Symposium last spring, Eren Bilen, assistant professor of data analytics, presented research exploring how critical thinking impacts human-AI collaborations.

During Dickinson’s first AI Symposium last spring, Eren Bilen, assistant professor of data analytics, presented research exploring how critical thinking impacts human-AI collaborations.

A human-centered approach to a fast-changing future

by MaryAlice Bitts-Jackson

How do you prepare students for the AI age without sacrificing the foundational skills they need to navigate it wisely? At Dickinson, the answer goes deeper than simple tech literacy and academic-integrity concerns. The college is taking a thoughtful, human-centered approach to the mighty challenges and opportunities AI presents—and defining what it means to be future-ready in the AI age.

“In true Dickinson fashion, we are leaning into AI across our curriculum, because teaching our students how to optimize its use is essential,” affirms President John E. Jones ’77, P’11. “Injecting these skills into our world-class liberal-arts education will produce graduates fully prepared to enter a constantly changing job market.”

AI AS A PARTNER

A key part of the college’s approach to AI is a new, dedicated course, AI as a Partner. Rather than wandering the wilds of AI on their own, students are invited to peek under the hood of emerging technologies and learn strategies to problem-solve with AI assistance. It’s open to students in any major who’ve already completed their first year at Dickinson.

“The course isn’t about learning a specific tool,” says James D’Annibale, director of academic technology, who developed the curriculum and is teaching the course for the second time this spring. “It’s about learning how to think with AI—how to use AI as a problem-solving partner, instead of an omniscient answer machine.”

Twenty-five students took the first class this fall. First, they learned to problem-solve by identifying root causes, considering stakeholder perspectives, identifying assumptions and anticipating second-order effects. They then learned how AI systems are trained and why AI tools behave differently.

As they practiced working with AI, the students recorded their screens and narrated their decision-making. This helped them focus on their processes, rather than their outputs, and allowed for instructor and peer feedback on how they could improve their approach.  

College students pose in an auditorium.

Students enrolled in this past fall's AI as a Partner course pose after their Dec. 15 research presentation on campus. Representing a spectrum of majors, they shared the results of their group project with the campus community.

The class culminated in a group project. Working in partnership with the college’s Center for Sustainability Education, the students used AI to analyze a huge set of data about energy use in every building on campus during the past year.

But they didn’t just load in the stats and click “enter.” The students developed a clear goal and critically examined the information they were provided. When identifying missing information that could influence individual buildings’ energy usage—factors such as building insulation, roofing types, windows and occupancy—they partnered with campus departments to gather that information themselves. They then wrote informed, contextualized prompts, keeping an eye out for common AI pitfalls. And they evaluated the AI outputs as well.

The biggest takeaway? “AI can help you move faster, but it doesn’t replace judgment,” D’Annibale says. “It can enhance work, but it can’t replace humans.”

TIME-TESTED STRATEGIES FOR A NEW ERA

That human-centered idea lies at the heart of the college’s overall approach to AI as well. Dickinson isn’t banning AI outright, and it isn’t adopting AI uncritically.  Instead, Dickinson asks: What, exactly, do students need to learn in order to thrive and lead in a future spurred by technological change?

“AI is raising fundamental questions for higher education—about what we teach, how we teach and what skills our students need—and we’re not shying away from that,” says Renée Cramer P'28, dean of the faculty and provost. “Our goal is to help students understand when AI gets in the way of learning and working, when it is useful and how to tell the difference.”

And while that challenge might be new, the solution is time-tested. Dickinson continues to emphasize fundamental skills like critical thinking, clear communication and interdisciplinary approaches to problem-solving—all critical skills for effective AI use, development and management. The college also maintains small class sizes and a culture of mentorship, ensuring that students continue to benefit from human-led personalization.

Dickinson's rigorous writing and language requirements and Dialogues Across Differences initiative provide the raw materials for productive collaboration—a must for developing and rolling out strategies on uncharted terrain. The college’s Ethics Across the Curriculum program brings ethical literacy into the mix, ensuring that students have a framework to address the thorny ethical issues AI creates.

And, critically, Dickinson students must complete introductory liberal-arts courses—like the first-year seminar, which helps build strong research and writing skills—before they take the college's new dedicated class on AI. Only then will they be able to use AI to greatest effect.

EXCITING APPLICATIONS

Across campus, students and professors are putting that multilayered approach to workplace readiness into exciting contexts.

Akiko Meguro, senior lecturer in Japanese, uses traditional methods to teach students grammar and syntax—and then invites students to use a proprietary chatbot to practice conversational skills outside of class. Assistant Professor of English Chelsea Skalak and her students use large-language models to winnow Medieval texts.  Data-analytics faculty are continually adding new technologies to the curriculum.

Students across majors are also completing class and research projects that develop workplace-ready skillsets and experiences. Some, working closely with faculty, are using AI to analyze large and complex data sets that would otherwise take years to comb through.

Epitomizing AI-enhanced active learning at Dickinson, students learning about forensic psychology recently used AI to practice interviewing techniques—then shared what they’d learned during the college’s AI Symposium. Math majors use AI tools to visualize abstract concepts and investigate the limitations and real-world applications of mathematical theories. Several students deepened their understanding of AI’s potential to streamline processes last summer through internships with proprietary AI use baked in.

Dickinson computer-science students are diving even deeper. This past fall, two won an award at the world’s largest collegiate hackathon for their AI-powered tool. Hemanth Kapa '27 partnered with John Lee ’27 to streamline blood-sugar management for people with diabetes. 

That honor arrived just a few months after Kapa participated in Dickinson’s 2025 AI Symposium, highlighting how students, faculty and staff are incorporating AI tools in ways that enhance teaching and learning. During that daylong event, Kapa shared findings from his campus survey investigating how instructor policies on AI can influence student AI use for that class. Afterward, he said he appreciated the chance to bring student voices into the conversation.

“It wasn’t just about presenting findings, but it was about bringing students, faculty and staff together to reflect, question and collaborate on ideas about how to embrace new tools while staying rooted in the values of critical thinking, creativity, and integrity,” Kapa says.

CLEAR GOALS, FLEXIBLE METHODS

These collaborations and innovations are only possible because of (human!) professors who are passionate about their work—and because of forward-looking leadership that supports professional development and cross-campus collaboration.

Faculty workshops, study groups, symposia and one-on-one consultations with Dickinson’s academic-technology department help educators brainstorm and realize innovative projects that enhance traditional teaching and learning, rather than replace it. Professionals in Dickinson’s Center for Teaching & Learning and Multilingual Writing Center help faculty with writing pedagogy and provide faculty and students with strategies to prevent misuse of generative AI.

New academic-integrity policies address AI explicitly, lighting the way for faculty and staff. This includes a syllabus statement on AI: Faculty are strongly encouraged to state clearly in their syllabi when AI use is and is not permitted so that students don’t need to guess about whether and how they may use AI for a given class.

Dickinson leaders also recognize that not every profession or academic discipline is affected by AI technologies to the same degree and in the same ways. Each academic department is therefore empowered to develop its own AI guidelines that dovetail with overarching collegewide recommendations but also reflect their students’ current learning needs and their future work.

“Individual faculty members may decide to bring AI into class projects in creative ways, or they may decide not to use it at all,” Cramer says. “What we all agree on is that we want to challenge them, we need them to be aware of ethical concerns related to the use of AI and we want our students to learn fundamental skills.”

TAKE THE NEXT STEPS 

Published January 26, 2026