Artificial Intelligence

Colleges Strategies on AI Really Should Be Comprehensive, Not Piecemeal

March 10, 2026 102

What happens to a college education when a chatbot can draft an essay, summarize a reading and generate computer code in seconds? The arrival of artificial intelligence in college classrooms has been swift and, for many schools, disorienting.

As professors of economics and business management and biology at liberal arts colleges, we are confronting a question that now cuts across all colleges and universities: What is the purpose of a college education, as AI is rapidly reshaping how students think, learn and prepare for careers?

The Conversation logo
This article by Vicki Baker and Linda M. Boland originally appeared on The Conversation, a Social Science Space partner site, under the title “Colleges face a choice: Try to shape AI’s impact on learning, or be redefined by it.”

While much of the public debate has focused on plagiarism and credit for student work, the deeper issue extends beyond rule-setting.

Across higher education, most schools have issued guidance on how students should use AI, rather than adopted sweeping mandates.

Liberal arts colleges, like the University of Richmond, Bard College and Trinity College, tend to emphasize the importance of students using AI ethically and responsibly, and typically allow students to use AI when they cite it and their instructor permits it. These schools also allow professors to individually determine their own AI policies.

A 2024 study of 116 research universities found similar patterns, with instructors largely determining course policies and few campus-wide bans.

What’s unsettled is not whether students can use AI, but how institutions want students to use it. In our view, unless colleges clearly shape AI’s role in teaching and learning, fast-moving technologies may begin to redefine education by default. The risk isn’t more AI, but a gradual shift in what counts as learning.

Students may spend less time asking hard questions, making their own judgments and building real expertise. In that case, college risks becoming less about understanding and more about producing papers and other content quickly.

Letting AI into the classroom

When generative AI tools first became widely available in late 2022 and early 2023, most professors focused on finding and preventing it in student work. They looked for signs of AI use, including generic phrasing, fake citations, sudden shifts in tone or unusually polished writing that didn’t match a student’s prior work. Some faculty also used AI-detection software to identify computer-generated text.

But it is often difficult to tell when someone has used AI, in part because the detection software is unreliable. As a result, many faculty have shifted from bans to more structured guidance.

Some faculty, as a result, allow students to use AI for specific tasks, such as brainstorming, outlining or debugging code.

The rationale is practical: AI is everywhere and already embedded in professional settings. College graduates are likely to use AI in the workplace.

Accepting AI is here to stay

More recently, college faculty at a range of schools have shifted the focus from whether students are using AI at all to whether students using AI can still analyze, question and justify their own research and conclusions.

At the University of Michigan, for example, some faculty are redesigning assessments to include live debates and oral presentations.

And across the U.S., professors are reviving oral exams, since live questioning makes it harder for students to rely solely on AI. Students must then verbally explain their reasoning and defend their work.

Different academic fields, though, are approaching AI in various ways.

Many business programs, like the University of Pennsylvania’s Wharton School, have moved quickly to bring AI into coursework and degree programs, often framing them as workforce preparation.

Recent analysis of more than 31,000 syllabuses at a large research university in Texas showed a growing number of faculty in the fall of 2025 allowed students to use AI. Business courses allowed the greatest use of AI, while humanities courses allowed it the least. The physical and life sciences fell in between.

Across disciplines, AI was most often allowed at this school for editing, study support and coding. It was most commonly restricted for drafting, revising and reasoning or problem-solving.

AI’s role in higher education is not settled. Instead, it is evolving, dependent on different academic cultures.

Different schools, different approaches

Colleges’ and universities’ overall responses and approaches to AI are varied, as well.

Research universities like Carnegie Mellon University and Stanford University are expanding on their long-standing investments in AI, moving quickly to develop new research centers, hiring faculty with AI expertise and creating new degree or certificate programs.

Liberal arts colleges are moving too, but often with a different emphasis.

The Davis Institute for AI at Colby College supports AI work across disciplines through new courses, faculty development and entrepreneurship. At the University of Richmond, a new center links AI to critical thinking and human values, so students can study AI’s impacts and help shape it intentionally.

All of these schools are determining AI policy course by course. But these plans are not part of a comprehensive, school-wide strategy.

Few schools have articulated coordinated, institution-wide plans on AI. Arizona State University is one example of a broader AI integration strategy, which spans academics and campus operations.

Comprehensive AI strategies are expensive. Meaningful integration may require campus licenses for AI services, upgraded computing systems and faculty training. These investments are difficult at a time when many colleges face enrollment declines and financial strain.

Public trust in higher education is another concern that makes enacting broad change difficult. Gallup surveys in 2023 and 2024 found that only 36 percent of Americans had high confidence in colleges and universities.

Against this backdrop, AI is raising questions about how colleges prepare students for their careers. Employers still prize critical thinking and communication. Yet generative AI can mimic the appearance of thinking even when real understanding is absent.

The tension is clear: If AI does the writing, coding or analysis, where do students do the thinking?

Rethinking learning

Rising use of AI is forcing colleges and universities to revisit what students should learn, how to measure this and the enduring value of a college degree.

That shift moves the conversation beyond course-by-course changes to a shared strategy on what forms of knowledge and thinking are developed in college. Colleges may redesign assignments, expand oral and project-based assessments, and integrate AI literacy across disciplines. They may also clarify learning outcomes, invest in faculty development and find new ways to document students’ judgment and problem-solving in an AI-assisted world.

The question is no longer whether AI belongs in higher education. The real question is whether colleges and universities will shape its role – or allow AI to quietly reshape them.

Viski Baker (pictured) is the E. Maynard Aris Endowed Professor in Economics and Management at Albion College, where she serves as associate dean, strategic partnerships & innovation and chair of the Economics & Management Department. Her books and monographs include Charting Your Path to Full: A Guide for Women Associate Professors , Success After Tenure: Supporting Mid-Career Faculty , Developing Faculty in Liberal Arts Colleges and a New Directions in Higher Education volume, Bridging the Research-Practice Nexus: Resources, Tools, and Strategies to Navigate Mid-Career in the Academy. Linda M. Boland is professor of biology at the University of Richmond, where she recently served as associate provost for faculty. Boland currently leads a professional development program for department chairs through the Associated Colleges of the South and serves as the University of Richmond’s institutional co-representative to the ACE Women’s Virginia Network and co-chairs a Women in Leadership group for staff and faculty.

View all posts by Vicki Baker and Linda M. Boland

Related Articles

AI Tutors Support 16 Percent of Learning. What About the Other 84 Percent?
Artificial Intelligence
February 20, 2026

AI Tutors Support 16 Percent of Learning. What About the Other 84 Percent?

Read Now
After the University? Braiding a Path Forward
Higher Education Reform
January 21, 2026

After the University? Braiding a Path Forward

Read Now
Should We Expect AI to ‘Do’ Science?
Artificial Intelligence
January 20, 2026

Should We Expect AI to ‘Do’ Science?

Read Now
Critical Thinking is Critical in Universities
Critical Thinking
January 7, 2026

Critical Thinking is Critical in Universities

Read Now
What Is a University For, After Gaza?

What Is a University For, After Gaza?

What is a university for? Traditionally, education has long been seen as a foundation for ethical and intellectual life. Aristotle viewed learning […]

Read Now
An AI Authorship Protocol Aims to Sharpen a Sometimes-Fuzzy Line

An AI Authorship Protocol Aims to Sharpen a Sometimes-Fuzzy Line

The latest generation of artificial intelligence models is sharper and smoother, producing polished text with fewer errors and hallucinations. As a philosophy […]

Read Now
Stop the Rot, Fight the Malaise and Reclaim the Void!

Stop the Rot, Fight the Malaise and Reclaim the Void!

Reflecting on my 17 years of tertiary education and 19 years of teaching and learning, university life was mostly supportive and always […]

Read Now
0 0 votes
Article Rating
Subscribe
Notify of
guest

This site uses Akismet to reduce spam. Learn how your comment data is processed.

0 Comments
Newest
Oldest Most Voted
Inline Feedbacks
View all comments