Skip to Content
image description
Designer

Braid, Don’t Build: Why Community Colleges Must Stop Siloing AI and Start Integrating It GenAI Trends in Community College Education

Nancy Miller, AI Innovator, Forsyth Community Technical College. Subscribe and read more from Nancy on Substack here.

I sat in a keynote recently and felt that particular kind of discomfort that only comes when someone names a problem you have been watching build quietly, systematically, and at scale across the institutions you care most about.

The presenter described a pattern that is accelerating across community college systems nationwide. Colleges, responding to the undeniable urgency of the AI moment, are doing two things simultaneously: launching new, standalone AI programs with significant fanfare, and quietly allowing or actively engineering the enrollment decline and eventual elimination of the very programs those AI courses are supposed to complement. Web Development, Computer Programming. Programs built over decades, staffed by expert faculty, serving the exact populations community colleges exist to serve.

We are, in the name of innovation, cannibalizing our own infrastructure.

I need to say that plainly, because I do not think we are fully reckoning with it.

The Silo Fallacy: Why Standalone AI Programs Are Not Enough

Let me be clear about what I am not arguing. I am not arguing that community colleges should avoid building AI-specific programs. There is a legitimate and urgent need for machine learning technicians, AI model auditors, prompt engineers, and data annotators. The AI workforce will include machine learning specialists, data annotators, AI model auditors, prompt engineers, AI risk managers, and applied AI technicians and two-year colleges are uniquely positioned to train that workforce. That work matters.

What I am arguing is that the creation of those programs cannot come at the expense of the programs already delivering on the community college promise. And yet, in institution after institution, that is precisely what is happening. Enrollment pressures, budget constraints, and the gravitational pull of “AI” as a branding opportunity are converging in ways that are quietly devastating programs with proven workforce outcomes.

The structural problem is this: when institutions treat AI as a subject rather than a skill layer, they build silos. They create “Introduction to Generative AI” as a standalone elective that nursing students, IT students, and business students may or may not elect to take, rather than weaving AI competencies into the fabric of every program. They launch an “AI and Data Technology” degree while the computer program down the hall loses three sections. They announce a partnership with a major tech firm to deliver AI boot camps while the faculty in Early Childhood Education are still waiting for guidance on whether students can use AI tools at all.

This is not integration. This is substitution dressed up as innovation.

What “Braiding” Actually Means

I use the term braiding deliberately, because it captures something that integration does not. Braiding implies that each strand retains its identity, its strength, and its purpose and that the result is more resilient precisely because the strands are interwoven rather than replaced. You do not braid by removing threads. You braid by learning how they work together.

In practical terms, braiding AI into existing programs means asking a different set of questions at the curriculum design table:

  • In the Networking Capstone: How does a network technician use AI-assisted log analysis, anomaly detection, and automated remediation workflows? What does responsible AI-augmented network monitoring look like? (These are not hypothetical competencies they are industry expectations today.)
  • In Medical Assisting: How does a clinical assistant navigate AI-generated patient summaries, flag potential AI errors in documentation, and communicate AI tool outputs to supervising clinicians with appropriate epistemic humility?
  • In Early Childhood Education: How do early childhood educators evaluate AI-generated developmental screening tools for bias? How do they communicate to parents about the appropriate and inappropriate use of AI in child assessment?
  • In HVAC Technology: How does a service technician use AI-powered diagnostics, predictive maintenance alerts, and natural language troubleshooting assistants, while retaining the hands-on judgment that no model can replicate?

In every case, the answer is not a new program. It is a redesigned course with updated learning outcomes, an AI-aware assessment strategy, and faculty who have been equipped, not just exposed, to facilitate that kind of learning. The most forward-thinking institutions are already recognizing that AI work cannot happen in silos, establishing cross-functional structures that simultaneously address infrastructure, data governance, security, curriculum, policy development, and student engagement.

That is the model. Not a new AI building erected next to the old vocational wing. A rewired institution.

The Enrollment Trap and the Program Death Spiral

Here is the mechanism I am most concerned about, because it operates beneath the level of deliberate decision-making.

When a college launches a new AI program, especially one with industry partner branding and a press release, it draws enrollment. Students who might have entered Networking, or Information Systems, or Business Administration, now enroll in “AI and Data Technology.” The existing programs see enrollment dip. Dipping enrollment triggers low-enrollment reviews. Low-enrollment reviews generate recommendations for consolidation or elimination. Programs are sunset. Faculty lines are not replaced. Institutional knowledge, sometimes decades of it, walks out the door.

Meanwhile, the new AI program, stripped of the contextual industry depth that made the legacy programs valuable, produces graduates who understand how to prompt a model but cannot configure a switch, read a balance sheet, or manage a patient intake workflow.

As more firms eliminate jobs because of AI, especially at the entry level, community colleges need to “be realistic” about which jobs will still exist and which skills remain marketable. That realism cuts both ways. It means we should not train students for roles that are genuinely disappearing. But it also means we should not eliminate programs training students for roles that are transforming, where the human expertise remains essential, augmented by AI rather than replaced by it.

The vast majority of the occupations served by community college technical programs fall into the second category. And we are eliminating them as if they belong to the first.

The Equity Dimension We Are Not Talking About Enough

There is an equity argument embedded in this structural critique that the field is only beginning to surface.

Community colleges serve high proportions of low-income and first-generation students, and leaders are acutely aware of the risk that AI’s rapid rise could further stratify opportunity rather than democratize it. But the equity conversation in our sector has been almost entirely focused on access to AI tools, do our students have accounts, devices, connectivity, and permission to use these platforms?

That is the wrong level of analysis, or at minimum, it is an incomplete one.

The deeper equity question is: who gets the integrated AI education? When AI competencies are siloed into standalone programs, programs that often require prerequisite math, prior tech experience, or full-time enrollment patterns that working adult students cannot sustain, we are building a two-tier AI education system inside the very institutions designed to dismantle educational stratification.

The student who takes Certified Nursing Assistant in the evenings while working full-time does not have the schedule flexibility to add a standalone AI elective. But she absolutely needs to know how AI is reshaping clinical documentation, care coordination, and patient communication. She needs that knowledge in her CNA program. Braiding it in is not a luxury, it is an equity obligation.

What Systemic AI Governance Actually Requires

The institutions getting this right are not doing so by accident. They are doing it because someone at the institutional leadership level made an explicit decision that AI would be treated as a cross-cutting competency rather than a departmental specialty. Mississippi Gulf Coast Community College, for example, revamped its required computing course within the general education core, shifting the focus toward AI and cybersecurity while still covering essential computing skills, ensuring that every student in the system would have a foundational understanding of AI, how to use it, and how to use it ethically.

That is a governance decision, not a curriculum decision. And it required institutional courage.

In my work across Community College Systems, I have articulated this through a Five-Part AI Governance Model: Communicate, Equip, Train, Govern, Partner. Every element of that framework assumes that AI is embedded in the institution’s existing work not offloaded into a new program that allows everyone else to abdicate responsibility for engaging with it. Governance that lives only in a new AI program is not governance. It is containment.

The braiding imperative belongs explicitly in the Equip and Train dimensions: equipping faculty in every discipline to integrate AI-relevant outcomes into their existing courses, and training them not just on the tools but on the pedagogical judgment required to do that well. It belongs in the Govern dimension as well: curriculum review processes must explicitly ask, for every program under review, whether AI has been braided in before any enrollment-based elimination decision is made.

A program that is struggling with enrollment because it has not been updated in five years to reflect AI-augmented industry practice is not a failing program. It is an underfunded, under-supported program that deserves redesign, not elimination.

A Call to Action for Curriculum Leaders, Deans, and System Administrators

If you are reading this in a curriculum committee meeting, a dean’s council, or a system-level workforce development planning session, I want to offer you a concrete diagnostic question:

Before we eliminate any technical or career program for low enrollment, have we asked whether the program has been redesigned to reflect AI-augmented industry practice and have we given faculty the professional development resources to make that redesign possible?

If the answer is no, then enrollment numbers are not measuring program value. They are measuring institutional neglect.

The community college sector is at a genuinely consequential moment. The coming AI economy will produce a new and consequential divide, not between those with college degrees and those without, but between those trained to work with AI and those who are not. We have the opportunity to be the institutions that close that divide at scale, across every sector, for every learner who walks through our doors.

But not if we silo the technology, defund the programs, and call it transformation.

Braid, don’t build. Integrate, don’t eliminate. Redesign before you retire.

The work is harder that way. It is also the only way.

Nancy Miller is a Professor of Information Technology at Forsyth Technical Community College and AI Innovator in Residence for the North Carolina Community College System. She publishes GenAI Trends in Community College Education on Substack.

Back to top