Southern Illinois University
Carbondale AI Task Force
An independent review of the SIU System AI Implementation Plan — examining its current strengths, areas for development, and opportunities to strengthen the proposal before it advances.
Current Implementation Status
The plan is actively being implemented as of February 2026. Goal teams are formed and recruiting, with co-chairs confirmed and chancellor-level support engaged. A Board of Trustees presentation is being planned for April 2026. The goal numbering issues present in the original December 2025 draft appear corrected in the current working version.
Goal team staffing is uneven. Goal 1 has a full team and a named lead. Goal 2 has 2 members for 4 objectives. Goal 3 has 2 members. Goal 4 currently has no team members — the most urgent organizational gap before the April BOT deadline.
Faculty Capacity: The Foundation of Every Goal
The Task Force Report explicitly identified faculty hiring at SIUC as "critically important," noting that the campus has fewer AI faculty than R1 peers. SIUC's School of Computing currently has approximately 9 faculty with AI or machine learning as a primary research area — a strong foundation, but one that cannot support a simultaneous expansion of a bachelor's program, a research collaborative, and institution-wide curriculum redesign without additions.
For comparison, the Siebel School at U of I hired 16 new faculty in 2024 alone and 18–19 more in 2025 as part of a sustained AI growth strategy. Peer institutions including Boston University, UGA, and Penn State have all launched formal AI cluster hiring initiatives. Adding faculty hiring as an explicit objective — with a target number and timeline — would give every other goal in this plan a more solid foundation.
Add faculty hiring as a named objective. Even a modest 3–5 targeted cluster hires across Computer Science, Education, and a professional school would meaningfully expand capacity for the bachelor's program, research, and redesign work simultaneously.
Estimated AI-focused faculty: SIUC vs R1 peer range · Sources: SIUC CS, Siebel School
What's Still Missing
The current plan addresses several concerns raised in the original Task Force Report. Goal 2 now includes a stackable certificate pathway (Obj 2.1), the AI+ bachelor's program (Obj 2.2), a systemwide AI minor (Obj 2.3), and GE integration (Obj 2.4) — a meaningful response to the undergraduate gap. The governance structure is actively forming with co-chairs and goal team leads. Two gaps remain worth naming explicitly.
Consider expanding Obj 3.4 to explicitly include student retention analytics alongside enrollment forecasting. Consider adding a Goal 2 objective in partnership with the Graduate Schools targeting a concrete graduate-level credential by AY 2027–28.
Goal-Level Opportunities to Strengthen
The plan proposes using Google Microcredentials and LinkedIn Learning for AI literacy training. Micro-credentials are reasonable starting points and widely available, but these particular products are general-audience tools not tailored to higher education contexts. For the credential to carry weight with employers and regional accreditors, SIU would benefit from either developing its own curriculum or co-developing content with a partner better matched to the academic context.
Consider piloting the off-the-shelf tools in Year 1, then evaluating whether SIU-developed or co-developed content is warranted based on completion rates and employer feedback.
Goal 3 is the broadest in student data scope, covering four distinct systems: AI chatbots for student services (3.1), AI peer tutoring in writing, math, and gateway courses (3.2), degree planning and advising copilots (3.3), and predictive analytics for enrollment and resource planning (3.4). Each of these systems touches student records in different ways — admissions data, academic history, course performance, financial aid — and each carries distinct FERPA implications. The Future of Privacy Forum's 2024 AI-FERPA compliance checklist and the EDUCAUSE 2024 AI Action Plan both provide vendor evaluation frameworks that should be applied to each system independently, not once at the goal level.
The plan currently has only 2 team members for these 4 objectives. Registrar, VC of Student Success, and IT have been identified as target additions — the right offices, and recruiting them quickly will be important given the 2027 timelines on multiple objectives.
Add a FERPA/data governance review gate before each student-facing system enters production — not a single gate for the goal. Prioritize recruiting from Registrar, Student Success, and IT to bring the team to functional size before milestone planning begins.
Goal 4 currently has no team members — the only goal in this state — and has been identified as the area most appropriately led by the Graduate Schools. The goals are substantial: a systemwide research collaborative and faculty directory (4.1), competitive seed grants for cross-campus AI research (4.2), an AI innovation lab and GPU cluster plan (4.3), and coordinated external grant and industry partnership development (4.4).
On infrastructure: the plan targets a GPU cluster for AY 2029–2030 at $1.5M–$3M. For context, Princeton's 300-GPU H100 cluster (2024) required a multimillion-dollar endowment investment, with individual H100 GPUs priced around $30,000 each. The budget should also account for power, cooling, and ongoing staffing. Cloud platforms — AWS, Azure, Google Cloud — all offer academic research credit programs and could bridge the gap while the cluster is planned and procured.
Goal 4 needs a team before any of its objectives can move. Engaging Graduate School leadership at each campus — identified as the right entry point — is the immediate next step. Cloud research credit applications can begin without a full team and would demonstrate forward motion on Obj 4.3 while the cluster plan develops.
The plan would be stronger with a brief competitive context section. U of I is aggressively expanding AI capacity; Illinois State has launched an NIH-funded AI health research lab; NIU has a dedicated AI initiative with industry engagement programming. Articulating SIUC's specific strengths and differentiated position — not just listing goals — would help make the case to leadership and prospective students for why SIU's approach is worth investing in.
A one-page competitive context appendix — covering what peers are doing and where SIUC has distinct advantages — would significantly strengthen the proposal for a senior audience.
Milestone distribution as currently written in the plan — compressed into 2027 with no quarterly breakdown
Faculty Development: A Coherent Strategy
Four interrelated mechanisms are missing from the plan but are well-established at peer institutions: hiring AI-specific faculty across programs (not limited to Computer Science), offering course release time for faculty redesigning courses, hiring temporary NTT lecturers to cover vacated sections, and providing summer pay stipends as a flexible alternative. Used together, these create a sustainable pipeline for curriculum redesign that scales with the institution's needs.
Peer programs offer a useful benchmark: Ohio University's CTLA offers $1,000 per redesign ($500 on completion, $500 on assessment report submission); UNC Chapel Hill's CFE provides up to $7,500 per grant for more substantive curricular development. Both are housed in instructional support centers — the model SIUC's Center for Teaching Excellence (CTE) is well-positioned to replicate.
For governance, the oversight body should be positioned at the system level with the Core Curriculum Director in a standing seat. Core Curriculum is the right institutional home for general education-level redesign, but a system-level body is needed to cover upper-division, professional, and graduate courses that fall outside Core Curriculum's jurisdiction. The Center for Teaching Excellence should provide operational staffing and instructional design support.
All four faculty development mechanisms are well-tested at peer institutions and implementable within existing structures. Adding them to the plan closes its largest strategic gap.
Scale and Cost Estimates
What research does establish is high and growing student AI use across virtually all disciplines. A 2025 HEPI survey found 88% of students had used generative AI for assessments (up from 53% in 2024), and 92% use AI in some form. A 2024 Ithaka S+R national survey of 2,654 instructors found 72% had experimented with AI as an instructional tool, but no single use case is well-established — meaning most faculty are navigating this without institutional scaffolding. A 2023 Scientific Reports study tested ChatGPT against student work across 32 university courses spanning multiple disciplines and found AI performed comparably to or better than students in the majority of them.
The closest proxy for discipline-level exposure comes from labor market research. OpenAI's "GPTs are GPTs" study (2023) found that 80% of the U.S. workforce has at least 10% of their tasks affected by LLMs, with impact concentrated in higher-education-aligned, white-collar roles. This suggests significant AI exposure is distributed across most academic disciplines — not concentrated in one area — but the translation from workforce tasks to specific course redesign needs has not been formally studied.
What a responsible plan should include: a methodology for identifying which of SIUC's courses require redesign — whether through a faculty survey, syllabus review, or structured pilot — rather than relying on inferred national averages. The scope of the redesign program should follow that assessment, not precede it.
Ohio pays $1,000 per redesign ($500 on completion + $500 on assessment report), run through its Center for Teaching, Learning, and Assessment with AI Faculty Fellows as facilitators. 14 faculty from 10 departments completed the inaugural summer cohort. SIUC's CTE already has an equivalent operational structure.
| Component | Estimate | Basis |
|---|---|---|
| Stipends (75–150 redesigns/yr) | $75K–$150K | $1,000/redesign, scaled from Ohio pilot |
| CTE staffing (1–2 FTE coordinators) | $65K–$130K | Estimated; SIUC FTE cost not public |
| Instructional design support | $30K–$50K | Estimated |
| Ohio-style annual total | ~$170K–$330K |
UNC's Center for Faculty Excellence awards up to $7,500 per grant as salary support for faculty developing new AI-integrated curriculum, tracks, or high-enrollment course redesigns. Multiple instructors can be part of a single proposal. Appropriate for deeper structural work — new specializations, cross-disciplinary AI literacy modules.
| Component | Estimate | Basis |
|---|---|---|
| Grants (30–60 projects/yr, avg $4.5K) | $135K–$270K | Based on UNC's up-to-$7,500 range |
| CTE grant administration (1 FTE) | $65K | Estimated |
| Instructional design support | $30K–$50K | Estimated |
| UNC-style annual total | ~$230K–$385K |
A faculty member receives a course release while a non-tenure-track (NTT) instructor covers the vacated section. The marginal cost is the NTT replacement pay, not the full faculty salary. Per AAUP 2023–24 data, the 20th-percentile assistant professor salary at doctoral universities is $82,391 on a 9-month contract — a reasonable lower-bound benchmark for SIUC as an R2 institution. SIUC-specific NTT pay rates are not publicly published; actual figures should be confirmed with SIUC Human Resources. National ranges for NTT per-course pay run $3,000–$5,500 per the AAUP Part-Time Faculty Pay data.
| Component | Estimate | Basis |
|---|---|---|
| NTT replacement per course | $3,000–$5,500 | AAUP national NTT pay range; SIUC rate unconfirmed |
| Summer pay alternative (1/9 salary) | ~$9,155 | 1/9 of AAUP 20th-pct $82,391; no coverage needed |
| 100 releases at NTT rate | $300K–$550K | Course-release-based scale |
| Release model annual total | ~$365K–$620K | Includes CTE coordination |
All three models are defensible starting points. The right choice depends on whether SIUC prioritizes breadth (Ohio-style, lower per-unit cost, lighter redesign), depth (UNC-style, larger grants, more structural change), or protected time (course release, highest per-unit cost but fullest faculty commitment). A hybrid — stipends for most courses, grants for high-impact structural redesigns — is likely most practical. Total program cost depends on the scope SIUC's own course assessment determines.
What Peer Institutions Are Already Doing
These four institutions represent the most directly useful comparisons for SIU's planning process — not because they are similar in size or mission, but because each has already executed a specific piece of what SIU is currently proposing. In each case the gap between "task force report" and "operational program" was 12 months or less. The section closes with a cross-cutting finding that has no equivalent in SIU's current plan.
The most directly comparable peer document. RIT released its AI Task Force Final Report in April 2024 and moved to execution immediately — the first recommendation was creation of a permanent RIT AI Hub, transforming the task force into an institution in one step rather than leaving coordination informal.
- 40+ new faculty positions tied to AI across computing, engineering, business, humanities, and design — not confined to technical fields
- TutorBot and AdvisorBot deployed for student-facing AI support (directly parallel to SIU Goals 3.1–3.3)
- Five-tier AI governance classification system for tools and data handling
- Faculty Summer Institute on AI in teaching, run through the AI Hub
- All of this operational within 12 months of the task force report
UNR's PACK AI initiative is student-driven and required rather than optional. Rather than hoping students encounter AI literacy through electives or department programs, it embeds a mandatory AI ethics and use module directly into first-year orientation — ensuring 100% exposure before students reach gateway courses.
- Required AI ethics module through the NevadaFIT first-year orientation program — not elective
- Structured faculty symposium series and curated teaching resources
- Institutional Microsoft Copilot and Apple Intelligence access for all students
- Nevada Online three-course AI certificate
- Single coherent web presence unifying all AI programs — a structural contrast to SIU's four-goal, sixteen-objective architecture
The most directly relevant model for SIU's data governance question. UC built BearcatGPT — a private GPT platform running inside UC's Azure infrastructure — meaning student and faculty data never trains Microsoft's public models. Launched to all faculty and staff December 2025, student access planned Spring 2026.
- AI-driven Socratic Tutors deployed in math and statistics courses (parallel to SIU Obj 3.2)
- Four-tier AI fluency framework starting with "AI Essentials" for all staff
- AI Enablement Community of Practice with four subcommittees: teaching & research, ethics & inclusion, operations, and policy — governance more granular than anything in SIU's current plan
- Private cloud model directly addresses the data privacy concern that faculty most often raise about commercial AI tools
CSU's RamGPT uses the same Azure infrastructure as UC's BearcatGPT — and uniquely, the cost is publicly reported. This is the most useful benchmark for any SIU budget conversation about a secure institutional AI environment. Faculty raised concerns about environmental impact and academic integrity during rollout — honest context for SIU planning.
- $120,000 in 2025, scaling to $142,000/year in 2026 and 2027 — the full public cost of an Azure-hosted institutional AI environment
- Phased rollout: CSU-GPT (existing) operational while RamGPT (next-generation) launches
- User data stays within the university's cloud environment; does not feed Microsoft's public training models
- Addresses the faculty concern about commercial AI tool use in instruction and research more directly than any policy alone can
Both UC and CSU have built private, Azure-hosted AI environments where student and faculty data stays within the institution's cloud — not feeding public AI training pipelines. The cost is now publicly benchmarked: $120,000–$142,000 per year at CSU's scale.
For context, that's less than one NTT hire and less than the annual cost of two of the three course redesign models estimated in Section 06. At that price, SIU would give every faculty member a safe tool to experiment with, give students a privacy-compliant AI environment, and give the institution visibility into how AI is actually being used across campus — data that would directly inform Goal 1 policy development and Goal 3 chatbot scoping.
SIU's current plan has no equivalent objective. Adding one — even as a Goal 1 or Goal 3 deliverable — would close the most significant structural gap between SIU's plan and what comparable institutions have already deployed.
Add an objective for a secure, university-hosted AI environment (Azure or equivalent). Budget $120K–$142K/yr based on CSU's public figures. Assign it to Goal 1 (policy & standards infrastructure) or Goal 3 (student services tools) — either is a defensible home. This is the single most concrete, costed, peer-validated addition available to the plan right now.