The Delaware Cost Study has been discontinued

Instructional Cost
Benchmarking Reimagined

Free, instant benchmarking for every U.S. institution. Compare your instructional costs, faculty composition, and productivity against national peers — powered by IPEDS data and advanced analytics.

4,000+
Institutions Covered
8
Key Metrics
10+
Years of Trend Data

700+ institutions lost their benchmarking tool overnight

The University of Delaware's Cost Study — the only national program-level instructional cost analysis — shut down in December 2025. Here's what that means.

No Replacement Exists

There is no other national study that benchmarks instructional costs at the program level. IPEDS reports at the institutional level only.

Accreditation at Risk

Many institutions relied on Cost Study data for accreditation evidence, program reviews, and external reporting requirements.

📈

Budget Decisions in the Dark

Provosts and deans used benchmark data to allocate resources across programs. Without it, budget decisions lack external context.

💡

We Built the Replacement

Using public IPEDS data, advanced statistical models, and interactive dashboards — we deliver what the Cost Study never could.

Search any institution. See where you stand.

Instant benchmarking against Carnegie classification peers. No login required. No data submission needed.

Oklahoma State University
Doctoral University: Very High Research · Public · Stillwater, OK
$8,432
Instructional Cost per FTE Student
Peer Median: $9,105
38th percentile
87.3%
Personnel Cost %
Peer Median: 84.0%
72nd percentile
16.2
FTE Students per Faculty
Peer Median: 14.8
61st percentile
62.1%
Full-Time Faculty %
Peer Median: 68.4%
41st percentile
48.5%
Tenured/Tenure-Track %
Peer Median: 52.1%
39th percentile
$142K
Research Exp. per T/TT Faculty
Peer Median: $168.5K
35th percentile
31.2%
Instruction as % of E&G
Peer Median: 28.7%
58th percentile
24.8
Degrees per 100 FTE Students
Peer Median: 22.1
68th percentile

Go beyond benchmarks. Understand why.

Ph.D.-level statistical models that the original Cost Study never offered. From cost drivers to efficiency frontiers.

Cost Drivers Analysis

A multiple regression model that identifies which institutional characteristics — enrollment size, faculty mix, research intensity, program composition, and geographic location — significantly predict your instructional cost per FTE student. We decompose each predictor's contribution using dominance analysis (Shapley values) so you know how much each factor explains, not just whether it's significant.

Multiple Regression Dominance Analysis Learn more →

State & Sector Context

Institutions are nested within states that have different funding models, cost of living, and regulatory environments. Our multilevel (hierarchical) model separates state-level variance from institution-level variance and tests cross-level interactions to determine whether the impact of faculty mix on cost differs depending on state funding levels.

Multilevel Modeling (HLM) Random Effects Learn more →

Efficiency Frontier

Data Envelopment Analysis (DEA) is the method the original Delaware Cost Study cited but never implemented. It constructs an empirical efficiency frontier and scores each institution from 0 to 1, with a reference set of efficient peers and specific improvement targets.

Data Envelopment Analysis Malmquist Index Learn more →

True Peer Groups

Data-driven peer groups based on your actual cost and productivity profile — not just Carnegie classification. Research from the Cost Study's own presentations demonstrated that data-driven groups outperform Carnegie-based ones. Includes your 10 nearest neighbors by Mahalanobis distance.

K-Means Clustering Mahalanobis Distance Learn more →

Distribution Analysis

What drives cost at the extremes? Quantile regression estimates separate models at the 10th, 25th, 50th, 75th, and 90th percentiles. The factors that matter for median institutions may differ entirely from those driving the most expensive ones.

Quantile Regression Conditional Distributions Learn more →

Anomaly Detection

Four complementary methods — Mahalanobis distance, Cook's distance, DBSCAN, and IQR fencing — systematically flag institutions with unusual patterns and generate diagnostic narratives explaining which metrics are unusual and why.

Mahalanobis Distance Cook's Distance DBSCAN Learn more →

10 Premium Analytics Reports

Each model produces a standalone report: Cost Drivers (P-1), State Context (P-2), Efficiency Assessment (P-3), True Peer Group (P-4), Distribution Analysis (P-5), Anomaly Flags (P-6), Causal Pathways (P-7), Cost Trajectory (P-8), Program Prioritization (P-9), and Workload Scenarios (P-10). Every report includes full methodology documentation, assumption testing, model diagnostics, and effect size reporting — publication-quality from a journal associate editor.

Publication Quality

Executive Summaries

4-page visual briefs designed for provosts, VPs, and CFOs. Each summary distills the statistical findings into plain-language key findings, contextualized peer comparisons, and 3–5 specific recommended actions. No jargon, no p-values — just "here's where you stand, here's why, and here's what to do about it." Includes data visualizations designed for non-technical audiences.

Board Ready

Board Presentations

Interactive Power BI decks or exported PowerPoint/PDF formatted specifically for Board of Trustees and Regents meetings. Includes narrative talking points, annotated visualizations, benchmarking context slides, and an appendix with methodology for the detail-oriented board member. Designed to support the provost or CFO presenting to governance.

Governance Package

Data Workshops

Half-day on-site or virtual workshops for your IR, finance, and academic affairs teams. We walk through the methodology, interpret the results together, demonstrate the dashboards, and train your staff to use the ongoing tools. Includes hands-on exercises with your institution's actual data and a Q&A session with both the Power BI lead and the Ph.D. statistician.

Capacity Building

Annual Monitoring

Quarterly refresh of key metrics with automated trend alerts. When your institution crosses a percentile threshold, shifts trajectory class, or shows a significant change in efficiency score, you'll know immediately. Includes an annual re-run of the full model suite with updated IPEDS data and a summary of what changed and why.

Ongoing Subscription

Accreditation Evidence Packages

Pre-formatted evidence packages aligned to common accreditation standards (HLC, SACSCOC, MSCHE, WASC). Includes instructional cost benchmarks, productivity evidence, faculty composition analysis, and efficiency documentation — ready to drop into your self-study or compliance reports.

Compliance Ready

Three tiers. One mission.

Start with free benchmarking and unlock deeper analytics as your needs grow. No data submission required — everything is powered by public IPEDS data. Pricing details coming soon.

Explore
Free Forever
Instant benchmarking for any institution. No login, no data submission required.
  • Search any U.S. institution
  • 8 key instructional cost and productivity metrics
  • Carnegie classification peer group comparison
  • Percentile rankings on every metric
  • Updated annually with each IPEDS release
Try It Now
Advanced Analytics
Custom Engagement
Ph.D.-led statistical modeling and strategic consulting tailored to your institution.
  • Everything in Dashboard
  • DEA efficiency frontier analysis
  • Regression-based cost driver identification
  • Structural equation modeling (causal pathways)
  • Data-driven peer group clustering
  • Program prioritization matrix
  • Faculty workload simulation scenarios
  • Executive summary and board-ready presentation
  • Dedicated Ph.D. analyst throughout engagement
  • Optional on-site data workshop
Let's Talk

Built by people who live in this data

Two Ph.D.s with decades of combined experience in higher education data, analytics, and statistical modeling.

RP
Robert Pilgrim, Ph.D.
Product & Data Architecture
  • Associate Director of Data Strategy & Insights, University of Arkansas (R1)
  • Ph.D. Space & Planetary Sciences — quantitative research background
  • Power BI expert since 2013 — Desktop, Embedded, DAX, M, Dataflows
  • IPEDS, HERD, and federal survey expert (the exact data powering this tool)
  • Recreated Carnegie Research Classifications for institutional benchmarking
  • 15 years SAS, plus SQL, Python, R, Tableau, Azure
  • NCURA national conference presenter on data visualization and AI
  • NSF co-investigator — Research Analytics Summit ($100K)
JK
Jam Khojasteh, Ph.D.
Lead Statistician & Advanced Analytics
  • Professor of Research, Evaluation, Measurement, and Statistics, Oklahoma State University
  • Ph.D. Educational Statistics and Research Methods, University of Arkansas
  • Associate Editor, Structural Equation Modeling journal
  • 40+ peer-reviewed publications in SEM, multilevel modeling, and psychometrics
  • $6M+ in funded grants (NIH, NSF, DOJ, state agencies)
  • Teaches: SEM, Multiple Regression, Multilevel Modeling, Nonparametric Statistics
  • Consulting: State DOE evaluations, DOJ program evaluation, NIH biostatistics
  • Signature method: Structural Equation Modeling with measurement invariance testing

Ready to see where your institution stands?

Start with the free tool or talk to us about a deeper engagement. No pressure, no data submission required.