Free, instant benchmarking for every U.S. institution. Compare your instructional costs, faculty composition, and productivity against national peers — powered by IPEDS data and advanced analytics.
The University of Delaware's Cost Study — the only national program-level instructional cost analysis — shut down in December 2025. Here's what that means.
There is no other national study that benchmarks instructional costs at the program level. IPEDS reports at the institutional level only.
Many institutions relied on Cost Study data for accreditation evidence, program reviews, and external reporting requirements.
Provosts and deans used benchmark data to allocate resources across programs. Without it, budget decisions lack external context.
Using public IPEDS data, advanced statistical models, and interactive dashboards — we deliver what the Cost Study never could.
Instant benchmarking against Carnegie classification peers. No login required. No data submission needed.
Ph.D.-level statistical models that the original Cost Study never offered. From cost drivers to efficiency frontiers.
A multiple regression model that identifies which institutional characteristics — enrollment size, faculty mix, research intensity, program composition, and geographic location — significantly predict your instructional cost per FTE student. We decompose each predictor's contribution using dominance analysis (Shapley values) so you know how much each factor explains, not just whether it's significant.
Multiple Regression Dominance Analysis Learn more →Institutions are nested within states that have different funding models, cost of living, and regulatory environments. Our multilevel (hierarchical) model separates state-level variance from institution-level variance and tests cross-level interactions to determine whether the impact of faculty mix on cost differs depending on state funding levels.
Multilevel Modeling (HLM) Random Effects Learn more →Data Envelopment Analysis (DEA) is the method the original Delaware Cost Study cited but never implemented. It constructs an empirical efficiency frontier and scores each institution from 0 to 1, with a reference set of efficient peers and specific improvement targets.
Data Envelopment Analysis Malmquist Index Learn more →Data-driven peer groups based on your actual cost and productivity profile — not just Carnegie classification. Research from the Cost Study's own presentations demonstrated that data-driven groups outperform Carnegie-based ones. Includes your 10 nearest neighbors by Mahalanobis distance.
K-Means Clustering Mahalanobis Distance Learn more →What drives cost at the extremes? Quantile regression estimates separate models at the 10th, 25th, 50th, 75th, and 90th percentiles. The factors that matter for median institutions may differ entirely from those driving the most expensive ones.
Quantile Regression Conditional Distributions Learn more →Four complementary methods — Mahalanobis distance, Cook's distance, DBSCAN, and IQR fencing — systematically flag institutions with unusual patterns and generate diagnostic narratives explaining which metrics are unusual and why.
Mahalanobis Distance Cook's Distance DBSCAN Learn more →Structural Equation Modeling (SEM) maps the full web of causal pathways showing how research intensity, faculty mix, and enrollment interact to produce cost. Estimates direct, indirect, and total effects simultaneously. Led by Dr. Khojasteh, Associate Editor of the Structural Equation Modeling journal.
Structural Equation Modeling Measurement Invariance Learn more →Growth Mixture Models identify distinct subpopulations following different cost trajectories — rising, stable, or declining. Your institution is classified into a trajectory group with 3-year forecasts and confidence intervals. Parallel process models test whether enrollment and cost co-evolve.
Latent Growth Curves Growth Mixture Models Learn more →A composite decision framework plotting each academic program on a cost-efficiency vs. demand matrix. Four quadrants: Invest, Sustain, Monitor, or Restructure — with drill-through evidence and scenario modeling for program consolidation.
Composite Decision Matrix Multi-Criteria Analysis Learn more →An interactive scenario engine modeling the cost impact of faculty composition changes over 1–5 years. "What if we replace 5 retiring T/TT with lecturers?" See the impact on cost, ratios, and peer percentile rank with Monte Carlo uncertainty estimates.
Monte Carlo Simulation What-If Modeling Learn more →Each model produces a standalone report: Cost Drivers (P-1), State Context (P-2), Efficiency Assessment (P-3), True Peer Group (P-4), Distribution Analysis (P-5), Anomaly Flags (P-6), Causal Pathways (P-7), Cost Trajectory (P-8), Program Prioritization (P-9), and Workload Scenarios (P-10). Every report includes full methodology documentation, assumption testing, model diagnostics, and effect size reporting — publication-quality from a journal associate editor.
Publication Quality4-page visual briefs designed for provosts, VPs, and CFOs. Each summary distills the statistical findings into plain-language key findings, contextualized peer comparisons, and 3–5 specific recommended actions. No jargon, no p-values — just "here's where you stand, here's why, and here's what to do about it." Includes data visualizations designed for non-technical audiences.
Board ReadyInteractive Power BI decks or exported PowerPoint/PDF formatted specifically for Board of Trustees and Regents meetings. Includes narrative talking points, annotated visualizations, benchmarking context slides, and an appendix with methodology for the detail-oriented board member. Designed to support the provost or CFO presenting to governance.
Governance PackageHalf-day on-site or virtual workshops for your IR, finance, and academic affairs teams. We walk through the methodology, interpret the results together, demonstrate the dashboards, and train your staff to use the ongoing tools. Includes hands-on exercises with your institution's actual data and a Q&A session with both the Power BI lead and the Ph.D. statistician.
Capacity BuildingQuarterly refresh of key metrics with automated trend alerts. When your institution crosses a percentile threshold, shifts trajectory class, or shows a significant change in efficiency score, you'll know immediately. Includes an annual re-run of the full model suite with updated IPEDS data and a summary of what changed and why.
Ongoing SubscriptionPre-formatted evidence packages aligned to common accreditation standards (HLC, SACSCOC, MSCHE, WASC). Includes instructional cost benchmarks, productivity evidence, faculty composition analysis, and efficiency documentation — ready to drop into your self-study or compliance reports.
Compliance ReadyStart with free benchmarking and unlock deeper analytics as your needs grow. No data submission required — everything is powered by public IPEDS data. Pricing details coming soon.
Two Ph.D.s with decades of combined experience in higher education data, analytics, and statistical modeling.
Start with the free tool or talk to us about a deeper engagement. No pressure, no data submission required.