Blog

/

Modular Data Center Manufacturer vs Provider vs Integrator: How to Select the Right Delivery Model

February 15, 2026

Modular Data Center Manufacturer vs Provider vs Integrator: How to Select the Right Delivery Model

A procurement framework for system integrators and infrastructure agencies to select between manufacturer, provider, and integrator delivery models for modular data centers. Covers qualification checklists, contract clauses, and acceptance logic to prevent delivery-model mismatch.

Modular Data Center Manufacturer vs Provider vs Integrator: How to Select the Right Delivery Model

You've won a project that requires edge compute infrastructure. The client expects delivery in six months. Traditional construction would take two years minimum. Modular data centers solve the timeline problem—but now you face a different challenge: which delivery model actually fits your project?

This isn't a vendor selection question. It's a procurement architecture decision that determines who owns what risk, when acceptance happens, and where the blame lands when something goes wrong at site.

Most delivery-model mismatches trace back to one root cause: procurement language implies single-point, end-to-end accountability while the selected counterparty operates as a component supplier (or vice versa). In modular programs, this mismatch becomes expensive fast. Factory-built systems compress schedules and shift risk earlier, concentrating exposure at five critical interfaces: design authority, integration, testing, logistics, and permitting.

This guide uses three procurement-defined roles to force explicit responsibility splits. It's written for system integrators coordinating turnkey projects and infrastructure engineering agencies specifying MDC solutions into larger builds.

The Three Delivery Models: What You're Actually Buying

Before diving into qualification checklists and contract clauses, let's establish what each delivery model actually means in procurement terms.

Manufacturer (MDC Module OEM / Fabricator)

A manufacturer builds and factory-tests physical modules or skids and sells them as goods—typically under Incoterms—with limited system integration outside the module boundary.

What you get: Hardware that meets a specification at the factory gate.

What you don't get: Site integration, commissioning, system-level performance accountability.

Documentation and warranty: Usually stops at the shipped product boundary unless explicitly expanded.

Best fit: When you have strong in-house engineering and general contractor capability. You need standardized modules quickly and can handle interface and commissioning risk internally.

Provider (Turnkey MDC Provider / Prime Contractor)

A provider contracts to deliver a working facility or a defined "ready for IT load" outcome. They act as single point of responsibility for design coordination, procurement, manufacturing oversight, logistics, site works integration, commissioning, and handover.

What you get: An operational outcome with demonstrable performance.

What you don't get: Flexibility to swap in alternative vendors mid-project without renegotiation.

Documentation and warranty: Single-point accountability across the full scope.

Best fit: When you need someone else to own the outcome and commissioning evidence. You're willing to pay for reduced internal resource requirements and earlier design lock protects your schedule.

Integrator (Multi-Vendor Systems Integrator / Prime for Integration)

An integrator coordinates multiple OEM modules and site contractors into one system, owning interface management and integrated test execution. Module fabrication may sit with multiple manufacturers.

What you get: Interface risk management and integrated system testing across vendors.

What you don't get: Single-point product warranty without explicit back-to-back arrangements.

Documentation and warranty: Consolidated across OEMs plus interface as-builts plus integrated test reports.

Best fit: Multi-vendor module mix (power from OEM A, cooling from OEM B) or phased brownfield integration where modules must be evaluated together.

Modular Data Centers by ModulEdge

Selecting a delivery model? We work as manufacturer, provider, or integrator — matched to your acceptance requirements and project structure.

  • 5–150 kW per rack, engineered for edge compute and AI
  • Integrated power, air/water cooling, fire, monitoring, and security
  • Climate- and site-specific customization, including free cooling
  • Designed to meet Tier III/Tier IV principles
  • Typical custom build cycles: 3–6 months

The Selection Principle That Prevents Most Disputes

Choose the delivery model that matches the level at which you need a demonstrably testable performance outcome.

If acceptance is "module meets spec at the factory," manufacturer-led procurement fits.

If acceptance is "site meets availability objective under real-world demonstrations," you need a prime provider or an integrator with explicit design and commissioning scope.

The Uptime Institute's Tier Certification of Constructed Facility process makes this concrete: live demonstrations and commissioning evidence are required to validate that a facility performs to its stated objective. System-level acceptance cannot be reliably purchased as "equipment only."

A second principle prevents most white-label failures: branding can be negotiated; accountability cannot. When you want to white-label an MDC solution under your own brand, the contract still needs unambiguous "design authority" and a single owner of integrated acceptance. Dual identification in documentation—"branding name" (partner) and "legal manufacturer / responsible contractor" (accountable party)—keeps certification marks and compliance dossiers traceable for authority and insurer review.

Responsibility Comparison: What Changes by Model

Modular delivery reshapes the risk map. "Construction" becomes manufacturing plus logistics plus integration. Acceptance must split into factory acceptance (FAT) and site acceptance (SAT), then extend into integrated systems testing where the facility operates as a system under load and failure conditions.

Dimension Manufacturer Integrator Provider
Contract object

Goods/modules delivered to defined boundary (often ex-works or similar)

Integrated package (modules + interface works + integrated commissioning scope)

Operational facility (or "ready for IT load") with defined performance outcome

Design responsibility

Module internal design only; site design remains with buyer/GC unless explicitly added

Coordinates design interfaces; may own integration design package; OEM retains internal design

Owns end-to-end design coordination and design compliance evidence

FAT ownership

OEM defines and executes FAT; customer witnesses/accepts per agreed protocol

OEM FAT plus integrator-run "neutral site" or combined FAT for multi-vendor evaluation

Provider orchestrates OEM FAT and adds system-level factory or staging tests

SAT/commissioning ownership

Often excluded or limited to advisory; buyer runs SAT and commissioning unless separately contracted

Integrator runs SAT plus integrated systems testing across module and site interfaces

Provider runs SAT plus integrated systems testing and delivers commissioning documentation

Acceptance logic

Pass/fail to module spec at factory plus incoming inspection at site; system performance risk retained by buyer

Multi-stage: module FAT → interface SAT → integrated systems testing; integrator carries defined system-level accountability

Outcome-based acceptance tied to OPR and demonstrable facility behavior under defined test conditions

Warranty boundary

Product warranty to module boundary; interface failures frequently excluded

Split warranties persist; integrator must bridge via back-to-back warranties and single front-end claim process

Single-point warranty feasible (prime holds subs), subject to negotiated caps/terms

Documentation pack

Manufacturing QA, FAT reports, module drawings/manuals; site as-builts sit elsewhere

Consolidated documentation across OEMs plus interface as-builts plus integrated test reports

Full handover pack: design + as-builts + test reports + O&M/training, aligned to OPR

Schedule risk drivers

Component lead times plus factory slots; buyer controls site readiness and approvals

Synchronization of multiple factories plus logistics plus site readiness; interface changes are high-cost

Earlier design lock plus procurement release; provider manages critical path but prices risk into contract

Design Responsibility: What Must Be Explicit

ISO/IEC 22237 frames procurement and integration as lifecycle phases with multiple parties—owners, main contractors, commissioning agents, suppliers. A procurement document that doesn't name a single design authority will default into ambiguity.

Your procurement language must explicitly assign three things:

1. Requirements-to-design translation. Who owns the traceability from project requirements to engineered solution?

2. Interface definitions. Who controls utilities, structural loads, fire systems, controls/BMS, and network demarcations?

3. Code-compliance packages. Who holds responsibility for design packages submitted to authorities having jurisdiction?

Standards acknowledge that AHJ requirements supersede recommendations. This triggers the contractual requirement for a jurisdiction-specific compliance plan and named responsible party.

For white-label arrangements, this becomes critical. The partner may own the client relationship and brand presence, but someone must be contractually named as design authority with liability to match.

FAT/SAT/IST: Turning Tests into Enforceable Procurement Language

Commissioning frameworks define performance testing as verifying a product or system meets defined criteria using test protocols. This is the legal and operational foundation for acceptance criteria that don't collapse into subjective "works as intended."

Factory Acceptance Testing (FAT)

FAT is most valuable when it reduces site uncertainty through pre-integration and controlled-environment testing. Equipment should arrive as prefabricated, factory-tested modules to enable parallel installation.

Manufacturer model: OEM-run FAT is the primary acceptance gate.

Integrator model: OEM FAT plus integrated FAT or neutral-site testing where multi-vendor modules must be evaluated together.

Provider model: Provider orchestrates FAT and adds system-level staging tests where practical.

Modern modular programs need enhanced FAT protocols that handle multi-vendor, multi-location manufacturing and remote witnessing. UL Solutions' guidance on prefabricated modular data centers emphasizes that documentation must be suitable for code authorities.

Site Acceptance Testing (SAT)

SAT verifies installation correctness and interface connectivity through functional performance tests against sequences of operation tied to the Owner's Project Requirements (OPR).

Manufacturer model: Often not included; buyer runs SAT. This creates high mismatch risk when implied but not contracted.

Integrator model: Included and owned by prime integrator.

Provider model: Included and owned by prime provider; aligns to OPR and any Tier objective.

Integrated Systems Test (IST)

IST is system-level demonstration over days, simulating conditions and validating that the system operates as a system. This produces a formal test report and is critical to determining actual system operation.

The Uptime Institute's constructed facility certification process explicitly includes live system demonstrations under real-world conditions and reviews commissioning documentation. SAT must evidence system behavior, not only component start-up.

Practical Procurement Split

Include these elements in contract language, with scope sitting with the accountable party per model:

Test protocol hierarchy: OPR → test plan → scripts → data capture requirements → pass/fail thresholds.

Witnessing and records: Define witness rights (on-site or remote), required evidence (raw logs, calibrated instrument records, configuration snapshots), and signing authority.

Deficiency handling: Define NCR/deficiency log ownership, closure criteria, retest rules, and who pays for retest under which condition (manufacturing defect versus site readiness).

Taking-over trigger: Acceptance occurs only after defined "tests on completion" are passed—or an explicit commercial workaround is used with agreed mitigation and holdback.

Warranty Boundaries: Where Disputes Concentrate

Warranty is primarily a boundary problem.

Manufacturer-led deals tend to warrant the module "as built." Performance issues at site often manifest at interfaces: incorrect site utilities, controls integration, environmental conditions, installation workmanship, or interactions between multiple modules.

Incoterms selection matters more than most procurement teams realize. EXW (Ex Works) transfers risk very early; buyer owns logistics and damage risk from factory pickup. D-terms transfer risk late, keeping it with the seller until destination. For heavy modular shipments, logistics and damage responsibility must be written explicitly regardless of which term you select.

Warranty Matrix Requirement

Include a table that maps each subsystem to:

  • Warrantor
  • Warranty period
  • Response time
  • Exclusions
  • Required operating conditions

Subsystems to cover: power train, cooling, controls, fire detection/suppression interfaces, structure/enclosure, monitoring.

Interface Liability Rule

Define a default: the party owning the Interface Control Document (ICD) owns interface defects unless they prove noncompliance by another party against the ICD baseline.

Defects Period Alignment

For Provider and Integrator primes, align warranty and defects regimes to a defined Defects Notification Period and remedy obligations—consistent with FIDIC and similar international contract structures.

White-Label Boundary

Require dual identification in all documentation: the branding name (partner) plus the legal manufacturer or responsible contractor (accountable party). Certification marks and compliance dossiers must remain traceable.

Documentation Pack: Minimum Enforceable Deliverables

Use a contractually controlled document register with "required at gate" timing. ASHRAE commissioning guidelines emphasize verification and documentation as part of acceptance.

Category Minimum Contents Due At Notes
Design drawings

GA layouts, single-lines, schematics, controls architecture, interface/demarcation drawings

Design freeze and updated at as-built

Procurement must assign ownership of "record" documents

Test documentation

FAT scripts/results; SAT scripts/results; IST report; deficiency logs with closure evidence

FAT sign-off; SAT/IST sign-off

Must be suitable for authority review

O&M manuals

Operating procedures, maintenance schedules, safety procedures, configuration backups, recommended consumables

Handover

Training plans link here

Spares list

Critical spares, recommended quantities, lead times, storage conditions

Pre-handover

Supply chain volatility makes lead-time disclosure material

Certificates and compliance

Product certifications, lifting/structural docs, inspection records, jurisdictional compliance dossier

Before shipment and before energization

Adapt to local regimes

Training records

Training plan, attendance, competency sign-off

Handover

Commissioning standards treat training as a defined deliverable

Lead-Time Drivers: What Actually Controls Your Schedule

Modular delivery can reduce on-site duration by shifting work to factories. But it doesn't eliminate long-lead equipment constraints—it front-loads design lock and procurement releases.

The promise of modular is parallelization: modules fabricate in the factory while site pads are poured concurrently. But this only works when design locks early enough for procurement to release.

Supply Chain Reality

Critical equipment—transformers, switchgear, generators, cooling systems—can face lead times exceeding 12 months in strained markets. Turner & Townsend's Data Centre Cost Index tracks these constraints across regions. This hasn't changed post-pandemic; it's become structural.

Regulatory and Power Constraints

Permitting authority practices vary by jurisdiction. Power availability can become the real critical path. A module sitting complete in the factory doesn't help when grid connection takes 18 months—a constraint Reuters has reported is actively slowing EMEA data center rollouts.

Indicative Timeline Bands

These vary by size, topology, jurisdiction, and procurement readiness:

Phase Manufacturer Integrator Provider Primary Drivers
Requirements + RFP 2-6 weeks 4-10 weeks 4-12 weeks

Requirements clarity; standards baseline selection

Design freeze / ICD baseline 2-8 weeks (module spec only) 6-16 weeks 8-20 weeks

Early design lock needed for modular sequencing

Component procurement 8-60+ weeks 8-60+ weeks 8-60+ weeks

Critical equipment manufacturing delays

Module fabrication + FAT 6-20 weeks 8-24 weeks (multi-factory) 8-24 weeks

Factory slots; test protocol complexity

Site works Buyer/GC dependent (4-20+ weeks) 6-24+ weeks 6-24+ weeks

Permits; utility readiness; inspections

Delivery + install 1-6 weeks 2-10 weeks 2-10 weeks

Heavy logistics, cranes, staging

SAT + IST + handover Buyer-led (2-12+ weeks) 4-16 weeks 4-20 weeks

OPR-driven testing; integrated demonstrations

The speed advantage of modular comes from parallel workflows and controlled factory builds—not from magically shorter equipment lead times.

Vendor Qualification Checklists by Delivery Model

Qualification here is structured as procurement gates with evidence requirements.

Manufacturer Qualification Checklist

Gate Requirement Evidence to Demand Mismatch Prevented
Responsibility boundary

Documented module boundary and exclusions (site works, SAT, controls integration beyond module)

Boundary drawings, demarcation list, exclusions schedule

Buyer assumes system integration unknowingly

Quality system

Demonstrable manufacturing QA process suitable for repeat builds

Quality certificates (e.g., ISO 9001), ITP/QA plan

"Prototype quality" shipped into production

FAT protocol

FAT test plan with pass/fail criteria and witness method

FAT script; instrumentation list; acceptance thresholds; sample FAT report

FAT becomes "demo" instead of acceptance gate

FAT nonconformance control

Formal NCR workflow with closure before shipment

NCR log template; closure evidence; hold-point policy

Issues discovered at site without remedy path

Logistics and risk transfer

Incoterms selection aligned to project risk appetite; packaging and lifting plan

Incoterms clause text; packing method statement; lifting points/rigging plan

Transit damage and schedule blame disputes

Documentation pack completeness

Minimum documentation set delivered at shipment

Drawing list; O&M manual outline; as-built promise language

Late or missing O&M and test records

SAT support definition

Explicit scope/pricing for site support (optional)

Optional SAT support SOW with rates and response times

Buyer expects commissioning included "by default"

Integrator Qualification Checklist

Gate Requirement Evidence to Demand Mismatch Prevented
Interface governance

Formal Interface Control Document (ICD) ownership and change-control authority

ICD template; interface register; RACI for interfaces

Interface gaps between OEMs and site contractors

Multi-vendor FAT strategy

Plan for integrated FAT when modules must be evaluated together

FAT Master Plan; remote witness plan; combined test matrix

"Each module passed FAT" but system fails at site

SAT/IST competence

Demonstrated ability to execute OPR-driven functional tests

Commissioning plan; sample IST report; test protocols

SAT devolves into start-up, not acceptance

Warranty bridging

Back-to-back warranties and explicit "single front door" defect process

Warranty matrix; claims workflow; OEM support agreements

Buyer trapped between OEMs blaming each other

Documentation consolidation

Single, indexed documentation pack across OEMs and site works

Master document register; controlled transmittals; as-built plan

Fragmented documentation undermines operations

Authority/approval strategy

Plan to package testing and compliance records for code authorities

Compliance dossier list; inspection plan; certification strategy

Approval delays due to missing records

Provider Qualification Checklist

Gate Requirement Evidence to Demand Mismatch Prevented
Outcome definition

Contract defines "ready for IT load" in measurable terms linked to OPR

OPR baseline; acceptance test matrix; performance criteria

Acceptance becomes subjective and disputed

End-to-end design authority

Provider owns design coordination and compliance evidence across disciplines

Design responsibility statement; standards compliance matrix

Design gaps hidden until commissioning

FAT-to-SAT continuity

Single master test plan spanning factory and site, with traceability

Test traceability matrix; FAT/SAT scripts; deficiency closure gates

"Passed FAT" but site performance unproven

Live demonstration readiness

Ability to run real-world demonstrations and produce commissioning evidence

Demonstration scripts; load bank plan where applicable; commissioning dossier

Operational risk not de-risked before go-live

Defects and warranty regime

Single-point warranty with clear caps, exclusions, and defect remedy timelines

Warranty statement; defects process aligned to taking-over/defects period

Buyer faces fragmented warranty enforcement

Handover completeness

Full documentation pack plus training plus spares strategy at handover

Document register; O&M + training plan; spare parts list and lead times

Decision Matrix: Matching Use Case to Delivery Model

Use Case / Constraint Default-Fit Model Risk Posture Purchased Typical Trade-off Must-Write Clauses
Strong in-house engineering + GC capability; need standardized modules quickly Manufacturer

Buyer retains interface + commissioning risk

Lower prime margin; higher buyer management load; faster factory throughput if specs fixed

Incoterms + insurance; FAT protocol + witness; site interface spec; exclusion list for SAT/IST; documentation minimums

Multi-vendor module mix or phased brownfield integration Integrator

Interface risk shifted to integrator within defined scope

Added integration fee; reduced rework/punch-list churn if integrator controls interfaces

ICD; integrated FAT/neutral-site testing plan; back-to-back warranties; single claims process

Single-point accountability for "ready for IT load" and commissioning evidence Provider

Outcome risk shifted to prime provider

Higher price; fewer internal resources needed; earlier design lock to protect schedule

OPR-based acceptance tests; SAT/IST scope; liquidated damages / schedule remedies; warranty boundaries and caps

Third-party resilience validation required (Tier objective or equivalent) Provider (or Integrator as prime with explicit commissioning authority)

Demonstrable facility behavior under real-world conditions

More testing time + documentation cost; reduced operational risk

Align tests to live demonstrations; require commissioning documentation package; define retest and deficiency closure gates

White-label / channel partner wants to resell under own brand Provider or Integrator preferred; Manufacturer only if partner owns full integration capability

Brand risk minimized by clear accountability map

Branding adds documentation overhead; liability remains with accountable party

Branding vs certification marks; warranty pass-through; report ownership; customer-facing escalation tree

Decision Flowchart

Start here: Define your required acceptance outcome.

Question 1: Is acceptance only "module meets spec at factory"?

  • Yes → Select Manufacturer model. Write: FAT protocol + Incoterms + interface spec + exclusions.
  • No → Continue to Question 2.

Question 2: Does acceptance require site-level performance demonstration?

  • Yes → Continue to Question 3.
  • No → Select Manufacturer model.

Question 3: Is single-point accountability required?

  • Yes → Select Provider model. Write: OPR + SAT/IST + taking-over + defects regime + SLAs.
  • No → Continue to Question 4.

Question 4: Are there multi-vendor modules or complex brownfield interfaces?

  • Yes → Select Integrator model as prime. Write: ICD + integrated FAT/SAT/IST + back-to-back warranties.
  • No → Select Provider model.

Clauses That Prevent Mismatch: Quick Reference

Clause Area Manufacturer Integrator Provider
Scope boundary definition

Define module boundary and explicit exclusions for site works/commissioning

Define integration boundary plus included interface works

Define outcome boundary ("ready for IT load," performance metrics, documentation)

Design responsibility

Module internal only unless expanded

Owns ICD and integration design; OEM retains internal design

Owns end-to-end design coordination and compliance evidence

FAT

OEM-run FAT is primary acceptance gate

OEM FAT plus integrated FAT/neutral site where needed

Provider orchestrates FAT plus adds system-level staging tests

SAT/IST

Often not included; buyer runs; high mismatch risk if implied

Included and owned by prime integrator

Included and owned by prime provider; aligns to OPR and Tier objective

Warranty and defect remedy

Product warranty; transport damage risk depends on Incoterms

Back-to-back OEM warranties plus integrator obligation to manage defects end-to-end

Single-point warranty feasible; prime manages subs; align to defects notification period

Documentation pack

Minimum manufacturing plus FAT docs unless expanded

Consolidated pack across OEMs plus integration evidence

Full handover pack plus commissioning evidence tied to OPR

Converting Partner Intent: Making the Selection Explicit

Channel conversion succeeds when intake data forces an explicit delivery model before pricing. Don't let sales qualification stay ad hoc.

Operationally, control mismatch by making the partner select, contractually, one of three acceptance endpoints:

Endpoint A (Manufacturer fit): Module-level FAT acceptance and goods delivery. All site performance risk retained by partner/buyer. Aligns with Incoterms-centric contracting and factory test evidence.

Endpoint B (Integrator fit): Integrated SAT/IST acceptance for a defined scope of interfaces and multi-vendor integration. Requires ICD ownership and integrated testing plan.

Endpoint C (Provider fit): Outcome acceptance tied to OPR and demonstrable facility behavior. Aligns with constructed-facility validation logic and commissioning evidence.

Once the endpoint is selected, everything else follows mechanically: design authority, test ownership, warranty boundary, documentation pack timing, and lead-time responsibility become enforceable rather than implied.

The Bottom Line

Modular data center procurement isn't about finding the "best" manufacturer, provider, or integrator. It's about selecting the delivery model that matches your acceptance requirements, then ensuring your contract language enforces that model consistently.

The manufacturer model works when you can own integration risk. The integrator model works when multiple vendors must perform as a system. The provider model works when you need someone else to own the outcome.

Get the model selection wrong, and you'll spend the project arguing about whose fault the site issues are.

Get it right, and the accountability structure does the work for you.

Standards referenced: ISO/IEC 22237 (facilities/infrastructure lifecycle), ANSI/TIA-942 (physical infrastructure requirements), Uptime Institute Tier programs (design and constructed-facility validation), ASHRAE commissioning guidelines (OPR-driven testing), FIDIC conditions of contract (acceptance and defects), ICC Incoterms (risk transfer).

Yuri Milyutin

Commercial Director at ModulEdge