Delivery Methodology v1.0 · April 2026

Clouditive Foundations Framework

The Clouditive Foundations Framework

A 5×5 delivery matrix: five lifecycle phases, five capability pillars, 25 delivery cells. Every engagement produces a maturity radar, DORA baselines, and capabilities that survive after Clouditive rolls off.

Typical Platform Vendor

  • Sells hours and profiles
  • Ships tools, not outcomes
  • Metrics are reports, not baselines
  • Knowledge leaves when the consultant does

Clouditive Foundations

  • Sells phases with defined outcomes
  • Delivers measured capabilities
  • DORA + DX baseline in every engagement
  • Methodology survives the engagement

Four Design Principles

Operational checks that govern every engagement, not aspirational values.

Outcome-Driven

Every initiative traces to a DORA metric or business outcome. No tool is implemented because it is modern. Only because it moves a measured number.

ADR-Governed

Every architectural decision of consequence is documented, reviewed, and approved as an Architecture Decision Record before execution. The governance process is as important as the decision.

Evidence-Based

Baseline before intervening. Measure during. Demonstrate impact after. Nothing is declared successful without data. That is the most common Platform Engineering failure mode, and we prevent it.

Progressive Delivery

Capabilities ship in phases. Every phase produces independent value. No big-bang migrations. The client can stop at any phase with working assets.

Five Phases. Each Closes with Exit Artifacts.

The lifecycle of every Clouditive engagement, from initial assessment to measured ROI.

01

Horizon

3–6 weeks

Where are we?

We baseline your current state across all five pillars using structured interviews, tooling audits, and DORA measurement. The output is a scored radar chart and a prioritized findings register, the shared map both teams work from.

Exit artifact:Maturity radar + prioritized findings
02

Blueprint

4–8 weeks

What should it look like?

We design the target state: architecture decisions documented as ADRs, a roadmap with measurable phase gates, and a team model for the delivery phase. Nothing is built before the design is approved.

Exit artifact:Target architecture + approved ADRs
03

Forge

3–9 months

How do we build it?

Delivery phase. Each capability is built against the approved ADR, with adoption evidence collected as we go. Every deliverable is production-grade and paired with runbooks targeted at the teams who will own it.

Exit artifact:Production capabilities + runbooks
04

Sustain

Ongoing

How do we keep it reliable?

Operationalize what was built. SLO reviews, postmortem cadence, on-call optimization, and adoption dashboards. The phase that turns platform investment into a defensible operational practice.

Exit artifact:SLOs met + adoption metrics
05

Ascend

Quarterly

How do we measure ROI?

Re-assessment against the original baseline. The maturity radar is updated, ROI is documented against DORA deltas, and the next phase of expansion is proposed with data rather than opinion.

Exit artifact:Updated radar + expansion proposal

The Full Picture

Every phase × every pillar produces a defined deliverable.

Horizon
Blueprint
Forge
Sustain
Ascend
Platform
Cloud audit + IaC baseline
Module library design
Terraform / Crossplane build
Drift detection
Cost & policy governance
Delivery
Pipeline inventory
GitOps blueprint
CI/CD implementation
Release automation
Self-service upgrades
Reliability
SLO gap analysis
Error budget design
Runbook + alert build
On-call optimisation
Chaos program
Observability
DORA baseline
Tracing architecture
Dashboard rollout
Cost visibility
Executive reporting
DX
Cognitive load survey
IDP blueprint
Golden path build
Portal launch
SPACE benchmarking

25 delivery cells · each with defined artifacts, exit criteria, and a maturity rubric

Five Pillars. The Capability Surface.

Each pillar is owned by a named Clouditive role. ADRs and deliverables are scoped to one pillar — no mixed artifacts.

01

Platform Foundations

IaC, Kubernetes, cloud infrastructure, networking, identity, secrets, policy-as-code.

TerraformCrossplaneKubernetesAWS/GCP/AzureVault
02

Delivery Engineering

CI/CD pipelines, GitOps, release strategies, environment management, developer tooling.

GitHub ActionsGitLab CIAzure DevOpsArgoCDFluxCDFeature Flags
03

Reliability & Operations

SLOs, SLIs, error budgets, incident management, on-call, runbooks, capacity planning.

PagerDutyRunbooksChaos EngineeringSLO frameworks
04

Observability & Insight

Metrics, logs, distributed tracing, cost visibility, DORA dashboards, executive reporting.

DatadogNew RelicGrafana/LGTMOpenTelemetryDORA Pipelines
05

Developer Experience

Internal Developer Platform, self-service workflows, golden paths, docs, onboarding.

BackstageSelf-service templatesCognitive load surveysSPACE metrics

Four Maturity Levels

Every delivery cell is scored against this rubric. A level-up movement in any cell is a legitimate engagement objective.

1

Ad-hoc

Informal, heroics-dependent. Not documented. No baseline metrics.

2

Managed

Formalized for a subset of teams. Documentation exists. Some manual measurement.

3

Defined

Standardized across the org. ADR-governed. Automated where feasible. Metrics tracked continuously.

4

Optimized

Continuously improved based on measured outcomes. Integrated with developer workflows. Demonstrable ROI.

Five Entry Points

Any rectangular slice of the 5×5 matrix is a valid engagement. These are the canonical commercial patterns.

01

Foundations Assessment

Recommended First Step4–6 weeks

Horizon × all pillars

Entry point for new clients. Produces the maturity radar and a prioritized roadmap. Priced to be approved at director level without a lengthy procurement cycle.

02

Platform Foundation Build

6–9 months

Blueprint + Forge × Platform Foundations + Delivery Engineering

Core platform engagement for organizations with an existing but fragmented stack.

03

IDP Delivery

6–12 months

Blueprint + Forge × Developer Experience + Observability

Flagship offer for mature clients with existing infrastructure but weak developer experience.

04

Reliability Program

6–12 mo + retainer

Blueprint + Forge + Sustain × Reliability + Observability

For clients with stability pain and mature platforms. Often transitions into a Sustain retainer.

05

Staff Augmentation with Framework

Ongoing

Any cells, embedded delivery

The white-label partner model. Clouditive engineers embedded in partner teams, bringing the methodology as a differentiator.

How We Measure Every Engagement

Three complementary metric sets. No engagement relies on a single set — the combination is what produces a defensible picture of platform health.

Primary

DORA Metrics

The industry standard for delivery performance. Baselined in Horizon, tracked through every Sustain cycle.

Deployment Frequency
Lead Time for Changes
Change Failure Rate
Mean Time to Restore
Supplementary

Clouditive Metrics

Two metrics that surface failure modes DORA alone misses — platform friction and review bottlenecks.

DevOps Ticket Ratio

Platform support tickets per active developer

PR Cycle Time (p95)

Open to merge, surfaces review bottlenecks invisible in mean lead time

Qualitative

DX Signals

Quarterly surveys and structured interviews to capture what quantitative metrics miss.

Cognitive load
Tool satisfaction
Deployment confidence
On-call burden

Ready when you are

Start with the Radar.

A four-week Foundations Assessment produces a shared maturity baseline and a prioritized roadmap. It is priced to be approved at director level, and it is the artifact that justifies everything that follows.

Questions? Reach out to the team