Case studies·Design & Development Quality

Engineering Quality Into a Blue Prism CoE

Introducing development standards, a full SDLC, design authority, peer review, and shared process and object libraries into a growing Blue Prism CoE — none of which existed at the start.

Design & Development QualitySDLCDevelopment StandardsCoEFinancial Services
Context

In the early stages of a Blue Prism programme, delivery pressure dominates. Processes get built and promoted to production quickly, standards are informal or non-existent, and the team operates on shared understanding rather than documented practice. That approach works when the estate is small and everyone who built it is still in the room.

As the programme scales — more processes, more developers, more system dependencies — the absence of a formal quality infrastructure becomes a compounding liability. Technical debt accumulates quietly. Object duplication proliferates. Processes built to different standards become increasingly expensive to maintain. And when something breaks in production, the absence of a runbook or exception guide means the time to resolution is longer than it needs to be.

The challenge

The programme had no formal intake process, no defined development lifecycle, no shared object or process libraries, and no quality gates between design and production. Requests arrived informally, development started without completed designs, and processes went live having been seen only by the developer who built them.

Introducing structure without disrupting a programme already in delivery required each piece to be designed carefully — sequenced in a way that improved quality without creating friction that slowed the team down.

Three layers of quality infrastructure

Quality in a Blue Prism CoE isn't a single initiative — it's a set of interlocking practices that each reinforce the others. Introducing them in isolation produces limited results; building them as a coherent system changes how the whole programme operates.

01

Structured intake and full SDLC

Jira was introduced to track automation requests through the full development lifecycle — from intake through design, build, testing, and production handover. Separate, defined phases for development testing, UAT, and regression replaced the single informal test phase that had preceded them. Each process went live with a formal handover pack: runbook, exception guide, and support contact. The full lifecycle was tracked and visible, rather than existing only in the heads of the people who delivered it.

02

Standards, templates, and libraries

Development standards were documented and applied consistently across the team — naming conventions, exception handling patterns, credential management, version control. Process development templates ensured every solution was designed to the same structure before a single stage was built. A full map of shared object and process libraries was created and maintained: developers were required to check the library before creating new objects, eliminating duplication and keeping the estate coherent. None of this existed at the start — each piece was introduced as the programme reached the maturity that made it necessary.

03

Design authority and peer review

A design authority gate was introduced before development begins: solution designs reviewed and signed off at the design stage, with exception handling approaches defined in the design rather than decided during build. Peer review was introduced before production promotion — no process goes live having only been seen by the developer who built it. These gates catch problems at the right stage. A design flaw found before build starts costs an hour. The same flaw found in production costs significantly more — in time, in operational disruption, and in the credibility of the programme.

How it came together

Each element of the quality infrastructure was introduced progressively — not as a wholesale transformation imposed on the team at once, but as a deliberate addition at the point where the programme had grown enough to need it. Standards came before they were being broken consistently. Libraries were built before duplication became a maintenance problem. Design authority was introduced before the volume of concurrent deliveries made ad hoc design decisions unmanageable.

The result was a development practice where quality was structural rather than dependent on individual discipline. Any developer working to the standards, using the templates, checking the libraries, and passing through the design and review gates would produce a process that met the programme's quality bar — regardless of their level of experience.

That consistency matters most when the team changes — when a senior developer moves on, when new resource joins, or when the programme scales faster than it can onboard people carefully. The quality infrastructure carries the standard forward independently of who is in the team at any given time.

Outcome

A development practice where quality was structural — not dependent on individual discipline or institutional memory.

Processes built to a consistent standard. An estate that remained maintainable as it grew. Problems caught at design and review rather than in production. And a programme that could onboard new developers and bring them up to standard through the practice itself, rather than relying on the knowledge of whoever happened to be senior at the time.

ROM2 dimension
Design & Development Quality

Design & Development Quality in ROM2 asks whether solutions are designed before they are built, and built to a consistent standard. The standards, gates, and practices that determine this aren't overhead — they are what separates an automation estate that remains maintainable at scale from one that accumulates debt quietly until it becomes unmanageable.

Assess your CoE's development quality →
← All case studiesDiscuss your programme →