Enterprise Architecture Case Study

A Teaching and Learning Roadmap in Under Two Months

What Worked and Why

NSW Education2020-20211,000+ Schools200,000+ Students

How a sprint-based approach, stakeholder co-creation, and domain grounding delivered a comprehensive digital transformation roadmap that educators actually owned.

A few years ago I was brought in as an enterprise architect on one of Australia's most ambitious public education programmes. The government had committed hundreds of millions of dollars to close the digital divide between rural and metropolitan schools - over a thousand schools, more than two hundred thousand students, and tens of thousands of teachers. The programme was organised into digital focus areas covering everything from connectivity and devices to teaching and learning tools.

My brief was the teaching and learning technology layer - understanding how digital capabilities could genuinely support the teaching and learning journey across over a thousand schools. The opportunity was significant: a chance to map the entire teaching and learning experience from pre-planning through to assessment and reporting, and to define a digital roadmap that was grounded in how educators actually work rather than how technologists think about systems.

This is the story of how we reframed the programme around the teaching and learning experience, delivered a comprehensive roadmap in under two months, and handed the project managers a decision-making framework they could actually use.

It Started with a Plan - for the Engagement Itself

Before conducting a single interview, we did two things that shaped everything that followed.

First, we structured the engagement as three sprints. Sprint One was research and discovery. Sprint Two was synthesis - vision, strategic themes, value stream, user stories. Sprint Three was consolidation - programme increment plan, journey maps, and stakeholder feedback. Weekly status reports kept artefacts visible throughout. No one was waiting six months for a big reveal.

Second, we mapped the structure of the deliverable before starting the research. A mind map defined what the roadmap report would contain and how its sections would relate to each other - the programme overview, the solution overview with as-is and to-be states, the strategic themes, the value stream, the journey maps, and the programme increment plan. This structure became a contract with our sponsor about what they would receive, and it gave every piece of research a clear home.

"This sounds simple. In practice, most consulting engagements do it the other way around - gather information first, figure out the structure later. Defining the structure first forces clarity of thinking and naturally controls scope creep."

Research First, Architecture Second

🎯 Critical Success Factor

A key factor in the engagement's success was the support of an education researcher from within the department - an expert in the teaching and learning lifecycle and teacher practices. Their domain expertise grounded every architecture decision in educational reality and ensured we were working with the language and frameworks that educators actually used, not approximations of them. This collaboration meant we could move quickly without sacrificing depth.

Sprint One was almost entirely outward-facing. We built a comprehensive research wall - a visual workspace in Miro that mapped the entire programme landscape. Every existing delivery stream and its dependencies. The department's own teaching and learning frameworks. The programme horizons. The strategic alignment to government priorities. The relationships between all the digital focus areas.

Critically, we invested time understanding how the department itself described the teaching and learning cycle. Rather than arriving with an external framework and mapping the department's world into it, we studied the language already in use - including the school excellence framework that principals used for self-assessment and improvement planning.

We connected with stakeholders across the education and technology communities in the first week. These conversations surfaced what no desktop research could: the real pain points, the workflows that actually happened in classrooms, and the language that educators used to describe their work.

It was from these interviews that the value stream approach crystallised. Teachers kept describing their work as a cycle - planning, programming, teaching, assessing, reflecting, reporting - and it became clear that this cycle, not the technology landscape, was the right organising frame.

By the end of Sprint One, we had a validated scope, a comprehensive research base, and a first draft of the programme overview. No one had asked us to draw a solution architecture yet. That was exactly right.

Five Decisions That Made the Difference

Looking back, five methodological decisions elevated this from a standard roadmap exercise into something that genuinely landed with stakeholders.

1. Using the department's own framework as architecture scaffolding

The department's school excellence framework covered three domains - Learning, Teaching, and Leading - with detailed elements under each: everything from curriculum provision and assessment through to data literacy, professional standards, and school resource management. Schools already used this framework for self-assessment and improvement planning.

Rather than inventing a new capability model, we used the SEF as the scaffolding for mapping every proposed technology capability. We created a matrix where each row was a framework element (say, "Formative Assessment" or "Data Use in Planning") and each column represented a maturity stage. Into each cell we placed specific capability cards describing what a teacher or student would be able to do - tagged with the systems required to deliver it.

This achieved three things at once:

1

Comprehensive coverage: Working through an established framework is far less likely to miss capabilities than brainstorming from a blank sheet

2

Immediate legibility: Every capability was expressed in educators' existing planning language

3

Natural alignment: Created synergy between the digital programme and existing school improvement processes

Most EA consultants would have brought TOGAF or APQC and translated the department's world into it. That always creates friction. Using the framework educators already owned meant there was no translation layer to argue about.

2. Quantifying benefits in teacher-hours, not technology metrics

We decomposed the programme's technical streams into four outcome-oriented strategic themes - each with a human tagline that teachers could relate to:

Teacher Digital Agency

"Empower me"

Teacher Efficiency

"Free me"

Personalised Learning

"Know me. Show me. Value me."

But naming themes is easy. Making them concrete is harder. For each theme, we defined specific capability statements and mapped them to measurable benefits using the department's own research on teacher administrative burden:

  • Streamlining routine admin: Reduced time spent on routine administrative processes
  • Centralised resource discovery: Significantly less time spent searching for teaching resources
  • Integrated reporting (enter once, use many times): Reduced administrative burden per teacher

When a school leader can say "this programme will reduce the administrative load and give my teachers more time with students," that is a fundamentally different conversation than "this programme will implement a digital learning resource hub."

3. Co-creating priorities through stakeholder voting

User stories were put to stakeholder vote in a live Miro workshop, with vote counts visible on each story. Separately, nine stakeholders each independently prioritised the full set of technology enablers in their own workspace, arranging them by high, medium, and low priority.

This dual mechanism - collaborative voting for stories, independent prioritisation for enablers - produced something fundamentally different from the typical consulting model of "we interviewed people and here's what we think they said."

Stakeholders owned the priorities because they had literally created them.

When the programme increment plan landed, nobody could say "the consultants decided this" - the traceability ran from vision through strategic themes through voted stories through independently prioritised enablers to the delivery timeline.

4. Bringing the future to life through journey maps

The most technically ambitious elements of the roadmap - machine-readable curriculum, recommendation engines, learner profiles, education telemetry - were concepts that could easily alienate non-technical stakeholders if presented as architecture diagrams or system specifications. We needed a vehicle that could explain what the future would actually feel like for a teacher, a school leader, or a student, without requiring them to understand the technology underneath.

That vehicle was a set of highly visual six-panel journey maps, one for each phase of the teaching and learning value stream. Each panel told a story: a narrative paragraph written in educator language describing what happens at that stage, with the enabling systems woven naturally into the text. A teacher doesn't "interact with the curriculum management system" - they see syllabus changes automatically highlighted in a knowledge navigator that surfaces linked learning resources. A school leader doesn't "consume analytics from an operational data store" - they augment their school improvement plan with student insights drawn from across the school.

Each panel was backed by a system diagram showing the technology touchpoints, colour-coded by scope status, so that delivery teams could see exactly which systems were required. But the narrative came first. The technology was in service of the story, not the other way around.

These journey maps became the primary communication tool for explaining complex technical concepts to non-technical stakeholders. When a principal read the pre-planning journey map, they could see their own start-of-year process reflected back - but enhanced. When a teacher read the programming journey map, they could see how finding resources, building programs, and differentiating for students would actually work in practice. The future state stopped being abstract and became tangible.

5. Synthesising everything onto a single page

The Solution Overview was perhaps the most effective communication artefact in the whole engagement. Three layers on one page:

Layer 1: What teachers do

The value stream with activities per phase

Layer 2: What we build to help them

Epics and enablers organised as digital focus areas

Layer 3: What we depend on

Key dependencies highlighted

Any stakeholder - from a classroom teacher to a minister - could understand this page in under two minutes. It answered the only three questions that matter: what is the work, what are we building, and what could block us.

The Handoff

The programme increment plan - a delivery matrix with streams as rows, school terms as columns, and items colour-coded by scope status - was the culminating deliverable. But it was not a delivery commitment. It was handed to the project managers who then worked out what could realistically be included given time and budget constraints.

This is the proper role of an enterprise architecture roadmap: not to dictate delivery, but to ensure delivery decisions are informed.

The project managers could make trade-offs with full visibility of what they were deferring and why it mattered. Every item traced back to a strategic theme, a stakeholder vote count, and a quantified benefit. "We're deferring the recommendation engine to 2023" is a very different statement when you can see that it traces to the personalised learning theme, received moderate stakeholder votes, and depends on the machine-readable curriculum enabler that won't be ready until late 2022.

Several concepts from the roadmap - including the learner profile - later became standalone departmental initiatives in their own right, which speaks to the lasting influence of the architecture thinking beyond the immediate engagement.

The full traceability chain - from vision to strategic themes to capability mapping to value stream to journey maps to voted user stories to prioritised enablers to programme increment plan to project manager handoff - was completed in under two months.

Not because we cut corners, but because the method was designed for pace: structure first, research second, synthesis third, consolidation fourth.

Key Outcomes

  • Comprehensive roadmap delivered in under two months - from vision to programme increment plan
  • Stakeholder-validated priorities across four strategic themes through voting and independent prioritisation
  • Full traceability from vision through strategic themes, value stream, voted user stories, and prioritised enablers to delivery timeline
  • Decision-making framework handed to project managers with visibility of trade-offs and impact
  • Lasting influence - several roadmap concepts (including the learner profile) became standalone departmental initiatives

Reflections

I've had the opportunity since this engagement to reflect on what made it work - and to compare the approach against other EA roadmap engagements I've seen across government, health, and not-for-profit sectors.

The honest reality is that most roadmap engagements follow a predictable pattern. A consultant interviews a few people, disappears for six weeks, and comes back with a slide deck that reflects their pre-existing views dressed up in the client's terminology.

The deck contains a technology architecture diagram, a capability model borrowed from TOGAF or APQC, a Gantt chart, and maybe some generic user stories. Stakeholders nod politely. The deck goes into a SharePoint folder. Nothing changes.

What made this engagement different was three things: method discipline, genuine co-creation, and domain grounding.

On Method Discipline

Defining the deliverable structure before starting research, running three timeboxed sprints, and keeping artefacts visible to stakeholders every week meant we never lost momentum.

Most consultants don't commit to a deliverable structure before they start research because they're afraid of scoping something they don't yet understand. The confidence to do that - and to hold to it - comes from experience. A tight timeframe forces prioritisation and keeps stakeholders engaged rather than fatigued.

On Co-Creation

Having nine stakeholders independently prioritise enablers in their own workspaces, and having user stories voted on in a live workshop, produced priorities that stakeholders owned because they had literally created them.

Many engagements claim to be collaborative but actually amount to "we showed them the deck and they didn't object." When the programme increment plan landed, nobody could say "the consultants decided this." The traceability ran from the vision they had helped shape, through the stories they had voted on, to the delivery timeline.

On Domain Grounding

Using the department's own school excellence framework as the architecture scaffolding - rather than arriving with an external framework and translating the department's world into it - was the single smartest call in the engagement.

Every capability was expressed in the language educators already used for school planning. There was no translation layer to argue about. When principals looked at the capability map, they saw their school improvement plan populated with digital opportunities, not a technology architecture they needed someone to interpret.

This kind of decision looks obvious in hindsight. In practice, it almost never happens - consultants are perpetually tempted to bring their own framework because it's what they know.

The collaboration with the department's education researcher was essential to this. Enterprise architecture without domain expertise produces technically sound roadmaps that miss the point.

Domain expertise without architecture discipline produces wish lists that cannot be delivered.

Bringing both together - and giving the domain expert genuine influence over the architecture decisions, not just a review role - is what made the capability mapping, the value stream, and the journey maps credible to both educators and technologists.

And knowing where the job ended mattered. The programme increment plan was a decision-making framework, not a delivery commitment. Handing it to the project managers with full traceability - so they could make realistic scoping decisions with visibility of what they were deferring and why it mattered - is the proper role of an enterprise architecture roadmap. It is not there to dictate delivery. It is there to ensure delivery decisions are informed.

If I had to distil what set this engagement apart into a single observation, it would be this:

The working artefacts - the Miro boards, the research wall, the voted stories, the independently prioritised enablers - show thinking in systems and relationships, not just slides. That is the difference between enterprise architecture that facilitates genuine transformation and enterprise architecture that produces documents.

And if there is one principle I would take from this engagement to any similar programme, it is this: start with the practitioners' experience of their own work, use their own frameworks and language, and make them the authors of the priorities.

The architecture method that made this engagement successful - the research, the framework mapping, the value stream decomposition, the capability definition - is increasingly something organisations can do for themselves with AI-assisted approaches rather than relying solely on expensive consulting engagements. The domain expertise lives inside the organisation. The architecture discipline can be augmented. What matters is bringing the two together in a way that produces something practitioners recognise as their own.

Stakeholder Testimonials

Photo of Simon Maizels

Simon Maizels

Group Director, ITD Teaching and Learning Experience

Vinod is an outstanding talent that helped the Department with fresh thinking on large, complex digital transformation programs. He is a skilled design thinking practitioner and has the ability to...

Photo of Dr Pedro Harris PSM

Dr Pedro Harris PSM

Chief Digital Education Delivery Lead at NSW Department of Education

Vinod, is someone who thinks outside the box and always has fresh insights to share. I am always attracted to work with people with a high ethical frame and social conscience - Vinod ticks both boxes....

I work as an enterprise architect specialising in mission-driven organisations - education, health, aged care, and community services. I'm increasingly focused on how AI can augment enterprise architecture practice, enabling organisations to develop and maintain their own capability models and roadmaps with less reliance on traditional consulting. If that's a conversation that interests you, I'd welcome it.