Framework

A structured seven-phase framework from orientation through expansion. One workflow proven before the next begins. The right tool selected for each problem.

The seven phases

0

Orientation

Who are we working with and what are the ground rules?

Scope, compliance constraints, data access, stakeholder map, communication strategy. Nothing proceeds without this foundation.

1

Workflow Discovery

What does the team actually do?

Interviews across leadership, team leads, and individual contributors — separately. The real process rarely matches the documented one. Shadow systems, workarounds, and single points of failure surface here.

2

Assessment

Which workflows are worth targeting?

Every discovered workflow scored across eight dimensions: labor intensity, current cost, return potential, frequency, AI suitability, implementation complexity, data availability, and error tolerance. The matrix drives the recommendation; the conversation around divergent scores drives the insight.

3

Selection & Prescription

What are we building, and who owns it?

One workflow selected. The right technical approach prescribed from the full range — simple automation, ML, LLM, or hybrid. A Process Champion assigned from inside the organization. Success criteria agreed before a line of code is written.

4

PRD

What exactly are we building?

A product requirements document detailed enough to build from without ambiguity. Current state, future state, user stories with acceptance criteria, data requirements, integration points, edge cases, failure modes. The PRD is also the onboarding document for the tool — it specifies how the AI is trained for this workflow.

5

Build — Test — Iterate

Does it work?

Three structured cycles: internal validation, champion review with real data, then live use. An iteration log tracks every change — what changed, why, who decided, and what happened. Prompt changes, threshold adjustments, process refinements — all versioned. MVP acceptance is a defined gate, not a feeling.

6

Expand

Where do we go next?

Retrospective on the completed workflow. Lessons documented. Assessment matrix revisited with new information. Next workflow selected. Parallel tracks assessed when the team is ready for them — they are earned, not assumed.

Three principles

First-Win Principle

The first workflow selected optimizes for speed to demonstrated value — not maximum theoretical ROI. One workflow proven in production is worth ten workflows on a roadmap. A working result builds the trust and momentum to expand.

Process Champion

Every workflow initiative has an internal owner: someone who knows the workflow deeply, has authority to act, and is accountable for the outcome. Tools don't get adopted — people adopt tools. The champion is why it sticks after Mimir leaves.

Right Tool for the Job

The intervention fits the problem — not the other way around. Simple automation when the problem is deterministic. Machine learning when there's a pattern to find. A language model when judgment and language are involved. One of the case studies here uses Python matching logic, not an LLM, because that was the correct tool.

What Mimir does not do

  • Deploy generic AI tools and walk away.
  • Produce strategy documents without an implementation path.
  • Build without establishing a baseline to measure against.
  • Start the next workflow before the first one is proven.