Six tools for implementing active contract
management
Technical Guide!
This technical guide is comprised of six tools that the Government Performance Lab has created to help
governments use active contract management strategies (ACM) to produce better results from their
contracted services. It includes:
1. A worksheet with ten planning questions for designing and launching a new ACM practice
2. Examples to help agencies select leading and lagging performance metrics
3. Guidance for prioritizing a roadmap of performance topics for in-depth analysis
4. Three simple data techniques for revealing performance patterns
5. Strategies for fostering a collaborative, trusting ACM practice
6. Checklist of elements for maintaining an effective ACM practice
Active contract management: What it involves and why do it
Active contract management is a set of strategies developed by the GPL in partnership with government
clients that apply high-frequency use of data and purposeful management of agency service provider
interactions to improve outcomes from contracted services. ACM consists of high frequency, data-
informed meetings between government agencies and social service providers designed to produce
action that improves performance. ACM empowers leaders to detect and rapidly respond to problems,
make consistent improvements to performance, and identify opportunities for reengineering service
delivery systems. For more on ACM, visit https://govlab.hks.harvard.edu/active-contract-management.
2
Performance*Improvement*Opportunities! Implementation!
1. What!is!the!motivation!for!regularly!reviewing!performance!data?! 6. What!is!the!appropriate!cadence!for!meeting!with!providers!to!review!real-time!
performance!data!and!promote!continuous!learning!and!improvement?!
2. What!are!the!most!important!leading!indicators,!outcome!metrics,!or!other!
performance!measures!that!we!want!to!be!frequently!tracking!and!reviewing!with!
providers?!Identify!up!to!five.!
7. Who!needs!to!regularly!be!“in!the!room”!to!enable!rapid!barrier!busting!when!
performance!lags?!How!can!sufficient!participation!by!senior!leadership!be!assured!
to!support!these!efforts?!
3. Against!what!benchmarks!shall!provider!performance!be!compared?!Potential!
benchmarks!may!include!historical!outcomes,!peers,!specified!targets,!third-party!
standards,!national!best!practices,!or!others.!
8. What!data!sources!are!available!–!or!need!to!be!developed!–!to!generate!
performance!information!for!frequent!review?!How!reliable!is!this!data?!
4. In!human!services,!how!are!we!going!to!match!and!refer!clients!to!services?!How!will!
we!check!if!matching!and!referral!procedures!are!working?!
9. Who!from!the!agency!will!perform!necessary!data!analysis!and!develop!meeting!
materials?!Who!will!be!responsible!for!directing!further!analytical!needs!and!
identifying!the!practice!implications!raised!by!the!data?!
5. On!what!topics!do!we!anticipate!needing!in-depth!analysis!on!provider!performance!
and!client!outcomes!to!proactively!support!system!improvement?!
10. How!will!the!agency!support!regular!follow!up!and!action!based!on!dashboard!and!
roadmap!information?!Potential!solutions!may!include!ad!hoc!working!groups,!
individual!case!pulls,!and/or!dedicated!follow-up!time!on!meeting!agendas.!
Design worksheet: Planning a new ACM
practice
Toolbox: Choosing performance metrics
3
Leading indicators Lagging (outcome) indicators
Description Early warning signs indicating if a
program is on track to achieve its
ultimate results
Ultimate results a program aims to
achieve
Benefits Can be early proxy measure for results
Often faster to observe or easier to
measure than results
Sometimes necessary to make sure data
available for other metrics
Explicitly linked to the purpose of the
program
Can capture whether program has
lasting impact
Weaknesses Alone, rarely offer insight into efficacy/
opportunities for improvement
May be misleading because never
perfectly predict results
Often time delayed
May require matching data to other
systems
Examples Proportion of people who graduate job
training
Percentage of prisoner assessment data
entered into system
Time from child referral to when
services begin
Wages 1 year after training completion
Recidivism 3 years post release from
prison
Child removals after stabilization
services
Depending on available data, dashboards often include two types of metrics:
Identifying the right metrics to review:
Leading and lagging indicators
4
Performance improvement roadmaps usually focus on practices that are critical to
success and include questions to guide in-depth analysis about these practices.
Toolbox: Guiding questions to develop a performance improvement roadmap
q What practices are most critical to the success of the project? How can we identify best
practices/areas for improvement?
q Are providers capturing the whole target population? Are there new ways to identify/
recruit needy individuals that may not be on the City’s or providers’ radar?
q To what extent are providers focusing resources on priority sub-populations?
q At key case transition points (e.g., referral, opening, closure, etc.), are the right decisions
being made about who needs what services? How can we know?
q How effective are hand-offs between the department and providers, or between different
providers? How can we minimize losing individuals during hand-offs?
q What proportion of referred clients enroll in services? How quickly are clients enrolled after
referral? How can we improve the speed/proportion of enrollments?
q What proportion of referred/enrolled clients are completing services? How can we increase
the completion rate? If relevant, how are case closure decisions made?
q Where do we consistently see patterns of strong or weak long-term results? Are there
common demographic/provider characteristics associated with strong results? Can we
apply lessons of strength to areas of weakness?
Prioritizing topics for in-depth attention:
Developing a performance improv. roadmap
5
Toolbox: Data techniques to reveal patterns
Visualize the
data
Charts
Maps
Disaggregate the
data
Providers
Geography
Client
characteristics
Create ratios
Unit costs
Caseloads
Throughput
10 : 1
Revealing performance patterns: Three
simple data techniques
6
q Build a collaborative vision of success: By articulating goals and metrics, you
define what success looks like. Use early meetings to establish a shared vision of
what you and delegates are working toward.
q Set the table deliberately: ACM requires the right people to drive from data à
analysis à insights à action. Expect that partners may need to adjust initial
attendees to cover this array of competencies.
q Be solutions-oriented, not punitive: Focus on identifying and sharing best
practices, rather than singling out low performers.
q Acknowledge differences: It can be useful to acknowledge if some delegates
assist harder-to-serve populations while reminding everyone that we’re all still
trying to achieve the same vision of success.
q Avoid surprises: Share analysis with delegates in advance of meetings so they
can correct data errors and prepare for productive conversations.
q Remember learning is a two-way street: Build trust by addressing
department or division opportunities for improvement.
q Be adaptable: Don’t let great be the enemy of the good. Its okay to refine your
dashboard metrics and deep dive plans over time. We want to be learning and
adjusting.
Building trust: Strategies to foster a
collaborative ACM practice
1. Analytically valid data
ü Providers enter data in a reliable manner
ü Program staff can request and view the data they need to manage performance
ü Data analysis is held to a high standard of analytical rigor
ü Trends and implications are explained in a meaningful way for providers and agency staff
ü Reviews of individual cases are periodically produced to help explain trends observed in data
2. Operationally purposeful insights
ü Data analysis is driven by operational questions – things providers and staff may be able to
influence
ü Data is produced and reviewed in a timely manner to enable real-time troubleshooting
ü Data is presented to uncover operational and outcome differences between providers to
facilitate peer learning
ü Providers are encouraged to think about outliers – why do some cases do better (or worse)
than others?
3. Action-oriented meetings
ü Differences in provider performance are discussed in depth, with the goal of discovering
potential process and practice improvements to spread
ü Providers and agency staff drive discussion together and collaboratively generate and
prioritize performance solutions
ü Strategy meetings with executives end with clear, practice-related action steps
ü Staff-level working group meetings are used to check on the status of implementation
Elements of effective active contract
management: A checklist
8
https://govlab.hks.harvard.edu/active-contract-management
Find ACM case studies, policy briefs, and video
from an provider meeting on the GPL website
Resources
For more information on active contract management, see the resources below:
Active Contract Management: How Governments Can Collaborate More Effectively with Social Service
Providers to Achieve Better Results
Video case study: Using Active Contract Management to Support Real Jobs Rhode Island
The Government Performance Lab is grateful for support from Bloomberg Philanthropies, the
Corporation for National and Community Service Social Innovation Fund, the Dunham Fund, the
Laura and John Arnold Foundation, the Pritzker Children’s Initiative, and the Rockefeller
Foundation. © Copyright 2019 Harvard Kennedy School Government Performance Lab
9