Episode Two: Testing 1, 2, 3...
Susan Johnston: Welcome back to Track Changes / Changer de Voix, a podcast series exploring public sector policy innovation from a number of different angles: from demystifying new policy instruments, bringing you examples of innovation that worked and, well, innovation that ..didn’t…
Dan Monafu: ...and from conversations and Â chats with previous generations of public service innovators to interviews with current leaders and practitioners from Canada and beyond. Episodes are produced and created by us, your interdepartmental team of public servants, with support from the Innovation Hub within the Government of Canada’s PCO or Privy Council Office.
Susan Johnston: We’re delighted to be back and welcome to episode two!
Dan Monafu: Yes, we survived it, and it’s been quite the journey. Thank you for your enthusiastic responses on Twitter; your praise and things we should be doing differently. We’re very excited to work on this project, and we know that we can continually make this better. This is ultimately a policy innovation podcast by practitioners, for practitioners at all levels, so please please keep the feedback coming!
Susan Johnston: We’ve definitely learned a lot, and we’re grateful for those opportunities. We learned for example that Track Changes may already be a podcast on Soundcloud and iTunes. So what have we done about it? Well, we’ve changed our name from Track Changes to reflect both of our French and our English names: Track Changes / Changer de Voix.
Dan Monafu: Yup, that was a good one, and we are getting there. In the meantime, we have lots in store for you this episode, and we’re really thrilled to get started. So, first, we’re gonna start off with a segment we’ve introduced already in our first episode, and it’s called ‘tl;dr’. And this is Internet slang for ‘too long; didn’t read’. So for this piece, we’ll get a little teaser on the topic of experimentation.
Susan Johnston: Okay. Our first segment brings in Karen Hall. She is the former - and founding! - Director of the Canada Revenue Agency’s innovation lab, called the Accelerated Business Solutions Lab, to talk about experimentation. Karen, thanks so much for joining us. Can you tell our listeners about your current role?
Karen Hall: I’m the Director General of the Benefits Programs Directorate at the CRA, and we’re the group that pays out benefits to Canadians through the tax system: the Canada Child Benefit, GST credit, etc.
Dan Monafu: Great. Thanks, Karen. Now it might be helpful if we provide some context for this edition of tl;dr: the Canadian federal government made a specific commitment to experimentation and measurement in its programs. So, it’s in the mandate letter from the Prime Minister to one of the Cabinet ministers - specifically the President of the Treasury Board. The Canada Revenue Agency has recent experience putting experimentation into practice in a very rigorous way. Karen, as per tl;dr’s fairly brief tradition, two episodes, you have 100 seconds to explain a complex topic. Please tell our listeners what experimentation means in a public sector context… and, go!
Karen Hall: Â I think that’s a big question. I think what experimentation means in the public sector context, I think it’s testing approaches, interventions that we’re going to put into practice in a low cost way to ensure their effectiveness before we go to large scale implementation. So, there’s some examples of what those could be - often we think about Randomized Controlled Trials that have experimental design and control groups. That’s definitely one way to go. It can be piloting a new program or process -- or new approaches, research methods, etc. that we haven’t used before. And I think what’s really key in those is the evaluation part. So we run these experiments, and we evaluate them so we have rigorous evidence about what works and what doesn’t work.
Experiments let us test our interventions in the real world. If an experiment fails, it’s not necessarily a bad thing. In fact, it can be good, because it’s let you know, early on before you make major investments, or rolled out a program nationally, that it’s not going to work. And it’s better to know that at the beginning than $50 million dollars later.
Dan Monafu: That’s really well said. In blatant disregard for our own segment format, Karen - we’ll ask you a follow-up question. So what has this looked like at CRA?
Karen Hall: So at the CRA, we’ve used experiments to test different wording and different ways of delivering messages, you know, whether postcards or letters, for example, make an impact in our ability to communicate important messages to taxpayers. And we’ve also used experiments in terms of different research methods, broadening our scope and looking at different ways of learning about what taxpayers are thinking about and, you know, ways to deliver our business. One factor that’s helped my department get started on this path was strong senior management support and respect for the rigor of experimentation and what it can tell us. So that it’s not about what we think, it’s about what we know. And that’s made just an enormous impact, having that support.
Dan Monafu: Okay. Thank you very much, Karen.
Susan Johnston: Karen really stressed the importance of ethics for public sector experimentation and ensuring that when experiments are underway, they are conducted in a way that lives up to that public trust. We asked her to elaborate on the point she raised in the context of experimentation and failure. We felt it was too important not to share:
Karen Hall: I think failure has to be really reframed in this context. In my view, the only failure is when you have a major, major flaw -- a mistake -- in your experimental design, or if your experiment isn’t well conceived in the first place. But if you test something and it doesn’t work, I see that as a contribution to the evidence base. And really, we’ve probably saved taxpayer money, rather than, you know, spending significant amounts of public funds on something that’s not going to work. We might as well test it and see.
Dan Monafu: This is music to our ears. Ok. As usual, we’ll post more resources on our website and social media, and we’ll also link to a few of the things Karen may have mentioned or alluded to in this brief segment. So thanks again so much again, Karen.
Susan Johnston: As a nod to those of you who are familiar with these concepts, we hear you that you may want to go a little deeper than a simple introduction -- which is what we’ve tried to do with some of the other questions we asked Karen. If that’s you, and you’d like more depth, we’d like to tell you about an initiative from within the federal family. Our colleagues at In.Spire, the innovation hub at Natural Resources Canada, have been pulling together a number of resources, curated by authors from across the Public Service. We’ve worked with our NRCan colleagues to make some of them available for the first time outside the government firewall, and we’re really excited to share them with you -- you can have a look at our website, pco-bcp.gc.ca/trackchanges.
Dan Monafu: Our second segment today brings our listeners inside the Deputy Ministers’ Committee on Policy Innovation, or D-M-C-P-I. DMCPI is an interdepartmental senior management committee, but it has some fairly unique features. The Committee includes Reverse Mentors, these are non-executive departmental representatives who participate as full committee members alongside their Deputies.
And DMCPI makes the vast majority of what it discusses available to any federal public servant through the internal wiki it has. It’s called GCpedia. So now it’s planning to move from internal “open by default” to sharing more publicly, so this month we were invited in to record.
Susan Johnston: So the audio you’re about to hear is from a DMCPI meeting that took place in late September. The guest speaker was Carolyn Curtis, the head of the Australian Centre for Social Innovation (called TACSI). She was here on a Canada-wide tour, and stopped by to give a presentation to the group. Oh, and full disclosure: Anatole Papadopoulos, one of the Executive Producers of our podcast, is also Secretariat to this Committee.
Dan Monafu: As Carolyn speaks, we’ll preface some of her remarks with our own thoughts or explanations. Here’s Carolyn with a definition of TACSI and their main objectives.
Carolyn Curtis: So TACSI, the Australian Centre for Social Innovation, we are seven years old. We are positioned outside of government, but we were started up by government. And we were started up by a state government.
We basically exist to develop and spread innovations that change lives. So we believe that innovation is about idea right through to implementation. We’re not just about the ideation phase. We’re right through to the scaling phase. We work with partners at different stages of that process. I think what we’re seeing in Australia is that there were often great ideas or even prototypes but then after that prototyping phase things just weren’t ever continuing on to scale or implementation. And so that’s why we’ve built our business across the whole spectrum.
We are now what we would refer to as a not-for profit social enterprise. We have a commercial business model and we have 50 staff working across the country, with home bases in South Australia and also New South Wales, in Sydney. So we’re growing, and I think the need for social innovation in Australia is vast.
Dan Monafu: So, many important threads to pull on even in that little blurb: most of us in the policy innovation space spend time worrying about how to scale and how to better connect policy development with, you know, implementation, how to create solutions that work in the real world.
Susan Johnston: Before we continue, let’s define social innovation. As the government doesn’t have a pre-set definition, we’ll build on one from Social Innovation Generation, a Canadian organization supporting new ways to examine and address massive social problems such as poverty, homelessness, and violence.
According to SiG, social innovation is a way to create better outcomes by applying new learning and strategies to large-scale social problems. For outcomes to be both successful and durable, they should have a measurable impact on the broader political, economic, and cultural context that created the problem in the first place.
Okay. So DMCPI then heard about how TACSI thinks about innovation -- and why it’s important to innovate.
Carolyn Curtis: For us, you know, what is innovation? And, in Australia we can be lulled into it being a very, I guess, exciting, creative, playful term. But for us, innovation is just absolutely about outcomes. It’s not just about different, it’s about better. It also has to be about a more attractive proposition to your customer or to the person you are serving. It has to be more effective and efficient. It has to be sustainable, and it has to be spreadable. And it’s only when we start to tick those different boxes that we’re really going to achieve the impact that we want.
Dan Monafu: Again, it resonates: innovation for better outcomes and impacts, not innovation for innovation’s sake. As Carolyn notes, it’s not playful -- TACSI approaches all of this pretty seriously.
Remember a few minutes ago when Karen linked experimentation with evidence? So Carolyn’s organization, TACSI, is also very methodical about evidence-making. It’s a way to avoid making mistakes where you only get part of the evidence-experimentation-scale chain right. Carolyn told the committee a story where a public sector organization ran a number of trials - which is good - except that the trials weren’t scalable. So when it came time to scale, they had to go bigger with something unproven instead. The common mantra you might be familiar with is “start small but think big”. TACSI has a system for all this.
Carolyn Curtis: So the four stages are about a staged and gated pathway. So at each stage you’re producing a new level of evidence.
So at the end of the first stage, you want to have clearly tested your assumptions about both the problem that you think it is you’re looking at, and the opportunities. You want to test your assumptions about what you hold to believe true about the system, about the people you are serving, about the policy. So at the end of the first stage, you want to have a clear set of evidence around what it is that’s true and what it is that you’ve tested and what you know.
The second stage is about doing the designing and prototyping and learning, and so by the end of the second stage you should be able to show to your -- whoever your client is or whoever is commissioning your work that you’ve got a clear theory of change, that you clearly understand how what you’re doing is leading to change for people. And you should have been able to test that with people in quite an iterative way.
By the end of the third gate, you should have a clear business case and data that supports your theory of change that you’ve developed in stage two. And that’s what sets you up for scale. So I think when social innovation doesn’t work, is when we’re not clear about what the gateways are, and when we sort of get quite wishy-washy about how you move from idea through to implementation. And so for us with government, one of the reasons we’ve been able to get so much traction is because in the absence of being able to describe the solution, we can very clearly and in a lot of detail describe how we’re going to arrive at a different solution -- whether that be a policy, service, commissioning solution. And that is really starting to be embraced across government in Australia.
Susan Johnston: Now, as Karen said in our tl;dr segment, there are different ways to experiment and to use what you learn. Carolyn said something along the same lines. She makes the point that prototyping itself can be considered a very good way to reduce and mitigate risk in any organization.
Carolyn Curtis: I think the fundamental issue is that government, or business, is not designed for innovation. Its designed for performance and efficiency. It’s repeatable, replicable activities that you do every day to deliver the value that you’re there to deliver. Now the measures of success around innovation are fundamentally at odds with that. The measures of success around innovation are around learning, testing your assumptions, iterating -- completely at odds with efficiency measures. Just a point on the accelerating learning, in terms of risk again. And this just goes back to the point I was making earlier around the staged and gated process. You know you would recruit RCTs, pilots, and prototyping at different stages of any process, but I can’t stress enough how much prototyping is a risk-mediating strategy for government. It would de-risk any strategy, policy, or framework you want to bring to bear, because effectively it enables you to learn very quickly and to adjust very quickly. But it is a different capability.
And I’m not quite sure what the situation here is in Canada, but we were really developing a big track record, particularly in social policy areas in Australia, of leaking millions of dollars on big expensive pilots -- big expensive pilots that would never scale, and before you know it, they’d disappear. So really embracing a prototyping approach is something we’re starting to get leverage with across government.
Susan Johnston: That set up for prototyping was interesting - the tension between how we traditionally design and measure for organizational performance, versus how you build and assess for the innovation health of an organization. I suspect Carolyn isn’t suggesting that those are incompatible. And in fact, the central thesis is that a learning, iterating, prototyping government or business will be a higher-performing one. It takes work, though, to realign how we think about what matters and consequently what we’re looking for from our institutions.
Dan Monafu: Ok, so if we look at this in a different way, what does this all look like in practice? Carolyn told the committee about one of TACSI’s signature initiatives, called Family by Family. This is a great story. It’s got everything: citizen-centric design and co-creation, prototyping, trials, Â measurement… and all of it to address very real, stubborn social problems. Here it is in Carolyn’s own words.
Carolyn Curtis: I wanted to give you a sense of some of the things that that we’ve designed that are now scaling. So this was the very first program or product that came out of TACSI, and it was a response to looking at our failing child protection systems in Australia. And also in Australia, we’re building a very thick service system -- so the answer in Australia, if things aren’t going well for a family, is always another social worker, or another professional service. We’re investing very little in building community, in building community resilience, building people’s ability to actually be able to solve their own problems when things happen. And so when we did our research with families, one of the things, or a couple of things, that stood out is families across the country were saying to us, who do I call after 5 o’clock? Or who do I call on the weekend? Or I’ve got 13 services involved in my life, but I’ve got no one to talk to. And so, we’ve gotten ourselves into this situation where we’ve seen services as the whole pie. So Family by Family is a peer model, and it’s really looking at finding families who have been through tough times but have come out on the other side and doing well, and linking them with families that are really struggling in putting one foot in front of the other.
So this model is creating huge cost savings for government. For every dollar spent, government saves $11, and that’s largely around child protection. It’s also demonstrated that it’s clearly reducing families’ interaction with state services and the child protection system. The business model for this is a co-operative business model, and it’s now scaling across Australia.
So we would often talk about Family by Family as being designed by families. They were engaged in the process at every step of the way. And that’s partly why it’s generating the results that it’s generating now. So it’s a program that hasn’t been built on any assumptions, those assumptions are constantly tested.
Susan Johnston: We’ll leave it there for now, thanking Carolyn very much for her willingness to let us share her thoughts with you. We are also thankful to the DMCPI co-chairs and membership for allowing us to be “in the room” with them for this meeting.
Dan Monafu: Well there’s one last piece we wanted to squeeze in there, especially as we think it’s more of an invitation to collaborate and to partner across sectors. We thought it’s very fitting with our own mandate here at Track Changes, where we’re also, or we try to be, very committed to seeing things from multiple sectors and working across fields. So we’re trying to bring our listeners more segments that highlight these opportunities to partner across, really.
Carolyn Curtis: I think the thing we would advocate really strongly for when it comes to social innovation is it’s just critical to have empathy for the system, the way you have empathy for the people you are serving. If you don’t go to great lengths to understand the system and what drives the system’s behaviour, then you’re really going to have limited success in changing it. So when we’ve got people who sort of look at systems mapping or talk to people about who is in their kind of sphere of influence, often government here have been quite removed from that for the labs that are outside government currently. So instantly I saw an opportunity, knowing that innovation and social innovation is something you’re really embracing now, is about how you bring those worlds more closely together.
Susan Johnston: Agreed - ending with an invitation to bring worlds together is a much better ending. Thanks again, Carolyn - we’ll post further links on our website to TACSI and the great work your team is doing in Australia.
Dan Monafu: Well, this was our second episode. We’re nearing the end, and we’d love to hear from you: how is this podcast going? Are you enjoying these segments? What would you like us to tackle next? We have lots of questions, and we’re listening; we really do want to make you part of the design of this show. If you want to reach out to me directly, I’m Dan Monafu and that’s @danutfm on Twitter.
Susan Johnston: I’m Susan Johnston, and on twitter I’m @engagequestion. Our podcast handle is @trackachangespod.
Dan Monafu: Yup. And this would be a good time to mention that as part of our Official Languages commitments -- and aspirations -- we’ve landed on a model where we have two co-host teams: one anglophone (that’s me and Susan here), and one francophone (that’s Kaili Levesque and Francis Nolan-Poupart). We’ll build episodes in either English, French, or a mix first, and then the episodes will be dubbed in the second language.
Susan Johnston: Thanks for listening. We’ll leave you with Mark Matz’s Journey of the Mandarin. Join us again in a month or so, and please do subscribe to us on iTunes or Soundcloud; or you can visit our website: www.pco-bcp.gc.ca/trackchanges.
Dan Monafu: Thank you very much, see you next month.
Susan Johnston: Bye now.
- Date Modified: