Why NSW's evaluation drive flopped, and Treasury's fix coming next

By Stephen Easton

November 4, 2016

A drive to improve and expand the evaluation of New South Wales public services has had little effect in over four years, according to the auditor-general, mainly because central agencies failed to take charge.

In a stinging report published Thursday, Margaret Crawford found the service delivery evaluation push that followed criticism in the state government’s 2012 commission of audit has been “largely ineffective” in terms of investigating value for money and whether programs achieve their intended outcomes.

The audit leads Crawford to believe ministers do not get “enough information to make evidence-based investment choices” from the public service. And there is “little assurance that the right programs are being evaluated” either, she added. Only one of five departments scrutinised by the audit — the Department of Industry, Skills and Regional Development — had a process in place to select appropriate evaluation subjects.

The audit’s key findings section includes details on how the Industry cluster set up its evaluation program, which has been in place since 2012, as well as the shortcomings of the Planning and Environment and Justice clusters, which only “partially” met what the audit office considers good practice in evaluation.

Secondly, the “limited” oversight of Treasury and DPC did not constitute “the comprehensive understanding of programs across NSW government, as well as what will contribute to achieving NSW government priorities” that the central agencies should possess, in the auditor-general’s view.

This financial year, the kind of service delivery programs covered by the audit will cost the NSW taxpayer more than $73 billion. The stated aim of the 2012 evaluation drive was to put in place more rigorous checks on what these large amounts of funding actually achieve, by putting Treasury and the Department of Premier and Cabinet in charge.

The original idea was the central agencies would keep an eye on which programs agencies selected for evaluation as well as the outcomes of the assessments. But according to the audit report’s bottom-line conclusion, it has not turned out that way:

“No information is provided on the performance of programs that have been evaluated. The information that is provided is limited to a list of programs being evaluated in the upcoming financial year, with little assurance that the right programs are on the list.

“NSW Treasury and DPC are not using evaluation outcomes to analyse agency funding proposals in their advisory role to the NSW government.”

Crawford reports DPC and Treasury only started requiring agencies to provide completed evaluation reports in May this year, and they still don’t have to explain how they have fixed or will fix any problems that come up. The report argues:

“For program evaluation to be effective, agencies should demonstrate they are evaluating the right programs, and the outcomes from completed evaluations should inform advice to the NSW Government on investment decisions.”

Link analysis to future funding

Crawford sees the means to rectify the program, however, and proposed that Treasury and DPC implement a review process to provide assurance that evaluation schedules submitted to ERC actually contained valid information so analysis could be undertaken, and linked to government priorities. Additionally, finalised program evaluation reports and agency responses should used to provide evidence-based advice to government on future funding bids.

While, technically, the recommendations to lift their game only apply to the five departments that were audited, Crawford made it clear that all departmental clusters should follow a “good practice model” for selecting programs to be evaluated — containing these four elements:

  1. having an evaluation centre of excellence;
  2. ensuring that agency and cluster strategic planning processes align programs and program evaluations to NSW Government priorities;
  3. developing a master list of all current cluster agency programs with their tier ranking and linkage to NSW Government priorities; and
  4. objectively prioritising programs across the cluster for evaluation and inclusion in evaluation schedules, taking account of the department’s capacity and capability to conduct evaluations.

Only the first has been required by DPC in the past, and the audit office says it came up with the other three based on its “observations and research” around how each cluster can choose “the right mix of programs” based on “size, strategic significance and degree of risk, the priority for evaluation and the department’s evaluation capacity and capability”. The report suggests:

“Having these good practices in place would give a department secretary confidence that their cluster evaluation schedule meets the criteria for program selection and prioritisation.”

“NSW Treasury and DPC should participate in the nomination process to give assurance that the right programs are evaluated, and should use findings from the final reports, and agency responses, to provide evidence-based advice to NSW Government to inform decisions on future funding of programs,” Crawford added in a statement accompanying the report.

NSW Treasury has confirmed it will develop an evaluation framework to support the budgeting and reporting component of the Financial Management Transformation (FMT) program which it aims to implement in time for the 2017–18 state budget.

About the author

Any feedback or news tips? Here’s where to contact the relevant team.

The Mandarin Premium

Try Mandarin Premium for $4 a week.

Access all the in-depth briefings. New subscribers only.