The evidence for early intervention to improve children’s lives is irrefutable. Early investment creates opportunities and prevents problems later in life.
It follows that if we are to make headway in overcoming indigenous disadvantage, we have to invest in improving life for Aboriginal and Torres Strait Islander children. We cannot do that effectively unless we know what programs actually work; alas, at present, we don’t. Governments are spending billions on indigenous programs without, except in the rarest of cases, evaluating the results.
The Productivity Commission report released on Thursday shows progress in some areas. There is lower infant mortality and improvement in associated health indicators such as antenatal care and birth weight. Ear health is improving, more indigenous kids are completing school through to year 12, and there is more post-secondary education. These are positive changes.
However, some other important indicators are not improving.
The saddest bit of data is that the indigenous suicide rate is double the rest of Australia. These are mostly young people. Every suicide is a tragedy that devastates the child’s family and community.
Other areas of concern include juvenile detention; although down slightly, the juvenile detention rate remains 24 times the rate for non-indigenous youth. Many of these kids need help to escape poverty and despair. Instead we lock them up. Evidence shows that once someone enters the criminal justice system, he or she is likely to stay there for life. Prevention is not about being “soft on crime”, but helping kids avoid being sucked into criminal behaviour in the first place. Work on this — including learning, meeting basic material needs, building pride in culture and identity — has to happen much earlier in life.
In a PC report full of data, one of the most worrying statistics is that out of more than a thousand indigenous programs, the commission could find only 34 that had been properly evaluated.
[Yes, Aboriginal children are taught to fear police — as they should be]
Those highlighted by the PC have evidence behind them. Many (for example, Care for Kids Ears, Home Interaction Program for Parents and Youngsters, Healthy for Life, Families as First Teachers, Yiriman Project and others) are improving indigenous kids’ life chances.
They are cause for optimism. Yes, it is possible to improve well-being through well-designed early intervention programs. Indigenous policy is not all problems; there are solutions.
However, there were more than a thousand other programs where the PC wasn’t able to find any evaluation. That is a failure of both policy and implementation.
It is possible some of the thousand unevaluated programs have had audits and surveys, but these are not even close to rigorous evaluation.
When programs are audited to see if money has been spent in the right place without fraud, that tells us nothing about whether or not it has produced results. Other times, what is passed off as “evaluation” is simply participant feedback. This tells us nothing about whether outcomes have improved.
Evaluation looks at the state of things in an indigenous community or population cohort before and after a program was introduced to determine whether it made any difference. Proper evaluation can be expensive — but it’s massively cheaper than continuing to spend on programs that aren’t working.
The inescapable conclusion from the PC report is that without data, analysis and evaluation we have little chance of overcoming indigenous disadvantage. There are some pretty basic questions that program managers simply aren’t asking: is this working? Are we achieving results?
We do know some basic principles of what is likely to help programs succeed: they have to be specific to community and need, and be driven by indigenous people themselves. What might work in remote Western Australia could be completely unsuited to urban Sydney or to rural Victoria — and vice versa. Indigenous leadership is essential if programs are to be implemented effectively.
Even with these basics in place, there is still no substitute for evaluation — finding out if the program has actually made a difference.