So I am only going to touch on a couple of issues mentioned in the study (which, I am proud to say, I have actually read from end to end). Here's the first one:
The annual report on crime statistics, know as Juristat, routinely revises crime statistics from previous years upward in any given year’s report, making annual crime decreases appear more significant than they are.
OK. Below is the relevant time-series from the Juristat report in question. It displays the ebbs and flows in Canada's Crime Severity Index (note: a chart showing the more traditional measure of crime by volume can be found here. There is no accompanying graph, but it would look generally similar to the one below):
What the analysts from MLI are saying is that, when Stats Can reports the data point for 2009, it will be measured against a point on the graph representing 2008 that has been upwardly revised. Thus any decline year-over-year will be accented, and any increase muted. That's not surprising: an initial StatsCan crime report is issued, more data comes in that was not available at the publication date, and the report and accompanying graph is updated. It isn't as though there is anything that could be done to get around this this, other than making publication date so late that revisions are unnecessary.
But, more importantly, the issue MLI identifies only really applies to the last couple of data points on the X-axis of the graph above. Stats Canada figures might be revised once or twice, but eventually they become pretty much set in stone, so the revisions MLI notes will not alter the overall shape of the graph. Which is to say, the revisions will not show a long-term downward trend if there is an upward trend in the underlying crime data. The downward trend in crime that Stats Canada notes really is there.
There's one other point in the MLI study that I would like to touch on, because it seems representative of a number of the criticisms that they make:
This is, as the MLI admits, a methodological complaint. Stats Canada chops up and recombines its numbers based on a series of methodological decisions. But these are not the only decisions that could have been made, and indeed when you make re-cut the numbers as per MLI, you get table 7 above. The question then becomes whether this, as MLI asserts, is a "much better" way of chopping up and recombining the numbers or not, and whether the result is worthy of highlighting in the Stats Canada report, or not.
I will not defend the agencies various methodological decisions in this space, for they have already done on their own behalf, here.