Human rights chief praises police oversight report’s focus on race and diversity

It all starts with data:

The provincial government’s commitment Thursday to require police watchdogs to collect race-based statistics is evidence Ontario is in “a very unique moment” when it comes to recognizing the need for such data, says Ontario’s chief human rights commissioner.

One day after the release of Ontario Justice Michael Tulloch’s broad-ranging report on police oversight in Ontario, Renu Mandhane said the judge’s work provides a detailed road map to rebuild trust between community and police oversight agencies at a time of “historic levels of distrust.”

Shortly after the release of Tulloch’s 129 recommendations — many aimed at increasing transparency within the police watchdogs — Ontario Attorney General Yasir Naqvi committed to act on the key recommendation that civilian oversight bodies, including the Special Investigations Unit (SIU), begin collecting demographic data such as statistics on race, ethnicity and indigenous status.

Currently, none of Ontario’s civilian watchdogs — the SIU, the Office of the Independent Police Review Director (OIPRD) and the Ontario Civilian Police Commission (OCPC) — collect statistics on race or any demographic data on religion, age, mental health status, disability, or indigenous status of complainants and alleged victims.

That move shows “the conversation has shifted in terms of the collection of data,” Mandhane said in an interview Friday, but stressed the importance of ensuring the data is both collected and then publicly reported; Tulloch recommended an advisory board be established to develop “best practices on the collection, management, and analysis of relevant demographic data.”

“There needs to be real thought about who is going to receive the data and making sure they have the resources to effectively analyze the data,” Mandhane said.

Tulloch’s race-based statistics recommendation was one of several praised by rights groups and advocates, who appreciated the emphasis placed on diversity, cultural training and the focus on indigenous communities. The report states Ontario’s oversight bodies must be “both socially and culturally competent.”

During consultations with First Nations communities in particular, Tulloch said there was consensus that the oversight bodies “lack cultural sensitivity and often are disrespectful of Indigenous peoples.” During consultations, he was told of cases where an SIU investigator arrived in a First Nations community following an incident, spoke briefly with someone from the community, and had no further contact.

“Equally troubling, some First Nations communities in the north described having to wait days for SIU investigators to arrive on scene. In some cases, matters were closed without talking to members of the community and the leadership,” the report states.

To begin to remedy fraught relationships, Tulloch recommended mandatory social and cultural competency training for watchdog staff — developed and delivered in partnership with the communities they serve.

The report also says Ontario’s police watchdogs should reflect their diverse communities, meaning the oversight bodies must take initiatives to hire people from communities currently under-represented within the organizations.

“This includes all individuals at the oversight bodies: the directors, the investigators, the adjudicators, and the staff dedicated towards outreach, communications, administration, affected persons services, and so forth,” the report states.

The move toward greater diversity is long overdue, said Julian Falconer, a Toronto lawyer who has represented many families of people killed by police and who also practices in Thunder Bay.

In his submission to Tulloch during the review process, he says he was “quite blunt” about the lack of diversity when it comes to the director of the SIU.

Source: Human rights chief praises police oversight report’s focus on race and diversity | Toronto Star

Andrew Coyne: Politicians need to forget about polls and do the right thing

Great column by Coyne “rather trust the data:”

Liberals used to take a dim view of this sort of perception-based decision-making. When the Harper government claimed it didn’t matter if the official statistics showed crime rates falling to their lowest levels in decades, because people felt as if crime was rising, Liberals rightly scoffed. Now a similar fact-free feeling — the middle class is getting nowhere — is the foundation of their whole economic platform.

Liberals are by no means the only ones playing this game. Rather than answer questions raised by her signature proposal to subject every refugee, immigrant or tourist to a quiz on their belief in “Canadian values” — questions such as why this is needed, what it would accomplish, and what it would cost — Kellie Leitch refers to polls showing sizeable majorities of Canadians support the idea.

Likewise, those raising the alarm over Motion 103, unable to answer how a parliamentary motion with no legal force or effect could restrict free speech, have lately taken to citing polling data showing a majority of Canadians with varying concerns about the motion.

It’s easy enough to gin up a poll in support of just about anything, of course, depending on how you ask the question. The people waving them about today are in many cases the same ones who not long ago were railing ago about all the pollsters who failed to call Donald Trump’s victory (in fact, they called the vote to within a percentage point: Clinton beat him by two points, instead of the three points in the consensus forecast).

But let’s suppose these polls are genuine reflections of current public opinion. That’s a good answer to the question: what does the public think on these issues? It’s no answer at all to the question: are they right to think so? Yet that is how they are invoked: if that’s how the public feels, it must be true.

Skeptics are challenged, in tones of indignation: what, so you’re saying that millions of Canadians … are wrong?

Well, yes. What of it?

“Millions of people” are quite capable of believing things that aren’t true, particularly on matters to which they have given very little thought and with which they have little personal experience. The political science literature is filled with examples of people cheerfully offering their opinions to pollsters on entirely fictional events and people. As Will Rogers used to say, “there’s lots of things that everybody knows that just ain’t so.”

Climate skeptics rightly make the point that the overwhelming consensus of expert opinion on global warming is not enough, in itself, to prove it is right. Science is not a popularity contest: throughout history, individuals have stood against conventional opinion, and been vindicated, But let 1,340 randomly selected Canadians have their dinner interrupted to answer a question from a telemarketer about a subject they’ve barely heard of, and suddenly it’s gospel.

Experts, it is true, can sometimes be mistaken. But if experts can get it wrong, the public is at least as capable of it. And yet these days we are enjoined to reflexively reject the former, and just as reflexively to believe the latter. Perhaps we should rather trust the data.

Reevely: Massive collection of race-based data part of Ontario’s anti-racism strategy

It all starts with having more and better data and ensuring that the data is consistent and reliable.

While there will be differing interpretations of what the data means, without having good data, society is flying blind when dealing with complex issues. While data and evidence are never perfect, they do provide a sounder basis for policy choices and political discussion:

Ontario will start collecting masses of race-based data on the programs in its biggest ministries this year, hoping to use the information to find and help stamp out systemic racism.

That’s a big deal in the provincial government’s new anti-racism strategy, a three-year plan that took a year to create.

Much of the strategy is high-level stuff, scooping together things particular ministries were doing and calling it a plan. That includes a training program for staff in the courts system so they better understand aboriginal culture, trying to make the boards of Children’s Aid Societies more diverse and having the first black judge on the Ontario Court of Appeal assess the way police forces are overseen. All of it noble, some of it genuinely consequential, most of it already underway.

There’s also this: “To address racial inequities, we need better race-based disaggregated data — data that can be broken down so that we further understand whether specific segments of the population are experiencing adverse impacts of systemic racism,” the strategy says.

They’re going to start with health, primary and secondary education, justice and child welfare. That is, in the areas where government policy really makes and breaks lives.

The systems in those various ministries generate boatloads of data already, from wait times for surgeries to rates of readmission for patients in particular hospitals, from school occupancy numbers to results from Grade 6 math tests, from trial times to recidivism rates. “Disaggregating” that data means pulling apart the stats by race, routinely, in a way that typically raises more questions than it answers.

So if 15 per cent of the Queensway Carleton Hospital’s patients are back in hospital within 30 days of being discharged, we’ll monitor whether the stat is the same for members of different racial groups. If not, why is that?

Pulling all this together means devising a consistent approach so the information is collected, crunched and presented in a standard form, while protecting privacy. Which is hard enough, and that’s before we get to what we’ll do with the information.

This is, historically, very touchy. Systemic racism “can be unintentional, and doesn’t necessarily mean that people within an organization are racist,” the government says, but being accused of systemic racism sets off the same sorts of reactions as being accused of the traditional kind.

Here in Ottawa, the police spent two years tracking race-related data on their traffic stops, following a human rights complaint by a black teenager who said he’d been pulled over only because an officer was suspicious of him driving a Mercedes (which was his mother’s). When researchers managing the study released their findings last fall, they reported that drivers the police identified as black or Middle Eastern were stopped at rates many times their population shares.

A companion study found some officers deliberately misrecording the races of people they’d stopped, staying away from some parts of town and otherwise behaving differently to shift the stats so they’d suggest less racism. To whatever extent police officers changed their behaviour so as to actually behave less racistly when they knew their work was being measured, that’s a good thing in itself, of course.

Ontario’s chief human-rights commissioner Renu Mandhane argued the stats are consistent with racial profiling; Chief Charles Bordeleau of the police defended his officers, saying there’s nothing going on in the police force beyond what’s normal in society at large.

(Something similar happened when the Toronto police released statistics on the people they “carded” — stopped in the street to ask for their ID papers. Way more black and brown people than whites, for reasons that were argued about for years. Yasir Naqvi, the then-provincial minister responsible for policing, imposed new rules scaling the practice back.)

You can use such statistical findings in a lot of ways, including flatly racist ones. Maybe the police are irrationally suspicious of certain visible minority groups. Maybe certain visible minority groups are worse drivers. Maybe they’re more likely to be driving in areas patrolled by police — a possibility that opens whole vistas of speculation about why either of those things might happen. Maybe it’s a combination of things. Collecting the data doesn’t solve the problem.

We can argue about why people in different ethnic groups have different dealings with the authorities, and heaven knows we do. Sometimes to a fault. But at least with traffic stops and carding, nobody can say any longer that it doesn’t happen, and that’s a step forward.

Source: Reevely: Massive collection of race-based data part of Ontario’s anti-racism strategy | Ottawa Citizen

ICYMI – Hans Rosling: A truth-teller in an age of ‘alternative facts’ – Macleans.ca

Good profile of Rosling, who was so creative and insightful in his presentation of data. Particularly important to note in the context about “alternative facts” and fake news:

… Rosling joined the Karolinska Institute in Stockholm, where he discovered his students, and even his colleagues, had crude and misinformed ideas about poverty. To them, and most policy makers in the western world, the “third world” was one uniform mess of war and starving orphans. They did not understand the vastly different experiences of a family living in a Brazilian favela and one living in the Nigerian jungle, nor did they realize how rapidly these countries and economies were evolving. Without this knowledge, how could people make informed decisions about diseases, aid, or economics? “Scientists want to do good, but the problem is that they don’t understand the world,” he said.

Working with his son, he developed software that explained data through easily understood graphics. He launched the Gapminder site, which allowed people to explore and play with data that was otherwise hidden in the archives of the OECD, the World Bank and the United Nations. And Rosling began giving increasingly popular public lectures. His TED Talks and online videos went viral, as he explained global population growth using plastic boxes, or the relationship between child mortality and carbon dioxide emissions with Lego bricks.

https://embed.ted.com/talks/hans_rosling_shows_the_best_stats_you_ve_ever_seen

Rosling had a natural charm and a ready sense of humour that grabbed your attention and kept it, even if he was talking about the most esoteric elements of statistical science. His message—that the world is getting better but we need to understand the data if we want to help those being left behind—resonated not just with the public, but among philanthropists and government leaders.

From Davos, to the White House, to the offices of the World Bank, Rosling could be found tirelessly preaching the gospel of facts, data, and truth. For generations, aid and charity decisions were taken for reasons of vanity, simplicity or self-interest. Billionaires gave money in ways that would grant them the most publicity. Bureaucrats channelled aid dollars to projects that were the easiest to administer. And western governments built dams in Africa solely to help their own construction companies. The real impact of aid on poverty was rarely considered and almost never measured. Rosling helped change that, by explaining to donors that ignorance is the first battle that must be fought in the war against extreme poverty.

This idea, as obvious as it seems in hindsight, was new. And it mattered. Governments listened. Donors became converts to Rosling’s religion of evidence-based policy. He was not its only apostle, but he was among its most well-known, and the only one with millions of views on Youtube.

Ironically, Rosling had a much more critical assessment of his own influence on the world. He called himself an “edutainer”, and in a 2013 interview he bemoaned the fact that the average Swede still overestimated the birth rate in Bangladesh: “they still think it’s four to five.”

“I have no impact on knowledge,” he said. “I have only had impact on fame, and doing funny things, and so on.”

The deputy prime minister of Sweden, Isabella Lövin, disagreed. After Rosling’s death was announced, she wrote: “He challenged the whole world’s view of development with his amazing teaching skills. He managed to show everyone that things are moving forward … I think the whole world will miss his vision and his way of standing up for the facts—unfortunately it feels like they are necessary more than ever at the moment.”

A few hours after the announcement of Rosling’s death, Betsy DeVos, a Republican donor who believes the American school system should be reformed to “advance God’s Kingdom” was confirmed as Donald Trump’s Secretary of Education.

Source: Hans Rosling: A truth-teller in an age of ‘alternative facts’ – Macleans.ca

Garbage in, garbage out: Canada’s big data problem

A reminder that despite the restoration of the Census, there still remain significant gaps in the collection, methodologies and dissemination of statistical data by the government:

In a recent article in the Toronto Star, Paul Wells lays out what he sees as Prime Minister Trudeau’s game plan for slowing Canada’s brain drain and making science pay. “Over the next year,” he writes, “the Trudeau government will seek to reinforce or shore up Canada’s advantage in three emerging fields: quantum tech, artificial intelligence and big data and analytics.”

As he should. If that’s the plan, it’s a good one. Canada’s future prosperity depends on our ability to innovate and retain the best talent in those three fields.

What we call “big data analytics” works by finding previously unknown patterns in the huge blocks of data that very large organizations — governments, for example — grow around themselves constantly, like coral. Finding those patterns can point the way to new efficiencies, new ways to fight crime and disease, new trends in business. But as with any complex system, what you get depends on what you put in. If the inputs aren’t accurate, the results won’t be, either. So before we embrace the “big data revolution”, we may want to look first at the worsening quality of the data our federal government produces, and that businesses, activists and social planners use.

Take something as basic as divorce. Statistics Canada first started reported marriage rates in 1921, divorce rates in 1972; it stopped collecting both data streams in 2011, citing “cost” concerns.

Marriage and divorce rates are exactly the kinds of data streams consumers of big data want collected, because they affect so many things: government policies, job markets, the service sector, housing starts — you name it. Having abandoned the field now for five years, StatsCan’s data volume on marital status isn’t nearly as useful as it might have been.

Take wildlife conservation. Recently an Ontario provincial backbencher proposed a private members bill to allow for unlimited hunting of cormorants. The bill’s proponent says the species is experiencing a population explosion. And we don’t know if he’s right or wrong — because the feds stopped collecting that data in 2011.

open quote 761b1bCanada used to publish statistical reports that were every bit as good as the Americans’ — in some cases, better. Then we stopped.

Here’s another big data blind spot: gasoline imports. After having reported data on gasoline imports regularly since 1973, StatsCan has been suppressing the numbers since 2013 due to what it calls “privacy” concerns. In the last reporting year, 2012, a staggering amount of imported gasoline came into the country — almost 4 billion litres.

Now, if you were thinking of expanding your oil refinery, or wanted to know more about how dependent this country is on foreign fuel, this would be pretty precious data — the kind you’d probably pay for. But the data aren’t reliable — any more than the StatsCan data on gasoline demand by province, which we use to work out whether carbon taxes are actually reducing demand for gasoline. It’s bad data; it has been for years. You’d think someone in the higher echelons of the federal or provincial governments would get annoyed.

Combing through StatsCan’s archive of reports can be a bewildering experience, even for experts. Its online database, CANSIM, is easy enough to use. It’s the reports themselves that sometimes fail you.

Say you want to understand trends in Ontario’s demand for natural gas. You’d start by looking at CANSIM table 129-0003, which shows an increase in sales of natural gas in 2007 over 2006 of 85 per cent. “Ah,” you think to yourself, “that must be because of the conversion of coal-burning plants to gas.” But no, that change occurred years later. Ask StatsCan and they’ll tell you that they changed their methodology that year — but didn’t bother re-stating the previous years’ numbers under the same methodology. Individually, the numbers are accurate — but the trend stops making sense.

StatsCan changed its methodology again this year; it now warns researchers to take care when comparing current and historical data. That’s an improvement over changing the methodology without telling anyone but it isn’t very helpful for understanding long-term trends.

And this isn’t just StatsCan’s problem. The National Energy Board published an excellent report showing where Canada’s crude ends up in the United States. Industry analysts use the numbers to understand the reasons why light and heavy crude are selling for what they’re selling for south of the border.

The NEB stopped reporting the data after September 2015. Ask why, and this is the response you get: “The Board has decided to discontinue publication of this data while we re-evaluate our statistical products.” That, of course, was a year ago.

Source: Garbage in, garbage out: Canada’s big data problem

Public servants scramble to fill data deficit on Liberals’ priorities

Understandable given difficult cut choices recommended by the public service and approved at the political level (with the previous government’s anti-evidence and anti-data bias), with predictable impact on the quality of analysis:

If Prime Minister Justin Trudeau really is a data geek, he couldn’t have been encouraged by what some federal departments had on hand.

Internal documents obtained by the Star suggest years of belt tightening has led to a data deficit in Ottawa, gaps that may “create challenges” in delivering on the Liberal government’s priorities.

Early childhood learning and child care, expanding parental leave, increasing youth employment, and expanding training for apprentices and post-secondary students all figured prominently in the Liberals’ election platform.

But as of November, the department responsible for making good on those promises was worried they didn’t have enough concrete data to deliver.

“Spending on surveys has been reduced over the last several fiscal years and has been concentrated on priority areas to help manage financial pressures,” read documents prepared for the senior public servant at Employment and Social Development Canada (ESDC).

The Liberals have made “evidence-based decision-making” a watchword for their early days in office, and senior staff in the Prime Minister’s Office are known for their attachment to data-driven strategy.

A spokesperson for Families, Children and Social Development Minister Jean-Yves Duclos said the issue is government-wide, not isolated in their department.

“This is an issue that all ministers are facing right now. We do know that there are gaps in the data the government owns,” Mathieu Filion told the Star in an email.

“There are many discussions on the matter with different minister’s offices as to see what will be done to acquire more data.”

According to the November documents, Statistics Canada was largely preoccupied with the restoration of the long-form census, but had identified a number priority files.

Along with ESDC, StatsCan was looking to revive “longitudinal surveys” to fill in gaps. Longitudinal surveys are more expensive and time consuming than other methods of collecting data, but the documents suggest they can give greater insight into “the dynamics of life events” and have a greater payoff when continued over a number of years.

StatsCan’s wish list includes greater labour market information (specifically aboriginal participation, unpaid internships, temporary foreign workers, and worker mobility), better information on children’s physical and mental health development, and more data on Canada’s aging population and the resulting effect on the economy and the health-care system.

The agency says the digital economy remains largely in the dark, as well.

“The use of digital technologies is an important and growing phenomenon and stakeholders are increasingly demanding statistical products to address questions on the topic,” the documents read.

“While the agency has been doing some feasibility work on Internet use by children, the incidence of cybercrime amongst Canadian businesses, and has developed some questions for the inclusion in various surveys, there remain important data gaps.”

ESDC is also interested in learning more about Canadians’ “computer literacy” and use of the Internet.

Source: Public servants scramble to fill data deficit on Liberals’ priorities | Toronto Star

How the Big Red Machine became the big data machine: Delacourt

As someone who likes playing with and analyzing data, found Delacourt’s recounting of how the Liberals became the most data savvy political party interesting:

The Console, with its maps and myriad graphs and numbers, was the most vivid evidence of how far the Liberal party had come in its bid to play catch-up in the data war with its Conservative and NDP rivals. Call it Trudeau 2.0. Just as the old Rainmaker Keith Davey brought science to the party of Trudeau’s father in the 1960s and 1970s, the next generation of Trudeau Liberalism would get seized with data, science and evidence in a big way, too.

And in the grand tradition of Davey, Allan Gregg and all the other political pollsters and marketers who went before them, this new squad of strategists set about dividing Canada’s electoral map into target ridings, ranked according to their chances of winning in them. In a 21st-century-style campaign, though, the distinctions would be far more sophisticated than simply “winnable” and “unwinnable” ridings. Trudeau’s Liberals divided the nation’s 338 electoral districts into six types, named for metals and compounds: platinum, gold, silver, bronze, steel and wood.

Platinum ridings were sure bets: mostly the few dozen that the Liberals had managed to keep in the electoral catastrophe of 2011. Gold ridings were not quite that solid, but they were the ones in which the party strategists felt pretty certain about their prospects. Silver ridings were the ones the Liberals would need to gain to win the election, while bronze ridings, the longer shots, would push them into majority government territory. Steel ridings were ones they might win in a subsequent election, and wood ridings were the ones where the Liberals probably could never win a seat, in rural Alberta for instance.

The Console kept close track of voter outreach efforts on the ground, right down to the number of doorsteps visited by volunteers and what kind of information they had gathered from those visits — family size, composition, political interests, even the estimated age of the residents. By consulting the Console, campaigners could even figure out which time of day was best for canvassing in specific neighbourhoods or which voters required another visit to seal the deal.

When the Liberal team unveiled the Console to Trudeau, he was blown away. He told his team that it was his new favourite thing. He wanted regular briefings on the contents of the program: where it showed the Liberal party ahead, and where fortunes were flagging and volunteers needed to do more door-knocking. Actually, he wondered, why couldn’t he be given access to the Console himself, so that he could consult it on his home computer or on his phone while on the road?

And that, Trudeau would say later, was the last he ever saw of the Console. “My job was to bring it back, not on the analysis side, but on the connection side — on getting volunteers to go out, drawing people in, getting people to sign up,” Trudeau said. Clearly he was doing something right on that score — Liberal membership numbers had climbed from about 60,000 to 300,000 within Trudeau’s first 18 months as leader.

Volunteers for the party would learn — often to their peril — that the leader was fiercely serious about turning his crowd appeal into useful data. Trudeau wasn’t known for displays of temper, but the easiest way to provoke him was to fall down on the job of collecting data from the crowds at campaign stops. Few things made Trudeau angrier, for instance, than to see Liberal volunteers surrounding him at events instead of gathering up contact information. “That was what I demanded. If they wanted a visit from the leader they had to arrange that or else I’d be really upset,” Trudeau said.

Source: How the Big Red Machine became the big data machine | Toronto Star

Charts: No, the Y-Axis Doesn’t Always Need to Start at Zero | Re/code

Having spent more than a year on finding the right charts for my book, Multiculturalism in Canada: Evidence and Anecdote, I liked this little video on when to use the 0 and when not (also applies to the x-axis, where for median income data, starting at 0 made no sense):

If you write things on the Internet, angry email is inevitable. If you make charts on the Internet, angry email about those charts is inevitable. Especially if your charts sometimes use a y-axis that starts at a number other than zero. You see, an old book called “How to Lie With Statistics” has convinced people that truncated axes are a devilish tool of deception.

The truth is that you certainly can use truncated axes to deceive. But you can also use them to illuminate. In fact, you often have to mess with the y-axis in order to craft a useful image — especially because data sometimes reaches into negative territory and sometimes goes nowhere near zero. The right principle is that charts ought to show appropriate context. Sometimes that context includes zero, but sometimes it doesn’t. It’s long past time to say no to y-axis fundamentalism.

Source: Charts: No, the Y-Axis Doesn’t Always Need to Start at Zero | Re/code

Long-form census could be reinstated for 2016, experts say

An early test  of the incoming Liberal government, one that looks like it could be done:

The return of the long form, promised by Justin Trudeau during the election campaign, would yield vastly more reliable data and cost less than running another national household survey, the former heads of the agency say.

“It should be possible. I am certainly very hopeful. But [the decision] needs to be done very soon. It’s an enormous logistical operation,” said Ivan Fellegi, chief statistician from 1985 to 2008.

It’s “no problem” to reintroduce the long form in time for the 2016 census, said Munir Sheikh, head of the agency from 2008 to 2010. The questions needn’t change, he said – just the instructions at the top. “All they need to do is put on the front page that this is mandatory.”

The other step is for “cabinet to approve it as a census, which they can do at any time – it would take a matter of seconds.”

Researchers are already pressing for action. “Undoing these mistakes cannot wait; the time for action is now as Statistics Canada is on the cusp of launching the 2016 census,” says a letter signed this week by 61 academics and directors of research centres, including Statscan’s former assistant director Alain Bélanger.

Issuing an immediate order in council “is the only way to implement the long form in time for the census six months from now,” they said. “This must be one of the first moves made ​​by the Liberal government of Mr. Trudeau. It would mark a clear break with the previous government and ensure that future social policies can be made on scientific grounds rather than ideological dogmatism.”

….The Liberal platform pledges to “immediately” restore the mandatory long form – and make Statistics Canada “fully independent.”

Mr. Sheikh, who resigned over the controversy in 2010, said having the agency operate at arm’s length to the government is an even more crucial step. “I would say that is more important than restoring the long-form census, because that really was the cause of the problem, that the government can interfere with Statscan on issues like this.If you have an independent agency, the census in the future wouldn’t be the cabinet or minister’s problem, it would be the chief statistician’s problem.”

Mr. Sheikh said “anyone who uses data” will benefit from the return of the census. The biggest beneficiaries would be governments at all levels, “which have to base their policies on reliable data. And then of course researchers, who use this data to determine social outcomes, the condition of households in terms of income, poverty, unemployment, the state of housing, transportation needs, the needs of ethnic minorities, language, the employment equity act. Any kind of social and economic policy issues you can think of really are related to the census.”

As well, “the census provides an anchor to all other surveys, will have much more reliable data to check all other survey results against that.”

Both former chief statisticians said the switch could save money by reducing printing costs and expenditures associated with the labour required to administer and analyze the separate household survey. The NHS was sent to about 4.5 million Canadian households while the 2006 long-form census was sent to 2.5 million dwellings. Running any census is a massive undertaking that typically takes years to plan. The total projected budget for the 2016 census – which had been planned as a mandatory short form and voluntary NHS – is $701.8-million.

Statistics Canada wouldn’t comment on whether it’s possible to make the changes in time for the 2016 census. “It’s a policy matter, and we can’t comment,” said spokesman Peter Frayne.

Other experts say it can be done. “It is inherently easier to return to a well-tested methodology” such as the traditional census, said Ian McKinnon, chair of the National Statistics Council. “If any statistical agency in the world can do it, Statistics Canada can.”

Reinstating the census “soon, both sends a signal of change of policy, and interest in basing policy on evidence – evidence-based decision-making, which I think is very healthy,” said Charles Beach, professor emeritus at Queen’s University and head of the Canadian Economics Association. Moreover, “doing something that is both cost effective and more useful, it’s an economic no-brainer.”

Source: Long-form census could be reinstated for 2016, experts say – The Globe and Mail

The Data Behind Radicalization

Findings of a recent study of 1,500 individuals radicalized in the US since WW II:

Compared to violent domestic terrorists on the Far Left and Far Right, Islamists stand out. They’re more likely to be young (between 18 and 28 years old), unmarried and unassimilated into American society. They are also more likely to be actively recruited to an extremist group.

But in other important ways, Islamist extremists in the U.S. as a whole — violent and nonviolent — are not so different from other extremists. People in the three groups were equally likely to have become radicalized while serving time in prison — complicating the narrative that Muslim prisoners are unusually likely to commit to extremism from behind bars — and to be composed of individuals who have psychological issues, are loners, or have recently experienced “a loss of social standing.”

“Social networks are incredibly important to radicalization, but that’s not unique to Islamists at all,” [researcher Patrick] James said. “There’s almost always a facilitator — a personal relationship with a friend or family member who’s already made that leap.”

The Data Behind Radicalization « The Dish.