Ministry asks $30 per minute for data, despite Brison’s order to drop ATIP fees

As a reasonably heavy user of IRCC data, that released on Open Data as well as specific requests, I understand and appreciate both the cost recovery (takes time and resources) and public interest aspects (data helps inform discussion and debate).

But $30 per additional minute of search time? Hard to justify on cost recovery given it is only staff time that should be counted: $100 for the first 10 minutes and $30 per minute thereafter is $1,600 per hour!

The federal immigration ministry is asking up to $30 per minute to process a public request for immigration data, despite the Liberal government’s directive last year to waive extra fees for access to information requests and commitment to making government information open by default.

One advocate of government transparency said the $30-per-minute proposed charge thwarts the intent of Treasury Board President Scott Brison’s fee-waiving directive, and another said such fees could work as a “deterrent” to members of the public looking for government information.

The request related to information that factored into a change in the government’s visa policy that allowed the passage of Canada’s trade deal with Europe.

Immigration, Refugees, and Citizenship Canada officials initially declined to make public the rate at which visa applications from Bulgaria and Romania were refused, unless the requester—The Hill Times—agreed to pay $100 for a 10-minute search of the department’s records, plus $30 for each additional minute it would take IRCC employees to find the data.

Mr. Brison (Kings-Hants, N.S.) instructed all government entities last year to waive fees associated with access to information requests—used by businesses, media, and the general public to obtain government information—beyond an initial $5 filing fee, as part of the government’s transparency platform.

The Hill Times used the Access to Information Act to request the most recent three-year visa refusal rate for Romania, Bulgaria, and Mexico, countries for which the Liberal government has scrapped or has pledged to scrap visa requirements since it came to power in 2015. The government has been criticized, including by the opposition Conservatives, for deciding to drop those requirements to grease the wheels of international relations, despite evidence that in the months leading up to the visa-lifting decisions none of the three countries satisfied some of the government’s formal criteria for eliminating a visa, including high rates of refused visa and asylum claims.

The immigration ministry provided some data for Mexico, but none for Romania and Bulgaria, citing a clause in the Access to Information Act that says the access law does not apply to “material available for purchase by the public.” The ministry’s response also cited regulations, specific to that department, which allow it to charge large sums for “statistical data that have not been published by the department.”

In effect, the data—which should be at the fingertips of decision makers in IRCC—was considered to fall outside of the scope of the Access to Information Act because the Immigration and Refugee Protection Regulations allow the department to charge money for data searches.

“It’s obviously an illegitimate interpretation of the act,” said Toby Mendel, executive director of the Centre for Law and Democracy in Nova Scotia, and an advocate for government transparency.

The Access to Information Act clause excluding material available for purchase “means material that you are selling, like a book,” not government data, said Mr. Mendel, who called it a “dishonest” interpretation of the act by the department.

The three-year visa refusal rate is a key figure used by the government to decide whether or not citizens of a particular country need to apply for a visa before travelling to Canada. Canada decided last year to waive the visa requirement for Romania and Bulgaria by December 2017 as part of what is widely seen to be a quid-pro-quo for support from those two countries for supporting the Comprehensive Economic and Trade Agreement with Europe.

The government has not disclosed the latest visa refusal rate for those countries, but an April 2015 report from the European Commission, citing Canadian statistics, said the refusal rates in the first half of 2014 had been 16 per cent for Bulgaria and 13.8 per cent for Romania, which made hitting the target of four per cent over three years “quite difficult.”

…The fee starts at $100 for the first 10 minutes departmental employees spend searching for the requested information in their databases. After that, it increases to $30 per minute.

After being initially contacted by The Hill Times on April 20, the immigration department’s media relations team promised to provide the visa refusal rate for Romania and Bulgaria and respond to a series of questions about the fees charged under the Immigration and Refugee Protection Regulations in relation to requests under the Access to Information Act. The department had not responded by filing deadline May 2.

Source: Ministry asks $30 per minute for data, despite Brison’s order to drop ATIP fees – The Hill Times – The Hill Times

True test of Trudeau’s expensive data devotion will be whether he follows the numbers – Politics – CBC New

Better data may not guarantee better policy and outcomes, but at least it can ensure more informed discussion and debate:

Justin Trudeau’s Liberals are a group that enthuses about “evidence-based policy” and “smarter decisions” and has concerned itself with “deliverology.”

And they are apparently hungry for more data.

“The challenge that we’re facing is one of — and we saw this more acutely a year ago in Vancouver — a dearth of data,” the prime minister said recently when asked about what his government might do about Toronto’s heated real estate market.

Adam Vaughan, a parliamentary secretary and Liberal MP in downtown Toronto, says there are theories about what’s happening within the city’s real estate market, but not enough is known about what’s actually going on.

“We’ve got to … get the data,” he told CBC’s Power & Politics. “We have to manage the data so that we can understand where the problems are emerging and deal with them quickly.”

Such concerns follow a spring budget that, between the promises of jobs and roads and social assistance, included new commitments to data: tens of millions of dollars to be spent collecting new numbers on health care, housing, transportation and other concerns.

More money for more data

The Liberals have promised $39.9 million for the creation of a new “Housing Statistics Framework,” while another $241 million will go to the Canada Mortgage and Housing Corporation to, in part, “improve data collection and analytics.”

The Canadian Institute for Health Information will receive $53 million to address “health data gaps” and strengthen “reporting on health system performance,” while $13.6 million will go to Statistics Canada to “broaden tourism data collection.”

Developing a “Clean Technology Data Strategy” will cost $14.5 million and Transport Canada will receive $50 million to establish a new “Canadian Centre on Transportation Data.”

Meanwhile, the new infrastructure bank will be committed to working with other levels of government and Statistics Canada to “undertake an ambitious data initiative on Canadian infrastructure.”

A week after the budget’s release, the government announced $95 million would be spent gathering data on the availability of child care.

So what might all these numbers add up to?

It might simply give government a better understanding of what’s happening across the country. As one senior Liberal official notes, more data can also lead to the discovery of previously unrecognised problems.

Such data would then, in theory, inform and guide government decisions.

That’s an ideal of evidence-based policy, an aspiration for more rational politics that has arisen in recent years and might now be viewed as a technocratic rival to the emotional, anti-establishment populism that brought Donald Trump to the White House. Witness this month’s marches for science across the United States, which echoed a similar protest on Parliament Hill in 2012.

EARTH-DAY/USA-MARCH

Marchers advance toward city hall during the March for Science Los Angeles on April 22. (Kyle Grillot/Reuters)

“Data allows you to know what is the scope of the problem you’re trying to solve, or is there a problem that actually needs solving, and to measure how you’re doing and if [your policy] is working,” said the senior Liberal official.

If all that data is made public, it could also foster a better policy debate.

Incomplete record on evidence

The signature first act of Trudeau’s government was to restore the mandatory long-form census, the cancellation of which galvanized concerns about the former Conservative government’s approach to evidence and policy.

The rest of their agenda in this regard remains a work in progress. A chief science adviser has not yet been appointed. Still pending are improvements to the annual reporting on the performance of government programs and reform of the estimates process, through which Parliament approves the government’s spending plans. New legislation for the parliamentary budget officer has been panned as too weak and restrictive.

‘Will they use the data? Will they listen to it? Even if it shows that some of their policies aren’t working? That will be the true test.’– Katie Gibbs, Evidence for Democracy

Liberals nonetheless express interest in focusing on outcomes, not inputs: on what is accomplished with public money, not just what is spent. More information about what’s happening in and around the areas touched by public policy would help with that.

“Collecting the data is the first step in making policies that are informed by evidence and, even more importantly, actually evaluating public policies to see if they are doing what we hoped they would,” said Katie Gibbs, executive director of the group Evidence for Democracy.

“So it’s certainly important, but it’s still just the first step. Will they use the data? Will they listen to it? Even if it shows that some of their policies aren’t working? That will be the true test.”

Source: True test of Trudeau’s expensive data devotion will be whether he follows the numbers – Politics – CBC News

Human rights chief praises police oversight report’s focus on race and diversity

It all starts with data:

The provincial government’s commitment Thursday to require police watchdogs to collect race-based statistics is evidence Ontario is in “a very unique moment” when it comes to recognizing the need for such data, says Ontario’s chief human rights commissioner.

One day after the release of Ontario Justice Michael Tulloch’s broad-ranging report on police oversight in Ontario, Renu Mandhane said the judge’s work provides a detailed road map to rebuild trust between community and police oversight agencies at a time of “historic levels of distrust.”

Shortly after the release of Tulloch’s 129 recommendations — many aimed at increasing transparency within the police watchdogs — Ontario Attorney General Yasir Naqvi committed to act on the key recommendation that civilian oversight bodies, including the Special Investigations Unit (SIU), begin collecting demographic data such as statistics on race, ethnicity and indigenous status.

Currently, none of Ontario’s civilian watchdogs — the SIU, the Office of the Independent Police Review Director (OIPRD) and the Ontario Civilian Police Commission (OCPC) — collect statistics on race or any demographic data on religion, age, mental health status, disability, or indigenous status of complainants and alleged victims.

That move shows “the conversation has shifted in terms of the collection of data,” Mandhane said in an interview Friday, but stressed the importance of ensuring the data is both collected and then publicly reported; Tulloch recommended an advisory board be established to develop “best practices on the collection, management, and analysis of relevant demographic data.”

“There needs to be real thought about who is going to receive the data and making sure they have the resources to effectively analyze the data,” Mandhane said.

Tulloch’s race-based statistics recommendation was one of several praised by rights groups and advocates, who appreciated the emphasis placed on diversity, cultural training and the focus on indigenous communities. The report states Ontario’s oversight bodies must be “both socially and culturally competent.”

During consultations with First Nations communities in particular, Tulloch said there was consensus that the oversight bodies “lack cultural sensitivity and often are disrespectful of Indigenous peoples.” During consultations, he was told of cases where an SIU investigator arrived in a First Nations community following an incident, spoke briefly with someone from the community, and had no further contact.

“Equally troubling, some First Nations communities in the north described having to wait days for SIU investigators to arrive on scene. In some cases, matters were closed without talking to members of the community and the leadership,” the report states.

To begin to remedy fraught relationships, Tulloch recommended mandatory social and cultural competency training for watchdog staff — developed and delivered in partnership with the communities they serve.

The report also says Ontario’s police watchdogs should reflect their diverse communities, meaning the oversight bodies must take initiatives to hire people from communities currently under-represented within the organizations.

“This includes all individuals at the oversight bodies: the directors, the investigators, the adjudicators, and the staff dedicated towards outreach, communications, administration, affected persons services, and so forth,” the report states.

The move toward greater diversity is long overdue, said Julian Falconer, a Toronto lawyer who has represented many families of people killed by police and who also practices in Thunder Bay.

In his submission to Tulloch during the review process, he says he was “quite blunt” about the lack of diversity when it comes to the director of the SIU.

Source: Human rights chief praises police oversight report’s focus on race and diversity | Toronto Star

Andrew Coyne: Politicians need to forget about polls and do the right thing

Great column by Coyne “rather trust the data:”

Liberals used to take a dim view of this sort of perception-based decision-making. When the Harper government claimed it didn’t matter if the official statistics showed crime rates falling to their lowest levels in decades, because people felt as if crime was rising, Liberals rightly scoffed. Now a similar fact-free feeling — the middle class is getting nowhere — is the foundation of their whole economic platform.

Liberals are by no means the only ones playing this game. Rather than answer questions raised by her signature proposal to subject every refugee, immigrant or tourist to a quiz on their belief in “Canadian values” — questions such as why this is needed, what it would accomplish, and what it would cost — Kellie Leitch refers to polls showing sizeable majorities of Canadians support the idea.

Likewise, those raising the alarm over Motion 103, unable to answer how a parliamentary motion with no legal force or effect could restrict free speech, have lately taken to citing polling data showing a majority of Canadians with varying concerns about the motion.

It’s easy enough to gin up a poll in support of just about anything, of course, depending on how you ask the question. The people waving them about today are in many cases the same ones who not long ago were railing ago about all the pollsters who failed to call Donald Trump’s victory (in fact, they called the vote to within a percentage point: Clinton beat him by two points, instead of the three points in the consensus forecast).

But let’s suppose these polls are genuine reflections of current public opinion. That’s a good answer to the question: what does the public think on these issues? It’s no answer at all to the question: are they right to think so? Yet that is how they are invoked: if that’s how the public feels, it must be true.

Skeptics are challenged, in tones of indignation: what, so you’re saying that millions of Canadians … are wrong?

Well, yes. What of it?

“Millions of people” are quite capable of believing things that aren’t true, particularly on matters to which they have given very little thought and with which they have little personal experience. The political science literature is filled with examples of people cheerfully offering their opinions to pollsters on entirely fictional events and people. As Will Rogers used to say, “there’s lots of things that everybody knows that just ain’t so.”

Climate skeptics rightly make the point that the overwhelming consensus of expert opinion on global warming is not enough, in itself, to prove it is right. Science is not a popularity contest: throughout history, individuals have stood against conventional opinion, and been vindicated, But let 1,340 randomly selected Canadians have their dinner interrupted to answer a question from a telemarketer about a subject they’ve barely heard of, and suddenly it’s gospel.

Experts, it is true, can sometimes be mistaken. But if experts can get it wrong, the public is at least as capable of it. And yet these days we are enjoined to reflexively reject the former, and just as reflexively to believe the latter. Perhaps we should rather trust the data.

Reevely: Massive collection of race-based data part of Ontario’s anti-racism strategy

It all starts with having more and better data and ensuring that the data is consistent and reliable.

While there will be differing interpretations of what the data means, without having good data, society is flying blind when dealing with complex issues. While data and evidence are never perfect, they do provide a sounder basis for policy choices and political discussion:

Ontario will start collecting masses of race-based data on the programs in its biggest ministries this year, hoping to use the information to find and help stamp out systemic racism.

That’s a big deal in the provincial government’s new anti-racism strategy, a three-year plan that took a year to create.

Much of the strategy is high-level stuff, scooping together things particular ministries were doing and calling it a plan. That includes a training program for staff in the courts system so they better understand aboriginal culture, trying to make the boards of Children’s Aid Societies more diverse and having the first black judge on the Ontario Court of Appeal assess the way police forces are overseen. All of it noble, some of it genuinely consequential, most of it already underway.

There’s also this: “To address racial inequities, we need better race-based disaggregated data — data that can be broken down so that we further understand whether specific segments of the population are experiencing adverse impacts of systemic racism,” the strategy says.

They’re going to start with health, primary and secondary education, justice and child welfare. That is, in the areas where government policy really makes and breaks lives.

The systems in those various ministries generate boatloads of data already, from wait times for surgeries to rates of readmission for patients in particular hospitals, from school occupancy numbers to results from Grade 6 math tests, from trial times to recidivism rates. “Disaggregating” that data means pulling apart the stats by race, routinely, in a way that typically raises more questions than it answers.

So if 15 per cent of the Queensway Carleton Hospital’s patients are back in hospital within 30 days of being discharged, we’ll monitor whether the stat is the same for members of different racial groups. If not, why is that?

Pulling all this together means devising a consistent approach so the information is collected, crunched and presented in a standard form, while protecting privacy. Which is hard enough, and that’s before we get to what we’ll do with the information.

This is, historically, very touchy. Systemic racism “can be unintentional, and doesn’t necessarily mean that people within an organization are racist,” the government says, but being accused of systemic racism sets off the same sorts of reactions as being accused of the traditional kind.

Here in Ottawa, the police spent two years tracking race-related data on their traffic stops, following a human rights complaint by a black teenager who said he’d been pulled over only because an officer was suspicious of him driving a Mercedes (which was his mother’s). When researchers managing the study released their findings last fall, they reported that drivers the police identified as black or Middle Eastern were stopped at rates many times their population shares.

A companion study found some officers deliberately misrecording the races of people they’d stopped, staying away from some parts of town and otherwise behaving differently to shift the stats so they’d suggest less racism. To whatever extent police officers changed their behaviour so as to actually behave less racistly when they knew their work was being measured, that’s a good thing in itself, of course.

Ontario’s chief human-rights commissioner Renu Mandhane argued the stats are consistent with racial profiling; Chief Charles Bordeleau of the police defended his officers, saying there’s nothing going on in the police force beyond what’s normal in society at large.

(Something similar happened when the Toronto police released statistics on the people they “carded” — stopped in the street to ask for their ID papers. Way more black and brown people than whites, for reasons that were argued about for years. Yasir Naqvi, the then-provincial minister responsible for policing, imposed new rules scaling the practice back.)

You can use such statistical findings in a lot of ways, including flatly racist ones. Maybe the police are irrationally suspicious of certain visible minority groups. Maybe certain visible minority groups are worse drivers. Maybe they’re more likely to be driving in areas patrolled by police — a possibility that opens whole vistas of speculation about why either of those things might happen. Maybe it’s a combination of things. Collecting the data doesn’t solve the problem.

We can argue about why people in different ethnic groups have different dealings with the authorities, and heaven knows we do. Sometimes to a fault. But at least with traffic stops and carding, nobody can say any longer that it doesn’t happen, and that’s a step forward.

Source: Reevely: Massive collection of race-based data part of Ontario’s anti-racism strategy | Ottawa Citizen

ICYMI – Hans Rosling: A truth-teller in an age of ‘alternative facts’ – Macleans.ca

Good profile of Rosling, who was so creative and insightful in his presentation of data. Particularly important to note in the context about “alternative facts” and fake news:

… Rosling joined the Karolinska Institute in Stockholm, where he discovered his students, and even his colleagues, had crude and misinformed ideas about poverty. To them, and most policy makers in the western world, the “third world” was one uniform mess of war and starving orphans. They did not understand the vastly different experiences of a family living in a Brazilian favela and one living in the Nigerian jungle, nor did they realize how rapidly these countries and economies were evolving. Without this knowledge, how could people make informed decisions about diseases, aid, or economics? “Scientists want to do good, but the problem is that they don’t understand the world,” he said.

Working with his son, he developed software that explained data through easily understood graphics. He launched the Gapminder site, which allowed people to explore and play with data that was otherwise hidden in the archives of the OECD, the World Bank and the United Nations. And Rosling began giving increasingly popular public lectures. His TED Talks and online videos went viral, as he explained global population growth using plastic boxes, or the relationship between child mortality and carbon dioxide emissions with Lego bricks.

https://embed.ted.com/talks/hans_rosling_shows_the_best_stats_you_ve_ever_seen

Rosling had a natural charm and a ready sense of humour that grabbed your attention and kept it, even if he was talking about the most esoteric elements of statistical science. His message—that the world is getting better but we need to understand the data if we want to help those being left behind—resonated not just with the public, but among philanthropists and government leaders.

From Davos, to the White House, to the offices of the World Bank, Rosling could be found tirelessly preaching the gospel of facts, data, and truth. For generations, aid and charity decisions were taken for reasons of vanity, simplicity or self-interest. Billionaires gave money in ways that would grant them the most publicity. Bureaucrats channelled aid dollars to projects that were the easiest to administer. And western governments built dams in Africa solely to help their own construction companies. The real impact of aid on poverty was rarely considered and almost never measured. Rosling helped change that, by explaining to donors that ignorance is the first battle that must be fought in the war against extreme poverty.

This idea, as obvious as it seems in hindsight, was new. And it mattered. Governments listened. Donors became converts to Rosling’s religion of evidence-based policy. He was not its only apostle, but he was among its most well-known, and the only one with millions of views on Youtube.

Ironically, Rosling had a much more critical assessment of his own influence on the world. He called himself an “edutainer”, and in a 2013 interview he bemoaned the fact that the average Swede still overestimated the birth rate in Bangladesh: “they still think it’s four to five.”

“I have no impact on knowledge,” he said. “I have only had impact on fame, and doing funny things, and so on.”

The deputy prime minister of Sweden, Isabella Lövin, disagreed. After Rosling’s death was announced, she wrote: “He challenged the whole world’s view of development with his amazing teaching skills. He managed to show everyone that things are moving forward … I think the whole world will miss his vision and his way of standing up for the facts—unfortunately it feels like they are necessary more than ever at the moment.”

A few hours after the announcement of Rosling’s death, Betsy DeVos, a Republican donor who believes the American school system should be reformed to “advance God’s Kingdom” was confirmed as Donald Trump’s Secretary of Education.

Source: Hans Rosling: A truth-teller in an age of ‘alternative facts’ – Macleans.ca

Garbage in, garbage out: Canada’s big data problem

A reminder that despite the restoration of the Census, there still remain significant gaps in the collection, methodologies and dissemination of statistical data by the government:

In a recent article in the Toronto Star, Paul Wells lays out what he sees as Prime Minister Trudeau’s game plan for slowing Canada’s brain drain and making science pay. “Over the next year,” he writes, “the Trudeau government will seek to reinforce or shore up Canada’s advantage in three emerging fields: quantum tech, artificial intelligence and big data and analytics.”

As he should. If that’s the plan, it’s a good one. Canada’s future prosperity depends on our ability to innovate and retain the best talent in those three fields.

What we call “big data analytics” works by finding previously unknown patterns in the huge blocks of data that very large organizations — governments, for example — grow around themselves constantly, like coral. Finding those patterns can point the way to new efficiencies, new ways to fight crime and disease, new trends in business. But as with any complex system, what you get depends on what you put in. If the inputs aren’t accurate, the results won’t be, either. So before we embrace the “big data revolution”, we may want to look first at the worsening quality of the data our federal government produces, and that businesses, activists and social planners use.

Take something as basic as divorce. Statistics Canada first started reported marriage rates in 1921, divorce rates in 1972; it stopped collecting both data streams in 2011, citing “cost” concerns.

Marriage and divorce rates are exactly the kinds of data streams consumers of big data want collected, because they affect so many things: government policies, job markets, the service sector, housing starts — you name it. Having abandoned the field now for five years, StatsCan’s data volume on marital status isn’t nearly as useful as it might have been.

Take wildlife conservation. Recently an Ontario provincial backbencher proposed a private members bill to allow for unlimited hunting of cormorants. The bill’s proponent says the species is experiencing a population explosion. And we don’t know if he’s right or wrong — because the feds stopped collecting that data in 2011.

open quote 761b1bCanada used to publish statistical reports that were every bit as good as the Americans’ — in some cases, better. Then we stopped.

Here’s another big data blind spot: gasoline imports. After having reported data on gasoline imports regularly since 1973, StatsCan has been suppressing the numbers since 2013 due to what it calls “privacy” concerns. In the last reporting year, 2012, a staggering amount of imported gasoline came into the country — almost 4 billion litres.

Now, if you were thinking of expanding your oil refinery, or wanted to know more about how dependent this country is on foreign fuel, this would be pretty precious data — the kind you’d probably pay for. But the data aren’t reliable — any more than the StatsCan data on gasoline demand by province, which we use to work out whether carbon taxes are actually reducing demand for gasoline. It’s bad data; it has been for years. You’d think someone in the higher echelons of the federal or provincial governments would get annoyed.

Combing through StatsCan’s archive of reports can be a bewildering experience, even for experts. Its online database, CANSIM, is easy enough to use. It’s the reports themselves that sometimes fail you.

Say you want to understand trends in Ontario’s demand for natural gas. You’d start by looking at CANSIM table 129-0003, which shows an increase in sales of natural gas in 2007 over 2006 of 85 per cent. “Ah,” you think to yourself, “that must be because of the conversion of coal-burning plants to gas.” But no, that change occurred years later. Ask StatsCan and they’ll tell you that they changed their methodology that year — but didn’t bother re-stating the previous years’ numbers under the same methodology. Individually, the numbers are accurate — but the trend stops making sense.

StatsCan changed its methodology again this year; it now warns researchers to take care when comparing current and historical data. That’s an improvement over changing the methodology without telling anyone but it isn’t very helpful for understanding long-term trends.

And this isn’t just StatsCan’s problem. The National Energy Board published an excellent report showing where Canada’s crude ends up in the United States. Industry analysts use the numbers to understand the reasons why light and heavy crude are selling for what they’re selling for south of the border.

The NEB stopped reporting the data after September 2015. Ask why, and this is the response you get: “The Board has decided to discontinue publication of this data while we re-evaluate our statistical products.” That, of course, was a year ago.

Source: Garbage in, garbage out: Canada’s big data problem

Public servants scramble to fill data deficit on Liberals’ priorities

Understandable given difficult cut choices recommended by the public service and approved at the political level (with the previous government’s anti-evidence and anti-data bias), with predictable impact on the quality of analysis:

If Prime Minister Justin Trudeau really is a data geek, he couldn’t have been encouraged by what some federal departments had on hand.

Internal documents obtained by the Star suggest years of belt tightening has led to a data deficit in Ottawa, gaps that may “create challenges” in delivering on the Liberal government’s priorities.

Early childhood learning and child care, expanding parental leave, increasing youth employment, and expanding training for apprentices and post-secondary students all figured prominently in the Liberals’ election platform.

But as of November, the department responsible for making good on those promises was worried they didn’t have enough concrete data to deliver.

“Spending on surveys has been reduced over the last several fiscal years and has been concentrated on priority areas to help manage financial pressures,” read documents prepared for the senior public servant at Employment and Social Development Canada (ESDC).

The Liberals have made “evidence-based decision-making” a watchword for their early days in office, and senior staff in the Prime Minister’s Office are known for their attachment to data-driven strategy.

A spokesperson for Families, Children and Social Development Minister Jean-Yves Duclos said the issue is government-wide, not isolated in their department.

“This is an issue that all ministers are facing right now. We do know that there are gaps in the data the government owns,” Mathieu Filion told the Star in an email.

“There are many discussions on the matter with different minister’s offices as to see what will be done to acquire more data.”

According to the November documents, Statistics Canada was largely preoccupied with the restoration of the long-form census, but had identified a number priority files.

Along with ESDC, StatsCan was looking to revive “longitudinal surveys” to fill in gaps. Longitudinal surveys are more expensive and time consuming than other methods of collecting data, but the documents suggest they can give greater insight into “the dynamics of life events” and have a greater payoff when continued over a number of years.

StatsCan’s wish list includes greater labour market information (specifically aboriginal participation, unpaid internships, temporary foreign workers, and worker mobility), better information on children’s physical and mental health development, and more data on Canada’s aging population and the resulting effect on the economy and the health-care system.

The agency says the digital economy remains largely in the dark, as well.

“The use of digital technologies is an important and growing phenomenon and stakeholders are increasingly demanding statistical products to address questions on the topic,” the documents read.

“While the agency has been doing some feasibility work on Internet use by children, the incidence of cybercrime amongst Canadian businesses, and has developed some questions for the inclusion in various surveys, there remain important data gaps.”

ESDC is also interested in learning more about Canadians’ “computer literacy” and use of the Internet.

Source: Public servants scramble to fill data deficit on Liberals’ priorities | Toronto Star

How the Big Red Machine became the big data machine: Delacourt

As someone who likes playing with and analyzing data, found Delacourt’s recounting of how the Liberals became the most data savvy political party interesting:

The Console, with its maps and myriad graphs and numbers, was the most vivid evidence of how far the Liberal party had come in its bid to play catch-up in the data war with its Conservative and NDP rivals. Call it Trudeau 2.0. Just as the old Rainmaker Keith Davey brought science to the party of Trudeau’s father in the 1960s and 1970s, the next generation of Trudeau Liberalism would get seized with data, science and evidence in a big way, too.

And in the grand tradition of Davey, Allan Gregg and all the other political pollsters and marketers who went before them, this new squad of strategists set about dividing Canada’s electoral map into target ridings, ranked according to their chances of winning in them. In a 21st-century-style campaign, though, the distinctions would be far more sophisticated than simply “winnable” and “unwinnable” ridings. Trudeau’s Liberals divided the nation’s 338 electoral districts into six types, named for metals and compounds: platinum, gold, silver, bronze, steel and wood.

Platinum ridings were sure bets: mostly the few dozen that the Liberals had managed to keep in the electoral catastrophe of 2011. Gold ridings were not quite that solid, but they were the ones in which the party strategists felt pretty certain about their prospects. Silver ridings were the ones the Liberals would need to gain to win the election, while bronze ridings, the longer shots, would push them into majority government territory. Steel ridings were ones they might win in a subsequent election, and wood ridings were the ones where the Liberals probably could never win a seat, in rural Alberta for instance.

The Console kept close track of voter outreach efforts on the ground, right down to the number of doorsteps visited by volunteers and what kind of information they had gathered from those visits — family size, composition, political interests, even the estimated age of the residents. By consulting the Console, campaigners could even figure out which time of day was best for canvassing in specific neighbourhoods or which voters required another visit to seal the deal.

When the Liberal team unveiled the Console to Trudeau, he was blown away. He told his team that it was his new favourite thing. He wanted regular briefings on the contents of the program: where it showed the Liberal party ahead, and where fortunes were flagging and volunteers needed to do more door-knocking. Actually, he wondered, why couldn’t he be given access to the Console himself, so that he could consult it on his home computer or on his phone while on the road?

And that, Trudeau would say later, was the last he ever saw of the Console. “My job was to bring it back, not on the analysis side, but on the connection side — on getting volunteers to go out, drawing people in, getting people to sign up,” Trudeau said. Clearly he was doing something right on that score — Liberal membership numbers had climbed from about 60,000 to 300,000 within Trudeau’s first 18 months as leader.

Volunteers for the party would learn — often to their peril — that the leader was fiercely serious about turning his crowd appeal into useful data. Trudeau wasn’t known for displays of temper, but the easiest way to provoke him was to fall down on the job of collecting data from the crowds at campaign stops. Few things made Trudeau angrier, for instance, than to see Liberal volunteers surrounding him at events instead of gathering up contact information. “That was what I demanded. If they wanted a visit from the leader they had to arrange that or else I’d be really upset,” Trudeau said.

Source: How the Big Red Machine became the big data machine | Toronto Star

Charts: No, the Y-Axis Doesn’t Always Need to Start at Zero | Re/code

Having spent more than a year on finding the right charts for my book, Multiculturalism in Canada: Evidence and Anecdote, I liked this little video on when to use the 0 and when not (also applies to the x-axis, where for median income data, starting at 0 made no sense):

If you write things on the Internet, angry email is inevitable. If you make charts on the Internet, angry email about those charts is inevitable. Especially if your charts sometimes use a y-axis that starts at a number other than zero. You see, an old book called “How to Lie With Statistics” has convinced people that truncated axes are a devilish tool of deception.

The truth is that you certainly can use truncated axes to deceive. But you can also use them to illuminate. In fact, you often have to mess with the y-axis in order to craft a useful image — especially because data sometimes reaches into negative territory and sometimes goes nowhere near zero. The right principle is that charts ought to show appropriate context. Sometimes that context includes zero, but sometimes it doesn’t. It’s long past time to say no to y-axis fundamentalism.

Source: Charts: No, the Y-Axis Doesn’t Always Need to Start at Zero | Re/code