Aug. 9, 2019
Analyzing big data – and lots of it – streamlined by new research computing cluster
How do we turn Calgary into the next Silicon Valley? What’s the impact of a university on a city? How does a city’s economy change when it increases emphasis on discovery and rapid technological advancement?
Dr. Alex Whalley, PhD, associate professor in the Department of Economics, is using big data to answer these questions, and figure out how we can turn innovation into good jobs.
“Why do we have a lot of highly skilled workers in Silicon Valley, but we don’t have the same sort of things in other areas? What can policy-makers do to try to move from one situation to another? We have to be creative to tease out what the effect of different policy choices would be,” Whalley says.
Teasing out those effects requires data, and lots of it. Whalley takes into account everything that makes a city what it is — climate, infrastructure, population, neighbourhoods, services — to determine what impact an innovation policy has on productivity, versus one of a city’s inherent qualities, like the appealing sunny climate in Silicon Valley.
“The amount of data is just taking off, and we have so much more granular, high-quality data than we’ve had before,” he says. This is great news for Whalley, but analyzing this newfound wealth of data requires serious computing power.
“You’re always bumping up against computing capacity,” he explains. “Sometimes you have to make compromises. You don’t have the capability to process all the datasets that you want, or it may take too much time to process, and you can’t spend two weeks on this one dataset, so you have to abandon that part of the project.”
To test his hypotheses, Whalley must do a large number of simultaneous calculations to model the economic shifts of a city over a long period of time. “Economists would love to do experiments in a lab, but we can’t randomly assign many economic policies. Instead, we try to find real-world events that approximate those experiment to see what the effects of those policies are.”
Enter the newly built , an IT infrastructure undertaking that has vastly increased ɫ computing power. This new computing capacity includes 4,560 central processing units (CPUs) for a total of 16,000, and 24 graphics processing units (GPUs), the critical components for the kind of intensive calculations that many projects require.
CPUs are the equivalent of the brain of a computer — they perform calculations via math and logic operations. GPUs perform parallel operations on multiple sets of data, doing repetitive calculations concurrently.
“This increased computing infrastructure will enable ɫ researchers to work on problems not feasible before, and allow students to gain valuable skills in how to use High Performance Computing infrastructure as a tool to solve current and future grand challenge problems,” says Robert Fridman, senior analyst in Research Computing Services.
For Whalley, more computer power means more fulsome research projects achieved in a much shorter period of time — no more skipped analyses, and he can take full advantage of the growing databases of microeconomic data available.
“We can see what’s really driving the economic outcomes in a way we never have,” he says. “I’m excited to get to work and see what the new cluster is capable of.”
ճ assists Canadian post-secondary institutions and their affiliated research hospitals and institutes with the expenses associated with managing the research funded by Tri-Council agencies (CIHR, NSERC, and SSHRC). The annual grant helps institutions manage the financial demands of the indirect costs of research. By easing the institutional financial burden, the Research Support Fund helps the university create an environment where researchers can focus on their research, collaborate with colleagues, and translate their discoveries and innovations.