Aug. 1, 2018

What do we do with all this data? How joining smart sensors and smart data analysis can make data more useful

All around us, devices and sensors collect data. Not just our personal devices that can track our movements and our online behaviour, but large-scale data systems that monitor weather trends, health-care usage, traffic patterns and more. UCalgary researchers look at how we harness this data and make it useful, and what impact advanced data analytics can have on our daily lives.

To get a sense of how sensors can affect our daily lives, take a walk through the Robotics and Sensor Network Group lab at the University of Calgary. Behind a non-descript door in the high-tech Information and Communication Technology building, a dozen PhD and grad students ponder over computer monitors filled with formulas, algorithms, databases, satellite images, 3D renderings and more. Robotic components – circuit boards, a set of camera “eyes”, a partially disassembled orange robot – are scattered about. Everyone is focused. The only sound is the clatter of fingers on keyboards.

Studying under Dr. Henry Leung, PhD, a professor in the Department of Electrical and Computer Engineering in the Schulich School of Engineering, these sensor-savvy scientists are fusing data and robotics to make our lives, cities and industries smarter, safer and more efficient. They're working on everything from fundamental research and conceptual frameworks to proof of concepts and working prototypes.

In one corner of the lab, one of the “robot guys” is fussing with two synched camera lenses that are set in rectangular sockets like a pair of eyes in glasses. He wants to get them to move and function like human eyeballs – the peripheral vision of two eyes can create better, movable 3D vision – while minimizing component costs.

Nearby, a master’s student is working on 3D mapping using parts from a Kinect for Xbox. It’s for a five-foot-tall black-and-orange robot (which looks a little like Rosie the robot maid from “The Jetsons”). The robot relies on sensors to self-map and self-navigate its way around an apartment without having been pre-programmed with a layout. One of its intended functions is to act as a low-cost home assistant for seniors, performing tasks like pouring a glass of water or using voice controls to call for help if a person falls.

The robot isn't yet strong enough to help a person get up from a fall, but that is a longer-term goal. With a support robot like this, a senior could potentially live independently at home for longer, rather than relocating to a care facility. It’s just one example of the different ways sensors can help improve quality of life.

How sensors create artificial intelligence

Leung’s lab specializes in data fusion – combining different types of data. “It’s just like the brain,” he says. “Your brain can effectively combine your vision, your hearing, your smell, your taste, so we are using technology to develop the human process – how we can interpret, combine information, make sense of sensor information, and put it together in a consistent way to make decisions.” This is, essentially, artificial intelligence, says Leung: “It’s how you can explore the information from the sensors and understand it as a human.”

How sensors disrupt industries

There are hundreds of billions of sensors on our planet – so many that it's virtually impossible to count them all. Multiple sensors are embedded in everything from our smartphones, cars and homes to devices in our workplaces, cities and industrial infrastructure.

Many of these embedded sensors are connected. They make up the Internet of Things – an ever-expanding assortment of “smart” consumer devices and industrial machines that “talk” by sending and receiving data over the Internet. The U.S. National Science Foundation estimates there will be 50 billion of these smart, connected things by 2020 – and not long after that, the number of sensors in the world is expected to exceed 1 trillion.

Beyond the tech world, sensor data is disrupting industries around the globe including manufacturing, automotive, aviation, health care, energy and utilities, and more. Businesses are investing billions into the research and development of sensor hardware, data collection, cloud platforms, software and analysis. The global Internet of Things market is projected to grow from US$157 billion in 2016 to US$457 billion by 2020, according to forecast firm GrowthEnabler. A global sensor economy is taking shape.

How to fix the Internet of Things

While the exponentially expanding number of Internet of Things devices imply the world is becoming increasingly interconnected – and it is – it’s not actually happening as easily or as universally as it may seem.

The biggest problem with the Internet of Things, says Dr. Steve Liang, PhD, is that most sensor data is used for only one purpose. “Most devices and applications are one-off,” says Liang, an associate professor in the Department of Geomatics Engineering at the Schulich School of Engineering and director of the GeoSensorWeb Laboratory.

For example, a thermostat in a baby’s bedroom. Or a video doorbell. Or a moisture detection system that can alert you to leaky pipes. These siloed sensor systems don’t – and typically can’t – talk to each other. And neither do similar-task sensor systems, for example different kinds of motion detectors created by various manufacturers. While combining data from different sensors may sound simple, it's actually quite complex.

“Technically, it can be very difficult to aggregate different kinds of information – the semantics of what we are measuring is very important,” says Liang. “When you want to design a system that accommodates for any sensor, lots of contextual information is needed: Time, location, unit of measurement, observed property, these are just some examples. And then there are multiple times and locations.”

Consider a sample of water or blood. It's collected at a certain time and location. It goes to a lab and it gets tested a few days later. The time the sample was taken is different from the result time, and the test is done at a second location from where the sample was collected. In order to use this data, all of that information is important. What if the testing sensor in the lab wasn’t calibrated? Then a different sensor is needed and a different sampling time and location are generated with the retest. All of this must be aggregated for samples to be comparable.

“We have tons of examples like this,” says Liang. “And the more complexity you add in – consider data from cameras that move, like in a self-driving vehicle – the more a proper architecture is required.” Aggregating sensor data is not only technically difficult, it’s made even more challenging because of a lack of global standards for how everyone around the world is trying to do it.

How to create international data standards

Liang has long advocated for an open industry standard for how Internet of Things devices, applications and data communicate. He is one of the project editors for the Open Geospatial Consortium standard SensorThings API, an international consensus framework that provides open and uniform ways for devices, device-to-cloud communications and cloud-to-cloud communications to connect. “The more we can coordinate the data, the more useful we can make it,” he says.

Liang is also the founder and CEO of SensorUp Inc., a Calgary-based startup that is using SensorThings API’s open standards for its non-proprietary system that aggregates all kinds of sensor information into one cloud platform.

The company was part of the 2018 Creative Destruction Lab Rockies, a Haskayne School of Business-based bootcamp helping science-based startups commercialize their technologies. SensorUp has raised $2 million in seed funding – and it’s just one of several CDL-Rockies businesses founded by UCalgary researchers that are creating products using sensors, Internet of Things and artificial intelligence technologies.

How the live data trend is taking shape

One of SensorUp’s most innovative products so far is an air pollution sensor, which costs less than $100. People can build the device themselves in workshops put on by the company and then install it outside their homes. The SensorUp program is called Smart Citizens for Smart Cities and it has already recruited more than 500 people in 12 cities across Canada. Live air quality data from dozens of neighbourhoods in Calgary, for example, is already available online in easy-to-understand charts that are colour coded, time coded by the hour and searchable by date. 

The SensorUp device, which looks a little like a square white beehive, measures temperature, humidity and atmospheric particulate matter with a diameter less than 2.5 micrometres (PM2.5), a measure of fine pollutants. It has a central processing unit and Wi-Fi. And the data is available almost immediately: the sensor collects readings every five minutes and sends it to a cloud server, which then publishes the data online.

Most cities have only one or a few air monitoring devices and each can cost upwards of $1.5 million, says Liang. Poor air quality is a rising cause of death around the world. Based on estimated exposure to PM2.5, the World Health Organization has determined that ambient air pollution is one of the largest health and environmental risks we face.

“We want to know the quality of the air that we breathe in our own backyard, not the reading from one sensor in our city that is 10 kilometres from our house,” says Liang. “The problem we want to solve is to develop a system that empowers anyone to quickly deploy an open-source sensor that is readily available. The sensors can be used from different locations and the data from all the sensors can be joined together in our cloud system. That way we know the ambient air quality of the day. It’s dependable and it’s way cheaper.”

The ability to create such a network is a result of the availability of cheaper and higher quality open-sourced hardware. “It’s the democratization of equipment,” says Liang. “The quality of the sensors we are using can never beat the results of a $1.5 million sensor, however, when we have a lot of the cheaper sensors, we can use this data to have the sensors calibrate themselves.”

The SensorThings API framework is another piece of a data democratization trend that is underway: Beyond being an open system that can be used to provide real-time results and user-friendly open data which anyone can access – we’ll get to data ownership issues later on in the story – it was also specifically created to work with devices that are low-power, low-bandwidth, and rely on intermittent connectivity.

How sensors minimize inefficiencies

What excites Liang the most is the power of sensors and sensor-cloud networks to provide almost immediate access to data. “When you can have factual information in near real-time you can identify inefficiencies and start to eliminate them,” he says.

To illustrate this, Liang likes to share an agricultural example. Consider a square metre of farmland: “If you use sensors to know the exact soil moisture, the wind levels and the soil pH in this block, then you can fertilize for the exact amount needed. We can react in real-time and eliminate waste and fertilizer runoff.”

Another example comes from Liang’s SensorUp, which is working on a project with the U.S. Department of Homeland Security called Next Generation First Responder. The goal is to create an application that you can embed in a shirt or vest that integrates all kinds of sensor information that will help a firefighter, paramedic or police officer stay safe and do their job more efficiently. The application could include a wearable camera, a sensor that shows the level of an oxygen tank and how many minutes of air left, and the responder’s location. This information would need to be comparable and shareable between agencies and connected to an emergency operations centre. That system would also need to be created with a common standard so that emergency response units from different cities and municipalities – even those across an entire country – could easily communicate and share data during disasters, according to Liang.

“The idea behind this project is to save 60 seconds for every 911 call," says Liang. "What kind of impact could we have if firefighters could start fighting a fire 60 seconds earlier?”

How sensor networks make cities smarter

Back on the UCalgary campus, down the hall from the Robotics and Sensor Network Group, another researcher, Dr. Dean Richert, PhD, a postdoctoral fellow in electrical and computer engineering, is working with Leung on another sensor network that could, one day, take a different approach to saving first responders some precious seconds here in Calgary.

Leung and Richert are developing a network of low-cost palm-sized acoustic sensors that could be installed across a city to monitor noise levels and sounds. They're working on a final prototype before installing five test sensors in Calgary this summer.

It’s all for a pilot project for the City of Calgary’s Smart City project through Urban Alliance, a research partnership between the University of Calgary and the City of Calgary. The intent is for the system to monitor noise levels in real-time, particularly during construction projects or big events. Essentially, it would be able to hear anything that might violate the city’s noise code – and also detect the sound of gunshots.

“What makes our design different from most other sensors is that a lot of the data processing happens at the local level on the sensor itself," says Richert. "A microcontroller processes the acoustic signal locally, which allows it to be battery powered and limits the amount of data that needs to be transmitted, stored and processed at a data centre. At the end of the day, we don’t have a gigantic database of data that we need to process in some fancy way. The data is processed locally and then the results are aggregated at a central location. It saves energy and time.”

These sensors will communicate with a radio gateway network called LoRaWAN, or Long Range Low Power Wireless Area Network. Installed in September 2017, LoRaWAN was created specifically to enable Internet of Things innovations in Calgary. Because of their long range (approximately eight kilometres) a few of these strategically placed gateways are enough to cover a city.

If the acoustic sensors dot the city, the scalable network could provide real-time noise monitoring and a real-time noise map of the whole city. The sensors would use passive listening to identify sounds – like a gunshot – and an event notification could be transmitted to a control centre and first responders, perhaps even before a 911 call is placed.

As cities strive to become “smarter,” Leung believes we are going to see more sensor networks everywhere – embedded in our buildings, our public spaces, our streets and transportation systems and more. “I’m optimistic that the more data we have and the more we analyze it, the more it is going to help society,” he says. “The challenge is, how do you extract information and improve it by combining the sensor information? And as the amount of data being collected increases, how do you make sense of that data?

“I think that is why we are starting to see a lot of collaborations to combine data into different applications. In the next 10 or 15 years I can see only more and more applications using artificial intelligence in sensor processing and also incorporating non-sensor information as well.”

How the democratization of data is evolving

Thanks to innovative, cost-saving technological advancements, data generation and collection is no longer the hardest part of working with big data. For years, it's been estimated that 90 per cent of the world’s data is generated in the most recent two years and that just one per cent of that data is analyzed. The amount of data in the world is expected to continue to more than double every two years. The challenge now is how that data is combined, analyzed, used and shared.

Today, the power resides with the people who hold the data – and know how to use it, says Dr. Qiao Sun, PhD, a professor in the Department of Mechanical and Manufacturing Engineering at the Schulich School of Engineering at UCalgary. “Have you heard the saying ‘Data is the new oil’? It means if you own the data, you own the future. So who wants to give up the ownership of the data?”

Sun is like a doctor for machines – she works on machine failure detection. When a piece of machinery is operating, sensors collect all kinds of data including sound, vibration, temperature, pressure. If you understand the normal condition of a machine, you can use the data to determine if there is a problem – or to try and predict when a problem could arise. She diagnoses if and when actions need to be taken.

But machines are not built to universal standards – so there is no general solution. “It’s similar to how doctors diagnose patients’ problems,” Sun says. “There is some general sense of how the body works, or how machines work. But as a doctor needs to learn the issues with individual patients, we have to learn the issues for individual machines.” To do this, she needs data – lots of data, including historical data that reveals how a machine has performed in the past. Think of it like a patient history. Except that to fix a pipeline or a drilling rig, or to predict when it will next need to be upgraded, you need its operating history.

“But companies don’t want to share this sensitive industry data because they don’t want to lose their competitive edge,” Sun says.

How sensor data can be amplified

Sun is working on a framework that can fuse whatever real-life machine operating data she can get her hands on with physics-based models – because physics principles don’t need data and combining the two can help her create a model, or a digital twin, that reflects a real machine.

A digital twin is essentially a virtual duplicate of a machine, which can be run and tested just like the real machine, except algorithms and machine learning provide the freedom to push the digital twin past its limits to better understand the real machine’s limitations.

Sun is working on a digital twin model for a wind turbine, which have large, expensive components that are not easy to access or fix. She wants to help the renewable-energy-generating machines function more efficiently.

“You need real data to improve your model, so that it reflects reality," she says. "But the model is also a living model and it has to evolve together with the real system. So the challenge is we don’t have a lot of real industrial application data to work with. The work I have done is mostly lab work – data was provided from a test. I’ve worked on projects with power companies and they have given me their operating data after I sign a non-disclosure agreement, but it’s still only their surface data.”

How sensor data can empower change

The good news though, Sun says, is that the world is moving in a direction where there is more incentive for businesses to minimize potential environmental damage, increase productivity and look for cost savings. “If people can get away without serious accidents, they don’t always invest in solutions," she says. "But sensors are so cheap now and everyone is doing it. If you don’t you’re going to get left behind."

Sun wants to see more companies move from reactive, scheduled and proactive maintenance efforts toward predictive maintenance. Think of an oil change in a car. This is scheduled maintenance – owners are expected to change it at fixed intervals according to the terms of their warranty. But a predictive maintenance approach would be based on how the car is performing. The car’s sensor data would be analyzed to decide what parts or fluids need attention and when. “There could be huge cost savings,” says Sun.

In his own data fusion work, Leung has encountered two different viewpoints on sharing data. With projects related to government defence work and the fiber optics industry, “the data is very valuable so they don’t want to share too much,” he says. “But we are also working with some companies who have a different logic. They want to make big data usable to everybody, and they are developing databases and data repositories so many people can use their data.”

For Liang and his open-source and open standards efforts, data transparency is key. “To be able to analyze a large amount of data is not new," Liang says. "Governments and large corporations have been able to do that for a long time. They have had more computing power and more access to data. But now ubiquitous cloud computing and the availability of open data has empowered more people to have access. So I would say today the situation is better. I think it’s the government’s responsibility to make a lot more data, where appropriate, more accessible. It’s a trend now.”

Making data and computing power available to everyone reduces the barriers to analysis, lets people make their own data-informed decisions, and gives almost anyone the ability to use significant data to question and challenge those in power. “With more data, we can make people more accountable,” says Liang.

Why data has the potential to change our world

Data access issues, integration obstacles and analysis efforts are a challenge for everyone – no matter the research expertise or business or industry. But despite all the hurdles to advanced data analysis, useful real-time data is being shared, efficiencies are appearing, real-world problems are being solved, cost-savings are being found, and global standards are beginning to emerge. Interdisciplinary data collaborations are ramping up across industries and across the globe. More data continues to be collected and enhanced in innovative ways.

If you can think of a problem that needs a solution, there is a scientist somewhere developing new ways to harness and analyze data to solve the issue. Analytics-enabled data will one day be able tap into predictive capabilities in ways that we can still only imagine.

In the meantime, despite sometimes being frustrated by the current limitations of data access and analysis tools, researchers like Dr. Quazi Hassan, PhD, a professor in the Department of Geomatics Engineering at the Schulich School of Engineering, continue to creatively come up with relatively simple solutions and strategies to make our world a safer and better place to live.

Hassan, along with two other UCalgary engineering researchers, recently studied data from the 2016 fire that devastated the city of Fort McMurray in northern Alberta. The fire forced 88,000 people from their homes and more than 10 per cent of the city was destroyed. They also looked at data from another serious fire in Slave Lake, Alberta in 2011, which forced the entire town of 7,000 to evacuate and destroyed about a third of the community.

The researchers used super-high-resolution satellite images, spatial data modeling techniques, and combined this with Statistics Canada data as well as data from Canada’s National Fire Information Database, among other sources, to assess how the fires spread – and to generate some straightforward, but very useful, ways that could help minimize a community’s future fire risk.

They concluded that if cities can be built (or rebuilt) with facilities such as ring roads, school playgrounds and parking lots erected on a city’s outskirts, these open spaces could act as buffers to protect buildings from fire.

It sounds so simple – and perhaps one of very few good scientific arguments for bigger parking lots.

“I always try to make a simple solution,” says Hassan, who leads UCalgary’s Earth Observation for Environmental Laboratory in efforts to use spatial data modeling to help monitor, forecast and mitigate all kinds of natural disasters. “Because let’s say you have one variable versus 1,000 variables. Maybe one variable can predict 80 percent of the cases. Maybe 1,000 can predict 82 percent. I will never want to go to 1,000 variables because it wastes my time and energy. I want to use data in the simplest ways that I can, especially because working with data can be so complex.”

Explore our engineering programs

Participate in a research study

 


–  –  –  –  –


ABOUT OUR EXPERTS

Dr. Henry Leung, PhD, is a professor of the Department of Electrical and Computer Engineering at the Schulich School of Engineering at the University of Calgary. His research interests include big data analytics, cognitive robots, information fusion, machine learning, signal/image processing and wireless communications. He leads UCalgary’s Robotics and Sensor Network Group. Read more about Henry

Dr. Dean Richert, PhD, is a postdoctoral fellow working with Dr. Henry Leung in the Electrical and Computer Engineering Department Engineering at the Schulich School of Engineering at the University of Calgary. His research interests include distributed data analytics and control systems. He is focused on developing distributed algorithms for sensor networks and cooperative control applications. Read more about Dean

Dr. Steve Liang, PhD, is an associate professor in the Department of Geomatics Engineering at the Schulich School of Engineering at the University of Calgary and director of the GeoSensorWeb Laboratory. He is also the founder and CEO of SensorUp Inc, a Calgary-based startup that offers a Data Exchange Platform based on International Open Standards for the Internet of Things. Dr. Liang wants to disrupt the silos of the Internet of Things by creating, using, and implementing open and interoperable standards for device and cloud communications. Read more about Steve

Dr. Qiao Sun, PhD, is a professor in the Department of Mechanical and Manufacturing at the Schulich School of Engineering at the University of Calgary. Her research interests include optimizing mechanical systems to the highest levels of efficiency, reliability, robustness and intelligence. She is focused on the modeling, dynamics, control, and fault diagnosis of mechanical systems. She is also the associate dean of diversity and equity, as well as teaching and learning, at the Schulich School of Engineering. Read more about Qiao

Dr. Quazi Hassan, PhD, is a professor in the Department of Geomatics Engineering at the Schulich School of Engineering at the University of Calgary. His research interests include finding ways to use technology – including using and integrating remote sensing and GIS techniques – to help mitigate, forecast and monitor natural disasters caused by fire, drought and flooding. He leads the Earth Observation for Environmental Laboratory. Read more about Quazi


girl using vr headset

Stay updated

Get our latest stories in your inbox every month. Sign up for our newsletters and receive stories of inspiration and discovery, event notifications, breaking news, and more.

Thank you for your submission.