Aug. 1, 2018

Can machines make us more human? Digital technology and the future of work

UCalgary researchers look at the ways that technology is transforming the workplace, and reinforcing the fundamental things that make us human.

Anyone who watches Westworld or Star Trek knows that humans are obsessed with the commingling of flesh and machine. It’s an obsession that raises fundamental questions about our relationship with technology: Are we using technology or is it using us? Will artificial intelligence become more human than humans? Which leads to the most uncomfortable of questions: What separates humans from machines?

We watch with awe and trepidation as our evolutionary process immerses us in the digital world, searching for the ultimate fit between body and bot. Meanwhile, technology creeps steadily into our homes and offices.

Workspaces are evolving rapidly as technology changes our workflow processes, communications practices, and final outputs. Some business functions, such as demand forecasting, are both saved and crippled by technology. Meanwhile, some of the most human aspects of labour, such as human resources, are increasingly reliant on digital tools. And some of the most technology-driven workplaces – think app design – are returning to their human roots.

What is this switchabout trickery? Is technology paving the way for this evolutionary roadmap, or is it a mirror revealing who’s the most human of us all?

By examining technological shifts in hiring practices and immersive digital tools, we can delve into the contradictory nuances of the search for the perfect human-tech fit.

Shifting the human in Human Resources

Derek Chapman, Associate Professor of Industrial and Organizational Psychology at the University of Calgary, works on digitizing recruitment and selection processes. He forecasts a shifting of the human factor in Human Resources management. Whether HR is evolving away from or toward technology is a question with no definitive answer.  

“Just in the past 15 years there’s been a revolution in how organizations recruit and select employees,” says Chapman. This revolution brings benefits to both job-hunters and employers. Technology makes the process more targeted and more active than traditional methods like a generic “Careers” ad in a newspaper.

Now you can have people fill out surveys about what they're looking for in jobs,” says Chapman. “So potential employees are more likely to find work that aligns with their specific values and goals.” This is much less labour-intensive, according to Chapman. “You can have thousands of applicants for positions, and effectively have them screened for pennies instead of having someone read through resumes. You now have automated procedures and algorithms that identify the best candidates.”

Employers have a larger pool of candidates from which to pull the best employee, and they are better able to foster a more diverse workforce. Plus, the technology frees them up to focus on the most desirable applicants.

But finding the best candidate is a process plagued by the frailties of human nature. In online applications or interviews, people can exaggerate their strengths or downplay their weaknesses. The new hire who seemed outstanding at the fit interview may let the team down three months into the job. It’s not easy to tell from a selection process – no matter how thorough it is – whether that applicant is the sort of person who will excel at their job.

“Organizations often look at personality when they hire,” says Chapman. “A candidate may have great cognitive ability, which suggests they can do the job well. But personality helps you predict how frequently that person will do the job well. These are factors that are harder to evaluate.”

One selection tool Chapman is exploring is game play. “It’s a way to measure personality indirectly,” he says. “You can set up environments where the applicant is a bit off-guard. They get through a difficult puzzle, and you assess their strategies. Did they cooperate with others or compete for the goal? Were they persistent? Did they use a sophisticated approach? Were they patient or impatient?” When applicants immerse themselves in a multimedia environment, they are less likely to simply give the answer they think the employer wants to hear. The hope is that the digital game reveals the true self.

The game element, Chapman suggests, could also take the bias out of cognitive testing. A game can test cognitive ability as well as personality traits, and can do it with less bias than standard cognitive tests. “Some demographic groups tend to score poorly on those cognitive tests,” says Chapman. “It’s due to a whole host of factors, such as motivation or what we call ‘stereotype threat,’ where people feel they're being stereotyped.” These biases could pertain to gender, race, sexual orientation or disability, for example. So far, Chapman’s research is showing that games can level the playing field for cognitive testing by removing some of the bias from traditional measurement systems.

A digital hiring process can also increase the fairness factor for applicants. Imagine going through a hiring process where you’re required to role-play a key workplace situation, like an argument with a customer.  Chapman suggests that technology can take the inconsistencies out of this all-too human drama. “If you use an online video for role-playing, everyone gets the same customer complaining to them, with the same inflection in their voice. It doesn't depend on the mood of the other person doing the role play, or whether they're paying attention or not. It’s a standardized approach, and it’s much fairer.”  

But technology has its downsides in the selection process: it can give employers too much access to the personal side of an applicant. “The problem arises when technology gets ahead of people, in terms of something like privacy,” says Chapman. “There are all sorts of legal problems associated with selection when you use a site like Facebook. People are still working out whether it’s okay for an employer to evaluate you based on your social media presence.”

Indeed, the job-search industry has noted this trend. A 2013 Workopolis.com  article titled “The top three things employers want to see in your social media profiles” advises readers that “what you post and how you behave on these sites can create a first impression of the sort of person you might be.” The article warns that potential employers see certain types of posts as red flags, including those about illicit drug use, overt sexuality, profanity and, last but not least, imperfect grammar.  

This urge to hire and retain employees with a desirable value system is nothing new. In the early 1900s Henry Ford was giving his employees a bonus based in part on whether they were “doing things the ‘American way’” (Forbes.com). Ford’s Socialization Organization would visit employees’ homes and ensure they were not partaking in activities such as gambling or drinking. If Ford were alive now, he might well be poring over Facebook posts to weed out employees who seem less than American.

But of course we are much more aware of privacy concerns in the 21st century. Despite our willingness to share personal information online, we are unsure of how much of our lives we want potential employers to evaluate. The implication here is that the digital factor in HR processes must retain its human touch. Chapman echoes this sentiment, warning against the temptation to hand over online recruitment and selection responsibilities to IT specialists. “It’s a bit like handing over the keys for decision-making to people who are really good at building the machine, but not necessarily the decision-making process itself.” 

Plunging into data

Frank Maurer also thinks about how technology is changing the way we work. He’s a professor in the University of Calgary’s Department of Computer Science, and currently serves as associate dean, innovation and strategic partnerships, in the Faculty of Science. He envisions a world where we are surrounded by information, thanks to immersive technologies like augmented or virtual reality. In this world, data is everywhere, informing the way we live and work.  Above a dial in a power plant hover words and numbers – the purpose and metrics of that dial. In a greenhouse, data about a cucumber glows amongst leafy vines.  

It’s all right there at our fingertips. Or rather, in our line of vision if we’re using something like Microsoft's HoloLens.

Maurer’s work focuses on engineering applications where digital surfaces meet immersive analytics and machine learning. Picture the 3D Ops Center in the movie, Avatar, with its topographical holo-maps floating in space. These immersive environments could have profound effects on our work spaces and decision-making process. According to Maurer, “they will radically change how we perceive and understand information.”

Maurer’s research team has already created a prototype 3-D virtual hologram that could change how the city of Calgary is perceived. Maurer’s team combined augmented reality and custom-built software to create an immersive map of the downtown core. It’s designed to help emergency responders coordinate evacuations. “You feel as if you’re inside the skyscrapers,” says Maurer. “And the real benefit is that you can move around the image and see it from different perspectives.”

This combination of movement, visualization and understanding was once crucial to survival. “Our brain is built for detecting patterns in three-dimensional environments,” Maurer explains. “In the early days of human history, if you were able to detect a tiger before it jumped at you, you could probably survive. So we are hard-wired for understanding spatial data.” Since computers screens are two-dimensional, Maurer feels compelled to bring a third dimension of data into the workplace.

One of the spaces he’s interested in re-shaping is the control center environment, which could derive particular benefits from immersive data. “It would be used by a team like the Calgary Emergency Management Agency,” says Maurer. “Say we have another flood, but this time there are digital tools like big screens, a great network connection, digital control panels and 3D interaction with data. By introducing an immersive workspace, there is more potential for collaboration and improved decision-making processes. You get a richer perspective on the information at hand – and share it with remote team members who can’t get to the control center during a civic emergency.”

Embracing technology in the workplace will lead to more than efficiencies and improved effectiveness. “Even now, more and more jobs have a digital component,” says Maurer. “Software is changing everything, removing the driver from a car or truck, for example.” And the computer itself is evolving, developing humanesque skills.

“Computers used to be terrible at perceiving things,” says Maurer. “Anything that had to do with seeing or hearing was fundamentally hard. With people it’s intuitive; we don’t even think about something like recognizing faces. But computers have to think about how to see.” This, however, is an area where machine is merging with its human colleague. “Through machine learning, computers are gaining the ability to see, hear, understand – to put things in context,” says Maurer. And machine learning will have an impact on jobs that rely on these abilities – jobs such as radiology. “This is not a career most people think of as being threatened by technology,” says Maurer. “But wait and see.”

When you ask Maurer which jobs are not likely to be threatened by a rapidly evolving digital world, he points to those that involve abstract thought and emotion. The human touch, in these cases, would trump the digital click. “Tasks that deal with caring, or politics, or social interactions,” says Maurer. “These are functions where we want to have humans in the loop.” 

Embracing the human algorithm

Greg Hart is likely to agree with Maurer on the importance of emotion. Founder of an innovative learning centre called Inception University, Hart brings his expertise in entrepreneurialism, ergonomics, critical thinking and software user experience into the classroom. His pedagogical strategy centres around design thinking – an imaginative, solutions-based strategy for innovation.

Design thinking involves empathizing with the people who will use a product or service, to better anticipate their problems or define new opportunities. Design thinkers don’t just imagine new applications and products; they imagine what other people are thinking and feeling. So while Frank Maurer suggests that emotion-laden tasks like caring for others are best left to humans, Greg Hart reminds us that creating technology is a human process, right down to its core.

“Technology, after all, is the interface between us and the rest of the world,” says Hart. “I can’t think of a more profound challenge than creating that interface. So it’s not about knowing how to write code for an application. What’s most vital is understanding the problem that the application is trying to solve. The fundamental human tendencies behind that interaction. That’s the gold.”

Part of acknowledging the human underpinnings of technology is recognizing the uncertainty of a digital world. Courses at Inception University train learners to focus on “building our competence to meet the demands of an uncertain future.” Instead of being overwhelmed by the pace of change proposed by technology, Hart suggests embracing it. “We have rapidly scaling technologies and cities – this is our new reality,” says Hart. “The challenge is to develop an expertise at things you didn’t even know about the day before.”

It helps to focus on process rather than traditional hierarchies, says Hart. Resisting that human temptation to build layers of authority is crucial to building processes based on purpose – processes that help organizations stay flexible as they work toward their goal. “Which reminds me of evolution,” says Hart. “That search for the perfect fit. Biological evolution is just one form of this very basic algorithm. It runs experiments. It keeps the things that fit best and loses the things that don't."

And the fit must enable humans to find the ideal role for technology in their processes, to define – or blur – the boundary between machine and human. “Today it’s the search for the organizations and individuals that are responsive,” says Hart. “The ones that can adapt to uncertainty. We’re inhabiting an evolutionary search algorithm.”

Herding humans, Facebook-style

As Hart points out, technological change is contributing to the increasingly uncertain climate of the 21st century. The ability of an organization to adapt and embrace this tenuous state may have a huge impact on its success. This principle is especially true for companies trying to forecast product demand in a market that zigs and zags along social media buzz-lines. It’s difficult to keep up with consumer desires that move at the speed of Facebook – especially since it normally takes years of planning to bring a new product to the market. Suppliers need to be signed on, facilities might need building. Getting product numbers wrong can be disastrous. 

Mozart Menezes, formerly a professor in UCalgary's Haskayne School of Business, has been tackling the problem of product demand forecasting along with current Haskayne professor Giovani da Silveira, and Renato Guimaraes of the ICN Business School in Nancy, France. They recently published their findings in a study in the International Journal of Production Economics.

Menezes, who specializes in supply chain complexity, notes that forecasting challenges stem in part from a depressed global marketplace. “There are new pressures for a company to grow, so the company develops new products, new markets, new channels,” says Menezes. "It’s hard to convert demand into profit, especially when there are so many new products and fragmentation.” But there’s more than economics to blame.

A new digital demon is wreaking havoc with the forecasting process, according to Menezes. “Social media is making forecasting so much harder, even though computer power has increased substantially,” he says. “Social networks create ‘herd behaviour,’ which means your product could become a huge hit or a major flop, because customers are so influenced by their peers.”

The digital demon has its angelic side, though. Businesses certainly make use of data collected through social media platforms to make decisions about how much product to produce.  But social media remains a sizeable problem for predictability. “These extreme outcomes are the new normal,” says Giovani da Silveira. “Businesses can only dream about that ‘average’ range of the past.”

Technology platforms, then, herd humans into groups that dash one way or the other, but not down the middle of the field. To mitigate the effects of this overly human – but technologically supported – erraticism, Menezes and da Silveira have come up with an alternative way to forecast product demand.

“We propose a system for projecting the increase in uncertainty due to social network influences,” says Menezes. “Instead of using the usual bell curve method to predict demand, we use a beta-binomial distribution model, which we’ve tested using computer simulations. Our model allows businesses to forecast demand with higher variability. So we’re helping them adjust for this new expected uncertainty in a scientific manner.”

The team’s forecasting model considers three factors that inform customers' purchasing decisions: individuals’ personal preferences, the influence of their close social connections, and the product market share – or the “coolness” of a product.

Menezes and da Silveira's theory is just theory for now; they haven’t yet accessed the right data set to test it properly. But they feel it has legs, and they know that businesses are thirsty for ways to mitigate the unpredictability of product demand. They know that social media is where those human tendencies aggregate, echo and ripple, carried by digital waves through the webmosphere.

Moving toward a fit

If, as Greg Hart suggests, the human evolutionary process is an algorithmic search for fitness, what does that fit look like, where technology is concerned? Technologies are created by humans, so they are – in theory – fundamentally human. But as machine learning advances, so does the potential for artificial intelligence to overtake its creators.

Perhaps it’s only by embracing the rapidity of technological change that humans can make intelligent decisions about how best to employ digital tools, and when to blur the lines between flesh and machine. Whether we are stepping inside immersive technologies, harnessing the eccentricities of social media, or testing our personalities against digital simulations, we are evolving toward a new fit with technology. Which is nothing new. As any Westworld or Star Trek fan will tell you, human nature has never been much good at standing still. And, as Frank Maurer points out, sometimes you have to keep moving to see the whole picture.


–  –  –  –  –

Explore our programs

Psychology

Computer Science

Business

Participate in a research study


–  –  –  –  –


ABOUT OUR EXPERTS

Derek Chapman is an associate professor in the Department of Psychology in UCalgary's Faculty of Arts. His research interests lie in industrial-organizational psychology, recruiting and selection. Read more about Derek

Dr. Frank Maurer, PhD, is a computer science professor and associate dean of innovation and strategic partnerships in UCalgary's Faculty of Science and head of the Agile Surface Engineering (ASE) group. ASE conducts industry-oriented research on immersive analytics and application engineering for digital surfaces. Read more about Frank

Dr. Giovani da Silveira, PhD, is a professor in UCalgary's Haskayne School of Business. His research interests lie in the areas of operations strategy, supply chain management, and mass customization. Read more about Giovani


girl using vr headset

Stay updated

Get our latest stories in your inbox every month. Sign up for our newsletters and receive stories of inspiration and discovery, event notifications, breaking news, and more.

Thank you for your submission.