Close×
Privacy

Does Grandma need a social support robot?

By : Maryse Guénette
Photo credit: Alex Knight (Unsplash)

The robots are coming – and the manufacturers are taking square aim at seniors. Is this a good thing? A study reveals how the target audience feels about all this.

As the COVID-19 crisis sheds new light on how isolated many seniors feel, technology companies are working on devices and applications specifically designed to break their isolation.  “There’s a certain momentum right now,” says Dr. Andrea Slane, professor of legal studies at Ontario Tech University and coauthor of a study on the subject. The proportion of seniors in the population continues to rise, and the pandemic has helped them learn to use new technologies, since they want to stay in their own homes – “aging in place” — as long as possible.  Those who will become seniors over the next 10 years are already relatively adept at using technology.

The study, Involving Seniors in Developing Privacy Best Practices: Towards Responsible Development of Social Technologies for Seniors, was conducted at Ontario Tech University (University of Ontario Institute of Technology).

We wanted to collect seniors’ views on devices and applications they use now and those that may be created in the future. We also wanted to know what they’re doing to protect their privacy, since they’ll proceed in the same way with new devices.”

Dr. Andrea Slane, professor of legal studies at Ontario Tech University and coauthor of the study  

Revealing discussions

As part of the study, focus group participants exchanged views on factors that help them use new technologies. They also watched promotional videos for a personal domestic robot and shared their reactions. Those who attended workshops were asked to create a fictional character they could refer to later for answers to their questions.

What emerged was a range of diverse experiences. Using technology to make appointments and set up reminders was fine for those who were already doing it, but seemed too complicated for those who had never tried it. Many participants were accustomed to using technology to keep in touch with family and friends. Many also noted they were afraid that using technology was reducing their cognitive abilities.

Participants were extremely skittish about using technology to make purchases or perform financial transactions online. Half the participants were not performing any transactions online, and many of those who were preferred to use their computer rather than a smart phone or tablet. Quite a few participants preferred to go into their financial institution and talk to a teller. This behaviour was not always dictated by a greater need for security – several participants expressed concern about the impact new technologies were having on job prospects for staff of financial institutions.

After watching promotional videos for personal domestic robots, many of the seniors said that they did not want this type of device in their homes – except when it came to calling for help in the event of a fall. Some believed, however, that these devices could be useful for others who needed more help (due to cognitive disorders or mobility problems, for example).

Participants deplored the fact that new technologies were expensive, hard to configure and needed constant updates; they also mentioned wanting easy-to-understand instructions. They stated as well that they would be more open to using support robots if the robots did not present such problems.

Many participants expressed concern about the dependency new technologies could bring, as well as the potential impact on their ability to remember things and make their own decisions. However, they did not believe a robot could express real empathy or show real signs of friendship.

 

What about privacy?

The researchers found a range of viewpoints on this topic. Some participants feared losing control of their financial information, or even falling prey to cyber piracy or fraud. Others were concerned about the confidentiality of personal data in health-related uses, such as reminders to take their medication. Finally, though few participants had had any contact with a digital assistant, many feared that this type of device would eavesdrop on their conversations and share any information that was disclosed that way.

Workshop participants indicated that the fictional character they had create for their own reference should be able to choose how their information would be handled. They noted the importance of transparency and significant consent.

Developers who want to create devices and robots for seniors need to take into account what seniors agree to reveal and what they want to keep confidential.”

Dr. Andrea Slane

If some participants were highly sensitive to the confidentiality of information, others were simply being prudent about trusting developers. Participants did see major advantages in using devices for social support, and were relatively comfortable with having information collected and shared in such situations.

It’s worth noting that participants preferred the collection and storage of information to be handled by a personal robot rather than a digital assistant. Similarly, they seemed to have more confidence in businesses that might be created in the future (which they seemed to idealize) than in existing businesses (of which they had a more realistic conception).

 

Only under certain conditions

Will devices and robots that truly meet seniors’ needs and provide sufficient protection of their personal information be successfully launched on the market? Dr. Slane believes the answer is yes. “I’m fascinated by certain devices, but I’m not hiding my head in the sand,” she says. “I want to see this industry develop in a responsible manner.”

According to the expert, we still need to remain prudent. “Companies are investing a lot of money in  making high-performance robots because the market is there,” she notes. “But also because they believe in them. The developers who are working on this are idealists. They’re asking themselves what they need to do to make sure users’ personal information is protected. But they’re telling themselves that this is a problem others can solve. That needs to change.” It also depends on the type of business involved.

If a product is designed by a medical organization, that instills a certain confidence. But if the manufacturer is a social media company, we might well ask questions. Will the company be able to offer such a service without being contaminated by the way it usually operates?”

Dr. Andrea Slane

This is all the more important given that the data to be protected can be particularly sensitive. “We’ll need to discuss how a robot can notify users if something fraudulent occurs,” says Dr. Slane.

The expert says that we also need to take the target audience – or rather, target audiences, plural – into account. “We need to determine what level of protection seniors need,” says Dr. Slane. “That won’t be easy because many are not comfortable with technology. Not all seniors are vulnerable, but some are. And we need to act accordingly.”

The study

The study, Involving Seniors in Developing Privacy Best Practices: Towards Responsible Development of Social Technologies for Seniors (Ontario Tech, 2020 University), was conducted by Drs. Andrea Slane, Isabel Pedersen and Patrick C.K. Hung. It was designed to collect seniors’ views on the technological devices and applications currently on the market, which could become even more popular over the years, and on their attitudes to and concerns about privacy. “All the participants were recruited at seniors’ centres,” says Dr. Andrea Slane, professor of legal studies at Ontario Tech University and coauthor of the study. “So they were socially active people. While many were connected (to the Internet), others only had a landline.”

The team of researchers held six focus groups across Canada and two exploratory workshops in Ontario. The focus group moderator started by gauging participants’ attitudes to technology used for social support. Next, after presenting a video promoting a personal domestic robot, he noted their reactions to the social support functions that could be performed by a robot. He also looked at strategies participants were using to protect their personal data, and their views on the protection of their data. In the workshops, the researchers were more interested in hearing about confidentiality problems and solutions in terms of existing and potential technologies designed to provide social support.