How can we rebuild trust in the digital world? Part 1

Main visual : How can we rebuild trust in the digital world? Part 1

“If you could speak to the ‘digital twin’ of a deceased family member, would you want to do so?”

With provocative questions like this, Professor Michael Sandel led a frank and wide-ranging discussion that struck at the heart of many of the problems we face in today’s digital world.

Speaking at the Fujitsu Forum Tokyo 2019 Frontline session earlier this year, Professor Sandel was joined by Yoshikuni Takashige, Fujitsu’s Chief Strategist for Global Marketing and eight other panelists in a lively discussion of ethical challenges related to personal data and AI, as well as the responsibility of corporations in society.

The eight panelists included:

  • Yuko Yasuda, Public Affairs Specialist at the United Nations Development Programme (UNDP) Representative Office in Tokyo
  • Edmund Cheong, deputy director of the Rehabilitation Center of the Malaysian Social Security Organization
  • Hazumu Yamazaki, co-founder and CSO of the speech and emotion analysis startup company Empath
  • Yasuhiro Sasaki, a director and business designer at the design innovation company Takram.
  • Yumiko Kajiwara, Fujitsu Corporate Executive Officer, in charge of industry-academia-government collaboration and diversity
  • Ian Bradbury, CTO of Financial Services, Fujitsu UK & Ireland
  • Sebastian Mathews, Process Manager at the Global Delivery Center in Fujitsu Poland
  • Mika Takahashi of Global Marketing Unit, Fujitsu

Audience members attending the discussion at the Tokyo International Forum were also encouraged to participate, and were provided with red and green cards that they could hold up to vote yes or no on various questions that Professor Sandel asked.

The following is a lightly edited transcript of the conversation.

Yoshikuni Takashige

Hello everyone. Today we would like to discuss how we should rebuild ‘trust’ in the digital world with Professor Michael Sandel, who is well-known for his lectures at Harvard University.

SA1_8324.jpg

Digital technologies such as the internet and smartphones have made our everyday lives more convenient, and have enabled us to generate new businesses like e-commerce and social networks.

But on the other hand, don’t you feel something is wrong? The leakage of Facebook users’ personal data last year is one example that comes readily to mind for many of us.

In Fujitsu’s global survey, more than 70% of respondents were concerned about the trustworthiness of online data and the risk of their privacy being compromised. People are losing confidence in digital technologies and trust in the companies that use them.

What do you think about this, Professor Sandel?

Michael Sandel

I think it’s one of the biggest ethical questions we face today. If we don’t think through the ethical implications of new digital technologies, then companies will have problems.

There will be a loss of trust, as you said, and citizens will view technology as a threat rather than as an opportunity. So it’s important to have public debates even about some of the hard ethical questions posed by digital technologies and that is what we’re going to do today.

How much are we willing to compromise our privacy?

Michael Sandel

Let’s begin with one of the big ethical questions we face today — the use of personal data. Now companies often offer services or benefits in exchange for our personal data.

SA1_8848.jpg

The question is: Are you willing to give your personal data in exchange for better service? Let’s begin with the question of health insurance. Now, you’re probably all familiar with wearable devices that measure how long you sleep at night, what you eat, whether you eat broccoli very often which is healthy for you, or how much alcohol you consume.

Now suppose that this wearable device could measure all of these things, and would send that data to your health insurance company. In exchange for sharing that personal data, if your behavior is healthy, they will give you a big discount on your insurance policies.

How many would wear the device to send the data to the insurance company? Those who say yes, raise the green card; those who say no, raise the red card, and we will see what people think.

About 70~80% of the audience seems willing to share their data. And on our panel six are willing to share, and two are not. Yuko, you indicated you would not take this deal. Tell us why?

Yuko Yasuda

There are two reasons. Although I like broccoli and I do yoga, I’m not comfortable to share my privacy with companies. I feel exposed — like I’m being watched wherever I go. The fear I feel outweighs the benefit of any discount.

Also, if only healthy people are eligible for the discount, the end result might be, “You cannot get insurance because you have a genetic abnormality.” That kind of society seems quite frightening.

SA1_8535.jpg

Michael Sandel

So let’s now hear from most of the panelists who are willing to send in the health data. Yasu, what do you think, and in response to Yuko’s point, would you also send in your genetic data?

Yasuhiro Sasaki

I feel that anything is okay if it actually makes the insurance cheaper. I don’t think data about my diet or sleeping habits will reveal anything significant, so I’m willing to provide such data if I can get something in return.

And in the case of genetic information, if many people share their genetic information, it may contribute to the discovery of new therapies and advances in medicine.

Michael Sandel

All right, so you would share everything, because you don’t think it is absolutely an essential aspect of your identity. In that case, what type of personal data would you not want to share?

Yasuhiro Sasaki

I wouldn’t want to share private conversations that I have with my family, friends, and other people who are important to me. I wouldn’t want to involve these people.

Michael Sandel

Yuko will never send that information, even it is favorable, and Yasu says it’s okay because that kind of information is not personal. Ian, what do you say? You were willing to send it in.

Ian Bradbury

Yes, but I would expect the insurance company not to share my data, and to only use it for the purpose which was intended.

Michael Sandel

Let me ask you then, do you trust Facebook to keep your data private?

Ian Bradbury

I only put information that I’m willing to share on Facebook. I think it has value as a platform, but I don’t put all my information on it. That’s the difference between the insurance company and Facebook.

Michael Sandel

Is there anyone on the panel who doesn’t use Facebook? Yumiko...why don’t you use it?

Yumiko Kajiwara

I don’t like to post personal information somewhere that can be viewed by anyone.

Michael Sandel

Now I want to ask about another use of personal data. How do you feel about providing personal data to an automobile insurance company? What if the company offers you a discount if you put a device on your car that tracks how you drive, whether you exceed the speed limit, whether you brake suddenly, whether you make sharp turns, and what time of night you drive? Let’s get the audience and panelists to vote on this as well.

Now we’re seeing some disagreement. About 90% of the audience is willing to share their driving data, whereas only four out of our eight panelists are willing to do so.

Edmund, you were willing to share your health data, but you’re not willing to share your driving data. Why is that? Is driving more personal and intimate to you than your health?

Edmund Cheong

It is not that driving is more intimate, but I don’t think my driving data would result in a discount. It might even cause my insurance rates to go up instead.

Michael Sandel

Because you are a bad driver?

Edmund Cheong

No, I’m a reasonably competent driver. But I often have to drive at night, and I often drive long distances.

SA1_8750.jpg

Michael Sandel

I see. So the data might not work in your favor. Who else would not send the driving data, but did send the health data? Mika?

Mika Takahashi

My reason is similar to Edmond’s. I’m actually a terrible driver, so I don’t think I would get a discount. Also, I’m concerned about the use of location data. I voted “No” because it scares me to think that I could be tracked everywhere I go.

Michael Sandel

So Mika, you consider location data to be personal just as Yuko considers health data to be personal.  And Yasu said he doesn’t care if companies access his health data.

But what about location data? Yasu, how do you feel about companies having your location data and knowing where you go? That seems quite personal.

Yasuhiro Sasaki

It’s personal, but I don’t see it as a problem because I trust the car insurance company not to leak that information. If I thought the company wasn’t trustworthy, I wouldn’t use its services.

Michael Sandel

Now, this issue about location data has come up in the debate about Uber, because they need to access users’ location data. For example, they did a study and found that a certain area was visited by many men late at night, and the suspicion arose that prostitution was going on in that area.

Eric Schmidt, the former CEO of Google, said that if you don’t want people to know about something you’ve done, maybe you shouldn’t do it in the first place. This is a very sensitive issue. It is a question of ethics and privacy. What do you think about this, Yoshi?

Yoshikuni Takashige

It is certainly sensitive. It’s rare in Japan, but if a terrorist were to get the location data of a specific person, it would make it easier to kidnap or murder that person. Sharing can certainly be dangerous if you don’t know how a company will manage and handle your data.

Edmund Cheong

Speaking from my own perspective, I’m here in Japan now, but my wife and two children are in Malaysia. So I would worry about my family if people can easily find out when I am away.

Michael Sandel

Because your family is more vulnerable...

Edmund Cheong

Exactly! That’s my point.

Michael Sandel

So let me summarize what we’ve been saying. It’s clear there are certain types of personal data that we don’t want other people to know. One principle that has begun to emerge from the panel discussion is that we are bothered if the data is somehow deeply connected to who we are, to our personal identity. But it is also clear that people have different views about what sort of data that is.

This post continues in Part 2 and Part 3.

Inspired to find out more about Fujitsu's vision to build a Trusted Future? Visit the Fujitsu Technology & Service Microsite to learn more.

And we'll be continuing the discussion at Fujitsu Forum Europe in Munich, November 6-7 2019 - click below for all the details.

 

Editor's Picks

Current Status and Future Prospects of Decarbonization Efforts throughout the Supply Chain
Decarbonizing supply chains is key to corporate growth strategies. We interviewed Professor Koshizu…
March 27, 2025
82 percent of workers feel positive about GenAI
Find out how European workers think about using GenAI at work and how Fujitsu can save you time and…
Fujitsu / March 26, 2025