The Stanford Responsible Digital Leadership Project helps industry foster an ethical attitude to data privacy and artificial intelligence. Søren JØRGENSEN, the Project Director, explains how to find a balance between privacy and data use.
Technology and data have transformed the lives of many, but their misuse can be detrimental to personal privacy and safety. For this reason, Søren Jørgensen, a research fellow at Stanford University in the US, joined with Dr Elise St John of the Digital Transformation Hub at California Polytechnic State University and Radhika Shah, an angel and impact investor from Silicon Valley and also a fellow at Stanford University, to launch the Responsible Digital Leadership project at the end of 2019.
The aim of the project is to provide guidelines for the responsible use of new technologies such as Artificial Intelligence (AI) and data, and to provide a learning platform for global businesses to exchange ideas and establish better practices. Working with a group of banks and insurance companies, the project hopes to find ways for companies to implement ethical principles and guidelines when using data. The project also considers data ethics in the context of responsible digital leadership, especially its impact on society, human rights, and it assesses how the use of data complies with the United Nations Sustainable Development Goals (SDGs).
“Globally, it is a critical time, as we are starting to realize the challenges that the use of technology brings,” says Jørgensen. “We realize that responsible digital leadership is about culture and learning, and about promoting a mindset of responsible behavior. We must develop the ability to face these challenges when they come up. We need guidelines, but guidelines alone won’t do that, it’s a result of how we behave.”
In what could be classified as the largest global project of its kind, Jørgensen says that the Responsible Digital Leadership Project is the culmination of the ideas of more than 70 PhD, MBA and master’s level students from some of the world’s top universities. HKUST is included in that list.
This global effort “will define the risks of technology use, with a mission to future-proof the financial sector against getting the ethics of data and AI wrong.” Jørgensen says that the project has so far uncovered around 60 concrete dilemmas and challenges when it comes to the ethical use of data and technology.
Such challenges include looking at how companies can use technology and data to build trust, while also protecting the interests, privacy and the rights of the individual. Companies must strive to be flexible enough to comply with local regulations, laws and cultural expectations. They must also drive innovation while benefitting society as well as themselves.
Tough Challenges
Jørgensen says these are difficult challenges, and there are no straightforward solutions. But the project is certainly taking a step in the right direction by creating useful frameworks. First and foremost, the project has put privacy on the top of the agenda, and made it the central pillar of its activities.
The question is, Jørgensen says, how can we enable those in the financial sector to benefit from the power of data without risking the privacy and fundamental rights of customers? “At the core, it’s about how we relate to data and manage it. That concerns privacy, and how the data is being used, and manipulated. That is an important issue,” he says. Another issue is bias and discrimination in the way the data is used.
A central concern is the contradiction between an individual’s feelings about privacy issues and the way they take care of their personal data. For example, Jørgensen notes that people are hesitant to leave social media platforms such as Facebook, even though they aware of the problems such platforms may cause them due to the access the companies have to their data. “People have just accepted the trade-off, as it’s convenient,” Jørgensen says. We all look for convenient platforms and solutions, and the most convenient solutions are those where your data is being collected, even though giving access to that data may be a risk, Jørgensen notes.
Awareness is Growing
The good news is that there is now a growing awareness of security and privacy concerns. Jørgensen notes that people are becoming more mindful and quicker to react to instances of abuse. A recent Pew Research Center survey conducted in April 2021 shows that 56 per cent of Americans think that major technology companies should be regulated more than they are now, and 68 per cent believe that some of the tech firms have too much power.
Humankind has taken an enormous digital leap during the COVID-19 pandemic. The world got to see the enormous advantages and possibilities in using digital platforms, but the pandemic also highlighted tech’s limitations and shortcomings, such as a lack of human connection.
As the world journeys further into the age of digital transformation, education must play a key role in meeting the demand for new skills and competences, and providing relevant research, Jørgensen says. He thinks that education can help companies to set the right boundaries, and allow them to find a balance between privacy and data use.
Responsible digital leadership is about culture and learning
Søren Jørgensen
Director of the Responsible Digital Leadership Project
Blockchain and Cryptocurrency - Food for Thought
The Responsible Digital Leadership project hosted an inspiration session in mid of 2021, with a focus on blockchain and cryptocurrency regulation. The project’s researchers presented a number of ethical dilemmas that were imposed by the adoption of the Central Bank Digital Currencies (CBDG). Questions and ethical challenges that were raised included:
- Since the “right to privacy” and the “right to be forgotten” does not apply to the traceable crumbs left behind by digital payment, what resources do customers have to protect themselves from the state and disproportionate power of corporations?
- Will there be a decrease in central bank digital currency because people are afraid to use it?
- Is the future of transparency about transforming all information into “open data” so that there isn’t an imbalance of information?
- Will we see a more or less equitable world with the increased adoption of crypto currencies?