Data ethics and what we should demand when entrusting personal information to a device 

1ec3c1a303c140a3a1fadf3a85b15a93.jpg

Institutional Communication Service

30 September 2022

Conscious digitisation also comes with greater awareness of data processing, as Marta Fadda, a Postdoc researcher and lecturer in the Biomedical Ethics course at the Faculty of Biomedical Sciences at Università della Svizzera italiana, explains, starting with a shared experience, the passive approval of user licenses.

 

"I accept the privacy policy"

It is a common and frequent occurrence. Pick up your smartphone. Open the latest app downloaded to shop or monitor health. Scroll down the page regarding the privacy policy for several seconds. Casually read the main headings - "data protection," "collection of personal data," "data processing," and "transmission of personal data to third parties." Grasp their ethical and legal weight. And, partly because of lack of time and language that is difficult to understand, click "I accept the privacy policy."

Through what lens can we understand the implications of this experience?

 

Data ethics

When we talk about data ethics, we refer to the responsible and sustainable collection, processing, storage, and eventually sharing of data, with respect for individuals and society. However, data ethics represents a step further than simply complying with data protection laws. At a minimum, all data processing must comply with the requirements outlined in the EU's General Data Protection Regulation (GDPR) and other data protection regulations in the respective countries where data is collected, processed, stored, or shared.

 

The promises we should demand

What should we demand from those who obtain, process, and store our data? Research shows that most users lack an understanding of how their data is processed and perceive a low level of control over what and how much information they share on websites and smartphone apps. In recent years, several companies have begun experimenting with a people-centric approach in developing systems to ensure security, transparency and control. Such an approach can be broken down into the following promises:

 

The human being at the centre

Users must clearly benefit from sharing their data. People's interests must prevail over institutional and commercial interests. Users must be put in a position to understand how their data is being used and for what purpose, in a straightforward way, accounting for both the context where they are and what they are doing.

 

Individual control of data

An individual's self-determination must take priority in all data processes. The user must have primary control regarding how, where, when and what personal information is used and for what purposes – including the ability to browse incognito without sharing any data. Users must be able to personalise their browsing experience with repeated opportunities to control their data. In addition, the user must be able to decide how long the company can keep his or her data.

 

Transparency

Data processing activities and automated decisions must make sense to the user and be fully transparent and explained. In addition, the purposes and interests of data processing must be disclosed regarding legal, ethical and social risks and consequences.

 

Accountability

Companies must comply with standards and codes, avoid consequences and risks to the user, and mitigate social and ethical implications in all aspects of data processing. This means extending their responsibility to data processing by third parties and partners with whom data are shared.

 

Equality

Equality-based data processing is premised on an awareness of the social power relations on which data systems not only rest but can create or reproduce. In data processing, special attention should be paid to vulnerable people who are particularly exposed to negative influences affecting their self-determination and control or exposed to discrimination or stigmatisation, for example, because of their economic, social, or health conditions. Paying attention to vulnerable people also means actively working to reduce bias in algorithm development.

 

Faculties

Sections