Autonomy & Freedom
Digital self-determination
https://en.m.wikipedia.org/wiki/Digital_self-determination
Digital self-determination is a multidisciplinary concept derived from the legal concept of self-determination and applied to the digital sphere, to address the unique challenges to individual and collective agency and autonomy arising with increasing digitalization of many aspects of society and daily life.
Data Sovereignty
Data sovereignty is the concept that data is subject to the laws and regulations of the country or region where it was generated or is stored. It essentially means that a country has the authority to govern the data within its borders, including how it's stored, processed, and protected.
https://en.m.wikipedia.org/wiki/Data_sovereignty
The use of cloud services can make it difficult to determine where data is physically located and which law applies
Freedom & Privacy
Intellectual freedom & censorship
Censorship online can be carried out (to varying degrees) by actors including totalitarian governments, network administrators, and service providers. These efforts to control communication and restrict access to information will always be incompatible with the human right to Freedom of Expression.5
Censorship on corporate platforms is increasingly common, as platforms like Twitter and Facebook give in to public demand, market pressures, and pressures from government agencies. Government pressures can be covert requests to businesses, such as the White House requesting the takedown of a provocative YouTube video, or overt, such as the Chinese government requiring companies to adhere to a strict regime of censorship.
People concerned with the threat of censorship can use technologies like Tor to circumvent it, and support censorship-resistant communication platforms like Matrix, which doesn't have a centralized account authority that can close accounts arbitrarily.
You must always consider the risks of trying to bypass censorship, the potential consequences, and how sophisticated your adversary may be. You should be cautious with your software selection, and have a backup plan in case you are caught.
Personal privacy
A common counter-argument to pro-privacy movements is the notion that one doesn't need privacy if they have "nothing to hide." This is a dangerous misconception, because it creates a sense that people who demand privacy must be deviant, criminal, or wrong.
You shouldn't confuse privacy with secrecy. We know what happens in the bathroom, but you still close the door. That's because you want privacy, not secrecy. There are always certain facts about us—say, personal health information, or sexual behavior—that we wouldn't want the whole world to know, and that's okay. The need for privacy is legitimate, and that's what makes us human. Privacy is about empowering your rights over your own information, not about hiding secrets.
Control over your privacy inside most apps is an illusion. It's a shiny dashboard with all sorts of choices you can make about your data, but rarely the choices you're looking for, like "only use my data to help me." This type of control is meant to make you feel guilty about your choices, that you "had the choice" to make the apps you use more private, and you chose not to.
Privacy is something we need to have baked into the software and services we use by default, you can't bend most apps into being private on your own.
Corporate surveillance
For many people, tracking and surveillance by private corporations is a growing concern. Pervasive ad networks, such as those operated by Google and Facebook, span the internet far beyond just the sites they control, tracking your actions along the way.
Additionally, even companies outside the AdTech or tracking industry can share your information with data brokers (such as Cambridge Analytica, Experian, or Datalogix) or other parties. You can't automatically assume your data is safe just because the service you're using doesn't fall within the typical AdTech or tracking business model.
Technological Stewardship
and Responsible Innovation
The call for responsible innovation is a call to address and account for technology’s short- and long-term impacts within social, political, environmental, and cultural domains.
Technological stewardship stands as the corollary of this mindset: a commitment to anticipate and mitigate technology’s potential for disruption and especially harm and to guide innovation toward beneficial ends.
Adopted from the domain of engineering ethics, where it “involves taking a value sensitive approach to embedding ethics, sustainability, and EDI (equity, diversity, and inclusivity) principles into the practice and culture of engineering” [3, p. 34], technological stewardship belongs to any who would take up the mantle.
To be a technological steward means to be committed to an ethos of care: to understanding the systematic impacts of innovation and its philosophical dimensions, and to cultivating the capacity to apply this knowledge to drive development into new realms of humane engineering and design. To be a steward is to see care as the foundation of one’s professional identity and not as an add-on or bonus. Technological stewardship is therefore about an optimistic turn toward the future of more responsible innovation—about recognizing the possibility for technology to transform culture and civilization and to steer away from the maelstrom of unfettered innovation.
Dialogue and collaboration across diverse perspectives is essential for developing actionable technological solutions that attend in responsible ways to the evolving needs of society.