In March 2018, the world was rocked by a Facebook scandal involving Cambridge-Analytica and allegations of privacy invasions that numbered in the tens of millions. If ethics in information technology wasn’t a hot topic before, it became an explosive one after. In business, there has always been a disparity between the idea of what is legal to do versus what is ethical, and this disparity is enormously wrought in the world of privacy and data. Beyond data collected by businesses, leisurely internet users are frequently warned by industry watchdogs that they are the product in an information-based world, and their data is business gold.
It’s entirely possible for things to be legal yet completely unethical, and that’s the conundrum the IT industry has found itself in as privacy and data commerce has come to the fore in the wake of Facebook’s scandal.
If you struggle to understand legality versus ethicality, consider a hypothetical scenario: A man dying on the side of the road calls for help and clearly isn’t a threat to your safety. Do you stop to help, or drive on by? You’re under no obligation to stop, it’s legal to mosey on, but nearly every religious and moral authority would say it’s your ethical responsibility to stop.
There is no wrong answer but being a drive-on-by type versus the one who stays to help means a wide gulf in personal ethics. And that’s the issue facing the IT industry today because it turns out those 56-page terms-of-use agreements usually have privacy stipulations making it legal to sell and use personal data as brands see fit. But is it ethical?
Social networking user terms may be a huge ethical quandary for industry, but there are so many other ethical standards for IT professionals to solve today. Some of those include:
- Security: From e-commerce sites to banking and government databases, the public trusts that their information is secure once they’ve set up a password-protected account. When data breaches occur, it can cause a domino effect of security issues, especially when they’re using a site like Facebook or Google as their master-key log-in under the presumption they’ll be more secure. Even when breached, users often don’t find out until long after the fact, like when Equifax had a breach for 76 days, affecting 147 million Americans. How soon should the public be notified, and what recourses should they have?
- Proprietary Software: Software made for a company or organization’s private purposes does not go through an oversight process. When building a house, a civic inspector needs to approve it against the building code, but not with software, even though software can conceivably impact far more people. In proprietary instances, if only the client and the IT coding personnel know about possible ethical conflicts in software, personnel are conflicted with either needing to quit their job or do what they’re asked.
- Deep Learning & Artificial Intelligence: AI algorithms now underpin so much daily technology. From the concierge on your phone to your smart TV and your car’s cruise control, right through to safety mechanisms on flights. What happens when the AI makes a choice that involves an ethical conundrum? AI is designed to make the call, but what if it’s a questionable call its designers never considered?
- Parental Ignorance: Today’s parents are keen to record so much of their children’s lives, uploading everything from deeply personal moments that amuse them to rallying support to advocate for their child’s medical issues to creating hashtags with their child’s full name. These records can follow their children for the rest of their lives. At what point do parental rights trump the child’s, and whose side should the IT world be protecting?
These dilemmas barely scratch the surface of questions worth posing about information technology and moral philosophy.
Ethics with biometrics is more complicated than one can even begin to parse. When people buy a fitness tracker, they’re thinking, “I can track my steps and pay attention to my health.” Many don’t realize this data can be used in ways they’ve never even considered. What if insurance companies have financed the apps and access the data as their intellectual property; what will that mean for the user’s future health claims?
But it goes beyond personal use, too, especially with GPS-enabled devices. Strava, a tracker popular with cyclists, came under fire when Strava app’s heatmap inadvertently revealed remote U.S. military installations around the world. The company responded to the controversy by saying they’ve made efforts to show users how to turn off sensitive location-sharing, but critics wondered why is the onus upon users to opt out versus the company making that data sharing the default?
The trouble with data is it’s on fragile systems that can be easily damaged or stolen. In March 2019, the “original Facebook,” MySpace reported that a server malfunction meant the company lost all data uploaded prior to 2016. That’s over 13 years, 50 million songs of indie artists, countless photos and so much more, gone forever from what was often its only home on the internet.
Unfortunately, there’s a delicate dance between those who use information technologies and those who profit from them. Should companies like MySpace warn their users when porting their data to new servers so users can take their own precautions? Right now, that’s not done, as companies are confident in their processes, but also because the stock market tends to punish those who advise caution or warn of possibilities.
As time goes on, more and more platforms and companies will have technologies facing obsolescence forcing them to migrate stored data. In a world where nothing is forever, what is the ethical responsibility in safeguarding this information, and what’s the timeline for that – a decade, a century? These are questions that don’t have answers yet because the industry and the public don’t use the long view – hence parents posting so much about children with little regard for how it affects their kids’ future.
The trouble with trying to decide what is ethical at times comes down to a philosophical conflict. Is the guideline the “for the greater good” rule of thought, which is, yes, a small number of people will feel repercussions from these actions, but ultimately it will benefit thousands or millions? The greatest good for the greater number of people is considered the “Utilitarian” school of ethics.
In contrast stands one of the giants of philosophy, Immanuel Kant, who theorized the “Categorical Imperative,” which basically holds that no one should ever do anything they’d not want to be done to themselves. Everything should come down to a “universal law” that you would want to see yourself bound to as well as everyone else. This would mean never lying or manipulating, not prioritizing one person above another, and of course, it would mean Utilitarianism goes right out the window because no one’s rights exceed anyone else’s.
But ethics are complicated and there are more than 3,000 years of writings debating different schools of thought about whose rights supersede whose. Does the government have more rights than citizens? For corporations, do laws apply if no one’s enforcing them?
Automotive: The next great technology will be that of self-driving cars. IT programmers designing automobiles have been wrestling with the ethical dilemma of whose life is more important, the pedestrian's or the passenger’s. When a car is faced with two options holding potentially fatal outcomes on both sides, how should the AI be designed, and whose life should the programming protect? When such decisions are made by a computer using a mathematical equation in the blink of an eye, there is no easy answer.
Medicine: The American medical system accounts for nearly a fifth of the GDP, so medicine isn’t just big business, it drives the American economy. With a digital world, so much more information can be compiled about people – how they eat, how they exercise, where they spend their time – and all if it can be parsed to make judgment calls about whether they’re a good bet for insurance or employment. Where will the lines be drawn on data collection and sharing, and should IT personnel use their moral imperative or simply do as much as they can, legally?
Borders: When data is international, how should it be governed? Europe decided laws around data too heavily favored industry, of which the largest players were American companies. With a borderless cloud-based world becoming a reality, they created the General Data Protection Regulation (GDPR) to protect the transmission and exportation of European private data. This may foreshadow the future of data and how it’s handled and sold.
Public Life: In Toronto, Canada, an ambitious dream of creating a “smart city” is coming to life through “Sidewalk Labs” – a Google-owned project where everything will be monitored and stored, from bench use and air quality to foot traffic and who’s present. Numerous ethical questions are being raised, from how the data is planned to be distributed and to whom, whether facial-recognition software will be used, how much person-specific data will be collected, and more. Unfortunately, this may be the most ambitious urban-tracking program of its kind, but these sorts of tools are in use the world over – all part of a brave new world whose ethical conundrums aren’t even close to being resolved.
In the movie "The Matrix," screens fill with endless data scrolling up and down, and in a way, that’s life today. Every time you get a member card for a store or an airline, your data gets just a little easier to track and sell. Every time you use GPS, turn on a biometric product like a Fitbit or even bark orders at Google's Alexa or Apple's Siri, there’s data out there getting collected – some of it anonymously but much of it tied to your name.
The problem right now is, assumptions are made that this data is stored or traded ethically, and the reality is – it’s often not. From Cambridge-Analytica to black-hat hacker breaches, there are no promises your data will be protected or secure. And the greater problem is, there’s no one body that has oversight about who has which data and for what means – and would you even want one organization to have that much access or power?
Data collection, information technology and the internet are practically as unregulated as the Wild Wild West, and there are no easy answers about the ethics in IT, nor who should be the arbiter of whether those ethics are employed. Ultimately, the problem is that technology moves too fast and government moves too slow. Laws and lawmakers can't possibly keep up to the speed of innovation, and society is left hoping that those who write the programs err on the side of what's ethical, not what's failed to be regulated by slow-moving governance.