Turkish Law Blog
Computer Ethics and Privacy
The phrase of “Computer Ethics” first highlighted in 1970s by Walter Maner. In 1977, Weizenbaum recognized while he was working on artificial intelligence that the communication between computers and human are increasing and people are happy about this interaction. Later, several authors such as Milton Wessel, Deborah Johnson, James Moor and Donn Parker examined and developed the field of computer ethics through 1980s. In 1985, Moor described computer ethics as “the study of situations where a computer is essentially involved”. In 1990s, this field has been broadened much more which would cover its relationship with many different areas such as privacy as it is first mentioned as “Big Brother Watching”. Mainly, ‘computer ethics’ or ‘cyber ethics’ or ‘internet ethics’ addresses the effect of technology on human behavior. Therefore, computer ethics is accepted as a form of practical ethics. In relation to this, computer ethics demonstrates the developing interest of policy makers, academics and professionals within ethical concerns.
Within many years, the information technology started to cause many ethical problems. As it is mentioned above, computer ethics posed several ethical concerns since it is a practical or applied ethics system. One of these related problematic areas is privacy. Moreover, it is also stated that “privacy proved to be the dominant ethical concern” in the discourse of computing and ethics. Privacy is a morally valuable concept and it is fed by ethical concepts regarding autonomy. However, privacy is not a broad and general concept, but it is also divided into some sub-categories when it is engaged with computer technologies.
This paper examines the field of data privacy regarding the computer ethics and privacy relationship. With regard to data privacy, there is an impact of big data which pertains to targeted advertising, internet of things; biometric technology which deals with identity management and other tracking services and lastly there is profiling and surveillance activities in relation to artificial intelligence.
2. Data Privacy
The first thing that comes into mind regarding the new developments is the computer’s ability to store, process and retrieve information more easily and faster than ever before. This ability generates the concerns regarding the data privacy since all the information that computers process is defined as data. Therefore, data privacy is an evolving field which is resulted in direct privacy related human rights concerns today.
2.1. Big Data
Computer ethics has been a specialized area for several years. However, it changed thanks to the new developments in information technology, the computational science and computing devices. Regardless whether technology-oriented communities are engaged with computer ethics, computer ethics is a spreading debate among political and social environments. First development lead to this ubiquitous computer ethics debate is big data.
The computers’ ability to collect, process and store the data of individuals is a principal ethical issue in terms of privacy concerns. Ubiquitous computing devices and online services are able to collect vast amount of data thanks to the data being shared by humanity. The more data is shared online the more is amassed by computing devices. This vast amount of data is generally called as big data. However, big data does not merely relate to the volume of data, but it is a well-known concept with its four V’s which are variety, velocity, veracity and volume. Big data with its four V’s is now the main source which is used by many cooperation in order to survive in various business sectors. All kinds of companies from different sectors, which are executing their services via technology or not, use big data as a mean to keep their pace with catching or creating new consumer trends.
Big data related cyber ethical issues are related to the use cases of the data collected by computers and other technological devices such as targeted advertising and internet of things.
2.1.1. Internet of Things
The internet continues to develop globally every day. Internet of things (IoT) is one the latest improvements. IoT brought Internet 2.0 since it presented a system where all electronic devices can link to each other via internet such as computers, smart phones, drones, smart home systems, wearable gadgets etc. The interoperability feature of IoT devices challenge to the ethical norms in terms of privacy.
IoT devices demonstrates how could internet be more interfering to the individual’s private life. The response lies under the growing sharing culture and globalism trend. Today, individuals share more than ever before. Hence, the more individuals share, the more data is generated. In this respect, computers become more eligible to process data since they can eat more and more data every day.
In order for IoT devices to create linked services, they have to share data of a specific user from one to another. Therefore, this system is also called as identification technologies. Linked services create a personalized network which serve to the special needs of each user. Nonetheless, this system poses a greater risk for user’s privacy in terms of personal data processing since the IoT devices proceed according to the personal data shared by the user.
2.1.2. Targeted Advertising
According to Mintel, statistics regarding online shopping and e-commerce reveals that computers play a substantial role for consumers’ daily life. The internet modifies the way of how the goods and services distributed. Targeted advertising is one of the consumer trends which uses the big data to personalize advertisements online. Targeted advertising is directly related to the field of consumer ethics in terms of consumers’ e-commerce activities and all consumer data collected by the goods and services providers. Consumer chasing activities first started in an uncontrolled way and afterwards it become an ethical concern today.
To exemplify, in 2007, Facebook launched Beacon software which connects external purchases with Facebook profiles. This automatic connection provider software created linked datasets to be able to use these data for targeted/personalized advertising. The software; however, revealed sensitive data from data subjects such as health condition and sexual preferences. Even though the software would not reveal the data, without the knowledge and consent of the individuals merely the collection of personal data -not even sensitive data- lead to ethical concerns in terms of privacy of individuals.
Another example is related to the location awareness systems which links the user’s profile to the in-store computer systems and advertising boards. Thanks to this service, Apple’s iBeacon software provided personalized shopping and monitor consumers even within physical stores. Therefore, the information provided by consumers to the Apple was used for entirely different purposes such as integrating offline experiences to the online domain. Again, personal data which was processed by a computer program for personalized advertising purposes led to ethical problems due to privacy concerns.
The above-mentioned examples were related to the process of how to generate targeted advertisements and were given in order to denote the harmful results against privacy of individuals. In addition, personalized/targeted advertising per se is problematic regarding its impact on consumers’ decision-making process. Goods and service providers use internet as a way of tracking behaviors, preferences and tastes of consumers and then manipulating them to buy what they offer. Therefore, the online and personalized advertising has a substantial impact on consumers’ behavior. Sellers aim is to sell under any circumstances and the consumer’s data which was provided by the consumer in advance is the means. Using his collected data as a stimulus against the consumer leads to another cyber ethical concern in terms of privacy since it leads to the destruction of consumer’s self-decision-making process.
2.1.3. Data Processing
As it is explained above, IoT devices and companies which use targeted advertising process data. However, not only IoT devices or big companies, but today almost every industry and every company process data. Hence, data processing is one of the main topics of most of the legal and political studies. In respect of these studies, unions and countries started to adopt different legislations in order to control the data processing activities and to protect the privacy of individuals. As pioneers, the European Union (EU) adopted General Data Protection Regulation (GDPR) in 2016 and the State of California has adopted California Consumer Privacy Act (CCPA) in 2018.
Pursuant to GDPR, personal data processing contains any operation which is performed on personal data such as collection, recording, structuring, storage, adaptation, retrieval, use, disclosure, erasure or destruction. Data processing does not have to be automatic or by means of technological devices; however, due to today’s technological developments most of the processing execute online. In addition to this, personal data is defined as “any information relating to an identified or identifiable person”.  Therefore, any other information out of this scope is anonymized data. Anonymization plays a significant role for the protection of data subject’s privacy. The impossibility of anonymization increases the risk to harm privacy since anonymization is necessary under some specific circumstances pursuant to most of the data processing regulations. Besides, the online collection and storage of personal data require higher security precautions since weak cybersecurity systems might cause to revelation of personal information. Thereof, the user would suffer from lack of privacy.
Regarding personal data processing, another ethical concern occurs when the data is collected without the consent of the data subject. It should be noted that the right to privacy is a human right regulated under Article 8 of the European Human Rights Convention as follows: “Everyone has the right to respect for his private and family life, his home and his correspondence”. Although privacy being protected as a human right is not something new, privacy becoming a spreading ethical concern is happening lately. The right to private life has become subjected to many discussions today is due to the impact of computer technologies and internet on private life.
In accordance with the ideal scenario, data subjects’ privacy and the data processing relationship must remain balanced. In order to achieve this goal, the data processing must be lawful. Therefore, the consent requirement has been brought to the EU system with the GDPR. The lawfulness of the data processing must be fulfilled in accordance with Article 7. This rule regulates the consent requirements given by data subjects.
Notwithstanding the above, current legislations adopted in the USA (CCPA) and the EU (GDPR) regarding data protection might not be fully cover every situation that could be encountered during this process. Additionally, during the times where these pieces of legislations have not yet been adopted; however, there are the problems relating to the privacy of data subjects, the ethical systems and values of specific databases governed the processing of these data especially sensitive data such as internet based-health data.
2.2. Biometric Technologies
Biometrics is a study which deals with the analysis of biodata. Biodata is the biological and physical data obtained from a human. Biometric technology could be integrated in many systems such as wearable electronic devices, smart phones, computers and any other similar devices.
2.2.1. Identity Management
The use of biometric technology lead to ethical concerns in terms of privacy when it relates with the authentication and verification systems. These systems are defined as identity management. They use identification technologies which are able to assign, manage or verify in order to establish a trust relationship among users and the IoT devices. Identity management systems are necessary for protecting the privacy of device users. Thus, it can be easily concluded that if there would not be any cyber security protection, the information which was stored on user’s personal electronic devices might be stolen easily on the internet domain. However, on the other hand, the users apply for some specific authentication systems which were provided by these devices’ software developers.
The verification and authentication systems require the user to provide a password or his fingerprint or to use face recognition. Although the user has a choice, face recognition seems the easiest of these authentication systems, and putting a password every time seems the hardest. Biometric authentication technology was developed through processing human beings’ physical information such as fingerprints (smartphones, Identilock to prevent unauthorized gun firing) face shape, eye shape and other physical characteristics. Therefore, biometric technology generated the following result; the more accessing electronic devices are getting easier, the more sensitive data should be provided by the user. It should be highlighted that the only way to take cybersecurity precautions or precautions in order to prevent offline interventions is to provide personal and even sensitive data to software developer companies. As a result, the following can be said that “While potentially more secure than traditional username/password combinations, these data sources introduce new risks of invasive inferences due to the inherent sensitivity of health data.” Therefore, this system does not seem to be an efficient system with regard to ethical and privacy-related concerns.
2.2.2. Other Biometric Technologies
Biometric technology can also be integrated into an individual’s daily activities and behavioral modelling systems. Therefore, they are not involved in merely authentication transactions, but they can also enable resources and service discoveries such as tracking objects and communication activities. For instance, Apple Watch can detect electrocardiogram signals and ask for help in case of possible interruptions and disorder. This health data was processed for the sake of user which in order to be able to call help immediately. However, at the same time, this health data is strictly sensitive as it collects the data that demonstrates the working system of that individual’s cardiovascular system from daily heartbeat data.
Another example is again related to wearable technology devices and smartphones which provide accelerometer to calculate daily walking/running activities of users. Although the information is very effective and useful for an individual who would like to be aware of his activity cycle, the processed data is again highly sensitive data due to its nature as being the medical data.
Overall, although these systems provide users a lot of opportunities, they execute their service only by using sensitive biometric and health data of users. Thus, this directly results in an ethical concern regarding risk for users’ privacy.
2.3. Profiling and Surveillance in relation to Artificial Intelligence
New technologies developed far more away from computers and they now started to deal with artificial intelligences (AI), robotics and other machine learning systems. Machine learning systems are integrated into many different technologies which imitates neurological system of human beings. They have varieties of types such as deep learning, reinforcement learning, unsupervised learning etc. Now, these systems are able to educate themselves and improve their system on their own. Therefore, although the main source and binary code is known by the creators, engineers, software developers, their data executing process seems like a black box. This obscurity leads to some ethical concerns regarding privacy when it comes to profiling.
Profiling means categorization of one’s data according to some specific characteristics which were indicated without his knowledge. The first concern is lack of self-control of the data subjects in the beginning of profiling. AI can be involved at this stage. The AI can be fed by data and by executing data mining it can constitute profiled or mapped data sets. The problem raises due to the ambiguity of AI’s executing system such as the method of profiling and there are any discriminatory results. Even there is not any discriminatory results, profiling data subjects via an automated process per se creates ethical concerns as well. On the other hand, AI can be involved in later stages such as analyzing the profiled data. According to these analysis, particular groups can be created which the services provided in a discriminatory way.
As the third and the most dangerous ethical concern, this profiling can result in surveillance. Thinking the fact that even profiling by automation itself can risk the value of privacy, executing surveillance by using that profiling measures creates a greater risk in terms of privacy. As it is mentioned in the introduction section, big brother watching is one of the most apprehensive ethical concern in terms of privacy. Therefore, it can also be denoted that executing surveillance with these methods actualizes the Big Brother Watching scenario.
Computer ethics is engaged in a lot of side studies. One of them is privacy. Privacy related concerns have the most dominant impact on computer ethics since today the distinction between public and private was almost lost on online domain. Therefore, data privacy deals with protecting the balance. Big data has led to several privacy related ethical concerns. Data processing is a substantial concern that needs to be regulated. Related to data processing, some personalized activities such as targeted advertisements and biometric technologies pose some risks due to harmful activities towards privacy. The acknowledgement of these problems is the first step that has to be taken. Second, ethical awareness and regulations could balance the opportunities that are provided by computer technologies and the privacy of individuals.
- K. Bergstedt, T. Tran & K. Waller (2018). Biometrics: Ethical Implications of Future Authentication Systems Background Definition of Biometrics. Retrieved from https://medium.com/var-city-uw/biometrics-ethical-implications-of-future-authentication-systems-b0ac833b53a7
- A. Chatzidakis & D. Mitussis (2015). Collaborative consumption: determinants of satisfaction and the likelihood of using a sharing economy option again. Journal of Consumer Behaviour, 14, 305–320. https://doi.org/10.1002/cb
- European Commission “Factsheet on the ‘Right to be Forgotten’ ruling (C-131/12)” 4 https://www.inforights.im/media/1186/cl_eu_commission_factsheet_right_to_be-forgotten.pdf
- Google Spain ECJ (C-131/12) (2014) http://curia.europa.eu/juris/liste.jsf?language=en&num=C-131/12
- K. E. Himma & H. T. Tavani (2009). Moral Methodology and Information Technology. In The Handbook of Information and Computer Ethics. https://doi.org/10.1002/9780470281819.ch3
- Intersoft Consulting. (2019). GDPR Consent. Retrieved from https://gdpr-info.eu/issues/consent/
- E. A. Kallman (1985). Computer Ethics: Two Complementary Perspectives. Business Ethics Quarterly. Jul91, Vol. 1 Issue 3, P319-331. 13p., 1(3), 319–332.
- B. D. Mittelstadt & L. Floridi (2016). The Ethics of Big Data: Current and Foreseeable Issues in Biomedical Contexts. Science and Engineering Ethics, 22(2), 303–341. https://doi.org/10.1007/s11948-015-9652-2
- SB-1121 California Consumer Privacy Act of 2018 https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=201720180SB1121
- R. A. Spinello (2013). The Cambridge Handbook of Information and Computer Ethics, ed. Luciano Floridi (Cambridge: Cambridge University Press, 2010), 327 pp., 978-0-521-88898-1. Business Ethics Quarterly, 23(1), 154–161. https://doi.org/10.5840/beq20132319
- B. C. Stahl, J. Timmermans & B. D. Mittelstadt (2016). The Ethics of Computing. ACM Computing Surveys, 48(4), 1–38. https://doi.org/10.1145/2871196
- S. Wachter (2016). Normative challenges of identification in the Internet of Things: Privacy, profiling, discrimination, and the GDPR. Internet of Things: Principles and Paradigms, 44(0), 3–27. https://doi.org/10.1016/B978-0-12-805395-9.00001-0
 Kallman, E. A. (1985). Computer Ethics: Two Complementary Perspectives. Business Ethics Quarterly. Jul91,
Vol. 1 Issue 3, 13p., 1(3), 319
 Stahl, 5
 Kallman, 319
 Chatzidakis, A., & Mitussis, D. (2015). Collaborative consumption: determinants of satisfaction and the likelihood of using a sharing economy option again. Journal of Consumer Behaviour, 14, 310. https://doi.org/10.1002/cb accessed 7 June 2019
 Kallman, 320
 Himma, K. E., & Tavani, H. T. (2009). Moral Methodology and Information Technology. In The Handbook of Information and Computer Ethics. https://doi.org/10.1002/9780470281819.ch3
 Stahl, 22
 Stahl, 6
 Kallman, 320
 Mohanty, S. (2015). The Four Essential V’S for a Big Data Analytics Platform. Dataconomy. http://dataconomy.com/2015/06/the-four-essentials-vs-for-a-big-data-analytics-platform/ accessed 26 May 2019; Mittelstadt B. D., & Floridi, L. (2016). The Ethics of Big Data: Current and Foreseeable Issues in Biomedical Contexts. Science and Engineering Ethics, 22(2), 326. https://doi.org/10.1007/s11948-015-9652-2 accessed 5 June 2019
 Wachter, 4
 Wachter, 3
 Chatzidakis, 311
 Wachter, S. (2016). Normative challenges of identification in the Internet of Things: Privacy, profiling, discrimination, and the GDPR. Internet of Things: Principles and Paradigms, 44(0), 5. https://doi.org/10.1016/B978-0-12-805395-9.00001-0 accessed 3 June 2019
 Wachter, 5
 Chatzidakis, 315
 SB-1121 California Consumer Privacy Act of 2018 https://leginfo.legislature.ca.gov/faces/billTextClient.xhtml?bill_id=201720180SB1121 accessed 9 June 2019
 General Data Protection Regulation (2016), Article 4 (2)
 General Data Protection Regulation (2016), Article 4 (1)
 Mittelstadt, 319
 Wachter, 3
 Bergstedt, K., Tran, T., & Waller, K. (2018). Biometrics: Ethical Implications of Future Authentication Systems Background Definition of Biometrics. Retrieved from https://medium.com/var-city-uw/biometrics-ethical-implications-of-future-authentication-systems-b0ac833b53a7 accessed 8 June 201
 Wachter, 5
 Spinello, R. A. (2013). The Cambridge Handbook of Information and Computer Ethics, ed. Luciano Floridi (Cambridge: Cambridge University Press, 2010), 327 pp., 978-0-521-88898-1. Business Ethics Quarterly, 23(1), 160. https://doi.org/10.5840/beq20132319 accessed 26 May 2019