The importance of ethics in smart technology
Why is ethics important?
Smart technology has entered many different areas. From smartphones, smart health and robots to smart cities and driverless cars, this innovative technology affects many parts of our daily life. However, the smarter technology gets, the more important moral and ethical codes become. Since some of these smart devices do not only collect and process data but are also partially capable of making their own decisions, they also have to “learn” how to make the right decisions without causing harm.
A good example of smart devices that can make their own decisions are smart cars. Smart cars are still being developed, however, it is only a matter of time before smart cars are available to many people worldwide. A smart car, a car without an actual driver, has to learn how to react in a dangerous and possibly life-threating situation. For example, if a driverless car is driving along a road and suddenly notices a dog in front of the car, how is it supposed to react? Hit the dog or swerve into the parallel lane and hit a car with a family of four? Running over one person, child or adult, to save several people on the other lanes? Crashing against a tree to save others while possibly killing the passenger in the driverless car? These are some examples of situations in which moral and ethical codes are relevant.
What moral and ethical codes are necessary?
For people the most important social moral values are ensconced in law. Obvious examples of moral values forbid people to kill, rape, steal or harm the environment. In the case of driverless cars, these cars are supposed to obey to speed limits, keep a safe distance and avoid harming other people on the road. Nevertheless, as the previous examples show, there are many other moral and ethical values smart devices need to have to make sure that they do not harm their users and others. So how do we make sure that smart devices behave and react appropriately?
Amitai Etzioni and Oren Etzioni mention in their paper “AI assisted ethics” two different approaches how to determine which values are relevant: the communitarian and the libertarian approach. The communitarian approach does not only include moral codes manifested in laws, but also moral values the community expects parents to pass down to their children. The problem with this approach is how to determine which values to add. Although most people agree on general moral values, their opinions may differ when going into more detail. For example, most people will agree that it is important to protect the environment, however, they might disagree on what measures are necessary to do so. Therefore, it might also be difficult to decide what moral and ethical values to “teach” smart devices.
The second approach Etzioni and Etzioni mention in their paper is the libertarian approach. The idea behind this approach is that every person should define the good and the values they think are important to them themselves. The state is supposed to stay neutral. In terms of smart devices, the user would have the possibility to choose which options and moral values they would want their devices to have. The issues with this approach is the great number of possible options and therefore this approach seems to be very impractical.
Five levels of ethical programming
Whereas these approaches aim at deciding which moral values to add to smart devices, other attempts show how, once chosen, these values are “taught”. To do so, five different levels of programming are necessary. Gartner analyst Frank Buytendijk explains that the level 0, also called the dubbed non-ethical programming, is yet not concerned with ethical values. The first level, ethical oversight, does also not require specific ethical programming. However, at this stage manufacturers have to start thinking about how people might use the smart device and what ethical consequences this could have. Level number two, the ethical programming, is concerned with possible challenges when providing smart devices. At this point the responsibility for ethical behaviour is shared among users, service providers and manufacturers. At this stage users would only be responsible for the content they produce, not for the possible outcome. Level three, evolutionary ethical programming, would mean to include ethical programming in the smart device that learns and evolves. Although the user is still in control, the device would gain a certain degree of autonomy. The last level of ethical programming would deal with the hypothesis of machine-developed ethics meaning that connected machines develop self-awareness and therefore need to be brought up and taught like children. At this point this is of course hypothetical.
Medicine, smart technology, and ethics
Medicine is one of those areas where ethics plays a relevant part. For example, smart pills, pills with very small chips inside of them, may help doctors treat patients more effectively, but at the same time involve risks for the patient including data abuse. When patients take a smart pill, the chip inside the pill enables patients and doctors to monitor regular intake and make sure that the medication is used as prescribed.
However, not every patient might be thrilled about the possibility of having a very small machine travel through your body. Moreover, in terms of data security, patients may have doubts about how their data is used and whether the information on their health might be misused. Additionally, patients may be afraid of being manipulated by the chip they swallow every day, consequently even being compelled to take medication against their will. Therefore, it is necessary to make sure that data collected by smart devices is protected and developers of these devices consider every relevant aspect of ethics to guarantee that these smart devices serve human beings and do not harm them.