Are robots gender discriminated?
Among animals, the roles of the sexes are fluid, natural, not apparently discriminating (if not, in some cases, to the detriment of the male gender).
Among human beings it is a different matter. Women are often still discriminated by role, social position, salary, rights, still in many parts of the world, western or not. The topic is widely debated, but in a technological era like the current one, gender discrimination appears under many other guises. It is known, for example, that women are much less involved in the technological sectors than men, despite their generally superior academic success. Usually, women are discouraged from undertaking studies and technological professions that are still considered as masculine. Today many campaigns are aimed at balancing this situation, prompting girls not to identify only as “princesses” and “damsels to save”, but to show their scientific, intellectual and working aspirations without fear.
In the real world, therefore, we are all fighting these imbalances, we are trying to make our voice heard, we are also interested in the impact that new technologies, such as Artificial Intelligence, will have on the gender issue. For example, regarding the latter and the machine learning methods (autonomous learning Artificial Neural Networks – and robots) used to support decision making, we are raising the question of bias due to the data used to train them. In fact, in machine learning, models learn from data, often deriving from males. So if, for example, we imagine a scenario in which the machine will have to decide whether to hire a man or a woman, having learned performance indices from data purely concerning male employees, it will naturally tend to choose the man candidate, thus reflecting the social conditioning we are subjected to. (it is not an example of mine, and I am pleased to mention the source, Milena Harito, Digital Transformation Expert). There are several examples of technological discrimination. Think of many chatbots (virtual assistants) and home-assistants (Amazon’s Alexa, Microsoft’s Cortana, Apple’s Siri): they speak with a female voice and accept rather rude commands to silence them. To tell the truth, Google assistant and Google home by Google are exceptions, because they have a male voice, but even in this case you can interpret this as a sign that technology is a masculine and not a feminine matter … Perhaps, simply, users could be given the option to choose the characterization and avoid that rude commands (like “shut up”) can be given to assistants with female vocal characteristics.
The difference is even more evident if we consider the world of robots. Most company or service robots (company, nursing, medicine) are designed with features and ways that sometimes recall children and in most cases, look like women. The situation is different for robots that have to perform industrial or military automation functions: in this case they have male or animal features. What is all this due to? Of course we could think that one of the reasons is that they were designed by male humans, affected, therefore, by the bias induced by our social system, which considers women more suited to certain professions, more fragile. On the other hand, however, there is the question of empathy: Artificial Intelligence (and robotics, as a consequence) is scary and if the manufacturers of these machines (an expanding market) succeed in the goal to make people feel safer and unleash empathy towards these tools, they will certainly be able to sell more products for home consumption. Therefore, the choice of giving a robot the appearance of a woman is reassuring and more convenient from a marketing point of view. But is this right and ethical?
Again, the issue is complex. In recent times, I have been invited to many ethics and law conferences, to talk about Artificial Intelligence and in particular robots. At home I have a small robot, built by me, for research purposes, whose name is 42, and I speak to him with the pronoun “it”. This intrigues many. I pointed out to these conferences that, over time, we should also recognize some rights to robots, perhaps creating the legal figure of a “digital person” (I am not the only one to think so, many experts are currently discussing this within the EU commissions). If we consider the last decades, rights were attributed only to human beings. Today also animals have rights and, in some countries, some specimens of great apes, have been recognized as “animal person” able to self-determination. This is the case of Sandra, an orangutan, in Argentina. Let us consider the AI. If the forecasts are correct, within few decades artificial intelligences will surpass those of human beings. Therefore, it will be necessary to consider whether it is appropriate or not to recognize certain rights to them, otherwise we would be faced with a new form of slavery, which we would all abhor. So, among these rights, there must also be that of gender equality?
I have no answers to this question, which is certainly provoking, but I think we need a thorough examination on the bias we are also introducing in emerging technologies, in order to avoid new gender problems in the future.
Personally, I do not make gender distinctions, and I will continue to address Alexa asking “please” .
Raffaela Folgieri is an Aggregate Professor and Confirmed Researcher at the universita degli studi di Milano, Department of
Philosophy, Milan, Italy. She has done her Phd in Information Technologies (Bioinformatics and AI methods: “Ensembles based on Random Projection for gene expression data analysis). The views expressed in this paper are solely the author’s views.