Download as pdf or txt
Download as pdf or txt
You are on page 1of 15

Ethics Inf Technol (2016) 18:283–297

DOI 10.1007/s10676-016-9387-z

ORIGINAL PAPER

Should we welcome robot teachers?


Amanda J. C. Sharkey1

Published online: 10 February 2016


Ó The Author(s) 2016. This article is published with open access at Springerlink.com

Abstract Current uses of robots in classrooms are Keywords Robot teacher  Robot companion  Robot
reviewed and used to characterise four scenarios: (s1) ethics  Attachment  Deception  Privacy  Classroom
Robot as Classroom Teacher; (s2) Robot as Companion
and Peer; (s3) Robot as Care-eliciting Companion; and (s4)
Telepresence Robot Teacher. The main ethical concerns
One looks back with appreciation to the brilliant
associated with robot teachers are identified as: privacy;
teachers, but with gratitude to those who touched our
attachment, deception, and loss of human contact; and
human feelings. The curriculum is so much necessary
control and accountability. These are discussed in terms of
raw material, but warmth is the vital element for the
the four identified scenarios. It is argued that classroom
growing plant and for the soul of the child. (Carl Jung
robots are likely to impact children’s’ privacy, especially
1953)
when they masquerade as their friends and companions,
when sensors are used to measure children’s responses, and
when records are kept. Social robots designed to appear as
if they understand and care for humans necessarily involve
Introduction
some deception (itself a complex notion), and could
Many children find the idea of robots exciting. Imagine a
increase the risk of reduced human contact. Children could
school visit to a museum, where a small friendly humanoid
form attachments to robot companions (s2 and s3), or robot
robot explains to a group of children why they should eat
teachers (s1) and this could have a deleterious effect on
enough vegetables. The children are likely to pay attention
their social development. There are also concerns about the
and to enjoy the encounter. They might even remember the
ability, and use of robots to control or make decisions
lesson more than they would if it had been delivered at
about children’s behaviour in the classroom. It is concluded
school by their regular teacher. There seems little reason to
that there are good reasons not to welcome fully fledged
object to such a presentation. But if the children were to
robot teachers (s1), and that robot companions (s2 and 3)
arrive at school the next morning to find a robot in the
should be given a cautious welcome at best. The limited
place of their familiar teacher, they (and their parents)
circumstances in which robots could be used in the class-
might not be so happy.
room to improve the human condition by offering other-
People are worried about the use of robots in schools. In
wise unavailable educational experiences are discussed.
2012, a European survey of public attitudes to robots of
over 27,000 people found that 34 % thought robots should
be banned from the field of education (Eurobarometer 382
2012). 60 % thought that robots should be banned from the
& Amanda J. C. Sharkey care of children, the elderly or the disabled. Only 3 % of
[email protected]; [email protected] those surveyed thought that robots should be used in edu-
1
cation. Are these negative views justified? In this article,
Department of Computer Science, University of Sheffield,
we will look at current and near future uses of robots in the
Regent Court, Portobello Rd, Sheffield S1 4DP, UK

123
284 A. J. C. Sharkey

classroom and discuss the extent to which there are good instance telling the class to ‘‘Be quiet!’’ whilst displaying
reasons to be concerned about their use. an angry facial expression. Despite this, questionnaire
Robotics has progressed to a point where there is a real responses from elementary school pupils indicated that the
possibility of robots taking on social roles in our lives, and class was enjoyable.
it has become crucial to look at the ethical issues raised by The Saya robot was presented in the role of a teacher.
such developments. We need to think about where robots More often, classroom robots are presented in the role of a
can and should be used, and where they would be best companion or peer. For instance, ‘Rubi’, a low cost ‘so-
avoided, before we travel too far along a path towards ciable’ robot, was used to explore whether a robot could
complete automation. The field of robot ethics is currently improve toddlers’ vocabulary skills (Movellan et al. 2009).
undergoing quite a rapid development (Lin et al. 2012), and The robot was immersed in an Early Childhood Education
there have been a number of ethical assessments of the use Centre for 2 weeks. It operated autonomously during this
of robots in society. These range from considerations of the period, and could sing and dance; play a physical game of
advantages and risks posed by robot nannies (Sharkey and taking and giving back objects using its physical actuators;
Sharkey 2010), to using robots to care for older people and play Flash-based educational games targeting vocab-
(Sparrow and Sparrow 2006; Sharkey and Sharkey 2012), ulary development. It switched between games depending
or even for the provision of sexual services (Levy 2007). In on an ‘interest estimator’ which took into account the
this paper, we focus upon the ethical issues raised by the number of faces detected and the number of touches
idea of robots teaching in the classroom. In order to pro- received in the past minute. The researchers reported evi-
vide a realistic grounding for this discussion, we begin with dence of a 27 % improvement in 18–24 month toddlers’
a review of the social robots that are currently being used in knowledge of the target words taught by the robot as
classrooms. On the basis of this review, four representative compared to a matched set of control words that were not
scenarios will be identified. These will be used as the basis taught. They concluded that ‘sociable robots may be an
for an ensuing discussion of the ethical concerns that they effective and low cost technology to enrich Early Child-
raise. hood Education environments’.
Kanda et al. (2004) describe an 18 day field trial at a
Japanese elementary school in which two English-speaking
Current robots in the classroom ‘Robovie’ robots operated autonomously to interact with
first and sixth grade pupils. The robots could identify the
Robots as objects to be manipulated and operated by stu- children by means of the wireless tags they wore. The
dents have become common place in schools. For quite robots spoke English with children that approached it, and
some time they have been used as intermediary tools to had a vocabulary of around 300 sentences for speaking, and
explain concepts in mathematics and science, and as a 50 words for recognition. A picture-word matching test
means of involving students in technology by building and was administered to the children before the study, after
programming robots and working in teams (for reviews see 1 week and after 2 weeks, and the frequency of interac-
Benitti 2012; Mubin et al. 2013). Our focus here is instead tions between the children and the robots was recorded.
on the idea of using ‘social’ robots to act as teachers, or as Improvements in English skills as measured by the picture-
classroom companions. When a robot is acting as a teacher, word matching test were found in those children who
or as a companion, the children are encountering an interacted with the robot more often. Kanda et al. stressed
apparently social being, and are not involved in program- the need to investigate the development of longer term
ming, or building it. relationships with robots, as opposed to the brief initial
There are already some examples of social robots being encounters that are often studied. The robots’ ability to use
used in classrooms. One example is Saya, a humanoid the children’s names was found to encourage interaction.
robot deployed in classrooms in Japan to deliver material Evidence of an improvement in English scores was found
about the principles of leverage, and an introduction to for those students who continued to interact with the robot
robotics. Saya is a remote controlled humanoid robot with over the 2 week period, and who could be said to have
a female appearance. She (or it) consists of a movable head formed some kind of a relationship with it.
with the ability to make emotional facial expressions, In a subsequent study, Kanda et al. (2007) developed a
attached to a manikin body (Hashimoto et al. 2011). An classroom robot installation designed to encourage children
operator in a control room can hear and observe the stu- to continue to interact with a robot for a longer period. The
dents in the classroom by means of a video camera and the field trial was performed over an 8 week period in an
robot’s CCD camera. When the robot is operated in ‘in- elementary school in Japan, placing a Robovie robot in a
teractive’ mode, it can articulate brief sentences and class with 37 students aged 10–11 years. The children were
accompany them with an appropriate facial expression, for given the opportunity to interact with the robot during the

123
Should we welcome robot teachers? 285

lunch time break. As well as being able to identify the The robot was programmed to make the same errors as
children by means of RFID tags, the robot could keep track typically made by children, and to gradually improve its
of how often individual children interacted with it. It was performance based on the example letters that the children
programmed to exhibit ‘pseudo development’: adding formed on a tablet computer. The children were keen to
more behaviours to its repertoire over time. In addition, the teach the robot, although it is not clear whether or not
robot informed the children that it would tell them a secret teaching the robot led to improvements in their own
if they spent time with it: the personal information it handwriting.
divulged varied depending on how long the child had spent Telepresence robots represent another form of robots to
with it. For instance, it would tell children that it liked the be found in classrooms. They have been used to enable
teacher, or what its favourite baseball team was. telepresence communication between pupils and remote
As well as studies in which classroom robots are pre- teachers and also between pupils in different classrooms.
sented as companions, some researchers have explored Tanaka et al. (2013) report a study in which a child-oper-
young children’s interactions with robots designed to elicit ated telepresence robot was used to link remote classrooms
care-giving behaviour from them. Tanaka et al. (2007) of children aged 6–8 years old in Australia and Japan.
placed a robot in a classroom of 18–24 month old toddlers Their preliminary results suggested that when individual
for 45 sessions each lasting approximately 50 min over a children controlled a remote robot to interact with a distant
period of 5 months. The aim was not to get the robot to English teacher, they were more engaged than when they
teach the children, but to look at the social interactions interacted with the teacher via a Skype screen. Similarly,
between the children and the robot. The QRIO robot when Australian children remotely controlled a robot in a
received some input from a human operator specifying a Japanese classroom, the Japanese students were keen to
walking direction, head direction, and six different beha- interact, and to try using English phrases to address the
vioural categories (dance, sit down, stand up, lie down, operator.
hand gesture and giggle). An automatic giggle reaction There has been considerable interest in South Korea in
when its head was patted was set up for the robot, to ensure using robots for English language teaching (Han 2012).
a contingent response. The researchers claim to have found Han et al. (2005) report studies of the educational effec-
evidence of ‘long term bonding’ between the robot and the tiveness of the IROBI robot, a so called home educational
children in their study. The children continued to interact robot. They found better learning of English from the robot
with the robot over time, and exhibited a variety of social compared to other media (books with an audio tape, and a
and care taking behaviours towards the robot. They tou- computer program). The EngKey robot has been deployed
ched the robot more often than a static toy robot or a teddy in South Korean classrooms to teach students English via
bear, and the researchers claimed that they came to treat it telepresence. The EngKey has a dumpy egg shaped
as a peer rather than as a toy. appearance and was designed to seem friendly and acces-
Tanaka and Matsuzoe (2012) introduced a ‘care-re- sible. It has been used to enable remote teachers in the
ceiving’ robot into an English language school for Japanese Phillipines to teach English to South Korean students, and
children. The robot was smaller than the children, made found to improve students’ performance when deployed in
mistakes, and seemed to need their help. 17 children aged field tests in 29 elementary schools in South Korea (Yun
between 3 and 6 years were involved in the study with the et al. 2011). The Robosem robot developed by Yujin
aim of seeing whether the children would learn English Robotics has also been used as a telepresence robot for
verbs if they ‘taught’ them to the robot. They identified a remote language teaching in Korea (Park et al. 2011).
set of 4 previously unknown English verbs for each child. Telepresence robots differ from the autonomous robots
Two of the verbs were taught to the children by the used in some of the studies described here, in being overtly
experimenter (who asked them to match up the word and controlled by a human operator. They usually have an
the appropriate gesture). For the other two verbs, the anthropomorphic appearance, and some, like the EngKey
experimenter showed the child how to teach the robot to and Robosem robots, can operate in either telepresence or
match the word and the gesture, and then encouraged the autonomous mode (a useful classroom feature when the
children to teach the robot in the same way. The verbs the remote connection breaks down). By contrast, the Robovie
children taught to the robot were remembered better than robots investigated by Kanda and colleagues are designed
the words the experimenter taught them directly. The to operate autonomously, without human input. Then there
authors conclude that these preliminary results suggest that are robots such as Saya that are presented as if they were
getting children to teach ‘care-receiving’ robots can have autonomous, but are remotely controlled in a Wizard of Oz
some educational benefits. Hood et al. (2015) also report set up. Other robots are operated semi-autonomously, with
research in which children taught a robot. Children aged some human supervision. For example, the QRIO robot
6–8 years taught a Nao robot to form handwritten letters. researched by Tanaka and colleagues exhibited some

123
286 A. J. C. Sharkey

autonomous behaviour but could also be directed by a may have been bored by its delivery, especially when it
human operator so that it responded more appropriately to continued to give an explanation when the children had
what was going on in the classroom. already understood.
It is evident from this review that the idea of robot An interesting question that needs to be explored is the
teachers in the classroom is not just the stuff of science extent to which children trust and believe in robots that are
fiction. At the same time, it is apparent that the current presented in a teaching role, and the factors that affect that
abilities of robot teachers to operate autonomously are still trust. Some recent work on selective trust has begun to
quite limited, and often aided by covert or even overt explore the factors that influence a child’s beliefs in what
human intervention or remote control. The underlying they are told (Sabbagh and Shafman 2009). As Koenig and
motivation of several of these studies is often more one of Sabbagh (2013) point out, ‘children do not blindly trust the
exploring whether the robot would be accepted in the words of others’, but exhibit selective learning; making
classroom than of demonstrating its effectiveness at decisions about who to believe about what. A robot that is
teaching. Some of the studies, such as those by Kanda et al. unable to answer children’s questions when they stray
(2004, 2007) and Tanaka et al. (2007) are designed to beyond the featured topic would probably be viewed quite
investigate children’s relationships to robots over a longer sceptically by the children it is ‘teaching’. It is also likely
time period than many human-robot interaction studies that the appearance and behaviour of a robot will affect the
cover. Others explore some of the factors that affect chil- extent to which the information it provides will be
dren’s interest in the robots, such as the ability of the robot believed, with different results from robots with different
to call the children by name (Kanda et al. 2004), or to give appearances and behaviours. It is also possible that trust
them privileged (secret) information (Kanda et al. 2007). and belief in a robot will depend on the topic being con-
The studies do show that children can learn from robots, sidered. When Gaudiello et al. (submitted) considered
particularly in the application area of robot language people’s trust in an iCub robot and the robot’s influence on
teaching. Kanda et al. (2004, 2007) found improvements in their decision making, they found it had more influence
English scores. Tanaka and Matsuzoe (2012) found better when its answers related to functional and technical ques-
learning of the words that children taught to robots as tions (e.g. the weight of objects) and less when they related
compared to the words the experimenter taught them. to social questions (which items were more suitable for
Movellan et al. (2009) report an improvement in vocabu- different social contexts). This implies that people might be
lary scores, and Yun et al. (2011) report an improvement in more willing to believe and trust information provided by a
student performance as a consequence of the robot’s robot when it concerns factual and functional topics, than
telepresence operation. Nonetheless there is a need for when it deals with emotional and social issues. The phe-
more careful experimental design here. It is important to nomena of automation bias (Carr 2015) and algorithm
compare the robot’s teaching efficacy to other teaching aversion (Dietvorst et al. 2015), are also relevant here,
methods; especially so given the greater cost usually although their relationship to robots in the classroom have
associated with robotics. Comparisons between the effec- not yet been explored. There is a need for further research
tiveness of a human teacher and a robot teacher are rarely here: if robots are to be placed in classrooms, it is impor-
undertaken (for an exception see Rostanti 2015). Com- tant that they are given an appropriate level of trust and
parisons between the effects of language teaching by acceptance.
means of a telepresence robot and a Skype interface
(Tanaka et al. 2013), and an educational robot and other
media (Han et al. 2005) represent steps towards more Four scenarios for robots in the classroom
convincing assessments.
There is also scope for more detailed investigations of The ensuing discussion of the ethical issues raised by robot
the extent that children learn and retain the information teachers will be made more specific by basing it on a set of
delivered by a robot, and of the factors that determine the four representative scenarios. These scenarios are identified
robot’s teaching effectiveness. The appearance of the on the basis of the classroom contexts exemplified in the
robot and its ability to interact with and respond to its studies described above, in a review that presents a picture
audience are prime candidates for such factors. Komat- of the current state of the art in 2015. First, the Saya robot
subara et al. (2014) carried out a field trial with a social was presented in the role of an authoritative classroom
robot that quizzed children about science lessons they teacher (even though it was actually remotely controlled).
had received from a human teacher, and told them the This leads to the identification of Scenario 1, Robot as
correct answer together with a simple explanation. Classroom Teacher. In the investigations reported by
However they found no evidence that the robot increased Movellan et al. (2009), Tanaka et al. (2007), Kanda et al.
their knowledge and it was suggested that the children (2004), the robots were presented to the children as

123
Should we welcome robot teachers? 287

companions and peers rather than as a teacher. On the basis identifying information about them that can be accessed by
of these studies, we identify Scenario 2, Robot as Com- other people. The privacy of individuals would be intruded
panion and Peer. Some researchers (Tanaka et al. 2007; upon if a social robot was used to enable direct surveil-
Tanaka and Matsuzoe 2012; Hood et al. 2015) used com- lance. For instance, information picked up by the robot’s
panion robots designed to elicit care-giving from children: sensors that enabled the identification of the person being
these examples form the basis for Scenario 3, Robot as monitored could be directly transmitted to human monitors,
Care-eliciting Companion. And finally there are the even though that person might consider themselves to be
Telepresence robots, used to enable a remote teacher to alone and unobserved. Alternatively (or additionally) such
teach the class, which lead to the identification of Scenario personal information could be stored on the robot, and
4, Telepresence Robot Teacher. subsequently accessed by others. An insightful discussion
about the impact of robots on the privacy of individuals can
Scenario 1: Robot as Classroom Teacher.
be found in Calo (2012). As he points out, robots in social
Scenario 2: Robot as Companion and Peer.
spaces highlight questions about increased direct surveil-
Scenario 3: Robot as Care-eliciting Companion.
lance, since they are ‘equipped with the ability to sense,
Scenario 4: Telepresence Robot Teacher.
process and record the world around them’. As mobile
As well as being based on the reviewed studies of class- devices, robots may be allowed increased access to his-
room robots, it is claimed that these scenarios represent an torically protected spaces such as the home. Also, by dint
interesting range of roles for robots in the classroom. They of what Calo terms their ‘social meaning’, and their
vary in the extent to which the robot replaces or supple- apparent social nature, robots may extract confidences from
ments the human teacher. The most extreme version of a people that computers or other machines would not. There
robot teacher is represented by the Classroom teacher in are particular reasons to be worried about the privacy
Scenario 1, since it involves the robot replacing the human implications of robots in the classroom, and this forms the
teacher for at least a limited period. A Classroom teacher starting point for the ethical consideration that follows.
robot would need to act as a figure of authority and as an As well as privacy, there is another set of interrelated
explicit source of knowledge. By contrast, the Companion concerns that arise as consequence of the presentation of
robot and the Care-eliciting companion robot scenarios do robots as social entities. If a robot is built to resemble a
not involve replacing a human teacher, and could depend human being, or at least a being with emotions, those who
on a human teacher to be present and in charge of the encounter it may expect it to be able to care for and look
classroom. Neither would require the presentation of the after people. However, this appearance is, in some respects,
robot as an authoritative figure, and both have a goal of deceptive (although the issue of deception is a complex
implicit rather than explicit teaching. The Telepresence one, as discussed in ‘‘Attachment, deception and loss of
robot by contrast could be used to replace (or to supple- human contact’’ section). Questions have been asked about
ment) physically present human teachers with a remote, the ability of robots to provide meaningful care for older
albeit human, educator. These four scenarios are not the people (Coeckelbergh 2010; van Wynsberghe 2013), and
only ones possible, and different situations could arise in about the impact of robot care on the dignity of older
future studies. However it is claimed that identifying and people (Sharkey 2014). Such questions are relevant to
discussing these provides a necessary and useful first step robot teachers, since one aspect of what is required of a
towards an ethical consideration of robot teachers, and good teacher is that they should provide care for the chil-
enables a consideration of whether some of these scenarios dren in their charge.
represent better goals than others. In related work, Sharkey and Sharkey (2010) identified a
number of ethical concerns associated with the idea of
Robot Nannies. As well as misgivings about their effects
Ethical concerns about robot teachers on children’s privacy, several of these concerns were
related to questions about attachment, or lack of attach-
In order to determine the ethical issues that are most rel- ment, between children and robots, and about the deception
evant to the idea of robot teachers, as exemplified by the this could involve. The idea of robot nannies differs in
four scenarios, we begin by examining the ethical concerns several respects from that of robot teachers. For a start, a
previously raised elsewhere in discussions about social ‘nanny’ robot is more likely to be used at home than in the
robots in related situations and contexts. classroom. A robot nanny is also more likely to be used
A number of questions about the impacts of social with very young children and babies, and to come with a
robots on the privacy of individuals have been previously strong risk of psychological harm if used for any extended
raised (see Sharkey and Sharkey 2010, 2012). Social robots periods of time. Nonetheless, many of these concerns are
can affect the privacy of individuals by collecting personal still relevant to the idea of robot teachers in the classroom.

123
288 A. J. C. Sharkey

Others have also highlighted concerns about the the context of robotics as a whole. The three headings used
deception that may be engendered by robots, particularly in here have been chosen because they seem the most relevant
discussions of robot companions and robot pets. Sparrow and the most in need of reinterpretation and articulation in
(2002) takes exception to the deception and self-deception terms of robot teachers.
that he claims robot companions and robot pets rely on.
Wallach and Allen (2009) also suggest that the techniques Privacy
used to enable robots to detect and respond to human social
cues are ‘arguably forms of deception’. It therefore seems The more technology is used in the classroom, the more
important to consider the extent to which robot teachers or issues about privacy of information come to the fore. A
classroom companions involve some form of deception, robot’s ability to interact with children is enabled by sen-
and whether this could lead to negative consequences. sors. If those sensors are used to enable a reactive response,
Another common concern is the loss of human contact without storing information, there seems little reason to
that could result from the deployment of social robots in worry. For instance, a robot might use its sensors to detect
some circumstances. Sparrow and Sparrow (2006) were whether or not a child or a group of children was standing
suspicious about the reduction in human contact that would in front of it in order to trigger its presentation. But if the
result from the introduction of any robots, social or not, information detected by the robot is recorded, or cate-
into the care of the elderly. As well as loss of human gorised and recorded, this gives rise to concern about what
contact, Sharkey and Sharkey (2012) were also concerned information should be stored, and who is permitted access
about the reduction in human contact that could result from to it. In addition, even if the information is not stored, the
the use of robots to care for the elderly, or from their use as use of sensors and associated algorithms that make it
robot nannies (Sharkey and Sharkey 2010). possible to detect children’s emotional state could be
Attachment, deception and loss of human contact are all viewed by some as a step too far.
pertinent to the idea of robot teachers. The concepts cannot As is apparent from the studies described earlier, robots
be easily disentangled from each other. For instance, the in the classroom can be enabled to recognise individuals.
deceptive appearance of robots as real social entities could This can be accomplished by means of RFID tags worn by
lead people to form attachments to them, or to imagine that the children enabling the robot to call them by their names.
they were capable of or worthy of attachment. This could Alternatively, face recognition algorithms could be used to
in turn increase the loss of human contact that could result recognise individual children. Recognising and naming a
from the introduction of robots in the classroom. Because child does not necessarily mean that further information
they are so interrelated, the ethical issues relating to about that child will be stored, but it raises questions about
attachment, deception and loss of human contact are con- record keeping. Indeed, Kanda et al. (2007) describe how
sidered together under one heading. the robot they used kept a record of which children had
Other pressing ethical concerns that have been raised in interacted with it, and even of friendship groups amongst
papers on robot ethics, and that seem particularly relevant the children. This strikes a disturbing note. Is it too far-
to the use of robots in the classroom, are those that pertain fetched to imagine that, in the future, robots might be used
to control and accountability. Placing robots in charge of to categorise and monitor children’s behaviours; keeping a
human beings, whether they are vulnerable elderly people record of disruptive behaviour, or alerting the teacher?
(Sharkey and Sharkey 2012), or young children (Sharkey In the present post-Snowden climate, there is uneasiness
and Sharkey 2010), gives rise to questions about control about technologically based invasions of privacy. Of
and accountability. To what extent should robots be trusted course, when experimental research studies are conducted
to make the right decisions about what humans should do? in a classroom there are established protocols to follow
To what extent can they be held accountable for such about the storage of personal information. However, if
decisions? It seems important to consider these questions robots are to be really used in the classroom, the personal
with respect to robots in the classroom. information they store will not be deleted at the end of the
Following this analysis, in this ethical assessment of the study in the same way. There are many questions to be
idea of robot teachers, we will concentrate on discussions considered here, including the extent to which such infor-
of (1) Privacy; (2) Attachment, deception and loss of mation should be used as the basis for educational deci-
human contact and (3) Control and accountability. These sions made about the child. The storage of personal
topics do not exhaust the list of possible issues for con- information is covered in the UK by legislation such as the
sideration—there are others such as safety and liability, Data Protection Act, but the mobility and connectedness of
which are also relevant. However safety and liability issues robots provide new challenges. In 2015, concerns about the
are common to all robotic applications that involve contact collection of ‘big data’ in schools were raised by President
with humans, and we suggest that they are best discussed in Obama in USA where there are plans to introduce the

123
Should we welcome robot teachers? 289

Student Digital Privacy Act to curtail the use of informa- educational system towards a form of ‘edutainment’ in
tion about students collected by schools in order to provide which any difficult and potentially boring topics were
personalised educational services and to limit targeted avoided.
advertising and selling of the data. In UK, although the In addition, the use of emotional detectors and sensors
Data Protection Act provides some protection of personal can be viewed as an invasion of privacy. Although a human
data, the full implications of an increasing ability to sense teacher may be able to recognise the emotions and feelings
and store enormous amounts of personal data have not yet of their pupils to some extent, this is not the same as the
been thoroughly addressed. kind of recognition that might become possible if the pupils
Sensors in the classroom also give rise to the possibility had to wear sensors on their body that could transmit
of a more intimate form of privacy invasion. Physiological information about their present emotional state. Teachers
measures, and emotional facial expression recognition, and other adults sometimes complain about the eye rolling
offer the potential to detect and possibly record information behaviour of children, but what if they could further
about the emotional state of children interacting with a legitimise this complaint by referring to data on the chil-
robot. For instance, a biometric bracelet named the dren’s emotional response tracked by a digital device?
Q-sensor was developed by Affectiva (an MIT media lab Does the relevance of these privacy issues differ for the
spin off company) to measure Galvanic Skin Response four scenarios for robots in the classroom? Most apply
(GSR) and the emotional arousal of the wearer (http:// equally to all four scenarios, because personal data might
affect.media.mit.edu/projectpages/iCalm/iCalm-2-Q.html). be stored or used in any of them. There are additional
It was suggested that it could be used as an ‘engagement concerns about the privacy of information stored and
pedometer’, indicating when students are engaged and conveyed by means of a telepresence robot (Scenario 4)
when they are bored and disinterested. Affectiva has sub- because of the potential to cross national boundaries (e.g.
sequently diverted its attentions to the development of South Korea to the Philippines), complicating the appli-
other software. Nonetheless, both physiological measures, cation of national legislation such as the Data Protection
and emotional expression recognition have the potential of Act. There are also concerns that apply particularly to the
being used in the classroom to track students’ engagement. scenario in which the robot is presented as a companion or
Of course, a robot that is able to detect the level of peer (Scenarios 2 and 3). Presenting a robot to children as
engagement of its audience may deliver a better perfor- their ‘friend’ could encourage them to share information,
mance. Mutlu and Szafir (2012) programmed a Wakamaru and even confide secrets, in ways that could result in a
humanoid robot to monitor the engagement of its users and violation of their privacy. This issue is tied up with the
to adjust its behaviour to increase that engagement. They questions about deception and attachment that will be
monitored real time student attention using neural signals explicated further in the following section.
captured using a wireless EEG headset as the robot told a
story to individual participants. The robot was able to nod Attachment, deception and loss of human contact
its head, and engage in eye contact during the story. In
addition, it could display ‘immediacy cues’ by increasing There is a growing knowledge of the factors that contribute
its volume and making arm gestures. In three different to the illusion that a robot is able to relate to humans. A
conditions it (1) displayed these immediacy cues at random robot’s sensors, as discussed earlier, can allow it to respond
intervals, or (2) displayed them adaptively when the EEG to and interact with humans in ways that foster the sem-
indicated a drop in the participant’s level of engagement or blance of understanding. For example, a robot that is able
(3) did not change its volume or use gestures. Performance to detect a person’s emotional facial expression and
on a memory test for a story told by the robot indicated that respond with a matching one of its own, or an appropriate
participants’ memory for the story was significantly better verbal comment, will seem responsive. A robot that can
when the robot responded adaptively to detected decreases look into the eyes of the person talking to it is more likely
in engagement. to seem sentient. Likewise a robot that can detect when a
Even though a robot might well increase the engagement person is paying attention to it, or what they are paying
of its audience through the use of sensors, there are still attention to may be seen as one that understands what is
reasons to be concerned about their use. One problem is going on. A robot’s ability to respond contingently to
that high levels of arousal might have nothing to do with humans can be enabled by its sensors, and makes for a
the material or delivery but could be caused by other events more convincing robot (Yamaoka et al. 2007). The
in the classroom. Higher levels of arousal could also be appearance and behaviour of a robot also plays an impor-
created by exciting behaviours on the part of the robot that tant role. The illusion of understanding can be more con-
do not result in better learning or understanding of the vincing if the robot’s appearance avoids the uncanny
material being communicated, and would push the valley, and it behaves like a human whilst not looking too

123
290 A. J. C. Sharkey

much like one. This is probably because we are so skilled teacher. The problems associated with such a view will be
at rapidly evaluating human behaviour and monitoring it discussed in ‘‘Control and accountability’’ section.
for any signs of abnormality. A good match between a The idea of deception and the creation of a convincing
robot’s voice and its appearance helps (Meah and Moore illusion give rise to several important issues relating to the
2014), as does its ability to respond with emotional emotional attachments that might, or might not develop
expressions that are appropriate to the surrounding context between children and classroom robots. There are partic-
(Zhang and Sharkey 2012). ular risks associated with the convincing presentation of a
The creation and development of robots that are able to classroom robot as a companion or peer. If a classroom
respond appropriately to humans can certainly have the robot is presented as a friendly companion (Scenarios 2 and
effect of making them easier to interact with and more fun 3), the children might imagine that the robot cares about
to have around. At the same time, the argument has been them. They might feel anxious or sad when the robot is
made by some that such development is inherently absent, or choose to spend time with the robot in preference
deceptive. Sparrow and Sparrow (2006) writing about the to their less predictable human peers. Instead of learning
use of robots for elder care, argue that ‘to intend to deceive how to cope with the natural give and take involved in
others, even for their own subjective benefit is unethical, playing with fellow students they might get used to being
especially when the result of the deception will actually able to tell their robot companion what to do. In other
constitute a harm to the person being deceived’. Wallach words, some of their learning about social skills could be
and Allen (2009) also consider the techniques being used to impeded.
give robots the ability to detect social cues and respond Children do sometimes try to abuse robots (Bartneck
with social gestures, and conclude that ‘from a puritanical and Hu 2008; Brščić et al. 2015). A child could be
perspective, all such techniques are arguably forms of unpleasant and cruel to a robot and it would not notice. The
deception’. Sharkey and Sharkey (2006) pointed out that child might as a result learn that bad behaviour in friend-
much research in Artificial Intelligence, from robotics to ships does not have any consequences. Tanaka and Kimura
natural language interfaces, depends on creating illusions (2009) mention the expectation that ‘people who treat non-
and deceiving people. living objects with respect naturally act in the same way
Of course, the terms ‘deception’ and ‘deceptive’ in the towards living things too’. However, the impact that
context of robotics do not necessarily imply any evil intent human-robot relationship have on subsequent relationships
on the part of those keen to create the illusion of animacy. with other human beings is unknown. Supposing that a
The harm that could be created by a robot that gives the child were to treat the robot badly, what impact would this
illusion of sentience and understanding is not going to be have on their behaviour towards other children?
immediately obvious, and researchers attempting to create There is also the possibility that a child’s trust in rela-
robots able to respond to humans in a social manner may tionships could be weakened if they thought the robot was
not have even considered that their endeavours could lead their friend, but came to realise that the robot was just a
to any kind of damage. Since the time of automata makers, programmed entity, and as likely to form a friendship with
or even earlier (Sharkey and Sharkey 2006), inventors have the next child as with them. Similarly, the pseudo rela-
enjoyed creating apparently life-like machines. In addition, tionship formed with the robot could affect the child’s
those who view and interact with such machines can be views and understanding of how relationships work.
seen as contributing to their own deception, since the Thinking you have a relationship with a robot could be like
human tendency to enjoy being anthropomorphic is well imagining you have a relationship with a psychopath: any
known (Epley et al. 2007, 2008; Sharkey and Sharkey kind and empathetic feelings that you have for the robot are
2011). definitely not reciprocated.
Despite these points, there is a strong risk that robots It could be argued that an attachment formed to a robot
which create the illusion of a greater competency than they is no different to the attachments that children feel for their
actually possess could engender some harm to the person favourite cuddly toy. But there are important differences. A
or persons being deceived. Friedman and Kahn (1992) cuddly toy does not move, and any attachment that the
point out some of the risks of imagining that machines are child feels for it is based on their imagination. A social
capable of more than they actually are. A robot that is too robot is also not a living entity. However unlike the toy, it
good at emulating the behaviour of a human could lead can be programmed to move and behave as if it were alive.
people to expect too much of it, and to use it for educa- As a result it is can be more compelling to interact with,
tional purposes for which it is not well enough suited. It and children may be less clear about whether or not it is a
could for instance encourage the view that it could be living being, and about whether or not it can reciprocate in
placed in a position of authority such as that of a classroom a relationship. There is good reason to believe that a robot

123
Should we welcome robot teachers? 291

(like the Care-eliciting robot in Scenario 3) that seems to children, as they adopt the adult’s behaviour and values
be vulnerable and in need of care is particularly hard to and are encouraged to interact harmoniously with other
resist. There have been various computer games that have children. An attachment bond is more likely to be formed
exploited children’s caring natures: think about the Tam- with a teacher who is sensitive to the child’s emotions and
agotchi craze at the turn of the century, when children needs.
spent hours looking after a digital pet on a screen. If robots were to be increasingly deployed as Classroom
It might also be claimed that attachments to a robot are Teachers in the future, there is a risk that children would
no more based on deception than a child’s attachment to not view them as attachment figures, and so would lose that
the family pet. But again, there are important differences. emotional security. By contrast, if they were to perceive the
First, the family pet is a living creature, and something with robot as an attachment figure, this would open the possi-
which the child can genuinely form a relationship. Even bility of the children adopting the robot’s apparent values,
though it is often argued that we should not be anthropo- and as in the case of the robot companion, basing their
morphic and imagine that animals have human-like feel- social skills and world outlook on the behaviour and
ings for us, the family pet will know the child, and will be apparent attitudes of a machine rather than on a living,
directly affected by its actions. The robot, on the other breathing, empathising human.
hand, will only be able to simulate any affective response
to the child. Some might suggest that robots will eventually Control and accountability
be able to feel real emotions, but there is little evidence that
this will happen any time soon. The notion of robot teachers highlights concerns about
As well as concerns about robots presented as children’s robots being in charge of human beings. The idea of robots
companions (Scenarios 2 and 3) there are also questions to being in a position to exert control over humans, even (or
be asked about the attachments children would form, or fail especially) when those humans are children, should be
to form with Telepresence Robot Teachers (Scenario 4). controversial. However it is hard to imagine how a robot
Any relationship with the distant teacher could be com- could function as a teacher (Scenario 1) without being able
plicated by the children’s views of and relationship with to exert its authority over the children in the classroom.
the Telepresence robot itself. The extent to which a human Surely it would need to be able to recognise, and prevent,
teacher’s relationship to the children in the classroom disruptive behaviour? It would also need to be able to
would be affected by not being physically present is recognise and reward positive behaviour and successful
unknown, and in need of further investigation. learning, and find ways of reducing or eliminating negative
Problems seem likely to result from placing a robot in behaviour and poor learning outcomes.
the role of a Classroom Teacher (Scenario 1). Children do Many people might be concerned by the idea of giving
form attachments to their human teachers, and can be robots the power to restrict the activities of humans. At the
attentive to their direction: learning more from them than same time, others might like to think that robots would be
just the explicit educational material they deliver. The fairer than humans. Those who had uncomfortable rela-
quotation from Carl Jung at the beginning of this paper is tionships with teachers in their childhood could argue that a
apposite here. Teachers are most effective when they robot would do better: it would not be prejudiced, vindic-
function as an attachment figure. Bergin and Bergin (2009) tive or angry. A similar argument has been made in other
summarise research on attachment style relationships with contexts, from care-giving to the battlefield. Borenstein and
teachers where attachment is defined as a deep and Pearson (2013), in a discussion of robot caregivers, suggest
enduring affectionate bond that connects one person to that robots could be preferable to humans in some respects,
another across time and space (Ainsworth 1973; Bowlby because ‘robots are unlikely to suffer from certain kinds of
1969). It is a bond that is first formed with a baby and human failings’ (Borenstein and Pearson 2013) since they
child’s primary caregiver, and affects their relationship to lack empathy and are therefore not susceptible to the ‘dark
the world. A securely attached child feels at liberty to side of empathy’: namely indifference and even sadism
explore the world, secure in the knowledge that they have a (Darwall 1998). In a military context, Arkin (2009) has
caregiver they can rely on. Although the main attachment argued that robot soldiers could be more ethical than
bonds will be to the child’s primary caregiver, teachers can human soldiers because they would not get angry or want
also function as attachment figures. Bergin and Bergin to take revenge. The suggestion that a robot would be fairer
(2009) claim that attachment is relevant to the classroom in and less prejudiced than humans in the classroom is related
two respects. First, an attachment bond between child and to Arkin’s claim that robots can be more ethical than
teacher can help in the classroom by encouraging the child humans.
to feel secure, and able to explore their environment. Arkin (2009) proposed the idea of an ethical governor
Second, an attachment to a teacher can help to socialise for robot soldiers, which would evaluate possible actions

123
292 A. J. C. Sharkey

against a set of constraints such that unacceptable levels of problem solving in a social context; and learning social
collateral damage would be avoided, and only morally practices. She argues that the basis for caring about others
permissible actions selected. Winfield et al. (2014) also lies in the neurochemistry of attachment and bonding in
discuss the possibility of an ethical robot that evaluates mammals. Neuropeptics, oxytocin and arginine vasopressin
possible actions against a set of constraints before selecting underlie the extension of self-maintenance and avoidance
one. They describe an example in which a robot risks its of pain in mammals to their immediate kin. Humans and
own safety in order to preserve the safety of another robot other mammals feel anxious and awful both when their
(representing the idea of a robot protecting the safety of a own well-being is threatened, and also when the well-being
human). Could a robot teacher be programmed in a similar of their loved ones is threatened. They feel pleasure when
way to make ethical decisions in a classroom; decisions for their infants are safe, and when they are in the company of
instance about when to praise or castigate children for their others. Churchland (2011) extends her argument about
behaviour? morality originating in the biology of the brain to explain
One problem with this idea is that making a good the development of more complex social relationships.
decision about what to do in the classroom depends on This argument implies that robots do not have the nec-
having the ability to discriminate between different kinds essary biological nature required for a sense of morality.
of behaviour, and to understand the intentions that underlie Without this, how could they make fair decisions about
them. Recognising which children are misbehaving and good or bad behaviour in the classroom? The robot teacher
disturbing the classroom requires a detailed understanding could ‘decide’ by means of pre-programmed rules, but their
of the intentions behind a child’s actions. A quiet child effectiveness would depend on the programmer having
could be studying, or sullenly refusing to participate. A anticipated the situations likely to arise and the appropriate
vociferous child might be actively contributing to the class response to them. The variety of situations and social
discussion, or interfering with it. The problem is further encounters that could arise in a classroom makes this
compounded by the rapidity with which pupils can change unlikely.
states; the previously studying child can switch to being a Although it may be possible to create the illusion of
disruptive ring leader. For a robot to exert effective (and understanding and empathetic robots, it remains the case as
fair) control over children’s behaviour in the classroom, it Wallach and Allen (2009) acknowledge, that ‘present-day
would also need a reasonable idea of their probable next technology is far from having the kinds of intelligence and
actions, and to have strategies for encouraging good intentions people demand from human moral agents’ (p.
behaviour and discouraging bad behaviour. These are 45). Roboticists have begun to consider the relevance of
abilities that humans have, and that the best teachers can artificial empathy to robotics (e.g. Damiano et al. 2014),
exploit effectively. but this research is at an early stage. In the meantime,
Could a robot have these abilities? It seems unlikely in robots’ lack of understanding of children’s behaviour
the near future. Christof Heyns, the UN Special Rapporteur provides a major stumbling block for suggestions that
on extrajudicial, summary or arbitrary executions argued robots will be able to replace human teachers any time
against the use of autonomous robots to make lethal deci- soon.
sions on the battlefield on the basis that robots lack ‘human As well as deficits in moral understanding, robots are
judgement, common sense, appreciation of the larger pic- also not necessarily fair and unbiased. Because robots are
ture, understanding of the intentions behind people’s developed and programmed by humans, they can exhibit
actions, and understanding of values and anticipation of the technological bias. Forms of technological bias were
direction in which events are unfolding’ (2013, A/HRC/23/ already being discussed nearly two decades ago (Friedman
47). Clearly robot teachers would not be required to make and Nissenbaum 1996). The idea was illustrated in 2009 by
lethal decisions, but their actions would still impact the reports showing that Hewlett-Packard webcams’ face
lives of children, and they also lack the abilities listed by tracking algorithms worked only with white faces, and not
Heyns. with black faces (the problem was subsequently fixed).
Could robots develop these abilities in the distant Ensuring that a robot treats all children equally requires the
future? There are good reasons to think they will not. It has developers and programmers of the robot to be aware of
been argued that understanding good and bad behaviour possible inequalities that could result from the robot’s
depends on a sense of morality, which itself has a biolog- behaviour or sensors. Hewlett-Packard is unlikely to have
ical basis. Churchland (2011) argued that morality depends intended their face tracking algorithm to be racist; the
on the biology of interlocking brain processes: caring developers had just failed to notice that the algorithms they
(rooted in attachment to kin and kith and care for their well were using did not perform well with black and darker skin.
being); recognition of other’s psychological states (rooted It is possible to imagine other forms of bias that a robot
in the benefits of predicting the behaviour of others); might show, if they were not anticipated by its

123
Should we welcome robot teachers? 293

programmers and developers. For instance, any speech awareness of what is going on in the classroom. When the
recognition systems they use are likely perform better for robot is presented as a companion or peer (Scenario 2 and
children without strong regional accents, or dialects. 3), it is not seen as being in a position of authority, and
As well as questions about a robot’s ability to make there is less reason to be concerned about questions of
appropriate decisions, robot teachers would also give rise control and autonomy. Nonetheless, if companion robots
to legal issues about accountability. Teachers need to be are to be used for teaching purposes, there is still a need to
able to reward and punish the behaviour of children in the think carefully about any delegation of decision making
classroom. Under the Children Act 1989, teachers have a capabilities. Even a robot presented as a companion could
duty of care towards their pupils, a concept referred to as be required to make some decisions about a child’s learn-
‘in loco parentis’ that has evolved through legal precedent. ing, or performance. Care needs to be taken to ensure that
Legally, while not bound by parental responsibility, any such decisions are ones that it is appropriate for a robot
teachers must behave as any reasonable parent would do in to make. In other words, it should be clear that the deci-
promoting the welfare and safety of children in their care. sions are made by programmed algorithms, and not the
The principle of ‘in loco parentis’ can be used to justify a result of human-like judgement.
teacher’s reasonable use of punishment, although corporal
punishment in schools has been outlawed in most of Eur-
ope for some time. Questions about legal responsibility and Reasons in favour of robot teachers
robots are complex and increasingly discussed (Asaro
2012). It is unlikely that the ‘in loco parentis’ principle Although we have identified and discussed the main ethical
would be applied to a robot, but a robot engaged in concerns associated with the introduction of robots in the
teaching activity would need recourse to some forms of classroom in terms of 4 different scenarios, we have not yet
sanction. Apart from rewarding or punishing behaviour, a considered the arguments that could be made in favour of
robot teacher might need to prevent a child from per- classroom robots. Perhaps the ethical concerns raised here
forming dangerous actions, or from hurting their class- could be outweighed by compelling reasons in favour of
mates, or injuring the robot. It is not clear what kinds of deploying robots. In order to address this possibility, we
sanctions a robot could acceptably use. It might be that consider the main arguments and reasons for replacing
such questions mean that a robot could not feasibly be left humans with robots in social roles, and the extent to which
in charge of a classroom of children, and would always they apply equally, or differently, to the four classroom
need to be able to rely on a human supervisor to maintain scenarios.
classroom control. There are at least five general arguments that have been
Different roles and scenarios for classroom robots do made in favour of the use of robots in society. First, it is
create differing perspectives on these questions about often suggested that robots are particularly appropriate for
control and accountability, and about decisions about what situations that involve tasks that are dangerous, dirty or
to teach. They are particularly salient when considering the dull for humans to undertake, and that by taking on such
possibility of an autonomous robot teacher (Scenario 1). A tasks robots could free up humans for more interesting and
robot teacher could be programmed to teach on a particular rewarding activities (Takayama et al. 2008). A second
topic, or to follow a given curriculum. However a human reason for placing robots in social roles would be if they
teacher will continuously make decisions about when and were found to outperform humans. For instance, if it were
how to teach something, adjusting their delivery in shown that children generally learned better from robot
response to their understanding of the situation and the teachers than they did from human teachers, this could be a
audience. A robot, for reasons discussed above, is unlikely reason in favour of their adoption. A third reason for
to be able to do this. In addition, in order to function as a turning to robots is when they can offer something that
classroom teacher, a robot would have to be able to control would not otherwise be available. A fourth reason for
and make decisions about children’s behaviour in the deploying robots is as a signal that the organisation
classroom. The argument is made here, and elsewhere, that deploying them is technologically advanced and ‘cutting
robots do not have the necessary moral and situational edge’. A fifth reason in favour of replacing humans with
understanding to be able to adequately, or acceptably, fulfil robots is an economic one, based on claims that they will
this role. be more cost effective than the human alternative.
Control and autonomy are less of a concern in the case Do any of these reasons provide compelling justifica-
of the Telepresence robot in Scenario 4, since a human tions for the introduction of teaching robots into the
operator, or operators will presumably be involved; classroom? Several do not stand up to much scrutiny. The
although the extent to which the remote teacher is dis- first does seem particularly relevant to teaching. Few would
tanced from the classroom situation is likely to limit their see the teacher’s role as being so dangerous, dirty or dull

123
294 A. J. C. Sharkey

for humans that we need to replace them with robots, as in motivations for such developments: but is of questionable
Scenario 1. There is generally no shortage of people value. Ensuring that children have some knowledge and
wanting to become teachers. If, in the future, teaching experience of robots may well be a good thing, but it is
came to be perceived as a boring activity best left to robots, important to critically evaluate the evidence about the
this would not augur well for the future of humanity. extent to which robots can be used to enhance and facilitate
The second reason depends on finding robot teachers to learning before diverting too much of the limited educa-
be better than human ones, and is one that is particularly tional funding budget towards them. This argument applies
relevant to Scenario 1, and the idea of a robot replacing the to all four scenarios discussed here. This leads us to the
classroom teacher. Given the limited ability of robots to fifth reason. Replacing human teachers or assistants with
have a good understanding of what is going on in the robots because they are more cost effective is surely not
classroom and in children’s minds, this is unlikely to be the something to be encouraged. Governments and local
case in the near future. So far research on robots in the authorities might see some advantages to employing robot
classroom does not usually involve a comparison between teachers; they would not demand pay, or strike, or com-
the effectiveness of robots and humans in conveying infor- plain about being asked to follow a prescribed curriculum.
mation. In much of the research reviewed in the section on However, to be justifiable there would need to be good
‘‘Current robots in the classroom’’ (e.g. Movellan et al. evidence of the robots’ adequacy and competence for the
2009; Kanda et al. 2004, 2007) the concern was to show that role as compared to human teachers. This argument applies
children can learn from a robot and accept it in the class- to all four of the scenarios we have considered. The cost
room, and not to compare the robots’ effectiveness to that of effectiveness argument is one that may increasingly be
human teachers. When a small scale comparison to human made about robotics in various domains. It is to be hoped
teachers was undertaken (Rostanti 2015), the robot did not that discussions such as this that highlight the associated
fare well. Claims that robot teachers will be more motivating ethical concerns will help to reinforce and strengthen the
and effective for students than a humans need to be backed arguments against such developments, and to ensure that
up by convincing evidence, and that evidence is not yet robots are only introduced in situations where they can be
available. The possibility of robots making fairer decisions shown to lead to an improvement in the human condition.
in the classroom than humans was discussed in ‘‘Control and Most of the reasons we have considered here are not found
accountability’’ section and argued to be an unlikely one. to be good ones. Teaching is not a dangerous, dirty or dull task
The third reason is more viable, as there are situations and for which robots could appropriately replace humans. There is
scenarios in which classroom robots could conceivably offer no compelling evidence that robots are better than humans at
something otherwise unavailable. Telepresence robots teaching children. The economic reason is not a powerful one
(Scenario 4) for instance can be used to enable such a unless the robots were shown to outperform humans, and the
learning experience. The EngKey robot reviewed earlier was same is true for their use as a signal of the technological
being used to give South Korean students access to English sophistication of the school or organisation.
tutors in the Philippines. Likewise, a robot companion The most convincing reason then in favour of robots in
(Scenario 2) could augment a human teacher’s lessons by the classroom is that they can sometimes offer a beneficial
providing some individual coaching. Children may even be educational experience that might otherwise not be avail-
more willing to admit their lack of understanding to a robot able. This might be the case for the companion robots in
than to a human. Similarly, they might prefer to practice Scenarios 2 and 3, and the Telepresence robots in Scenario
speaking a foreign language with a robot companion than 4. Generally, it makes sense to use robots in circumstances
with a person. As discussed in ‘‘Current robots in the in which they can offer people access to resources and
classroom’’ section, an effective use of robots in the class- abilities that would not otherwise be realisable, rather than
room that is beginning to emerge is when they are presented in situations where they are being used to replace compe-
as a peer in need of help (Scenario 3, the Care-eliciting tent humans. A related argument was made in the context
companion), so that the child has to teach the robot some- of robots for older people in favour of deploying robots and
thing. This was the case in the study described earlier by robotic technology that expanded the set of capabilities
Tanaka and Matsuzoe (2012). Their preliminary results, and accessible to them (Sharkey 2014).
those of Hood et al. (2015), suggest that this approach can
work well since the robot can be programmed to seem to
need help from even a struggling student, thereby giving that Conclusions
student a rewarding feeling of competence.
The fourth reason for using robots as a means of indi- Now that we have considered the main ethical issues raised
cating to the technological sophistication of the school or by, and the reasons in favour of, classroom robots, some
educational establishment may reflect the underlying implications about the relative acceptability of the four

123
Should we welcome robot teachers? 295

classroom robot scenarios can be drawn. These conclusions unlikely to have the ability to keep control of a room full of
are based on the current and likely near future abilities of children in the absence of a human teacher (except in a
social robots, and it is acknowledged that they might need nightmare situation where they could administer physical
to be revisited if robots with significantly greater abilities restraint and punishment to make up for their own short-
are developed. comings). A robot could be programmed to deliver educa-
There are reasons to support the use of Telepresence tional material, but it is not at all clear that children would
robots (Scenario 4) when they are used to provide educa- learn that material once the initial novelty of the robot tea-
tional opportunities that would otherwise be inaccessible. cher had worn off. In addition, even if it were possible to
For instance, they could be used to facilitate children’s program robots to deliver a curriculum, that would not make
access to remote skilled teachers unavailable in their them good teachers. A good teacher should be able to iden-
school. Their use as a cost-cutting measure should still be tify the zone of proximal development for a child, and be able
viewed with suspicion, and they do give rise to concerns to teach them just what they need to know, just when they
about privacy and sharing of information, but nonetheless need to know it (Pelissier 1991). As discussed by Sharkey
they could usefully supplement regular classroom teaching (2015), a robot is unlikely to be able to determine the relevant
in some circumstances. Their use to facilitate contact with information to teach to a student in any meaningful way. As
teachers and speakers of a foreign language seems appro- non-humans, how could robots determine what human
priate, and if they are deployed in a classroom in which a children need to know, or have the intention to pass on the
human teacher is also available, there would be less need to information that is needed to accomplish the tasks required in
be concerned about the issues of control and autonomy, and human culture (Kline 2015)? First and foremost, children
attachment and deception. need to be taught by fellow human beings who understand
Companion and peer robots designed to foster implicit them, care for them, and who form appropriate role models
learning (Scenario 2 and 3) seem quite likely to appear in and attachment figures.
schools because they can function under the auspices of the
human teacher without the need to control the classroom, Acknowledgments This work was partially supported by the
European Union Seventh Framework Programme (FP7-ICT-2013-10)
or to appear fully competent. If such robots are to be under grant agreement No. 611971.
welcomed, their welcome should be a cautious one because
of the need to establish the educational effectiveness of Open Access This article is distributed under the terms of the Crea-
such measures, particularly when compared to cheaper tive Commons Attribution 4.0 International License (http://creative
commons.org/licenses/by/4.0/), which permits unrestricted use,
alternatives such as educational software and virtual coa- distribution, and reproduction in any medium, provided you give
ches. In addition, since such robots masquerade as chil- appropriate credit to the original author(s) and the source, provide a link
dren’s friends, there are concerns about the extent to which to the Creative Commons license, and indicate if changes were made.
they would violate their privacy, and a risk that they would
have a deleterious impact on their learning about social
relationships. Nonetheless, if concerns about privacy and References
social relationships were addressed, it is possible that such
robots could be used to offer new educational opportuni- Ainsworth, M. D. S. (1973). The development of infant–mother
ties. For example, the idea of developing a care-eliciting attachment. In B. Caldwell & H. Ricciuti (Eds.), Review of child
development research (Vol. 3, pp. 1–94). Chicago: University of
robot that encourages children to teach it new concepts or
Chicago Press.
skills (and thereby reinforce their own learning) seems a Arkin, R. (2009). Governing lethal behavior in autonomous robots.
promising one. Similarly companion robots could be Chapman-Hall review. Computers and Education, 58(3),
developed to provide individualised practice for children 978–988.
Asaro, P. (2012). A body to kick, but still no soul to damn: Legal
on tasks that require repetition (and that might be too dull
perspectives on robotics. In P. Lin, K. Abney, & G. A. Bekey
or time consuming for human teachers). It also seems (Eds.), Robot ethics: The ethical and social implications of
plausible that children might be more willing to admit a robotics (pp. 169–186). London: MIT Press.
lack of understanding, or a need for repeated presentation Bartneck, C., & Hu, J. (2008). Exploring the abuse of robots.
Interaction Studies: Social Behaviour and Communication in
of material to a robot than to a human adult.
Biological and Artificial Systems, 9, 415–433.
The use of fully fledged robot teachers (the extreme of Benitti, F. B. V. (2012). Exploring the educational potential of
Scenario 1) is surely something that should not be encour- robotics in schools: A systematic review. Computers and
aged, or seen as a goal worth striving for. There seems no Education, 58(3), 978–988.
Bergin, C., & Bergin, D. (2009). Attachment in the classroom.
good reason to expect that robot teachers would offer extra
Education Psychology Review, 21, 141–170.
educational benefits over a human teacher. It is also apparent Borenstein, J., & Pearson, Y. (2013). Companion robots and the
that robot teachers will not be able form an adequate emotional development of children. Law, Innovation and
replacements for humans in the near future. Robots are Technology, 5(2), 172–189.

123
296 A. J. C. Sharkey

Bowlby, J. (1969). Attachment and loss: Volume 1: Attachment. Kline, M. A. (2015). How to learn about teaching: An evolutionary
London: Hogarth Press. framework for the study of teaching behavior in humans and
Brščić, D., Kidokoro, H., Suehiro, Y., & Kanda, T. (2015) Escaping from other animals. Behavioural and Brain Sciences, 38, 1–17.
children’s abuse of social robots. In Proceedings of ACM/IEEE Koenig, M., & Sabbagh, M. A. (2013). Selective social learning: New
international conference on human-robot interaction (pp. 59–66). perspectives on learning from others. Developmental Psychol-
Calo, M. R. (2012). Robots and privacy. In P. Lin, K. Abney, & G. ogy, 49, 399–403.
A. Bekey (Eds.), Robot ethics: The ethical and social implica- Komatsubara, T., Shiomi, M., Kanda, T., Ishiguro, H., and Hagita,
tions of robotics (pp. 187–202). London: The MIT Press. N.,(2014) Can a social robot help children’s understanding of
Carr, N. (2015). The glass cage: Where automation is taking us. science in classrooms? In Proceedings of the second interna-
London: Bodley Head. tional conference on human–agent interaction (pp. 83–90).
Churchland, P. S. (2011). Braintrust: What neuroscience tells us Levy, D. (2007). Love ? sex with robots: The evolution of human–
about morality. Oxford: Princeton University Press. robot relationships. Ltd: Gerald Duckworth and Co.
Coeckelbergh, M. (2010). Health care, capabilities, and AI assistive Lin, P., Abney, K., & Bekey, G. A. (2012). Robot ethics: The ethical
technologies. Ethical Theory and Moral Practice, 13(2), and social implications of robotics. London: MIT Press.
181–190. Meah, L.F.S, and Moore, R.K. (2014) The uncanny valley: A focus on
Damiano, L., Dumoouchel, P., & Lehmann, H. (2014). Artificial misaligned cues. In M. Beetz, B. Johnston, M. Williams (Eds.),
empathy: An interdisciplinary investigation. International Jour- Social Robotics (vol. 8755, pp. 256–265). LNAI
nal of Social Robotics, 7(1), 3–5. Movellan, J., Eckhart, M., Virnes, M., & Rodriguez, A. (2009).
Dietvorst, B., Simmons, J., & Massey, C. (2015). Algorithm aversion: Sociable robot improves toddler vocabulary skills. In Proceed-
People erroneously avoid algorithms after seeing them err. ings of 2009 international conference on human robot interac-
Journal of Experimental Psychology: General, 144(1), 114–126. tion (HRI2009).
Epley, N., Akalis, S., Waytz, A., & Cacioppo, J. T. (2008). Creating Mubin, O., Stevens, C. J., Shahid, S., Al Mahmud, A., & Dong, J. J.
social connection through inferential reproduction: Loneliness (2013). A review of the applicability of robots in education.
and perceived agency in gadgets, gods, and greyhounds. Technology for Education and Learning, 1, 1–7.
Psychological Science, 19, 114–120. Mutlu, B., and Szafir, D. (2012) Pay Attention! Designing adaptive
Epley, N., Waytz, A., & Caciopo, J. T. (2007). On seeing human: A agents that monitor and improve user engagement. In Proceed-
three factor theory of anthropomorphism. Psychological Review, ings of Human Factors in Computing (CHI 2012).
114(4), 864–886. Park, S., Han, J., Kang, B., & Shin, K. (2011). Teaching assistant
Eurobarometer 382. (2012). Public attitudes towards robots. Bussels: robot, ROBOSEM, in English class and practical issues for its
European Commission. diffusion. In Proceedings of workshop on advanced robotics and
Friedman, B., & Kahn, P. H. (1992). Human agency and responsible its social impacts. http://www.arso2011.org/papers
computing: Implications for computer system design. Journal of Pelissier, C. (1991). The anthropology of teaching and learning.
Systems and Software, 17(1), 7–14. Annual Review of Anthropology, 20, 75–95.
Friedman, B., & Nissenbaum, H. (1996). Bias in computer systems. Rostanti, S. (2015). Can a robot teach? University of Sheffield,
ACM Transaction on Information Systems (TOIS), 14(3), Department of Computer Science undergraduate dissertation.
330–347. Sabbagh, M. A., & Shafman, D. (2009). How children block learning
Han, J., Jo, M., Park, S., & Kim, S. (2005). The educational use of from ignorant speakers. Cognition, 112, 415–422.
home robots for children. In Proceedings of the 14th IEEE Sharkey, A. (2014). Robots and human dignity: The effects of robot
International Workshop on Robot and Human Interactive care on the dignity of older people. Ethics and Information
Communication (RO-MAN 2005) (pp. 378–383). Piscataway, Technology, 16(1), 53–75.
NJ: IEEE. Sharkey, A. (2015). Robot teachers: The very idea! Behavioural and
Han, J. (2012). Emerging technologies: Robot assisted language Brain Sciences, 38, 46–47.
learning. Language Learning and Technology, 16(3), 1–9. Sharkey, N., & Sharkey, A. (2006). Artificial intelligence and natural
Hashimoto, T., Kato, N., & Kobayashi, H. (2011). Development of magic. Artificial Intelligence Review, 25, 9–19.
educational system with the android robot SAYA and evaluation. Sharkey, N. E., & Sharkey, A. J. C. (2010). The crying shame of robot
International Journal Advanced Robotic Systems, 8(3), 51–61. nannies: An ethical appraisal. Interaction Studies, 11(2), 161–190.
(Special issues assistive robotics). Sharkey, A., & Sharkey, N. (2011). Children, the elderly, and
Gaudiello, I., Zibetti, E., Lefort, S., Chetouani, M., & Ivaldi, S. interactive robots. IEEE Robotics and Automation Magazine,
(submitted). Trust as indicator of robot functional and social 18(1), 32–38.
acceptance. An experimental study on user conformation to the Sharkey, A. J. C., & Sharkey, N. E. (2012). Granny and the robots:
iCub’s answers. arXiv:1510.03678[cs.RO] Ethical issues in robot care for the elderly. Ethics and
Heyns, C. (2013). Report of the Special Rapporteur on extrajudicial, Information Technology, 14(1), 27–40.
summary or arbitrary executions, A/HRC/23/47 Sparrow, R. (2002). The march of the robot dogs. Ethics and
Hood, D., Lemaignan, S., & Dillenbourg, P. (2015). When children Information Technology, 4, 305–318.
teach a robot to write: an autonomous teachable humanoid which Sparrow, R., & Sparrow, L. (2006). In the hands of machines? The
uses simulated handwriting. In HRI ’15, March 02–05 2015, future of aged care. Mind and Machine, 16, 141–161.
Portland, OR, USA Takayama, L, Ju, W., and Nass, C. (2008) Beyond dirty, dangerous
Jung, C. (1953). The development of personality, Collected Works of and dull: What every day people think robots should do. In
C.G. Jung, volume 17, Princeton, N.J. Princeton University Proceedings of Human Robot Interaction 2008, March 12–15,
Press. pp 25–32
Kanda, T., Hirano, T., Eaton, D., & Ishiguro, H. (2004). Interactive Tanaka, F., Cicourel, A., & Movellan, J. R. (2007). Socialization
robots as social partners and peer tutors for children: A field trial. between toddlers and robots at an early childhood education
Human Computer Interaction, 9, 61–84. center. Proceedings of the National Academy of Science,
Kanda, T., Sato, R., Saiwaki, N., & Ishiguro, H. (2007). A two-month 194(46), 17954–17958.
field trial in an elementary school for long-term human–robot Tanaka, F., & Kimura, T. (2009). The use of robots in early
interaction. IEEE Transactions on Robotics, 23(5), 962–971. education: a scenario based on ethical consideration. In

123
Should we welcome robot teachers? 297

Proceedings of the 18th IEEE international symposium on robot Winfield, A.F., Blum, C., and Liu, W.(2014) Towards an ethical
and human interactive communication (RO-MAN 2009) (pp. robot: Internal models, consequences and ethical action selec-
558–560). tion. In M. Mistry, A. Leonardis, M. Witkowski, & C. Melhuish
Tanaka, F., & Matsuzoe, S. (2012). Children teach a care-receiving (Eds) Advances in autonomous robotics systems: Proceedings of
robot to promote their learning: Field experiments in a classroom the 15th annual conference, TAROS 2014 (pp 85–96). Birming-
for vocabulary learning. Journal of Human–Robot Interaction, ham, UK, 1–3 September
1(1), 78–95. Yamaoka, F., Kanda, T., Ishiguro, H., & Hagita, N. (2007).
Tanaka, F., Takahashi, T., Matsuzoe, S., Tazawa, & Morita, M. Interacting with a human or a humanoid robot? In Proceeding
(2013). Child-operated telepresence robot: A field trial connect- of the IEEE/RSJ International Conference on Intelligent Robots
ing classrooms between Australia and Japan. In Proceedings of and Systems (IROS 2007)
IEEE/RSJ international conference on intelligent robots and Yun, S., Shin, J., Kim, D., Kim, C.G., Kim, M., & Choi, M. T. (2011).
systems (IROS 2013), Tokyo, Japan, November 2013, (pp. Engkey: Tele-education robot. In B. Mutlu, et al. (Eds.) Social
5896–5901). robotics: Proceedings of the third international conference on
Van Wynsberghe, A. (2013). Designing care robots for care: Care social robotics, LNAI 7072, pp. 142–152.
centered value-sensitive design. Journal of Science and Engi- Zhang, J., & Sharkey, A. (2012). It’s not all written on the robot’s
neering Ethics, 19(2), 407–433. doi:10.1007/s11948-011-9343-6. face. Robotics and Autonomous Systems, 60(11), 1449–1456.
Wallach, W., & Allen, C. (2009). Moral machines: Teaching robots
right from wrong. New York: Oxford University Press.

123

You might also like