Utilitarian factors that determine the acceptance or use of a robot
Hedonic factors or intrinsic motivations in robotic acceptance
Social beliefs and social factors influencing the acceptance
The contextual factors that play a role when using a robot
Discover the Human-Robot Interaction
acceptance variables
The Perceived Ease Of Use (PEOU) has been discussed since the first models of technology acceptance such as Davis’ TAM (1989). The PEOU is determinant of the Intention to Use (ITU) as validated by the studies of Lee et al. (2003 ) and Yuanquan et al. (2008). This construct was also discussed in the UTAUT and defined as Effort Expectacy (EE). According to some studies (Davis, 1993; Venkatesh et al., 2003) the perception of ease of use (PEOU) can influence perceived utility (PU).
The Intention to Use (ITU) is a reliable indicator of actual use (Actual USE) as a fundamental assumption of any acceptability model (Lee et al., 2003).
Social Influence (SI) or social norm, does not appear to be reliable from the experiments of Heerink (2010), non substances according to McFarland & Hamilton (2006) is a determining element of the intention to use (ITU) and therefore of the actual use (USE). It is likely that the construct in question can be included in the experiments in relation to enjoyment or attitude and, therefore, that it can influence the pleasure during the use of the robot or the users’ attitude towards it.
Facilitating Conditions (FC), as factors facilitating the use of the technology, are only relevant in cases where the technology is actually available for use by users. They influence the PU and PEOU and ITU (i.e. actual use). They may vary according to the users: according to Chen & Chan (2014) they are specific and particular for elderly users, as they include factors such as health status, cognitive and physical-functional abilities, social relationships, etc.
Perceived Enjoyment (PENJ) is the pleasure during interaction with a robot or social and companion agent and can influence acceptability as a determining factor of the ITU (Sun & Zhang, 2006) . According to Van der Hejden (2004) the use of hedonic systems for entertainment is strongly influenced by the pleasure of use and, even if assistive robots for the elderly are not specifically designed for entertainment, the pleasure of use combined with entertainment can facilitate their interaction with humans, becoming a relevant factor even in the case of more utilitarian systems.
Perceived Adaptiveness (PAD), the level at which a robotic system can adapt to the different needs of older users, influences the PU and thus its acceptance. As stated by Forlizzi et al. (2009), older people have a specific way of relating to new technological systems and changing needs and expectations over time: in order to ensure that people remain independent at home for as long as possible, it is important that these systems are flexible enough to adapt to these changes, supporting “the ecology of ageing”.
Perceived Behavioral Control (PBC) is the users’ perception of their ability to implement a certain behaviour and is determined (Ajzen, 1991) by the set of control beliefs (control beliefs, i.e. beliefs about the presence of factors that may facilitate or prevent the execution of the behaviour). PBC is a predictor variable of behaviour as it influences perceived ease of use, intention to use and actual use (Venkatesh, 2000).
Related experiences (EXP) are past experiences with robots both direct and indirect, through media, articles and other sources. They can influence the intention of use and actual use (Bartneck et al., 2006).
The attractiveness (ATR), is understood as the positive evaluation of the robot given by the combination of pragmatic and hedonic elements (Hassenzahl et al., 2003). It is among the variables that most influence acceptability and perceived usability (Gerlowska et al., 2018).
Likeability (LIKE) is introduced as an influential factor for the acceptability and quality of human-robot interaction in the Godspeed questionnaire by Bartneck et al., (2009). The likeability dimension is based on the importance of the first impression on the formulation of a judgement in human relations: it is very much influenced by the behaviour perceived through life and voice communication. It has been shown, in fact, that positive first impressions lead to a more positive evaluation of a person (Robbins et al., 1994). This also applies to first impressions of robots, as robots are treated as social entities (Reeves & Nass, 1996).
Perceived security (SEC) is a fundamental element in the man-robot interaction, both in the industrial or health sector and in the domestic and private one. According to Bartneck et al., (2009) perceived security describes the user’s perception of the level of danger and the level of comfort perceived during the interaction. This is essential for robots to be accepted as employees or as valuable service tools.
Robustness (ROB), together with learnability, is analyzed by Weiss et al., (2009) within the USUS. This variable indicates the level of support provided to the user to enable the successful achievement of tasks and objectives. In fact, it is likely that inexperienced users may make mistakes when interacting with assistive robots: an effective HRI should allow people to correct these errors independently and the robotic system itself should prevent errors by being reactive and safe.
Perceived Usefulness (PU) has been a key variable since early technology acceptance models such as the TAM developed by Davis in 1989. PU is a key determinant of Intention to Use (ITU) as validated by the studies of Lee et al. (2003 ) and Yuanquan et al. (2008) . The PU is also present in the UTAUT where it is called Performance Expectacy (PE). According to some studies (Davis, 1993; Venkatesh et al., 2003) the perception of ease of use (PEOU) can influence perceived utility (PU). Moreover, attitudes towards the use of technology (Attitude – ATT) are determinant of perceived utility (PU) according to some (Yang & Yoo, 2004).
Attitude (ATT) can have a direct influence on Intention to Use (ITU) and Perceived Usefulness (PU), according to Yang & Yoo (2004). These researchers have distinguished, moreover, an attitude understood as an affective attitude (how much an object may like) which has little influence on acceptance or a cognitive attitude (depends on the internal and subjective beliefs of an individual on a specific object) which could be a determining and direct factor of the ITU, as confirmed by Wu & Chen (2005).
The Anxiety toward robots (A) is at the basis of the studies of Nomura et al. (2006) and the constitution of the RAS (Robot Anxiety Scale), in which it is shown that this construct has much influence in the interaction with robots. Anxiety is also crucial in the experimentation of acceptability models, although closely related to the world of HCI: according to Montazemi et al. (1996) it can directly influence PEOU and PU. In HRI the influence of ANX on PEOU has been verified by Nomura et al. (2008) but still remains the subject of further research.
The trust (TRUST) is related to social skills (De Ruyter et al., 2005) and directly influences the ITU (Wu & Chen, 2005; Cody-Allen & Kishore, 2006). This means that a robot with high social skills can gain more trust from users who, as a result, will rely more on its advice and give more weight to its claims. The issue of trust has generated important ethical issues, especially in relation to older users: it can generate emotional attack and, consequently, the so-called emotional deception, i.e. over-reliance on the robot by the individual who risks exercising less and less judgement (Sharkey & Sharkey, 2012).
Perceived Sociability (PS), i.e. how much the user perceives the robot’s behaviour as social, was introduced by Heerink (2010) although previous studies (Forlizzi, 2007; Mitsunaga et al., 2008) have shown that, in the case of assistive robots, social skills influence the robot’s general appreciation. This construct is influenced by TRUST.
Social Presence (SP) depends on “media equation” (Reeves & Nass, 1996) , according to which people tend to anthropomorphize and treat machines and robots as social entities, as if they were people. Already Heider & Simmel (1944) had shown that people attribute motivations, intentions and objectives to inanimate objects, based on the models of their movements.
Anthropomorphism (ANTR) influences the acceptability of robots (Duffy, 2003) as a determinant of PENJ: according to the scholar the anthropomorphic characteristics must be carefully balanced with the technological skills of the robot in order not to raise too high expectations that cannot be met. Moreover, anthropomorphism, if not excessive (and therefore not at risk of incurring in the Uncanny Valley), can positively influence PU and PEOU but also directly the Intention to Use (ITU) if people have previous experience with technology and robots (Goudey & Bonnin, 2016).
Realism (REAL) indicates the level at which users believe that the robot behaves realistically. Realism comes in the form of behavior, aesthetic appearance and visual fidelity to humans. It is as important as the social capabilities of the robot in determining the quality and type of interaction between man and robot (Paawe et al., 2015). Aesthetic realism concerns the physical appearance of the robot (embodiment) and is similar to similarity, i.e. the extent to which a robot resembles the entity it embodies in the real world (anthropomorphism, zoomorphism, etc.) (Fong et al., 2003). Behavioural realism concerns the degree to which the robot’s behaviour corresponds to what it should resemble. Higher behavioural realism leads to an increase in perceived sociality (Paawe et al., 2015).
Animacy indicates the level at which people think a robot looks “alive” (Bartneck et al., 2009). The robot’s perceived animation can be influenced by perceived intelligence but also by social presence (Okita & Schwartz, 2006).
Perceived intelligence (INT) is part of the Godspeed questionnaire for HRI evaluation (Bartneck et al., 2009). The dimension of intelligence represents a great challenge for robotics, necessarily related to artificial intelligence (AI), on whose knowledge and methodologies robots’ behaviours are based. Very often the Wizard-of-Oz method is used to simulate intelligent robotic behaviour during experiments but it is clear that the limits of robotic intelligence are evident in the case of an interaction in a real environment. Using a robot, in fact, users perceive its level of intelligence which in turn influences their relationship with the robot. It is likely that the perceived intelligence of a robot depends on its competence, knowledge and reactivity (Koda & Maes, 1996).
Learnability (LRN) is an indicator of usability originating from software engineering and introduced by Weiss et al., (2009) within the USUS model for HRI assessment. Learnability indicates the ease with which it is possible to learn how to use a system/product by inexperienced users. It is a key factor for usability and therefore for the acceptability of assistive robots as they represent devices with which people usually have little experience. Learnability includes several elements such as familiarity, consistency, predictability and ease of use and also depends on the complexity of the system, the personality and learning preferences of the user and perceived control (Neunast et al., 2010).
Dependability (DEP) is a determining factor for trust in HRI (Cramer et al., 2010). The dependability factor is decisive for the acceptability of assistive robots by older people and includes the perceived reliability of the system, i.e. whether it works safely, quickly and accurately (Neunast et al., 2010).