Embed from Getty Images


The British Sci-Fi TV series, Humans, has created a bit of a stir recently for its depiction of domesticated androids and has just recently ended its first run. In the show robotic technology has advanced to a degree to which “synths” are almost indistinguishable from human beings in their physical appearance and are competent enough to perform many of the more low status tasks. It is also implied that they could probably surpass humans in most other areas as well.

The show has probably hit a nerve partly because of the clever way it plays with the “uncanny valley” and the highly ambiguous role the synths play with it being unclear to what extent they are threatening/oppressed/caring.

As with much sci-fi this does not so much represent a realistic future for technology as an extrapolation of our current hopes and fears.

One of the most interesting relationships is that between a retired scientist, played by William Hurt, and his robotic carer. This synth has been forced on him by the health service and is played with a perfect Nurse Ratched style authoritarianism by Rebecca Front.

There is a lot of optimism for the potential of caring technologies and one of the most significant areas for this is in the care of older people. Many devices and systems have been proposed and introduced which will purportedly be able to ensure safety, reduce isolation and increase independence. Some of the more commonplace ones are sensors (such as ones designed to detect falls) which will automatically alert a call centre.

The robotic carer in Humans always ensures her charge complies with his medication, prepares nutritionally appropriate meals (rather than the ones he wants) and physically forces him back into the house when he leaves at night.

Unlike other characters who get domestic help from their “dollys” the scientist is not the “primary user” of his as it is provided by the health authority. This means that his synth is programmed to obey their orders rather than his. In the show this situation is presented as a means for bureaucratic control to extend more fully into the home.

In other popular culture if a family member or professional carer engaged in similar behaviour to the synth carer in Humans they would probably be portrayed as benevolent or at worst fussy and annoying rather than creepy and controlling. But as Hurt’s character tells the robot:

“You’re not a carer, you’re a jailer”

While lifelike robots capable of complex human interaction and others useful as domestic servants are currently being developed the scenario in humans is perhaps most useful as an allegory of our current relation with caring technology and the fears people have of it.

When security is provided through monitoring technologies this can feel like surveillance. Similarly, when isolation is ameliorated through emergency call centres some people become dependent on these for social interaction so trigger emergency systems just to talk to someone (Maggie Mort has previously observed this).

I am not of the opinion that care necessarily cannot be delivered by non-human means but care itself is a much more complex and delicate relation than some of the proponents of these devices might have us believe.

What the fictional and real life cases of these relationships reflect is an inherent tension in the caring relationship; care is not always welcomed. As anyone who has ever performed any caring activity will know the person being cared for does not always welcome it even if it is what is best for them. Therefore, care cannot simply be delivered from one person (or thing) to another it is a negotiated, processual relationship.

For this reason feminist thinkers have emphasised that care must be seen as material practice, it is a practical thing which we do not an abstract feeling. Also, the caring relationship constitutes our identity and morality (Kittay et al, 2005) . We learn about what it means to be a moral person through caring and being cared for. This is because care is a process of ‘the weaving of a common fate with others’ (Adam and Groves, 2011: 23).

An important part of care is in doing what is best for someone regardless of whether they want it or not. This is only acceptable when done in through an attachment and relationship with another person. This is not to say that it is not possible to form attachments to machines but these are certainly of a different kind to those with other humans and animals.

Prior to being given his Nurse Ratched Hurt’s character had less sophisticated, defective carer (Odi) who was much loved as he served as an external replacement for Hurt’s own failing memory. In particular helping him to feel more connected to his dead wife. Hurt says:

“When I look at him I see see the years of care he have gave me. He can’t love me but I see the love looking back at me. He carried memories for me when I couldn’t”

Despite his technical deficiencies Odi was a more effective carer than the more sophisticated robot.

Our behaviour towards technologies needs to be reflected on as much as the capacities of the technology itself. Indeed, perhaps it is the robots which need to be concerned. A recent Canadian experiment to investigate robot-human relations ended in tragedy when “hitchBOT” was sent out onto the road to hitch a ride from drivers in order to experience human life and interact with people. Sadly, the robot was ultimately found decapitated and no longer functional.


Adam, B. And Groves, C. (2011) ‘Futures Tended: Care and Future-Oriented Responsibility’ Bulletin of Science, Technology & Society 31(1): 17-27.

Kittay, E.F. with Jennings, B. and Wasunna, A.A. (2005) ‘Dependency, Difference and the Global Ethic of Longterm Care’ The Journal of Political Philosophy 13(4) 443-469.