Featured

News

Now that the entire book is here (see here), new posts will consist of discussion and debate about the book, the odd poetry, and bits about ethics. Stay tuned and feel free to go to the Contact page if you have anything to say concerning the discussion.

Edit: the book is now also available as an eBook on Amazon Kindle, in fully proofread and formatted form!

Advertisements

Open Access: Really?

There was a time when publishers ruled the roost. University libraries paid exorbitant fees to stock paper journals across shelf after shelf. Earnest academics spent time extracting paper journals and laboriously photocopying page by page. Authors received packages of paper offprints and then posted them out in response to postcards from foreign lands appearing in their post-trays.

But the arrival of electronic journals meant that we rarely sit in libraries poring over stacks of paper journals. Shelves shrink and where paper copies are required we now turn handles on mobile storage shelves which have been moved to library basements. Now universities pay even larger fees to subscribe to electronic journals or packages of journals offered by publishers. The product model has become a subscription model. Access depends on being staff or students of an institution. Individuals and the public cannot see the fruits of academic labour.

Enter open access. Instead of the consumer paying, the provider pays. The market shifts. It looks, smells and tastes like vanity publishing but it isn’t … or is it?  The publishers require a profit. They draw their profit from institutions, primarily in the public sector.

Previously they marketed to libraries – sometimes indirectly by getting us to recommend their journal to the library; sometimes directly through special offers and trial subscriptions. But then we were protected from the commercial pressures. As long as enough libraries subscribed to a journal to render it viable, whether you were published depended solely on the refereeing process, flawed as it is.  At least you had some chance of publication provided you aligned with the views and direction of your academic discipline and your colleagues. No longer.

Open access brings with it a pernicious new market logic which could threaten academic integrity and independence. Institutions still pay, but they pay through their authors. So an author must find money from somewhere to pay for publication.  Either the institution will require a substantial slush fund to hand out to academics, or publication will be limited to those who can command funds from research bodies. For many jobbing academics, such funds are not available. So publication, the ultimate judge of productivity, depends on the wealth of the institute or the ability to conform to research council agendas. An underclass of academics emerges for whom open access publication is out of reach.

And as with any market, competition develops. The more prestigious publication will raise page fees. You will have to be very rich to access Rolls Royce publications. Pressures will be applied to the editors and managers of journals. If an author is paying she will want fast publication. Open access with be turned around much faster than traditional papers. I have seen open access papers appear which have been turned around in a month. However good the paper, it would take more than a month to read and understand it, let alone review it. Special offers start to appear:  reduced fees for papers which the publisher thinks might get more hits; the more controversial the better.  Open access publishing will look like Tesco or Wallmart: publisher clubcards, special offers, vouchers for money off second publication, extra points for more citations.

So what is the answer? Open access looks like vanity publishing. But why not self-publish? Blogs are starting to get cited in papers. Ideas can be spread much quicker. You don’t have to wait years for referees to wield their swords to produce a paper which has had its guts removed in order to satisfy the whims of the referees. However, the clout of an editorial board and thoughtful feedback is important. Can we achieve this outside the grip of commercial publishers?

There are costs with publishing. Editorial assistants may need to be employed. There is the practical issues of proofing, formatting, structuring, managing the journal website, and publishing hardcopies if they still exist. But much of the editorial process is administrative and mechanical. There is no reason why AI should not be employed in formatting, proof reading and even to some extent reviewing.  Algorithms and agents could become publishers. Universities can host key high quality journals, publish carefully selected papers within key disciplines and academic areas that the university excels in, while recruiting external auditors to check for institutional nepotism.

Open access is not a gateway to freedom in publishing. Quite the contrary: it promises less access and offers a different constraint on the promulgation of ideas and diversification of research. It’s time for universities to take control of publishing.

Why Robots and Autism Don’t Mix

Robots are increasingly being used as replacements for human contact. Researchers are proposing autonomous robots for providing therapy to autistic children. This may be causing more harm than good. Rather certain uses of robot could cause long term post-traumatic stress disorder.

At the University of Hertfordshire, a robot called Kaspar i, is used to teach social skills. The child sits in front of the robot and encouraged by researchers and carers to interact with the robot. For an autistic child, already struggling to cope with sensory stimulation, her anxiety is only increased to a point at which she hits out at the robot. ii

The European DREAM project iii promised to go further. The human therapist will be replaced by robots which will train autistic children to behave in a socially acceptable way. The child will spend hours alone with the robot practicing repetitive behaviour dictated by the robot which will adjust its responses based on sensors in a smart room which detect the child’s response.

The DREAM project uses a NAO robot to implement Applied Behavioural Analysis (ABA) iv, a controversial behavioural conditioning therapy which involve hours of repetition based on rewards and punishments. ABA seeks to shape external behaviour to be socially convenient. v It does not matter what is going on inside the child’s mind or what the cause of the distressing behaviour is.

In the 1930s BF Skinner developed operant conditioning vi, demonstrated in the behavioural conditioning of starved rats in cages. This radical behaviourism vii is based on the dated idea that the goal of psychology is to predict and control behaviour. The DREAM project aimed to predict and control the behaviour of autistic children. The cage is the smart room and the conditioning is rewards and punishments administered via a robot which called out instructions. viii

In the 1940s autistic children were locked up in institutions and fed and watered like animals. Autistic children exhibit repetitive self-stimulatory behaviour called stimming. This behaviour such as hand flapping is their attempt to manage the over-stimulation they feel in their sensory world. Children would gouge out their eyes, chew fingers off or chew arms down to the bone in an attempt to cope with the overwhelming distress caused from the sensory overload of a crowded institutional ward.

In the 1960s Ivor Lovaas ix tried to deal with this by applying operant conditioning hitting the child every time it attempted to stim. Instead of being chained in a ward, the child would endure forty hours or more a week of behavioural conditioning and repeating socially acceptable behaviour. The effect could be dramatic. The child would outwardly look normal and fit into what we expect. But it is the outward behaviour that has been controlled, just like a robot, by forcing the child to obey and stay in a situation which is overwhelming for them.

But since Lovaas viewed autistic child as non-human x – merely a physical human shell, but psychologically not human – the use of conditioning of the same type that was used to train monkeys

for space travel was quite acceptable. And 40 hours stressful conditioning, however inhuman, was better than being chained to a bed gnawing your finger off. The fact that ABA resulted in lifelong post-traumatic stress syndrome xi was neither here nor there.

Far from lacking in empathy and emotions, the autistic problem is more one of a flooding of the emotions and an over stimulation from more sensory input than the child can manage. The autistic child lacks the skills and brain pathways to manage the emotional and sensory input that a normal child can. xii

Autistic children feel their own emotions and others strongly, but the wiring between the emotional and cognitive parts of the brain is weak. Emotions are hard to label, difficult to source and extremely difficult to calm. This leads to anxiety, panic and a fight or flight response. Indeed the autistic child is the opposite of robotic. Overwhelmed by human emotion, he tries to shut down and manage his brain activity.

Social robots act on a limited set of social rules. Faced with the almost infinite variety of human social behaviour, the limited repertoire of a social robot, even if complemented with attempts at machine learning, will be unable to cope with the human’s social variety. For a successful social robot interaction the variety of social response exhibited by the robot must match that of the human.

If we perceive autistic children as having a limited social repertoire we can view them as robotic and as a suitable case for social robots. But treating a robot and an autistic child as equivalent demeans the humanity of the child who may have skills and insights beyond the neurotypical.

Isolating an autistic child in a smart room with an autonomous robot xiii, based on the excuse that autistic children prefer machines to people which is the ambition of the DREAM project can only succeed in dehumanising the child.

The child needs more sensitive and supportive human contact not none nor being treated like the husk of a machine. What is required is not a blanket treatment of autistic children as severely disabled, but a sensitive understanding of the sensory issues, preferences and problems of the individual child supported by gentle and patient human interaction. Of course, in a world where gentleness and patience are swamped by the need for efficiency and quick results, the temptation to adopt a mechanistic, industrial approach to autism is strong.

There may well be a role for robots in providing social distraction in the classroom and home and changing the social atmosphere to take the focus off the child. But the idea of replacing human contact with robots or attempting to programme human behaviour with a robot promises a questionable brave new world.

How to use robots with autistic children:

  • Teach the child to do simple programming to make the robot say things or dance, thus giving the child a sense of control.
  • Insert the robot in a discussion group with friends and therapists as a social discussion point. Such a distraction will reduce the social pressure and support interaction.
  • Identify soothing movements and patterns which the child likes and program these into the robot to support the child in calming down and dampening meltdown.
  • Deploy the robot as an option for time out in a classroom situation such that the child can go and sit with the robot for a while and hence avoid a meltdown.
  • Develop programs for using the robot in music therapy.

  1. i https://www.herts.ac.uk/kaspar/meet-kaspar
  2. ii https://humanargument.wordpress.com/2018/07/27/robots-and-autism-is-kaspar-just-a-sophisticated-doll/
  3. iii https://www.dream2020.eu/
  4. iv https://www.bacb.com/about-behavior-analysis/
  5. v https://autisticuk.org/does-aba-harm-autistic-people/
  6. vi https://www.simplypsychology.org/operant-conditioning.html
  7. vii http://old.behavior.org/item.php?id=529
  8. viii https://www.degruyter.com/view/j/pjbr.2017.8.issue-1/pjbr-2017-0002/pjbr-2017-0002.xml?format=INT
  9. ix http://thelovaascenter.com/about-us/dr-ivar-lovaas/
  10. x http://neurodiversity.com/library_chance_1974.html
  11. xi https://www.emeraldinsight.com/doi/full/10.1108/AIA-08-2017-0016
  12. xii https://www.jkp.com/uk/the-autism-discussion-page-on-the-core-challenges-of-autism-2.html/
  13. xiii https://ieeexplore.ieee.org/document/8307127/

Robots and Autism: Is Kaspar just a sophisticated doll?

 

Kaspar

In the BBC Documentary, Six Robots and Us, Ethan Docherty, a four-year-old boy on the autistic spectrum, is introduced to a robot called Kaspar at his home.

Kaspar, a short robot who sits on a table, dressed in jeans and casual shirt is able to respond to simple instructions and speak in simple sentences and display a limited set of facial expressions.  The idea is that Ethan can learn basic social communication from interaction with Kaspar.

Continue reading

Ten Commandment of Robotics

My Ten Commandments for human interaction with AI.

  1. You shall not worship your robot.
  2. You shall not relinquish responsibility to machine intelligence, robots or any other internet enabled thing. You will suffer if you do; your resilience and ability to adapt will degrade quickly, but if you retain control of the technology generation after generation will benefit.
  3. You shall not confer human characteristics on a machine or look to machines for your salvation.
  4. Remember the limits of AI. Don’t assume that the machine will not break, or will always give a right answer. Remember that humans are not machines.
  5. You shall not use robots as tools to steal, kill and destroy.
  6. You shall not have sex with a machine.
  7. You shall not blame robots for your own shortcomings, stupidity and sin.
  8. You shall not use AI to enslave people, to degrade them or to remove their moral responsibility for decision making, to render a person powerless, passive, purposeless or to turn that person into an object t be controlled and manipulated.
  9. You shall not use AI to create false correlations, justify predetermined decisions, present opinions as fact, suggest that because the data says so it is so, justify exclusion, prosecution, guilt or to claim that the interpretation of data is scientific fact.
  10. You shall not use AI to measure others’ performance, possessions and prestige.

 

Towards a Christian 10 Commandments of AI

 

Image result for robot and book

The Ten Commandments of AI

Artificial Intelligence  (AI) is changing the world we live in. Self-driving cars may take over the roads. Chips implanted in our fingers may help us to do contactless transactions. Facebook data can be analysed by AI to judge our personalities more accurately than anyone in our family.  The Internet of Things will enable gadgets and sensors to communicate with each other. Blockchain may change the way we handle money.  Delivery drones may soon be flying over our heads. And as for robots, they’re already running retail warehouses.

Continue reading