Can robots coach?

This tweet from IFA, Claire Walsh, is something I commonly hear from planners.

Could a computer ever help someone understand their needs as well as a human adviser?

Chris Budd says no. I disagree, although it will take some time and has little to do with artificial intelligence.

If you subscribe to bouffant-haired Frenchman Jacques Lacan’s theory of how and why therapy works, and apply this to financial planning and business coaching, it becomes clear what is required.

Lacan, rogue analyst and one of the most influential thinkers of the 20th century, uses ‘person-who-knows’ to describe the relationship of the analyst to the analysand (or client). Put (very) simply, for the process of analysis to work, the client must assume that the analyst (or adviser) ‘knows’ the answer. This allows the client to explore and uncover the answers for herself. For the client to understand themselves, they must first think that the person they sit opposite already has the answers but refuses to share them just yet.

Sounds weird, but the skills and techniques for establishing this position of authority are very similar for good business coaching and lifestyle planning. Use of silence, refusal to validate emotions and opinions by non-verbal gestures, subtraction of the coaches’ personal values from the discussion, and so on. The planner is the ‘person-who-knows’ and given the right amount of space, the client will keep talking, way beyond initial stock responses, until they come up with what sounds like an authentic answer.

The accuracy of the answer is, I would argue, impossible to measure by either a human or a computer. It is entirely subjective. We can only measure its efficacy by the client’s commitment to it as truth.

Could a machine ever replicate this? The questions designed to disturb respondents out of standard responses (“What would you do if you had 5 years left to live?” etc) could be fired out by a computer. Maybe you’d even be more honest with a computer than a real person. Psychometric testing already mandates no human involvement in the questioning. At the moment, however, would you really believe that a computer knows the real answer?

I’ve seen rooms full of people duped by online personality profiling tools which purport to analyse a person’s social media outputs and assess their personality. A couple of quick examples and the room is awed. 10 minutes playing with it afterwards and the superficiality of the tool is easily exposed. They are marketing gimmicks. However, the growth of genuine artificial intelligence is likely to be exponential in the years to come.

Crucially, if we accept Lacan’s model, the ability of a robot (or human) to coach has little to do with their own analytical or deductive powers. At no point does the coach know the clients’ true desires. Success lies in the ability to make the client believe in that they do. This perceived omniscience, allows then allows them to generate and believe in their own answers.

It’s a Wizard of Oz machine.

The success of the machine known as robo-advice is harder for us inside the financial services industry to believe in as we peeled back the curtain long ago, and saw an actuary on a bicycle working for a life company. We don’t believe.

That doesn’t make it impossible. But humans and robos alike are yet to realise that the answer doesn’t exist in their algorithms, but the relationship between an investor and a person (or computer)-who-knows.

3 Comments
  1. Profile photo of Chris Davies
    Chris Davies 2 months ago

    Just back from a Roboinvestor conference where I was discussing this very subject with a number of delegates from around the EU. The common consensus is, yes AI will bring engaging service proposition that moves beyond the current ‘bot-advice’ process to a holistic consultative process that can involve a form of coaching. As Jan mentions, the coaches job is to facilitate a journey for the coachee to explore their own ideas and subjective reasoning around potential solutions to the issues at hand.

    IPsofts Amelia is a great example, currently trailed with help desks, financial ops and call centres, the algorithm for processing complex problems and natural language means, for the first time, AI can mirror human interaction. The problem (as I see it) comes down to emotional intelligence (EQ), can AI really mirror empathy which is the key to EQ. Coaching and good quality financial planning relies a lot on this and it’s something that I’m not convinced AI can replicate any time soon.

  2. Profile photo of Jan Bowen-Nielsen
    Jan Bowen-Nielsen 2 months ago

    Hi Phil,
    Interesting article. This was something I had to respond to 🙂
    The fundamental presumption that the coachee must believe that the coach knows the answer, for the process to work is in my opinion and my experience wrong.
    This may be true for some psychologist and therapeutic approaches, but not for coaches. I (crudely) describe psychologists and psycho-therapists as “service engineers” who just happen to work with the mind, rather than machinery. The therapist and their patient, enters their conversation believing that something is wrong with the patient that needs fixing. The therapist’s role is then to first identify the cause of the problem and then use their tool set to fix it.
    This is not how a coach or their client approach a coaching conversation. A coach fundamentally believes that the coachee has the answers themselves and the coaches’ role is to help them find their own answers. A good coach explains this to their client before the process starts, and this empowers the coachee. Upon finding their own answers the coachee will then gain a strong sense of personal success and ownership for the learning and actions they derive from these.
    So, can a computer help someone understand their needs as well as human adviser? I think there are many facets to this:
    – What is the context? Is it just a question of narrowing some options down to arrive at a selection of possible products?
    – How good is the human adviser? Is the human adviser genuinely interested in understanding the client and building a relationship or is he/she just focused on selling products?
    – How good are the robo-adviser’s questions?
    – etc.

    Can a computer be as good as a competent human coach? In my mind – no! Humans are not rational, but highly complex, emotional, irrational, social, flawed beings who are part of complex systems. The AI required would be mind-boggling and sometimes it is the flaws of the complex, emotional, irrational, social coach that create break-through moments for the coachee.

    • Profile photo of Phil Young Author
      Phil Young 2 months ago

      Thanks for the replay Jan, and having experienced your coaching via one of your coaches I do believe it works. I disagree though, as an analyst (and I accept this doesn’t apply to all therapists e.g. CBT) adopts the same approach. If the answer lies in the coachee’s own subjective truth, then its production isn’t dependent on the analytical skills of the coach (the coach isn’t producing the answer) but on the skills in coaxing it out in a way the the coachee accepts and identifies as truth. That is to say it is the relationship between the two rather than the analytical brain of the coach which is most important. Would you agree with that?

Leave a reply

Copyright © 2017 Adviser Lounge. All rights reserved. Intended for UK Financial intermediaries only. Terms and Conditions

Newsletter

Enter your details below to recieve our regular newsletters.

Log in with your credentials

or    

Forgot your details?