Powered by RND
PodcastsBusinessDecision Nerds
Listen to Decision Nerds in the App
Listen to Decision Nerds in the App
(524)(250,057)
Save favourites
Alarm
Sleep timer

Decision Nerds

Podcast Decision Nerds
Paul Richards & Joe Wiggins
We talk about human behaviour and decision-making with an investment slant. And tell terrible jokes. Join us as we dive into the trenches with industry innovato...

Available Episodes

5 of 16
  • Room 101: Project Coldplay
    As the wordly philosophers of Coldplay suggest, getting what you want, but not what you need, might leave you in need of fixing. Leaps in investment platform technology give investors more information, more choice and the ability to act more quickly and easily. We want that, but is it what we need? As Joe points out, many of the positive developments in tech are double-edge swords. He thinks from a behavioural perspective, now is one of the worst times ever to be an investor.𝗞𝗲𝘆 𝘁𝗮𝗸𝗲𝗮𝘄𝗮𝘆𝘀:#𝟭 𝗟𝗮𝗰𝗸 𝗼𝗳 𝗳𝗿𝗶𝗰𝘁𝗶𝗼𝗻 – it takes me less than 10 seconds from launching my platform app on my phone to being able to deal. Is that a good thing? In one dimension yes, but the overarching story of behavioural finance is people doing irrational things that create bad outcomes. Slick and seamless tech combined with noise, FOMO and a constant barrage of stimulus has the potential to exacerbate these problems.#𝟮 𝗔 𝗰𝗼𝗺𝗽𝗲𝘁𝗶𝘁𝗶𝘃𝗲 𝗱𝗼𝗼𝗺 𝗹𝗼𝗼𝗽 – tech providers exist in a highly competitive environment and 'faster, easier, more' are key facets of the battleground. No one wants to lead a pitch with, ‘and….this is how we reduce information available to clients and make it harder for them to trade’.#𝟯 𝗥𝗲𝗳𝗿𝗮𝗺𝗶𝗻𝗴 𝘁𝗵𝗲 𝗴𝗮𝗺𝗲 – whilst it might be possible to get providers around a table to agree a common approach that helps investors manage their worst impulses, a market-based solution is likely more workable. This needs those who advise on these platforms to be changing the conversation and including behavioural design as part of any selection process. Imagine a world where providers compete on how they help clients beat their biases as much as how slick the tech itself is. 𝗣𝗵𝗿𝗮𝘀𝗲 𝗼𝗳 𝘁𝗵𝗲 𝗱𝗮𝘆? 𝗜𝗻𝘁𝗲𝗹𝗹𝗶𝗴𝗲𝗻𝘁 𝗳𝗿𝗶𝗰𝘁𝗶𝗼𝗻“Intelligent friction,” is a concept from the payments industry which focuses on interventions based the risk level of a transaction. It aims to balance a good user experience with effective security. Buy a coffee in a new country when you land there fine, buy a laptop, expect an intervention. There are some obvious investment analogies here. And of course this is only one tool in the arsenal, getting better at education and helping clients help themselves is also pivotal.We don’t want to lose all the good things that tech brings, but to mangle Coldplay, we should perhaps be trying to help people want what they need.
    --------  
    8:22
  • Room 101: Performance fees - heads I win...???
    Joe doesn’t like performance fees - but why? Are they innately problematic or just badly and perhaps cynically implemented in the mutual fund industry? In this bitesize episode, we get into alignment of interests, bad design and investor behaviour.𝗧𝗵𝗲 𝗺𝗼𝘀𝘁 𝗶𝗻𝘁𝗲𝗿𝗲𝘀𝘁𝗶𝗻𝗴 𝗽𝗮𝗿𝘁 𝗼𝗳 𝘁𝗵𝗲 𝗰𝗵𝗮𝘁?Performance fees are most often discussed in the context of alignment and risk sharing. Joe raises an interesting question - can they also be used to nudge investors away from unhelpful behaviour?
    --------  
    8:31
  • Room 101: Chartcrime?
    We all have things in our working lives that drive us insane. Anyone who regularly listens to the pod will know that there are a few subjects that consistently raise Joe’s blood pressure to unhealthy levels…In the interest of Joe's and our future guest’s wellbeing, we wanted to find a way of dealing with these issues productively. Our solution, Decision Nerds: Room 101 Room 101 is the torture chamber in George Orwell’s classic book, 1984. For those who cross its threshold, it contains, ‘the worst thing in the world’. Many Brits will remember the Room 101 TV and radio shows where celebrities suggested what they thought was the worst thing in the world and competed to have their pet hate consigned to oblivion (my personal favourite being Jimmy Carr and tax avoidance schemes).Our take on Room 101 is slightly different. Like the celebrities, Joe, I and our guests will discuss the issues that make our eyes roll. But it won’t be just a winge-a-thon, we’ll try to get to the heart of the issue and start a productive discussion. 𝗖𝗵𝗮𝗿𝘁𝗰𝗿𝗶𝗺𝗲We're kicking-off with ‘chartcrime’ and something that particularly riles Joe - the overlaying of time series, such as inflation, from different periods and looking for predictive patterns. Are these charts a problem, or is it how they are used and framed? In the episode, we discuss:𝗣𝘂𝗻𝗱𝗶𝘁𝗿𝘆 𝗮𝗻𝗱 𝗽𝗿𝗲𝗱𝗶𝗰𝘁𝗶𝗼𝗻 and how different investor types might confuse the two𝗧𝗲𝘅𝗮𝘀 𝘀𝗵𝗮𝗿𝗽𝘀𝗵𝗼𝗼𝘁𝗲𝗿𝘀 𝗮𝗻𝗱 𝗰𝗵𝗮𝗿𝘁 𝗰𝗿𝗲𝗮𝘁𝗼𝗿𝘀 - are they the same thing?𝗧𝗵𝗲 𝗮𝘄𝗸𝘄𝗮𝗿𝗱 𝗾𝘂𝗲𝘀𝘁𝗶𝗼𝗻 that might stop people producing these chartsOur Room 101 episodes are bite-sized and designed to provoke a conversation. Hot takes, or deeply considered meditations are both welcome. https://www.linkedin.com/posts/paul-richards-34965883_%F0%9D%97%A5%F0%9D%97%BC%F0%9D%97%BC%F0%9D%97%BA-%F0%9D%9F%AD%F0%9D%9F%AC%F0%9D%9F%AD-%F0%9D%97%96%F0%9D%97%B5%F0%9D%97%AE%F0%9D%97%BF%F0%9D%98%81%F0%9D%97%B0%F0%9D%97%BF%F0%9D%97%B6%F0%9D%97%BA%F0%9D%97%B2-we-activity-7282682076963733504-tB9s?utm_source=share&utm_medium=member_desktopAnd of course, feel free to submit the most egregious example of chartcrime you have ever seen (if you want to raise Joe’s blood pressure).
    --------  
    11:47
  • May Contain Lies...
    If you’ve been around the block, you will likely have seen some eye-rolling use of evidence during meetings. Evidence can be used badly for many reasons; a misunderstanding of what conclusions can be drawn from it, or perhaps it has been cherry-picked to support a particular position.In this episode, we unpick these issues with Professor Alex Edmans of London Business School. Alex recently published a book, ‘May Contain Lies’, which discusses the methodological, psychological and incentive problems surrounding evidence use.We spend a decent amount of time on a core idea from the book, ‘The Ladder of Misinference’. If you think scientifically, there are no earth-shattering revelations here, but I really like it because it is a simple teachable framework that groups can adopt. Alex gives some great examples that everyone can understand and internalise. The Ladder deals with the challenges of method, but that’s only half the story. We also have to beat the behavioural cards that nature has dealt us, e.g. confirmation bias. And even if we beat the first two traps, incentives can nudge us away from saying what we really believe. Key insights:- How Alex tries to move beyond black-and-white thinking and engage with complexity - getting the right mix of data and stories- Why do bad ideas stick - do you still 'Power Pose'? - Changing minds – the power of good questions (there’s a great experiment on pianos and toilets that you can try at home).- Trading off the short and long-term - why he chose the most critical agent to help him publish his book. - Understanding neurological carrots and sticks - what happens when we put people in a brain scanner and give them statements they like and don’t?- The state of debate around ESG and DEI – ideology, identity and pressures to conform.
    --------  
    48:21
  • Pants on fire
    Lying – it’s something that all humans do. Most of the lies we tell are small and harmless. But deceptive behaviour in the investment industry lowers trust and increases costs and complexity.We are deceptive for many reasons and one of them is that we can get away with it. This is because, despite what we might believe, most of us are pretty terrible at spotting lying – including highly experienced financial analysts.But what would happen if we all had access to AI-powered technology on our phones that could spot deception with a high degree of accuracy? Would that change how the industry behaves? This is no idle speculation – in this episode of Decision Nerds, we explore research that suggests that AI is significantly better at spotting lying than humans. And as we all know, AI has a habit of surprising us by appearing in the wild far faster than we might expect.How would this technology impact the investment industry? We discuss:𝙏𝙝𝙚 𝙢𝙤𝙩𝙞𝙫𝙖𝙩𝙞𝙤𝙣 𝙛𝙤𝙧 𝙙𝙚𝙘𝙚𝙥𝙩𝙞𝙤𝙣 𝙞𝙣 𝙩𝙝𝙚 𝙞𝙣𝙙𝙪𝙨𝙩𝙧𝙮 – the entirely logical reasons that we don’t always tell the truth𝘿𝙞𝙛𝙛𝙚𝙧𝙚𝙣𝙩 𝙠𝙞𝙣𝙙𝙨 𝙤𝙛 𝙙𝙚𝙘𝙚𝙥𝙩𝙞𝙤𝙣 𝙖𝙣𝙙 𝙩𝙝𝙚𝙞𝙧 𝙧𝙚𝙡𝙖𝙩𝙞𝙫𝙚 𝙞𝙢𝙥𝙖𝙘𝙩𝙨 – what are the traps that managers fall into and why𝙅𝙪𝙨𝙩 𝙝𝙤𝙬 𝙢𝙪𝙘𝙝 𝙗𝙚𝙩𝙩𝙚𝙧 𝙞𝙨 𝘼𝙄? – the results might surprise you𝙒𝙤𝙪𝙡𝙙 𝙖 𝙩𝙧𝙪𝙩𝙝 𝙢𝙖𝙘𝙝𝙞𝙣𝙚 𝙙𝙚𝙨𝙩𝙧𝙤𝙮 𝙩𝙝𝙚 𝙞𝙣𝙙𝙪𝙨𝙩𝙧𝙮 𝙤𝙧 𝙢𝙖𝙠𝙚 𝙞𝙩 𝙗𝙚𝙩𝙩𝙚𝙧? – our take on ‘creative destruction’𝙏𝙝𝙚𝙧𝙚’𝙨 𝙣𝙤 𝙩𝙧𝙪𝙩𝙝 𝙢𝙖𝙘𝙝𝙞𝙣𝙚 𝙮𝙚𝙩  - we discuss a few better questions that we can use today.Affectiva facial recognition demo Paper on analysts' ability to spot CEO deception Paper on AI's ability to spot CEO deceptionLying on CVs
    --------  
    40:52

More Business podcasts

About Decision Nerds

We talk about human behaviour and decision-making with an investment slant. And tell terrible jokes. Join us as we dive into the trenches with industry innovators, academics and mavericks.
Podcast website

Listen to Decision Nerds, The Entrepreneur Experiment and many other podcasts from around the world with the radio.net app

Get the free radio.net app

  • Stations and podcasts to bookmark
  • Stream via Wi-Fi or Bluetooth
  • Supports Carplay & Android Auto
  • Many other app features
Social
v7.6.0 | © 2007-2025 radio.de GmbH
Generated: 2/7/2025 - 10:04:49 PM