Weeknotes (2024-06-28)

Number 5. Do it however you want, but do it really well, why people don't adopt AI tools, the changing nature of trust, help me with my remote workshop vision.
I love a good mechanical pencil. Upgraded my Kuru Toga to the Advance 🤤I love a good mechanical pencil. Upgraded my Kuru Toga to the Advance 🤤
  • Work-wise, I've been figuring out approaches to taking on some meaty CX challenges for a big energy company and a big manufacturing company. Similar challenges, different approaches. Which reminds me of this great Morgan Housel/Dave Perrell conversation:
    • DP: The major lesson of doing this podcast so far has just been everything works, but you need to just be really, really, really, really, really, really good at whatever it Is that you do.
    • MH: I like that. Yeah. And in some ways, that's obvious, but it's easy to overlook. But people are always like, "hey, how should I do it?" You can do whatever you want. Yeah. But, you've got to be really good at it. That that's actually really good advice. Because there's a million ways to write. It's an art. It's not science.
  • I'm writing my last essay for MSc Behaviour Change this year on why people don't adopt AI tools and how we could encourage it. It will make a good blog post so I'll rewrite it when I've done reading thirty or so papers (with the help of Elicit).
    • Happily, Cass Sunstein and Jared Gaffe just wrote a paper on 'algorithm aversion', summarising a lot of the research so far. People are less likely to use algorithms (i.e. AI/ML recommendations) in decision making when:
      • They don't believe the algorithm will produce a better recommendation than a human
        • Either because it hasn't been shown OR
        • Because they believe the decision requires unique skills
      • In the workplace, people may want to receive credit for the decision
      • There's also an identity piece - if you've spent years building up your skills and now an AI can do it, what do you do now? (See one side of the reaction to the Figma AI announcements this week)
      • They don't understand how the algorithm works and/or don't trust the way the algorithm works
      • They want to express agency or control
      • Or they just enjoy the process of making that decision, or they want to learn or express their skill
      • It's an emotionally charged decision or has moral consequences
      • The decision may lead to unintended consequences which the algorithm doesn't/couldn't consider
      • They believe the decision is highly subjective
  • Related to the above, I mentioned a couple of weeks ago that I was going to dig deeper into trust. I'm still figuring out what the output of this looks like–I think it might be fun to make a documentary style podcast–but I've started exploring.
    • I have a hypothesis that an economic perspective on trust–that it is an inefficiency–is leading us down a road where we use technology to eradicate the need for it. That has profound implications in interactions that aren't mediated by technology, so profound that it's unlikely to come to pass. But the ramifications of the philosophy will still be felt. And, as with all things, there are examples that show the opposite.
    • This PEW research from 2017 posited that the nature of trust will change and focused heavily on the fact that when people have no choice but to trust technologies or systems, well, that ain't trust. "Those who have doubts about progress say people are inured to risk, addicted to convenience and will not be offered alternatives to online interaction."
  • Lastly, some geekery. I have a vision for remote workshops where I can fade music in and out using a MIDI controller (so I don't have to switch windows) but I can't quite get it working consistently enough to use it in client settings. Here are my explorations:
    • OBS does mixing and, with a plugin, MIDI control, but doesn't provide a virtual mic.
    • Loopback gives me virtual mics where I can route Spotify and it is good but doesn't support MIDI control.
    • There are some apps to do application audio levels via MIDI control but that application audio level isn't reflected in a virtual microphone.
    • This week I've been experimenting with Ginger Audio Caster Live which does virtual mics, provides a mixer interface, and supports MIDI control. But I keep getting audio dropouts/digital artefacts.
    • I think I could set up a StreamDeck Live to do this, but would face the same problems as above and it wouldn't have faders (which the part of me that, aged 16, thought very hard about studying sound engineering feels is very important).
    • I've also heard people use a DAW to mix then use Loopback to use the output of the DAW as a virtual microphone. It sounds CPU intensive, but possibly worth exploring.
    • If anyone has made a setup that does this (and it works) I would love to hear about it!