We’re All Somebody’s Fool | Dan Simons & Chris Chabris

There may be errors in spelling, grammar, and accuracy in this machine-generated transcript.

Earmark CPE: If you'd like to earn CPE credit for listening to this episode, visit earmark Cpcomm. Download the app, take a short quiz, and get your CPE certificate. Continuing education has never been so easy. And now on to the episode.

Caleb Newquist: This is on My Fraud, a true crime podcast where victims get fooled instead of filleted. I'm Caleb Newquist.

Greg Kyte: And I'm Greg Kite. Greg,

Caleb Newquist: I am dispensing with the usual [00:00:30] pleasantries because we've got a longer episode today and I'm gonna get right into it.

Greg Kyte: Sounds great.

Caleb Newquist: Okay. Uh, when you and I were conceiving this podcast, we talked quite a bit about the psychological aspects to frauds and why those were interesting to us. And so it's pretty exciting that today on the podcast, we are going to talk to two psychologists, Professor Dan Simons of the University of Illinois. Urbana. Urbana [00:01:00] Urbana, Urbana Urbana Champaign. And, uh, Chris Chabris. He's the professor and director of decision sciences, uh, in the Department of Bioethics and Decision Sciences at the Geisinger Research Institute.

Greg Kyte: So fancy.

Caleb Newquist: These are very fancy. Some fancy guests, they're they're I, I told the guys, I think this is I don't know if this is in the conversation. I don't remember, but but, uh, we were kind of giddy to talk to these guys. Yeah. And, uh, and it kind of shows, but anyway, they [00:01:30] Dan and Chris, uh, their most recent book is Nobody's Fool. Why we get taken in and what we can do about it, and it's from Basic Books, and you can get it wherever you get books. They're also the authors of The Invisible Gorilla How Our Intuitions Deceive Us, and you should definitely check that out. And you may have actually heard of that one. Uh, that book was based on a paper and an experiment they did back in the late 90s called Gorillas in Our Midst. And, [00:02:00] Greg, have you heard of this? Had you heard of the experiment before? We talked to Dan and Chris today?

Greg Kyte: Yeah, I absolutely yeah, I had and actually in the interview, I had a light bulb where I was like going, oh shit, you guys are the ones who did the the the gorilla, uh, experiment which which I think might have been, uh, insulting to them that I didn't know that before. Yeah. The interview, but, uh, uh, I think they I think they rolled with it. I think.

Caleb Newquist: They were. Yeah, they were there. You know, what's you know, what's so [00:02:30] funny about it is like, Dan and Chris are there like funny guys. Yeah. They were like, they were great to talk to. And what's, uh, kind of hilarious about the Invisible Gorilla is that it won them the IG Nobel Prize, and you've probably never heard of that. But the IG, the IG Nobel Prize is a satiric prize awarded to celebrate unusual or trivial achievements in scientific research. And, um, they. [00:03:00] On their website, it says its aim is to honor achievements that first make people laugh and then make them think. Dan and Chris won won that award in 2004.

Greg Kyte: Because of the gorillas.

Caleb Newquist: Because of the invisible gorilla. Yeah, yeah, yeah.

Greg Kyte: And we we do have a link to the YouTube video if you want to check out the Invisible gorilla video, there's a, there's a if you don't if you have no idea what it is, maybe, maybe go watch that real quick so that you can because I don't know if we explain it very I think we just [00:03:30] have the assumption everybody knows what it is right during the conversation.

Caleb Newquist: Definitely. Yeah. Good. That's a good call. So if you haven't seen the Invisible Gorilla video, pause the podcast, go watch the video. It's very easy to find and then come back and then you'll be you'll be sufficiently primed. Yeah, for this conversation. But the reason we had them on the show is because of their new book, Nobody's Fool. Right. Uh, and that's also linked in the show notes. And, uh, we, we we did some homework, Greg. We we kind of, you know, we we [00:04:00] divided and conquered and, um, in preparing to talk to these guys. And we had a great time talking to them. They were they were super fun.

Greg Kyte: Fantastic conversation. A couple of of, uh, of of good guys who know how to roll with my sometimes, uh, caustic sense of humor.

Caleb Newquist: Absolutely. Um, so, yeah, here we go. Let's get into our conversation between Greg and I and Dan Simons and Chris Chabris. So. [00:04:30] Dan, Chris, thanks for coming on. Uh, generally, the way we start this is we ask people to kind of give us their kind of background, kind of their life story, try to summarize in about 2 or 3 minutes or so and, uh, you know, uh, don't leave out the awkward phase if, you know, if you feel so inclined, but, uh, yeah. And oh, and I flipped a coin and Dan, you won. So, uh, Dan, over to you.

Dan Simons: Okay, well, if I win, does that mean Chris goes first or. Uh. Yeah. [00:05:00] So, uh, where to start, exactly? I was an undergraduate at at Carleton College in Minnesota, and I went into my undergraduate years thinking I would major in maybe physics or maybe English, or maybe French, and physics didn't work out very quickly. Um, French. I was, uh, I took a couple of semesters of French, and then I was in the first literature class, and there was this one person in the class who just was so fluent and so quick on the draw, that I was just intimidated out of the class. [00:05:30] Uh, I'm married to her now. Um. Oh, my.

Greg Kyte: Gosh, that's fantastic. That's a great.

Dan Simons: Story. And then, uh, so I dropped that class. Uh, and then I ended up taking a psychology class and really liked it, and then took another one and liked that and kind of almost ended up defaulting into psychology and cognitive science as an undergraduate. Um, I had a lot of broad interests in, in psychology, went to graduate school, uh, for a year at Harvard. My advisor, uh, ended up going on maternity leave and then leaving for northwestern, uh, in my first [00:06:00] year there. So I ended up transferring to Cornell, where I started out doing work on cognitive development. So how kids understand different kinds of objects and categories, and how they reason about what's inside different kinds of things based on what they look like.

Greg Kyte: Oh, interesting.

Dan Simons: So that was the first couple of years of graduate school. I was working with preschoolers and ended up shifting more to adult human perception. So that was kind of what got me into studying perception and attention more generally. And then [00:06:30] when I went to Harvard as faculty and worked with Chris when he was there as a graduate student finishing up, we continued doing a lot of work on those sorts of topics and did work on kind of how we fail to notice things that are entirely unexpected. So people in gorilla suits is right. Kind of. Yep.

Greg Kyte: Yeah, right. I've seen that video where you're trying to count the number of times they pass the basketball. Yep. And the gorillas. And I'm pretty sure I was one of the the idiots. It's like I'm smarter than that. But in [00:07:00] hindsight I was like, I guess I'm not smarter than that.

Dan Simons: Yeah. I mean, so that video kind of has a life of its own. It ends up in all sorts of strange places, but it was actually an undergraduate class project that we did. Oh no kidding. Um, and, uh, you know, the finding is that about half the time people don't notice. And because it's right at that kind of sweet spot of half the people noticing and half not. Um, you can watch it in a room of people, and two people will see it and two people won't. And then they'll argue about how clueless the people who missed it are, or right, you know, [00:07:30] or.

Greg Kyte: How or how lack of focus. The other. That's what I like to that's how I defend myself, is I'm so laser focused on the task. I'm so effective at my job that I don't care about a gorilla and wait, but are you saying that you made that video? That's your. Yeah, yeah, that's.

Dan Simons: That's Chris and I Chris and I did that as a study.

Greg Kyte: Caleb. We've got we've got celebrities on the show. That's amazing.

Caleb Newquist: I didn't want to say anything, Greg, because I knew you would react this way.

Greg Kyte: Oh my gosh, that's so cool. Well, now I'm gonna fanboy try to try to chill more.

Caleb Newquist: Try to chill. Greg. Okay.

Greg Kyte: Yeah. Amazing. [00:08:00] Yeah.

Dan Simons: So that's pretty much the sort of my central line of research over the last 20 some odd years is trying to understand, you know, when are the cases that we do miss something, when do we not? And are there differences between people? Right. Are there some people who are missers and other people who are noticers? And the short answer, as best we can tell so far, is no. Some you know, whether or not somebody misses it seems to be mostly due to chance, um, at that moment. And as long as you're trying to do the task and doing it passively, you've [00:08:30] got a chance of noticing it and the chance of missing it. So cool. Um, yeah. So that's kind of where I came from.

Caleb Newquist: Nice. Chris. You're up.

Chris Chabris: Well, I was deterred from biology at an early age when I had trouble with the frog dissection. Uh, in ninth grade, you know, and so on. And then I was deterred from physics, uh, pretty early in college when I actually failed the course, um, which was the only college course I ever failed, thankfully. Um, but I did major in [00:09:00] computer science. Um, you know, where there are no guts or anything like that to worry about. And, um, I was really interested in computers for a long time. I had a computers as early as junior high or something like that. And this was a long time ago. So you know where you, like, had cassette tapes to save the programs and things like that.

Greg Kyte: Yeah, I had a Vic 20. I know what you're talking about.

Chris Chabris: Yes, exactly. I had an Atari 400, was my first computer. And my my fondest memory from that time was all the arguments over which was better, you know. Um.

Greg Kyte: Oh, I. As [00:09:30] a Vic 20 owner, I knew mine was not better. We we were we were very well aware that we were the Walmart of the of the computer world. Yeah, but if.

Chris Chabris: You had a Commodore 64, then we could then we could have argued. Yeah, that.

Greg Kyte: Would have been an argument.

Caleb Newquist: I had a Commodore.

Greg Kyte: Oh, look, he chimes in.

Dan Simons: I played with the Commodore Pet in in middle school. That was fun.

Chris Chabris: Okay, that was even that was even earlier. Um, well, so, uh, I majored in computer science in college, and, um, you [00:10:00] know, this was way back in the history of computer science. I mean, not all the way back. I'm not that old, but there weren't enough computer science courses for an undergraduate degree. So what you had to do as an undergraduate at Harvard was take some courses in a related field. And one of the related fields was physics. And I said, nope, that's not for me. Right. Um, one of them was math also not good enough at that. And one of them was psychology. And I had a friend who had been doing that on that same track, and he said, oh, you should try this. So so I did that. I wound up taking more psychology courses than I needed to. [00:10:30] And, um, one of my professors at the end of college offered me a job working in his lab. That was Stephen Kosslyn, who's now, uh, retired. He's the former chair of the psychology department at Harvard and, um, dean of social studies. He's now retired from, from there.

Chris Chabris: And, uh, he was also the one who talked me into going to graduate school. So I kind of had to be dragged into all this stuff, like, all all along. And then, um, uh, as Dan says, we met, uh, when I was finishing up grad school, and Dan was starting as a, as a faculty member. And I had been really working on entirely, well, [00:11:00] not entirely different things. I mean, some different things from, um, from what? From what Dan was. But then when we worked together teaching this course, that's what really got me into this same line of research when we had to come up with this class project, which was really Dan's idea. But, um, in making that film with the people passing the basketballs and so on, my job was choreographer, which was the only time I've ever choreographed anything. I couldn't even handle the dance lessons for my own wedding. Um, so, uh uh, so I was my [00:11:30] in charge of, like, making sure people didn't run into each other and came out at the right time, you know, and so on.

Dan Simons: So in the video, somebody almost gets hit by a ball, so.

Greg Kyte: Oh, nice.

Chris Chabris: So that's why you can see why. Like I never had that job again, you know. Um, uh, so um, uh, I got yeah. So I got into sort of cognitive psychology around the same time, cognitive science around the same time. But I think I've probably worked on more different topics, you know, since then than, than Dan has. And I've worked a lot with economists and other social scientists and [00:12:00] so on. And right now I'm, uh, I'm doing research at, uh, health care organization in Pennsylvania. Really, um, focused on applying ideas from psychology and decision making, behavioral economics to helping, um, people make better decisions in health related matters and also continuing to work on many other topics. Cool.

Caleb Newquist: And so just to kind of set the stage for the audience, the, um, you guys, based on the invisible [00:12:30] gorilla, it started out as an experiment, right? And then eventually, to make a long story short, it evolved into your guys's. You wrote a book together?

Dan Simons: Yeah, we did, and it was actually an experiment that was replicating much earlier work and trying to build on it. So the there was earlier work from Ulric Neisser in the 1970s doing a similar sort of task, and we wanted to kind of get the same thing and then make it a little more kind of in-your-face to see whether people really would miss it, even if it was not like a hard to see thing. So that's where where that came from. The [00:13:00] book that we wrote was, was inspired by the study, but it wasn't actually about that specifically. So the interesting thing from our perspective is that we've known for decades that people can miss unexpected things, that that wasn't incredibly new in the field. It was not not as vivid, but it was it was not a brand new finding. The interesting thing for for me, and I think for Chris as well, was how much of a discrepancy there was between what people actually see, which we've known for a long time and what [00:13:30] they think they'll see. So if you show people something like that gorilla video, they're convinced that, of course, I would see if a person in a gorilla suit walked into the scene and thumped its chest and walked off the other side. Yeah, and it's that intuition and that mismatch between our beliefs about how our minds work and the reality of what we see and remember and notice. It's that discrepancy about our intuitions that was the motivation for what we wrote about in the book, which was a much broader than just that one example.

Caleb Newquist: Right? Right.

Greg Kyte: Based on based on [00:14:00] that, because, well, I guess just so you know, both Caleb and I were we're very much, uh, uh, psychology, uh, turns us on. Uh, so you guys are really a couple the sexiest guests that we've had on, on this podcast. Um, one of the things, one of the things that I, I, I'm all like, I, I love thinking about the different psychological biases that have been, you know, labeled and identified, um, one of one of which and I think this actually. Pretty [00:14:30] pretty strongly to nobody's fool is the, um, the bias blind blind spot where you're where you're like, okay, that makes sense that people be tricked by that. But not but not me. Not this guy over here. Are are you have have the two of you gotten to the point where you're like, yeah, we we also like, we're just as susceptible to to whether it's things like fraud that we're talking about in Nobody's Fool or whether it's talking about, you know, [00:15:00] can't see the gorilla behind the people playing the basketball, how how susceptible. What's your thought about bias, blind spot. And do you still have it?

Chris Chabris: Well, speaking for myself, I have not defeated it. Um, I doubt anyone. I doubt anyone can. Uh, one way I know that is that when, uh, the gorilla experiment had first been published, or a few years after that, I would occasionally show it to people and actually watch the video along with them and try to count the passes that the people were making. And, uh, I noticed a couple of [00:15:30] times that I would get to the end of the video and I would not have seen the gorilla, and I would wonder why the video stopped and so on. So even when I was focused enough on one task, I could miss, uh, other things that were happening on the screen in front of me, even though I knew they would be there.

Caleb Newquist: Dan, what about you? Have you have you conquered? Have you conquered not. Have you conquered your bias?

Dan Simons: I mean, I think one one way that I know that I haven't is when we first started doing these studies, even though we knew that people would miss the gorilla, you still kind of held your breath every time you would show it, because you were convinced that, of course, everybody would [00:16:00] notice it. Right? And this is the same sort of experience that I think magicians often have where they're just starting out. They're convinced that because the way they're doing their trick is right there in the open and everybody can see it, they're convinced everybody's going to see it, and it takes a lot of practice and rehearsal to get to the point that you say, okay, I know they're not going to see it, even though intuitively it feels like everybody's going to notice this thing.

Greg Kyte: Right?

Greg Kyte: So so I guess if we're if we're saying everybody's susceptible to bias, blind spot bias, meaning one of the many psychological biases, um, [00:16:30] and you do cover this in the book because part of your thesis is that everyone can fall victim to a fraud. We we actually just did a podcast recently about Darren Burge. Who was this? He was he's called like the mini Madoff or the Madoff of of the West Coast. Um, had a, had a big old Ponzi scheme. And very interestingly, one of his victims was a guy whose job used to be, uh, an investigator, [00:17:00] where he would investigate and prosecute people who were doing Ponzi schemes. And so you would think, if anyone's not going to get suckered into this, it'd be him. And he absolutely was and lost a bunch of money. So. So, yeah. So can you speak to that? Just like like to what extent is everyone. Because that's that's the flip side of your book, is you're giving us tips for how to protect ourselves from it, but is that just futile? And was that just like a cute idea for you to put in there? You're like, you're all gonna get. You're all gonna lose all your money anyways. But here's [00:17:30] maybe a couple ways you could lose less.

Chris Chabris: Well, I bet that the investigators of Ponzi schemes are still less likely to be victims of Ponzi schemes than, you know, people who don't investigate Ponzi schemes. Right? So maybe this was a, you know, the the the one case that, you know, that that, uh, was ironically, you know, interesting I suspect that that lawyers are less likely to be defrauded in these cases, let's say than than than non-lawyers. But, um, but it's true. I think you're right. One of our main points is that, um, it's, uh, naive [00:18:00] and dangerous to think that you can't be the victim of these cons and scams that look, in retrospect, obvious and simplistic. Uh, and one reason why we feel that way is that we only see these cases that have been solved, and where we can completely lay out the story in a nice movie or podcast or book or something like that. And in retrospect, all the signs look obvious. Whereas, you know, when time is moving forward, it's hard to know what everything means [00:18:30] and what's legitimate and what's not. And that's sort of that's part of what we're trying to do in the book, is sort of give some advice about what kind of questions you can ask yourself or others, and what warning signs you might want to look for that you could potentially see in advance, rather than just in hindsight?

Speaker6: Yeah, I'd.

Dan Simons: Say a broader point is that, you know, the central theme is that, uh, you know, it's not just that gullible people fall for these things. Really highly educated, critical thinking people can fall for them when they're targeted. And maybe some people are [00:19:00] more likely to fall for completely untargeted scams than others are. But for the ones that are targeting people, you can have really people who have tremendous amounts of experience and expertise and are world leaders in their own fields. Uh, if the scam targets them in a in an effective way, you can fall for it. I mean, there are scams that target scientists and professors who are heading to conferences there. There are scams that have fooled, you know, former secretaries of state. So it's not it's not that [00:19:30] it's only kind of clueless or naive people who fall for these. Yeah.

Caleb Newquist: So so here's kind of a. I don't think I don't I don't recall if you addressed this specifically in the book, but is there something that you guys have kind of come across where this is what people who perpetrate frauds, this is what they seem to understand about humans? What do they understand about humans that the rest of us don't understand about ourselves? Is there is there is there one thing, or is there? Is there like a couple of things that they they [00:20:00] have? Come to learn that that's how they're able to swindle people. Like what? What is? Are they just are they just aware of these biases that we have, or do they kind of like stumble across them? And then it's kind of like more of a, you know, comedy of errors. Like I'm just curious as what you see, as if there are any patterns in what fraudsters understand about us.

Chris Chabris: I don't think they're reading books and psychological literature and so on to figure out how to do this. I don't think they [00:20:30] take online courses or whatever, but I do think there's sort of like if you look at sort of like what scams have been working for a long period of time and what kinds of patterns of fraud have been working for decades, generations and so on. They do have some things in common that relate to human psychology. And then also, I think for I've been thinking about this more recently, I think for each con artist, uh, you know, or fraudster or whatever, like there's a learning process where they're not very good at the beginning. So, for example, those, uh, famous Nigerian [00:21:00] scammers, um, who, uh, send out the weird emails about a prince, you know, and his fortune, and you can send me money to help me get it and so on. Apparently, at first they don't actually succeed very well. And the ones who learn, and perhaps the ones who have just better talent for it in the first place, um, wind up sticking with the business and making making money with it. A lot of people drop out of it. As for what they know about human psychology, I think that's, you know, sense a good way to describe the contents of this book, [00:21:30] actually, is, you know, we have sort of tried to reverse engineer what are the key things they know about about human psychology that that helps, um, that that helps them succeed. And I'll mention one of them and then maybe Dan can contribute some other ones. But the first one is they actually know, in a sense, about the invisible gorilla, uh, effect, so to speak.

Chris Chabris: They know that when people focus on one thing, uh, they can be blind to things that happen that they're not focusing on. Or more generally, when people [00:22:00] focus on one thing and process the information, the information that's in front of them or that's been put before them, they're very unlikely, or at least less likely to go and look elsewhere for other kinds of relevant information, that people will often make a decision based just on what's in front of them, even when other information they don't have could be just as important or more important to making the decision. And so this works on the level of salespeople. Salespeople don't tell you what you could buy at the other stores, that the price is lower online, that you maybe don't need this thing at all. [00:22:30] You know that you might be impulse buying or whatever. They don't tell you any of that stuff. They tell you the stuff that they want you to hear that would contribute to you deciding to buy their product. And they're not being unethical in that way. They're doing their job. And it's it's an ethical job in society. But in many ways, fraudsters are sort of relying on the same on the same principle. They don't have to hide that much if they can sort of count on you not looking for it in the first place. So that's that's what we call the principle of focus. They know about focus and how it limits, uh, our thinking and affects our decision making.

Speaker6: Yeah, in.

Dan Simons: A sense, it's effective storytelling, [00:23:00] right? I mean, if you're telling somebody a story, you don't necessarily have to fill in every background detail. You just tell them what you want them to take home from it. And, you know, fraudsters are really good at that. They're good at giving you a narrative that puts you into it, right. It makes you part of it and makes it compelling. But they're not telling you all of the things that would take you out of that story. They're not they're not breaking the fourth wall. Right? They're making sure that you're immersed in in their narrative. And there are lots of ways of doing that. And they've probably [00:23:30] developed those through practice and evolution. And in the same way that, you know, television psychics do the same thing, right? They have a rapid patter that forces you to kind of pay attention to what they're saying and not to have time to focus on what they're not saying, or how likely it would be that this guess would be right just by default. Um, they don't give you that opportunity.

Speaker6: Yeah.

Greg Kyte: One of the one of the things that that you, you established pretty early in the book that I thought was fascinating and fascinating, interesting, [00:24:00] but also made a lot of sense was and I can't remember exactly what you call it, but kind of the, the fact that as humans, our, our basic assumption is truthfulness. If, if, if somebody says something, if we're presented with information, our, our default is this is this is true, this is accurate things like that. And I know and I know that that that factors in tons even the stuff you guys were just talking about like I because again, I would like I like to think of myself as, I mean, especially because we got a freaking fraud podcast that I'm going to be less susceptible [00:24:30] to fraud. But I can think of two, two times where I've gotten text messages where I was like, I mean, I was I was moments away from clicking the link that I'm sure would, you know, make it so that I transferred my mortgage to a Russian hacker or something like that. But, you know, but one of them was like, hey, there's a problem with your with your, uh, Mountain America credit union, uh, account. We need you to click this link to, to to fix it or something like that. And I did at that point, I had not too long ago opened an account at Mountain America Credit Union [00:25:00] and and.

Greg Kyte: Funny thing then as I was going, because I knew the account had hardly any money in it and I was like, well, if it's if there's a problem, they're talking about like $18, so I don't that was really why I didn't click the link is because I'm like going it's not, it's not this is not a life changing amount of money that we're talking about by any means. And I'm busy right now. And then later I happen to call them. And I asked them about that and they said, oh, that's 100% a scam. Like I called him for something unrelated and was like, also, what about this text? Um, so that that's an example of me just [00:25:30] going, oh, this must be true. And I guess the little bit of verification, if they had said, you know, for Wells Fargo, I would have been, well, this is a scam because I don't have an account at Wells Fargo. But they got lucky with what I had. So, so in terms of that, like, like the two things that I guess one of the things that I really thought was interesting is that you also defended the truth like that, that assumption of truth in that society couldn't function without it, could you? Could you guys speak a little bit to that?

Chris Chabris: So there are a couple of interesting things going on in [00:26:00] your example. One is that they're sending messages to a lot of people about a lot of bank accounts and so on, and they don't care if some people get a message. It's about a bank they don't have an account at, or some people have only $18, whatever. They just want to get a few people who will actually follow through and give them money. So most scams profit from the very few people who are actually willing to go all the way, um, and their, their profit, their profitability comes often from being able to filter out the people like you who wouldn't bother, so you didn't bother to answer them. You cost them nothing, [00:26:30] you know, they just. So then then we get to this concept of truth bias that you were that that that you were um, mentioning. So truth bias I think is a concept that seems kind of obvious once once you've heard about it, but we don't really think about it in advance. And truth bias simply is the idea that we tend to assume that everything we see, hear, read are told is true as opposed to assuming everything is false, which would be a hard way to get through life.

Chris Chabris: If you just thought everything you ever saw was false, you'd never find the exit to a building. [00:27:00] Uh, you know, I guess you could look for a sign that said entrance, you know, and then that would be the exit or something, but you would never even get out of a building, um, or marking it as uncertain or unknown and then waiting for more evidence to come in. It seems like we don't even do that. We generally sort of assume it's all true, and then maybe later we will remarket as false or unknown or something like that. But that takes extra effort, extra time, maybe more information. And if we're in a hurry or we just don't bother to get that information or we're distracted, we don't sort of take that step. And that, I think, leaves us open to a lot of these other techniques of [00:27:30] of scamming us, as is the default belief that that things are true to start with, that can that can start the whole thing off.

Dan Simons: And it can be a momentary thing. Right? It can be that you just initially accepted as true and very quickly say, no, that can't be right. Right. Which is, you know, in an ideal situation, especially if it's something that you are disinclined to trust in the first place. Yeah. So that can make it a lot easier. And then but you know, I hear from a lot of people who consider themselves to be skeptics and critical thinkers who are like, well, I could never fall for these sorts of things because [00:28:00] I'm skeptical. You know, I'm a good critical thinker. And yeah, you know, maybe you're less likely to fall for an obvious scam, but I can guarantee you that there are scams that could target that skeptical thinking and that you will pass along things you see on social media without vetting them fully because they're aligned with your worldview. So in that sense, you've contributed to deception without necessarily even knowing it. Right? So, you know, I don't think there's some sort of universal protection against this, and I don't think [00:28:30] you'd want it. You if you're a perpetual cynic about everything, you don't have a lot of friends, right?

Speaker6: Right.

Greg Kyte: Chris, likely you wouldn't have gotten married, uh, if you were like, uh, if you if you're your future spouse was like, I love you. And you're like, hmm, let's see if we can find some evidence to to back this.

Speaker6: Can you?

Chris Chabris: Well, but but you make a good point because, um, actually, like, in the whole category of romance scams, people do say I love you, but it actually is not true. And it's a high stakes. It's a high stakes evaluation [00:29:00] to make. Should I believe that? Should I not believe it? Or maybe I should gain gather some more evidence. And people do that all the time, right? It's one thing to say I love you. It's another thing to buy an expensive ring. It's another thing to schedule a wedding. It's another. Another thing to move in. But. But some romance scammers will go to all those lengths, right. So it's not it's not necessarily it's not necessarily even easy in that case. Right. So those high stakes decisions you should be gathering the more evidence and making a better judgment, maybe even getting some outside views. Right. Should I really marry this guy? You know, that, that kind of thing. And so what do you think of this stuff [00:29:30] he's been doing? Is this shady? Right. We do those kinds of things often, but but not not necessarily enough or in the right circumstances.

Dan Simons: And the most common form of that, of course, is the person online that you've never actually met in person. Right, right. Um, and that's a good situation to be asking yourself, you know, if it's somebody online and I've never met them in person, are they really who they say they are? Should should always be a question we're asking based on people we know online only.

Caleb Newquist: So I want to ask you about because you're right. There's, there's I mean, I think we all know people [00:30:00] who. Fancy themselves to be, to have, like, highly acute bullshit detectors. Right. And so my question for you guys is, does that kind of like overconfidence sometimes make those people like the perfect mark in a way?

Dan Simons: You know, I've talked to a couple of magicians who have commented that some of the easiest people to fool are professors who value themselves as critical thinkers. Right, right. Um, and the reason is [00:30:30] magicians are very good at giving people a false story. They give you a narrative, they they make you think, here's what I'm doing when they're actually doing something totally different. So all they have to do to fool somebody who's a good critical thinker is give them a possible explanation for the for the magic effect. That's wrong. And if they think they've discovered it themselves, they lock on to it. And by the time they reevaluate and realize that couldn't have been it. It's too late. They've missed the other opportunity. [00:31:00] So. Gotcha. Um, much harder to fool kids who aren't paying attention as well to what you're fooling them on on the story you're giving them. So that's a case where it might well be the case that critical thinkers are actually more susceptible. Mhm. Um, because they think they've got the answers, they're quick to find the problems, find the, find the loopholes. And if a magician is good enough at subtly giving that cue that, here's what I might be doing, that they discover it then then they're lost.

Speaker6: Yeah.

Greg Kyte: So and so again [00:31:30] back back to kind of the general concept of skepticism that we were just touching on to one of. So, so Caleb and I, we both our backgrounds, both in accounting. And one of the things that this is actually literally an ethical requirement of accountants is something we call professional skepticism, which is which is like the it's basically the trust but verify except just without the trust part. So that's that's what accountants are supposed to be doing when like when I, if I go in to [00:32:00] audit your books, I, I'm basically just you're going to tell me a lot of stuff and which is great because I need to gather information, but I need to I need to be I need to verify everything that you're saying. But the the problem that we find throughout. So, you know, when we're talking financial statements, scams, a lot of times with a lot of embezzlement cases that we see is this like this, this should if any auditor was doing the basics of their job, they should have found this. But they got fooled because of because [00:32:30] of of whatever. Um, I guess I guess one of the things. Have you guys heard of the the concept of professional skepticism for accountants before? Is that a new a new thing for you?

Chris Chabris: I don't think I had heard of it in those terms, but it certainly makes a lot of sense. Like that's exactly what you would expect an accountant is there to do is to verify what's been said, not to, you know, not to believe it or something like that, but to verify it. And likewise, you would think, you would think that scientists are there to verify, you know, the data that they're getting [00:33:00] and the results that they're getting and so on. But often we can sort of, uh, you know, let our guard down when it's our own favorite hypothesis, our own study. We, you know, we analyze the data, we get some results we want. We don't check our work. We don't verify that the calculation was done correctly. But if but if someone finds results that are completely against what they're expecting or against their hypothesis or bad for their career, they might really dig in and try to figure out where you know what went wrong in this, in this process, which is, uh, that kind of asymmetry you don't want to have, because then you'll [00:33:30] publish mistakes that are in your favor, uh, which makes you, you makes your work look better than it is.

Speaker6: Yeah.

Greg Kyte: And, well, and then that that gets to because and I think this is a lot of what you're getting to in the in the one chapter on hooks about the butterfly effect, where it's like, you got to make sure that the that they're not promising too much. And, and what I was reading into that is with a lot of people who get fooled, it's like they they want so badly for the thing to be true that that [00:34:00] kind of like just what you're saying, Chris, where they're more they're like, okay, this is this is my favorite hypothesis. So I'm going to go ahead and do that. That's that dude. That's me all day long with Dan Ariely. I know Dan Ariely is the guy, the ethics, the guy who researched all the ethics. And then they found out that he was lying about his research about ethics. And I hate that because I've been teaching ethics to to accountants for a decade or more now and using all his stuff, and I love it. And, uh, so I'm like, okay, he was even [00:34:30] now I'm like going, okay, he he faked that, that data. But I mean, the bottom line is his points are still valid.

Greg Kyte: That's still where I'm at. So I'm like going he's still I'm still a sucker even though is he in jail? I don't think he's maybe he's in he's in no behavioral economics jail. Maybe at least that um, but I think it's the same with, with um, with auditors and, and Caleb and I were just talking about this last night where it's like, if you think about an auditor. Yeah, I'm supposed to be the skeptic, but I also have all this pressure from my company to make sure that I [00:35:00] get this. Audit done quickly and with few hours I so I'm I have all these incentives to to want your books to be awesome and great and super clean and perfect and then I just I just move on. So that's kind of the same thing that you're talking about, where it's like we're we and maybe just even, you know, expanding that to the entire population. We need to be more skeptical about stuff that we're falling prey to. But we're not. But a lot of it's not because what would you call it? It's not [00:35:30] self-deception, but it's like self-deception.

Dan Simons: It is a form of self-deception. But I think I think the bigger issue is that we all have these tendencies to see what we expect to see, and to not look hard enough for things we don't expect to see. And one of the things scientists are subject to this, as well as Chris was saying that, you know, yes, we're trained to be critical thinkers. Yes, we're trained to think about alternative hypotheses and explanations and to try and knock down the alternatives. But we tend to do that much better when we disagree with something than when we agree with it. And, [00:36:00] you know, one of the things that you can do that scientists have started to do is to put practices in place that essentially force you to do those things along the way. So there's a movement toward specifying in advance exactly what your hypotheses are supposed to be and, and how you're going to analyze the data and how you will interpret the outcomes. And if people do that consistently, then if it comes out the opposite of what you expected, you still have to report what came out right. And you have to sort of think about, okay, maybe it came out this [00:36:30] way for some other reason, and maybe I can find out why. But you're having to dig into it. You're not just dismissing it.

Speaker6: Right.

Greg Kyte: So wait, so you're saying scientists are being held to use the scientific method?

Speaker6: Finally, you.

Dan Simons: Think that this would be kind of obvious? That's, you know, it's kind of amazing that this isn't the norm. It's kind of what's taught in second and third grade. When you learn about what a scientist does.

Greg Kyte: That's exactly what I was thinking when you were talking about this stuff. It's like, I swear, in fifth grade we had to I had to write a report about or, you know, pass a test on exactly [00:37:00] that.

Dan Simons: And it's, uh, you know, it's the sort of trend that seems like the sort of thing that everybody should have been doing all along. Um, but it's, you know, there are a lot of other incentives in science that lead people to do things that really aren't the best for finding. Yeah. Or better approximating what's true. Yeah.

Greg Kyte: And that's the same with the professional skepticism that we see. Everybody who's an accountant knows that they're supposed to do. And even people who aren't accountants, like Chris, like you were saying, it's like, well, that seems obvious that that's how that's what an audit is supposed to be. Right? But then we just don't do [00:37:30] it. And that's that's just a I guess what you're saying, Dan, is that's basically just a weakness that humans have in, in general.

Dan Simons: So and the question is, can you put in place preventative measures that keep you from keep you from falling for it? Because that's going to get you a lot of the way there if you can, if you can set up those sort of conditions in advance, those checks that so that you are automatically doing them regardless of how your data came out. So, for example, in my lab, we have a policy that we don't publish a result unless I [00:38:00] and somebody else in the lab or two other people in the lab independently analyze the data and see if we come up with the same numbers. Okay? And you find errors that way. I mean, everybody makes mistakes. If you don't do that, then the errors you notice are the ones that make your result come out not the way you wanted it to, and you miss the ones where it came out exactly the way you expected. You don't check further there, but if you set up that procedure so that you're automatically analyzing the data multiple times with different people, you find out if you got something wrong.

Caleb Newquist: So if I may, I and I, I'm, [00:38:30] I'm kind of this is we haven't mentioned this yet but like and I know this can be a very technical, uh, aspect in your field, but like, how much? Of this is our egos working against us in terms of like, uh, whether whether it happens to be it's like thinking that I have a good bullshit detector or thinking that's like, oh, I, I, I don't make mistakes. I'm very careful. Like, how much is, is it, is it just our wiring and we don't really notice it, [00:39:00] or is there a is that overconfidence that I kind of mentioned earlier. Like how big of a factor is that in in in when people are are getting fooled.

Chris Chabris: The overconfidence can be a very big factor. Um, but it is true that that many people are good bullshit detectors in areas where they have expertise. So I think the mistake is to think that in general, I'm a good bullshit detector, or in general, I can read people well and know when they're trying to cheat me and know when something legitimate is happening, right? You might be good within your own field [00:39:30] because you, you know, if you know a lot about finance, you can probably more easily tell when someone is trying to sell you a scam or a fraud than something legitimate. But you may have no expertise when it comes to whether medical advice you're getting is is good, or is it pseudoscience or, you know, or something like that. Um, so I think that general thinking that you're really good at detecting bullshit is, is a bad strategy. You should know what your limitations are, where you're an expert, where you're not, where you're not an expert. Right. We did come across a lot of stories where people, [00:40:00] though, seem to have been sort of taken in in scams targeted at, you know, intelligent, educated, wealthy people because those people were apt, perhaps to make quick decisions and to trust their instincts and to think they were good readers of people, and they were very experienced and worldly. So they should be able to tell when they're being scammed and when they and when they aren't. Uh, so, uh, it's bullshit. Detection is great. You just have to know when when you're good at it and really be aware when you're not.

Caleb Newquist: Right. And I guess, and you addressed [00:40:30] this in the book, but like essentially the curse of knowledge in the sense that, like, as you say, you have this expertise, but that that expertise isn't transferable to these other areas and that's, that's that's the vulnerability.

Dan Simons: Yeah. And I'd say there's also a second issue, which is that we often don't get feedback when we make mistakes. Right? So, um, in sciences, if you're not checking and rechecking your results every single time, you're not necessarily going to know if you made a mistake unless somebody else points it out. And people [00:41:00] don't like to make mistakes, they don't like to admit that they might make mistakes, and they don't.

Caleb Newquist: Like being told that they've made a mistake.

Dan Simons: They definitely don't like being told they're making mistakes. And you see this all the time. And when a paper is corrected or retracted, somebody finds out that they just calculated something completely wrong and the key result doesn't hold. They'll say, well, yeah, that's right. You know, we made a mistake there. But still the conclusions all still hold, which is kind of like, really? Right. If they still hold then then why do we take that initial result to be meaningful at all? Yeah. Right. Yeah. Yeah. Um, and [00:41:30] it's a really common retort because people don't like to admit that. Yep. I could have been wrong. And maybe I got that completely backwards. Yeah, yeah, it happens. But we there's there's a lot of science being produced in the same way. There are a lot of, you know, business reports being produced and financial statements being produced. And yeah, you know, how often do we miss the mistakes that are out there? Probably quite a bit.

Greg Kyte: Well, one, one of the things that you were talking about, well, I guess I guess first off, just to just to round up the thing about if you're an expert in the field, you're more likely [00:42:00] to actually detect that. So I so I'm a CPA, I'm a I'm an expert when it comes to accounting. So and like I said we have the fraud podcast. So even though I'm not impervious to fraud, I think I'm more likely to detect it. I'm also an expert in my own body. I've been in this body for 50, almost 52 years now, and so I'm an expert in that. So when I did give myself the ivermectin enema, I did, I felt like that. I felt the COVID, I felt the COVID go.

Speaker6: Did you try the bleach?

Greg Kyte: I didn't well, I mixed that [00:42:30] with it was kind of a it was a whole compounding. It was kind of a compounding pharmacy kind of. Anyways. Um, uh, but changing, well, not really changing subjects because, because you guys were talking about if, if you're able to set up like processes like, like put in, put in place some, you know, different, different procedures that you need to follow that are going to allow you to be duped less. That was one thing that it [00:43:00] seemed like you had something in every single chapter that was like, here, here's the here's here's the problem, here's here. Here is a like a question you need to ask yourself that's going to help you be duped. Like just real quick. Could you give us a quick just rundown of what of what what people should be doing, uh, to help them get get defrauded less?

Chris Chabris: Well, it's hard to go through all of them, but a couple of the best ones. Yeah.

Greg Kyte: Do your do your faves. What are the what are the. [00:43:30]

Chris Chabris: Well so the first I'll start I'll kind of start at the beginning. So we started with truth bias. So it's important to more often ask ourselves is this really true simple question is this really true. It's amazing. Like again it seems obvious in hindsight and we can't ask it all the time. But when things are important, you know, like you got this text message saying that, you know, your bank account has been locked for something like that, you know. Right. Is that really true? Is that really is that really what it's what it seems to be then when it comes to focus, um, before making a big decision, [00:44:00] we have to ask about the information we have in front of us. What's missing? Um, so what information is not here that I would really like to have in order to in order to make this decision more accurately, uh, when it comes to situations where, uh, there could be, like a prior belief that we have that is really influencing our decision making. It's often good to ask specifically, what am I assuming is true that could influence my my decision making here? So often that's just not like is [00:44:30] the thing in front of me true. But is there some other belief I have that, um, that is really critically determining everything else I'm believing? A cute example of that.

Chris Chabris: Uh, is this this, this idea called the Mandela Effect, where people believe that, like, history is being rewritten, timelines are forking. Physics has gone wild because they have a memory that, for example, Nelson Mandela died in the 1980s in prison as opposed to dying in 2013 after being president of South Africa. Right. And [00:45:00] they are assuming that their memories are infallible, at least on this topic, and then changing all their other beliefs about how the world works in physics and so on to conform to that. So these are cases. Now, I'm not saying asking this question would have stopped these people from coming up with this hypothesis, but it can happen a lot in everyday situations, a simple one being oftentimes we proceed as though we assume that the people we're dealing with are not criminals or have not been convicted of crime or fraud. But as you would know better than us, even a lot of [00:45:30] fraud is done by people who've been done it before, and in fact, sometimes done it over and over and over again in very similar patterns. So perhaps looking them up before making a deal with them.

Speaker6: Right.

Chris Chabris: Important matter, you know, might might be good. Right.

Greg Kyte: So we talk about that background checks. It's like yes you got to do you just do them.

Chris Chabris: Yeah. And the reason we don't do them is we're assuming that the people we're dealing with are legitimate. And that's often something we might not even we didn't even realize that at some point we made that assumption. We just made that assumption. Yeah. You know, in the back of our minds and never [00:46:00] question it. So we have to surface it and question it explicitly. I don't know, Dan, you want to mention a couple other. Yeah.

Dan Simons: Well, I'd take a step up a little bit and say, okay, the most important thing to think about first is, is this a situation where I should bother looking for more information. So you gave the example of this text where you only had $18 in your bank account. So you decided, you know, you wouldn't worry about clicking the link then because it wasn't enough money to worry about. The time when you should worry about it, of course, is when you actually do have a lot at stake.

Speaker6: Yeah. And yeah.

Dan Simons: You know that that [00:46:30] thinking about when to spend the time and effort to ask because it's hard to ask additional questions, to constantly be double checking. Is this true? You know, should I trust this person who I've known for a long time online? Those things take a lot of effort. And if the stakes are really low, yeah, it's probably not worth your effort when the stakes are high. The strategy that I like that kind of encompasses most of the. Guidance we give is imagine that you were trying to defraud yourself. Right. And the stakes are high. How much effort would [00:47:00] you go to? Right, if you wanted to pull a con on on you like you want to get somebody to give you their their bank account number? What kind of links would they go to if they knew that you had a lot of money in that bank account, or what links would they go to to get you to, you know, enter into a business deal that's questionable? Um, what what links would they go to to cover the fact that they were embezzling if they were somebody who was trying to deceive? Right. And if there's a lot of money at stake and they [00:47:30] at least have some experience doing what they do, because people who commit fraud tend to commit fraud, how much effort will they go to if somebody is trying to pass off a forged painting and it's a high value painting, they're going to go to a lot of effort, and it might be worth them spending six months or a year to set up the background for that, in order to make it more and more compelling.

Dan Simons: Right. In the same way that, you know, magicians often will go through Herculean efforts to set up an effect, so much so that nobody would think, oh, they they couldn't possibly have spent six months doing that [00:48:00] in order to make this effect work. So it's so outlandish that it wouldn't work. But if it's central to their career, they're going to. Right. So I think thinking like a con artist brings out a lot of these things. If you think about somebody who is sending out, sending out texts saying your bank account might be compromised, okay, well, if they were going to try and make money doing that, how would they do it? Well, they'd send that text out to tons and tons of people, in which case it doesn't necessarily matter. They're trying to catch people who happen to have an account at that bank. [00:48:30] Yeah, and happen to have a lot of money in it.

Greg Kyte: And that's funny, again, even with that specific example, because because what you're saying is it's like, how much is it? Basically do ask the questions to the extent that, you know, like you said, stakes are low. Yeah. You probably can just not even worry about it. But but for me, with that text, it was almost like I did it exactly the opposite, where I was like, I didn't fall for it because there was so little at stake. Whereas if I had had, you know, $40,000 in that account and I go, oh crap, that's my $40,000 [00:49:00] account, let's click that link and figure out what's going on.

Speaker6: So when you.

Dan Simons: First gave me that example, gave us that example, I was I was thinking, okay, you did that exactly backwards, right? The time when you should be thinking critically about it is the time when there is a lot at stake. Yeah.

Speaker6: Not one. And you need to.

Greg Kyte: Slow, slow it down because there's so much at stake because I think and that's another thing you guys pointed out in the book is like so often I can't remember. There was some some celebrities that got tricked into, I think into to donating money to some, some, uh, scam website kind of thing. And, and that [00:49:30] they were just kind of they clicked through just because they're busy. It's they're just busy and they're going fast. And so I think that's a, that's, that's part of it is I mean, I think that's kind of behind a lot of what you're doing. Every single protocol is really just slowing down a little bit or a lot. Is that is that fair to say?

Chris Chabris: And yes. And never believe an exploding offer, right. Like, can you guys think of any exploding offer where you must respond within a certain number of minutes or hours or whatever? That's that's important. [00:50:00] Not just like, you know, free shipping expires at midnight, you know, but but something really important that you should respond to that quickly. I don't think there is one, except in very specialized cases like job offers. Right. Where like, you know, they'll take the job offer open. Yeah. Well even they're even, they're like.

Speaker6: Yeah, yeah.

Chris Chabris: You could probably ask for an extension right. Yeah. But um, yeah, but uh, but so often, um, so, so often we somehow get tricked into thinking we have to respond quickly. And that's behind a lot of scams, a lot of sort of like high volume scams. [00:50:30] Uh, like the process try to ensnare a lot of people as opposed to very bespoke, you know, million dollar painting scams where they know there's only a few potential buyers, but where, like, anyone can go to a Walmart and buy gift cards and start reading the numbers over the phone. So, yeah, you know, just don't do anything that quickly. Like that's that's a sort of a simple heuristic for for not being scam. You won't miss out on anything important, right?

Dan Simons: Including offers and exploiting threats. Right? I mean, a lot of the call center scams involve threats. The cops are going to come to your door unless you go buy gift [00:51:00] cards, and nothing ever works that way.

Speaker6: Yeah, right.

Caleb Newquist: Yeah, right. Uh, Greg, I have one question, and then I know you have a last question. I'm. I'm curious about what you guys think. I mean, this is your field, so you probably have a bias that, yes, more people need to learn about this. And you wrote a book about it and you're trying to educate people about it. But I guess Greg brought up auditors earlier, and I think that's a perfect example. And I think like journalists are another that's another field where skepticism is supposed to be really important. I don't really get the sense [00:51:30] that it's explicitly part of the background and training in those kind of critical fields, where skepticism and asking more questions is, is, is crucial to doing the job well. And I think some people there seems to be people that find a knack for it, and they learn it on the job, and they and they and they learn those skills on the job. It. I think as far as like training and kind of like education, just simple awareness around this stuff in fields like and again, [00:52:00] I'm just using two examples of auditing and journalism where this stuff could be really valuable and it doesn't even tend I don't get the sense it makes, uh, it's even it's even on the radar of those fields. So I just wonder if you guys can talk about that aspect and if, if you see any kind of hope for weaving this kind of thinking into those kinds of fields.

Dan Simons: I would say that those are a really good illustration of how we all have the same tendencies. Yeah. So [00:52:30] auditors and journalists and scientists are all supposed to be trained in critical thinking. They all get some training in critical thinking and how to ask questions and when to dig further. But they're all subject to the same sorts of biases that we we have. And the fact that we get some training about this doesn't necessarily immunize us against all of the ways in which we can make mistakes. Right? So that's where I really think having procedures in place that help prevent you from fooling yourself can make a big difference. [00:53:00] So having external vetting of sources is is a good thing to do in journalism. Asking. I often tell science journalists one of the things that you should do if you're interviewing me for something is you should ask somebody about this who disagrees with me completely. And a good question to ask is, are there people out there who would completely disagree with this conclusion and it prevents you from. And if they say no, everybody agrees with me, then it's like, well, then why are you doing the work? Because if it's so obvious that everybody will agree with you, then it's not interesting, [00:53:30] you know, and revolutionary in that way. Yeah. Um, and you can't market it as a big revolutionary finding an idea if everybody agrees with it to begin with. So, um, I think that sort of questioning who would disagree, why would they disagree is an easy next step for for a journalist to take in the context of science journalism. And if you make that part of your procedures, then you're going to catch yourself, um, if you find somebody really compelling and a great storyteller and really charismatic, it's very easy to not question them.

Chris Chabris: It sounds like a good a [00:54:00] good example of where, like, you know, checklists can help, right? If you diligently actually go through the things on a checklist and you don't make a decision before you reach the end. And and so, of course, you still have to go through the checklist, right? It's hard to to force yourself to do that. So I like Dan's idea of having other people who, you know, have to do the same thing and you have to to reach agreement. We found many cases in the book where there were in our research for the book, where other people sort of saw a scam in progress and said, wait a minute, that's a scam, you know, and a couple of cases talked someone out of actually, you [00:54:30] know, going through with it so that that's a good that's a great idea. I wouldn't want to be I wouldn't want to sort of pretend to know how journalists should be trained better, you know, and so on in the details. But I do think what our, our book can contribute is sort of a framework for understanding why and when and how that additional skepticism and questioning and so on might be warranted, and maybe some patterns of how people try to lead you to not do it. Right. So that could be I think that could [00:55:00] be very helpful and providing a little bit of psychological, you know, framework and backbone to these common sense ideas that are already embedded in a lot of professions as, as Dan, you know, as Dan said.

Greg Kyte: Right on. Well, I actually have two, two last questions for you guys before we wrap this up. So the first one is, uh, on on page 150 of your book, Nobody's Fool. You use the word oeuvre. And I was wondering if you use that just to make me feel stupid, or if that was actually [00:55:30] had some meaning to the, uh, to the to the book itself?

Chris Chabris: No, it was to make you look stupid. Okay.

Greg Kyte: All right. Okay, okay.

Speaker6: We anticipated.

Dan Simons: That you would interview us about the.

Speaker6: Book. Yeah.

Dan Simons: And we put it on that page just just for you.

Speaker6: Well, then you.

Greg Kyte: You nailed it. Well. Well done. Well.

Speaker6: Well played.

Greg Kyte: Um, the other the other question I have and this. Well, it's funny and this kind of, this kind of falls back to the, the scientific method stuff that we're kind of talking about earlier is that, [00:56:00] I mean, just based on my own studies, I have formed a hypothesis. I did make some observations. I tested my hypothesis, and I have drawn some conclusions. I want to see if you agree with me about this. Um, uh, fuck Malcolm Gladwell, right? Am I right?

Chris Chabris: I didn't see that coming. No. Um. Uh, I wouldn't use exactly. I wouldn't use exactly those words, but. Okay. Uh.

Speaker6: I, I because I.

Greg Kyte: Would, I did, I did and I.

Speaker6: Would.

Dan Simons: Just now in.

Speaker6: Fact. Yeah. Yeah.

Chris Chabris: Yes [00:56:30] we.

Speaker6: Heard. Yeah. Um.

Chris Chabris: I don't know even, even I wouldn't, I mean, I think I've sort of been in a somewhat bigger, you know, bigger fight with Malcolm Gladwell than, than than Dan. Um, but you might be surprised to hear that I've read almost everything he's written, and I actually enjoy reading his stuff, but it's sort of like you have to approach it, in my view, with the right mindset. And so what can you get out of Malcolm Gladwell? You can get some interesting stories and ideas that you then have to see from other corroboration, from other sources, and from your own expertise and, and so [00:57:00] on, whether to believe them or not. And you can also get some lessons in good writing and good persuasive writing. He does a great job of, of sort of describing science in a way that seems to make millions and millions of people interested in it and believing in it, and probably communicators and scientists should learn a little more from that, but at the same time adhere to the ethic of we have to describe the science, you know, in the most fair and accurate way, um, and not pretend we know more than we do and so on. So I, I don't, I don't hate I [00:57:30] don't hate Malcolm Gladwell. I wish he would uh, I wish he would do a little better job, though.

Greg Kyte: Gotcha. I so I so personally, I do hate Malcolm Gladwell. I know, I know, hate's a strong word, but I'm going to lean lean into it. Uh, I read I read the big three with, uh, you know, blink, outliers and Tipping Point. I read all three of those. And, Chris, I would agree with you. I and I very much enjoyed. They were very enjoyable books to read. Um, but but then I felt like I got duped [00:58:00] because, uh, each one of them, after I finished reading the book, I was like, going, there is nothing actionable for me to do in this, in this book. Like the like outliers, like he was talking about the hockey players where it's like, well, so, so many professional hockey players were born between the months of October and December. If you look at this like, like I can't go back and get born between October and December, this doesn't help me at all. And then I think just by and large, like the 10,000 hours thing is just, uh, you know, been shown [00:58:30] to be bullshit because we've all spent 10,000 hours driving, and I still drive with a bunch of fucking idiots on the freeway all day long, so.

Speaker6: Well.

Chris Chabris: You could still influence the birth date of your children.

Speaker6: I guess that's true. Think about.

Chris Chabris: Think about that.

Speaker6: That's okay.

Chris Chabris: Um, as far as the 10,000 hours go, 10,000 hours thing goes. It's, um, you know, someone who was a big believer in that theory might say, well. We aren't really doing deliberate [00:59:00] practice when we're driving. We're just trying to get to work on time. You know, we're not actually like trying to get feedback. And so and there are some people, of course, who like to honk a lot and basically give people feedback on their driving. But that's still a minority of, you know, I know.

Greg Kyte: And despite how many times I yell out my window, get off your cell phone, you asshole! Seems like people are still on their cell phones. I'm I mean, I guess I'm doing I'm doing what I can.

Dan Simons: What's interesting about those cases is, of course, there's often some evidence behind the claim, and [00:59:30] then it gets kind of exaggerated beyond the level that it's actually supported. Right. So there's plenty of evidence for the effects of deliberate practice generalizing that. To say that talent doesn't matter is maybe a bit too much. Yeah. Right. And that's the for for me, that's the issue that I have with a lot of sort of popular science writing. Is that it? It doesn't have the right level of humility that it states a claim with certainty. And then if you're an expert in an area and then you read those claims, you can see all [01:00:00] the problems with them, you know, the the research that's involved. And you realize that it's a misrepresentation of the consensus, right? If there is a consensus, it's it's not representing alternative views or ways that it might be wrong. And Gladwell is a master at making people feel like they're having great insights and that they're smart from from reading his work, it's a real skill to make people feel like they're getting, you know, they're getting a deep understanding of something. But he does that by cherry picking, right? He does that by [01:00:30] exaggerating results, by not interviewing people who disagree with the people he's interviewing, because the people he's interviewing give him a good story. And that story is often to support a broader narrative.

Greg Kyte: Um, okay. So then let's move on from Malcolm Gladwell to, uh, Billy Joel, Neil Diamond and Josh Groban, three other people that I hate.

Speaker6: Um, what? Oh, yeah.

Caleb Newquist: This is this is now this is the this is the part of the show where we we kind of phase into Greg's grievances. [01:01:00] Yeah. And so I know this isn't your, you know, your necessarily your area of expertise, but, um, I don't know what your hourly rate is, but, Greg, Greg needs some serious help. And so.

Speaker6: We'll we'll sit.

Greg Kyte: We'll have you guys back on another podcast to talk about those guys. Uh, I think our discussion of Malcolm Gladwell was was probably sufficient for what we're going for today.

Caleb Newquist: Thanks, guys. There you go. Greg. [01:01:30] Did we learn anything? Were you listening during that conversation? I feel like we were listening.

Greg Kyte: I was I was focused, man. I think I mean, the biggest the biggest takeaway for me from the interview was when we were talking about it's almost like the kind of the silver bullet to getting defrauded is to is just slow down, like, yeah, just take your time, slow down, do do more due diligence. That's what you're slowing down to do. Um, because and that and that's it's it's definitely [01:02:00] an oversimplification because, you know, you and I were talking about this, I think in preparing for the interview and, and having had read the book where it's kind of like, you know, what? We do have this truth bias. And if we just if we just abandon our truth bias and we make sure that we confirm everything that everyone ever tells to us, we will never make a decision, ever, because.

Speaker6: And and we won't.

Greg Kyte: Have jobs and we'll [01:02:30] be broke because we'll just be spending all of our time trying to figure out if, you know, if if a few statements that have been made to a certain fact. True. One of the other things was I liked the idea in one of the chapters about, um, about consistency. As humans, we we want we're risk averse, we're loss averse. So we want consistency. We want to go, okay, if I do this, then I know these following things are going to happen. And if somebody promises things that are going and [01:03:00] and if the actual results are too consistent, you need to you need to be going, wait a second. Why is there not any variability in these, you know, in, in these results? Um, so I thought that was that was very interesting. I also liked the idea of compared to what was one of the questions that they wanted where, where it's like again, it and that's more like going back to if you remember, uh, Charles Ponzi's original Ponzi scheme. Yeah. He was I mean, I [01:03:30] it was like some astronomical rate of return, like it was a 50% return in six months or something. Oh, I think it.

Caleb Newquist: Was shorter than.

Greg Kyte: That. Yeah, it was, it was ridiculous. And you kind of go, but but I think he broke it down into you get like a six month, a 6% return every month and you just go oh, 6%. Okay. That's but then you go compared to what compared any other investment is going to be like nothing compared to compared to that. So what whatever. Because it's kind of like that. That's [01:04:00] how you can tell if something's too good to be true.

Speaker6: Yeah.

Caleb Newquist: Right. Yeah. I think, um, for me, the thing that this underscored well, and I think yeah, underscored or validated for me is that auditors and I, we're, we like to pick on auditors here as people well know. Yes. But like to me it seems like some enterprising. Accounting school should put [01:04:30] psychology into their curriculum if people are going to be auditors. And like at the very least, because then then you have to imagine then you, you, you kind of if you're going to if you're going to have this trope of professional skepticism, like, what is that based on? Like what? I mean, like you like you just throw people out in the field and they learn how to be professionally skeptical, like there's no foundation for what that even means. Right? Right. And so, like when people when you read about professional skepticism [01:05:00] and you're auditing and you're auditing classes, yes, they define it and they explain it. But like, what's the practical application of that in the real world? Like how do you do that? And you know, and Chris and they had they had suggestions like, like checklists and things like that. And those are all good. And famously or infamously, auditors have shitloads of.

Greg Kyte: Checks and checklists. Yeah, exactly.

Caleb Newquist: But the thing is, is like you actually have that has to be like rigorous, right? It can't be it. You have to somehow make that instead of making it monotonous [01:05:30] is what that's what it actually feels like in practice. It has to be. It has to have rigor, right. So that you're actually so that it actually works. And I think that is something that auditing sorely lacks. Right. Like it just oh, and and I'm sure there's like like we've joked before, the very last auditor that's listening to this podcast will this will be the last right. They listen to. But in any case, I think if I were going to design a curriculum for auditors, there'd [01:06:00] be some heavy duty, uh, psychology. Right. Uh, curriculum.

Speaker6: Yeah, that.

Caleb Newquist: That helps with.

Greg Kyte: That. And I totally hear what you're saying when they were like, checklists are a good idea. And I'm like, you're talking to us about fucking checklists. We're checklist fucking people, okay? Don't even you psychologists? You don't even know checklists. Listen, I got checklists on checklists, motherfucker. Um, but but I also think in terms of the psychology because, again, because [01:06:30] I, I think that people like, I felt like the concept of professional skepticism. Very easy to understand. Sure. Because again, it's like my well and even Chris Chris copped to that. He's like, yeah, that's isn't that what auditors are supposed to do? And you're like, yeah. So for him it was very it was very intuitive that they should be like that. But the problem is, is that we're supposed to be professionally skeptical. But then we also, you know, like I was trying to say we have these, uh, we have these requirements. [01:07:00] It's like, here's a, here's our billable hour budget. So you can be professionally skeptical, but only if it only if you got time to burn, right, that kind of stuff. And so I think the psychology that I would want to see would be psychology saying, hey, listen, you're motivated because your paycheck is coming from the person who you're supposed to be calling bullshit on, and that's a conflict of interest. So let's make sure you understand that. And you know how [01:07:30] you understand how to how to deal with conflicts of interest. Yeah. How to, you know, and that you can you can clearly identify that you've got you've got weird you've got perverse incentives. Really. Yeah. As an auditor. Yeah. You need you need a heavy dose.

Caleb Newquist: You need a heavy dose of psychology. And you need a heavy dose of, uh, a heavier dose of ethics. Yeah. Uh, and and I don't know, Kant or whoever. I don't know Kant. He's one of.

Greg Kyte: Them. Kierkegaard.

Caleb Newquist: He's one of those, too.

Greg Kyte: Yeah. So let's just throw out some [01:08:00] philosophers.

Speaker6: Nobody's the ones.

Greg Kyte: With is heard of, but nobody's actually read. Yeah, yeah.

Caleb Newquist: The one with the ones with CS. Those are good ones.

Greg Kyte: Right? They're the best. They're the.

Caleb Newquist: Best.

Greg Kyte: If your last name starts with a K, I mean, I think that goes without saying. Yeah, that's absolutely a smart.

Speaker6: Um.

Greg Kyte: Well that's it for this episode. Uh, remember.

Caleb Newquist: If you wait a minute, wait a minute, wait a minute. We can't wait, wait wait wait wait. So Malcolm Gladwell is not going to hear this ever. No, but you [01:08:30] went pretty hard at him, Greg. So.

Speaker6: And yeah. Which was.

Caleb Newquist: But here's the other thing that the audience may or may not know, but because we did not explain this in the podcast, but, uh, there was there has been, there has been a, uh, it's, it's a while ago, but there has been kind of a, a small beef or a small feud, let's say, between Chris and Dan and Malcolm Gladwell and, um, it it [01:09:00] is I, I found it fascinating. I think, um, I find, I find that there's valid points on both sides, but I think, like, uh, Chris and Chris and Dan are scientists. And so that's going to be their perspective. And Malcolm Gladwell is a, uh, storyteller slash bullshit artist. And so so that's so I can understand that. Right. And so, um, anyway, uh, that was, if you're so inclined, investigate on your own, you won't be disappointed. [01:09:30] Um, and so that's that's where that all came from.

Greg Kyte: Yeah. Well, and I, I stand firmly with what I.

Speaker6: Said, all right. Yeah.

Greg Kyte: You all heard.

Speaker6: It. Yeah.

Greg Kyte: All right. Well that's it for this episode. And remember, if you think your bullshit detector is amazing, well that's bullshit. So apparently it isn't as good as you think it is.

Caleb Newquist: And also remember to be more skeptical, especially of Malcolm Gladwell's books.

Greg Kyte: If you want to drop us a line, uh, send us an email at omy fraud at earmarks. Com and [01:10:00] Caleb, if people want to get Ahold of you, how can they do so?

Caleb Newquist: They can find me on LinkedIn. Uh, my full name, Caleb Newquist. Greg LinkedIn, Greg Kyte, CPA.

Greg Kyte: That's me. Yeah. Bald guy, glasses and a beard.

Caleb Newquist: As we said at the top of the show, there are links in the show notes to Dan and Chris's books, Nobody's Fool and The Invisible Gorilla. Get those wherever you get books. Uh, and check out them on Twitter and LinkedIn. Uh, the [01:10:30] Invisible Gorilla video is on YouTube. Uh, all kinds of good stuff. And yeah, enjoy. All My Fraud is written by Greg Kite and myself. Our producer is Zach Franc. Rate review and subscribe to the show wherever you listen to podcasts. If you listen on your mark, you can get CPE. That's a thing.

Greg Kyte: It's so nice.

Caleb Newquist: It is nice. Yeah. Join us next time for more average swindlers and scams from stories that will make you say oh my fraud

Greg Kyte: oh my fraud!

Creators and Guests

Caleb Newquist
Host
Caleb Newquist
Writer l Content at @GustoHQ | Co-host @ohmyfraud | Founding editor @going_concern | Former @CCDedu prof | @JeffSymphony board member | Trying to pay attention.
Greg Kyte, CPA
Host
Greg Kyte, CPA
Mega-pastor of @comedychurch and the de facto worlds greatest accounting cartoonist.
Christopher Chabris
Guest
Christopher Chabris
Professor & Director of Decision Sciences, Department of Bioethics and Decision Sciences, Geisinger Research Institute Faculty co-director, Geisinger Behavioral Insights Team
Daniel Simons
Guest
Daniel Simons
Professor of Psychology Head of the Visual Cognition Laboratory Department of Psychology University of Illinois, Urbana-Champaign
We’re All Somebody’s Fool | Dan Simons & Chris Chabris
Broadcast by