Hey Facebook: What’s that Smell? Part 2 Calling Bullsh!t
Stated purpose: To empower all of us to build community and bring people closer together.
The Wall Street Journal dropped conclusive proof that Facebook knew that their algorithms were dangerous all along. Whistleblower Frances Haugen went public with allegations that among other things, Facebook routinely places profits over public safety. The Senate commission on the capital insurrection got underway. Oh, and Facebook ignored all of that and blythely announced the name change to “Meta”.
Join us to hear what UCLA Professor and author Ramesh Srinivasan had to say about all of this. And for a revision of Facebook’s BS score. Spoiler alert: it didn’t go down.
I’m just gonna keep feeding you conspiratorial and outrageous content so that you go crazy with dopamine firing in your head. It’s like staring at a burning car.
– Ramesh Srinivasa
Zuckerburg: I started Facebook, I run it, and I’m responsible for what happens here.
Whistleblower Francis Haugen insisting Congress must act against a company she says is misleading the public promoting hateful and harmful content.
She laid ultimate responsibility on facebook founder and CEO Mark Zuckerberg.
TY MONTAGUE (VO): Since we wrapped our first episode on FB, it has been revealed that, hell bent on growth, Facebook leadership actually knew of the harm they were causing to users – and they didn’t care. So, in this episode, we continue our exploration of events and learn how this new information affects Facebook’s BS score.
And if you haven’t listened to part one, it’s in your feed now. We’ll be right here when you come back.
TY MONTAGUE (VO): Welcome to Calling Bullshit, the podcast about purpose-washing…the gap between what companies say they stand for and what they actually do — and what they would need to change to practice what they preach.
I’m your host, Ty Montague. I’ve spent over a decade helping companies define what they stand for — their purpose — and helped them to use that purpose to drive transformation throughout their business.
Unfortunately, at a lot of organizations today, there’s still a pretty wide gap between word and deed. That gap has a name: we call it Bullshit.
But — and this is important — we believe that Bullshit is a treatable disease. So when the bullshit detector lights up, we’re going to explore things that a company should do to fix it.
TY MONTAGUE (VO): Facebook says its mission is to give people the power to build community and bring the world closer together. And we’ve called Bullshit on that. In fact, in our previous Facebook episode, we gave them a sky-high BS score of 72.
But some pretty important news has dropped since that episode wrapped. So, let’s do a quick catch up.
Since we left off or Since episode one, there has been both good news and some seriously bad news from Facebook.
TY MONTAGUE (VO): First, former data scientist Sophie Zhang (rhymes with Mahjong) came forward.
ABC NEWS: Sophie says she saw evidence that political parties across 25 countries had been manipulating facebook to mislead and in some cases harass its own citizens.
TY MONTAGUE (VO): However, in fairness, Facebook also took a step in the right direction by creating an “independent advisory board” similar to what Kamran Asghar, founder and CEO of Crossmedia, suggested in our first facebook episode. When I heard of its formation, I thought it might be a turning point.
But then, the hammer fell.
WSJ: This is the Facebook Files. A series from the Journal. We’re looking deep inside Facebook from it’s own internal documents.
TY MONTAGUE (VO): Frances Haugen – a product manager in Facebook’s civic integrity unit left the company in May, taking internal documents with her. She delivered those documents to the Wall Street Journal and law enforcement giving us an unprecedented look inside Facebook.
Haugen: The thing I saw at facebook over and over again was there were conflicts of interest between what was good for the public and what was good for facebook. And facebook over and over again chose to optimize for its own interests, like making more money.
TY MONTAGUE (VO): In October Haugen testified to Congress that Facebook had extensive research on ALL of the problems and negative effects of their platforms – from misinformation on Facebook to toxic Instagram content targeting young female users.
Not only did they know, they chose to do nothing about it.
Haugen: The only way we can move forward and heal facebook is we first have to admit the truth. .. [4:18 – 4:24] the way we’ll have reconciliation and we can move forward is by first be honest and declaring moral bankruptcy.
TY MONTAGUE (VO): And the way Mark Zuckerberg chose to move forward in the midst of their very public “trial” for moral bankruptcy was to change the company’s name.
Mark Zuckerberg: To reflect who we are and what we hope to build I am proud to announce that starting today, we are now Meta.
TY MONTAGUE (VO): It was a head scratcher. It’s a move right out of the Big Tobacco playbook (remember when Phillip Morris became Altria?). And even if it was genuine, it was pretty much guaranteed to raise suspicion about the company, not lower it..
TY MONTAGUE (VO): Zuckerberg isn’t going anywhere and he has surrounded himself with a tight group of people at the top who literally don’t care about harming the welfare of users. Which is why we’re seeing Facebook employees like Zhong and Haugen go public, joining former co-founder Chris Hughes, former investor Roger Mcnamee and others in calling out for congress to step in. But, as we’ve seen lately, congress finds it hard to agree on some pretty basic stuff… like that an attack on the capital is bad and voting is good, for instance. So I’m not holding my breath.
So here’s the question: what the fuck do we do?
TY MONTAGUE (VO): To try to get to the bottom of this I decided to reach out to an expert who has some unusual and I think sorely needed attributes: he’s a trained software engineer with a decidedly humanist point of view.
My guest today is twenty-first century renaissance man, Ramesh Srinivasan. A Silicon Valley native, UCLA professor, AI engineer, anthropologist and author of several books including:
Beyond the Valley: How Innovators around the World are Overcoming Inequality and Creating the Technologies of Tomorrow
Whose Global Village? Rethinking how Technology Shapes Our World.
Ty Montague: First I wanna thank you for being here Ramesh and welcome to calling bullshit.
Ramesh Srinivasan: Thank you for having me, Ty. I’m really excited to be part of this.
Ty Montague: So before we get into the topic of the day, which is Facebook, I’d love for you to, to just introduce yourself
Ramesh Srinivasan: So I’m I’m, I’m a very strange cat in the sense that I’m a mix of a humanist, a social scientist, right. I’m, I’m pretty trained in like anthropology, cultural studies, all these issues, as well as engineering. And I think we need all three of these spheres to be in conversation with one another, to get us out of the mess. the many messes we find ourselves in right now.
My work is really rooted in the larger question of how we humanize technology. How do we transform technology so it serves the best purposes of humanism. And so when, I mean, humanism, I mean, that idea that we all feel, and sometimes we forget, that we’re all in it together, you know. That your well being might be connected to my own.
So that sort of respect of peoples and their lives, across our planet. Particularly in the continents of Asia, Africa, South America, and so on. You know, these people represent together the vast majority of technology users yet they’re conspicuously absent really from almost anything involving major decisions around big technology platforms. So this kind of idea of dignity and respect for all peoples, that has such promise in relation to the internet. My work is really trying to drive us back toward that as our north star.
Ty Montague: Yeah. And I’ve, I saw that you, somewhere that you described it as your work as the intersection of technology, innovation, politics, business, and society. What was it that drew you to that topic?
Ramesh Srinivasan: So I am a late nineties, Stanford graduate in engineering. I worked for two years right after my undergrad in Amsterdam, in the Netherlands developing machine learning technologies. And all the, while, both in my classes as an undergraduate. In my own personal life because of, you know, my family being from India, my family traveling to different parts of the world, but just sort of also just generally in my own life, I saw an incredible disconnect between the priorities, the lives, the values, the belief systems, the people around our beautiful planet and where we were driving technology. And even in, in my context as an AI developer in, you know, the late nineties, with the realities that I was witnessing and, and the insights that came from other fields in many other parts of the world.
So you can’t really understand technology and what it means without really thinking about things that are not about technology, if you know what I mean. But you know, all the while what we tend to do is we elevate if not fetishize technology. And that becomes the object of our, our gaze and our attention. Rather than what it actually means for our people, for our planet, for diverse cultures, for democracy, for economic justice, for racial justice, all these issues, those are not questions fundamentally about technology.
And I think I’ve always been sort of a humanist at heart. Someone who cares about these values more than anything.
Ty Montague: We need more people like you with that, that incredibly diverse background. Right. It, it’s very rare. It seems to me, particularly in the valley.
Ramesh Srinivasan: That’s true. Yeah. And I’m in the Silicon valley right now. I’m from Silicon valley. I went to the same high school as Steve Jobs, Stanford, late nineties. So this world is very close to my own personal world. Yet, I’ve also seen how incredibly opaque and limited and actually nearsighted this world is. It’s very disconnected. That’s why I called my last book Beyond the Valley both in a literal and metaphorical sense.
Ty Montague: Yeah. So let’s pivot to Facebook and, this is our second episode on Facebook in the first one, the inciting incident really was the capital riot. And we talked a lot about that in Facebook’s role in it. And so the big change since we wrapped episode one, is that we’ve learned via whistleblower Francis Haugan that Facebook knows, right. Facebook knows that they are endangering mental health and in some cases, physical wellbeing. And they choose to ignore it. And so for us at calling BS, that really raises the stakes. And so first I’d just love to hear, you know, kind of what you think about all of that.
Ramesh Srinivasan: I mean, what Francis Haugan sort of leaked out that she was privy to as, as a whistleblower, was what a number of scholars, you know, not just me, but like, you know, dozens of us and journalists by the way, were pointing out. You know, we all saw these things. we saw the effects in various parts of the world. We saw the effects with January 6th. We saw the effects with the incredible divisiveness associated with Brexit. And the, and the Trump 20, 2016 election. So we saw, wait a second, what’s going on here? These algorithms are manipulating us psychometrically to be more divided than ever.
So we were seeing this. It was extremely helpful for our cause as scholars to get that collaboration from internal studies that Facebook was doing, showing that Instagram, for example, and any of us who’ve used Instagram can see this, and you can imagine being a, a younger person in their early twenties or teens. It just how it’s correlated with the feeling that I’m never good enough. You know, I’m never good enough. I’m never beautiful enough. I’m never hot enough. I’m never smart enough,
Ty Montague: Skinny enough yeah.
Ramesh Srinivasan: Yeah. Because so much of Instagram is artificially filtered, right? So, so basically these revelations were extremely important because they showed how, what Facebook is attempting to do, which other big tech companies are also trying to do in their own realms, is basically dominate our lives in relation to all things socially oriented, right.
They wanna basically be the place we go to, to communicate with, know and socialize with one another. But in the process what’s occurring is not Ty, you and I talking with one another, it’s some weird stuff that’s going on in the middle. That’s manipulating what you see and when you see it and what I see, and when I see it. And there’s this massive acquisition of data that is being used to target us as psychographic subjects. not demographic. But psychographic, what will drive you crazy? What will get you aroused? What will get you con you know, to be more extroverted?
Ty Montague: A hundred percent. as I was preparing for this, I ran across a quote by Sean Parker. Facebook’s first president president in Axios. And in this interview, he said that Facebook and Instagram constantly ask themselves one question, How do we consume as much of the user’s time and attention as possible? And then he went on to say, we were exploiting a vulnerability in human psychology. The investors’ creator, Mark Zuckerberg, Kevin Systrom, all of these people understood it consciously. And we did it anyway. And I guess my question is how, why is this even legal? Like, it, it, it’s odd to me that we allow this to go on.
Ramesh Srinivasan: Well, partly because we’re, we’re only wrapping our heads around what’s going on recently. And our lawmakers are even more slowly wrapping their heads around, behind this happening here. I mean, I’m trying to work with as many of them as I can. Yeah. One piece of good news is that you know lawmakers across the political spectrum for various reasons, the thing is that, what, what is occurring here is, you know, leveraging the kind of principles of free speech. And I’m, I’m all for free speech.
These technology companies have actually, they use that as their sort of crutch and then they do whatever they want with us.
I don’t mind that the internet supports the capacity for all types of speech. But what I do mind deeply is that the most heinous forms of speech are what are being most prioritized. So, you know, Ty it’s one thing for me to say, Ty, you know, you should be able to speak how you wish and you should be able to read what you want.
It’s another thing to say, that Ty, I’m just gonna keep feeding you insane, crazy. And at times conspiratorial and outrageous content. So you, so you go crazy with dopamine firing in your head and it’s like staring at a burning car the entire time.
So actually these companies that are not prioritizing actual free speech in the sense that the that’s right, all forms of speech are, you know, maybe even equally accessible. That’s that’s the other,
Ty Montague: No, I’m sorry. I’m I’m interrupting you go. I’m excited.
Ramesh Srinivasan: No, just the last thing I would say is in democratic societies, there has always been, voices that are a little extreme on various margins. It, those that’s part of democracy. Here what’s happening is, you know, we’re all being presented with completely different worlds in front of our fingertips. And we’re all glued to our phones. And on these apps and on these websites all the time. So our entire experience is completely different. Ty, say, you and I were like the same demographically. The same politically. The same in terms of economic class, we could be presented with completely different dissenting worlds on these platforms.
Ty Montague: Right. We live in completely different reality bubbles on the platform. Well, and that’s the thing. they claim that they, you know, shouldn’t be responsible because they exercise no editorial control on the platform. They are not a media platform. But the, algorithms exercise editorial control. That’s the point. It’s just not, that’s the point it’s non-human it’s. And, but that shouldn’t absolve them. They created the algorithms.
Ramesh Srinivasan: That’s the point. Opt in is the default on every level. And it’s not just opt in. You know, people like Edward Snowden have made the point that even if our phones were on airplane mode, we’re being recorded all the of time. You know? say, I’m hanging out with you Ty, but I left my phone at home, data about me is still being gathered through your, through your phone, my device.
Ty Montague: Exactly. Which is creepy.
Ramesh Srinivasan: So you can stitch together data points trangulate.
Ty Montague: Yeah. Yeah. Okay. So let me, let me shift gears for a sec here and just get into another aspect of this. My sense, given that Facebook’s scale, that we’re all affected by Facebook in some way today, but yeah. I’m wondering what you would say to somebody who says, well,if you don’t like what they do, just delete the platform, don’t use the product?
Ramesh Srinivasan: Easy, easier, said than done.
If you’re in a country in the world where Facebook is the media network. they say, we don’t, we are not a media network. They’re the biggest media network in the history of the world period, period. Right. You know, I mean, they’re only vying with like a YouTube, which is of course, part of Google, right? I mean, if you wanna look at Instagram, Facebook, I mean the technologies, and WhatsApp. If you look at the union of those three groups, we’re talking about 3.5 billion people. Out of 8 billion or so people in the world, that’s the, they’re all media platforms and they are together. If you put it all together, the biggest media network in the history of the world.
So it’s like, you know, saying to someone like, oh, well, too bad. You can just choose to get off our platform and use something else is just not, it’s really unrealistic on so many levels. Right. I mean, because of what we call network effects.
I’m on Facebook. Not because I care about Facebook, I’m on Facebook to connect with my relatives in India, with people I worked with in South America. That’s why I’m there. So it’s like too late are essentially a utility, if not something approaching a monopoly. So they are the de facto, you know, place for socializing. And I don’t mean Facebook, the technology. I mean, Facebook, the company, which owns again, Oculus. Right. Which is, they’re a gateway to meta by the way. Instagram, WhatsApp, and Facebook.
Ty Montague: Yep. And in terms of Facebook’s attitude about all this. I read another thing that I found sort of outrageous. This long time, Facebook, executive, and now new chief technology officer CTO Andrew Bosworth came right out and said that he blames users for the choices that they make on the platform. He said a bunch of, of,
Ramesh Srinivasan: oh, come on.
Ty Montague: Yeah,
Ramesh Srinivasan: That’s insane.
Ty Montague: He said a bunch of crazy things. He said rather than social media, people are to blame for the proliferation of misinformation online. He said the onus should be on the individual in any meaningful democracy. He called Facebook a fundamentally democratic technology, despite the recent revelation that the platform allowed high profile users to break its policies. Asked whether vaccine hesitancy would be the same with, or without social media he said that individuals choose what sources to trust. That’s their choice. They’re allowed to do that. If you have an issue with those people, you don’t have an issue with facebook. You can’t put that on me. And I, I just, that attitude seems I don’t know, anti-human in a way. Like, it just seems wrong.
Ramesh Srinivasan: yeah, It’s uninformed also. I mean, it’s
Ty Montague: In addition to being, just being untrue,
Ramesh Srinivasan: It’s uninformed. And sometimes these are, these are perspectives you hear from people like who are just so, you know, ensnared within the kind of tech bubble that they, they somehow think that what they’re creating is an open portal to the world.
How is that possible? If what I see on Facebook is determined, computationally and algorithmically by Facebook. Can they tell me how I can experience my Facebook feed in a, in a way that is more open or is more self-directed by my own choices, my own values? Of course not.
Ty Montague: Yes, indeed. So just spending another minute outside the US. As you pointed out, the vast majority of, of users are outside the us. And one aspect is that the US is where most of their moderation happens. In other words, they pay much more attention to the US [of course] platform than they do outside the US. Which means that most of their users are looking at a completely unmoderated platform in terms of hateful content. And so I just wondered if you could talk about the impact in places like the global south India, for example,and what the effects are on, on democracy worldwide?
Ramesh Srinivasan: Yeah. I mean, Facebook you know, the company understands that the most, you know, kind of crap that they’re gonna get.or will have to kind of deal with is in the US. Right.
So in other countries, you know, this approach toward just getting people content that rattles them up, that outrages them. That drives that drives them a little crazy is, is the norm, right? So we’ve seen how that manifests in genocides, in Myanmar with the Rohingya community. We’ve seen how it’s actually manifested in the more recent attack on Burmese people by the state government. We’re seeing it right now. In the Tigrien region Ethiopia, we’ve seen this in Sri Lanka. We’ve seen it right here at home in January 6th. We see it all around the world. And we can also see how desperates and authoritarians people like Rodrigo Duterte in the Philippines, Jeral Balsonaro in Brazil have been able to use social media to divide and conquer to polarize people. And our former president was a master at being a central node of digital activity. on many different platforms. He put out content messaging that worked perfectly well for these dystopic forms of virality and visibility that I’m speaking about.
TY MONTAGUE (VO): Ramesh’s take on Mark Zuckerberg, obstacles to regulation and Facebook’s revised BS score – After this.
TY MONTAGUE (VO): Back with Ramesh Srinivasan, author of Beyond the Valley: How Innovators around the World are Overcoming Inequality and Creating the Technologies of Tomorrow
Ty Montague: So just different topic, but, but relevant I think is, and, and some of our listeners may not understand this. I didn’t for a long time. That mark Zuckerberg is an unusual CEO in that he holds absolute power at Facebook.
Ramesh Srinivasan: That’s right.
Ty Montague: Because he owns the majority of the voting shares. And so unlike most normal companies, he doesn’t report to the board, as most CEOs do. So the board doesn’t decide Mark does. On everything. And so when you look at his actions personally, well, and I guess professionally, they’re hard to peel apart. Like what do you think it tells us about his goals for the future?
Ramesh Srinivasan: So the way Zuckerberg is able to get away with this is through what’s called dual class share structures. That basically means that Zuckerberg actually owns, you know, a different class, a different what’s called class of, of, of equity, this is inequity that can’t be, you know overcome by B by other people on the board’s decisions. Facebook basically has a strategy from this example to other ones, including their, you know, so-called independent, you know, advisory board. They basically say like, Hey, we’re down with regulation, we’re down with your opinions, your voices. But at the end of the day, these are just little advisory groups. Or a board that actually doesn’t have any power. You know, why would the board actually even care to change the status quo unless they have more humanist or humanitarian sensibilities given that they’re making more money than ever.
So what you actually see is, again, a bait and switch,They say they’re innovative of, but actually they’re really anti-innovation in the sense that they’re not resourceful at all. They say they’re, you know, supporting democracy. Really what they’re doing is figuring out a masterful model of a triumph of a kind of oligarchic capitalism over democracy. So this is kind of like when we talk about governance of these technology companies, you know, you can’t expect them to do any sort of self-regulation at all. Because it’s, it’s actually just a, like little fake sort of agreements to the public, like, Hey, okay, next time we’ll have better AI.
Okay. Next time. We’re, you know, we’re part of this consortium, you know, with, with the ACLU.
None of those things ultimately change the decisions that are made on a business or technological level by the company itself.
Ty Montague: Right.yeah. I think that you don’t agree with this, but, but obviously you’ll correct me. So, let me take a run at this. Mark has absolute power and he doesn’t seem to care whether Facebook is good for people or good for the world at all. In fact, we know that he is now knowingly pursuing revenue growth at the expense of people’s mental health, young people, children, in some cases, he knowingly lies in public occasionally to Congress about his intentions to change. And so I look at that behavior and it begs the question, is Mark a sociopath? Like, does he have some kind of clinical condition?
Ramesh Srinivasan: Yeah. I mean, you were right to say that my, I would answer no to that. Yeah. I,
Ty Montague: I heard that you disagreed with this and I just, I just wanted to hear you talk more about why.
Ramesh Srinivasan: I think the system is sociopathic, right. I think the system is just, is, is, is deeply inhumane actually. And, you know, as far as Mark Zuckerberg’s concerned himself. I actually met him a little over a decade ago. I mean, I wouldn’t call him shiny, happy people, but, you know, I, I thought he was just kind of in, in that way that a lot of engineers can be. I mean, this is a long time ago, just kind of just sort of wants data and evidence and analysis. And is sort of like, you know, interested in the growth of his platform.
Right, So it’s, so it’s sort of like if we have created such a system where growth, no matter what is, what is valued. And, you know, some of these people are so, you know, kind of just almost, you know, this is a very humane guess, you know, that they’re just overwhelmed with, you know, not just them growing the platform, but the, all the like hysteria and criticism and all. It just feels like they’re, they’re not, they’re not the people we should rely on to do much of anything. They have to be forced to be accountable to a public that they monetize. And also remember all of these companies live on an internet that was paid for, you know, I know it was a long time ago in 1969 by US taxpayers. they’re exploiting the public on, on multiple levels. Both in terms of investments that they’ve monetize for their own private benefit, just like, you know, Pfizer, which seems to be like the state corporation right now. And, and, and Moderna you know, I’m all for vaccination, but I, but I don’t like this. I don’t like that we are having a privatization of public life on every single level.
Ty Montague: Right. Yeah. I agree that the system is sociopathic. I guess given the amount of power and wealth that he has accumulated, how do we begin to think about protecting ourselves from Mark and other people like him?
Ramesh Srinivasan: it’s time for actual decision making power to rest in the hands of third parties that represent, you know, people. And there needs to be more of a relationship between, you know, private technology companies that have, you know, created certain types of spaces and technologies that they should certainly benefit from on a economic level and the rest of us. You know. And you can’t expect this to occur within the bubble of a tech company, within the bubble of this, you know, sort of hyper growth oriented, just valuation oriented, you know, toxic business model. So, you know, I think, I think we’re just gonna have to try our best. I mean, I tend to be someone who really deeply admires Cornell West actually. He’s like my hero.
And he just calls everyone my brother. So let’s, let’s try to say, Hey Mark, you’re my brother, but you know, we’re gonna call, we’re gonna call stuff out. We’re gonna keep it real. And we’re gonna have to ensure that you, that this kind of, you know, egregious offense, I think upon people around the world. Especially the people who are most vulnerable working class people.
Ty Montague: So one of the other levers that I I’ve been thinking about just as I’ve been circling this problem and trying figure out how to, you know, how do we begin to get out of this? One of the other leverages the employees. I mean, Francis Haugan is one of a handful of employees, which now include also a co-founder Chris Hughes who have spoken out, which is incredibly courageous, particularly for her. Yeah. I assume that that, that Facebook, like most companies is, is full of good people. How do we encourage more of them to either, you know, whistleblow if they can. Or just vote with their feet and go do something else?
Ramesh Srinivasan: I think that some of the greatest activism in terms of actually raising our awareness as a public around these issues are from employees within these companies. So I have so much appreciation for people who work in these companies. You know, a lot of engineers are very good people where they just, aren’t just like, I wasn’t necessarily, trained in all these other issues, but that may not necessarily be their job.
Ramesh Srinivasan: The problem is someone needs to have that awareness that kind of more holistic, interdisciplinary, you know, humanist kind of sensibility. And they need to have, you know, Fiat decision making power. And, you know, it’s, that’s the issue here, you know. But I think when I look toward the future, believe it or not, I have a lot of, I don’t know if it’s optimism as much as determination that we can move things on a much better direction. In, in a much more humane direction.
Ty Montague: I wanted to spend a little bit more time talking about regulation.
Ty Montague: One of the things that you’ve said is that we need legislation that to quote you sets the right balance between free speech and algorithms that make hate speech and blatantly false information from unreputable sources go viral. And obviously I totally agree with that. What do you see as the major challenges to regulation?
Ramesh Srinivasan: What a, what a great question. I think in this particular case in a sort of strange way, I do have a lot of hope because progressives, people in the center, and Republicans kind of all are pretty miffed by the status quo when it comes to where big tech companies are. A lot of Republicans like to claim that the tech companies are biased against conservatives and conservative interests. And that they actually don’t support free speech. A lot of people on the more progressive side of the, of the political spectrum, are pretty outspoken about how, you know, sort of these Oli oligopolic, you know, tendencies if not kind of monopolistic tendencies can hijack in economy and hijack equality. So, you know, whether it’s equality or whether it’s like you guys are biased against us kind of thing. Everyone’s got a problem.
Ramesh Srinivasan: So I think that there are certain things we can do. And I’ve been calling for what I call a digital bill of rights. Senator Klobuchar just introduced a privacy bill. It has some aspects of what I’ve been calling for, which I would think, I think is more progressive. You know, representative RO Kana in who I think is really intelligent in, Or really brilliant on these issues, just released a book on these issues. Has been an interesting figure in these discussions. I think there’s a lot of, you know, here’s a fertile ground for us to do something about the, the status quo.
Ty Montague: That’s good news.
Ty Montague: So, so another thing I, I wanted to talk about is just the frame of reference, I guess, that we bring to regulation. Because for some reason, when we view tech companies, it’s viewed as a big deal to regulate them. But we regulate for the common good all the time. Certain drugs are illegal or their age limits to things like drinking or driving a car or flying an airplane. Why do you think we don’t automatically view Facebook or other big technology companies through that same lens?
Ramesh Srinivasan: I think it’s because couple things, I mean, first of all.
Ramesh Srinivasan: It’s very, it seems different than, a car, you know, which is a, you know, we use the car in very, you know, specific defined kinds of actions and activities. And so in those activities, we’re like, yeah, there should be seat belts in the, in the car. Right, right. But you know, our digital lives are our lives increasingly.
Ty Montague: Right.
Ramesh Srinivasan: For better or worse. Right. So it’s sort of saying like our lives should be regulated, you know. And I don’t think people have an easy time wrapping their head around that.
Ramesh Srinivasan: That’s why we need, you know, actual regulators, right. Remember regulation is not the enemy of business regulation actually allows businesses to maintain their competitive edge.
Ty Montague: Right.
Ramesh Srinivasan: If you look at, for example, the last time a major company was broken up in the United States history, that was AT&T you know, last time I checked AT&T is doing pretty well right now. Right?
Ramesh Srinivasan: So it forced the company to actually compete. And that is what regulation, at least in that particular case, antitrust action, is about. in a way it’s, , you know, asking the company to maintain integrity and actually innovate right. Innovation. In this case, isn’t just creating a new iPhone. That’s gonna die in three years so we all have to buy another one.
Ty Montague: So I wanna spend the last few minutes that we have trying to light a few candles here because, you know, obviously this is a tough problem. And you know, we, we really believe in the power of action to create change both inside companies and outside them.So In the book, you talk about a future where technology becomes stakeholder centered instead of shareholder centered to get there, though you point out that people should have a hand in designing their own technology. How do we get there from here?
Ramesh Srinivasan: It’s such a good question. so I think what we can do is actually think not like we, we, we, like, we started with our conversation, Ty. Not sort of have this be about technology, but about what kind of world we wish to live in.
Ty Montague: Hm. Right.
Ramesh Srinivasan: And I, and I think that that question is the most important question that should drive all our discussions about many things, not only technology or Facebook. So when I think about that world, it would be a world where there’s dignity and respect for all people peoples right. It would be where our individual rights would be respected as sovereign human beings. You know, we wouldn’t be sources of, you know, manipulation and, and, and, you know, engines that fuel our depression and anxiety and trauma and divisiveness.
So I think that’s one. Second is when we talk about stakeholders, we need to understand what are the populations within the country, around the world, that are affected by decisions made about a large scale technology platform or platform arms that are owned by Facebook, right. And that’s 3.5 or so billion users. So in that case, we have a huge opportunity to resolve some of the problems with the digital economy, instead of hiring a few exploited, traumatized content moderators in the, in, in the Philippines. What if Facebook partnered with independent journalists in every single country where it operated so that those journalists could actually have power over mediating, you know, auditing, tweaking, working with Facebook technologies so that they could reach people in those, in those countries in ways that are more fair. I mean, bipartisan here, you know so that’s a huge idea, right? This could be a huge way in which Facebook, which is, you know, trillion plus dollar evaluated company. Right. We haven’t even talked about meta. Next time, I guess,
Ty Montague: I guess next time,
Ramesh Srinivasan: So when we say stakeholders, we need to think about those. We have to identify groups of populations, right? like, we have to identify the groups whose voices should be empowered, that can be done with this great advisory board in relation to Facebook, and then real steps need to be taken to arm those different groups with greater power. That means that they are, we move from, you know, disorientation to real collaboration, real collaboration, engagement. That’s what I’m proposing.
Ty Montague: Yeah.
Ramesh Srinivasan: And in the meantime, in our schools and our universities, we need, you know, our stem curricula to not just be stem, but to also be steam, including the arts and the humanities, I think more than ever, people need to really study the humanities and the social sciences so they can understand the question who, what kind of human being do I want to be? What kind of world do I wanna live in? Those are not questions quite honestly, that are usually asked in engineering classes.
Ty Montague: Yep. No, that’s such a great point.
Ty Montague: Okay. Ramesh last, last question. On this show, we have something called the BS scale. So it goes from zero to 100 zero being the best zero BS and 100 being the worst 100% BS. So Facebook says that their purpose is to empower all of us, to build community and bring the world closer together. Where do we think Facebook falls on that scale in terms of actually living that purpose right now?
Ramesh Srinivasan: 80. I’ll give them an 80 and be nice that way. Cuz I do have hope that, you know, because there are good people who work at Facebook, you know them, I know them.
Ty Montague: Absolutely.
Ramesh Srinivasan: I I’m willing to always sort of hope for, and, and not in this moment where people are so quickly calling one another out I try to follow a practice of compassion. Yeah. I think that there’s a huge opportunity here where it could be at an inflection point for real transformative change. That is great for businesses. But great for everyone else too. Right? Yeah. And that is what I want us to get to. And I’m, I’m determined. That’s, that’s why I’m here with you. I’m determined to do everything I can to take us there cuz I’m a veteran of this world in my own way. it’s just like, we can do a lot about this. We can do a lot about this, but we need to start with real dialogue and giving up some power and trusting one another. Real collaboration.
Ty Montague: I love that vision. based on that, that vision, what’s the one thing that you would tell Mark to do to, to actually enact that, which would of course lower that score?
Ramesh Srinivasan: Sit down with the right kinds of scholars and journalists. Let’s all get together. Let’s agree to some real actions that you can take based on what our conversation are, agree and actually enact those actions. Let’s do some AB testing, right? If you are to implement some of these changes, let’s see how much that actually, if at all detracts from your, you know, what they call engagement. So, you know, attention is their currency as you alluded to earlier. Yes. So like, is that really the case?
Ramesh Srinivasan: You know, if I were exposed to, you know, interesting content, that’s not just, you know, crazy trauma hype machine content. You know, and it was just sort of like maybe interesting in other ways, huh.
I don’t think as human beings, we only like to look at burning cars. I really don’t. I think human beings are more complex than that. And I think human beings are also activated by things that activate their interests and expand their imagination. And call me a hopeful optimist about humanity. But you know what, we gotta fight for what we mean by humanity here moving forward.
Ty Montague: That’s right. Hundred percent. it just reminds me one of the ideas that someone put forward in the first episode was instead of a like button let’s try different kinds of buttons, like maybe a made me think button right, right. Things that, that encourage you to, you know, think critically about what it is that you’re looking at. Things that encourage you to engage in healthier behaviors and with healthier content anyway.
Ramesh. This was so great. I, I so appreciate your being here with us today, And thank you. Oh,
Ramesh Srinivasan: It’s my pleasure. This was a great conversation and Thank you for having this podcast.
Ty Montague: Thanks.
TY MONTAGUE (VO): I had no idea when I started looking at facebook that they would turn out to be such a complicated story or that their score would be so high. But the BS score just goes way up when you know about the BS and decide to look the other way. And it goes up even further if you decide to try to hide it.
To me the tragedy here is that Facebook has so much potential to do good. The ability to support and promote great causes at scale. A way to share important information (about a pandemic, for instance), at scale. Connectivity in all parts of the world. That’s what I WANT from my social media.
The really big lie is that they have to pursue an antiquated and dying form of capitalism that puts short term profitability over humanity. I’m convinced it’s ultimately bad for business, and that the market will prove this out over the long term.
So I’m going to raise Facebook’s BS Score. Remember the scale is from Zero to 100. Zero being the best, Zero BS, 100 being the worst, total BS. In episode 1 we gave them a 72. We’re raising it 20 points to 92. Some might say even that is too low. We’re going to spare them them those eight points because, like Ramesh, I believe the company has a lot good people working inside trying to change it. So there’s always hope.
And If you’re running a purpose-led business, or you are thinking of beginning the journey of transformation to become one, here are three things you should take away from this episode. Today I’m going to switch it up and do these in the form of predictions.
1) With a BS score of 92, I predict that Facebook is going to begin to lose the war for talent. Young people today want to work in organizations that align with their values. The very best people won’t be interested in joining Facebook, and those already there will leave. If you’re running a purpose led business, keeping your BS score as low as you can is a vital competitive advantage in recruiting the best.
2) Being an ultra-high high BS company attracts regulators. I predict that at the very least, the days of Facebook’s anti-competitive behavior of just acquiring or copying competitors are over. Regulators will prevent Facebook from that “monopolistic” strategy in the future. If you’re lucky enough to be running a purpose-Led business that gets big, low BS keeps congress away.
And 3) Being an ultra-high BS company actually destroys shareholder value. I predict that Facebook investors, who in the short term, have benefited from all this bad behavior, will begin to suffer. There is a massive shift taking place in capitalism right now. We are seeing the largest inter-generational transfer of wealth in history, as boomers age out and transfer their wealth to the next generations. Those generations want to invest that money in ways that align with their values. If you’re starting or leading a purpose led business, low BS will improve your ability to access capital to grow that business, privately, or in the public markets.
Usually, at this point in the show, I would extend an invitation to Mark Zuckerberg if he ever wants to join us to chat. And that door will always be open, Mark. But today I also like to extend an invitation to Facebook employees who continue to fight the good fight. If any of you ever want to come on our show to discuss any of these ideas or any other aspects of today’s episode you have an open invitation.
TY MONTAGUE (VO): Thanks to our guest today, Ramesh Srinivasan. You can find him on social media linked on our website: callingbullshitpodcast.com. If you have ideas for companies or organizations we should consider for future episodes, you can submit them on the site too.
And I recommend you check out Ramesh’s book – Beyond the Valley
Subscribe to the Calling Bullshit podcast on the iHeartRadio app, Apple Podcasts or wherever you get your podcasts.
Thanks to our production team: Hannah Beal, Jess Fenton, Amanda Ginzburg, Andy Kim, D.S. Moss, Mikaela Reid, and Basil Soper.
Calling Bullshit was created by co:collective and is hosted by me, Ty Montague. Thanks for listening.
Tagged as: facebook.