spk_0 [0:00] is your Twitter feed sick
spk_0 [0:02] These days? Harassment, hate speech and misinformation are poisoning social media, and Twitter CEO Jack Dorsey seems to know this.
spk_0 [0:10] He's trying to cure Twitter of these ills. But how? At a rare interview at Twitter headquarters in San Francisco, I asked him What is broken about Twitter today?
spk_0 [0:23] What is broken about Twitter? I mean, I think it really depends on who you follow and your perception of what you see and how you feel about that.
spk_1 [0:35] I mean, there's a lot of emphasis today on politics, Twitter and politics.
spk_1 [0:41] Twitter tends to be pretty divisive, and it tends to be pretty contentious, and you see a lot of outrage.
spk_1 [0:48] And you see a lot of, um, a lot of unhealthy debate that you probably want to walk away from tangibly.
spk_1 [0:56] If you go to other Twitter's like NBA Twitter or K pop Twitter, you see the complete opposite.
spk_1 [1:03] You see a lot of empowering conversation, so we do have a lot of focus right now on some of the negative things given the current environment, and I believe it's important to see those.
spk_1 [1:15] I believe it's important to see the dark areas of society so that we can acknowledge and we can address them.
spk_1 [1:20] And I think the only way to address them is through conversation.
spk_1 [1:22] But it is hard, especially when it feels toxic, and you want to walk away from it.
spk_0 [1:27] What about incentives that encourage the extremes? Encourage polarisation,
spk_1 [1:32] just backing up a bit. Like when we started the company in the service 12 years ago, we weren't necessarily thinking about some of the repercussions from our actions.
spk_1 [1:45] And they look quite small at the time. For instance, we thought, you know, well, people are following you, so we should count them and then we should put that count right on your profile page.
spk_1 [1:58] And obviously people care about that, so we should make it big.
spk_1 [2:02] But that one small choice and it felt very small at the time and it felt obvious at the time, put an incentive to grow that number.
spk_1 [2:11] Is that the right attention? Is that the right incentive that we should be driving?
spk_1 [2:16] I don't think it is today. I don't think it matters as much in context of how many conversations you have or how much you contribute back to the network and another good example that I think will help a lot of what we're trying to do in health is what we see with Echo Chambers.
spk_1 [2:32] We we only give people one tool right now, which is to follow an account.
spk_0 [2:37] Do you want people to be able to follow stories or subjects or hashtag?
spk_1 [2:42] Yeah. I mean, we've been focusing a lot of the service today more and biassing it more towards topics, more towards interest.
spk_0 [2:49] It sounds like you are willing and ready and willing to rebuild the entire house to renovate everything.
spk_1 [2:55] We're ready to question everything. I mean, we've changed so much in Twitter over the past 12 years, and I know it doesn't always feel that way, but we've changed a lot, but we haven't changed the underlying fundamentals.
spk_1 [3:06] We haven't changed some of the incentives that we probably took for granted because they were easy when we built it.
spk_1 [3:13] Um, and they felt obvious when they built it, but it may not be relevant today.
spk_1 [3:17] When
spk_0 [3:17] you say health, Is that a euphemism for something?
spk_1 [3:21] Well, we you know, we've seen all these issues on the service.
spk_1 [3:25] We've seen abuse. We've seen trolling. We've seen harassment we've seen misinformation, and it came to a point where we felt we were playing whack a mole were just, you know, and also just addressing the surface level behaviours instead of symptoms rather than looking deeper at the second order drivers.
spk_1 [3:44] What's behind all these actions? And we wanted something that was really tangible that could be comprehensive of everything that we're seeing.
spk_1 [3:53] So we were asked the question, What if you could monitor What if you can measure the health of a conversation and we think we can?
spk_1 [4:01] Because we all know when we've been in a conversation that has felt toxic that we want to walk away from, And that's an indicator we've.
spk_1 [4:08] We've been conversations that don't feel toxic, that feel empowering, that we want to stay in.
spk_1 [4:13] That's an indicator. So if we can measure that, then we can measure our progress and then we can actually understand if we're helping.
spk_1 [4:20] We're asking ourselves the question like, how do we earn people's trust?
spk_1 [4:24] It's actually one of our operating principles which has earned people's trust, and we do that because we realise that more and more people have fear of companies like ours and the perceived power that companies like ours have over how they live and even think every single day.
spk_1 [4:44] And that is not right. And it is not fair. Well,
spk_0 [4:47] you're right that a lot of Americans, a lot of people around the world, fear the power of these Silicon Valley giants.
spk_0 [4:52] Are they right to fear your power? Do you feel as powerful as they think you are?
spk_1 [4:58] I don't feel as powerful as they think they are, as I think we are, but I do understand the sentiment.
spk_1 [5:06] I do understand how actions by us could generate more fear.
spk_1 [5:13] And I think the only way we can disarm that is by being a lot more open, explaining in a straightforward way why we make decisions, how we make decisions.
spk_1 [5:22] You
spk_0 [5:22] all are every day, taking down botnets and suspicious accounts and trying to stamp out harassment and abuse that that is happening every day.
spk_0 [5:31] But I wonder if users don't see it happening enough.
spk_1 [5:34] It's an amazing point and, like a lot of the output of our health initiatives, are pretty invisible.
spk_1 [5:43] In the short term, we have had people, um, some of your colleagues, for instance, say that you know, I've noticed it improved.
spk_1 [5:54] It's still there, but it improved. And you? I think you see a brunt of the negativity like our our journalists, journalists like journalists get a you know, an unfair dozing of a lot of the contention just based on everyone's morning.
spk_1 [6:10] Well, based on what you're reporting around, and I think you know, we need to do a better job at protecting and ensuring that you can do your work without distraction.
spk_1 [6:23] But over the over the short term, a lot of this work is invisible, and over the long term, it starts to add up.
spk_1 [6:30] What is the timeline for re examining how you show follow accounts or the use of the like button?
spk_1 [6:36] You know, we're I mean, we're looking and thinking about all these things right now.
spk_1 [6:40] We've we've we've definitely had conversations about them. But would you say, like by the end of the year, there's going to be those fundamental changes to Twitter?
spk_1 [6:48] I don't I I worry about a time frame like that because we we also need to take into consideration.
spk_1 [6:54] We're a small company. I mean, we in comparison with our peers, um, we're a small company.
spk_1 [6:60] But we have this outsised impact, and I believe importance. And, like there is a lot of what's in Twitter that you would find in a public square to use the older analogy.
spk_1 [7:14] You mean all the graffiti on the walls? Well, that there's part of that.
spk_1 [7:19] But there's also really amazing open conversation, and there is the ability to walk up to anyone and strike something up.
spk_1 [7:25] So there's positives, and there's what people perceive to be negative as well.
spk_1 [7:28] Look, I met my wife on Twitter. I'm always gonna love Twitter, but I kind of feel like it's a garden that's been overrun by weeds.
spk_1 [7:34] Do you feel like you're the gardener just struggling to keep up?
spk_1 [7:38] There are certainly times where we felt like we're behind, but that that goes back to my point of like, we need to be really good with prioritising and sequencing and understanding what matters most.
spk_1 [7:50] So if the incentives are going to have the greatest impact, then we should prioritise that is your job to make sure people are not misinformed on Twitter.
spk_1 [7:60] I think we need to be really thoughtful about what that means.
spk_1 [8:03] Like what is misinformation and How How do we? How do we help people determine?
spk_1 [8:11] Um, credibility? The classic example from the 2016 election was the pope endorses Donald Trump, right?
spk_1 [8:17] That was a popular article that spread around on Twitter and Facebook.
spk_1 [8:20] Wouldn't it be pretty easy just to make sure that lie doesn't spread?
spk_1 [8:24] I don't think it's pretty easy because you have to extend it and you have to generalise it to everything that could happen along that vector.
spk_1 [8:31] So I think what we could do is help provide more context and whether it be showing all the different perspectives.
spk_1 [8:38] And people were saying This is fake and people who are believing it and like to actually advance that conversation.
spk_1 [8:45] That's one way I'm not not assuming that's going to solve everything.
spk_1 [8:49] But it gives journalists more opportunity to actually remove some of that bias and and call it out for what it is.
spk_1 [8:56] Um and I I think we can do a lot to help there.
spk_1 [8:59] But also identifying more credible voices in real time and amplifying that credibility is something we can do.
spk_1 [9:06] But we have not figured this out, but I do think it would be dangerous for a company like ours.
spk_1 [9:11] Going back to that fear point to be arbiters of truth.
spk_1 [9:14] Did Twitter make mistakes around Alex Jones and info wars around the the initial announcement that no, he has not been abusing?
spk_1 [9:22] No, he has not been over the line, but then a few days later, giving him a time out.
spk_1 [9:28] Well, we are system works by people reporting content. So we, um we don't We're not in a place to proactively review everything.
spk_1 [9:40] Um, and we act when we receive reports that is just consistently enforcing our approach and our roles.
spk_1 [9:47] People may disagree with that approach. People may say you should be a lot more proactive around all the content and what we could do that it just It requires so many resources.
spk_1 [9:59] I mean hours and hours and hours of, of, of looking through video content.
spk_1 [10:04] So at the time, we did not receive reports that we felt we could take any action on that violated on our terms of service.
spk_1 [10:12] Your colleagues, as CNN pointed out a number of them. We took action on one, and then we noticed that all the others, likely because they were made known to Alex Jones and Info Wars were being deleted.
spk_1 [10:25] As we receive reports, we take action and there are varying degrees of enforcement action, starting with warnings to temporary suspensions, which the accounts are now in all the way to a permanent suspension.
spk_1 [10:40] Is it possible that he'll he'll change his behaviour on Twitter?
spk_1 [10:42] I think he really might do that. I don't know. I mean, just just stepping back like we have seen.
spk_1 [10:49] We have evidence that shows like temporary suspensions, temporary lockouts will change behaviour.
spk_1 [10:56] It will change people's approach. I'm not naive enough to believe that's going to change it for everyone, but it's worth a shot, you know, it's like we and it.
spk_1 [11:07] But more importantly, it's It's consistent with our our enforcement, like we can't just keep changing randomly based on our view points, because that just adds to the fear of companies like ours making these judgments according to our own personal views of who we like and who we don't like, and and and taking that out upon those people that those few points change over time and that just feels random and it doesn't feel fair and doesn't earn anyone's trust because you can't actually see what's behind it.
spk_1 [11:41] Do you miss the days when you could just use Twitter to meet up with your friends?
spk_1 [11:47] Because now we're talking about how it's used to cause violence.
spk_1 [11:52] I mean, I think. Okay, I think it's just so important to see the world for what it is.
spk_1 [11:58] And I I don't want to live in the world where, as we're just we only see the happy things and we only focus on what makes us feel good, because we got a lot of stuff to figure out.
spk_1 [12:10] So no, I don't. I don't miss them because we're seeing we're seeing a lot of important things that we need to finally discuss.