Cyber.RAR

Is Big Tech Imploding? Cybersecurity and Content Moderation

Episode Summary

Big Tech, honey, are you doing okay? Whether we like it or not, large technology platforms and the for-profit institutions that make them are here to stay in our society and economy. Governments are starting to craft often-overlapping regulations to try and fix the problems that come up, but instead of looking at issues one by one, let's look at these organizations as a whole - fundamentally “grow fast and break things” companies who somehow ended up in shouldering a lot of our national security, growing the international economy, and protecting values that underpin our Western society. How well does big tech help or hinder our security, privacy, and social fabric, and how will that change as the economy slows down?

Episode Notes

Big Tech, honey, are you doing okay?

 

Whether we like it or not, large technology platforms and the for-profit institutions that make them are here to stay in our society and economy. Governments are starting to craft often-overlapping regulations to try and fix the problems that come up, but instead of looking at issues one by one, let's look at these organizations as a whole - fundamentally “grow fast and break things” companies who somehow ended up in shouldering a lot of our national security, growing the international economy, and protecting values that underpin our Western society. How well does big tech help or hinder our security, privacy, and social fabric, and how will that change as the economy slows down? 

Show notes:

Episode Transcription

 

Winnona: Hello and welcome to Cyber dot rar, a podcast by Harvard Kennedy School students. My name is Winona and I'm joined today by Danny, Sophie, grace, and Bethan. Today we will be talking about the current big tech landscape and how it impacts our security and cyber.

 

Whether we like it or not, large technology platforms and the for-profit institutions that make them are used to influence elections, amplify conspiracy theories just as easily as they are used to promote freedom of speech and build. Community. Governments are starting to craft often overlapping regulations to try and fix the problems that come up, but I figured it'd be a good idea to, instead of looking at the issues one by one, look at these organizations as a whole system fundamentally. Grow fast and break things. Companies who somehow ended up shouldering a lot of our national security, international economy and protecting the values that underpin our Western society, you know, light stuff.

 

Dani: A light topic for a light week, you know, can't wait to bring this up at the Thanksgiving dinner table.

 

Bethan: Yeah, who's ready to fight?

 

Winnona: Well, you know, just to Lay some groundwork for our audience, I did wanna focus on the domestic big tech companies headquartered in the US rather than foreign big tech. So, Huawei rip and replace, or TikTok can be like a whole separate set of episodes. But you know, we have everything from Twitter being owned by Elon Musk.

 

Now the Uber ciso, former CISO is going to jail. We might be headed into a recession question mark. So, where do we wanna start? What's most, top of mind for everybody?

 

Bethan: Winona, this is very classic, you throwing out a million questions that are all really difficult and now we have to find some coherent and smart way to respond to them all. That's why I love this podcast and I hope our listeners love it as well. what I get nervous about, or I'm interested in seeing what happens is that we're in this moment of a lot of things changing, right? We. A fundamental shift in the economy. After a very long bull market, we had this big shift or change in how the US population views tech companies. We're seeing a lot [00:02:00] of growing rhetoric. Pushing back on big tech. And we have Elon Musk coming in, like throwing a grenade, trying to blow everything up. There's so many wins of change happening right now. And the question is, what happens to this fundamental investment in cyber security? And Remi remembering amidst all this noise that there are real major threat actors. Who are targeting American citizens, who are targeting American companies and not to lose sight of that. These pacing threats in the midst of all this noise.

 

Dani: Yeah, Beth and I, that's. Really great point in the way that you put your finger on the winds of change being the thing that you're noticing right now. That is a hundred percent the thing I'm feeling. So it's pretty huge that one of the two platforms that has defined US public discourse for the last 15 years, Twitter and Facebook, one of them is now private and therefore is beholden to the whims of one person in a way that I.

 

Without even noticing. We'd been considering sort of the pure social [00:03:00] media companies as all having an obligation to keep pace with one another when it comes to privacy and security because they sort of were all in the same boat of having shareholders, having public whatever, public accountability you wanna believe that creates.

 

This is a pretty big sea change then to have Twitter be private and not accountable in that same way. And it's coming in a moment where we've. Year on year, redefining what is acceptable in public discourse under the Trump administration. In business you have the much allegations about.

 

Twitter knowingly allowing espionage within its employee base for eign espionage you know, sort of, I, I don't know where that's gone, that those headlines sort of faded. And is Elon gonna be tracking? That does. May probably not. He doesn't even know which employees he has, so there's a lot of chaos going on.

 

I feel like we're at a. Chaotic, neutral to chaotic evil inflection point. And I, you know, your call out of the recession [00:04:00] is, is a big piece and is of a piece rather for that in so far as scarcity makes people panic, makes them cut back on the kinds of investments that are long-term investments and pay attention only to short term, quarter by quarter returns.

 

And those long term investments are things like security and privacy.

 

Winnona: Something you said really struck me, Danny and I had, I was thinking a lot about this particularly with regards to mudge and when you think about the whistleblower allegations that have come out of Big Tech much wasn't the first. Facebook's Francis Hogan was what, one year, two years prior. And you see a lot of similar vein issues there, not just from a domestic perspective, but also that Facebook posts were helpful in the me and Mark who and other forms of influence campaigns.

 

So I guess it's interesting to see Twitter taking this complete different approach from Facebook. You know, they instituted the oversight board. They have a lot of their security teams working really hand in glove to make sure that coordinated campaigns that range [00:05:00] from active hacking to specifically just influence or trust and safety concerns are working together.

 

Whereas we're not really sure what's happening with the content moderation side and Twitter right now. There was a Good Washington Post article that came out a couple days ago about how the trust and safety team went through multiple different back and forths with Musk on, you know, how do you implement Blue Check marks in a way that doesn't result in Eli Lilly losing 15% of their value through a tweet.

 

Grace, I know you had some stuff to say content moderation and de platforming. I'm super interested to, to hear your.

 

Grace: Yeah, I mean, I think like some of the, the biggest anxieties that I've personally had, Like Elon Musk taking over Twitter. I mean, not to mention, I don't know, 75% of the company no longer having jobs. I mean the fact that, yeezy formally Kanye West coming back and not to mention President Trump being invited back perhaps not coming back. I think you could speak more broadly to that. But I mean, thinking more generally about [00:06:00] the, the public conversation and having been, I think more in the hands of sort of the elites slash a smaller group of people. Like I think one of the powers of the presidency used to be, but you had like a direct like line to having press releases or press conferences, that sort of thing. And now anyone can really do that just depending on who is paying attention to that. and I, I think to have that moderation of who, who has that voice, who could be on the platform regardless of how much, you know, hate speech or rhetoric that can bring up that we've seen. And I think that yeah, that moderation. Being in the hands of, of a private company. And I think typically these types of decisions aren't made by one person. I'm not sure exactly how much it that is really in Elon Musk's hands at this point for Twitter. But at at least that perception, I think to me is, is jarring and it is scary in what we'd like to consider as like a democratic society.

 

Sophie: Yes, I think there's, it's definitely clear that there need to be some kind of [00:07:00] stricter regulations for limiting, especially things like hate speech and speech and sites violence in these kinds of public for, and it is genuinely concerning that Musk is now looking to automate much of what the trust and safety team is doing.

 

To police content on Twitter, but I also wonder what the effect of de platforming is and the ways in which it might be negative and in some ways even counterproductive. Because if you deep platform folks with radical views, I don't think that they will stop having those radical views just because you remove the outlet for them to express them. And what you might see is a situation where those folks find other ways that are even more of an echo chamber to get those thoughts out. And you could see like smaller niche groups or you know, for. Collect those viewpoints in a way that might actually be counterproductive.[00:08:00] Danny, do you have a thought about that?

 

Dani: I do, so I, I understand that argument and I think it. There's one of two arguments I hear for not D platform, and one is you risk creating even more of an echo chamber. They stay on the big platform on Twitter. They at least have some exposure to non-conforming views. The other is that in cases of really violent extremist views, it's harder to track folks who are in small niche platforms and.

 

To me, the first halts less water than the second. So I, when they're on Twitter, They're in as much of an echo chamber as they wanna be. I think as long as we have engage, as long as we have business models that prioritize engagement and we haven't figured out a different model for revenue than the spiciest, most extremist tweets are gonna be the ones that show up in their feed.

 

And so I don't think there's a true exposure and mitigating effect that comes from being on a platform that has all the world's population, cuz they're really just gonna see their, their [00:09:00] ideological. The other is that if you think of the small niche spaces they can find, to me it feels a lot like what we had before.

 

There was social media. I, if you were an extremist in your small community, you had access to a limited, another number of other people, and you. Couldn't find them. There was no social media. So you, you know, person by person found your extremist network and kind of coexisted in your little cell and didn't impact the broader world.

 

What we have now is that ideology having a much larger platform headline and having many more. Non extremist eyeballs on it thinking, holy moly, is that what my whole country believes when in fact it's really just a really vocal 10% that dominate the. The trending, but it, it shapes the country because then you think, okay, that is what everybody believes.

 

Should I better go in the other direction or, oh, I better give up voting cuz this country's so far gone. It just, to me, the, the effect of siloing [00:10:00] people, yes, they may not change, but also you limit their exposure to the broader public.

 

Winnona: So Danny, I actually disagree with both of your points, like both Danny's points and then also Sophie a little bit. I'll say why, because. Sophie's talking about de platforming, which is a tactic used to take some of the most extreme people off of the, like, off of the platforms. Right. And I think that it's a very solid approach. The one that I can remember most is kicking Infowars off of YouTube and other forms of social media and there was certainly less, , There's certainly fewer followers that ended up following Alex Jones into other sites, and that's a win from a trust and safety perspective. But it's also not enough because there is still going to be people who will follow him onto that site. And I think it's, a. Mistake to conflate a win on a corporate [00:11:00] risk perspective, which is a numbers game. We've gotten as many people off of our site that are spewing. Hateful rhetoric as possible with a national security or western values proposition, which is a how do we help these people make sure that they understand that the rest of the world doesn't agree with this. On Danny's point, I guess both agree and disagree, where when we're talking about content moderation and algorithm, Surfacing content like the infinite scroll of content moderation, curation. For a user that is, yes, going to be an echo chamber, but the problem then lies with the fact that again, that goes back to corporate interests where you want people to be on the site more. That's where you're getting your metrics. They're seeing more ads. And fundamentally, if they stopped that and ended up curating a website for you, that was based off of people. You actually [00:12:00] interacted with and made it more difficult for you to find people? I don't know if that would work. I don't know if it would provide the right type of corporate revenue that an entity that is as big as Twitter or Facebook or meta or, or other forms of, of social media would want and frankly, we have no idea if the benefits of social media, which is being able to connect people of similar interests would go away.

 

Sophie: I'd be really interested to read some research on sort of the effectiveness of de platforming and whether there's been decrease. Kind of violent rhetoric as a result of it. But yeah, I mean, I take both of your points and there are clearly trade offs here that are somewhat inevitable. I think my overarching point is that big tech executives should not be the ones making them.

 

Grace: I'm taking kind of it to a different dimension here. Just thinking about not only the people who Um, looking at the points that maybe a very vocal minority makes and making might agree with these points or I think maybe bigger judgments about than what the make of the world is or what, or how much we [00:13:00] disagree as a country, et cetera. I, I think something is like the number of children who are on these apps. It's growing and, and as, as people, you know, come of age on the internet a lot of these kinds of views that I think at some point were very fringe or very far right or even very far left, have become like mainstreams of normal ideas.

 

And especially for people who are extremely young to, to be faced with these kinds of ideas and. To be shaped so early on I think is also I think for me, a reason why I definitely support de platforming while agreeing, definitely agreeing with your, your, your general point, Sophie, that I'm not also sure who should be doing that, that moderation, but it doesn't seem like an unelected, very rich person should be that, that person doing.

 

Winnona: I agree. Grace. At the same time though, hasn't that always been the case? Like ultimately you have people in Silicon Valley that are, you know, a team of. Granted over 500 people, but ultimately it's those 500 people that are determining what stays up and what stays [00:14:00] down. So what then would be the number of people that would be sufficient for a content moderation that reflects all views of all people? Musk has been reinstating voices that feel like they've been marginalized on the platform, that feel like they've been censored. You also see the inability of content moderation and different languages that Twitter and Meta have been unable to shut down content because they're using slang in a different language that ultimately a western headquartered institution will not have the capabilities to censor. So like what, what is the right moderation? Techniques, tactics and staff I think is also a, a large set of questions that ultimately, like we can't answer on this podcast, but are worthwhile to bring up as vectors.

 

Beth an I know you had some points, and then Danny.

 

Bethan: If the argument is okay, de platforming isn't the cure all and we need to invest more in moderation, we're now at this point where a lot of these tech companies are strapped for [00:15:00] revenue.

 

When I say strapped for revenue, I mean like historically and content moderation is not a money maker, right? It's not a revenue generating area. A lot of the way that ads are cultivated are again, from this self, this reinforcing echo chamber.

 

That's how a lot of these tech companies make money. And so now that we're having this economic squeeze and there's obviously way less incentive to invest in content moderation, as we saw, you know, Elon cut a huge amount of the content moderation team at Twitter. Again, what makes me really nervous is we're, yes, we're having this conversation about the importance of content moderation, but now if we are just cutting out that talent and we're losing a lot of the foundational work that's been done by advocates what will this do to the core infrastructure of content moderation as these tech companies are going into a new economic cycle that's far less forgiving than the market we've been in that has allowed these companies to flourish and grow to the behemoth that they are now.

 

Dani: I agree with a lot of those points, Beth. And I wanted to go back also to I'm gonna find [00:16:00] myself agreeing with Sophie now as well about who should be doing the regulating. Our answer for a long time. The question of who should decide what you know, is this product good? Is it bad? What should be on?

 

It was always. The market will tell us, you know, and that's not just true for social media. It's true. I'd say broadly across American economy. It's the consumer will vote with their pocketbook and with their feet. They will go to the platform or to the product that is best for them, and the market will result in the highest utility which I think is.

 

Bullshit. You know, you're, we don't get to the, the best, most optimized product for everyone simply via a free market. But bizarrely, we are seeing that experiment play out in the case of Twitter, where you're watching advertisers withdraw their revenue as they start to anticipate the kinds of content that will be on there, and the ways in which it'll create a product that is bad for their brand.

 

Of course you have it now owned by the richest person of the earth. I mean, if you're, if you're buying Tesla's market cap, [00:17:00] richest persons of the earth and so the withdrawal revenue, you know, we'll see how much that matters. I wish we would come out of this experiment thinking about how often we rely on free market principles as the guiding, guiding governance for what's accept.

 

Bethan: Danny, you're not sounding like an MBA graduate.

 

Dani: Excuse me. Sloan's slogan is Principled innovative leaders. Alright, there's nothing in there about maximizing shareholder return.

 

Winnona: I'll argue the maximizing shareholder return, cuz I, I think that curly users are voting with their feet. Uh, you see people leaving Twitter for other alternatives at this point? , the Mastodon server that I'm on, if, if people wanna join our like little federated server is at 17,000 users.

 

It was 300 like eight weeks ago. You see that ultimately, yes. People are leaving Twitter. Like, follow, subscribe and join my masteron. Please do. I also [00:18:00] wanna just push back on some assumptions that I think we're making here, which is ultimately is it really the company's fault for ending up in this ecosystem? given the incentives that they've been provided and our very willing assumption to just hop onto this same platform. There's a lot of anti big tech sentiment, but ultimately this is the hand that they were given and it spiraled out of control.

 

You could see that in some senses, big tech is very well versed to do a lot of this more so than any alternative that exists. So when we're thinking about de platforming, the fact that the audiences are currently in one area on either YouTube or Twitter or Meta or Facebook, it does pack a huge punch when they are de platformed.

 

Sophie: Are you saying that the platforms are the best candidates to do the content moderation

 

Winnona: I think that we give big tech a lot of flack for [00:19:00] what ostensibly is some of the best solutioning, but we also have an assumption that content moderation is the right choice ultimately. Allegedly. I know, I know we make a lot of fun of, of Elon . But ultimately the, the alleged policy that he has for Twitter right now is anything that is ostensibly threatening speech under the First Amendment will be. De platformed not anything else. Which is how like that's, that's how we are under the first amendment

 

Sophie: I mean, I think like the point kind of there is that a lot of these tech executives see some of these problems I think as engineering problems when a lot of what goes into effective content moderation, I think has more to do with. Social engineering questions, then, you know, technical engineering questions, it's sort of like, how do you organize large groups of people in a way that keeps those people safe while allowing them to express their views on a platform [00:20:00] whose intended purpose is for those people to express their views, right?

 

So, I mean, that's why content moderation at this scale is incredibly hard. And, and that's why privacy regulations are so, so important. I think inspiring to see that we are now, for the first time really starting to get serious about those privacy regulations. I think it's, you know, five, 10 years too late.

 

I'm happy to see the progress that's been made on it in the past, kind of,, two years or so.

 

Winnona: You're absolutely right, Sophie. I think that given that this is such a hard problem, it's worth talking about, I, I think that maybe we should also broaden it out into the wider systemic issues. We have the big tech issues internally with regards to content moderations and. Internal corporate policy, but we also have governmental regulation and the wider capitalist market. I wanna hear more about what you guys think of not just the, the content moderation issue, but also security as both of these are [00:21:00] compliance issues. They're not exactly money makers as we know. And given the overarching regulation, kind of like Sophie said, we have what, five states now that have different patchwork privacy. Regulation. We'd love a federal one at some point, but who knows when that's gonna come. And then the tightening of belts. How do you guys think this is gonna play out given the somewhat of a slump that we're entering into?

 

Dani: Can I take us to a related area, which. Not only does the tightening of belts, period where aid have implications for the staffing of content moderation, security positions, but actually for the overall health of companies. And when you start to be unable to sustain operations overall, then you start to look at what assets you can sell off.

 

And one of the things tons of companies have is data on all of us. So there was a case from 2015 when Radio Shack filed for BA. And listed among their assets were 13 million [00:22:00] customers, email addresses and like 65 million phone numbers. And that was 2015. So let's expand out how many more companies have adopted the practice of data collection since then.

 

I'm really curious to see how much that starts to crop up. Now that that industry. Data collection, data aggregation has really matured. There's a strong market for it, and we're headed into a space where a lot of companies may feel like they need to find some other revenue sources.

 

Winnona: Radio Shack was in business until 2015.

 

Bethan: Yeah. Wow. Danny, I think the, the Radio Shack example is really compelling because it shows that there is a new economic value in our data. ~Um, ~and obviously that's been the underpinning of the growth of companies like Facebook and Google, et cetera.

 

The brick and mortar companies that weren't built on leveraging your data, who now realize, oh, I have this entire area of economic opportu. another facet of that is also what about threat actors who say, oh, you know, radio Shack, they [00:23:00] probably don't have the best cybersecurity infrastructure.

 

Dani: I just wanted to correct myself. I said 65 million customer phone numbers. It was names and address. Um, But Beth, that's the point about threat actors is really well made. I mean, I do not trust the cyber security posture of the 10 consumer goods companies I've interacted with this week to retain my information.

 

And yet there just seems to be no way to get consumer goods without giving up significant parts of your data. And it just, it's such a good point that you, when you hit a time of scarcity, it's not just legitimate businesses that start to get creative about where they find sources.

 

Winnona: You know, both of you have touched on something that I think is really important. The fact that now as people are moving off of social media onto more federated or smaller organizations you know, like Be Real or , Mastodon, what kind of data retention policies, content moderation, security teams do they have? , [00:24:00] you know, it's probably a lot easier to coerce, bribe or bully a single ~ uh,~ uh, deeded moderator than an entire multimillion dollar company, for example, it's probably a lot harder for five content moderators to be sitting on a server and looking at probably really terrible content sometimes than to have a whole team of individuals that does this. And it's probably a lot easier to look for threat actors when you have better data to do it across millions of users rather than a couple thousand. It'll be interesting to see, not just as the recession starts hitting these big tech firms and they have to make cuts, but also as people move on to alternative sites, how scrutinized those smaller organizations are and how many of them fall between the crack.

 

Dani: That's spot on. We're not on. When we think about when GDPR was passed, the panic that companies with unlimited resources went through thinking [00:25:00] about. , how are we gonna implement this? And the immense amount of time, effort, and resources it took to do that and continues to take. And then you think about the splintering into much smaller companies, how are they gonna navigate that one particular piece of regulation, let alone the patchwork you have to navigate in the us let alone whatever is to come in the future. As we said, a time of chaos.

 

Sophie: Yeah, there's definitely a lot of trade offs. I think like one example as well is in some of the recent antitrust legislation. There's. Pieces of it that target this idea of referencing, which is basically that these big tech platforms can't promote their own content or their own microservices on that platform.

 

Like groups like Google do this frequently, but at the same time Google's threat analysis group tracks, you know, several hundred government back threat actors and they use that data, which [00:26:00] is proprietary data. To make their product Gmail safer for their users. And so I wonder, or I guess the concern might be that if some of these referencing pieces of these bills would make something like that illegal, uh, and what the consequences of that may be.

 

Also countries like China are not gonna play by these rules. So I wonder if some of the interoperability requirements from these you know, antitrust regulations, which are. As a point of departure going in the right direction, whether they could actually, you know, serve to serve to be not in the interest of our national security or actually counteractive to some of the intentions of the antitrust legislation itself.

 

Grace: , I think the only thing I was gonna say about like, places like Macon or new social media having smaller moderation, content moderation teams, to me, I think. that goes into the antitrust stuff.

 

If there, these are smaller platforms. In smaller communities, I think the [00:27:00] threshold for maybe quality of moderation, for me personally, like my opinion is lower in that it's not shaping a global or an entire nation's conversation. It's more niche, and I think maybe that's why I support antitrust laws when it comes to.

 

Social media and where we're having these sorts of public conversations. But I think maybe your point was more, or what I do think is interesting is as those platforms grow, how will they. Solve these problems that I would say that the, the first generation of social media companies, I think maybe get more of a pass.

 

Because, you know, these aren't, these weren't necessarily, we didn't know that Facebook would affect the 2016 elections, would've thought that when he made this algorithm and this Harvard bedroom, you know, like, I don't even know, like 20 years ago at this point. Um, But the thing is now these are in public consciousness.

 

We know that this is possible when platforms get enormous and when there are millions and billions of people [00:28:00] using them. So I would say that that's, that's like the task at hand for, for growing social media companies.

 

Bethan: Yeah, grace, I wanna follow up on that or maybe either provide a pushback or caveat that some of these newer social media companies or startups, or outgrowths, whatever you wanna call them, are anticon moderation, right? Like they are created to go around the structures that have been built by the. The mainstream platforms like that is their value proposition. For example, truth social or parlor.

 

Their goal, their narrative, their ethos is to push back against the mainstream content moderation that they see as specifically targeting certain voices and people moving over to those platforms. So I think that's where it starts to get complicated is yes, we do wanna encourage robust antitrust regulations.

 

However, what is the second [00:29:00] order effects of when we're encouraging these new social media platforms that actually are against content moderation to the extent that current regulations allow.

 

Winnona: So I'm not sure if I agree with either of you, grace and Beth, mostly because I think that when platforms are smaller, that doesn't necessarily mean that they're going to allow diverse and diversity in thought. I think that quite the opposite. There's more of a risk of echo chambers, and that's one of the arguments against Parlor, right?

 

Where it's freedom of speech, but most of the speech is one type of view. And that goes back to one of the benefits of a larger firm that has guidelines because then you can get at least some diversity in thought is the theory, right? All, all of these discussion points have really brought me back to big Tech.

 

Has a [00:30:00] lot of really difficult, somewhat unsolvable problems, especially in the system in which they're operating, where they are fundamentally profit making institutions and security and content moderation is a loss leader at the same time. Smaller companies because they don't have the economies of scale, don't have the resources or the experience to really do the same things and try and promote the same goals, but at the same time is content moderation, especially given the American constitution something that we want to really enforce heavily or more lightly and what do we do with people from a corporate risk perspective versus a freedom of speech and over tin window society perspective when they have or develop extremist views. I think all of these are problems that we haven't quite solved.

 

And I think we're also only thinking of this in the American context, which is entirely biased, [00:31:00] uh, and doesn't even take into account all of our, our, uh, friends and partners globally and how Western companies think about, uh, how to do the same protections in a more global capacity, which is a whole different problem in.

 

Sophie: And also in countries where the media is basically state owned, Twitter is like completely revolutionary. So we've been having this discussion very much from an American. Point of view but it's not the same, I guess, in other parts of the world.

 

Winnona: Certainly. Do we wanna go around and maybe say, do we have any sort of recommendations that we would make for this

 

Dani: sure I have thoughts. I think we've kind of gone round, round and round on the cru issue of who should be moderating. And in the US in particular, as Winona just described, we're, we're gonna really struggle to get to the bottom of that. So that's a fight and a discussion we should keep having, but we need a short term fix.

 

And [00:32:00] I think that short term fix lies in behavioral engineering. And so building platforms in a way that. Impediment to accessing harmful or dangerous information. So whether that's, you have to click through and then you have to say, yes, I've read this. You know, all the things we find annoying to do, like declining cookies.

 

All the things that make us less likely to do the healthy behavior for us. Putting in those kinds of junk in the system so that people can access the, the harmful content less easily I think is a short term fix that doesn't compromise free. It allows the user to access things if they're really deliberate, intentional about doing it, but also protects them from inadvertent accessing.

 

And that's what we, that's what we implement while we get to the bottom of this constitutional and capitalist nightmare of a debate.

 

Bethan: If I'm trying to think of a takeaway I think policy makers need to be even more aware of how the market is changing. And we still [00:33:00] cannot trust or assume that these big tech companies or even smaller platforms are self-regulated, regulating, or even following existing regulations. Policy makers need to be very diligent about making sure that companies are staying up to date with strong cyber and content moderation infrastructure. I'm not entirely sure what that looks like, because obviously we can't have a committee hearing every five minutes. Congress has done a really good job of this, of being more active on listening to whistleblowers, giving them a platform and responding. But really we can't take anything for granted, particularly as the economy gets worse. And as companies feel the pressure that's where policy makers and the government really needs to step in to make sure they're holding companies account.

 

Winnona: As a, a final takeaway. Again, this has given me a lot to think about. Hopefully you guys as well. There's the devil we know and the devil we don't , and[00:34:00] as we're moving away from the general trust and safety models, or as some of the, the big tech companies are moving away or cutting down, or as we're entering this recession, laying off people in the security, trust and safety industry. I think that society as a whole is starting to realize that the way we conduct discourse over the internet, Is something that is going to stay around and potentially become more acerbic than we would like, and more extreme And content moderation as well as de platforming are two tools that are useful, but focus on corporate risk and are.

 

Enough or adequate for actually having a democracy where we think about each other's values. And I think that simply pulling out the big regulation hammer isn't enough. And I don't know if content [00:35:00] moderation is the tool for this. I don't think it is. And I think that we need to think about other forms of building societal infrastructure that don't rely on big tech, but also whatever comes after.

 

All right, so for our show and tell this episode, we have Danny and Bethen who are gonna talk about crypto again,

 

Bethan:

 

Okay, so I hate to be the person who said I told you so

 

Grace: get outta here.

 

Bethan: No, no, no. It's not even I told you so. It's, we told you so because we have this conversation on our podcast about how cryptocurrency is a volatile, speculative asset, and I think in the implosion of FTX and the massive waves that that has had throughout the crypto market speaks to the major risks. Financial, cyber, et cetera, et cetera. That cryptocurrency poses, that's my first take on what's happening with ftx.[00:36:00]

 

Dani: To be fair, when we said it was a volatile asset, I don't think we anticipated the possibility that someone would form two companies and use the capital in one to create, to cover the debts in the other

 

Winnona: so walk us through ftx, uh, as someone who has not been following the situation as closely as I.

 

Dani: So, ftx is a, Cryptocurrency exchange. That means you can go on there and buy lots of different kinds of cryptocurrency. And this is a really big deal because it was not easy to do that all in one place.

 

It was not easy to do that from different countries. And FTX really made that an incredibly accessible or made that an accessible activity itself is a very accessible platform. It's founder. also started another company, Alameda Research that invests in pretty high yield but also high risk trade with cryptocurrency. And in fact, part of what [00:37:00] they originated as was a way to execute arbitrage when they realized that the price of Bitcoin in some countries was less than the price of Bitcoin in others. And so they realized, okay, if I can buy it in one country and sell it in another, I can just make money off the arbitrage.

 

That is where the FTX platform came from. They needed a way to do that, and so they spun, they figured out how to do it, and then they spun out a separate whole other company that is now the. Fast forward. The investment arm is executing trades on the exchange platform, and as it turns out, it was heavily in debt and and FTX used its other consumer currency capital to cover the debts of the investment arm or the investment company. I shouldn't say arm. It's a whole separate company. It just happens to have the same founder and his girlfriend runs the, runs the investment company piece. I'm sure I'm gonna get eaten alive by the [00:38:00] crypto people.

 

They're on a get off again. Who knows if she's currently his girlfriend, his romantic past affiliation. Possible. status, friends with benefits, whatever. So that's the, that's the like super light, high level version. The obvious thing here is like, you don't use commercial assets to cover investment liabilities.

 

Like, that's, that is just a recipe for disaster, whether you're talking about traditional banking or cryptocurrency. And so that's where we are. My one piece, I will tack onto this. The whole, this sort of separate story is the sort of thrall that people have found themselves in with the founder. Sam Sbf is often referred to, but Sam Blank and fine.

 

And his story is one that I think is extremely familiar to us now as we put all these Silicon Valley founders on a pedestal, you know, he's a rockstar. M I t. Goes off for a few years to James Street Trading and then [00:39:00] goes off and found this brilliant company and spins off another one, does something nobody else has done immediately, you know, builds this network of politicians and other Silicon Valley leaders and is in t-shirts, in meetings with investors. There's one story about how he had investors come into a conference room where they could see him sleeping so that he could then wake, go to the meeting and you know, just be on it, you know, immediately from slumber, you know, playing video games and music. All of the things that you've come to associate with the sort of Mark Zuckerberg, you know, young people running the tech world type thing.

 

I, as I was catching up on this, so again, caveat, I was not an FTX user. I'm catching up after the fall. So I, the media I'm reading probably is biased in many ways, but one of the things I was struck by is like how familiar this character feels. I really loved MIT and I loved Sloan, but the amount of money that was sloshing around and being offered to [00:40:00] anybody who could put together a coherent pitch. Nevermind if there was proof of the product. Nevermind. If, you know, they had a history of acting with integrity and fiduciary, you know, respect towards their customer's fiduciary needs. Like it is just stunning how much we value pedigree. His trust and credibility and his story is one to me that really reinforces that, that here's somebody who had to foot in a door with certain networks and people and and what's brilliant is brilliant.

 

And that alone conveyed credibility into me. Network and intelligence is not sufficient to convey credibility.

 

Bethan: Yeah, a hundred percent. I think particularly when we look at this in terms of a white male founder this kind of implicit trust that investors or, you know, people with capital give to founders who look like them or have the background that they believe to be legitimate or trustworthy.

 

This is very indicative [00:41:00] of how there was so much dry powder, right? Extra capital that venture capital, private equity, had to throw around given that yields were low and there was so much desire for. For returns that weren't just from your vanilla stock market, right? Um, and now that's how you've ended up in this situation where these angel investors, VCs, private equity firms who get attached to founders and their background and really end up giving them a ton of money when the reality is certain investors may have not done the right due diligence. I'm not saying that's entirely the case here. I don't know, obviously, but I think this speaks to the broader issue.

 

Winnona: We like to vilify certain founders, and I wonder how much of this is just the self perpetuating system of individuals idolizing and then enjoying the fall of watching these characters succeed and then fail. Part of me wants to figure out how much of this is systemic in terms of the funding cycles and how we want startup. That have dynamic founders, and how [00:42:00] much of that is just these individuals in particular? I admit that I only really found out about FTX after it was alleged that they had been hacked.

 

And then as it turns out, most of that money had actually been seized by the Bahamian government for safekeeping during their bankruptcy process.

 

Grace: Ha. Good for them.

 

Bethan: Yeah. Yeah. I mean, grab it while it's hot, while it still exists. But I think that shows the point that a nation state took, what 500 million of, of assets from this failing company just shows how porous and opaque this entire setup is, both from a asset security perspective and a cyber security per.

 

Grace: Well, I, I, I guess I would wonder what the, like, general landscape is, because I think. Like, I think Binance was the there were, they, they looked into the books to kind of see if they would bail FTX out. And I mean, they, their public stance is that FTX was run like with far, far less [00:43:00] competence and like the checks and balances that one might expect in a certain company.

 

I. Maybe this is me being hopeful or optimistic that not all companies are sort of run, run like this in, in a fairly unregulated business. I think it's pretty clear that FTX at least is like one of the worst examples of it. I think the second point too to this idea of. I don't know these famous sort of CEOs. I mean, I, I first heard about Sam Bre through, through effective altruism. He was sort of all lauded as the golden boy of effective altruism because he was the first self-made billionaire in order to give it all away.

 

There, there was this sheen or this glow of like, oh, he must be a good person. He wants to, he wants to make all this money to give it all away.

 

And I won't, I won't say that maybe he said this publicly, but there is this general idea even that the company was founded in the Cayman Islands to skirt tax on purpose so that he can give more to the most effective nonprofits and, and sort of max maximize the utilitarian effect that money can have on the.

 

[00:44:00] I, I think in a lot of ways the reason his particular fall feels so emotional for a lot of people is that he's not just a crypto bro who became a billionaire. He's a crypto bro who is also supposedly, ethically way better than all of us. And, and it turns out it, you know, I think someone who was sort of lauded this highly as like an ethical person turns out to be one of the worst among us.

 

I think that's, for me, the drama here and that's part of where the story comes from. I see your point Winona, about like, what, what is our sort of obsession with these superstar CEOs and turning them into celebrities and then sort of loving to be in the audience as they crash and, and fall.

 

Dani: Grace, that was such a cogent and great point, and I wish I could take my point back and just plus one all of yours. As you were saying that I was just a million snaps, um, I also misspoke at the beginning. I gave Sam the wrong name, so apologies. Thank you Grace for correcting me.

 

Bethan: That was a great show and tell. For those listening, if you haven't listened to our crypto and low [00:45:00] rise jeans, , uh, you should go listen, it's even more relevant as we were watching this massive crypto exchange kind of implode. again, these questions of cryptocurrency, digital currency, cyber security, all are incredibly relevant in this moment and always.

 

Thanks to Nona for, for putting together such a challenging, as usual.

 

Winnona: Thanks for listening to Cyber dot r ar, a podcast by Harvard Kennedy School students. Given that this is a student led program, this podcast does not represent any views of any institution, school, or even ourselves. After we finish recording this episode on Tuesday, November 22nd, we're just students learning every day trying to navigate this murky area of cyber policy. Stay tuned for more episodes and.

 

Sophie: Do you have two factor authentication enabled?

 

Grace: I think that was a ve yes,

 

Bethan: These are the real hard-hitting questions we ask here on cyber. Are your pets being [00:46:00] cyber secure?