Explainer Episode 63 – Super Elections Year

The 2024 super election year has captured the world’s attention, with the US elections playing a central role in shaping global politics. Join Kathryn Ciano Mauler and Katie Harbath as they delve into the complexities of worldwide political elections while discussing how to counteract and recognize how these elections will intersect with emerging technologies like AI.

Transcript

Although this transcript is largely accurate, in some cases it could be incomplete or inaccurate due to inaudible passages or transcription errors.

[Music and Narration]

 

Introduction:  Welcome to the Regulatory Transparency Project’s Fourth Branch podcast series. All expressions of opinion are those of the speaker.

 

Marie Blanchard:  Hello, and welcome to the Regulatory Transparency Project’s podcast. My name is Marie Blanchard, and I am an Assistant Director of the Regulatory Transparency Project.

 

Today, on this podcast, we’re delighted to host a discussion on the Super Elections Year of 2024. To address this topic, we have with us today a stellar pair of experts who I will briefly introduce before we jump into the discussion.

 

First, we’re joined by Kathryn Ciano Mauler, and second, we’re delighted to have Katie Harbath joining us today. We’re glad to have both of you with us. In the interest of time, I’m introducing you guys very briefly. And if you’d like to know more about our guests today, please feel free to visit regproject.org—read their impressive and full bios. And with that, I’ll hand it over to our host, Kathryn Ciano Mauler.

 

Kathryn Ciano Mauler:  Katie, thank you so much for being here. I’ve been excited to talk to you about this since I’ve been following your work on elections, since there was a company that you worked at that was called Facebook. I’m a lawyer in tech, used to work on elections but haven’t for some time. So very interested to talk through this.

 

I’ve been trying to follow the super election year, and I realized that most of the coverage that I was finding was from you, Katie. And so I thought that it was easier instead of trying to triangulate all of the research to come straight to the source. So I’d love to hear your background, where you’re coming from on it, how you got into this in the first place.

 

Katie Harbath:  Yeah. Thank you so much for having me. And I’m really excited to have two Kathryns that spell their name the same way—K-A-T-H-R-Y-N. And so anyways, thanks for having me.

 

So my background is I initially started my career working in Republican digital politics in the early 2000s. Facebook is having its twentieth birthday here in February. And I was at — 2004 was my first campaign. And so back then, it was kind of called e-Campaign. It was very early days of getting politicians kind of thinking about using the internet and digital. I then, like you said, spent 10 years at Facebook where I built the teams globally that worked with politicians and governments on how to use the platform and coordinated all the companies work elections — on elections globally.

 

So I’ve seen sort of — the 10-year span was quite a bit from kind of just trying to convince people to use it to then a lot of the topics that we talk about today from foreign interference, mis and disinformation, how to think about transparency, and a lot of these content moderation issues. I left Facebook in March of ‘21 and started my own consulting firm and newsletter called Anchor Change. And I most recently now joined a tech consulting firm called Duco Experts, where we work with a variety of tech companies of all sizes on a lot of these tech policy and trust and safety issues.

 

And so, yeah, I’ve been talking about this big year of elections since 2020, when I realized how many were going to be happening this year. And that was even before AI and all these other things sort of exploded on the scene.

 

Kathryn Ciano Mauler:  Yeah. What makes this year? Is it 82 elections that are happening in how many countries?

 

Katie Harbath:  It depends on how you count it because everybody has a little bit of a different way. So a lot of people might be like, “Why am I seeing all these different numbers? Why is it hard?” It can really depend on if you’re — so it’s somewhere around in the upper 80s if you count it by election dates across 80 — so it’s 80 different countries.

 

A lot of these countries are having sometimes multiple elections as part of it. But it’s the first time ever in history that in the same year as a U.S. presidential election, you have elections in places like India, Indonesia, Taiwan, maybe Ukraine, Mexico, the United Kingdom, and the European Parliament. The U.S. is on a four-year cycle. Places like India and Indonesia are on a five-year cycle. Mexico is on a six-year cycle. So it’s pretty rare when all of those line up.

 

And because those are such populous countries, you have, depending upon the numbers you go off of, but The Economist says over four billion people will be impacted by elections this year.

 

Kathryn Ciano Mauler:  That’s hard to imagine and hard to imagine how much there is that’s positioned for opinions for everybody. There must be a ton of polling; there must be a ton of opinion gathering right now trying to understand where people stand.

 

Katie Harbath:  Yeah, I think it’s different in different countries. I always tell people the U.S. is the exception versus the rule in terms of how elections are run. We just have much longer cycles. It’s a permanent political class on that.

 

But yeah. People are really trying to get a sense of — because this is going to change the world order potentially in terms of these major countries that are part of the G7 and G20, BRICS. And so a lot of people are trying to figure out, is the world going to go — what type of leaders are they going to be electing?

 

As of our recording this, Taiwan and Bangladesh are some of the first elections that have happened. And the Taiwanese people elected somebody that’s very much pro-Taiwan versus pro-China, which I think they’re showing quite a bit of resilience because talk about a people that have been bombarded with propaganda and other things like that so much that I think there’s actually quite a bit that a lot of countries can learn from how that population and that culture has really inoculated itself against many attempts to influence their vote.

 

Kathryn Ciano Mauler:  Yeah, that was incredible to see, and that was also part of what made me want to talk to you more about the rest that are coming up. We’ll talk more about the propaganda bombardment in a little bit. But I’m also thinking about the changes in alliances with the changes in leadership that we’re seeing. Do we anticipate there being or do we have any sense as to how those alliances may wind up lining up, or is that all sort of remaining to be seen?

 

Katie Harbath:  I think it’s really remaining to be seen because frankly, at the end of the day, I think it’s going to really depend to who wins the U.S. election and what that looks like. And if you look at the conversations that are happening in international corridors, places like Davos, other things, all eyes are on November in the U.S. because I think that will then kind of determine where people might fall into line on some of this. So yeah, it’s hard for me to predict that right now.

 

Kathryn Ciano Mauler:  That’s interesting that the U.S. election is so late in the year, but it will be so determinative as to a lot of the alliances globally.

 

Katie Harbath:  I think it goes to the traditional role that America has played on the international stage around all of it. There’s also just such, I think, a divergent of foreign policy approaches between—assuming that it’s a Biden/Trump race—between those two and how they would approach this that I think a lot of folks are — they’re just kind of keeping their powder dry to wait and see what that looks like because then that’s going to impact what their own strategies are for their own countries.

 

Kathryn Ciano Mauler:  Yeah, totally. So I see our discussion as sort of two forks. One is the social side, and one is the security side. On the social side, I’m curious. What kinds of incentives do we see, and how do people respond to those when it comes to an election year coming up, lots of signaling, lots of discussion, lots of regulatory changes? What should we expect to see?

 

Katie Harbath:  Do you mean in terms of — because I hear “social,” and I think social media just because of what my background is. Is that what you mean, or do you mean more broadly?

 

Kathryn Ciano Mauler:  I think both. I feel like social media often measures or is a reflection of what the larger social responses are, and I can imagine that there will be — we’re obviously seeing changes across which platforms serve which roles and those sorts of things. So just curious about what role you think social plays. I know it’s evolved over the years, but I’m curious to hear your thoughts about what role I guess social media plays in reflecting that side of it.

 

Katie Harbath:  Well, and I think that one thing that is — there’s a lot of things that are really different about 2024 than past elections. And I think the first of those is you actually have a lot of different online platforms that are — that people are using to get information in which to communicate.

 

You have your legacy platforms—your Facebooks, your Googles, stuff like that. And even though a lot of people, like if you’re an American listening to this, you may be like, “I don’t use the blue app anymore, the Facebook app,” but it’s actually still quite popular in many places around the world. But you also have newer apps like TikTok; you have Telegram; you have messaging apps that people are using a lot more in which to communicate with people. I find that podcasts are starting to grow in popularity.

 

So it’s a much more fragmented environment, too, that people are using in which to communicate. And you have kind of different communities in each of those, but then you also have information being spread across them.

 

As part of that. I do think that people have gotten a lot more resilient to whether or not — just to be aware that there might be mis and disinformation out there. Obviously, AI is kind of pouring an accelerant on that a little bit of maybe making that even a little bit harder. But then you also really have this much more pronounced fragmentation around how people feel about how content should be moderated online.

 

And at least in the U.S., when you look at polling, it’s very different between the left and the right. Democrats tend to be more willing to have more content be taken down, whereas folks on the right tend to want to keep more content up. And this is going to just continue to be a constant tension, especially with the companies trying to figure out what to do in this space and also kind of waiting to see where there’s a bunch of Supreme Court decisions that could come down on this.

 

Another aspect of this, too, that I think is interesting is unlike in previous election cycles, you’re also going to have candidates on different sides of the aisle kind of having this moral debate with themselves about which platforms they’re even on.

 

So I know many Republicans are hesitant to be on TikTok because of the Chinese backing of their parent company. I know on the left, some folks very much disagree with decisions Elon Musk is making on X and other things like that and will be like, “I’m going to leave that.” There’s been issues with Substack. And so it’s becoming a part of the debate in a different way than I’ve necessarily seen in the past.

 

Kathryn Ciano Mauler:  Yeah, that makes sense. I also wouldn’t be surprised if the idea of who prefers which tendency in terms of removal or remaining when it comes to content, if that’s not static over time, if that sort of shifts depending on what’s going on. So it’ll be interesting to watch. And my understanding has been that social—by which I mean platforms—have played a much bigger role in elections overseas than they have in the U.S. Is that accurate?

 

Katie Harbath:  Yes. I mean, you’ll see, for instance, an app like WhatsApp is used — it’s actually quite growing now in the U.S. But previously, in 2019, there was another kind of big wave of elections in places like India, Indonesia, Mexico. And what makes it really tough with WhatsApp is when you have an encrypted platform, you have to think about how to mitigate these harms differently because you can’t see the content. And so you have to kind of look a bit more at behavior and stuff like that.

 

Other places, too, some have their own — some places, you have more like other Chinese apps, like WeChat and other things. Some places, Instagram is still much more popular. You also have places like India, where things like TikTok are banned, that they’re not available.

 

And so one of the things that you’ll see—like I know I did when I was at Facebook and other companies do—when they’re looking at trying to do the risk analysis for their own platforms in these countries, one of the things they definitely have to look at is how much people are using them in those different places to try to then decide how much time and effort they’re going to actually put into some of their election integrity efforts.

 

Kathryn Ciano Mauler:  That makes a lot of sense, and that’s really interesting. On the security side, how do you think about that? I know that’s a huge topic. Where do you think there are opportunities, and where do you think we stand now in terms of election security this year?

 

Katie Harbath:  Do you mean security in terms of how the elections are administered, or do you mean security online?

 

Kathryn Ciano Mauler:  I think I mean a little bit of both again. Yeah, I see them as being largely connected, but I’m curious about the distinctions.

 

Katie Harbath:  Well, I think that the reason I was asking the question is I do think they’re blended. I do think that you have — there’s a lot of work that election officials continue to do about making sure all around the world that the process of not only helping people to know where, when, and how to vote, but then the process of actually collecting those votes, the process of announcing the results, and then the process people have of appeals and court process and all that.

 

Prior to 2020, most—not just Americans, but I think a lot of citizens—did not necessarily pay super close attention to how these things were administered. And now, after 2020, we’re getting a lot more into the details on that. And so election officials are trying to make sure that that is all, frankly, just secure in terms of how — and transparent as much as possible about how those votes are counted so there can be less questions about that. If there are problems, they can be transparent about that.

 

But then they also have to think about communicating that well to the public. And one of the biggest challenges—this is in the U.S.—through the Bipartisan Policy Center, we’ve done some polling both in ‘22 and more recently that has sort of shown that people actually generally trust how the votes in their community are counted. They don’t trust how they’re counted in other places in the states because different states have different rules. And so that’s a new challenge of thinking about, again, towards both the actual security of elections but the perception of the security of elections for people that we have to think about of how you do that across the entire country.

 

And then I think the other thing, too, is we’re thinking about this, foreign interference is still a problem. And one of the things that we oftentimes see is for bad actors or people who want to negatively influence the information environment around these election type stuff, sometimes, they don’t even have to do anything. They just have to get stories planted and narratives out there that make people think that something bad has happened.

 

And so the term we used to use at Meta was called perception hacking around that. And I actually worry about that, just the overall narrative around AI and stuff like that, about what could happen versus what might happen. And I think as we continue to go into this year, we need to be really careful about separating out that signal from the noise around — so that people — I worry about what that could do to further erode people’s trust in the overall process and security of the elections. I hope that was the angle that you were kind of hoping to get at.

 

Kathryn Ciano Mauler:  No, no, no. That makes complete sense. And that certainly tracks with what I observe when it comes to perception being at least half the problem is trying to address that in a way that’s consistent and coherent. Are there solutions that you have in mind or that you’re aware of as being floated, or is this — are we more identifying issues as we go through the year?

 

Katie Harbath:  I think people are still — there have been some solutions like, frankly, going into the midterms in the U.S. in 2022, a lot of the partnerships and stuff that election officials had with the tech companies, the civil society, with academia, it’s this kind of concept called pre-bunking: so letting people know some of the narratives they might hear, getting some of that education out sooner about how these election processes work really helped to inoculate people and just help them to better understand what was working. And so I think a lot of folks want to continue that.

 

One challenge, though, is there’s these supreme court cases. One of them is about government engagement with tech companies and other things. That’s had a bit of a chilling effect on that. And so that is throwing a bit of a wrinkle into all of this about how quickly some of this stuff can get spun up.

 

Kathryn Ciano Mauler:  Yeah. No, that makes complete sense. So let’s talk about AI. What do you think are the real facts that we should be thinking about, both in terms of opportunities and in terms of any issues that we can expect?

 

Katie Harbath:  Yeah. I think first and foremost, we — my mantra for this year—and this is particularly about AI—is to panic responsibly about all of it because, again, I talked about the overall narrative and how I’m worried about that. But there’s some really cool use cases I’m seeing of AI around imagery, around being able to help. I really think about campaigns and issue groups that don’t have a ton of resources to be able to use this to help, to create content, to help do translations, to help do a whole bunch of things.

 

But I also liken it very much to social media in 2008. It was used by the Obama campaign and others, but it really wasn’t until 2012 where it was really integrated into it. So I really expect more AI in future elections because there’s still an adoption process and stuff that we have to think about.

 

We are going to see it used negatively. We already have seen instances. I think the big question, though, is how much impact that’s actually going to have on a lot of this. This is very much an issue where all of these platforms are really trying to, again, build the plane while it’s in the air of thinking about, “Do we label this stuff? How do you label it? Do you watermark it? What is the best way to let people know what might be happening and even being able to detect it?” A lot of these platforms are just straight out saying no political use whatsoever of our AI tools. But people are already finding ways around that.

 

I’m most worried about right now is audio deep fakes because on the audio side, you just have less contextual clues than you do a video one. And then I’m also most worried about this and down ballot races.

 

So listen. There’s been deepfakes of Joe Biden and Taylor Swift—very prominent people. The media is going to debunk that pretty darn fast. You’re not going to be able to get that out.

 

But there’s an interesting case happening in Baltimore right now where there’s a principal, and there’s audio that came out of him saying some unsavory things. He’s claiming it’s AI. Other people are not. What’s the truth? What happens when that comes out about a congressional candidate, a city council candidate, where with local news being so decimated and fact checkers completely overwhelmed, are you going to be able to have the mechanisms to try, to try to show whether or not something’s true or not?

 

There is something that I’m particularly maybe worried about of what that could look like. And at those down ballot races, you have more of an opportunity where just a few votes swinging something could really decide the winner.

 

Kathryn Ciano Mauler:  Yeah, that makes complete sense. And in terms of the perception, I have wondered whether people are receptive to some of these things. I feel like maybe it’s just from where I sit in Washington, D.C. Everything I see, I wonder, “Is this real? What’s really behind this?” But I don’t know if that know broadly and globally. I’m sure that this election will be one where there’ll be a lot of testing to see what perception is available.

 

Katie Harbath:  Yes. And I think that — and I think there’s going to be a lot of experimentation over what moves the needle, what doesn’t. And again, going back to that perception hacking thing I was saying, if I’m somebody that’s trying to negatively — well, not even negatively, but influence and just get coverage, I just have to do one ad or one thing, and then you get all this media coverage around it.

 

So you don’t necessarily have to spend a lot of money right now in which to maybe get attention for that and make people think that these things are getting spread a lot further than what they are. So I just think we’re going to see a lot of experimentation right now from a lot of different people.

 

Kathryn Ciano Mauler:  Yeah. So it may be a year when we see what the gamut is of opportunities, and then we can — we’ll be able to see it flesh out from there.

 

Katie Harbath:  Yeah, absolutely.

 

Kathryn Ciano Mauler:  Yeah. Do you think that the systems that we have in place will be able to evolve in response to some of that? I always wonder, “Will we be able to keep the actual systems for voting secure against any testing, against any kinds of attempts to breach or whatever?”—I guess I could rephrase that—to breach or to interfere with any of the actual systems.

 

I guess it wouldn’t be a Federalist Society podcast if I didn’t ask about the strength of the private sector versus the public sector. Do you anticipate some of the response to these attempts for interference or meddling to come from the private sector in a way that will bolster the public sector systems that support elections more broadly?

 

Katie Harbath:  Well, I think that you’re going to see — the private sector, at least in terms of protecting these systems, you’re going to see the self-regulatory component first versus any sort of necessarily regulation from the public sector because the public sector generally moves a little bit slower on this.

 

But this is where I think the partnership comes in, is really important between the two because they each have roles that they have to play and what information they can see and what they know. And so that’s why I think a lot of people are concerned about this chilling effect, again, of those folks not partnering as much at the moment. So I think we should see examples from both.

 

I would be remiss, though, to say, too, we oftentimes are like, “There’s no regulation in the U.S.” There is at the state level, not necessarily at the federal level. So I’m curious to see what that will look like. There’s also new regulations in places like Europe that are starting to get implemented with the Digital Services Act and Digital Markets Act, that a lot of people are looking to see how that will impact things happening here in the States and other countries.

 

But I also want to flip it over. On the private sector side, there’s also the nefarious side of this. There’s a lot of vendors that are popping up that will help, whether campaigns or others, create — use AI to create fake news stories, fake videos, a lot of these different types of things. And that’s an area that I don’t think we’ve paid enough attention to about how to regulate or how — what should be allowed or not allowed or what that looks like in that vendor community that is adjacent to the platforms that the stuff might appear on. And I think that you’ll start to see even more actors trying to utilize to try to cover their tracks more of getting this information out there.

 

Kathryn Ciano Mauler:  Yeah, that makes total sense. And I wonder if we even know enough to be able to regulate meaningfully on some of this yet. It seems like we’re still in experimentation phase or still a learning phase.

 

Katie Harbath:  This is the biggest challenge about regulating around tech anyways is that it’s just constantly evolving. And one of my worries is that from a regulatory standpoint, some people are just going to be like, “Let’s just put a label on it. We’re just going to label it and be done.” 

 

But labels actually have their own issues, too. There’s a lot of label fatigue. People don’t read them. They may not understand them. There’s also only so much we can label because most people are on their phones. That’s a very small screen space in which to see things. And I think about it. Think about the number of cookie banners you get that are like, “Accept all cookies or don’t.” That’s from a European regulation from a long time ago. And most people are just freaking annoyed by those types of things.

 

And so I think that is a real challenge that we’ve not fully grappled with of how this looks like for types of regulation and how that might impact what companies are doing and who’s moving fastest. So I agree with you. I just think it’s going to be very hard to do this right now.

 

Kathryn Ciano Mauler:  Yeah, I think that makes sense. And it goes also to something that just as a hobby, I’ve been watching, which is the tradeoffs of connected election systems. You want them to have some kind of connection so that you can more efficiently count. I’m from Florida, and so I still have fatigue over 2020 and the recount. So you want to have a trustworthy and quick counting system.

 

But there’s also, of course, risks that come with having that connection. So I can see that there would be a lot of attention being paid to how to best hit the right balance there and improve that in a way that works. Do you see any sort of tech changes coming to the systems themselves in terms of that connection through AI or otherwise?

 

Katie Harbath:  I don’t know. I’m sure there will be of thinking what that looks like. And I think there is a lot of promise of AI to help make these systems more efficient, to think about doing some of the analysis and stuff that just, whether it’s election officials or others, would not have had the expertise or the resources and time in which to do, but it is still just like very early on.

 

And I think also it’s also — people are — we’ve been through a lot of change in the last four-ish-long almost decade. And I think people are fearful of — they just — when you have it where we’re just like, “We don’t know. We don’t know,” that can make people, I think, really fearful and then to have a tendency to not want to use any of the new technology at all whereas, again, going back to that panic responsibility line I had. I think it’s really important of still, the genie is out of the bottle. We’re not putting it back in. It’s coming.

 

So how do we think about not just this year of how these tools might affect it, but how do we think about going forward? How do we help get people up to speed and trained quickly on what these large language models are, how they could be applied, how you might think about guardrails? That’s some of the work I’m doing right now that I just find utterly fascinating because it’s just a whole different problem set of thinking about a new way of having to think about these problems that we haven’t had to do before. And there’s no playbook for it.

 

That’s why I think conversations are really important about this and not just having it behind closed doors of what’s happening at each of the different platforms and other companies that are developing this kind of technology.

 

Kathryn Ciano Mauler:  Yeah. Maria, I hope you’re able to filter out some of the noises. No, I think that’s really interesting. Your “Panic Responsibly” mug has been in my cart for a while. I really like that phrase. I think it’s really helpful.

 

When you say “guardrails,” are there any kinds of recommendations that you would be able to share at this stage? I’m always thinking about self-regulation and principles and the way that those apply between private companies and potential regulations. I’m so curious about what sorts of principles you’re seeing.

 

Katie Harbath:  Right now, a lot of the principles that I’m seeing from folks are just like to ban political use of their tools until they figure out what those guardrails should be. I think one of the challenge — and then they’re also really thinking about, I think first and foremost, where — how to incorporate authoritative information into their tools, where you’re pulling that from, how you define that, different questions like that because we are still very much debating other guardrails and what should be allowed or not allowed.

 

I also think, too, that there’s this question of a lot of these companies, they want to think about the policies, but they’re limited in how proactively they can enforce them. And so they might have to be more reactive. And so I think one thing at least I would like to see a bit more from the companies and stuff is a bit more transparency into not only their decision making process on that, but also sort of level setting with people about how effective they’re going to be in proactively finding these things versus not because I think that a lot of folks just have this.

 

They think everything should be able to be proactively found right away. They don’t understand that, “Yeah, these tools are still evolving. They’ve come a long way, but they still have a long way to go of what this looks like.” And you’re never going to hit a hundred percent accuracy on that. And that’s a really hard conversation to have when you’re also just in this bigger tech lash environment that’s been going on since post-2016. So it makes it really hard to have some of these nuanced conversations, at least in public.

 

You are seeing a bit more in private. And this is something, too, where I’d really like to see — I really think there’s opportunity for civil society academia. You also have a lot because if there’s a silver lining to the tech layoffs, you have a lot of people that have been inside these companies that are now outside these companies who have worked on this stuff a ton.

 

Two folks I know really well just wrote a piece about using large language models for content moderation through Stanford. And that’s the kind of stuff that I think can be built that then many platforms can incorporate into their thinking rather than everyone kind of thinking about it individually.

 

Kathryn Ciano Mauler:  Yeah. No, that’s really interesting. I’ve been thinking about this mega election year as being the culmination of a lot of work, but actually, what it sounds like is that it’s the bottom layer. We’re in a tech universe that we couldn’t have anticipated four years ago. A lot of the risks and benefits and tools are fundamentally different than what we are sort of used to seeing or used to using. And so it seems like it’s going to be a major — instead of a major output year, it’s going to be a major intake year in terms of learning and figuring out what the responses will be.

 

Katie Harbath:  A hundred percent. And I think that this is — there’s no finish line to this work. There’s always going to be new problems and twists and turns. And as you were talking about that, the metaphor of Mario Kart came to my head. And let me explain for a second because when you — there’s levels of Mario Kart that are super easy, and you don’t have that many obstacles. And I kind of think about that about, like, early days social media.

 

And then I feel like every sort of election around the world and every year, we’re up leveling, and it’s getting — and we’re now in Rainbow Road type level where it’s just super hard and there’s all different sorts of things that could happen and unknowns that might come at us around that. And so that’s what makes it really hard.

 

You learn from each election cycle, and you have to keep building upon that around this. And so I do think there will be a lot of outputs on this cycle, but I also think there will be a lot of inputs. And this is, for me, just going to be a very formative and milestone of a year that’s really, I think, going to set at least the next decade of how we think about technology and elections in particular and the new problem sets that we have—again, how AI is incorporated into it.

 

I think there’ll be a lot more conversations about how speech is handled and how we hold people accountable for that speech and how the relationships between governments and these companies work. It’s just a whole new era that we’re going into. So that was my long winded answers of saying, “I think it’s going to be a little bit of both.”

 

Kathryn Ciano Mauler:  Yeah, I guess that’s always true. But it’s helpful to think of it in those terms. Also, the Mario Kart analogy works because you have lots of dramatic and intense flips as a matter, of course.

 

Katie Harbath:  Yeah, you do. And you could go off the road. Literally, it just popped into my brain, so I’m probably going to use that metaphor a lot more.

 

Kathryn Ciano Mauler:  I actually like that more and more for elections because I also think a lot of the players in Mario Kart are similar to the players in elections. I can now see a lot of —

 

Katie Harbath:  We’re going to take this way too far.

 

Kathryn Ciano Mauler:  It’s not on point as [inaudible 34:03], but it’s not off points.

 

Katie Harbath:  Yeah, exactly.

 

Kathryn Ciano Mauler:  What questions am I not asking? Anything else that you want to share from your wisdom and experience?

 

Katie Harbath:  I think that the one thing, especially for the group that’s probably listening to this type of podcast that I’m going to guess is a bit nerdier than a lot of other folks, is as we’re having these conversations around content moderation, censorship, things of that nature, one of the things that I would really like to see the conversation focus on a little bit more is how we draw the lines on some of these policies and how we think about what these penalties might be and how our expectations and not just the role of what the tech companies should have but again, regulators and others around this might look — what this might look like and also really starting to make sure that we’re thinking about — and trying — and a lot of people are trying to study what the actual impacts of a lot of this stuff are.

 

And so just encourage folks to really — if you have that opportunity, really try to get into the nuance of this stuff because it’s not — I have my own podcast. I call Impossible Tradeoffs because these are really hard questions that these companies and regulators and others are trying to answer. And there really aren’t easy answers to it when you really start digging underneath the surface. And I think it’s worth us trying to do a little bit more of that digging as we’re trying to figure out the right path forward.

 

Kathryn Ciano Mauler:  I think that’s really interesting, and I’m glad you put the podcast name in there so we can make sure that gets out there. This is really, really interesting. I think we may even talk further later in the cycle as we watch these go.

 

Katie Harbath:  Yeah.

 

Katryn Ciano Mauler:  Are there any particular elections that you’re watching? Taiwan was obviously top of mind for a lot of folks. But are there any in particular that you think will be telling or indicative of where things will go?

 

Katie Harbath:  Yes. So Indonesia is here on February 14. And how the campaigns are using AI, they’re not in nefarious purposes. It’s just overall is unlike something I’m seeing anywhere else around the world. And so I’m really keeping an eye on that.

 

India’s elections are always huge, always fascinating. Those will happen over — they have multiple — six or eight election days over the course of six or eight weeks. We don’t know their dates yet, but that’ll probably be here in March, April, May timeframe for sure.

 

I’m definitely watching Mexico as well just because given in the U.S., the conversation around the border and everything else like that, I think that will be a huge one as well. But listen, there’s practically an election almost every single week that’s happening all around the globe. And so I think the important thing, too, as folks are watching this is also trying to see how different issues and themes travel across these elections and if there’s similarities or not. But those are a couple of the ones that I would definitely keep an eye on.

 

Kathryn Ciano Mauler:  It’s like an Olympics year for a very particular type of nerd.

 

Katie Harbath:  Me, it is my Olympic year that I have been working towards for so long, and I can’t believe it’s finally here. The other metaphor I’ve been using for this year is a kaleidoscope because there’s just so many different pieces and types of things that are happening all around the globe that it’s really hard to predict what our world is going to look like in 2025. All I know is that I think it’s going to look drastically different.

 

Kathryn Ciano Mauler:  Yeah. I think there’s no way around — some amounts of change will cause other change in other places. And so even if a lot of things stay the same, a lot of things will certainly be different.

 

Katie Harbath:  Yes, a hundred percent.

 

Kathryn Ciano Mauler:  Yeah. Well, thank you so much. This has been really interesting, really, really fascinating. So thanks so much for joining.

 

Katie Harbath:  Yeah, thanks for having me.

 

Kathryn Ciano Mauler:  And thanks, Marie.

 

Marie Blanchard:  Thank you so much to you both. This has been really, really interesting to listen to. And so thanks for recording with us.

 

[Music]

 

Conclusion:  On behalf of The Federalist Society’s Regulatory Transparency Project, thanks for tuning in to the Fourth Branch podcast. To catch every new episode when it’s released, you can subscribe on Apple Podcasts, Google Play, and Spreaker. For the latest from RTP, please visit our website at www.regproject.org.

 

[Music]

 

This has been a FedSoc audio production.

Katie Harbath

Chief Global Affairs Officer

Duco


Kathryn Ciano Mauler

Corporate Counsel

Google


Emerging Technology

The Federalist Society and Regulatory Transparency Project take no position on particular legal or public policy matters. All expressions of opinion are those of the speaker(s). To join the debate, please email us at [email protected].

Related Content

Skip to content