President Biden’s Executive Order on Foreign-Controlled Apps

In June, President Biden revoked a Trump-era executive order that sought to ban TikTok and WeChat, and replaced it with a new executive order directing the government to review the security threats posed by foreign-controlled software applications. “The Federal Government should evaluate these threats through rigorous, evidence-based analysis,” Biden’s order dictated, “and should address any unacceptable or undue risks consistent with overall national security, foreign policy, and economic objectives, including the preservation and demonstration of America’s core values and fundamental freedoms.”

An expert panel joined us to break down the order and its implications for the apps it targets as well as for future relations between the United States and its foreign adversaries, such as China.

Transcript

Although this transcript is largely accurate, in some cases it could be incomplete or inaccurate due to inaudible passages or transcription errors.

[Music and Narration]

 

Nathan Kaczmarek:  Hello, and welcome to this Regulatory Transparency Project webinar. This afternoon, we’re pleased to be discussing President Biden’s executive order on foreign-controlled apps. My name is Nate Kaczmarek. I am Vice-President and Director of RTP. As always, please note that all expressions of opinion on this program are those of our speakers. 

 

Today, we’re happy to have with us Matthew Feeney from the Cato Institute to serve as our moderator. Matthew is the Director of Cato’s Project on Emerging Technologies, where he works on issues concerning the intersection of new technologies and civil liberties. He previously worked at Reason magazine, The American Conservative, the Liberal Democrats, and the Institute of Economic Affairs. 

 

If you’d like to learn more about Matthew and our panelists today, you can visit our website. That’s RegProject.org — R-E-G project.org — where we have all of their complete bios. In a moment, I’ll turn it over to Matthew. Once our panel has completed their discussion, we’ll go to audience Q&A, so please think of the tough questions you’d like to ask them. Audience questions can be submitted by Zoom using the raise-hand function, and we will call on you directly.

 

With that, we are excited for a great discussion. Matthew, the floor is yours.

 

Matthew Feeney:  Great. Thank you, Nathan. And thank you all for joining. I’m excited for this conversation on a topic that I think is increasingly on everyone’s minds. Some of you may remember during the Trump administration that the president signed a number of executive orders aimed at certain Chinese-owned apps — the most popular, perhaps, being TikTok. And what we’ve seen during the Biden administration is the revoking of the relevant orders and the imposition of new orders. 

 

And I think this raises an opportunity for all of us to discuss the national security risks of these social media apps, as well as the role of executive orders and the executive — the process surrounding how these are signed. And, also, how the current controversy may affect commerce, privacy, and other important features of our ongoing relationship with China.

 

We have an excellent panel here today to discuss these issues. What I’ve asked the panelists to do is — when they first speak — is to briefly introduce themselves and to mention their expertise. And, as Nate mentioned, if you have questions, feel free to throw them into the chat, or to raise your hand. My goal here is to make sure that we have at least the last 15 to 20 minutes for questions.

 

So, with that, I thought I would start with Jennifer, and ask about what are the kind of risks we’re talking about here? What do we know about how our foreign adversaries use American data? What about TikTok is particularly concerning? And what new and emerging technologies and techniques, like AI, are being used?

 

Jennifer Hay:  Matthew, thank you. It’s a pleasure to be here. I’m Jennifer Hay. I serve as the Senior Director for National Security Programs at DataRobot, an enterprise AI platform. Previously, I served in the Department of Defense, and I’ve held positions in the National Security Council staff, as well as at senior levels at the Department of Defense. 

 

In terms of how AI is being used by our adversaries, we can take China as a perfect example of what they’re doing. It’s a country that has no issues with being able to collect data. They are not concerned about privacy of their citizens, or anything like that. And so they have the ability to gain access to billions of data that they can use to target their U.S. — to target their own citizens, as we’ve seen what they’ve done with facial recognition in the Uighurs, using that to identify, you know, members of the Uighur population and be able to target them, put them in jail, and things like that. And there is a concern that they may be able to do that with U.S. data, using platforms via TikTok and WeChat and other various different platforms that are being used by citizens outside of — by individuals outside of China. 

 

And, being able to use that data, the way — as people know, artificial intelligence is dependent on vast amounts, vast amounts of data. So the more data that our adversaries are able to collect, the better their AI will be, the smarter the AI will become. This is important, as China has announced that they want to be — the president, Xi Jinping, has announced that he wants to be the leader in world AI by 2030. And, in order to do that, he needs data. And collecting data from worldwide sources is key to that. 

 

As we — as AI becomes smarter, we’re currently — many people are doing research on what we call artificial general intelligence, which — that’s the AI that they make movies about. We’re not there yet. But getting that AI to think, they need access to vast amounts of data. So, while we think that TikTok is just a platform where you can record your children lip-syncing to Lady Gaga, in reality, China would be able to access that data, and use that information to improve their facial recognition, as well as their natural language processing. And then, also, specifically target individuals that they want to target, in terms of national security purposes. That means that they can target specific individuals that they want to either recruit for intelligence purposes or create intelligence applications targeting U.S. citizens.

 

And so that’s all stuff that we want to avoid. And what we do know about the way China operates, is that their civil-military fusion is much different than it operates in the United States. Where, here, we a have direct — we have a split — there’s essentially a firewall between that, while, in China, the military and the PLA can compel Chinese companies to turn over their data for national security reasons. So, even though we may not have — you know, TikTok is not owned by the PLA right now — there is evidence that the PLA may request data from platforms like TikTok and WeChat in order to further their AI. 

 

We saw a little bit of that with Grindr. People may remember this story, back in 2019, where CFIUS—the Committee on Foreign Investment in the U.S—compelled Grindr to — compelled the Chinese company that owned Grindr to sell off its U.S. portion of it, because there were Chinese engineers that were accessing the Grindr data and that we were concerned about the national security implications of that.

 

So, I’ve gone on a little long, but I want to turn it over to somebody else. But there’s really — we’re really concerned about how the data can be used by the Chinese government.

 

Matthew Feeney:  Yeah. Thank you. Thank you for that. I thought — your comments prompted me to think of a question for Professor Jaffer, actually. Because, when you think about the scale of the issue here, I think the scale of executive authority comes to mind. So, maybe, Professor, given your work on national security, could you, I guess, give the audience an idea of how much authority does the executive branch here have to mandate either the sale or the prohibition of commerce with foreign-owned businesses? What’s the scope of that authority?

 

Jamil N. Jaffer:  Yeah, that’s a — it’s a great question, and it’s — and I would say it’s somewhat unclear. Right? I mean, I think that the executive orders that we saw coming out of the administration, in many ways — the last administration — in many ways were unprecedented in scale and scope. Now, obviously, the government we have laws and rules around investments into the United States. And, requiring companies to divest or prohibit transactions where, if foreign investment’s coming in, that’s the entire CFIUS regime — Committee on Foreign Investment in the United States. 

 

And, in this case, of course — at least if you think about, like, for example, the WeChat acquisition, they’re the — they acquired — I’m sorry, the TikTok acquisition of Musical.ly, which is an American company. There, they acquired an American company that was not a CFIUS review, was not submitted for CFIUS review. We require that to take place voluntarily. It’s required under law, but it takes place at the request of the purchaser. That didn’t happen. They made the acquisition, and then, later on, we saw it just sort of unwind it through this process, or at least look at it for consideration to be unwound. And then — but then, what you saw was a really interesting effort by the administration to sort of force the divesture of all their U.S. properties, which is primarily the Musical.ly acquisition, but there could have been more. And, so, it attempted to sort of leverage their access to the U.S. market in that manner. 

 

And, so, in a lot of ways, the claimed scope of authority is both breathtaking, but also critical to national security. And so — so it’s really — it will be really interesting to see how it plays out. We know now, obviously, that President Biden has withdrawn both the WeChat and TikTok executive orders. There’s now a process ongoing over the next six months to try and figure out what they’re going to do, and what tools they’re going to utilize to effectuate the same goal. They do seem to have the same concerns. Right? I mean, I think it’s worthwhile to talk about, as Jennifer did, and as I know Margaret may, going forward, you know, what those concerns are. Right? Because a lot of people out there are saying, “I don’t get it. Why is it that the U.S. government cares at all about some dancing videos of kids on TikTok? And so what? So the Chinese have access to it. Great. They have access to the data. Why does — why does that matter? Why should we care?” 

 

WeChat — all right, I get it. You know, it’s a messaging platform, and maybe they have access to what’s going on in there and they want to use it as a means of control on their own population, and maybe they want to expand it over here. Fine. WeChat seems to make a lot more sense to people, but TikTok weirds people out. Right? And they don’t really understand it. And I want to sort of highlight, you know, part of the reason why this raises such important concerns. And, I think, Margaret, having actually lived through some this, can actually really give you some detail on it. 

 

But, from my perspective, part of the thing that’s going on here is you have to remember that the Chinese are collecting huge amounts of data, as Jennifer pointed out. Right? We know about their now well-understood attempt to — or, successful attempt to take huge amounts of data about American security-clearance holders, through their attack on, or successful infiltration of the Office of Personnel Management. Again, by the way, that was not a cyberattack. That was just really good cyber intelligence collection. If we could have done it, we would have done it too. Right? They obtained huge amounts of data on all of our security-clearance holders. But that’s not the only one. 

 

We have now publicly disclosed that the Chinese government was responsible for Anthem. They were responsible for Marriot. They were responsible for one of the big credit-rating agencies. So they had a large amount of data on Americans; American citizens, American Green Card holders, people in the United States — vast trove of data, in addition to whatever they’re natively collecting. You add on to that what’s happening on platforms like TikTok and WeChat, and who these kids or their parents are friends with, who they share videos with, how they behave in those videos, and the like. And now you’re talking about huge amounts of data combined together. So it’s not just this one TikTok thing in isolation. It’s the combination of all the data the Chinese government’s collecting on Americans, and how that data might be tied together and utilized. Not just for the data itself, but for how that data can train supervised machine learning algorithms.

 

So you can – attempt to identify how people might behave or who they might talk to or who they might communicate with, because you can see who their connections are here. You’ve got their entire credit reporting record. You’ve got their — some of their hotel stays, some of their health records. You’ve got some of their data about where they’ve lived, who their associates are, who their family is, very detailed reporting about their travels. And when you combine all that information, that’s where it becomes really powerful. And so it’s not just WeChat or TikTok in isolation, which is important enough on its own, but it’s the way that data can be utilized in combination with the other data sets, and to train this sort of larger machine — larger-scale machine-learning AI database. 

 

So I think that’s why it’s important. And that’s why — while these claimed authorities by the government are very aggressive, and very strong, and might have some questions on — in terms of their legal footing — I think it’s critical, as President Biden now looks at this and decides how to do it, he recognizes the magnitude of the threat, and they take very clear, direct action to ameliorate that threat, whatever methodology that might be through. So let me stop there.

 

Matthew Feeney:  Yeah. Thank you for all that. I think that’s given all of us a lot to think about. I would, though, like to turn over to Margaret, because I know that she will have some views on what has been discussed so far. But I also think it might be worth discussing, perhaps, Margaret, the process of these EOs. You know, what, perhaps, goes on behind the scenes, the implementation of them. And then also feel free to add any thoughts that you have on the other comments so far.

 

Margaret Peterlin:  Great. Thank you. And I know you had asked us to give a brief intro at the start, so I will just say that I’m currently in academia. I’m teaching at George Mason Law, and Texas A&M University in their master’s program. And, before that, I had worked in all three branches of government at the federal level; clerking for the Fifth Circuit, worked on Capitol Hill, worked in the executive branch for two different Republican administrations. And I worked at two global U.S. companies. So, I do have a perspective on how data is produced, how that’s used and valuable for business, and why there might be some very legitimate concerns, domestically, on who has access to our data, and even more pressing concerns when a foreign adversary has it. 

 

But, let me answer your question about the process. So, when I hear your question, I hear two parts. One could be, what was the process of the Biden administration in producing the EO? That is less interesting to me, because the administration would have a lot of leeway in the process that they used. But, in terms of the process that President Biden’s EO lays out, I think, in many ways, it’s addressing some of the concerns that came up in the court cases immediately following the Trump administration’s EO. And, in that way, when you read — when you read President Biden’s EO, you actually hear him use very adoptive language. He talked about the ongoing emergency declared in Executive Order 13873, which was a Trump administration EO. So that’s a continuation of policy. 

 

And, as someone in this space, it’s actually nice when the U.S. can have a very consistent, predictable, continued policy with somebody who’s been declared a national security competitor by the National Defense Strategy, and somebody who is determined to be a foreign adversary by the Department of Commerce. So, the process that is laid out in the EO is thorough. And I think that bodes well for the cases that are ongoing — which, I know the DOJ has asked many of them to be stayed — and the ones that may come, once these determinations happen in December, as Jamil mentioned.

 

So, the process is interesting, because it lays out eight factors. And what I loved about the factors is they really focused on whether or not there’s control, whether or not there’s influence, whether or not there’s ownership, whether or not there’s management by a foreign adversary. And three of those eight factors specifically use that language. And all you have to do is read China’s very own 2017 intelligence law to say, “Check, check, check,” for those three, because it — the language is very clear in Article VII and X, and XVIII, that says, “All organizations and citizens shall support, assist, and cooperate with national intelligence. 

 

So, imagine if you had that requirement on U.S. companies like Facebook and Twitter and PayPal. That’s what we’re talking about here, except we’re talking about what I like to call a real — real data hogs. The truth of the matter is, TikTok was downloaded 2 billion times in 2020. It had reached 2 billion downloads in 2020. That’s more than Twitter, LinkedIn, Reddit and Pinterest. TikTok has 689 million active users. WeChat has 1.25 billion, and Ant — which is the financial services, Alipay, so think of it as the PayPal equivalent — has one point, or 1.3 billion.

 

So, when Jennifer and Jamil talk about, “This is the issue of the data,” we’re talking about true data behemoths. One of the stats that I read recently was that Ant, which is the Alipay — and WeChat actually has a WeChat pay feature, as well — they have more users outside the U.S. — so, non-Chinese citizen users — than PayPal has globally. So we’re talking about, not just a scale of access to data, but, to Jennifer’s point, we’re talking about people who have said, “We’re going to be the world leader in understanding. So what about this data.”

 

And you can have lots of data, and if you can’t parse through it, then it’s not really concerning or impressive other than just a general liberty concern. But the point is, is you’re talking about people who are saying, “When it comes to 5G, we’d like to lay the infrastructure for a 30 percent discount around the world, so we don’t even have to hack into your telecommunication systems. We’re just riding alongside your information. Then, as many of you as we can convince to opt-in to our apps, we’ll get your data that way. And then we are going to be the world’s experts on knowing the so what about the data.” 

 

So, if you think about this, and you think, “Okay, that’s all in the hands of a foreign adversary.” That’s a very different feeling than when you think about, “Well, Facebook has a lot of users.” Yes, they have about 1.8 billion daily users. That’s a lot. Twitter has about 10 percent of that, 187 million daily users. But imagine if you took PayPal’s 361 million, Twitter’s 187 million, and Facebook’s 1.84 billion, and said, “Those aren’t three different companies anymore. That’s the U.S. government. And we’re going to do what we’d like with the data, how we’d like it, to include policing, and, as Jennifer said, we’ll sort you on the basis of religion, and send some of you to camps.

 

And so that’s the risk that we’re thinking about. We’re thinking about, “What does this mean for U.S. citizens? What does this mean for our allies?” And, quite frankly, what Jamil said is critical. They are learning on our data. We’re helping them be better at attacking us and understanding us by the access to this data — sorry. That’s a little bit of the Alabama accent coming through — the “dater.” 

 

So, I guess I would say that the EO has a process that I think will have improved experience in the court. And we’ll see how the process goes. They’re still at the beginning of it. But I did want to touch on this issue of — the data we’re talking about isn’t 17 videos. That might be what you’re uploading, but what the, what the Chinese government is downloading is massive. And there are 300 and — they’re having a complete circumnavigation of your life, if they’re getting your social data, your work data, your communications that are happening over Huawei systems. 

 

Matthew Feeney:  Yeah, no. Thank you for that. I want to invite Jennifer and Jamil to jump in if they want, but I’ll just use moderator’s prerogative to just ask what the panel — I mean, anyone can feel free to jump in — but, what are we to make of, at least the — the supposed reassurances of TikTok, ByteDance, the executives in these companies, who have tried to reassure Congress and American lawmakers by saying, “Look, the data isn’t even housed in China. It’s either housed in the U.S. or Singapore or elsewhere.” All of this kind of raises the question of, you know, is there any degree of reassurance that these company executives could make, or the Chinese state could make, that would alleviate any of the concerns that you’ve all outlined so far?

 

Jamil N. Jaffer:  No. 

 

Matthew Feeney:  Okay.

 

Jamil N. Jaffer:  I said it jokingly — I think, I think the challenge here is that — well, with WhatsApp — with WeChat, sorry, I said WhatsApp — but with WeChat, there’s no pretenses about what’s going on with the data. Right? Everyone understands what WeChat is, and what the design, and what it’s aimed to do. And, in fact, the Chinese government effectuates a lot of its sort of policies, as we were talking about, through the social — these social credit scores and the like, that WeChat helps effectuate. 

 

But, on the TikTok front, they have made this claim that, well, you know, the data is stored elsewhere, and it’s all fine. Don’t worry about it. It’s all good. But, you know — as Jennifer and Margaret both pointed out — they’re subject to very clear Chinese law. And, it is what it is. Right? I mean, you know, they’re a Chinese company, and they operate in China, and they have to comply with that law. Now, whether or not we think they would or they shouldn’t, or whether they say they will or they won’t, or the data’s somewhere else — the reality is what it is. 

 

And, you know, not to draw out a current analogy, but trusting the Chinese government on this front, or these companies that are heavily Chinese-influenced because they have members of the Chinese Communist Party on their board of directors, is like trusting the Taliban to keep up their commitments. It’s stupid when you do it. It was stupid when the Trump administration did it. It was stupid when the Biden administration did it. They shouldn’t have done it. And the Taliban didn’t live up to their — live up to their commitments. The same has been true at every turn with the Chinese government when it’s come to these type of issues. It’s like playing Charlie Brown and Lucy with the football. Right? Eventually you’ve got to figure out that it’s not going to work. You can’t trust these folks, and you’ve got to act accordingly in your own national interests. 

 

The Australians have figured this out well ahead of us. We kept footsying around while the Australians actually did the right thing on Huawei. We talked about it, talked about it, and finally we were pushed into it by our own allies. And we have now finally brought the British around. They kept — they kept playing the game of Charlie and Lucy around the football when it came to Huawei. Finally, the British have come around, albeit too late. BT’s already got Huawei in their networks. So, I think, at least on this front, it is worth saying that trusting the Chinese, or their companies, is not a route to success. At least, that’s my perspective on it. But I’m interested in Jennifer’s thoughts.

 

Jennifer Hay:  I wholeheartedly agree with everything you said, Jamil. That — I come back to the fact that there is a law that compels these companies to provide any data as requested by the Chinese government for national security purposes. And that could be anything. So they have to turn over the data when the government requests it. And that’s — I get stuck on that. And, do we really want to trust, as Jamil said, do we really want to trust the Chinese government to allow these CEOs to uphold these commitments that they’ve made to the U.S. government? I don’t see that. I don’t see that happening, in reality. So, for me, the risk is still there, no matter what they say.

 

Matthew Feeney:  I had a question associated with one of the Trump executive orders, the one that targeted WeChat. Because, there was — you know, a federal judge did block at least some of it, I believe, on at least — on First Amendment grounds, saying that there were, you know, First Amendment interests of U.S. citizens at stake here with something like this. So, if any of you have thoughts on that, I’d be interested to hear them. Because it strikes me that — that certainly, national security concerns — but aren’t there also concerns about Americans who are using these services to communicate with each other, to share their thoughts, and to engage in other First Amendment-protected activities?

 

Jamil N. Jaffer:  Yeah, but I saw that ruling. I don’t — I’m not sure I agree with the ruling. I mean, this idea, somehow, that we’re limiting U.S. person First Amendment rights by prohibiting the use of a particular platform. It’s not like there aren’t plenty of alternatives out there. Right? And, it’s not like — in this case, I believe — I don’t remember if this was the TikTok case or the WeChat case. I thought it was the WeChat, but I could be wrong. But it’s not like you don’t have alternative mechanisms. Right? The idea that this is some significant restriction by restricting the sale of a particular platform, I think — I saw that ruling as a stretch. But, you know, I don’t know if Margaret or Jennifer has thoughts on this.

 

Margaret Peterlin:  I think I would agree with Jamil. I saw it as a stretch, because you’re — the right to use a certain app is not one that is a well-established right under the First Amendment, and so it starts to take you down that path. Right? That’s like saying, you know, what if they had required NCI to sell off a piece of it, and now they’re forcing me to go on, you know, a Sprint telephone? I don’t — I just don’t understand how that would be something that would be upheld. So I don’t have it in front of me, so as a lawyer, I’m hesitant to speak to the case. But my instinct was, I didn’t follow the reasoning.

 

Matthew Feeney:  What does the panel think about the, I suppose, the unique features of social media here? So, obviously, we’re using a lot of these platforms to share personal information, to express thoughts. The companies collect all kinds of data about, you know, geolocation, age, race, all the rest of it, political affiliations. But, obviously, social media is not the only industry that is collecting a lot of data on people. So you can think that there are Chinese companies engaged in, you know, smart city technology, robotics. Other industries, of course, use all kinds of artificial intelligence. 

 

So is there an argument to be made that there’s something kind of unique about social media? Or do you think that maybe these executive orders didn’t go far enough, and there’s potential for restrictions with other industries? Because it strikes me as kind of interesting that they’re just focusing on social media apps at the moment. 

 

Margaret Peterlin:  So, if I may go first on this one.

 

Matthew Feeney:  Sure.

 

Margaret Peterlin:  So, I think the answer to this has to be really looking at what China has said about their strategic intentions. Right? Again, they are determined by the U.S. government — they are determined, both by the Department of Commerce and by the DOD’s National Defense Strategy to be a strategic competitor of the U.S. and to be a foreign adversary. So, those are the two descriptions that we have, formally, of them. Their strategic intention is to become a world leader in these technologies. Their use of the technologies demonstrate that they believe control of the governed by their data is appropriate. And one of the things — and Jennifer mentioned it — the data that they’re collecting on the Hans and the Uighurs in the Xinjiang Province is really intimate. One of the things that they collect is how often the front door opens and closes. 

 

And so part of that is — I always think about it from a Bill of Rights perspective. If you can track who I’m assembling with, if you can track what I’m saying — I mean, go through our Bill of Rights and think to yourself, “Do they have access to the data where they can track and trace me in a way that, if they wanted to intercept those rights, it would be very easy for them to?” And then ask yourself, “What have they said, explicitly?” I mean, one of the things that I admire about the — Xi’s exuberance, is he states what he’s going to do. He states that “We’re going to be world leaders. We’re going to be Made in China 2025”, and then he proceeds to execute, in a very public way, 2 million Uighurs in 380 concentration camps, is, to me, a suggestion of how he looks at governing, and how he looks at data. And he’s moving that through his entire society through the social credit system.

 

So, when I say — when you say, do I think that it’s under-responsive — that wasn’t your question, but that’s how I heard it. Because there are these data concerns, I would say yes. I want to talk about telecommunications — in part it’s because something I understand, having worked at a telecommunications company. The shift from 4G to 5G — I always say, “It’s not just another G.” It wasn’t just another G. You go from having meter specificity in the location to centimeter specificity about where somebody is. 

 

So, yes, I would be concerned who has access, whether or not my phone is by my head or by my heart in a way to interfere with a pacemaker, or wherever else it is. I would be concerned about a foreign adversary being able to pinpoint somebody’s child and their insulin pump. And how — you know, what interference could occur. And these concerns become real when you’re talking about a foreign adversary. Right? That’s when you have to say, as a government protecting its citizen and working with its allies, “What restrictions should we look at?”

 

So, my concern about the — President Biden’s EO, is not the process. I actually prefer the process they’ve done this year. My concern isn’t the criteria. I think the criteria are very sensible. My concern is, is it strategically responsive to what the Chinese have articulated their intentions are? And is it strategically responsive to the concerns that we should have about all of these sources of data being vacuumed in by a foreign adversary, when the foreign adversary puts on full display, “This is what we do to people when we have this much data on them.”? 

 

And so, I always like to say, “These are not accusations.” Right? This is — I’m quoting back to you what their strategic intentions are, and what their execution — their observable execution models are. And that’s one of the things I think the U.S. government needs to take very seriously. The Chinese have a full-spectrum disagreement with us on the issues of data management, how people should be governed, and where the person begins and the state ends. And we need to accept that. You know, people say, “When someone tells you who they are, believe them.” This is very true, in the strategic sense, with China and Xi. I’d actually love Jamil and Jennifer to respond to that if you can — if you can turn to them.

 

Matthew Feeney:  Oh yeah. Please do. I’m just going to interject briefly to say to the audience, I am seeing the questions being thrown into chat. I promise I’ll get to them in about eight to ten minutes, but for the rest of the next ten minutes, I want to still focus on the discussion we have here. So, yeah, Jamil, Jennifer, please feel free to jump in if Margaret prompted any thoughts.

 

Jennifer Hay:  I also agree that the EO doesn’t — it’s a good start. We need to start someplace, and focusing on social media, because that is a current and immediate threat, because it is being used so widespread inside the United States. But the EO definitely does need to expand upon to other types of platforms. You know, I’m concerned about the surveillance platforms that they’re developing, and their use of collecting data, and then even using AI. And, if those start to be exported to police departments inside the U.S., or even companies that — you know, Amazon is creating the ability where you walk into a store and they immediately know who you are, and they can target ads to you while you’re in the store, know what you want to buy. You know, that kind of surveillance technology, in turning over that amount of data, really infringes on our privacy. And so we need to understand where we — as citizens, we need to be able to make an educated decision on how our data is being used, who is using it, and do we want to turn over that data. 

 

And part of that is to understand, like, the company that owns these applications, and where that data can ultimately be used. So, I do think it needs — we’ve taken a step in the right direction. But, ultimately, as we continue to learn more about software and applications that are collecting data, the Department of Commerce needs to be empowered to be able to look into, ultimately, what is behind the scenes on these applications and software platforms.

 

Jamil N. Jaffer:  Yeah. And I’ll just hit a different piece of what Margaret was talking about. I think it — I think she’s exactly right to point out that this is not some sort of made-up concept about where the Chinese are going with this and where this comes from. It comes directly from their strategic doctrine. Right? And, you know, one of, I think, the challenges the United States has — and I really do think the Trump administration gets — should get credit for their highlighting of the very real threat that China poses to our nation. I think that, between their advocacy on this issue, and the reality of the Covid-19 pandemic, and the realization amongst the American people about our supply-chain dependence on China for PPE and pharmaceutical precursors and the like, I think we finally have awoken a bipartisan concern about China and the strategic threat it poses to our nation. But they’ve been saying this for years, for decades. Right? They’ve been writing this in their writing. 

 

And part of the problem is — it’s really interesting, there’s a recent piece of legislation out there that actually would create a translation center to take Chinese strategic documents in native language and translate them into English, so that scholars in the United States can read them, and talk about and describe what they’re writing about. We lack a real deep-seated understanding of Chinese strategy, in part, because we don’t read and understand the language. And, you know, they don’t have that problem. Right? China has a lot of English speakers, and it’s one of the benefits of English being the lingua franca of the world. Right? At the same time, you know, we don’t really understand, fundamentally, at a deep level, what our adversaries are thinking. And they’ve been telling us for years, as Margaret points out. We just haven’t been paying attention. 

 

So, it’s good, now, that we are paying attention. It’s good, now, that we’re engaged in this discussion, that we’re rallying our allies around it. I worry that it’s very late in the process to be getting to this point, and they’re well ahead of us. And, to be candid, you know, we’re not helping ourselves in the global effort to confront China as an adversary, because we are not doing a good job. Afghanistan is just one example. You know, it goes back to what the Trump administration did with the Kurds, what the Obama administration did with — on the Syria issue. We are not making our allies comfortable with us, and that we’ll be there to back them. And we are not making our adversaries afraid of us. 

 

And, let’s be clear, the Chinese and the Taiwanese, and all of our allies in Asia and elsewhere around the globe, are watching very closely what we are doing in Afghanistan, our utter failure, as a policy matter — what we did in Afghanistan. And let’s be clear. It was not an intelligence failure, it was not a — and I know we’re not here talking about Afghanistan — but it was not a logistics failure. It was not a planning failure. This was a decision made by policy makers — first in the Trump administration, and then, again, in the Biden administration — about what to do. And everyone, China, our friends in Asia are watching. And it is not a good situation.

 

Matthew Feeney:  We have a few minutes before going over to audience questions. I do want to ask, though, given that I spend a lot of time thinking about privacy, and when these EOs came down the line, I remember thinking about whether the executive branch was the correct branch of government here. In the sense like, obviously, that there is national security use within the remit of the executive. But isn’t Congress around to maybe write privacy legislation to at least add some kind of protections here? Is the fact that we’re even having a conversation about a string of EOs a symptom of congressional failure, or lack of movement here, or is this actually something that is actually best placed within the executive purview?

 

Jennifer Hay:  I do want to clarify that I’m not a lawyer, but, in looking at the privacy implications to this collection of data, I think Congress does need to step in and create some sort of national regime for privacy protection when it comes to collection of data. Because, if we leave it up to the states to do it, then that is — that’s 50 different — 50 plus different privacy regimes that companies need to comply to. And the inadvertent effect — the second-order effect of that is that companies will have to ultimately collect more data than they possibly would have collected initially. 

 

For example, companies that don’t need location information will now have to collect location information of their users in order to make sure that they’re — that the users are — that the company is compliant with the user’s privacy regime that governs them. So, my personal opinion is that, yes, Congress does need to step in and start to have that discussion on what does privacy mean when it comes to data collection, and start to issue — you know, create some sort of regime for the United States. You know, maybe not as far as GDPR, but something similar, is what we need.

 

Jamil N. Jaffer:  I think it’s so interesting that Jennifer has that perspective. I don’t agree. I actually think that the — that these issues are best sorted through in the private marketplace. Right? I don’t think that we need to create this massive privacy regime. I agree with her; GDPR is not the way to go. There have been all sorts of problems with it. It really — in a lot of ways, GDPR is not really about protecting European privacy. It’s really about changing the game so that American companies can’t be as competitive in Europe as they are today. GDPR really doesn’t effectively protect privacy in any substantive way, in my view. 

 

And this idea, you know, all the states should have these privacy laws, and now they’re talking about federal privacy laws. The idea that the government can regulate or legislate in this space in any way that’s effective, with technology changing so quickly, to me is not realistic. Legislation takes forever. It’s oftentimes both overprotective and under-inclusive. And technology moves so quickly, and we’ve been so innovative. Frankly, the technology space has been so productive in those technologies, precisely because we have not legislated and not regulated. And the Europeans have been such a failure, because that’s exactly what they’ve done, is to legislate and over-regulate.

 

So, the idea that we should now come in and adopt the European approach on privacy, or adopt the European approach on whatever — I mean, if we want to tank the American technology economy, okay. And I get that there’s a lot of conservatives who think, “Oh it’s a good idea to regulate big tech and its platforms because we don’t like how they’re treating us on social media.” And liberals say, “Well it’s not — they’re not treating workers fairly, so we should regulate them.” So there’s this really nasty cabal coming together in a way that would sort of bring the government into private — into regulation, into legislating in this space of conservatives and liberals together uniting over, you know, perceived concerns. Right? 

 

And, really, what that’s going to ultimately do is undermine our economic security, undermine our most innovative sector, and, ultimately, undermine our national security competitiveness, because we’re undermining what is, at the end of the day, the heart of the American economy and our national security establishment going forward. That would be a huge mistake, you know, putting members of Congress in charge of technology, or worse, bureaucrats at the FCC. That’s a train wreck. We shouldn’t do that.

 

Matthew Feeney:  Margaret, any thoughts? If not, happy to turn over to the audience for questions.

 

Margaret Peterlin:  Yeah. I have two quick thoughts. One of those two — this data might be two years old, but there was a review done in Europe on what the effect of the GDPR was, and what they discovered is that it was doing what Jamil said. Small and medium enterprises were unable to keep up with the compliance costs, and so they saw a reduction in the existence of small and medium enterprises over there. So, I think that that was an unintended consequence. I didn’t think that — I don’t think was something Europe was trying to achieve. So, I do think the regulatory concerns are significant, in terms of how you could implement it. 

 

But, what I find remarkable—and it’s kind of ironic, in a bad way—is that all of this debate, and what Jamil called the big cabal coming together — this is how we feel about privately-owned companies that resist, to the hilt, requests for information from the U.S. government. We are very — there are people in the U.S. who are very concerned about how much data Facebook has on them, how much data Twitter has on them, all of this data collection. And yet, we’re stalling. We’re stalling out on the issue of this adversary, this determined — both determined in the sense that it’s been determined to be a foreign adversary, and it is a determined foreign adversary — to collect data on us through infrastructure and telecommunications, requiring U.S. companies, and through surveillance. And yet, we, for some reason, don’t exhibit the same passion to address that concern. 

 

So, I would just — I would say, we should address these in the order of greatest concern. And for me, I am more worried about a foreign adversary having information on me than a private U.S. company having on me. And, so, for the people who fall in the category of being concerned about all of the above, it’s remarkable to me that we are piecemealing — we’re trying to piecemeal a response to a centrally-planned attack.

 

Matthew Feeney:  Yeah. Thanks for that. I do want to turn over now to some of the audience questions. A reminder that you can throw questions into the chat, but also feel free to use the Raise Hand function. I’m going to take these chronologically. I wanted to start with a question from Ethan Meredith, who writes, “Do the panelists have thoughts on how the restrictions could be drafted to avoid Berman. I’m also not a lawyer, so I’m going to reveal my ignorance here. I think this is a reference to the Berman Amendment, which has some implication I hope one of the panelists can outline. Going on to write, “Social media apps seem to be particularly facing difficulties re the Berman, as opposed to, say, financial apps.” Does this prompt any response from any of the panelists? Jamil? Sorry, no? Okay. We can — we can move on if there are no thoughts on that particular one.

 

Jeffery Wood writes, “There in an inverse condemnation argument here. The U.S. federal government owes citizens, and residents, aliens, a duty to protect their privacy. Free speech, freedom of assembly, bah, bah, bah, bah . . . and permitting the — I’m not sure I quite see a question there. 

 

There’s a comment here from Clay Albert, about the concern, sounding a lot like overstepping on Know Your Customer — on transaction tracking for cryptocurrency, people in those industries. I guess, you know, there might be a question in here that actually occurred to me in the conversation, which is, how much data is it — what kind of data could these companies be collecting that wouldn’t be concerning, but it would also make them functional as social media apps? Or, as alluded to in the other conversation, maybe there’s no amount of assurance about data that would be reassuring to those of us worried about national security. 

 

Jamil N. Jaffer:  I mean, I guess you could imagine a world in which there were some limited amounts of data that they collected that you weren’t as concerned about on the national security front. But the typical stuff that these social media platforms gather, you know, Mathew, you ran through a laundry list of them — location, patterns of behavior, content, at times. I mean, these are things that, when aggregated — and, when aggregated, in particular, as I described earlier, with other sources of data, and fed into a highly performant — or to make machine learning algorithms more performant — it’s hard to imagine, you know, what you wouldn’t — what you would be okay with them collecting. Right? 

 

I mean, I suppose, if they were very transparent with what they collected, and how it’s being utilized, and what it was combining with, and yada, yada, yada, you might get some level of concern — of comfort. But, you know, I just think, at the end of the day, it’s different than my perspective on American marketplace competitors who provide privacy policies. And look, if we waive it, we waive it. How many of us go on our phone every day and say, “Oh yeah, use my location whenever you want, however you want, whatever, only when using the app.” Well, I think that’s a good one, right? Only using the app every five minutes when using Google Maps or whatever, so, okay. Right? 

 

But, I think, at the end of the day, it would be hard to imagine them functioning as normal commercial companies that collect a significant amount of data for your free app. Remember that whole idea that, you know, if you’re not paying for the thing, it’s because you’re the commodity. The old saw, now, at this point. It’s hard to imagine me being comfortable. I’m comfortable — unlike Europeans, I’m comfortable with Google having it if I tell them they can have it. Right? I’m not comfortable with the Chinese government having it. Europeans have a different view, right. They’re really comfortable with their governments having it. They’re uncomfortable with the private sector companies having it. It’s sort of the difference between Americans and Europeans, at least, that I’ve perceived. So, on this front, though, I don’t think any of us, Europeans or Americans alike, want the Chinese government having it.

 

Matthew Feeney:  Okay. Thanks for that. There’s a question from Elizabeth Eason (sp) here about how much of the concern here that we’ve been discussing could be addressed via the Committee on Foreign Investment, if that’s an appropriate body. I don’t know if that’s a body that any of the panelists have views on, but, you know, might that be one way to address some of the concerns here? Or not, that’s fine — I’m sorry. Go ahead Jennifer. Sorry, I saw someone speaking, but —

 

Jennifer Hay:  Margaret can go ahead.

 

Margaret Peterlin:  No, I mean that’s one that the — CFIUS is actually reviewing some of these transactions right now, separate from the EOs. So, that is a proper mechanism, I think. If you ask me, do I think there are amendments, I would want to sit down with a CFIUS lawyer and say, “Okay, what are the, you know, three or four things that we think we could do?” My observation of the CFIUS process, if you go back to issues like Qualcomm and others, is that it is a process that can be very cumbersome. And so, if we’re going to send a bunch more traffic through it, then I think we actually have to look at, you know, the reform of how it works. 

 

But it is — it exists, and it addresses some of these issues. And, in fact, it’s looking at some of the very questions the EO raised. So there may be a requirement for the divestiture to go forward under the CFIUS process right now. And I think that that’s something — that’s a great recommendation to the Biden administration, which is, how much of this are you going to try to do through EOs and other authorities? And how much of this are you going to look at CFIUS review? Because that’s actually been discussed for years, that there needs to be improvements to the process, but not — I know Jennifer had some thoughts as well.

 

Jennifer Hay:  I agree that I think that taking a review of the CFIUS process is important, but one reason why I liked the EO is that I think it fills a gap that CFIUS doesn’t cover at the moment, where it’s looking at applications that are solely owned by a foreign government or by an adversary, as opposed to the adversary purchasing a U.S. company. And so, I think that that’s the, in my mind, that’s the delineation between this EO and CFIUS. Sometime in the future, I think, you know, we’re doing that CFIUS reform, and potentially looking at foreign-owned apps that are used inside the United States. It’s something that CFIUS could potentially take on, but we’re not there yet.

 

Margaret Peterlin:  And, just to be clear, these would be amendments in addition to the FIRRMA Amendments that just happened about three years ago. So, I just wanted to point that out, that this did go through a series of reform. But, I don’t think that it is — even with the FIRRMA reforms — capable of addressing all of the questions that now we’re trying to grapple with through the EO. And part of it is just the traffic, the time that it takes to get through the process.

 

Matthew Feeney:  Yeah. I see a question from Jeffrey Wood about — you know, he writes, “I have — I agree with the impulse of seeking a private market solution, but will the market protect U.S. national security interests, or only private economic interests? Is the proper model import/export controls, and does that invite a new mercantilism?” And I guess I’ll use that question to use moderator’s prerogative to follow with my own, namely that is there a risk here that we’re inviting some sort of reciprocal response from other countries, and we end up in a situation where American companies make social media products for Americans, and the Chinese have to make theirs for Chinese people. Doesn’t this potentially have an impact on global trade and transactions, or is that concern overblown?

 

Jamil N. Jaffer:  I just love this, like, you know, this idea that somehow, you know, it’s all reciprocal, right? Let’s be clear. The United States and our government, when it seeks data, operates under a system of laws. A system of laws passed by a transparent process in the Congress, signed by the president, and put out in public for everyone to see, then implemented by judges who are appointed to life terms, and whose processes also are open for the public to see. When the Chinese operates, it’s the exact opposite. There is no transparent decision-making process. There’s just the Chinese Communist Party. Right?

 

And so this idea, somehow, that when we do something, and we have surveillance laws, or authorities, or the like — and they have surveillance laws, too, and they — you know, it’s the same thing. It’s not the same thing. And so when we decide we’re going to behave in a certain manner because the Chinese don’t have a process that’s transparent and understandable, and, as a result, our data is being collected and used in ways that we don’t know about or understand well, it’s fundamentally different. It’s fundamentally different when it’s the private sector, which is not — doesn’t have to give data to the government unless they’re required to under U.S. law, whereas, there, the private sector and the Chinese government are enmeshed. 

 

Members of the Chinese Communist Party sit on the boards of these companies. They’re influential in the decision-making processes of these companies. It is not the same thing. And, so, because we treat Chinese companies one way because of the way they behave, but that idea that it’s fine for us all to accept that they will, or our friends, or our allies will treat us the same way. It’s just not accurate, and they’re not the same thing. Right? And, so, yes, there’s a possibility. And, to your question, might there by reciprocal action? But is reciprocal action the correct and appropriate thing, and should we push back on it? It’s not the right thing, and we should absolutely push back on it. The questioner’s question, though — remind me again, Matthew, what the questioner’s question was? I apologize.

 

Matthew Feeney:  Oh no, I should apologize for piggy-backing off it. The specific question saying that — Jeffery expressed that he was in agreement with the impulse of seeking a private-market solution, but will the market protect US national security interests, or only private economic interests? I guess the question here is, you know, we can say that the private market solutions emerge, but how much do — does that market care about national security?

 

Jamil N. Jaffer:  Well, I think that, at least in this realm, there’s a couple places where they’re aligned. Right? Our economic interests are aligned with our private sector. Where our private sector’s successful, the American economy grows. That’s important for America in the national security realm. Right? Because, as the Trump administration correctly pointed out, economic security is national security. We’ve always all known that. They put it in writing. Right? And then, I think that was correct. But, at the same time, you know, it doesn’t mean we can’t regulate the national security space. We can and we should. And we are sort of — there is a little bit of mercantilism going on. I do think that there’s a little bit of that going on when we say we’re not going to allow Huawei in. Right? Will that happen worldwide? Again, I think there’s reasons to think it shouldn’t. But I’d be lying if I said to you that wasn’t what — a little bit of what’s going on here. 

 

But the Chinese have been doing it for years. What did we think Huawei or ZTE is? That’s literally mercantilism writ large. Right? We’ve been treating China like we’re all working in the capitalist environment. They’re playing fair. We’re playing fair. As far as that, we’re just the idiots, watching what’s going on and not paying attention to what they’re actually doing. They are not operating in a capitalist way. They’re operating a capitalist market. Right? But stealing intellectual property from us and giving it to private companies to prop them up, along with low-interest loans and the like. This idea that somehow they’re playing fair and we’re — and we’re just — we’re losing the competition. It’s not true. They’re playing unfair. We’re playing fair. And we’re just the morons in the game.

 

Matthew Feeney:  Jennifer and Margaret, if you want to add, feel free to jump in. But I’m happy to go on to other questions if you don’t.

 

Margaret Peterlin:  I’d like to reinforce something that — a point that Jamil started. And that was the Chinese government, and how it operates. Because I think it’s really important for us to identify those points of intersection. And so that it doesn’t lapse into accusations. Right? Because you don’t set foreign policy on the basis of raw accusations. And so, he talked about the fact that they have requirements to have CCP party members in the businesses. Let’s be clear. It’s in the C-suite. I’m also aware, from talking to people in U.S. companies where there’s a joint venture in China, they have been — there has been pressure for U.S. companies to accept a C-suite-level person from the CCP in the joint venture in China. I’m aware of specific examples of that. So there’s that. 

 

Then there’s the 2017 Chinese intelligence law that we keep referring to. And it’s really worth reminding ourselves of what it says. You’re an organization, you have to support, assist, and cooperate with national intelligence efforts. That isn’t the same thing as the FBI coming with a request that you resist, then they get a warrant, and you keep resisting it, which is what our model is. So that’s another example. Then there is the issue of Huawei and ZTE, and I want to make sure we all realize, Huawei is in the U.S. It’s in regional telecommunications company. The limitation that was put in place actually affected the four major carriers at the time, because of the way the prohibition was put into place had to do if you were doing work with the U.S. government. 

 

So we have Huawei equipment in U.S. regional carriers at the moment. So the approach is a true belt and suspenders approach, in terms of their access to information, their commitment to getting it, the number of ways they come in. The point I always try to make is, we are beyond the place where they have to be good at hacking into our systems. They are in the backbone of the system. They are in the C-suite meeting. I mean, ask yourself, as an American citizen, how you would feel if the U.S. Congress passed a law that said, “Here we go. We’re the — we’re a democratically-controlled — bicamerally, right now — and we’re democratically controlled in the administration, and we’re going to require a party — a senior party official to be in the C-suite of Google and Facebook and others.” What would the response be? 

 

And so I think Jamil is absolutely right. We need to stop equating what’s going on, like who has your data doesn’t matter. Who has your data absolutely matters. Who knows my secrets matters to me. And so I do think it’s really important that we understand the structure of the Chinese engagement with their — the data that comes in to their companies. And I do think we really need to — we need to be honest with ourselves that there is a mercantile system. There is a free-market system. They are not the same. And Jamil is absolutely right. We’re walking around saying, “We’re following the rules.” Great. China isn’t following the rules. In fact, they’re doing their level best to change them. And, again, Xi is explicit. Then, when he wants to change the rules of financial management in the world, which way do you think those are going to go? I mean, so, again, what I say is, I think it’s critically important that we take a fulsome strategic view of all of these, literally, points of data, in terms of the relationship of China with our information. And the fact that they have been determined to be a foreign adversary should matter to us differently than a company that isn’t determined to be a foreign adversary. I just —

 

Matthew Feeney:  Yeah. Yeah.

 

Margaret Peterlin:  That’s baseline stuff for me, in my analysis.

 

Matthew Feeney:  Yeah. Thank you for that. We’re running up to the last minute here. And, yeah, sorry I couldn’t get to all the questions. But I would just like to turn over to our host Nathan for a few concluding remarks.

 

Nathan Kaczmarek:  Well certainly, we’ll have to have everyone back to continue the discussion. There’s lots more to say. But, for now, our thanks to Matthew, Jennifer, Jamil and Margaret for their time and expertise today. Audience feedback is always welcome to RTP at [email protected]. Have a great day.

 

[Music]

Jennifer Hay

Senior Director for National Security Programs

DataRobot


Jamil N. Jaffer

Founder & Executive Director, National Security Institute

Director, National Security Law & Policy Program and Assistant Professor of Law, Antonin Scalia Law School


Margaret Peterlin

Adjunct Lecturer

The Bush School of Government & Public Service, Texas A&M University


Matthew Feeney

Head of Tech & Innovation

Centre for Policy Studies


Cyber & Privacy
Emerging Technology

The Federalist Society and Regulatory Transparency Project take no position on particular legal or public policy matters. All expressions of opinion are those of the speaker(s). To join the debate, please email us at [email protected].

Related Content

Skip to content