Technology Creates Opportunities for Democracies

Jigsaw Policy Director Scott Carpenter explains how the technology community is responding to the challenges that disinformation campaigns, deepfakes, and manipulated images present democracies.

Scott Carpenter serves as Policy Director at Jigsaw, formerly known as Google Ideas. A graduate of Hope College and Johns Hopkins University, Carpenter served in a number of posts in Washington, including at the State Department where he served as Deputy Assistant Secretary of State and worked on issues pertaining to democracy, human rights and the Middle East. He also was an advisor to the Coalition Provisional Authority in Iraq and worked on democracy issues for the International Republican Institute.

A member of the George W. Bush Institute’s Human Freedom Advisory Council, Carpenter spoke in late September with Lindsay Lloyd, the Bradford M. Freeman Director of the Human Freedom Initiative at the Bush Institute, Chris Walsh, Senior Program Manager in the Human Freedom Initiative, and William McKenzie, Senior Editorial Advisor at the Bush Institute. He detailed the threats democracies face from disinformation campaigns, manipulated images, and deepfakes. He explained how the technology community, including his company, is trying to address these modern challenges. And he discussed how technology actually can strengthen communities, despite the way some actors use technologies to undermine democratic stability.

We often hear terms today like fake news and disinformation. Can you talk about the difference between them and what people should know about that difference?

For us, disinformation has a very technical meaning. It’s defined as “coordinated inauthentic behavior online,” and is generally generated by states to confuse, divide, or manipulate. As such, it’s a very specific threat.

The term that we increasingly use is misinformation, which our team is particularly focused on. Misinformation is information that might be harmful, like providing inaccurate health information.

Fake news traditionally was considered news that was entirely made up out of whole cloth, where neither the source nor the content were in any way true. You have a completely made-up news organization that’s generating completely made-up news. Today, however, the term is more of an epithet that is hurled at one side or the other.

Jigsaw’s online journal, The Current, recently said that there’s no silver bullet solution for stopping disinformation flows. How do you see policy and technology efforts coming together to stem that flow, especially from places like Russia and China that are using it as a geopolitical tool or strategy?

Our threat assessment group has a fairly good handle on various state-backed hacking groups and is increasingly sophisticated in identifying and taking down influence campaigns. Some of the other big social media platforms have developed teams in that same direction.

Our threat assessment group has a fairly good handle on various state-backed hacking groups and is increasingly sophisticated in identifying and taking down influence campaigns. Some of the other big social media platforms have developed teams in that same direction.

The broader question that you’re asking is an epistemological one. In my view, the key to protecting an information space lies in creating or developing robust resilience among informed citizens. We need to find ways to increase trust and believability. You can’t simply tell people what to believe or what not to believe. They have to be able to explore, and they have to be able to come to their own conclusions. But you can, in a technical sense, provide additional context when they’re doing so.

Other people we’ve talked to for this series have pointed to the risk of deepfakes and how dramatic and large their impact can be. What is your view on deepfakes and what we can do to get ahead of their viral spread?

Deepfakes, which use artificial intelligence (AI) to create video that is indistinguishable from actual video, have gained a lot of attention in the popular press. But they are not as proximate a threat to the information space as many fear. Out of context and manipulated images are much more problematic today.


Last February, for example, India and Pakistan almost got into an altercation over a picture that was shared of a captured Indian fighter pilot. As it turns out, that photo was simply taken out of context. It had been taken much earlier at an Indian air show where two planes had crashed. At the moment, it is more important to provide additional context in real-time around such images before they go viral and spark off-line conflict.

That said, we do need to get ahead of the future threat of deepfakes. One challenge with doing so is that there are very few good deepfakes out there. Without them, you can’t develop detectors to defend against them. This is why we’ve been working with Google Research to generate the largest dataset of deep fakes for researchers to develop detectors.

It will of course be a cat-and-mouse game but we should remember that humanity has been through this before. We had the emergence of the printing press in the 15th century, and the mass-produced yellow journalism of the 19th, among others. These were not altogether peaceful evolutions or times, but the technologies that were developed ultimately helped to advance and democratize society. They will do so again.

Similarly to deepfakes, artificial intelligence (AI) is not going away. In fact, it’s going to grow. Are there ways free societies can capitalize on it or benefit from artificial intelligence?

We are already doing so. One reason I have hope is that open societies are way out in the lead on AI. We are deploying AI to predict wildfires and other natural disasters, improve crop yields, and develop ways to prevent blindness or diagnose cancer. Those are all social goods.

One reason I have hope is that open societies are way out in the lead on AI. We are deploying AI to predict wildfires and other natural disasters, improve crop yields, and develop ways to prevent blindness or diagnose cancer. Those are all social goods.

AI also has a lot of promise for accelerating how we learn. It already has expanded our access to other languages and cultures, for instance. AI is obviously really good at recognizing patterns that we can’t and, therefore, might be helpful to improve data-driven decision-making in open-societies. But it must be carefully and thoughtfully developed. In this regard Google has developed a set of AI principles to guide how we deploy AI, which I think are also broadly applicable to democratic societies.

The fourth one is most important in the democratic governance context: AI must be accountable to people. That means we will design systems that provide appropriate opportunities for feedback, relevant explanations, even appeal, and they will be subject to people to direct and control.

These principles are democratic and widely applicable to open-societies around the world, in my view.

Thinking back to the Arab Spring, there was a lot of positivity associated with social media. Dissidents used social media to keep each other informed about real-time developments and to coordinate with and rally each other. That seems missing nowadays. How do we get back to that positive connection between social media and the growth of democracy?

Your bias is showing! I don’t think we’ve ever lost that connection. Social media now is helping people register to vote. It has led to a dramatic expansion of access to information and exposure to different points of view — for good and ill.

Social media now is helping people register to vote. It has led to a dramatic expansion of access to information and exposure to different points of view — for good and ill.

Social media also has accelerated the sharing of information and accelerated transparency. And, significantly, following George Floyd’s murder a combination of smartphone footage and social media has driven the broadest, most hopeful effort for social justice in the US since the 1960s.

I remember an Egyptian activist telling me shortly after [Hosni] Mubarak fell that “We would use Facebook to plan, Twitter to communicate in real time, and YouTube to tell our story to the world.” That was in 2011, and as I think about it in 2020, that is still what’s happening around the world today. Of course, dissidents now have safer, more private means to plan and communicate than they did back then which is why you’re increasingly seeing governments shutting down the entire internet when threatened.

Where else do you see this positive use of social media?

The most promising recent example has been in Belarus, where just through using Telegram and VPNs, organizers were able to create the largest mass protests that the country has ever seen. Also, think about Iran, where, even during the pandemic, the protests were propelled largely by the internet and by apps like WhatsApp and Telegram.

Sorry to go negative again, but you brought up WhatsApp and Telegram. In some autocratic nations, leaders have used private chat apps to spread disinformation. At the same time, as you’re saying, these apps have helped dissidents organize protests. What are we to make of kind of these competing realities?

This is a fascinating question. Why were Telegram and Psiphon even available in a place like Belarus or Iran? Leaders and elites, especially in authoritarian countries, are increasingly relying on secure forms of communication as much or more than average citizens. I don’t know why, but maybe they don’t want the Ayatollah knowing what they’re saying to one another. Or maybe they want to watch cat videos on YouTube, like everyone else, even in Russia.

For this reason, when governments resort to shutdowns even the elites complain and the shutdowns are for the most part short-lived. They are also short-lived because they are so costly which has given rise to the idea of “collateral freedom.” The idea is that the more broadly used a set of tools are the more difficult and costly it is for a government to shut them down. Only Venezuela has been able to sustain its internet shutdown, but its economy is in shambles.

Thinking about your industry as a whole, what kind of conversations occur or might be occurring about how to use technologies to strengthen local communities?

As platforms that serve billions of people around the world, we tend not to think about tailoring the technologies for specific groups. But what we are increasingly trying to do is to understand their issues at the local level and exposing municipal leaders, local NGOs [non-governmental organizations], smaller businesses, and others to how the tools work. Hopefully, we can help them generate fresh thinking about how they can use the tools to solve problems in local contexts.

Free people are endlessly creative and are already using these tools in amazing ways to serve their communities. We look for places where we can have impact at the local level where people are gathering information. For instance, in the U.S. we’re partnering with the American Library Association to increase digital literacy and skills.

Imagine what small businesses around the world would be doing without the internet and our respective tools in this time of pandemic. NGOs, too, have benefited enormously from these free tools. In the past it was relatively easy for a government to block their websites. But when the NGO has its presence on Facebook, the government has to block all of Facebook to block their website.

What are your thoughts about how technology has been used to help curb the spread of COVID-19, especially through contact tracing?

We have been in collaboration with Apple to provide “exposure notifications” that would be available to those who download the app. The app has been released by a number of states, including New York and New Jersey to stop the spread, but this is different than contact tracing. Contact tracing enables centralized monitoring typically by state-based organizations, like a local CDC. They can access that information to track cases, and therefore be able to check in on specific individuals.

In either case, whether it’s exposure notification or contact tracing apps, the applications are predominantly opt-in, especially in democratic societies. There have been concerns that this is going to limit their effectiveness. How many people will be willing to download an app if they’re not willing to wear a mask, they ask. But a recent study shows that even a 15 percent adoption rate could save lives.

In this series, we have spent a lot of time looking at how technologies are presenting new challenges for democracies, such as through deepfakes and disinformation campaigns. So, let’s flip this around: What opportunities do you see modern technologies offering democracies in the future?

We’re still in the early days for these technologies, particularly AI, so we remain in an emergent and hopeful moment. The pace of change over the past, say, 15 years has been mind-numbingly fast, and our institutions and society have been challenged. But they’ve also been empowered. Imagine a pandemic without Zoom, for instance.

From my point of view, the technologies so far have been democratizing in open societies. Individuals today and in the future will have an increased sense of agency in open societies. And knowledge today is not only for the elites, but for anyone with access to a smartphone. Elites can’t control the narrative any longer, which is a challenge, but also democratizing.

As we’re experiencing at Google, knowledge workers are increasingly able to live and work from wherever. This may allow cities to de-densify or bring back small towns. All of this dynamism creates enormous opportunities for closing equity gaps within our society in education, healthcare, justice, among others.

Democracies are always evolving and, like sharks, if they’re not moving forward, they’re dying. My expectation is that we will continue to evolve toward a more participatory democracy. And that the dynamism of “small D” democrats, together with their endless creativity, will combine with technology to fuel a revolution in democratic governance.

This will begin not at a national level, but at the local level and build out from there. Local communities are more nimble. The democratization of these technologies can empower town councils at a local level that is more difficult to do at a national level.

It’s always darkest before dawn, as they say. But for me, despite everything going on, this remains a hopeful moment. The fact that democracies don’t try to control everything means these technologies will find a way to help us strengthen our societies.