Since its founding in 2013, Civis Analytics has become a leading data analytics firm for progressive campaigns, nonprofits and other private clients. Led by former Obama campaign data guru Dan Wagner, and with backing from Google billionaire Eric Schmidt, the firm was indispensable to Democrats in 2020. Civis Analytics claimed to have worked with every Democratic primary candidate, while the Biden campaign used the Civis platform for all of their data support. The firm surveys some 75,000 people on a monthly basis, testing a vast array of political messages, policy ideas and hot-button issues.
As Civis Analytics’ senior partnerships manager, Jenn Cervella helps the firm’s clients in the progressive advocacy world make sense of their data. She is also vice president of the board at Change the Game, an organization trying to advance diversity and opportunity in the progressive data space.
In an interview with Blue Tent, Cervella discussed advances in analytical work, data advice for progressive stakeholders, Civis’ big lessons from the 2020 campaign, and more.
This interview has been edited for length and clarity.
BT: What are the major differences between how campaigns use data now compared to maybe five years ago?
JC: In 2012, teams spent a lot of time hacking together backend data infrastructure, which meant that there was less time for the more impactful analytics work and data was often out of date. It was very innovative for the time, and it was definitely better than 2008—when the voter file basically existed as Post-It notes and index cards—but it also functioned like a start-up.
Because they were essentially building the plane while flying it, everything was very inefficient and there was little collaboration. This meant that the possibility for innovation was limited to the teams working on particular projects, and those projects would only live as long as the campaign. Each campaign was tasked with building this infrastructure for itself, which meant that duplicative tools and processes were the norm across a variety of organizations and campaigns.
Now, data and analytics have become a well-oiled machine. Teams knew what they needed—from specific job functions to technology—and got it from the get-go. As a result, they were able to spend more time innovating and helping their campaigns make better data-informed decisions.
BT: What about nonprofits like advocacy organizations and think tanks—how should they be trying to utilize their data?
JC: It’s a really boring answer, but the first step should be getting all of your data cleaned and in one place. Most people don’t know how much time and effort it takes to clean and unify data, so I always encourage analytics teams to set expectations with their colleagues.
If you don’t have a single, unified view of your organization’s outreach efforts and donor history, you can’t see how your efforts performed across all of your tools (email, texting, digital ads, direct mail, etc.), and which individuals respond best to which messages or forms of outreach. This makes it impossible to create a scalable personalized outreach program, and without that, mobilizing supporters becomes difficult, costly and time-consuming. Once you have that single source of truth, though, there’s a lot you can do: Fundraising teams can create personalized donor campaigns at scale, and create models to identify new donors. You can truly analyze what’s working and what’s not—and if it’s not working, determine why. Mobilize your supporters and reach them where and when they prefer to be contacted; test messaging so you know what’s going to work for whom.
It's important for organizations advocating for policy to have a good sense of where the public opinion is on their issues and what messages work best for which people. Technology enables an organization of data that can answer the question, “How has this issue moved over the course of time, or in relation to the events that happen in the world?”
BT: One challenge progressive organizations and political campaigns face is tighter budgets compared to their competition, and they may see data analytics as a luxury they simply can't afford. How can these groups put their data to good use without breaking the bank?
JC: Running regular reporting workflows and processes can take forever, but automating some of those processes can make analytics teams way more efficient. For example: Do you really need someone spending hours on end heads-down on an annual report? If you can automate some kind of progress-to-goal dashboard, people can view and share the most up-to-date data.
Another advantage to having analytics that can see across the organization is the opportunity to find efficiencies and cost savings, whether that’s purging incorrectly formatted addresses from mailing lists or cutting ineffective programs—seeing inefficiencies like these wouldn’t be possible without first investing in data.
Surveys are also good tools that provide a window into what stakeholders care about—and that knowledge is really helpful in shaping organizational initiatives, dispelling myths and arming grassroots teams. Just make sure the person writing the survey understands the principles of social science because it’s really easy for bias to creep in.
BT: Let's talk about what Civis’ team took away from the 2020 cycle. What messages worked best for Democrats and progressives, and do we have any idea as to why?
JC: For background, Civis tested the persuasiveness of thousands of TV ads, radio spots, talking points and other types of messages using a control vs. treatment framework (considered the scientific gold standard). A few key takeaways:
When looking at political races, negative ads were less effective than positive pro-candidate ads, or ads that compared the two candidates—likely because persuadable voters wanted reasons to vote for one candidate vs. reasons not to vote for the other candidate.
Messages that mentioned COVID typically performed better, but really, what we learned was that it’s impossible to identify what works and what doesn’t before testing. Many times, something that worked with one demographic group backfired with another, and our intuition was often wrong. Maybe that’s because the content often comes from consultants who look nothing like the people they’re trying to reach.
BT: What about on the right—what worked well for conservatives, and why? What trends did you see?
JC: In general, negative ads were also less effective for the right, probably for similar reasons as mentioned above. One specific finding was that ads about protests were usually less effective than other types of ads.
BT: What might the future of political data analytics look like? Are there any big breakthroughs on the horizon, either in terms of access to data or innovations in analyzing data itself?
JC: There are a few things I’m excited about.
Because there is now better technology that’s becoming standardized, there’s a lot more collaboration amongst progressive analytics teams. People are sharing their code. They’re teaching each other what worked and what didn’t, what code is needed to get the information they need, and how to communicate difficult results to superiors. They’re consistently learning from each other, and as a result, the community of data professionals will just get stronger.
I also think the community will get more diverse over time. If you look at the Biden analytics team as an example, many of their data leads were these really awesome, badass women. It’s exciting to think about how many different groups are leveraging technology, whether it is for email programs, fundraising, or direct contact with supporters. This kind of diversity in data and data practitioners means different perspectives will be introduced and analytics leaves fewer people behind.
I also think we’ll start to see analytics teams more involved in messaging and creative development. As I mentioned earlier, one key takeaway from our message tests was that it’s really hard to predict what works. When campaign teams consistently tested their messaging in advance, they could optimize their spending effectively, and at the same time, start to see themes in what worked and what didn’t. As more people realize that, I think we’ll see message testing as a regular part of ad and message development. That’s really exciting, because creative teams are given an opportunity to create messages that might fail without consequence, giving them more creative freedom.