Research, in general, is a fairly well-understood concept in the advancement world. Most people can describe it and even provide a reason or two why it's useful to integrate it into your work (regardless of the field). But what does it mean to have a research mind? Or to approach things with a research mindset?
"Having a research mindset doesn't necessarily mean you're an expert number cruncher or a master of Excel," says Megan Stevens, director of regional and affinity strategy at Lehigh University in Pennsylvania. "It's knowing that you need to look at evidence and data to structure your strategy."
In other words, being research-minded simply means you are objective-driven and intentional in making decisions. After all, if you don't have a question you're trying to answer or a goal you're trying to meet, how do you know what you're doing or what outcome you're looking for?
In Score! Data-Driven Success for Your Advancement Team, Peter Wylie writes: "In a world that increasingly runs on data and the interpretation of data, university advancement and nonprofit work still tends to follow a combination of gut instinct, intuition, experience, and methods that have worked in the past."
Ann Kaplan, who oversees the Voluntary Support of Education Survey at CASE, agrees.
"Things change, and it's good to be able to prove your instincts or have data to back you up when you propose a change," she says.
Kaplan gives the example of a pre-college institution that started a capital campaign to raise money for a library. The board felt the gift officers should be approaching corporate donors to fund the campaign, but Kaplan's intuition was that the money would come from a different source. When she went back through survey data from previous years, she discovered that when it came to funding property and equipment, parents were the major donors, not corporations. She was able to share that information with a staff member, who was then able to refocus the board.
When a colleague approached Sandra Campero with a desire to fill upcoming school board vacancies with more diverse faces and voices, the assistant vice chancellor of advancement operations at the University of California, Irvine, saw a perfect opportunity for her team to draw upon their research skills and the best data available to provide a targeted list of qualified alumni.
But that's not what her colleague wanted. "I need all the names I can get," Campero was told.
"But what does diversity mean to you?" Campero asked. "Are you looking exclusively for industry leaders? Are you focusing on gender diversity? Or do you mean ethnic diversity?" Campero understands that data without context are not particularly useful.
The original list that her team pulled, before any segmentation was done, yielded about 4,000 names. But then Campero had her team pull out anyone who was listed at a specific capacity level, and then from there, anyone who was already involved with the institution in some way, and so on, continuing to segment until she had a much more manageable 80 names.
This process also produced a list of people who were currently flying under the radar but who, with a little more cultivation, could be ready to give or volunteer in three to five years.
"It's a win-win," says Campero.
"My colleagues are getting the list that they need while I'm training my team and building their skill sets for applying their instincts to the reports," she continues. "To me, having a research mind means that there's intelligence added to the data. There's some synthesis happening; context and ideas are being offered on top of the raw data."
Dale Wright, associate dean for advancement for the College of Engineering at the University of Illinois at Urbana-Champaign, echoed the importance of providing context to research.
"It's not just about having the list of prospects," he says. "What do we know about these people or their motivations? What do we know about best practices for making donors feel appreciated? We need to take this data and actually let it help drive our future conversations."
While not every institution has a research team, Wright is fortunate to work with one. The team provides information about event attendees, the areas of the country that have the most alumni, how many of those alumni the development team can realistically reach, how they should be contacted—through email or direct mailings—and what information these alumni want to receive in a newsletter. Armed with this information, Wright and his team are able to make informed decisions about their portfolios and how they are going to spend their time and resources each quarter or each year.
"Data on its own is not as useful; it's when you add that [context] that data can help you reflect on your work and spot trends," says Lori Houlihan, vice provost for advancement at University College London and co-chair of CASE's AMAtlas Advisory Committee.
Providing context is what CASE hopes to do with its newly launched AMAtlas, a global resource for advancement-related metrics, benchmarks, and analytics for schools, colleges, and universities. The goal of AMAtlas isn't to simply collect data, but also to create value for CASE members out of what they can learn from the data.
For Houlihan, this means providing the information to put her institution in a global context. She wants to compare her school with other British universities, but also with Australian universities and American or Canadian universities. "Context is important," she says. "Where you think you are isn't necessarily where you actually are, so it's great to get a sense of how you stand compared to other institutions."
In order to create an environment where you can successfully integrate research into your advancement work, your institution should embrace a culture of research or analysis. Generally, a culture of research places a high value on data-driven decision making. It provides a space for advancement teams to not only collect data, but also to synthesize the data for use. While there may be pushback, the beauty of data is that it can be used to justify itself.
Before becoming associate vice president of development and campaign director at California State University, San Marcos, Kyle Button served as the vice president for institutional advancement at Cal State LA for 18 years. While there, he created a culture of research from the ground up.
"We went from a system that undervalued data to one where we realized that it's our first and best tool to identify capacity and that mentality only grew as our shop matured," he says. Resistance and budgetary concerns were raised, but Button was able to use short-term successes to justify longer-term capital investments in infrastructure.
At the University of Illinois at Urbana-Champaign, Wright stresses the importance of collaboration in creating a research culture.
"We should all be doing a better job of helping everyone understand how important our roles are to each other," he says. "We have sessions with the research team so that we can give input back and forth."
For example, if the researchers tell Wright that his team asked for a certain number of new prospects in a year but they only reached out to 2 percent of those prospects, they can have a conversation about their strategy. Do they have good contact information for those people? Do they have enough additional information to make meaningful asks?
"Everyone is well-meaning in both requests and use of the data, but people may be inadvertently frustrated, so we talk these things out and make sure everyone is on the same page," he explains.
Collaboration shouldn't be limited to just advancement staff either. At Lehigh University, Megan Stevens is seeing the benefit of working with other parts of the institution. The university's student newspaper, The Brown and White, is celebrating its 125th anniversary in 2019. In preparation for the celebration, Stevens' team met with a faculty member to discuss the potential of hosting an event or series of events aimed at getting alumni who had worked on the paper while they were students back in touch with one another.
"We've been able to share that data with him to make these decisions. And we've looked at what we've done before that worked well," Stevens says. "It's a big deal, a big anniversary, and the newspaper is well-loved."
It's been helpful to Stevens and her team to have not only the data about the alumni (some of whom are now high profile in their fields), but also information about the types of events that have worked well in the past and how those events were structured. As other departments and faculty members start to recognize that Stevens' team has this type of data and can help provide context around it, they're beginning to seek out these collaborative efforts, which benefits everyone in the long run.
As advancement teams embrace a culture of research, it naturally follows that more work is created and teams grow. When Houlihan started at the University College of London eight years ago, there were 30 staff members on the development team. Today, there are 75. "We used the Ross-CASE Survey [to benchmark themselves against other European institutions in terms of philanthropic performance] to make decisions about where to put our resources," she says. "You want that reassurance [from the data] that you're making a good choice."
CASE's Ann Kaplan has numerous examples of institutions that have made hiring decisions based on fundraising data. For example, a two-year college with limited resources was able to hire a full-time employee to the development team. When trying to decide if the employee should work on major gifts or the annual fund, the school compared its 12 largest gifts to see what they were as a percentage of all gifts and discovered it was well behind the national average for two-year colleges. Because of this, the vice president of development decided to hire a major gifts officer and focus resources in that area. In two years, the institution increased its major gifts, it had data to benchmark against, and it was able to justify keeping the major gifts officer on staff.
Data can help not only allocate resources, but even create new roles entirely. At CSUSM, as the institution was getting ready to bring its campaign public, Button looked at the staff structure and realized they needed a senior director of principal gifts to meet the demand they were projecting for the campaign. Without that forecasting data, they would have been short a critical staff position and would have had to scramble to fill it while simultaneously juggling the needs of the campaign.
Stevens faced a similar situation at Lehigh University. She saw a need to engage alumni in a professional space.
"We looked at data to say here's how many people we have as members of these [professional alumni] groups now, and here's how many people are in those career fields," she says. "It was the justification that this should be someone's full-time job."
Additionally, they looked at how rapidly the student body was diversifying based on admissions data and saw that the student affairs team had made new hires in the multicultural center. Lehigh's student body, and therefore alumni, was getting more diverse, but the alumni relations team did not have the specialized knowledge, training, or background to know how to best engage them. This led to a new hire on Stevens' team who supports diversity programs.
Knowing the value of research and what it can bring to an advancement team is one thing, but what can you do when you don't have the resources to support new hires and growing a large research division? "It's about being realistic about what [you] can deliver and staying organized," says Campero.
Look at what data you have and see how it can work for you. At a minimum, your institution probably has information on who is giving, where they're giving, what college they graduated from, if they played sports or were in clubs when they attended school, and where they live and work now, says Stevens. That's enough information to begin to craft an engagement strategy, create prospect lists, and fill portfolios. How many alumni can you realistically engage through events? Now that you've identified prospects, how many can you actually visit?
One easy way to cut down on time spent combing through reports is to ensure that your data is clean, notes Campero. It's much easier to work with data that already has duplicates (or entries without contact information) removed. "Maintaining clean, standardized, and up-to-date data is crucial and should be part of an organization's routine business procedures," she says.
Wright recommends first accepting the fact that you can't do it all and then using your annual plan or yearly goals to become more focused. "We know we can accomplish on average one to three things and then we narrow those down to smaller elements," he explains. Each year, Wright and his team build on the previous years' annual plan.
Once you've prioritized and focused your goals, you can tap into free resources, such as webinars or even your local library, says Campero. Many databases are available online with a library card.
"Excel is free and widely available," says Kaplan. "Just download your data to that format and once it's there you can graph things, and even people with no background in statistics or economics can eyeball the data and get a sense of what it's telling you."
She also recommends calling other institutions. People are generally eager to talk about what they're doing and share their knowledge, Kaplan says. Think about reaching out to institutions that are similar to yours as well as institutions that may be aspirational for you. The California State University system, for example, is set up as tiered peer groups.
Younger or emerging schools can collaborate with one another, but they can also look to developing or mature universities—those that have been around longer and have established successful advancement offices—as examples and resources as they build out their own campaigns and development teams, explains Button.
"People talk about differences from continent to continent, but a lot of the basics are the same," says Houlihan. "I get a lot from just talking to other universities and people in similar roles from across the globe."
You can also call CASE or the VSE directly, notes Kaplan. "We're all used to answering questions and we're happy to talk through issues with our members and participants."
The main thing is not to get discouraged by how much you could be doing, says Stevens. It doesn't have to be fancy, "but there is no excuse for not taking a look at the data that you do have to inform the decisions that you're making."
Whether you're doing long-range planning or starting a short-term project or campaign, you should use a similar research-minded approach. Call it a logic map, a framework for action, a blueprint, or simply realistic goal-setting—either way, it should include these components:
Purpose or goal. What is the problem you're trying to solve? What question are you trying to answer?
Inputs or resources. What people or tools can you use for this project?
Action or intervention. What steps are you proposing you take? Can the actions be broken down into smaller goals or milestones?
Outputs or results. Having worked through the process, what is the effect? What changes occurred because of your actions?
AmAtlas serves as a global resource for educational advancement-related metrics, benchmarks, and analytics, providing a comprehensive, data-rich resource for schools, universities, and colleges. CASE conducts more than 20 surveys globally including the Ross-CASE Survey and the Charitable Giving to Universities in Australia and New Zealand Survey. These surveys, along with the Voluntary Support of Education Survey, are key parts of AmAtlas.
Caitlin Lukacs is the CASE manager of editorial content.