Awards
Information Systems: McMaster University - Silver Medal

Category 3D: Advancement Programs – Information Systems
McMaster University – Performance Management Report

Contact: Cathy Collins, associate director of research – University Advancement, 50 Main Street East, DTC 125, Hamilton, Ontario, Canada L8N 1E9; phone (905) 525-9140 ext 20286; e-mail: ccollin@mcmaster.ca

Department Staff: Four full time systems staff. Advancement Services also has four full time research staff, two full time biographical update staff, four full time donations staff, a Director, an Associate Director and one full time administrative support position.

McMaster’s Fundraising Environment: Advancement services supports both a centralized and de-centralized fundraising operation, with 16 full time fundraising staff in the central office, and seven full time fundraising staff in the de-centralized offices. The university advancement department consists of development, alumni, international advancement, public and government relations, advancement services, and donor relations. We pride ourselves on having a truly integrated operation.

McMaster University Advancement Mission: University advancement is accountable for creating a dynamic environment for strong public and private support for McMaster University through an innovative and integrated program for the university community, our alumni and the public. Our values and principles help us to achieve this. They are: Integrity, Respect, Strategy, Quality, Service and Teamwork.

Why a Performance Management Report?

For the past few years, McMaster had made the decision that the only performance measure our fundraising staff would have was the number of visits they completed. The rationale for that was if development staff was all “out there”, and moving their prospects appropriately, the money would follow. And it did. What changed, however, was the decision of the university to embark on our most ambitious fundraising campaign ever, a comprehensive campaign that would more than double the amount of money we are required to raise each year. In real terms, it meant that we needed to raise two and a half times what we had been raising—from a 2003-2005 average of $16M annually to $41M annually in each of 2006-2009.

The challenge was to create some performance measurements and benchmarks that would ensure campaign success, without also creating an unmanageable amount of “administrivia” for our frontline fundraising staff. That’s where information systems came in.

While systems would ultimately be in charge of creating the report, its mandate did not include defining what those fundraising benchmarks should be. To assist in determining the performance benchmarks, McMaster worked with a consulting company, KCI Canada, who, based on their significant campaign experience, suggested a number of potential indicators.

These included not only the fairly typical ones of dollars raised, and number of visits, but also benchmarks for portfolio size, stages of movement within a portfolio, number of outstanding requests, realization rate for gifts, and the number of potential prospects identified by the development staff person themselves to enhance the overall prospect pool.

We have various categories of fundraising staff. As an example, principal giving officers have an annual goal of raising a minimum of $5M each. They are expected to work only with donors at the $250K and above giving level. Their target portfolio size is 60-80 prospects, and they are expected to identify five new prospects each month as a result of their visits with current prospects. As a second example, senior development officers have an annual goal of $1M, and work with donors at the $25K-$249K giving level. Much of their work involves discovery calls, so they have a larger target portfolio size (e.g. 120-150 prospects).

It was obvious, given all the different parameters for different categories of fundraising staff, that there needed to be a systems generated solution for tracking progress.

The ultimate goal was to provide a concise document for each individual staff person to measure their progress, without a significant increase in their work load to generate this document. Equally important was ensuring that the central database is used as the primary source of information so that information is both easily accessible and as current as possible.

University advancement uses Advance C/S, by SunGard Higher Education, as its database. All members of the department have access to it, and utilize it on a daily basis. It is used for all contact management, gift processing and reporting.

There was no specific budget for this report development, it was managed as part of everyday request workload.

The Challenges: As with any systems project, the biggest challenge of all is to design the output. It’s wonderful for people to have a sense of what they want to measure in “big picture” terms, but the challenge for systems is to translate that into the tiniest detail, and then figure out how to present it.

Stakeholders in this consultation process included the vice president of university advancement, with ultimate responsibility for reporting to the board of governors; the campaign director; our research team, which is primarily responsible for the prospect management processes; our gift processing team, which records all gifts to the university; and, of course, all front line fundraising staff.

We knew that we didn’t want a multi-page report, or a long column of meaningless or hard-to-understand figures, so came up with the idea of representing it using graph charts in Excel. In that way, it would be a true snapshot that would enable someone to review a large volume of data and/or the reports of a large number of staff to quickly identify potential issues.

As we worked through the process, it became apparent that this was not a report we would be able to run directly in Advance. Our main issue was the limitation of PowerBuilder datawindows, which did not allow us to format several reports on one page, using layout features that Excel provides.

As an end result, we wound up writing a custom report in PowerBuilder that is run in Advance C/S, allowing the user to save a comma delimited text file which contains all the relevant data. The user then opens an Excel file, which is the master template, and, at the click of a button, activates a macro which imports the data and formats the report in a manner that is simple to read and pleasing to the eye. The entire process takes about one minute. Documentation for this, and all other processes, is stored on a shared drive available to all of university advancement.

The other challenge, not to be underestimated, was significant inconsistencies in coding. Fundraisers had been historically measured on visits, so their coding around visits was perfect. Many of the other details of where dates, amounts, visit stages and other notes were properly supposed to be stored, and at what level of prospect and proposal windows had not been taken so seriously.

As a result, the first number of attempts to test the report proved difficult as often the results would come back vastly over-inflated or under-reported. What this did, however, was give us the opportunity to review coding procedures in a non-threatening way with individual staff members. It also provided the opportunity to reinforce the importance of regularly running proposal integrity checks which help them spot errors in their coding.

As we resolved the coding issues, however, other issues were raised, and there was one of those that we ultimately decided not to address—at least in this version. That was around crediting a gift where multiple people had been involved in the ask. It would have required significant modifications to both the technical and functional sides of the system in order to attach shared gift credit. The final decision was to have development staff record this on a separate spreadsheet that could be printed off and reviewed in conjunction with the Performance Management Report. This turned out to be an excellent decision, as, after six months of use, there are only two entries on the spreadsheet. Attempting to program that process would have likely delayed the creation and use of this report by months.

The Benefits: The major benefit to this report is that the user can run one report, instead of having to run five or more, and be able to view it all on one page. As a result of this process, fundraising goals and objectives are clearly laid out and clearly understood by our fundraisers.

Rather than having to analyze data, the visual images allow a quick scan to suffice in determining if there are significant issues that need to be addressed. Because all the data included in the report pulls directly from Advance, it is a dynamic report that will refresh itself each time it is run without any additional effort on the part of development staff.

The key surprise benefit was that once we started measuring more than just visits, data entry by the development staff improved significantly, as it was now in their best interest to record information correctly to show the results of their hard work. As a result, we are looking to explore additional ways of benchmarking and reporting at the campaign, rather than individual level, as we now trust that the coding is more accurate and complete.

As people came forward to ask questions about why their report didn’t show what they expected it to, it gave us the opportunity to explore other training needs with them, and to help them better understand some of the processes, and their underlying principles. The increased dialogue between the advancement services team and the development team really helped to foster the sense that we are all working together to achieve the same overall goal.

As a result of this consultation, and development staff wanting the report to show all their work, we added a component to show confirmed planned gifts, and the number of requests outstanding.

We are thrilled with the results. This was an extremely useful and rewarding process. What started out simply to be a performance management tool for development, resulted in a significant improvement to our overall Prospect Management System, and a greater sense of team spirit across the entire university advancement department. And it happened because systems captured all the data, facts, questions and feelings; defined the problem in many different ways; identified the real issues as well as strengths and weaknesses, and generated the solution!