The Official Baldrige Blog
Every year a new cohort of Baldrige Executive Fellows gains intensive knowledge about leading organizations to excellence through cross-sector, peer-to-peer learning hosted at the sites of Baldrige Award recipients.
The following interview with Rand Jerris, Ph.D., highlights how a recent Baldrige Fellow applied learning from the leadership-development program to bolster the performance measurement practices of his own organization. Jerris is the senior managing director of public services for the United States Golf Association.
Created in 1894, the United States Golf Association (USGA) is the governing body for golf in the United States and Mexico. Together with The R&A in St. Andrews, Scotland, we write and interpret the Rules of Golf for golfers around the world, including rules of play, equipment standards, and a code of amateur status. The USGA also maintains a handicap and course rating system that makes the game fair and enjoyable for golfers of differing abilities.
The association has also developed strategic goals of making golf more sustainable environmentally and economically, as well as making it more welcoming to a wider, more diverse population of participants and fans. Our most visible activities are our 14 national championships, including the U.S. Open, U.S. Women’s Open, and U.S. Senior Open Championships, as well as the recently announced U.S. Senior Women’s Open, which will launch in 2018.
First contested in 1895, the U.S. Open is among the premier events in golf; it is also the economic engine that fuels our investment in the game, generating well over half of our $200 million in annual revenue. With such resources, we are able to support substantive programs that advance the game’s economic and environmental sustainability, as well as programs that make golf welcoming for all audiences. We’re proud to have invested more than $1 billion in the game over the past 12 years. In my role as senior managing director of public services, I oversee the association’s sustainability and community programs, as well as the USGA Museum (the oldest sports museum in the United States).
I spend a majority of my time engaged in strategic and operational planning. For the past three years, I’ve led our efforts to improve organizational effectiveness and to create a culture of continuous improvement. While we’re unlikely to pursue a national Baldrige Award anytime soon, we have found that the Baldrige Criteria for Performance Excellence provide an outstanding framework for elevating our performance.
Together with the members of our leadership team, I am thrilled with all that we have accomplished over the past three years. While we had existed for more than 116 years, we had never had a formal strategic plan until 2012 (substantively revised in 2014). Since starting our journey in 2012, we’ve also aligned our team behind a renewed mission statement. We’ve articulated our core values for the first time, with active participation by more than 200 members of our staff. We’ve identified, with clarity and purpose, our key customers and initiated key customer feedback mechanisms for all core programs.
Last fall, we gathered every member of our staff from around the country for the first time and staged an all-staff retreat to build alignment around our strategic plan, our values (Lead, Serve, Inspire) and a new brand vision. We also conducted our first-ever employee engagement survey. It’s been quite a ride—and the results have been dramatic (and measurable!).
Most recently, we created and introduced a scorecard of organizational measures for the first time. This has been a transformational process. When we started our journey three years ago with the help of Bo McBee (former chair of the Judges Panel of the Malcolm Baldrige National Quality Award), it’s fair to say that the organization was focused on activities, rather than results. In those first days of working with Bo, we conducted interviews with departmental leaders across the USGA.
A few of the questions focused on outcomes and measures (e.g., Are you having a good year? How do you know? What metrics do you use to evaluate your progress?). The response most frequently given went something like this: “We don’t have measures; we’re not a corporation.” In our mission-driven culture focused on service to the game, the very notion of metrics and measurement was foreign. Yet we intuitively understood that any organization—for-profit or not-for-profit—could benefit from a clearly articulated set of measures that reflected our strategic priorities.
The organizational scorecard that we developed is structured around the four core pillars of our strategic plan: Championships, Governance, Health of the Game, and Community. Our Championships pillar centers on the concept of inspiration—how can we elevate our championships to inspire present and future golfers, as well as golf fans around the world?
The Governance pillar speaks to the integrity of the game and reflects one of our foundational purposes—to ensure that the game is fair for all who play, and that skill, not technology, determines success on the golf course.
The Health of the Game pillar speaks to the sustainability of golf, in particular, to the economic and environmental viability of golf courses.
Finally, our Community pillar guides our efforts to foster a game that is accessible and welcoming to all who wish to play. A fifth section of our scorecard speaks to the effectiveness of our operational activities (“support functions,” in USGA parlance).
In considering the metrics that would be most meaningful to our staff and most effective in capturing our desired outcomes, we found it helpful to think about various categories of measurements: metrics that speak to the reach of our programs; metrics that reflect customer satisfaction; metrics that capture engagement with our programs, and as such are reflections of the quality of our programs; metrics that reflect the effectiveness of our products and standards; and metrics that reflect the brand of our programs, which we view as essential to their sustainability.
Reach metrics are common within our Championships strategy; for example, these measure the size of the television audience for the U.S. Open (measured in gross ratings points) and the number of users engaged with the digital products that surround our championships (unique visitors to USOPEN.com, as well as the “Championships” tab of USGA.org).
Customer satisfaction scores are also critical for our championships because we want to ensure that the best players in the world have the best experiences when they compete in a USGA championship—it’s essential to our efforts to attract the strongest fields. Finally, because significant percentages of annual revenue derive from ticket sales and corporate hospitality sales, we measure customer satisfaction for key spectator audiences at each of our championships.
Engagement metrics speak to the adoption rates of our programs. Within our Governance strategy, for example, we need to know that golfers—from beginners to experts—are engaging with Rules education content. Another key initiative is the international growth of a single handicap and course rating system, so that a golfer from Australia can play a golfer from Austria and a golfer from Argentina and have a fair and fun match. For this to happen, the golf courses in each of these nations must have a course rating that is determined with a consistent and accurate formula. So we’ll measure the number of golf courses worldwide that have a formal USGA course rating.
Among our engagement metrics, none may be more significant than measuring the percentage of golf courses that have reduced maintained and irrigated acreage. As we look out 25 to 50 years at the future of golf, the USGA believes that the most significant threat to the game is water—the cost of water, access to water, and the availability of water. In some regions of the country (southern California being the prime example), water issues are so extreme today that golf courses are being shuttered. Having studied the issue and the various mitigation strategies, we believe that there is one lever that has more impact than any other: reducing the amount of land we irrigate as an industry. To this end, we are elevating awareness of the issue aggressively, and we need to understand how individual golf facilities are responding.
An illustrative example of an effectiveness measure can be found in the governance section of our scorecard: the average correlation coefficient of skill factors on the PGA Tour. As noted above, one of our fundamental responsibilities in setting and maintaining golf equipment performance standards—an important component of USGA Rules—is to ensure that technology never replaces skill as the primary determinant of success in the game. In other words, we don’t want you to be able to buy a better score.
To this end, we invest more than $5 million annually through the work of the USGA Research and Test Center to test golf equipment for conformance to established standards (e.g., clubhead size and length; the coefficient of restitution for clubface materials; the size, weight, and initial velocity of golf balls; etc.).
To confirm that these standards are effective and to understand if new materials or designs are providing unfair advantages, we measure the correlation between discrete skills (driving distance, driving accuracy, greens in regulation, average puts per hole, etc.) and overall performance on the PGA Tour—the world’s premier league and the stage on which the world’s most skilled players appear (measured by position of finish). Elevating this metric to the organizational scorecard ensures that we keep a close watch on this correlation.
Finally, we have brand metrics, which speak to the relevancy of our organization and its programs. While we are recognized as the “governing body” for golf, the truth is that we hold no legal authority. Golfers choose to play by USGA Rules and choose to play with conforming equipment only because they respect the game and they respect the USGA. For us to be effective, we need to know that we are relevant to golfers—and we believe that the best measures of relevancy are brand metrics.
Of the many opportunities that we first identified to improve organizational performance, the one that we feared could cause the most disruption was the introduction of organizational measures. It was clear that some within the organization viewed the adoption of measures as another step toward the “corporatization” of the USGA in which we would progress from being a nonprofit organization driven solely by the good of the game to become a for-profit entity focused purely on the bottom line.
There were also those who feared that the introduction of metrics would jeopardize individual job security. From such sentiments, we understood that cultural evolution—building an environment of trust across the organization—would be critical for the success of these efforts.
The formal path toward creating an organizational scorecard began with a capstone project that I completed as a Baldrige Fellow in 2013–2014—an invaluable program enhanced by the involvement of Harry Hertz (Baldrige director emeritus) and Bob Fangmeyer (Baldrige director); the talented facilitation of Pat Hilton of the Baldrige staff; and the contributions of Bob L. Barnett as executive in residence. [Editor’s note: As a former member of the Baldrige Program’s Board of Overseers, Barnett played a key role in establishing the Baldrige Fellows program.]
Through the Fellows program, I was first exposed to the power of organizational metrics. I would never have imagined that Lockheed Martin Missiles and Fire Control (2012 Baldrige Award recipient in manufacturing) or Advocate Good Samaritan Hospital (2010 Baldrige Award recipient in health care) could have much relevance to the USGA, but exposure to their measurement-driven cultures opened my eyes to the power of metrics.
The success of Lockheed Martin demonstrated clearly how metrics could be aligned from the organizational level to the individual level to drive both continuous improvement and results. From Advocate Good Samaritan, we learned that a system that advocates complete transparency around metrics to all key customers (patients, their families, their doctors, and all employees) can create a powerful alignment of customer satisfaction and employee engagement that, literally, saves lives.
To improve internal communications, foster greater alignment across the organization, and ultimately impact employee engagement, our executive director, Mike Davis, introduced a quarterly calendar of “town hall meetings” when he assumed his responsibilities in early 2011. We leverage these town halls to communicate important news about the organization, to solicit questions and feedback from the staff, and to advance initiatives for organizational improvement.
One particular theme has been incorporated into every meeting for the past three years: our strategic plan. It is also our intention to elevate discussion of our core values at every town hall following their introduction last December. It was only appropriate, then, that the formal introduction of the scorecard occurred at our most recent town hall in mid-February (where we also discussed the results of our first employee engagement survey). A member of our staff (but not a member of our leadership team) introduced the scorecard, discussing its structure, demonstrating specific measures, and explaining its purpose for driving organizational improvement.
At the end of the meeting, a hard copy of the scorecard, including 2014 baselines, was distributed to every member of the staff; the scorecard also has been posted to our employee intranet site. The first revision of the scorecard, incorporating data through February 28 (the close of our first quarter), was posted to the intranet on March 16. Naturally, the scorecard has also been shared with our board, which will use it to evaluate the annual performance of the leadership team moving forward.
As noted above, we initially thought that our greatest challenge might come from employee resistance to measurement. Thus far, this has not proven to be the case—the great majority of staff members are excited and eager to engage. Rather, the greatest challenge has proven to be the time and effort required to build a system that supports the aggregation and reporting of data on a regular basis. As a small (300 employees) nonprofit organization, we do not have a centralized technology platform (e.g., a customer relationship management database) that supports the centralization of data for our key activities.
Our solution—perhaps temporary—is manual and simple: an extensive Excel spreadsheet that sits behind a single-page report. We realized quickly that the maintenance of the scorecard (and associated Excel spreadsheet) would require considerable effort. To support this work, we restructured the responsibilities of a member of our staff (manager level), and that person is now held accountable for working with individual department leaders or their team members who have been assigned ownership for reporting individual metrics.
This latter point should not be taken lightly: We have come to understand that each metric needs to be assigned an owner, and that each owner has individual accountability for ensuring that the system or tool is in place to collect the necessary data and ensuring that the data is being reported. For example, one of our organizational priorities is to drive greater diversity and inclusion in golf. To this end, we are seeking to understand the penetration of our core programs to diverse audiences, in particular, women and persons of color.
As we do so, we recognize that we need to start the actual collection of information on gender and ethnicity, so we realize that we need to introduce relevant questions into championship entry forms, volunteer and committee biographical forms, program registration forms and systems, etc. It’s one thing to say you are going to measure diversity and inclusion in core programs, but it’s a far more complicated process to ensure that the data are actually being collected, aggregated, reported—and, we hope, used to drive improvement.
Finally, for the scorecard to be effective as we roll it out, there is also the need for baseline data. Once we had identified the relevant metrics, assigned ownership, and confirmed the processes for collecting and reporting data, we made an effort to identify relevant 2014 baselines. In some cases, this required staff members to revisit 2014 records and documents to reconstruct data, but the effort has proven worthwhile. In the end, the scorecard contains only five metrics for which a 2014 baseline could not be established, but more than 30 where baselines could be identified.
As work on the scorecard proceeded, one of the primary challenges was to build alignment across the leadership team as to the metrics that matter most. Based on my conversations with colleagues in other organizations and other industries who had embarked on similar efforts, I learned quickly that this was no cause for concern. Indeed, everyone I spoke to about the process assured me that selecting the best metrics is the hardest part of the project.
But I also learned another important lesson that I shared over and over with my colleagues on our leadership team, that is, that it’s far more important to get started with good metrics than to wait to identify the perfect metrics. In fact, I learned that you cannot truly understand what the perfect metrics might be until you get started. The key is to identify measures that you think are appropriate and then be willing and open to refining once you start to use the results and understand how they are impacting the operation. In and of itself, the process of selecting metrics is a process that can and should be subject to continuous improvement.
So as we move forward, I fully expect that continuous assessment and refinement of our metrics will occur. As we get underway, I also have my eye on identifying relevant external benchmarks. While it is one thing to have the ability to compare the performance of the U.S. Open against the Women’s Open, it would be another to be able to benchmark the U.S. Open against other elite tournaments in golf or, even better, against the very best championships in all sports.
Unfortunately, the golf industry does not yet have a culture of transparency, so it is hard to find relevant industry data. But we’re committed not to let this prevent us from trying. One of our objectives for the near future is to initiate an industry-wide dialogue about improving the game and bringer greater value to golfers. And we’re going to suggest that the sharing of data across the industry might be a powerful way to drive change for our industry.