Medical translational science, the process of converting scientific discoveries into improved health, requires innovation, analysis, adaptability, and dissemination of information.(1) The tracking and evaluation of translational science endeavors, necessary to safeguard strategic planning, steps forward, and compliance with various governance requirements, demands similar dynamic oversight.
The evaluation unit at the University of California San Diego (UCSD) Altman Clinical and Translational Research Institute (ACTRI) supports the ACTRI’s programs and unit leaders in developing goals, targets, and relevant metrics of success; assessing progress; reviewing performance and management; and establishing continuous quality improvement methods.
Finding existing logic models(2) lacking, the evaluation unit conceived a novel application of the balanced scorecard (BSC), a tool that businesses traditionally have used for strategic management and performance tracking, by incorporating non-economic performance metrics to supplement conventional financial criteria.(3)
The evaluation unit painstakingly developed, refined, and implemented a BSC tailored to the academic research environment. The big-picture value of the scorecard approach has been its ability to incorporate direct “customer/stakeholder” insights (through satisfaction metrics); increase capacity to align and re-align our initiatives and bring them back to our funding institution’s changing goals and priorities; and strengthen inter-group collaborations and emphasis on “learning from within.”(4)
In practical terms, the BSC informs the strategies of the ACTRI that drive adherence and effective outcomes to grant requirements and thus serves as an integral tool contributing to continued funding. Here, we present an overview of the basic attributes/benefits of the BSC in our application and the process of piloting novel change with a focus on the actors involved, the lifecycle of adoption, lessons learned, and a path forward.
In fall 2012, after experiencing earlier BSC successes in the academic department of medicine,(5,6) ACTRI implemented an electronic in-house version of the balanced scorecard for strategic management purposes.
Not to be confused with a data dashboard, the original BSC product of Drs. Robert Kaplan and David Norton was created for the purpose of unifying mission, vision, and planning, while offering a high-level view of how the institution or system is performing in regard to its main domains, key stakeholders, and available assets.
Other advantages of the BSC framework include its focus on cross-functional relationships; ability to present strategic objectives tied into actionable measures; and capacity to visually communicate organizational strategy in an adaptive, continuous cycle process of planning, doing, monitoring, learning, and acting.(7) Although the balanced scorecard has gradually become better known in healthcare settings,(8) we have not seen it applied to translational research institutions such as ours.
Context of Adoption and Implementation
A relevant starting point for analyzing the BSC implementation process at the ACTRI is the lifecycle of technology adoption; its four phases are formative, growth, mature, and decline.(9)
The formative phase is defined as a period of little growth and small numbers of actors engaging the novel technology. The second stage is characterized by high growth and high entry rates, a period of rapid expansion whereby the system starts enjoying a critical mass of users and standardization processes begin.
A high degree of specialization but low growth rates in user numbers characterizes stage three, with a period of stabilization and institutionalization of the technology. The fourth and last phase, the decline, is the stage at which no new users are expected; rather, increasing numbers of users abandon the technology system in search of new options. The technology eventually loses its relevance and established value, and a final breakup may take place.
To fully understand how a new technology is adopted and implemented, we need to consider the actors involved in the process as well as their particular nature in relation to the ecosystem where they interact. The work of sociologist Everett Rogers(10) is particularly enlightening. Rogers detailed five categories of technology “adopters”:
Innovators: Risk-takers, usually the first individuals to adopt an innovation, youngest, highest social class, have the financial resources to tolerate and absorb failures, have closest contact to scientific sources and interactions with other like-minded entrepreneurs.
Early Adopters: Opinion leaders among adopters, higher social status, financial liquidity, more socially forward than the next categories, more discreet in their adoption choices than the innovators.
Early Majority: Those who adopt the innovation significantly later than the two previous groups, above-average social status, some contact with early adopters but seldom opinion leaders in the system.
Late Majority: Accept the innovation well after the average participant, naturally skeptical toward innovation, have below-average social status, little financial liquidity, in contact with late majority and early majority, little to no opinion leadership.
Laggards: Last to partake of the innovation, no opinion leadership, tend to be risk- and change-averse, focused on traditions, lowest social status, lowest financial liquidity, oldest among adopters, little contact with anyone outside their immediate circle.
Although our institute is not a classic representation of society, there are internal and external hierarchies among our various units and their leaders (Figure 1). During the implementation of the balanced scorecard,(5) we have witnessed well-segmented and identifiable periods of BSC adoption and various “types” of innovation adopters among unit leaders and their staff that synchronize with the above-noted classifications.
Figure 1. Timeline of BSC Adoption and Phases of Technological Innovation
Additionally, our lifecycle of technology adoption did, indeed, loosely correlate with the previously referenced observations of Markland.(9) Detailed below are our observations.
Formative Phase (2012–2015)
All ACTRI units were mandated to utilize the BSC beginning in fall 2012.(5) The scorecards, as initially implemented, were a slightly revised version of the original tool, the most noticeable difference being the addition of a radio tab labeled “Customer and Stakeholders” to provide greater inclusivity and accountability to the decision-maker domain.
The formative stage was characterized by significant pushback and few truly committed users (better depicted as a combination of “innovators” and “early adopters”). Our evaluation unit conducted one-on-one training sessions with unit leaders and their staff, providing extensive “how-to” and FAQ documentation, as well as other learning materials, including background knowledge on the original Kaplan and Norton(3) work.
Leaders and staff members for whom adoption was easiest were in the more technology-intensive units as compared to service/people-oriented entities.
A recurrent problem encountered at this juncture was the lack of specificity in the information entered. Often, units left one (or several) items blank; the “Lead” category was frequently unpopulated, which presented a hurdle in terms of accountability. Similarly, as Figure 2 shows, measures and targets were often not S.M.A.R.T. (specific, measurable, achievable, relevant, and time-bound).
Figure 2. Sample BSC Formative Phase
Toward the end of the second year, and with the help of the early adopters and their targeted efforts in diffusing the tool, we managed to increase acceptance and trust as shown in the dashboard presented in Figure 3.
Figure 3. BSC User Adoption Dashboard
Growth Phase (2016–2019)
After the initial buy-in process, we progressed to the second growth phase in which early majority users were brought in. Although we believed our electronic scorecards to be designed to best suit our internal needs, in reality, few units maintained continuous updating (i.e., entering the most recent data on their self-determined Key Performance Indicators or KPIs).
Most of these difficulties stemmed from users’ misunderstanding that projects, ongoing collaborations, etc., were static, (i.e., once an initiative was placed within a domain, that spot was permanent) as opposed to the purposefully designed ability to dynamically move through various fields as an item evolved.
Some of our unit leaders were, perhaps, still affixed to the linearity of the previously used logic model strategic management tool. A reticence to change coupled with incomplete recognition of the flexibility of the BSC to accommodate fluid KPIs also contributed to the updating deficiency.
The evaluation unit used one-on-one meetings to resolve the confusion, although this may be the most recurrent issue we encounter with every BSC review. (Every quarter, our evaluation team meets once or twice with each unit management to review their individual scorecards and “refresh” or update them in a more systematic way. Although each unit is left alone to decide how often they modify their specific KPIs, at least once quarterly, they must comply with this standard institutional requirement.)
The few late majority arrivals for the most part were new unit leaders or administrative staff for whom the process of quantifying and measuring their strategic goals proved more challenging, given their lack of familiarity with our processes and our electronic software tool.
Beyond technological competency, some of these late majority arrivals were not necessarily the most open to change, and that is why the leadership’s buy-in was essential for our success. However, the consistent support for the BSC from our institute’s leadership guaranteed its full adoption.
To gauge BSC impact, the evaluation unit conducted three voluntary and anonymous standard satisfaction surveys of all unit leaders in 2013, 2015, and 2016, as well as follow-up semi-structured participant interviews with unit leaders and their staff over a span of three years corresponding to the growth phase. (The 2013 and 2015 surveys took place during the directors retreat and, as such, had high participation rates — 93% for each of them — while the 2016 survey was a standalone with a participation rate of 62%.)
Mature Phase (2020–)
We currently are in the third phase (mature) of BSC implementation. Quality improvement emphasis has been focused on scorecard redesign to match our most recent CTSA grant goals (Workforce Development, Collaboration and Engagement, Integration, Methods and Processes, and Informatics) while incorporating a parallel mechanism of accountability.
Figure 4 illustrates the differences in completion and detail for the scorecard entries. In addition, a day-to-day operations/project management software is being applied to track progress toward balanced scorecard targets.
Figure 4. Current View of BSC
While the evaluation unit continues to advocate for the strategic management nature of the BSC, the unit leaders in our center requested help creating more accountability mechanisms; subsequently, project management software serves as a complement to our BSC but is limited to detailed day-to-day operations (Figure 5).
Figure 5. Current View of Project Management Board for One Unit
In creating novel, complementary adjuncts (Figure 6), we believe the logical decline of the BSC strategic management tool is delayed, at least for the time being. We believe that any and all technologies and software are, at least theoretically, bound to decline, regardless of how dynamic and adaptive they may be by nature. This is due to the ever-increasing possibility of new technologies and new software options coming to the market and disrupting the balance of new threats to security and privacy that may not have been originally incorporated to the actual tools. Policy changes and funding opportunities at the institutional level also may be examples of potential threats to the survival of any tool.
Figure 6. Workflow of Delegation Lines/Accountability for Strategic Management
in Our Institute
We are currently in the process of standardizing the interaction and cross-software connectivity of the BSC and project management platforms to facilitate data input and visualization for unit leaders and their staffs.
Discussion: Lessons Learned
The balanced scorecard was introduced to better align our institute’s research mission and vision with our actual initiatives and performance, following the established commitments of our foundational grant. Before the balanced scorecard, this connection between our mandate and our work was not as clearly delineated, and neither were the variety of stakeholders involved in each of the translational continuum stages. Through the scorecard, we have rendered more visible the inner workings of our organization.
Our implementation and quality improvement processes of the balanced scorecard have taught three primary lessons.
1. You get what you put in.
The balanced scorecard is a living document. As such, it requires relatively constant interaction and flexibility, usually at every reporting deadline for the near term (annual progress reports) and for mid-term grant progress milestone setting.
Strategic management under this approach is brought online by the individual user through our in-house developed, maintained, and copyrighted software that is password-protected, easy to access, and self-explanatory. Users arrive at a landing page that has training materials, inter-unit message boards (for internal communication), an institutional level message board (for general announcements), and a drop-down menu to access each unit’s specific scorecard(s). Having gained access to their scorecards, units have editing privileges and control of their views.
Each scorecard has a general format (four or five tabs, each pertaining to the main programmatic domains, depending on the scorecard year) and as many items (initiatives) as unit leaders decide. As explained, the frequency of access to scorecards is primarily left to unit leaders and their staff, except for the minimum requirement to have updated information every quarter. Given the “high level” of information captured in the scorecard, there is no need to view the BSC daily.
Those unit leaders and managers who routinely codify their particular scorecards (i.e., major accomplishments, new initiatives, completed tasks, required grant-based KPIs, and unit-created original metrics) tend to do significantly better at BSC upkeep than those who need to be coerced into compliance. Consequently, a well-maintained BSC facilitates preparing for annual reports and external advisory board meetings.
2. There are many drivers of implementation.
Several factors have influenced BSC adoption in our environment. In the broader picture, the BSC has a well-established reputation as a strategic management tool and is employed by our institutional body, the University of California San Diego.
The primary drivers of local implementation were the most frequent users (unit managers) and/or tech-savvy unit leaders who were early adopters. Word of mouth among unit leaders and their opinion leadership with their own staff has been the single most important diffusion mechanism.
A key component in ongoing applicability has been a regularly scheduled bi-annual meeting between the evaluation unit and individual units to review the scorecard; an additional review is initiated when preparing for institution-wide reporting deadlines, and ad-hoc meetings are conducted upon request.
3. Tool flexibility and adaptability are important.
We have learned that constant solicitation of feedback from users, implementation of requested changes, and introduction of updated/novel features are requisite for survival. As an example, our most recent balanced scorecard revamp occurred two years ago when we created a more customizable, better-looking interface following findings from a 2016 survey.
Specifically, the original four domains were modified to five strategic aims as outlined by the National Center for Advancing Translational Sciences in our Request for Application. Additionally, several extra “edit” features were introduced, such as the ability to move initiatives across quadrants/domains, the capability to renumber items according to reprioritization, and, finally, the removal of unused features.
We are committed to continuous process improvement, and the most recent addition of project management is having a clear, immediate effect in increasing unit leader and staff engagement with both strategic and project management software.
The decision to integrate these two very different tools was a response to our own commitment to user satisfaction and stakeholder accountability; some unit leaders consistently requested added functionality from our balanced scorecard interface that, by design, was not available in a strategic management tool. Adding the link within our in-house scorecard platform to a commercial-project management tool allows a “single-stop shop” where a bird’s- eye view of their plans, milestones, targets, and accountability is presented together with direct access to delve into the minutiae of daily operations.
Although these changes have not yet been assessed for their impact on user satisfaction, we will engage in such efforts in the near future. Evidence suggests that the BSC continues to provide unit and institutional leadership with high-level visual organization, understanding, and cross-unit sharing of institutional plans. It constitutes a consolidated roadmap for success, while the project management tool is a daily log of day-to-day tasks.
There is no confusion about the benefits of each platform. The updating required for effective record keeping has driven a division of labor: unit leaders focus more on the BSC while unit managers/other staff handle the project management tool.
Future Work
An external advisory board comprised of national subject-matter experts meets annually to review our ACTRI. Year-over-year progress in meeting the goals of translational science has established the balanced scorecard as an integral component to successful outcomes and continued grant funding.
To avoid the decline phase in technology lifestyle adoption, the UCSD ACTRI evaluation unit has begun the process of incorporating major changes to our balanced scorecard. Opportunely, we are beginning our third grant-renewal year, and as such, most current initiatives are new or have a renewed scope. This novelty, coupled with an updated set of strategic aims, has allowed for additional latitude in adjusting to recent changes in mission and vision, i.e., increased responsiveness to issues of diversity, equity, and inclusion (DEI).
Our BSC is undergoing adaptations that will expand the pool of stakeholders and incorporate performance metrics to track societal impact. These changes will be documented, evaluated, and our findings disseminated over the next few months.
In our application, the balanced scorecard has proven its value; as such, we aim to extend its lifecycle. Now coupled with project management software, the BSC is fully responsible for ensuring key performance indicators match the larger values and mission of our translational science research institute in the quest to bring innovative research to clinical fruition.
References
National Center for Advancing Translational Sciences. Translational Science Spectrum. National Institutes of Health, National Center for Advancing Translational Sciences. March 12, 2015. https://ncats.nih.gov/translation/spectrum
Frechtling JA. Logic Models. International Encyclopedia of the Social & Behavioral Sciences. 2015; 299–305. Accessed February 17, 2022.
Kaplan RS, Norton DP. Using the Balanced Scorecard as a Strategic Management System. Harvard Business Review. July–August 2007. https://hbr.org/2007/07/using-the-balanced-scorecard-as-a-strategic-management-system
Bersin J, Zao-Sanders M. Making Learning a Part of Everyday Work. Harvard Business Review. February 19, 2019. https://hbr.org/2019/02/making-learning-a-part-of-everyday-work
Fontanesi J, Alexander A, Dworschak D, et al. Balanced Scorecards: An Alternative to the Logic Model Assessment Framework for CTSAs. Unpublished Manuscript; 2011.
Bouland D, Fink E, Fontanesi J. Introduction of the Balanced Scorecard into an Academic Department of Medicine: Creating a Road Map to success. Journal of Medical Practice Management. 2011;26(6):331–335.
Kopecka N. The Balanced Scorecard Implementation, Integrated Approach and the Quality of Its Measurement. Procedia: Economics and Finance. 2015;25:59–69.
Mutale W, Stringer J, Chintu N, et al. Application of Balanced Scorecard in the Evaluation of a Complex Health System Intervention: 12 Months Post Intervention Findings from the BHOMA Intervention: A Cluster Randomised Trial in Zambia. Carlo WA, ed. PLoS ONE. 2014;9(4):e93977. doi:10.1371/journal.pone.0093977
Markard J. The Life Cycle of Technological Innovation Systems. Technological Forecasting & Social Change. 2020;153.
Rogers E. Diffusion of Innovations. New York: The Free Press; 1962