SCUP

Challenges

What’s your biggest challenge? Let us help you find the resources.

Learning Resources

See All Learning Resources

Conferences and Programs

See All Conferences and Programs

Planning for Higher Education Journal

Modifying the Strategic Planning Engine

A Case Study
Journal Cover
From Volume 46 Number 4 | July–September 2018
By Christina Juris Bennett, Sharyl Kinney
Planning Types: Strategic Planning

Academic strategic planning can be difficult given the bevy of stakeholders and often multiple sets of accreditation criteria. Recognizing the limits of the traditional SWOT model, our program chose to use the Strategic Planning Engine model. The model itself is quite laborious, and to increase its usability, we simplified the environmental assessments. Our results proved to be useful and relevant, and we developed a series of feasible objectives. In this article, we describe and evaluate our experience. In comparison to SWOT, we found this process to be more objective and replicable, scalable and responsive to multiple criteria, flexible to accommodate changing strategic plans or criteria, and transparent. With that being said, we caution about the level of labor required and organization and communication needed. Finally, we recommend implementing a leadership team, a communication plan, and a plan on how to respond to uncontrollable circumstances and developing a level of comfort with limited resource allocation.

DOWNLOAD

Introduction

Our graduate program initiated a strategic planning process in response to changes in program leadership, competencies, and accreditation criteria. In the past, the program used a traditional linear planning process driven by the faculty determining necessary changes and implementing those changes without regard to an overall strategic plan. In contrast, the new department chair and program director chose to use an iterative and synthesizing non-linear process. Known as the Strategic Planning Engine (SPE), this approach incorporated a traditional SWOT analysis and a performance-based decision-making matrix that included weighted stakeholder input to align program goals and objectives to meet accreditation criteria (Rowley, Lujan, and Dolence 1997). While the SPE has been described since 1997, there is very little literature discussing an academic program’s experience with it. Our article serves to help fill that gap. Additionally, as in any testing of a model, we determined we needed to modify the proposed approach to better fit our limited resources. We describe these modifications, and we demonstrate that the modifications produced the desired results (a list of planning objectives based on qualitative and quantitative evaluations). This streamlined process fills an existing gap in SPE knowledge and experience. Overall, our replicable approach can serve as a model for other academic programs seeking to quantify and increase the systematic nature of their strategic planning process.

Background

The business world has long used the strengths-weaknesses-opportunities-threats (SWOT) model for organizational strategic planning (Camden et al. 2009; Higginbottom and Hurst 2001; Ng et al. 2013). This model uses stakeholder feedback to identify internal strengths and weaknesses and external opportunities and threats (Bialek, Duffy, and Moran 2009). Although SWOT analysis is a common tool used in strategic planning, scholars have criticized its limited effectiveness because of the variety of ways in which it may be conducted and used (Hill and Westbrook 1997; Pickton and Wright 1998; Ruocco and Proctor 1994; Valentin 2001; Walston 2013). In essence, its replicability is limited. An effective SWOT analysis is a dynamic process that involves a broad representation of internal and external stakeholders (Pickton and Wright 1998). Scholars have suggested increasing the value of the SWOT process by explicitly defining specific factors, prioritizing them according to criteria, and identifying relationships between the strengths, weaknesses, opportunities, and threats (Houben, Lenie, and Vanhoof 1999; Lee and Sai On Ko 2000; Piercy and Giles 1989; Ruocco and Proctor 1994; Valentin 2001).

Within the world of academics, university strategic planning is regarded as a relatively new process. Any planning that had or has existed traditionally is linear and heavily based on budgets and finances (Presley and Leslie 1999). In 1983, Keller questioned the budget-guided linear planning process on the grounds that it did not address the specific challenges facing higher education such as declining enrollment, increasing costs, and changing academic priorities. To that end, he promoted using traditional strategic planning tools like SWOT in higher education (Keller 1983). Consequently, universities began shifting to the business world’s strategic planning tools to develop their missions and visions.

In 1997, Rowley, Lujan, and Dolence further developed the concept of university-based strategic planning by assuming that universities existed in rapidly changing environments and positing that traditional short-term operational decision making is not optimally effective in those environments. They advocated for a strategic planning process that is flexible and consumer driven and also includes a continuous improvement process. To this end, they developed the Strategic Planning Engine, a 10-step theoretical model (Rowley, Lujan, and Dolence 1997). The SPE links key performance indicators (KPIs) to external and internal assessments through a SWOT analysis, and then a subsequent cross-impact analysis assesses the organization’s ability to achieve its KPIs through the proposed goals and objectives. From there, the SPE directs the organization to implement the goals and objectives that most impact its KPIs and subsequently monitor and evaluate the results. The organization should continue to monitor, evaluate, and tweak its actions via a quality improvement process to remain able to impact the KPIs. Through this process, an organization can gain agility to quickly address any previously unknown challenges instead of waiting for the next planning process.

Rowley, Lujan, and Dolence leave the monitoring and evaluation process as a choice for the organization. One widely accepted method is Bialek, Duffy, and Moran’s (2009) four-step cyclical model for quality improvement based on Deming’s plan-do-check-act model. This model monitors performance while providing a continuous feedback loop.

Moving from theory to practice, in 2013 our graduate program engaged in a strategic planning process to develop a new mission, vision, value statement, and competencies to meet organizational goals and accreditation criteria. Our program is a graduate-level health administration program situated within a College of Public Health, so we needed to meet the criteria for both the Commission on Accreditation of Healthcare Management Education (CAHME) and the Council on Education for Public Health (CEPH). Stepping forward, our new leadership embraced the non-linear aspect of the SPE and the cyclical quality-improvement nature of plan-do-check-act.

Methods

Six of the 10 steps of the SPE (Rowley, Lujan, and Dolence 1997) provided the components of this project:

  1. We determined a finite and reasonable number of KPIs based on the university, College of Public Health, and departmental strategic plans and the CEPH and CAHME accreditation criteria. In particular, we looked for overlapping criteria or goals to serve as our KPIs to ensure and maximize alignment.
  2. We performed an external environmental assessment through interviews and focus groups with recent alumni (graduated within the past two years), early and mid-career alumni (graduated five to 10 years prior), executives of health care organizations in the local health care market, the program’s executive advisory board, and preceptors for student internships. Questions to these groups included traditional SWOT questions, e.g., What are the program’s strengths and weaknesses? What are the external opportunities and threats to the program? Of note, the SPE model includes an external assessment PEST (political, economic, social, and technological) trend analysis; a collaborator analysis (shareholders’ KPIs and stakeholders’ KPIs); and a competitor analysis (direct KPIs and indirect KPIs). Given limited resources, we focused on the external environmental assessment of opportunities and threats and greatly limited the scope of that assessment.
  3. We performed an internal environmental assessment through interviews with current faculty members, including adjuncts, and students. Questions to these groups included traditional SWOT questions (see above) and questions related to CAHME curricular criteria. The criteria-based questions focused on curricular recommendations to increase competency in communication, interpersonal effectiveness, critical thinking, problem solving, management, leadership, professionalism, and ethics. The SPE model also includes an internal analysis of organizational performance (productivity, benchmarks, policies, and procedures); an analysis of organizational design (structure, function, infrastructure, and integration); and an analysis of organizational strategies (strategies, goals, objectives, and resources). Like we did with the external analysis, we focused primarily on the internal environmental assessment and greatly simplified its scope.
  4. We performed a SWOT analysis using the information from both the external and internal environmental assessments. Themes were identified using word repetition, key words in context, and overlapping words (Ryan and Bernard 2003). Common themes then became the categories for analysis.

    The themes also informed additional interviews with both external and internal stakeholders; these interviews asked increasingly specific questions derived from CAHME criteria, CEPH criteria, university and college requirements, and the initial SWOT analysis. Additional questions of external stakeholders included, “Where are open jobs right now in hospital administration in our local market?” and “How can the program better prepare the preceptor for an intern?” Student-specific questions included, “How would you increase student involvement?” and “What organizations do you want to have connections with?” The results were then analyzed for themes using word repetition, key words in context, and overlapping words.

    The themes from the various cycles were then crossed in a matrix with the KPIs. The program director, department chair, and vice chair assigned impact ratings in the matrix based on the influence the themes had on the program’s achievement of the KPIs. The ratings, as adopted from Rowley, Lujan, and Dolence (1997), were

    1 = Strong negative influence

    2 = Moderate negative influence

    3 = Weak negative influence

    4 = Neutral, do not know impact

    5 = Weak positive influence

    6 = Moderate positive influence

    7 = Strong positive influence

    NA = no impact or not applicable

    Entries categorized as 1, 2, 6, and 7 were color coded for their moderate or strong influences.

  5. Focusing on the moderate and strong influences, the program director, department chair, and vice chair brainstormed ideas to improve the performance of the department and generated performance improvement strategies to address weaknesses and threats and capitalize on strengths and opportunities. The brainstorming process included considering specific suggestions provided through the SWOT analysis and subsequent interviews. Consensus was reached on a finite list of performance improvement strategies.
  6. The performance improvement strategies were then crossed in a matrix with the KPIs, and the program director, department chair, and vice chair assigned impact values using the previously used seven-point scale. The impact values were used to reach consensus about which ideas would produce the most program improvement. These matrices were then presented to the faculty members, and they affirmed the proposed performance improvement strategies and provided input on implementing corresponding plan-do-check-act cycles.

Results

We developed the KPIs to reflect the various levels of strategic plans and accreditation criteria:

  1. Preceptor evaluations: 80 percent rating of good or excellent on all criteria
  2. Accreditation status: accredited and in good standing with CAHME
  3. Student completion rate: 80 percent completion rate for both full-time and part-time programs
  4. Diversity of graduate placements: one or two students placed out of state
  5. Faculty retention: seven or more full-time faculty members
  6. Faculty balance between practice and academia: 50 percent practitioner faculty, 50 percent academicians
  7. Post-graduation placement rate: 80 percent of students have secured employment within 90 days of graduation
  8. Alumni rating on competency achievements: 80 percent rating of good or excellent on all competencies in all core courses based on applicability to job duties
  9. Student diversity: race as identified by applicants reflects state racial demographics
  10. Student self-evaluation of competency achievement: 80 percent good or excellent rating of all competencies in all required courses

The SWOT analysis included feedback from the internal and external environmental analyses and interviews. The themes identified are presented in figure 1.

Figure 1 SWOT Analysis Themes

Strengths—Internal

Weaknesses—Internal

Opportunities—External

Threats—External

Curriculum

Professionalism

Critical thinking

Health care application

Breadth of curriculum

Adapting to change

Case studies

Faculty

Experience with practice

Focus on improvement

Available to students

Curriculum

Courses need updating

Need more real-world application

Need Excel

Need more presentation experiences

Need more communication skill development

Job Placement

Need career counseling and job placement support

Lack of participation in national case competition

Lack of participation in ACHE Congress

Changing Health Care Environment

Demand for health care leaders

Increase in number and types of health care facilities

Increased use of technology

Requirements and provisions of the Affordable Care Act

Availability of Professional Experiences

Availability of national case competitions

Availability of ACHE and other health care professional meetings

Competition

Competition from other MHA programs

Limitations of local internships

Change in Knowledge and Skills

Rapidly changing need for new skill sets for health care administrators

Figure 2 depicts the cross-impact matrix of KPIs and SWOT common themes. The program director, department chair, and vice chair determined the impact ratings. Ratings 1 and 2 are shaded with dark gray and reflect strong negative or moderate negative influence on the KPIs. Ratings 3, 4, and 5 are lightly shaded and represent weak negative influence, neutral or do not know of impact, or weak positive influence. Ratings 6 and 7 are shaded with medium gray and reflect moderate positive or strong positive influence.

Figure 2 Cross-Impact Matrix of Key Performance Indicators and SWOT Common Themes

The impact ratings indicate the influence of each of the SWOT common themes on achievement of the KPIs:

1 = Strong negative influence; 2 = Moderate negative influence; 3 = Weak negative influence; 4 = Neutral, do not know impact; 5 = Weak positive influence; 6 = Moderate positive influence; 7 = Strong positive influence; NA = no impact or not applicable

The identified performance improvement strategies were

  • Addition of Excel requirements for finance courses
  • Communication and professionalism course requirements
  • Case competition requirement
  • Lean Six Sigma requirement
  • Case studies and practice projects with health care organizations
  • Elective courses in leadership

Figure 3 depicts the cross-impact matrix of KPIs and performance improvement strategies. Scores of 5, 6, and 7 are shaded in dark gray and reflect a positive impact, whether weak, moderate, or strong, on the KPIs. The score of 4 is unshaded and reflects being unsure of the strategy’s impact on the KPIs. These scores gave our program an estimate of the value of implementing a particular change and directed us on how to best use our resources.

Figure 3 Cross-Impact Matrix of Key Performance Indicators and Performance Improvement Strategies

Our final list of objectives in ranked order of importance was

1. Communication and professionalism course requirements (tied)

1. Addition of Excel requirements for finance courses (tied)

1. Case studies and practice projects with health care organizations (tied)

2. Elective courses in leadership

3. Case competition requirement

4. Lean Six Sigma requirement

From this list, the program worked to implement the changes in a timely fashion.

Discussion

Using the SPE with cross-impact analysis, our program was able to improve upon the traditional linear strategic planning process. There are several key takeaways from our experience. First, this replicable process uses objective criteria based on a program’s mission, vision, values, and KPIs. By using objective criteria, the process has less subjectivity; in turn, this makes it inherently more replicable and less dependent on the leader or leadership team conducting it. Compared with other strategic planning methods, this modified SPE has increased mapping capability and increased possibilities for being incremental and iterative. Moreover, in terms of the results, the replicability improves a program’s ability to monitor progress and adjust accordingly.

By using objective criteria, the process has less subjectivity; in turn, this makes it inherently more replicable and less dependent on the leader or leadership team conducting it.

Second, this process incorporates university, college, and departmental strategic plans and two sets of accreditation criteria to create KPIs. It can be daunting to incorporate all the criteria with which a program must comply along with all the goals and objectives set by the multiple layers within an organization. The modified SPE process recognizes that difficulty and provides a framework for selecting particular measurements. Programs may select many measurements, or they may select fewer. This scalability grants programs the ability to weigh and allocate their resources to the process so that they can finish it successfully without becoming overwhelmed.

Relatedly, the modified SPE process is flexible, and it allows for the incorporation of changes in accreditation requirements, strategic goals and objectives, and KPIs. During the few years our SPE was being built and used, the accreditation requirements for the program and college changed dramatically, and the department, college, and university all developed new strategic plans. Had we been stuck in a more solid framework, we would have had to dispose of the planning already done and begin anew. The modified SPE provided sufficient flexibility so that we could adjust to and incorporate those changes in a timely manner.

Finally, this process is refreshingly transparent because it uses stakeholder feedback throughout. Stakeholder feedback helps derive goals, objectives, and KPIs; helps measure effectiveness; and helps develop potential solutions. Historically, strategic planning has often been completed in a vacuum, with stakeholders, particularly the faculty, only seeing the final product. This process includes stakeholders of all kinds at multiple points.

For any program seeking to use the modified SPE model, we offer two key points related to requirements. First, while this process is less labor intensive than the entire model described by Rowley, Lujan, and Dolence (1997), it is more intensive than a simple SWOT analysis. Accordingly, it requires a time commitment from the program director, faculty, students, and other stakeholders like alumni. Time must be allocated for planning, preparing documents, engaging stakeholders, and developing analyses. Given these time requirements, program, departmental, and college leaders should adjust performance expectations accordingly.

Second, this process requires significant planning and communication among participants. As a corollary, leaders of this process must have strong communication skills, knowledge of multiple strategic plans, and knowledge of accreditation requirements. Additionally, leaders should have good organizational skills, high attention to detail, and the ability to collect and analyze qualitative and quantitative data. Although the department chair and program director were new to their roles, this process opened the door to developing relationships with key stakeholders and was not hindered by “the way we have always done it.”

Considering our outcomes and experiences, we offer several recommendations for improvement. First, returning to the issue of labor and time requirements, this process was labor intensive for the program director. Consequently, we recommend developing a leadership team and dividing tasks based on the knowledge and skills of the team members. Relatedly, if there are gaps in needed skills within the team, programs may need to seek expertise elsewhere. Areas such as conducting focus groups and replicable qualitative analysis may not be faculty strengths, and consulting with professionals who have experience with qualitative research may be useful.

Second, a communication plan should be part of the planning process. The leadership team should develop channels and protocols for sharing information with stakeholders, particularly those not involved in the program on a daily basis. It is difficult to re-engage stakeholders once they have disengaged, and so developing a published plan that sets expectations for reports, input, or other engagement would be very helpful.

Third, we recommend that programs anticipate how to deal with circumstances beyond their control. For example, in our case a leadership-oriented course was not indicated by our analysis but mandated by higher authorities. The leadership in our program knew immediately to support and implement that proposed change, and other programs will certainly have similar issues to deal with. It would be good for the leadership team to begin to discuss how it might react and adjust to immovable obstacles.

Fourth, programs should become comfortable with the idea that time and resources are inherently limited, and it may not be feasible to address all needs identified in the analysis. There were many improvement opportunities we would have liked to address; however, we existed within a bounded reality and had to plan accordingly. Operationally, this meant we did not develop strategies that had high impact on student completion, faculty retention, or student diversity. This made sense because we chose to focus our energy on remedying the nine deficiencies found in our previous accreditation site visit, and those nine deficiencies did not touch on the areas mentioned. The priority ratings will help programs focus on the highest priorities and best use their limited resources. This does not mean that conversations about determining those priorities and allocating those resources will be easy, but by using this modified SPE model, they should be easier.

As with any study, there are limitations to what we present. One such limitation is that the program director was responsible for conducting interviews and compiling all stakeholder input; correspondingly, this meant that the process was very labor intensive for one person and also introduced potential bias. Of course, there is always the opposing argument that the concentration of one person’s efforts creates consistency. Additionally, as we mentioned, we did modify the model, and while the leadership team is confident that the improvement initiatives were not ultimately affected by these modifications, our results might have been different using the full model. Finally, while we believe this process is highly replicable, we did experience it with our small group of faculty members (core faculty of seven with adjunct faculty of seven) within our supportive college and university. This environment no doubt impacted the experience we had.

Conclusion

The bulwark nature of universities past can no longer thrive in today’s constantly changing environment. While universities have shifted to incorporate SWOT and other strategic planning processes, we advocate, along with Rowley, Lujan, and Dolence (1997), using the Strategic Planning Engine method to improve flexibility and responsiveness. Our experience provides a case study of a modified SPE approach that offers four key benefits: (1) a more structured process that limits subjectivity; (2) a flexible and iterative process that increases the relevance of results; (3) a simplified process that is easier to explain to stakeholders and supported by data; and (4) a process that increases alignment between multiple strategic plans, accreditation criteria, and KPIs. With these benefits, a program’s strategic plan should no longer sit on the proverbial shelf but be included in ongoing improvement efforts.

While universities have shifted to incorporate SWOT and other strategic planning processes, we advocate using the Strategic Planning Engine method to improve flexibility and responsiveness.

References

Bialek, R., G. L. Duffy, and J. W. Moran. 2009. The Public Health Quality Improvement Handbook. Milwaukee: ASQ Quality Press.

Camden, C., B. Swaine, S. Tétreault, and S. Bergeron. 2009. SWOT Analysis of a Pediatric Rehabilitation Programme: A Participatory Evaluation Fostering Quality Improvement. Disability and Rehabilitation 31 (16): 1373–81.

Higginbottom, M. J., and K. Hurst. 2001. Quality Assuring a Therapy Service. International Journal of Health Care Quality Assurance 14 (4): 149–56.

Hill, T., and R. Westbrook. 1997. SWOT Analysis: It’s Time for a Product Recall. Long Range Planning 30 (1): 46–52.

Houben, G., K. Lenie, and K. Vanhoof. 1999. A Knowledge-Based SWOT-Analysis System as an Instrument for Strategic Planning in Small and Medium Sized Enterprises. Decision Support Systems 26 (2): 125–35.

Keller, G. 1983. Academic Strategy. Baltimore: Johns Hopkins University Press.

Lee, S. F., and A. Sai On Ko. 2000. Building Balanced Scorecard with SWOT Analysis, and Implementing “Sun Tzu’s The Art of Business Management Strategies” on QFD Methodology. Managerial Auditing Journal 15 (1/2): 68–76.

Ng, G. K., G. K. Leung, J. M. Johnston, and B. J. Cowling. 2013. Factors Affecting Implementation of Accreditation Programmes and the Impact of the Accreditation Process on Quality Improvement in Hospitals: A SWOT Analysis. Hong Kong Medical Journal 19 (5): 434–46.

Pickton, D. W., and S. Wright. 1998. What’s SWOT in Strategic Analysis? Strategic Change 7 (2): 101–09.

Piercy, N., and W. Giles. 1989. Making SWOT Analysis Work. Marketing Intelligence and Planning 7 (5/6): 5–7.

Presley, J. B., and D. W. Leslie. 1999. Understanding Strategy: An Assessment of Theory and Practice. In Higher Education: Handbook of Theory and Research, ed. J. C. Smart and W. G. Tierney, 201–39. New York: Agathon Press.

Rowley, D. J., H. D. Lujan, and M. G. Dolence. 1997. Strategic Change in Colleges and Universities: Planning to Survive and Prosper. San Francisco: Jossey-Bass.

Ruocco, P., and T. Proctor. 1994. Strategic Planning in Practice: A Creative Approach. Marketing Intelligence & Planning 12 (9): 24–29.

Ryan, G. W., and H. R. Bernard. 2003. Techniques to Identify Themes. Field Methods 15 (1): 85–109.

Valentin, E. K. 2001. SWOT Analysis from a Resource-Based View. Journal of Marketing Theory and Practice 9 (2): 54–69.

Walston, S. L. 2013. Strategic Healthcare Management: Planning and Execution. Chicago: Health Administration Press.

Author Biographies

Christina Juris Bennett, JD, is an assistant professor in the Department of Health Administration and Policy at the University of Oklahoma and an adjunct professor in the College of Law. She is also the program director of the master’s of health administration.

Sharyl K. Kinney, DrPH, is an assistant professor in the Department of Health Administration and Policy at the University of Oklahoma, and she also serves as the department vice chair.