TIES Inclusive Education Roadmap

Data-Based Decision-Making

Continuous Improvement Cycles

Plan-Do-Study-Act (PDSA) Cycles (Deming, 1986) provide EILTs with a process for problem-solving barriers that arise in every Implementation Stage. Running into barriers is the nature of the change process. EILTs, leaders and those who are implementing the change employ PDSA cycles to intentionally identify and address these barriers to implementation. Continuous learning through PDSA Cycles becomes a regular part of the EILT meeting agendas and a way to monitor and adjust implementation of the action plan.

The PDSA cycle consists of four phases:   

  1. Plan -  Specify the new practices and processes to be implemented and identify strategies to measure intended outcomes,
  2. Do - Implement new practices and processes as specified,
  3. Study - Use data to assess progress and make adjustments as needed, and
  4. Act - Implement adjustments to improve implementation and continue to take data.

PDSA Cycle for Initial Implementation of Inclusive Practices

This is a circle-shaped diagram and the middle is broken into 4 sections representing the four stages of the PDSA Cycle, which are Plan, Do, Study, Act. In the middle of the circle are two arrows, showing that these 4 stages are a continous and iterative process.  The pop-out box next to Plan says Write SMARTIE goals based on RISE takeaways, and complete the inclusive education action plan. The pop-out box next to Do says implement the action plan for your SMARTIE goals and collect data. The pop-out box next to Study says analyze data, present data summaries to EILT, discuss findings, and identify next steps. Lastly, the pop-out box next to the section Act that says implement next steps, continous data collection, and return to Study as needed.

Data collection is an important part of initial implementation efforts. If baseline data related to your SMARTIE goals has not been collected, it is important to begin that as soon as possible. In order to measure progress in the implementation of new processes and practices, the EILTwill need to compare data collected during the initial implementation phase with the baseline data.

Once data is collected, the EILT needs to analyze and interpret the data. The usability of this data is a key consideration. Data collection on inclusive practices and processes only becomes usable information when decision-makers and stakeholders can understand what it represents (Wilcox et al., 2021). Is the data clear and not overly complicated? Is it presented in a way that is clear and concise? In order to present data in ways that are useful to others, the EILT will want to:

  1. Collect data according to the frequency decided in the Action Plan and organize the data related to each SMARTIE goal
  2. Create a concise summary of the most relevant findings
  3. Prioritize presentation of data that is most relevant to current questions and concerns  

Role of the EILT

Reviewing data should routinely be on the agenda for EILT meetings. With the data organized in usable formats, the EILT will be able to use data-based decision-making to determine the effectiveness and fidelity of implementation of new inclusive processes and practices. Not every aspect of the Action Plan needs to be discussed at every meeting, but all parts should be revisited with implementation (process) and outcome data (if available) at least quarterly. The presentation of clear and concise data summaries is followed by a discussion that includes both celebrations of successes and identification of “stuck spots.” It is not uncommon during systems change efforts to have occasional stuck spots where implementation is lagging. As part of the data review process, the EILT may decide to address these stuck spots by allowing additional time for implementation to improve. The EILT may also decide to adapt the Action Plan in some way, perhaps by increasing the use of a greater variety of implementation drivers.

Sharing Data with Stakeholders

As the EILT reviews data, they will need to make decisions about when to share data, what data to share, and with whom to share data. For example, there may be too few data points to really tell if a trend is moving in a positive direction or not. When the decision is made to share data more broadly, consider what are the "talking points" to share with the data. What does the data tell you? Are you continuing the work as is or tweaking what you do? This level of transparency helps build trust and can help develop a culture of data-based decision-making at all levels of the system.