Dr. Robert Wachter, Editor, AHRQ WebM&M: Why don't you start by you telling us what a PSO is?
Dr. William B. Munier: A Patient Safety Organization is a new or existing organization that applies to be officially listed by the department to engage in Patient Safety Activities, reviewing the quality and safety of care. What makes PSOs different from before is that through a federal statute passed in 2005, PSOs get special protections. They get legal protections in terms of privilege, meaning that the deliberations of the PSO are protected from discovery in court, and those deliberations carry with them certain obligations to be maintained in confidence. What this does for the people doing the analysis, and for the institutions where they work, is create a culture of safety where people feel free to report incidents that go wrong, near-misses, any kind of concern that they might have about quality or safety of care. PSOs can receive reports on quality and safety from any provider of health care. So it's not limited to hospitals—it can be nursing homes, ambulatory surgery centers, doctor's offices, free-standing clinics, any provider of care that wishes to align with a PSO can be assured that they can review quality and safety of care with those legal protections.
RW: Are there any other benefits to being in a PSO in terms of support that AHRQ gives these organizations, or networking across organizations?
WM: Perhaps the most significant kind of support has to do with providing common definitions and reporting formats, so that PSOs can collect information in a standard way to allow Patient Safety Events to be aggregated and analyzed. So somebody collecting information on medication errors at one PSO can be sure that they're collecting information in a standardized way. The information that is submitted at a national level into something called the Network of Patient Safety Databases, by statute and by regulation, must be de-identified very completely before it is submitted for aggregation. De-identification can be expensive, and it could be done in a number of different ways, which again might make data not able to be compared easily. So AHRQ is supporting an organization at a national level that will de-identify any PSO's data, free of charge, before sending it in to the Network of Patient Safety Databases. That represents significant assistance to PSOs. We also provide quite a lot of technical assistance free of charge. People call our offices and we take whatever time is necessary to help people understand the law, the regulation, and what is necessary to become a PSO, remain a PSO, and operate as a PSO.
RW: Let me see if I can better understand the national piece. So if I'm a PSO, let's say in a region, and we have created a network where we're sharing data within our PSO across hospitals and nursing homes, does all that information flow to a national database, or is it my choice as to how much of it does? Are there lessons learned at the national level that flow back to the individual PSOs? How does that interaction between individual PSOs and the national network work?
WM: The entire PSO program is voluntary for everyone except AHRQ, which obviously administers the program. There is no mandate that makes PSOs exist, either a certain number of them or a geographic spread. The establishment of PSOs and providers working with established PSOs is all on a voluntary basis. The attraction is because people would like to have the protections afforded by PSOs and we'd like to be able to compare their performance and understand what other people are doing, share lessons learned, and essentially participate in a larger learning community. With respect to what gets sent in or doesn't get sent in, the boundaries for PSOs are not only any provider organization or institution, it's also all of quality and safety; The Patient Safety Act is very broad, and it essentially says any quality or safety activities can be considered. For the purposes of this conversation, we can say that the Network of Patient Safety Databases will collect information from any PSO that wishes to send it on, initially starting with acute care hospitals.
RW: Tell us about the PSOs that have already formed. What are these organizations like?
WM: We really have a fairly broad mix of organizations. We do have a couple of IT vendors that have become PSOs. We have components of state hospital associations, provider groups, and consortiums looking at quality and safety for a number of different provider groups across the country. We have organizations that were doing quality and safety analysis before, sort of proto-PSOs if you will, and new organizations have been formed and incorporated solely to become PSOs. We have some that are broad PSOs that cover all kinds of quality and safety issues, and some that are focused on very specific areas of care.
RW: You mentioned that what appears to be the greatest motivation so far is the legal protections offered under the legislation. In many states, you already have significant legal protections around data that you're sharing—at least within your organization—around quality and safety. Can you talk a little about the variations in those kinds of laws from state to state and how that is interacting with the amount of enthusiasm that you're seeing for PSOs?
WM: Because AHRQ is defining common definitions and reporting formats, we have provided an organizing focus so that people from across the country, and even potentially across the world, can report information on patient safety in a way that is defined clinically in a congruent way and defined electronically in an interoperable way so that information can be aggregated at a larger scale. We're told that a lot of the appeal for the organizations that have signed up to become PSOs comes from trying to aggregate information and learn more rapidly from pattern analysis of aggregate data on a larger scale than has been possible before. Now, that advantage is linked to a legal advantage because the states do differ a great deal in the level of legal protection offered by various state statutes. Some offer little or no protection for review of the quality and safety of health care.
For example, the state of Florida had a ballot initiative where the constitution was amended by referendum, and peer review protections that had existed were stricken, so that Florida does not have state-level peer review protection for reviews of quality and safety. That has left the quality and safety risk management departments of hospitals, nursing homes, and so on somewhat nervous about conducting review, because the minutes and any of the deliberations or reports are all discoverable in court, if I understand the Florida constitution as it affects this issue correctly. So that information, used in a quality committee to try to make care safer in an institution, could be used by people to try to establish a basis for a malpractice suit. This situation can have a substantial chilling effect on the review process. There are many other states where the protection is less than fully adequate. Most states, even if they do have peer review protection, provide it within institutional walls, and it may be limited to just hospitals or to other specific health care organizations. It's a very common attribute of that kind of legislation that the information is protected when it's within a hospital or within the peer review committee of a hospital, but were it to leave the hospital, it would no longer be protected. Moreover, by definition, any state protection ceases once the information leaves the state. For instance, if you have a hospital chain with hospitals in many states, they cannot aggregate information across state lines easily or even across hospitals within a state without, in most cases, losing legal protection. So what the Patient Safety and Quality Improvement Act of 2005 does is provide protections that allow a rational aggregation and analysis of data on clinical units that make sense, whether they cross state lines or whether they cross institutional lines. So people can aggregate date regionally or nationally and look at performance on a broader scale.
RW: In some ways, what PSOs will allow us to do is look at quality issues and safety hazards on a much larger scale across, as you said, across hospital and health care organization lines. And yet, even working within a single hospital, I'm not sure we've completely figured this out. What gives you enthusiasm that this new organizational framework, that's much larger and in many ways more complex, will be able to solve some of these very sticky problems?
WM: I believe we see the world the same way you do in terms of individual institutions that have not necessarily solved all of the issues of integrating analysis and quality and safety improvement activities, even within the institution. To give you one example, we did a survey of about 2,000 hospitals a few years ago. The whole survey was about incident reporting systems in those hospitals, to find out how many were automated, how many weren't automated, how many had them, what they reported on. And to your point, we found that 98% of hospitals had an event reporting system. So that was good news. We found that at only 40% of those hospitals did the event reporting system receive any reports on health care-associated infections, which meant that the patient safety team running the event reporting system wasn't talking to the infection control team. This is a perfect example of a fragmentation of efforts at the hospital level.
We have tried to address that problem in the PSO program through the common definitions and reporting formats. Speaking now just about the area we have addressed, which is patient safety in hospitals, we have created common definitions and reporting formats (termed Common Formats) that will cover all patient safety events, no matter how rare or common. We have designed a common way of reporting any adverse event, including incidents, near misses, and unsafe conditions, and it will be standardized throughout the hospital regardless of where it occurs or what type of event gets reported. Basic information is standardized for all events. In addition, more detailed information is standardized for the more common types of events, such as health care–associated infections, medication errors, pressure ulcers, falls, and a few others. Hence the information will be clinically comparable no matter where in the institution an event occurs—or even in what institution it occurs. Information can be reported by different areas in the hospital, and shed light on where events are occurring, as well as aggregated across institutions, by PSO, and nationally.
One might think that AHRQ would have specified what the common formats would be for national reporting, and just define the data that we want to see come in at the national level. We didn't do that. We defined what should be collected at the local level, where care is being delivered at the hospital. The reason for that is two-fold. The first is that if you only define it at the national level and don't specify the details necessary at the point of care, leaving a degree of freedom in terms of what gets reported at the local level, then people may report things differently so that when they're aggregated at the national level they don't mean the same thing. We thought that the opportunity to define things in a common way had to begin where the data are initially collected. That also gave us the opportunity to say, we're going to define a system that doesn't work just at the national level, it works at the local level. So we are defining the queries that providers or institutions want to make, we're defining the data that need to be collected to answer those queries, and we're defining those reports that need to be run at the local level to report on what has been found. We're also issuing technical specifications so software developers can develop systems both to collect and to report on patient safety events and unsafe conditions at the local level, and transmit to the PSO and the Network of Patient Safety Databases. That will allow local institutions to run their own reports without having to wait to get them back from the national level. Then the aggregate reports can be done on all medication errors in the institution over some time period so that the system should work to support risk management and quality improvement in local institutions. The same kinds of reports that are available for the local institution provide the foundation for rolling up at the PSO level and to the national level, which also provides the ability to benchmark or compare findings at one hospital with another, one PSO with another, and nationally. So that's how we address the needs of the local institution to rationalize and harmonize what they're doing and integrate it at higher levels of aggregation.
RW: When the legislation was passed, there were maybe only a couple of state-required reporting systems on the NQF "Never Events" List. The HITECH law wasn't passed, which is infusing billions of dollars to promote health care IT, and health reform hadn't been passed, which changes a number of things related to quality and safety. So how has the agenda for PSOs evolved in light of these really remarkable changes in the landscape of safety, quality, and health care more generally?
WM: The simple answer is that none of those activities actually address the need that PSOs fill. In other words, none of them, at least to the best of my knowledge, address the issues of privilege and confidentiality in a way that sets up the protected space that enables the creation of a culture of safety and free reporting of adverse events and unsafe conditions. To a large extent, the PSO law that passed 5 years ago still retains its relevance and its uniqueness. I could add to the list the increasing requirement of CDC's National Healthcare Safety Network (NHSN) system for health care-associated infections, which I believe is now required by more than 20 states; there were few, if any, mandatory requirements for NHSN back in 2005. I think the CDC was operating at about 350 hospitals then, and now they're in several thousand. Those are all events that obviously have changed the reporting landscape, as you correctly note.
The HITECH Act and other initiatives dealing with IT try to simplify providers' lives by automating things with the computer. However, if you have two different systems and they're on paper, it's almost easier to sit down and knock heads in a room and get some kind of collaborative agreement on how you're going to unify them. If you've got two different systems that are supported by computer code, it's a rather more expensive and lengthy process. So to some extent, the increasing automation of everything has on the one hand offered the opportunity for harmonization, more efficient data collection, reporting, and much more powerful ability to analyze. But it's also increased the likelihood of creating silos encased in concrete. What we're trying to do, recognizing that fact, and recognizing that the end user of all of this—the provider, the hospital, the doctor's office—are finding reporting burdens increasing and becoming more of a problem, with different people requiring the same event to be reported in different ways. We are trying to work with other organizations to begin to foster harmonization and standardization in the way data are collected. So for instance, we're working closely with the CDC to harmonize what we do in PSOs with the NHSN system, and in fact, we have an interagency committee that includes all of the major health agencies in the Department of Health and Human Services as well as the Department of Defense and the Department of Veterans Affairs. These agency representatives work with us to achieve consensus on the Common Formats. We also use the National Quality Forum to obtain public feedback, so we've been able to get hundreds and hundreds of comments on the Common Formats from interested parties, including states and professional associations as well as the public and providers. The NQF also convenes an expert panel that provides AHRQ with advice on public comments received and on the Common Formats themselves.
We're trying to the best of our ability to act as an agent to work collaboratively with the health care community to begin to represent things in a fashion that is scientifically supportable, represents the consensus of the stakeholder communities, and begins to rationalize both substantively and electronically how things are represented. The ultimate goal is that the end user, where care is delivered, would ultimately report only what's important. Data would be collected only once. If reports are needed by the CDC, the FDA, the state, and a PSO, data get collected once and sent wherever they need to go. A provider is not burdened with multiple data reporting requirements from multiple parties regarding the same events.
RW: You are in a position where you're likely going to be asked to report something about rates of safety events, even though in a voluntary system those are always going to be biased. You see examples of this all the time—private and governmental organizations that report the number of incidents in their systems, and one doesn't know whether these reports reflect their actual safety or their reporting practices. Have you come with an answer to that conundrum?
WM: My preference is that we won't report anything that isn't valid, and we'll caveat it however we need to. I think the reliability of the information and not exaggerating what we have is essential to our credibility. In addition, we're very sensitive to legitimate concerns on the part of the people who have to collect this information about the volume of data collection. We are guided by a principle that nothing should get collected unless one has already decided exactly why it's important to collect it, and how it's going to be used in a report. We're defining clinical questions for which we want answers, and then we're stipulating how the data that get collected to answer those questions are actually used in a report that will run at the institutional level. Any data that might be collected—if it's not going to be used in a report, we don't collect it. We also recognize that even with those strict rules, there can be things that it's nice to have, but there's a tradeoff between the value of the information collected and reported, and the time it takes to collect it. We're trying to get that balance right. In the end, if the whole collection analysis and reporting effort is not improving care to patients, then it needs to be changed until it is, and nothing should get collected or reported that doesn't drive toward that end. We keep that as our goal.