Appendix B: Excess Cost and Length of Stay Associated with Voluntary Event Reports in Hospitals

Cost of Poor Quality or Waste in Integrated Delivery System Settings

 

Andrew R. Paradis, MBA
Valerie T. Stewart, Ph.D.
K. Bruce Bayley, Ph.D.
Allen Brown
Andrew J. Bennett

Address for Correspondence:
Andrew Paradis
Providence Health System
5211 NE Glisan St., Building C
Portland, OR 97213
(503) 215-7174
E-mail address: Andrew.Paradis@providence.org

Affiliations: Center for Outcomes Research and Education

Context: Voluntary event reporting has proliferated in hospitals yet little is known about the cost and length of stay associated with events captured through this means.

Objective: To quantify excess costs and length of stay associated with voluntary patient safety event reports.

Study Design: Patient safety events were captured in an electronic registry at three urban, community hospitals in Portland, Oregon. Information was collected on a voluntary basis from any hospital care provider. All reported events were anonymous. Cost and length of stay was assessed by linking event reports to risk adjusted administrative data from a well-known national vendor-supported data set, CareScience.

Principal Findings: Hospital stays with any event report were 17% more costly and 22% longer than those without events. Medication and treatment errors were the most expensive and most common unplanned events, representing 77% of all unplanned event types and 77% of added costs. There was no significant difference in additional cost or length of stay by the outcome designated on the report.

Conclusions: Though rarely utilized to measure patient safety related costs, the events captured by voluntary reports add significantly to the cost and duration of hospital care.

Key Words: Patient Safety, Cost, Length of Stay, Voluntary Reporting

Although the cost of adverse events has been well established for some time (Thomas et al. 1999), few studies have quantified the cost of events captured by general purpose, voluntary event reporting systems.

Since the Institute of Medicine (IOM) report To Err Is Human (Institute of Medicine 1999), patient safety has gained greater national prominence and, as recommended in that publication, voluntary event reporting systems have started to proliferate at hospitals around the country (Martin et al. 2005; Mekhjian et al. 2004). Prior work to quantify patient safety related cost and length of stay has focused on adverse events or injuries that are defined by some level of patient harm (Classen et al. 1997; Bates et al. 1997; Einbinder and Scully 2001; Leape 2002; Samore et al. 2004; Senst et al. 2001; Suresh et al. 2004; Zhan and Miller 2003).

By comparison, voluntary event reporting systems capture a broad range of unsafe conditions, events and patient outcomes that in many cases do not involve patient harm. As such, estimates of excess cost and length of stay using voluntary event reports measure different kinds of patient safety related costs than those associated with adverse events or injuries.

Few studies that we know of have used these data to examine the additional costs and length of stays associated with events collected in these types of unspecialized event reporting systems. By using a multivariate regression model with case matching, risk adjustment and log transformation of highly skewed dependent variables, this study represents a methodological improvement on the sole study (Nordgren et al. 2004) to estimate the excess cost and length of stay associated with voluntary event reports.

Methods—Data Sources

The Providence Center for Outcomes Research and Education (CORE) analyzed data from three Providence Health System (PHS) community hospitals in the Portland, Oregon metropolitan area. Providence St. Vincent Medical Center operates 451 beds, Providence Portland Medical Center operates 483 beds and Providence Milwaukie Hospital operates 77 beds.

To create a dataset containing patient encounter cost and length of stay data along with any associated event information, we extracted and linked data from two sources. One source was an administrative database and the other source was a database of voluntary event reports.

The administrative database, developed by CareScience (CareScience 2006), a benchmarking vendor, contained cost, actual length of stay, age, sex, payer, DRG, predicted cost and length of stay for each hospitalization, the unit of analysis. As a result, a patient with multiple hospitalizations during the study period would appear once for each hospitalization. In the CareScience database, cost is derived by applying cost to charge ratios to patient charge data. This database uses proprietary, diagnosis specific, risk adjustment models calculated from their client database of more than 200 hospitals representing over 4 million discharges. The risk models use variables for chronic diseases, comorbidities, principal diagnosis, major procedures, urgency of admission, age, sex, race, median household income in patient zip code, relative travel distance to facility, admission source and transfer status to provide patient specific estimates of expected cost and length of stay. Data for the present study included all 123,281 discharges between 4/1/2002 and 4/30/2004 from the three Providence hospitals.

The voluntary event report database contained 29,019 submissions from the three hospitals between 4/1/2002 and 4/30/2004 related to different event "types": medication errors, patient falls, treatment events, equipment problems, behavioral issues and loss/exposure events. Throughout this analysis these are referred to simply as event reports.

Beginning third quarter 2001, event reports could be submitted to a centralized database using machine readable paper forms and in late 2003 an online system was added. Hospital leadership has strongly encouraged the reporting of as many close calls and unsafe practices as possible rather than limiting reporting to actual incidents. Managers receive regular feedback about the events that occur in the areas for which they are responsible.

In our system, event reports could have more than one type-category assigned to it and multiple causes related to each type. Within each type-category there were also between 5 and 15 subtypes. For example, medication event reports could describe a variety of missteps including subtypes such as those of omission, patient mis-identification, dosage, timing, and adverse reactions (Table 4). Fall event reports included whether or not the event was observed and whether or not the fall was assisted from a variety of locations such as bed, toilet, or chair (Table 5). Treatment event reports included a wide variety of event subtypes such as delayed, omitted or incorrect treatment, patient misidentification, latex sensitivity, or injury (Table 6). Equipment events included misuse or malfunction of equipment or improper disposal of supplies (Table 7). Behavioral event reports could document threats of or actual physical or verbal abuse, legal action, complaints, leaving against medical advice or the presence of contraband such as drugs or weapons (Table 8). Loss event reports included theft and exposure to materials and fumes (Table 9).

Each report was also assigned an outcome using fourteen categories that described the events' potential impact using NCC MERP scale (National Coordinating Council for Medication Error Reporting and Prevention 2006) or for falls, the NDNQI (National Center for Nursing Quality, 2005) scale. An outcome could be categorized as "No Incident", "Error/No Harm", "Error/Harm", and "Error/Death". "No Incident" outcomes were those without the capacity to cause any disruption in care. There was no option for a "No Incident" fall since by its nature it constituted a disruption. "Error/No Harm" outcomes included events that occurred but did not reach the patient or reached the patient but did not cause harm. "Error/Harm" outcomes were events that occurred and resulted in additional treatment, prolonged hospitalization, permanent patient harm, or a near-death event. "Error/Death" outcomes were those where the patient died. Each report was reviewed by the manager of the department where the event occurred and by hospital quality management personnel. Department managers and quality management staff each review the event and make corrections as needed to ensure accuracy of the outcome and all other information.

During the time period analyzed, the account number or medical record number of the patient were manually entered into the Web-based system or stamped with the patient identifiers in the paper version. In both submission formats, this information was optional to encourage greater anonymity in reporting. In this system it is also possible to report a general safety issue that may not be specific to a patient, obviating the need for patient identifiers. Records with patient identifiers made the event report specific to a particular hospital encounter. Interestingly, the percentage of reports linked to an encounter (51%) was similar in both paper and online reporting formats. Reports could be completed for events involving patients, visitors, and employees although the majority were related to patients. This study includes only events related to patients.

Merging both data sources produced 15,851 encounters that were linked to a voluntary event report. This left 10,352 reports with some information entered in the patient identification fields but were not able to be successfully linked with the CareScience data. The event reports that were linked to an encounter in the CareScience data had a distribution of report types and outcomes similar to those that were not successfully linked.

Case Matching

Patient encounters with voluntary event reports were matched against 1 to 4 controls using facility, initial department, DRG, sex, and age (+ 10 years). This follows the methodology used by Zhan and Miller (2003) and was done to create a dataset with controls that were similar to patients with a voluntary event report. In particular, maternity cases represented a large portion of hospital volume, had relatively few event reports and had costs and length of stay that differed from other hospital cases. The initial department was the first hospital department where a room and board charge was recorded for a given hospitalization. In the hospitals' administrative data, charges were recorded at midnight. The goal of this procedure was to control for differences in hospital processes that might influence the likelihood of an error and of the event being reported. Matching was done without replacement so that each case was matched to a different control. Of the 15,851 encounters linked to an event report, 11,568 were successfully matched with at least one control case. The matched cases and controls represented the distribution of patient types found in the overall hospitals' patient population in all areas other than obstetrics and newborns which had a relatively small number of event reports relative to their large proportion of hospital volume (Table 1).

Multivariate Modeling

To isolate the influence of an event itself from patient characteristics that influence cost and length of stay, we initially constructed two linear regression models. A cost model used the logarithm of cost as the dependent variable and included as independent variables, the logarithm of expected cost, the logarithm of expected LOS, age, sex, payer, a surgery indicator variable, a dummy variable to indicate that a event was reported for that patient encounter, interaction terms for payer and the log of expected cost and the log of expected LOS. We also constructed a second model for length of stay using the same independent variables as the cost model and the logarithm of length of stay as a dependent variable. In these two regression models, the coefficient assigned to the event indicator variable was interpreted as the increase in cost or length of stay associated with an event while accounting for differences in patient characteristics.

To model the cost or length of stay of a particular event type or outcome, we replaced the event dummy variable with dummy variables for each event type or outcome category. This yielded a total of eight regression equations; two overall cost and length of stay equations, two equations for cost and length of stay by type, two equations for cost and length of stay by outcome, and two equations for cost and length of stay by type and outcome (Table 2). Significant differences between types and/or outcomes were identified by examining 95% confidence intervals around the parameter estimate for the respective event dummy variable. This approach is similar to a two-tailed t test.

Using the logarithm of cost or length of stay was necessary to ensure that our models satisfied the assumptions of linear regression (William 1993; Manning 1998). The result of this transformation is that the coefficient of the event indicator variable now estimates the logarithm of the proportional change in cost or length of stay and must be transformed to be more easily interpreted (Austin, Ghali, and Tu 2003). This was done by taking the anti-log of the event dummy coefficient which provided the proportional change in cost or length of stay given an event report.

This was then multiplied by the median cost or length of stay of non-event report patient encounters to provide a "per event" cost or excess days as a result of an event. This "per event" estimate was then multiplied by the total number of events in the dataset after matching to calculate the overall total cost and days. All models had an R-square of .72 and the overall length of stay model had an R-square of .51. All analyses were performed using SPSS 13.0 (SPSS 2004).

Results

In our analysis, after controlling for patient risk factors, hospitalizations with any type of event report were 17% more expensive than those without an event report (Table 2). Similarly, length of stay was 22% longer for patients with an event report compared to those without (Table 3). Medication and Fall events were the most expensive (21% higher cost), followed by behavioral events (15%), loss/exposure (13%), treatment (12%) and equipment events (11%). Both Medication and Fall events were significantly more expensive than other event types. Medication and Treatment event reports were the most common, representing almost 77% of all event types.

Overall, there was a significant difference in cost increase between "No Incident" events (11%-15% confidence interval) and more serious events ("Error/No Harm", 18%-21% and "Error/Harm", 17%-23% confidence intervals respectively) (Table 2). There was no significant difference in cost increase between events with harm and those without harm.

Fall event reports were associated with the greatest increase in length of stay (34% longer LOS) followed by medication events (26%), loss/exposure events (25%), behavioral events (21%), treatment events (13%) and equipment events (10%) (Table 3). There was not a consistent, statistically significant pattern of greater incremental length of stay for any particular event type. Overall, there was a significant difference in LOS between "No Incident" events (16%-21% confidence interval) and more serious events ("Error/No Harm", 22%-25% and "Error/Harm" 19%-26% confidence interval respectively). As was observed with the cost model, there was no significant difference in length of stay increase between events with and without harm.

Extrapolation of Cost and Length of stay

Percentage increases in cost and length of stay can be translated into dollars and days by multiplying the increase in cost and length of stay by the corresponding non-event report median values. This step provides a picture of the total impact of voluntary event reports since it combines both the percent increase in cost or LOS and the frequency of each event type. In the two years represented by our study, unplanned patient care events have added an estimated $8.3 million in additional patient care costs and additional 4800 patient days (Table 3).

Medication events which were both common and relatively expensive per event, accounted for an estimated $4 million in patient care costs and more than 2300 bed days alone. Treatment events were the next most expensive, accounting for roughly $2.3 million in extra costs followed by Fall events which accounted for more than $900,000 in additional costs. Falls had the greatest per event increase in length of stay and accounted for more than 1100 additional bed days over two years.

Discussion

In our study, the events collected through voluntary reporting add significantly to the cost of patient care. While the exact causes for greater costs are not yet known and need to be investigated further, we suspect that the additional costs are due to rework, the need for additional testing and treatment, and lengthened stays due to patient monitoring.

Our study differs from other efforts to measure patient safety related costs primarily in the means used to identify and define events. While voluntary event reporting systems have been criticized as underreporting the extent of patient harm, their strength is in the collection of data on "near misses" and unsound practices that have the potential to cause future patient harm and might not be discovered through other means (Aspden et al. 2004; Jha et al. 1998; Thomas and Petersen 2003). In contrast to other studies of adverse events and injury which define some level of harm, thirty percent of the reports in our study were assigned a "No Incident" outcome and ninety percent did not cause patient harm. Consequently, we are measuring a different type of event than those included in other studies. For example, Zhan and Miller (2003) report higher estimates of length of stay because Patient Safety Indicators (PSI) are generally more acute events (postoperative sepsis or accidental puncture or laceration, for example) than the majority reported in our system (90% did not involve patient harm). In that study, 13 of 18 PSIs had excess length of stay of 1.34 days or more versus .43 days per voluntary event report. Bates et al. (1997), Classen at al (1997) and Senst et al. (2001) likely show greater cost and length of stay (for all ADEs, $2100-$2500 in 1997 and 2001 and 1.9-2.2 days) than those we observe ($913 and .52 days for all medication related voluntary event reports) because ADEs are defined with some level of patient harm. Many voluntary event reports, particularly those that do not cause harm, may not be sufficiently documented elsewhere to trigger Patient Safety Indicators, chart review criteria, or automatic detection systems described in these studies.

To date calculations of the national cost of patient safety events have been based on studies that define some level of patient harm. Including the types of events reported here expands the picture of costs associated with lapses in patient safety to include a wider range of events. Unfortunately, we cannot make national extrapolations with our data and add these to existing national estimates since there is undoubtedly some overlap between the types events reported here and those in other studies. Further, our study is limited to three hospitals in one city and would need to be replicated on a much larger scale to be nationally representative.

The present study estimates a very large aggregate effect of events and near misses. However, there is reason to believe that these estimates still under represent total costs. They do not include the costs of review and investigation, risk management, or nonbillable costs. Taking all these costs into account, it is clear that the events documented thought voluntary patient safety reporting identify a major area of waste and inefficiency.

Limitations

Our study is limited by reporting biases inherent in voluntary reporting systems that influence the type and severity of events reported. Voluntary reports do not cover the whole extent of patient injury or offer a means to asses the prevalence or incidence of errors. These biases are reflected our results in much lower estimates costs and length of stay and add support to our hypothesis that even the less acute events captured though voluntary reporting add to total cost and length of stay.

This study is constrained to three hospitals within a single health system. Other hospitals may have different reporting cultures and systems that capture different event types and frequencies. This limits our ability to generalize our findings but should still shed light on the costs of hospital systems and processes that may not cause major harm or death but are inefficient.

There are certainly costs associated with the events in our database not captured by administrative systems. For example, the cost of investigation and review of these events is not billable. Costs are also not captured for the numerous instances of miscommunication among medical care teams. Nurses are often required to resolve these communication failures reducing clinical productivity, thereby increasing hospital operating costs. In the case of falls, sitters are often assigned to watch patients at high risk of falling. This time is also not billable and reduces efficiency. Although cost is only captured through billable activities, length of stay is not, so any patient specific delays in care or reduced efficiency would be captured by length of stay measurements.

 

Table 1: Major Diagnostic Category, Age, and Sex

Major Diagnostic Category (percent of encounters)Encounters excluded from analysisMatched Encounters with event reportMatched Encounters, no event report
Circulatory13.419.920.5
Musculoskeletal7.815.215.6
Digestive8.111.410.8
Respiratory4.69.59.1
Pregnancy19.79.210.7
Nervous System3.65.85.2
Newborns21.54.85.6
Kidney2.73.83.5
Female Reproductive4.23.64.0
Mental Health1.63.13.5
Hepatobiliary and Pancreas2.63.02.6
Endocrine, Nutritional and Metabolic2.02.42.2
Infectious1.12.21.7
Skin2.11.51.2
Injury1.21.20.9
Others (< 1% each)3.93.62.9
Mean age, years40.657.956.7
Percent female62.859.560.4
Number of Encounters727131156839000

Note: Italics indicate significant difference (p < .05, using a two tailed z-test or t-test for means) between matched encounters with and without an event report.

 

Table 2: Percent Increased Cost and Length of Stay (LOS) by Event Type and Outcome

Event Type
Cost, LOS, N
All Harm Levels % Increase (95% Confidence Interval)No Incident % Increase (95% Confidence Interval)No Harm % Increase (95% Confidence Interval)Harm % Increase (95% Confidence Interval)Death
Medication
Cost (%)21.2 (19.5-23.0)19.2 (15.6-22.9)21.1 (19.0-23.2)26.2 (20.7-31.9)-
LOS (%)26.0 (23.9-28.1)24.3 (20.1-28.7)25.9 (23.5-28.4)30.2 (23.8-36.9)-
N45439293167447-
Treatment
Cost (%)11.7 (10.1-13.3)8.5 (6.0-11.0)14.1 (11.9-16.3)10.8 (5.8-15.9)ns
LOS (%)12.6 (10.7-14.4)11.9 (9.1-14.9)14.2 (11.8-16.7)6.9 (1.6-12.6)-
N46221687249942412
Fall
Cost (%)20.9 (17.4-24.4)na22.4 (18.5-26.5)16.1 (8.7-24.1)-
LOS (%)34.2 (29.8-38.7)na36.4 (31.5-41.6)28.4 (19.2-38.4)-
N1025na828197-
Equipment
Cost (%)11.4 (7.3-15.6)10.2 (4.0-16.8)9.1 (3.5-15.1)28.4 (13.9-44.8)-
LOS (%)9.8 (5.2-14.6)10.1 (3.1-17.6)6.8 (0.6-13.4)25.5 (9.5-43.7)-
N63526131262-
Behavioral
Cost (%)15.3 (10.9-19.9)12.1 (6.3-18.1)19.7 (12.0-27.8)20.9 (6.4-37.3)ns
LOS (%)20.9 (15.6-26.4)19.0 (12.2-26.3)21.4 (12.6-30.9)33.5 (15.5-54.2)-
N569316198532
Loss/Exposure
Cost (%)12.8 (8.4-17.5)14.9 (9.1-21.1)9.9 (2.9-17.5)ns-
LOS (%)24.5 (18.9-30.2)27.4 (20.1-35.2)21.2 (12.5-30.7)ns-
N54231819628-
All Report Types
Cost17.4 (16.2-18.6)13.3 (11.4-15.2)19.1 (17.7-20.6)19.9 (16.7-23.3)ns
LOS21.6 (20.3-23.0)18.3 (16.1-20.5)23.3 (21.7-25.1)22.5 (18.7-26.4)-
N1156834326957116514
Outcome (%)29.760.110.10.1

na = not an option on the form; ns = not significant at p < .05; - = < 15 cases
Note: Since reports can include multiple event types and only one outcome in our system, the number of cases reported by type will not match the total. Values in parenthesis are a 95% confidence interval for the respective variable's coefficient.

Page last reviewed September 2008
Internet Citation: Appendix B: Excess Cost and Length of Stay Associated with Voluntary Event Reports in Hospitals: Cost of Poor Quality or Waste in Integrated Delivery System Settings. September 2008. Agency for Healthcare Research and Quality, Rockville, MD. http://www.ahrq.gov/research/findings/final-reports/costpqids/cpqidsappb.html