A local hospital publicly reported organization discharge data. The data were risk adjusted to control for pre-existing comorbidities to level the playing field with data reported from other hospitals. The report notes little risk adjustment for stroke mortality. Recently, the hospital received stroke center designation for following best practice protocols for stroke management. The DNP scholar and data analyst track stroke mortality data and note a spike in mortality for a given quarter, which explains why the state public report is unsatisfactory. Drilling into the data to examine the numerator deaths due to stroke, denominator and total patient population. The data analysts determines that the risk mortality has a score of one to four, with one being a minor risk of mortality. This explains why the hospital did not risk adjust well, because these patients did not appear to have risks associated with severe illness that would reasonably explain the associated mortalities. Next, the data analyst drills further into the detailed subject oriented data of actual patient records and then completes a full chart review to obtain the sweet spot to inform quality improvement.
Consider the scenario above to address the following.
1-Do you recommend that the data analyst examine aggregate data, detailed data, or both, to investigate this quality issue? Please explain your rationale.
2-Do you recommend that the data analyst use a retrospective data warehouse, clinical data store, or both, to investigate the mortality rate? Please explain your rationale.
3-What type of tools or analytic approaches is relevant for use by this analyst? Please explain your rationale.
Now, conduct a search for evidence. Select three scholarly sources of information describing the challenges of utilizing data in the clinical setting.
- Provide a brief overview of the findings of each source of evidence.
Please answer each question separated.