Dissemin is shutting down on January 1st, 2025

Published in

American Association of Neurological Surgeons, Journal of Neurosurgery: Spine, 2(26), p. 235-242, 2017

DOI: 10.3171/2016.7.spine16183

Links

Tools

Export citation

Search in Google Scholar

An assessment of data and methodology of online surgeon scorecards

Distributing this paper is prohibited by the publisher
Distributing this paper is prohibited by the publisher

Full text: Unavailable

Red circle
Preprint: archiving forbidden
Red circle
Postprint: archiving forbidden
Question mark in circle
Published version: policy unknown
Data provided by SHERPA/RoMEO

Abstract

OBJECTIVE Recently, 2 surgeon rating websites (Consumers' Checkbook and ProPublica) were published to allow the public to compare surgeons through identifying surgeon volume and complication rates. Among neurosurgeons and orthopedic surgeons, only cervical and lumbar spine, hip, and knee procedures were included in this assessment. METHODS The authors examined the methodology of each website to assess potential sources of inaccuracy. Each online tool was queried for reports on neurosurgeons specializing in spine surgery and orthopedic surgeons specializing in spine, hip, or knee surgery. Surgeons were chosen from top-ranked hospitals in the US, as recorded by a national consumer publication ranking system, within the fields of neurosurgery and orthopedic surgery. The results were compared for accuracy and surgeon representation, and the results of the 2 websites were also compared. RESULTS The methodology of each site was found to have opportunities for bias and limited risk adjustment. The end points assessed by each site were actually not complications, but proxies of complication occurrence. A search of 510 surgeons (401 orthopedic surgeons [79%] and 109 neurosurgeons [21%]) showed that only 28% and 56% of surgeons had data represented on Consumers' Checkbook and ProPublica, respectively. There was a significantly higher chance of finding surgeon data on ProPublica (p < 0.001). Of the surgeons from top-ranked programs with data available, 17% were quoted to have high complication rates, 13% with lower volume than other surgeons, and 79% had a 3-star out of 5-star rating. There was no significant correlation found between the number of stars a surgeon received on Consumers' Checkbook and his or her adjusted complication rate on ProPublica. CONCLUSIONS Both the Consumers' Checkbook and ProPublica websites have significant methodological issues. Neither site assessed complication occurrence, but rather readmissions or prolonged length of stay. Risk adjustment was limited or nonexistent. A substantial number of neurosurgeons and orthopedic surgeons from top-ranked hospitals have no ratings on either site, or have data that suggests they are low-volume surgeons or have higher complication rates. Consumers' Checkbook and ProPublica produced different results with little correlation between the 2 websites in how surgeons were graded. Given the significant methodological issues, incomplete data, and lack of appropriate risk stratification of patients, the featured websites may provide erroneous information to the public.