Conventionally, serum ceruloplasmin levels below the lower reference limit (0. 20 g/L) is considered a diagnostic cutoff point for Wilson's disease (WD). However, the lower reference limit varies with assay methodologies and the individuals in the included studies. The objective of this study was to determine the optimal cutoff value of serum ceruloplasmin levels for the diagnosis of WD in a large Chinese cohort and to identify factors associated with serum ceruloplasmin.
The cutoff value of ceruloplasmin levels was developed based on a retrospective derivation cohort of 3,548 subjects (1,278 patients with WD and 2,270 controls) and was validated in a separate validation cohort of 313 subjects (203 patients with WD and 110 controls). The performance of immunoassay was tested by receiver operating characteristic curve (ROC) analysis, and differences among the groups were analyzed by using the Mann–Whitney
The conventional cutoff of serum ceruloplasmin levels of <0.2 g/L had an accuracy of 81.9%, which led to a false-positive rate of 30.5%. The optimal cutoff of the serum ceruloplasmin level for separating patients with WD from other participants was 0.13 g/L, as determined by ROC analysis. This cutoff value had the highest AUC value (0.99), a sensitivity of 97.0%, and a specificity of 96.1%. Moreover, it prevented unnecessary further investigations and treatments for 492 false-positive patients. By determining the correlation between serum ceruloplasmin and phenotypes/genotypes in patients with WD, we found that the serum ceruloplasmin level was lower in early-onset patients and higher in late-onset patients. Interestingly, patients with the R778L/R919G genotype had higher serum ceruloplasmin levels than patients with other hot spot mutation combinations.
Our work determined the optimal cutoff value of serum ceruloplasmin levels for the diagnosis of WD and identified differences in serum ceruloplasmin levels with respect to the age of symptom onset and