Abstract
Fifty-one laboratories from 14 countries participated in a survey on the determination of selenium (Se) in 8 bovine blood samples with Se concentrations ranging from 0.2 μmol/L (0.016 μg/mL) to 14 μmol/L (1.1 μg/mL). The methods used (and the percentage of participants using each method) were fiuorometry (61), hydride-generation atomic absorption spectrophotometry (AAS) (23), graphitefurnace AAS (6), gas chromatography (4), neutron activation analysis (4), and X-ray fiuorometry (2). There was little difference in the mean Se results obtained by fiuorometry or hydride-generation AAS (P > 0.05). Mean intralaboratory coefficients of variation (CVs) from known replicates ranged from 4 to 14% for all samples. Interlaboratory CVs were related to blood Se concentration and increased to 55% at Se levels below 0.4 μmol/L (0.032 μg/mL). Laboratories that used quality control (QC) schemes had lower interlaboratory CVs than those that did not, but the advantage began to diminish at blood Se concentration below 0.4 μmol/L (0.032 μg/mL). The high interlaboratory CVs, coupled with the false assurance from the low intralaboratory CVs and the ineffectiveness of the QC schemes at blood Se concentrations below 0.4 μmol/L (0.032 μg/mL), are of concern in diagnosis of marginal Se deficiency in livestock where the concentrations of interest are in the range 0.15-0.5 μmol/L (0.012-0.039 μg/mL).