A Validation Study on Immunophenotypic Differences in T-lymphocyte Chromosomal Radiosensitivity between Newborns and Adults in South Africa
Children have an increased risk of developing radiation-induced secondary malignancies compared to adults, due to their high radiosensitivity and longer life expectancy. In contrast to the epidemiological evidence, there is only a handful of radiobiology studies which investigate the difference in radiosensitivity between children and adults at a cellular level. In this study, the previous results on the potential age dependency in chromosomal radiosensitivity were validated again by means of the cytokinesis-block micronucleus (CBMN) assay in T-lymphocytes isolated from the umbilical cord and adult peripheral blood of a South African population. The isolated cells were irradiated with 60Co γ-rays at doses ranging from 0.5 Gy to 4 Gy. Increased radiosensitivities of 34%, 42%, 29%, 26% and 16% were observed for newborns compared to adults at 0.5, 1, 2, 3 and 4 Gy, respectively. An immunophenotypic evaluation with flow cytometry revealed a significant change in the fraction of naïve (CD45RA+) T-lymphocytes in CD4+ and CD8+ T-lymphocytes with age. Newborns co-expressed an average of 91.05% CD45RA+ (range: 80.80–98.40%) of their CD4+ cells, while this fraction decreased to an average of 39.08% (range: 12.70–58.90%) for adults. Similar observations were made for CD8+ cells. This agrees with previous published results that the observed differences in chromosomal radiosensitivity between newborn and adult T-lymphocytes could potentially be linked to their immunophenotypic profiles.