Adrenal Lesions: Characterization with Fused PET/CT Image in Patients with Proved or Suspected Malignancy—Initial Experience

Radiology ◽  
2006 ◽  
Vol 238 (3) ◽  
pp. 970-977 ◽  
Author(s):  
Michael A. Blake ◽  
James M. A. Slattery ◽  
Mannudeep K. Kalra ◽  
Elkan F. Halpern ◽  
Alan J. Fischman ◽  
...  
Keyword(s):  
Ct Image ◽  
2015 ◽  
Vol 54 (06) ◽  
pp. 247-254 ◽  
Author(s):  
A. Kapfhammer ◽  
T. Winkens ◽  
T. Lesser ◽  
A. Reissig ◽  
M. Steinert ◽  
...  

SummaryAim: To retrospectively evaluate the feasibility and value of CT-CT image fusion to assess the shift of peripheral lung cancers with/-out chest wall infiltration, comparing computed tomography acquisitions in shallow-breathing (SB-CT) and deep-inspiration breath-hold (DIBH-CT) in patients undergoing FDG-PET/ CT for lung cancer staging. Methods: Image fusion of SB-CT and DIBH-CT was performed with a multimodal workstation used for nuclear medicine fusion imaging. The distance of intrathoracic landmarks and the positional shift of tumours were measured using semitransparent overlay of both CT series. Statistical analyses were adjusted for confounders of tumour infiltration. Cutoff levels were calculated for prediction of no-/infiltration. Results: Lateral pleural recessus and diaphragm showed the largest respiratory excursions. Infiltrating lung cancers showed more limited respiratory shifts than non-infiltrating tumours. A large respiratory tumour-motility accurately predicted non-infiltration. However, the tumour shifts were limited and variable, limiting the accuracy of prediction. Conclusion: This pilot fusion study proved feasible and allowed a simple analysis of the respiratory shifts of peripheral lung tumours using CT-CT image fusion in a PET/CT setting. The calculated cutoffs were useful in predicting the exclusion of chest wall infiltration but did not accurately predict tumour infiltration. This method can provide additional qualitative information in patients with lung cancers with contact to the chest wall but unclear CT evidence of infiltration undergoing PET/CT without the need of additional investigations. Considering the small sample size investigated, further studies are necessary to verify the obtained results.


2019 ◽  
Vol 14 (7) ◽  
pp. 658-666
Author(s):  
Kai-jian Xia ◽  
Jian-qiang Wang ◽  
Jian Cai

Background: Lung cancer is one of the common malignant tumors. The successful diagnosis of lung cancer depends on the accuracy of the image obtained from medical imaging modalities. Objective: The fusion of CT and PET is combining the complimentary and redundant information both images and can increase the ease of perception. Since the existing fusion method sare not perfect enough, and the fusion effect remains to be improved, the paper proposes a novel method called adaptive PET/CT fusion for lung cancer in Piella framework. Methods: This algorithm firstly adopted the DTCWT to decompose the PET and CT images into different components, respectively. In accordance with the characteristics of low-frequency and high-frequency components and the features of PET and CT image, 5 membership functions are used as a combination method so as to determine the fusion weight for low-frequency components. In order to fuse different high-frequency components, we select the energy difference of decomposition coefficients as the match measure, and the local energy as the activity measure; in addition, the decision factor is also determined for the high-frequency components. Results: The proposed method is compared with some of the pixel-level spatial domain image fusion algorithms. The experimental results show that our proposed algorithm is feasible and effective. Conclusion: Our proposed algorithm can better retain and protrude the lesions edge information and the texture information of lesions in the image fusion.


2021 ◽  
Author(s):  
Eugenio Galicia-Larios ◽  
Carlos Alberto Reynoso-Mejía

2017 ◽  
Vol 38 (6) ◽  
pp. 471-479 ◽  
Author(s):  
Nicholas J. Vennart ◽  
Nicholas Bird ◽  
John Buscombe ◽  
Heok K. Cheow ◽  
Ewa Nowosinska ◽  
...  

Tomography ◽  
2022 ◽  
Vol 8 (1) ◽  
pp. 131-141
Author(s):  
Kanae Takahashi ◽  
Tomoyuki Fujioka ◽  
Jun Oyama ◽  
Mio Mori ◽  
Emi Yamaga ◽  
...  

Deep learning (DL) has become a remarkably powerful tool for image processing recently. However, the usefulness of DL in positron emission tomography (PET)/computed tomography (CT) for breast cancer (BC) has been insufficiently studied. This study investigated whether a DL model using images with multiple degrees of PET maximum-intensity projection (MIP) images contributes to increase diagnostic accuracy for PET/CT image classification in BC. We retrospectively gathered 400 images of 200 BC and 200 non-BC patients for training data. For each image, we obtained PET MIP images with four different degrees (0°, 30°, 60°, 90°) and made two DL models using Xception. One DL model diagnosed BC with only 0-degree MIP and the other used four different degrees. After training phases, our DL models analyzed test data including 50 BC and 50 non-BC patients. Five radiologists interpreted these test data. Sensitivity, specificity, and area under the receiver operating characteristic curve (AUC) were calculated. Our 4-degree model, 0-degree model, and radiologists had a sensitivity of 96%, 82%, and 80–98% and a specificity of 80%, 88%, and 76–92%, respectively. Our 4-degree model had equal or better diagnostic performance compared with that of the radiologists (AUC = 0.936 and 0.872–0.967, p = 0.036–0.405). A DL model similar to our 4-degree model may lead to help radiologists in their diagnostic work in the future.


2009 ◽  
Vol 23 (3) ◽  
pp. 235-243 ◽  
Author(s):  
Keitaro Sofue ◽  
Ukihide Tateishi ◽  
Morio Sawada ◽  
Tetsuo Maeda ◽  
Takashi Terauchi ◽  
...  

2019 ◽  
Vol 44 (3) ◽  
pp. e168-e169 ◽  
Author(s):  
Young-Sil An ◽  
Joon-Kee Yoon ◽  
Su Jin Lee ◽  
Eugene Jeong ◽  
Il-hyun Kim

Sign in / Sign up

Export Citation Format

Share Document