scholarly journals Computer assisted Doppler waveform analysis and ultrasound derived turbulence intensity ratios can predict early hyperplasia development in newly created vascular access fistula: Pilot study, methodology and analysis

2021 ◽  
Vol 10 ◽  
pp. 204800402110001
Author(s):  
Matthew Bartlett ◽  
Vanessa Diaz-Zuccarini ◽  
Janice Tsui

Objectives Following surgical creation of arterio-venous fistulae (AVF), the desired outward remodeling is often accompanied by the development of neointimal hyperplasia (NIH), which can stymie maturation and may lead to thrombosis and access failure. The aim of this study was to investigate the feasibility of using a non-invasive test, to detect and quantify the turbulent flow patterns believed to be associated with NIH development. Design This was a prospective, observational study. Ultrasound derived turbulence intensity ratios (USTIR) were calculated from spectral Doppler waveforms, recorded from newly formed AVF, and were compared with haemodynamic and structural changes observed during the initial maturation period. Setting Measurements were obtained by accredited Clinical Vascular Scientists, at the Royal Free Hospital, London. Participants Patients with newly created AVF were invited to participate in the study. A total of 30 patients were initially recruited with 19 participants completing the 10 week study protocol. Outcome measures The primary outcome measure was the development of NIH resulting in a haemodynamically significant lesion. The secondary outcome was successful maturation of the AVF at 10 weeks. Results Elevated USTIR in the efferent vein 2 weeks post surgery corresponded to the development of NIH formation (P = 0.02). A cut off of 6.39% predicted NIH development with a sensitivity of 87.5% and a specificity of 80%. Conclusion Analysis of Doppler waveforms can successfully identify deleterious flow patterns and predict inward luminal remodelling in maturing AVF. We propose a longitudinal follow up study to assess the viability of this technique as a surveillance tool.

2019 ◽  
Vol 101 (4) ◽  
pp. 273-278 ◽  
Author(s):  
S Arman ◽  
A Vijendren ◽  
G Mochloulis

Introduction The aim of this single centre retrospective observational record-based audit was to assess the incidence of post-thyroidectomy hypocalcaemia. The setting was a district general hospital in Hertfordshire covering a population of 500,000 people. A total of 196 patients who had had total or completion thyroidectomy during a five-year period were included in the study. Materials and methods The primary outcome measure was to determine the rate of biochemical and symptomatic hypocalcaemia in patients undergoing total or completion thyroidectomy. Secondary outcome measures assessed time taken for biochemical and clinical hypocalcaemia to resolve, whether malignancy affected the rate of hypocalcaemia and if removal of parathyroid glands during surgery were a predictor of hypocalcaemia. Results The overall incidence of post-thyroidectomy hypocalcaemia (PTHC) within 24 hours was 21.4%. The incidence increased from 6 hours (13.8%) to 24 hours post-thyroidectomy (15.8%) and there was evidence of both transient and delayed PTHC within the first 24 hours. By 6 months post-surgery, 3.6% remained hypocalcaemic and required continual oral supplementation. Patients with benign thyroid disease had a higher risk of PTHC (P = 0.04) and patients younger than 50 years of age had a higher risk of symptomatic hypocalcaemia (P = 0.016). Other clinical factors including sex, type of surgery, neck dissection, oral calcium and/or vitamin D supplementation and inadvertent histological parathyroid gland excision were not associated with an increased incidence of PTHC or symptomatic hypocalcaemia. Conclusions Our audit shows that the rate of PTHC within our population was below the national average with higher risk in benign thyroid disease.


2021 ◽  
pp. neurintsurg-2021-017341
Author(s):  
Devin V Bageac ◽  
Blake S Gershon ◽  
Jan Vargas ◽  
Maxim Mokin ◽  
Zeguang Ren ◽  
...  

BackgroundMost conventional 0.088 inch guide catheters cannot safely navigate intracranial vasculature. The objective of this study is to evaluate the safety of stroke thrombectomy using a novel 0.088 inch guide catheter designed for intracranial navigation.MethodsThis is a multicenter retrospective study, which included patients over 18 years old who underwent thrombectomy for anterior circulation large vessel occlusions. Technical outcomes for patients treated using the TracStar Large Distal Platform (TracStar LDP) or earlier generation TRX LDP were compared with a matched cohort of patients treated with other commonly used guide catheters. The primary outcome measure was device-related complications. Secondary outcome measures included guide catheter failure and time between groin puncture and clot engagement.ResultsEach study arm included 45 patients. The TracStar group was non-inferior to the control group with regard to device-related complications (6.8% vs 8.9%), and the average time to clot engagement was 8.89 min shorter (14.29 vs 23.18 min; p=0.0017). There were no statistically significant differences with regard to other technical outcomes, including time to recanalization (modified Thrombolysis In Cerebral Infarction (mTICI) ≥2B). The TracStar was successfully advanced into the intracranial internal carotid artery in 33 cases (73.33%); in three cases (6.67%), it was swapped for an alternate catheter. Successful reperfusion (mTICI 2B-3) was achieved in 95.56% of cases. Ninety-day follow-up data were available for 86.67% of patients, among whom 46.15% had an modified Rankin Score of 0–2%, and 10.26% were deceased.ConclusionsTracstar LDP is safe for use during stroke thrombectomy and was associated with decreased time to clot engagement. Intracranial access was regularly achieved.


BMJ Open ◽  
2021 ◽  
Vol 11 (6) ◽  
pp. e047341
Author(s):  
Caroline Marra ◽  
William J Gordon ◽  
Ariel Dora Stern

ObjectivesIn an effort to mitigate COVID-19 related challenges for clinical research, the US Food and Drug Administration (FDA) issued new guidance for the conduct of ‘virtual’ clinical trials in late March 2020. This study documents trends in the use of connected digital products (CDPs), tools that enable remote patient monitoring and telehealth consultation, in clinical trials both before and after the onset of the pandemic.DesignWe applied a comprehensive text search algorithm to clinical trial registry data to identify trials that use CDPs for remote monitoring or telehealth. We compared CDP use in the months before and after the issuance of FDA guidance facilitating virtual clinical trials.SettingAll trials registered on ClinicalTrials.gov with start dates from May 2019 through February 2021.Outcome measuresThe primary outcome measure was the overall percentage of CDP use in clinical trials started in the 10 months prior to the pandemic onset (May 2019–February 2020) compared with the 10 months following (May 2020–February 2021). Secondary outcome measures included CDP usage by trial type (interventional, observational), funder type (industry, non-industry) and diagnoses (COVID-19 or non-COVID-19 participants).ResultsCDP usage in clinical trials increased by only 1.65 percentage points, from 14.19% (n=23 473) of all trials initiated in the 10 months prior to the pandemic onset to 15.84% (n=26 009) of those started in the 10 months following (p<0.01). The increase occurred primarily in observational studies and non-industry funded trials and was driven entirely by CDP usage in trials for COVID-19.ConclusionsThese findings suggest that in the short-term, new options created by regulatory guidance to stimulate telehealth and remote monitoring were not widely incorporated into clinical research. In the months immediately following the pandemic onset, CDP adoption increased primarily in observational and non-industry funded studies where virtual protocols are likely medically necessary due to the participants’ COVID-19 diagnosis.


Author(s):  
A. Rachid El Mohammad ◽  
Sree Koneru ◽  
Richard Staelin ◽  
Kenneth McLeod ◽  
Omar Tabbouche ◽  
...  

AbstractAssess treatment superiority of pulsed shortwave therapy (PSWT) against COX-2 NSAID therapy, in reducing disability and pain due to cervical osteoarthritis. Two hundred chronic pain suffers (average pain duration about 2 years) diagnosed with cervical osteoarthritis by radiological imaging were randomized into one of two treatment arms: COX-2 NSAID treatment; etoricoxib 60 mg/day for 4 weeks; or PSWT treatment worn 24 h/day for 4 weeks. The primary outcome measure was the 4-week score on the Neck Disability Index (NDI), a 10-question assessment on a 50-point scale. Secondary outcome measures included pain (at rest and during activity) measured on a visual analog scale (VAS) of 0–100 mm, dose count of rescue pain medication (paracetamol) use, and a treatment satisfaction rating. These 4-week scores were compared across the two arms to assess superiority. After 4 weeks of treatment, subjects in both study arms reported statistically significant (p < 0.0001) reductions in NDI, with final scores of 11.24-NSAID and 9.34-PSWT, VASrest, with final scores of 30.08-NSAID; 22.76-PSWT, and VASactivity, with final scores of 36.40-NSAID; 27.42-PSWT. The absolute reduction from baseline in NDI was significantly greater in the PSWT arm than NSAID arm (3.66 points; 95% CI 2.3 to 5.02; p < 0.0001). Similarly, the reductions from baseline in VASrest and VASactivity were significantly greater in the PSWT arm than NSAID arm (10.89 mm; 95% CI 6.90 to 14.87; p < 0.0001; and 12.05 mm; 95% CI 7.76 to 16.33; p < 0.0001, respectively). The PSWT arm used 50% less rescue pain medication. Eleven adverse effects were reported in the NSAID arm and zero in the PSWT arm. Both NSAID and PSWT treatments resulted in statistically significant improvements in quality of life (NDI) and reduction in pain (VAS) resulting from cervical osteoarthritis. However, the PSWT intervention showed superior improvements in all outcome measures when compared to the NSAID arm with no adverse effects. Clinicaltrials.gov (NCT03542955).


Trials ◽  
2021 ◽  
Vol 22 (1) ◽  
Author(s):  
Paul Sondo ◽  
Marc Christian Tahita ◽  
Toussaint Rouamba ◽  
Karim Derra ◽  
Bérenger Kaboré ◽  
...  

Abstract Background Malaria and malnutrition represent major public health concerns worldwide especially in Sub-Sahara Africa. Despite implementation of seasonal malaria chemoprophylaxis (SMC), an intervention aimed at reducing malaria incidence among children aged 3–59 months, the burden of malaria and associated mortality among children below age 5 years remains high in Burkina Faso. Malnutrition, in particular micronutrient deficiency, appears to be one of the potential factors that can negatively affect the effectiveness of SMC. Treating micronutrient deficiencies is known to reduce the incidence of malaria in highly prevalent malaria zone such as rural settings. Therefore, we hypothesized that a combined strategy of SMC together with a daily oral nutrients supplement will enhance the immune response and decrease the incidence of malaria and malnutrition among children under SMC coverage. Methods Children (6–59 months) under SMC coverage receiving vitamin A supplementation will be randomly assigned to one of the three study arms (a) SMC + vitamin A alone, (b) SMC + vitamin A + zinc, or (c) SMC + vitamin A + Plumpy’Doz™ using 1:1:1 allocation ratio. After each SMC monthly distribution, children will be visited at home to confirm drug administration and followed-up for 1 year. Anthropometric indicators will be recorded at each visit and blood samples will be collected for microscopy slides, haemoglobin measurement, and spotted onto filter paper for further PCR analyses. The primary outcome measure is the incidence of malaria in each arm. Secondary outcome measures will include mid-upper arm circumference and weight gain from baseline measurements, coverage and compliance to SMC, occurrence of adverse events (AEs), and prevalence of molecular markers of antimalarial resistance comprising Pfcrt, Pfmdr1, Pfdhfr, and Pfdhps. Discussion This study will demonstrate an integrated strategy of malaria and malnutrition programmes in order to mutualize resources for best impact. By relying on existing strategies, the policy implementation of this joint intervention will be scalable at country and regional levels. Trial registration ClinicalTrials.gov NCT04238845. Registered on 23 January 2020 https://clinicaltrials.gov/ct2/show/NCT04238845


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Manaf AlQahtani ◽  
Abdulkarim Abdulrahman ◽  
Abdulrahman Almadani ◽  
Salman Yousif Alali ◽  
Alaa Mahmood Al Zamrooni ◽  
...  

AbstractConvalescent plasma (CP) therapy in COVID-19 disease may improve clinical outcome in severe disease. This pilot study was undertaken to inform feasibility and safety of further definitive studies. This was a prospective, interventional and randomized open label pilot trial in patients with severe COVID-19. Twenty COVID-19 patients received two 200 ml transfusions of convalescent patient CP over 24-h compared with 20 who received standard of care. The primary outcome was the requirement for ventilation (non-invasive or mechanical ventilation). The secondary outcomes were biochemical parameters and mortality at 28 days. The CP group were a higher risk group with higher ferritin levels (p < 0.05) though respiratory indices did not differ. The primary outcome measure was required in 6 controls and 4 patients on CP (risk ratio 0.67, 95% CI 0.22–2.0, p = 0.72); mean time on ventilation (NIV or MV) did not differ. There were no differences in secondary measures at the end of the study. Two patients died in the control and one patient in the CP arm. There were no significant differences in the primary or secondary outcome measures between CP and standard therapy, although a larger definitive study is needed for confirmation. However, the study did show that CP therapy appears to be safe in hospitalized COVID-19 patients with hypoxia.Clinical trials registration NCT04356534: 22/04/2020.


2021 ◽  
Vol 09 (04) ◽  
pp. E646-E652
Author(s):  
Jan Rückert ◽  
Philipp Lenz ◽  
Hauke Heinzow ◽  
Johannes Wessling ◽  
Tobias Warnecke ◽  
...  

Abstract Background and study aims Due to demographic transition, neurogenic dysphagia has become an increasingly recognized problem. Patients suffering from dysphagia often get caught between different clinical disciplines. In this study, we implemented a defined examination protocol for evaluating the whole swallowing process by functional endoscopy. Special focus was put on the esophageal phase of swallowing. Patients and methods This prospective observational multidisciplinary study evaluated 31 consecutive patients with suspected neurogenic dysphagia by transnasal access applying an ultrathin video endoscope. Thirty-one patients with gastroesophageal reflux symptoms were used as a control group. We applied a modified approach including standardized endoscopic positions to compare our findings with fiberoptic endoscopic evaluation of swallowing and high-resolution manometry. The primary outcome measure was feasibility of functional endoscopy. Secondary outcome measures were adverse events (AEs), tolerability, and pathologic endoscopic findings. Results Functional endoscopy was successfully performed in all patients. No AEs were recorded. A variety of disorders were documented by functional endoscopy: incomplete or delayed closure of the upper esophageal sphincter in retroflex view, clearance disturbance of tubular esophagus, esophageal hyperperistalsis, and hypomotility. Analysis of results obtained with the diagnostic tools showed some discrepancies. Conclusions By interdisciplinary cooperation with additional assessment of the esophageal phase of deglutition using the innovative method of functional endoscopy, the diagnosis of neurogenic disorders including dysphagia may be significantly improved, leading to a better clinical understanding of complex dysfunctional patterns. To the best of our knowledge, this is the first study to show that a retroflex view of the ultrathin video endoscope within the esophagus can be safely performed. [NCT01995929]


2021 ◽  
Vol 4 (3) ◽  
pp. 106-114
Author(s):  
Syed Khadeer ◽  
B Jagannath

Rhinitis is inflammation of nasal mucosa which characteristically presents as running nose, blocked nose, itching on nose or sneezing. Allergic rhinitis is more common than non-allergic rhinitis. Anti-histamines are the mainstay of SAR treatment. Desloratadine, rupatadine and ketotifen are the commonly prescribed anti histamines in our region. In this study, we have compared efficacy and tolerability of desloratadine, rupatadine and ketotifen in SAR. This was a prospective, randomized, three arm, open label comparative study of desloratadine, rupatadine and ketotifen in SAR, conducted at Department of ENT, Kempegowda Institute of Medical Sciences, Bangalore; between January 2014 and December 2014. Patients’ severity of SAR symptoms were assessed by TNSS, QoL was measured using Medical Outcomes Study questionnaire (SF-12). SF-12 was administered at the start of study and then at the end of study. Adverse effects were monitored during clinical examination at each visit. Study subjects were systemically randomized into three groups – desloratadine (DES), rupatadine (RUP) and ketotifen (KET). Based on the assigned group; desloratadine was given orally in dose of 10mg OD, rupatadine orally 10 mg OD and ketotifen orally 1mg BD. All medications were given for 4 weeks. Follow up was done for all patients every week during treatment period of 4 weeks. The primary outcome measure was change in mean TNSS from baseline; secondary outcome measures were changes in the individual nasal symptom scores, change in the quality of life and tolerability to the study medications. Total 150 patients were recruited for this study, divided into 3 groups. DES and RUP were equally effective but significantly better than KET in improving rhinorrhea, nasal congestion, TNSS and AEC. (p=0.05). All the drugs were equally effective with no statistically significant intergroup difference in improving sneezing, nasal itching and QoL. RUP appeared to have better tolerability as the total number of adverse events were marginally less. DES and RUP are comparatively more effective and faster acting than KET. All the study medications were well tolerated with few mild, self-limiting, transient adverse events requiring no intervention.


2018 ◽  
Vol 103 (10) ◽  
pp. 1395-1400 ◽  
Author(s):  
Rashmi G Mathew ◽  
Sahar Parvizi ◽  
Ian E Murdoch

AimsTo compare success proportions at 5 years in three surgical groups: group 1, trabeculectomy alone; group 2, trabeculectomy followed by cataract surgery within 2 years; and group 3, trabeculectomy performed on a pseudophakic eye.MethodsA retrospective cohort study. 194 eyes of 194 patients were identified with at least 5 years’ follow-up post trabeculectomy (N=85, 60 and 49 in groups 1, 2 and 3, respectively).Main outcome measures1. Primary outcome measure: intraocular pressure (IOP) at 5 years post-trabeculectomy surgery, 2.Secondary outcome measure: change in visual acuity at 5 years.ResultsAt 5 years, the mean IOP (SD) was 12.9 (3.5), 12.5 (4.8) and 12.7 (4.8) mm Hg in groups 1, 2 and 3, respectively. Overall success was almost identical, 58%, 57% and 59% in groups 1, 2 and 3, respectively. There was no significant difference between the groups in terms of percentage IOP reduction, number of medications, proportion restarting medication and reoperation rates at 5 years. Logistic regression for an outcome of failure showed men to be at increased risk of failure OR 1.97 (95% CI 1.10 to 3.52, p=0.02). Nearly 80% of patients retained or improved their vision following their initial trabeculectomy.ConclusionsThe sequence in which surgery is carried out does not appear to affect trabeculectomy function at 5 years, success being similar to trabeculectomy alone. In our study, men may be at increased risk of failure.


Author(s):  
Ramakant Yadav ◽  
S. K. Shukla

Background: Migraine is a common health problem in children and adolescents. This study compares the efficacy and safety of propranolol and topiramate in preventing migraine among children and adolescents.Methods: Seventy-six patients (10-18 years of age) with migraine without auras defined by the 2004 International Headache society criteria were included in a prospective double blind clinical trial were allocated to receive propranolol (0.5-2mg/kg per day) or topiramate (1-2mg/kg per day). The primary outcome measure was reduction in 50 % or more headache days in comparison to baseline headache frequency per month. Secondary outcome measures were headache related disability, migraine intensity and duration. Efficacy measures were recorded at the baseline and at 12 weeks of prophylactic treatment.Results: In this study total of 76 patients with mean age of 12.43 years were evaluated, 40 in the propranolol group and 36 in the topiramate group. At the 12-week, the percentage of patients who had a relative reduction of 50% or more in the number of headache days were 67.5% patients in the propranolol group and 75.0% patients in the topiramate group. The monthly migraine frequency, headache related disability, intensity and duration were significantly decreased in both the propranolol and topiramate groups when compared to the baseline. No significant difference was observed between these two groups in term of reduction of frequency, headache related disability, severity and duration of attack. Fatigue, hypotension and exercise induced asthma were main side effects in propranolol group and weight loss, fatigue and loss of appetite, paresthesias in topiramate group.Conclusions: Propranolol and topiramate were found effective and safe for the prevention of paediatric migraines.


Sign in / Sign up

Export Citation Format

Share Document