• Stein Kirk posted an update 4 hours, 11 minutes ago

    Moreover, there was no significant association with outcomes in the in vivo T-cell-depleted (ie, serotherapy) cohort. This study, which is the largest analysis of donor KIR in the pediatric acute leukemia population, does not support the use of KIR in the selection of URDs for children undergoing T-replete transplantation. © 2020 by The American Society of Hematology.OBJECTIVE To rapidly deploy a digital patient-facing self-triage and self-scheduling tool in a large academic health system to address the COVID-19 pandemic. selleck chemical MATERIALS AND METHODS We created a patient portal-based COVID-19 self-triage and self-scheduling tool and made it available to all primary care patients at this large academic health system. Asymptomatic patients were asked about exposure history and were then provided relevant information. Symptomatic patients were triaged into one of four categories-emergent, urgent, non-urgent or self-care- and then connected with the appropriate level of care via direct scheduling or telephone hotline. RESULTS This self-triage and self-scheduling tool was designed and implemented in under two weeks. During the first 16 days of use, it was completed 1129 times by 950 unique patients. Of completed sessions, 315 (28%) were by asymptomatic patients, and 814 (72%) were by symptomatic patients. Symptomatic patient triage dispositions were as follows 193 emergent (24%), 193 urgent (24%), 99 nonurgent (12%), 329 self-care (40%). Sensitivity for detecting emergency-level care was 87.5% (95% CI 61.7-98.5%). DISCUSSION This self-triage and self-scheduling tool has been widely used by patients and is being rapidly expanded to other populations and health systems. The tool has recommended emergency-level care with high sensitivity, and decreased triage time for patients with less severe illness. The data suggests it also prevents unnecessary triage messages, phone calls and in-person visits. CONCLUSION Patient self-triage tools integrated into electronic health record systems have the potential to greatly improve triage efficiency and prevent unnecessary visits during the COVID-19 pandemic. © The Author(s) 2020. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For permissions, please email journals.permissions@oup.com.For the present study we collected 22 Lactobacillus helveticus strains from different dairy (n = 10) and cereal (n = 12) fermentations to investigate their biodiversity and to uncover habitat specific traits. Biodiversity was assessed by comparison of genetic fingerprints, low-molecular sub-proteomes, metabolic and enzymatic activities, growth characteristics and acidification kinetics in food matrices. A clear distinction between the dairy and cereal strains was observed in almost all examined features suggesting that the different habitats are domiciled by different Lactobacillus helveticus biotypes that are adapted to the specific environmental conditions. Analysis of the low-molecular subproteome divided the cereal isolates into two clusters while the dairy isolates formed a separate homogenous cluster. Differences regarding carbohydrate utilization were observed for lactose, galactose, sucrose and cellobiose as well as for plant deriving glucosides. Enzymatic differences were observed mainly for ß-Galactosidase and ß-Glucosidase activity. Further, growth temperature was optimal in the range from 33°C to 37°C for the cereal strains, whereas the dairy strains showed optimal growth at 40°C. Taken together, adaption of the various biotypes result in a growth benefit in the particular environment. Acidification and growth tests using either sterile skim-milk or a wheat-flour extract confirmed these results. Differentiation of these biotypes and their physiological characteristics enables knowledge-based starter culture development for cereal versus dairy products within one species. © FEMS 2020.BACKGROUND AND AIMS Complete histologic normalisation and reduction of inflammation severity in patients with ulcerative colitis are associated with improved clinical outcomes, but the clinical significance of normalisation of only segments of previously affected bowel is not known. We examined the prevalence, pattern, predictors and clinical outcomes associated with segmental histologic normalisation in in patients with ulcerative colitis. METHODS Medical records of patients with confirmed ulcerative colitis with more than one colonoscopy were sought. Segmental histologic normalisation was defined as histological normalisation of a bowel segment (rectum, left-sided or right side) that had previous evidence of chronic histologic injury. We assessed the variables influencing these findings and whether segmental normalisation was associated with improved clinical outcomes. RESULTS Of 646 patients, 32% had segmental and 10% complete histologic normalisaton when compared to their maximal disease extent. Most (88%) had segmental normalisation in a proximal-to-distal direction. Others had distal-to-proximal or patchy normalisation. On multivariate analysis, only current smoking (p=0.040) and age of diagnosis ≤16 years (p=0.028) predicted segmental histological normalisation. Of 310 who were in clinical remission at initial colonoscopy, 77 (25%) experienced clinical relapse after median 1.3 (range 0.06-7.52) years. Only complete histologic normalisation of the bowel was associated with improved relapse-free survival (HR 0.23; 95 CI 0.08-0.68; p=0.008). CONCLUSIONS Segmental histologic normalisation occurs in 32% of patients with ulcerative colitis and is increased in those who smoke or diagnosed at younger age. Unlike complete histologic normalisation, segmental normalisation does not signal improved clinical outcomes. © The Author(s) 2020. Published by Oxford University Press on behalf of European Crohn’s and Colitis Organisation. All rights reserved. For permissions, please email journals.permissions@oup.com.BACKGROUND Poor sense of smell in older adults may lead to weight loss, which may further contribute to various adverse health outcomes. However, empirical prospective evidence is lacking. We aimed to longitudinally assess whether poor olfaction is associated with changes in body composition among older adults. METHODS 2390 participants from the Health ABC Study had their olfaction assessed using the Brief Smell Identification Test in 1999-2000. Based on the test score, olfaction was defined as poor (0-8), moderate (9-10), or good (11-12). Total body mass, lean mass, and fat mass were measured by dual-energy X-ray absorptiometry annually or biennially from 1999 to 2007. RESULTS At baseline, compared to participants with good olfaction, those with poor olfaction weighed on average 1.67 kg less (95% CI -2.92, -0.42) in total mass, 0.53 kg less (95% CI -1.08, 0.02) in lean mass, and 1.14 kg less (95% CI -1.96, -0.31) in fat mass. In longitudinal analyses, compared to participants with good olfaction, those with poor olfaction had a greater annual decline in both total mass (-234 g, 95% CI -442, -26) and lean mass (-139 g, 95% CI -236, -43).