ObjectiveVideo electroencephalography (VEEG) monitoring for health education of elderly patients based on a process-based communication model, and explore the impact of this model on the success rate, negative emotions, nursing satisfaction, and active cooperation rate of such patients.MethodsFrom September 2017 to September 2019, 118 patients with suspected epilepsy, encephalitis and other diseases who required VEEG monitoring in Suining Central Hospital were selected for this study (patients aged 61 to 73 years; 54 males and 64 females). Patients were divided into 2 groups using a random number table method, 59 patients in each group.A group received routine nursing, and B group received health education based on the process communication model. The monitoring success rate, negative emotion, active cooperation rate, and nursing satisfaction were compared between the two groups.ResultsThe total effective rate in the B group was 86.44%, which was significantly higher than 76.27% in the A group (P<0.05). After nursing intervention, the scores of anxiety and depression in the two groups were significantly decreased, but the decline was greater in the B group (P<0.05). The active cooperation rate and nursing satisfaction of the B group were significantly higher than those of the A group (P<0.05).ConclusionCompared with conventional nursing, health education based on process communication mode can significantly improve the success rate of VEEG monitoring in elderly patients, alleviate the negative emotions of patients, improve the active cooperation rate and nursing satisfaction.
The monitoring of pregnant women is very important. It plays an important role in reducing fetal mortality, ensuring the safety of perinatal mother and fetus, preventing premature delivery and pregnancy accidents. At present, regular examination is the mainstream method for pregnant women's monitoring, but the means of examination out of hospital is scarce, and the equipment of hospital monitoring is expensive and the operation is complex. Using intelligent information technology (such as machine learning algorithm) can analyze the physiological signals of pregnant women, so as to realize the early detection and accident warning for mother and fetus, and achieve the purpose of high-quality monitoring out of hospital. However, at present, there are not enough public research reports related to the intelligent processing methods of out-of-hospital monitoring for pregnant women, so this paper takes the out-of-hospital monitoring for pregnant women as the research background, summarizes the public research reports of intelligent processing methods, analyzes the advantages and disadvantages of the existing research methods, points out the possible problems, and expounds the future development trend, which could provide reference for future related researches.
Exercise intervention is an important non-pharmacological intervention for various diseases, and establishing precise exercise load assessment techniques can improve the quality of exercise intervention and the efficiency of disease prevention and control. Based on data collection from wearable devices, this study conducts nonlinear optimization and empirical verification of the original "Fitness-Fatigue Model". By constructing a time-varying attenuation function and specific coefficients, this study develops an optimized mathematical model that reflects the nonlinear characteristics of training responses. Thirteen participants underwent 12 weeks of moderate-intensity continuous cycling, three times per week. For each training session, external load (actual work done) and internal load (heart rate variability index) data were collected for each individual to conduct a performance comparison between the optimized model and the original model. The results show that the optimized model demonstrates a significantly improved overall goodness of fit and superior predictive ability. In summary, the findings of this study can support dynamic adjustments to participants' training programs and aid in the prevention and control of chronic diseases.
ObjectiveThe guideline is formulated to standardize the principles, procedures and methods of developing therapeutic drug monitoring (TDM) guidelines and to promote open, transparent, scientific and credibility of the TDM guidelines.MethodsDivision of Therapeutic Drug Monitoring, Chinese Pharmacological Society established guideline working groups, declared and managed conflict of interest. The guideline working groups used the Delphi method to formulate the purpose and scope of the guideline and questions in PICO format, searched and synthesized evidence, integrated with Chinese actual situation and TDM characteristics to preliminarily develop recommendations for the guideline for TDM guideline development in China. Through internal discussion of the guideline working groups and external peer review, the content has been improved, and we eventually formulated the guideline suitable for guiding TDM related guidelines development.ResultsThe guideline provide suggestions for problems to be identified and solved in the planning, development, publishing and updating stages of TDM guidelines including forming guideline wording groups, planning guidelines, declaration and management of interests, formulating questions and selecting outcomes, preparing the planning proposal, retrieval and synthesis evidence, evaluating evidence, developing recommendations, drafting guidelines, external review, publishing and spreading guidelines, evaluating guidelines, and updating guidelines.ConclusionsThis guideline can provide methodological guidance and reference for the development of TDM guidelines.
A hand-held electrocardiogram (ECG) monitor with capacitive coupling is designed in this study that can rapidly detect ECG signals through clothing. This new device improves many deficiencies of the traditional ECG monitor, such as infection due to direct skin contacting, inconvenience, and time-consuming. In specificity, the hand-held ECG monitor consists of two parts, a sensor and an embedded terminal. ECG signals are initially detected by a sensing electrode placed on the chest through clothing, then treated by single ended differential amplification, filtering and master amplification, and later processed through A/D conversion and ECG signal transmission by CC2540 module. The waveform and heart rate are finally displayed on the screen based on digital filtering and data processing for the received ECG signal on the embedded terminal. Results confirm that the newly developed hand-held ECG monitor is capable of detecting real-time ECG signals through clothing with advantages of simple operation, portability and rapid detection.
ObjectiveTo understand the characteristics of and risk factors for nosocomial infection in a newly built branch of a university teaching hospital, in order to investigate the control measures for prevention and control of nosocomial infection. MethodsA total of 598 cases of nosocomial infection from April 2012 to June 2014 were enrolled in this study. We analyzed statistically such indexes as nosocomial infection rate, infection site, pathogen detection, and use of antibiotics. Meantime, infection point-prevalence survey was introduced by means of medical record checking and bedside visiting. ResultsAmong all the 44 085 discharged patients between April 2012 and June 2014, there were 598 cases of nosocomial infection with an infection rate of 1.36%. Departments with a high nosocomial infection rate included Intensive Care Unit (ICU) (9.79%), Department of Orthopedics (2.98%), Department of Geriatrics (2.62%), and Department of Hematology (1.64%). The top four nosocomial infection sites were lower respiratory tract (45.32%), urinary tract (13.21%), operative incision (8.86%), and blood stream (8.86%). The samples of 570 nosocomial infections were delivered for examination with a sample-delivering rate of 95.32%. The most common pathogens were acinetobacter Baumanii (17.02%), Klebsiella pneumoniae (14.21%), Escherichia coli (13.68%), Pseudomonas aeruginosa (11.93%), and Staphylococcus aureus (9.12%). And urinary tract intubation (42.81%), admission of ICU (28.60%), and application of corticosteroid and immunosuppressive agents (26.42%) were the top three independent risk factors for nosocomial infection. ConclusionGeneral and comprehensive monitoring is an effective method for the hospital to detect high-risk departments, factors and patients for nosocomial infection, providing a theoretical basis for prevention and control of nosocomial infection.
ObjectiveTo compare home blood pressure monitoring (HBPM) versus ambulatory blood pressure monitoring (ABPM) versus office blood pressure monitoring (OBPM) in diagnosis and management of hypertension, and to find the optimal blood pressure measurement and management.MethodsThe following were compared among three BP monitoring, such as cost-effectiveness, prognostic value of target organ damage (TOD), predictive value of the progress in chronic kidney disease (CKD) and blood pressure variety (BPV). ResultsCompared to OBPM, ABPM was the most cost-effective method in the primary diagnosis of hypertension, but HBPM was the optimal method in long-term and self-management in hypertension. In hypertensives, compared to OBPM, HBPM and ABPM, especially HBPM, had a stronger predictive value for cardiovascular events, stroke, end-stage renal dysfunction (ESRD) and all-cause mortality. In hypertensives with renal dysfunction, controlling HBPM and ABPM, especially controlling ABPM, was an effective way to slow the progress in renal dysfunction, to decrease cardiovascular events, and to decrease the need of dialysis. All BPV derived from OBPM, ABPM and HBPM had a predictive significance of cardiovascular events, and HBPM BPV performed the best.ConclusionCompared to OBPM, ABPM is the best method in primary diagnosis of hypertension and BP control in CKD patients, while HBPM is the best method in predicting and in evaluating BPV, as well as in long-term and self-management in hypertension.
PurposeTo analyze the effect of medication withdraw (MW) on long-term electroencephalogram (EEG) monitoring in children who need preoperative assessment for refractory epilepsy.MethodsRetrospective analysis was performed on the data of preoperative long-term EEG monitoring of children with refractory epilepsy who needed preoperative evaluation in the Pediatric Epilepsy Center of Peking University First Hospital from August 2018 to December 2019. Monitoring duration: at least three habitual seizures were detected, or the monitoring duration were as long as 10 days. MW protocol was according to the established plan.ResultsA total of 576 children (median age 4.4 years) required presurgical ictal EEGs, and 75 (75/576, 13.0%) needed MW for ictal EEGs. Among the 75 cases, 38 were male and 37 were female. The age range was from 15 months to 17 years (median age: 7.0 years). EEG and clinical data of with 65 children who strictly obey the MW protocol were analyzed. The total monitoring duration range was from 44.1 h (about 2 days) to 241.8 h (about 10 days)(median: 118.9 h (about 5 days)). Interictal EEG features before MW were including focal interictal epileptiform discharge (IED) in 39 cases (39/65, 60%), focal and generalized IED in 2 cases (2/65, 3.1%), multifocal IED in 20 cases (20/65, 30.7%), multifocal and generalized IED in 2 cases (2/65, 3.1%), and no IED in 2 cases (2/65, 3.1%). After MW, 18 cases (18/65, 27.7%) had no change in IED and the other 47 cases had changes of IED after MW. And IEDs in 46 cases (46/65, 70.8%) were aggravated, and IED was decreased in 1 case. The pattern of aggravated IED was original IED increasement, in 41 cases (41/46, 89.1%), and 5 cases (5 /46, 10.9%) had generalized IED which was not detected before MW. Of the 46 patients with IED exacerbations, 87.3% appeared within 3 days after MW. Habitual seizures were detected in 56 cases (86.2%, 56/65) after MW, and within 3 days of MW in 80.4% cases. Eight patients (14.3%) had secondary bilateral-tonic seizure (BTCS), of which only 1 patient had no BTCS in his habitual seizures. In 56 cases, 94.6% (53/56) had seizures after MW of two kinds of AEDs.Conclusions① In this group, thirteen percent children with intractable epilepsy needed MW to obtain ictal EEG; ② Most of them (86.2%) could obtain ictal EEG by MW. The IED and ictal EEG after MW were still helpful for localization of epileptogenic zone; ③ Most of the patients can obtain ictal EEG within 3 days after MW or after MW of two kinds of AEDs;4. The new secondary generalization was extremely rare.
ObjectiveTo observe and analyze the correlation between time within target glucose range (TIR) and hemoglobin A1c (HbA1c) and the risk of diabetic retinopathy (DR). MethodsA retrospective clinical study. From March 2020 to August 2021, 91 patients with type 2 diabetes mellitus (T2DM) who were hospitalized in Department of Endocrinology and Metabolic Diseases, Affiliated Hospital of Weifang Medical University, were included in the study. All patients underwent Oburg's no-dilatation ultra-wide-angle laser scan ophthalmoscopy, HbA1c and continuous glucose monitoring (CGM) examinations. According to the examination results and combined with the clinical diagnostic criteria of DR, the patients were divided into non-DR (NDR) group and DR group, with 50 and 41 cases respectively. The retrospective CGM system was used to monitor the subcutaneous interstitial fluid glucose for 7 to 14 consecutive days, and the TIR was calculated. Binary logistic regression was used to analyze the correlation between TIR, HbAlc and DR in patients with T2DM0. At the same time, a new indicator was generated, the predicted probability value (PRE_1), which was generated to represent the combined indicator of TIR and HbA1c in predicting the occurrence of DR. The receiver operating characteristic curve (ROC curve) was used to analyze the value of TIR, HbAlc and PRE_1 in predicting the occurrence of DR. ResultsThe TIR of patients in the NDR group and DR group were (81.58±15.51)% and (67.27±22.09)%, respectively, and HbA1c were (8.03±2.16)% and (9.01±2.01)%, respectively. The differences in TIR and HbA1c between the two groups of patients were statistically significant (t=3.501,-2.208; P=0.001, 0.030). The results of binary logistic regression analysis showed that TIR, HbA1c and DR were significantly correlated (odds ratio=0.960, 1.254; P=0.002, 0.036). ROC curve analysis results showed that the area under the ROC curve (AUC) of TIR, HbA1c and PRE_1 predicting the risk of DR were 0.704, 0.668, and 0.707, respectively [95% confidence interval (CI) 0.597-0.812, P=0.001; 95%CI 0.558-0.778, P=0.006; 95%CI 0.602-0.798, P=0.001]. There was no statistically significant difference between TIR, HbA1c and PRE_1 predicting the AUC of DR risk (P>0.05). The linear equation between HbAlc and TIR was HbAlc (%) = 11.37-0.04×TIR (%). ConclusionsTIR and HbA1c are both related to DR and can predict the risk of DR. The combined use of the two does not improve the predictive value of DR. There is a linear correlation between TIR and HbAlc.
Objective To compare the environmental microbiological and physical monitoring parameters between the temporary extended medical area and the normal area during the flexible allocation of ward, summarize the rule and find the potential risk points of infection control. Methods From April 10th to 23rd, 2023, prospective environmental microbial monitoring and physical parameter monitoring were carried out in a ward of Zhongnan Hospital of Wuhan University, and the monitoring results under different scenarios were compared and analyzed. Results In general, the carbon dioxide (CO2) concentration, particulate matter 2.5 (PM2.5) concentration, temperature, and relative humidity in the temporary medical area were better than those in the inpatient rooms (P<0.05), but there was no statistically significant difference in the amount of microorganisms detected on the surface of environmental objects or the hands of medical staff (P>0.05). After the start of the temporary medical area, the amount of microorganisms detected on the surface of environmental objects, CO2 concentration, and temperature in the inpatient rooms were higher than those in the temporary medical area (P<0.05), the PM2.5 concentration in the inpatient rooms was lower than that in the temporary medical area (P<0.05), and there was no statistically significant difference in the amount of microorganisms detected on the hands of medical staff or relative humidity between the two areas (P>0.05). Compared with those in the same area when the temporary medical area was not started, in the inpatient rooms after the start, the amount of microorganisms detected in the air, CO2 concentration, temperature, and relative humidity were lower (P<0.05), the amount of microorganisms detected on the surface of environmental objects and PM2.5 concentration were higher (P<0.05), and there was no statistically significant difference in the amount of microorganisms detected on the hands of medical staff between the two periods (P>0.05); in the temporary medical area after the start, the PM2.5 concentration was higher (P<0.05), the CO2 concentration and temperature were lower (P<0.05), and the differences in the relative humidity and amounts of microorganisms detected on the surface of environmental objects and the hands of medical staff between the two periods were not statistically significant (P>0.05). Regardless of whether the temporary medical area was activated or not, Filamentous fungi had the highest detection rates in air samples, and Staphylococcus epidermidis had the highest detection rates in both environmental surface samples and medical staff hand samples. Conclusion A series of environmental risks such as environmental microbial load and poor ventilation caused by temporary medical areas should be paid attention to.