Skip to main content

The effectiveness of E-learning in continuing medical education for tuberculosis health workers: a quasi-experiment from China

Abstract

Background

Given the context of rapid technological change and COIVD-19 pandemics, E-learning may provide a unique opportunity for addressing the challenges in traditional face-to-face continuing medical education (CME). However, the effectiveness of E-learning in CME interventions remains unclear. This study aims to evaluate whether E-learning training program can improve TB health personnel’s knowledge and behaviour in China.

Methods

This study used a convergent mixed method research design to evaluate the impact of E-learning programs for tuberculosis (TB) health workers in terms of knowledge improvement and behaviour change during the China-Gates TB Project (add the time span). Quantitative data was collected by staff surveys (baseline n = 555; final n = 757) and management information systems to measure the demographic characteristics, training participation, and TB knowledge. Difference-in-difference (DID) and multiple linear regression models were employed to capture the effectiveness of knowledge improvement. Qualitative data was collected by interviews (n = 30) and focus group discussions (n = 44) with managers, teachers, and learners to explore their learning experience.

Results

Synchronous E-learning improved the knowledge of TB clinicians (average treatment effect, ATE: 7.3 scores/100, P = 0.026). Asynchronous E-learning has a significant impact on knowledge among primary care workers (ATE: 10.9/100, P < 0.001), but not in clinicians or public health physicians. Traditional face-to-face training has no significant impact on all medical staff. Most of the learners (57.3%) agreed that they could apply what they learned to their practice. Qualitative data revealed that high quality content is the key facilitator of the behaviour change, while of learning content difficulty, relevancy, and hardware constraints are key barriers.

Conclusions

The effectiveness of E-learning in CME varies across different types of training formats, organizational environment, and target audience. Although clinicians and primary care workers improved their knowledge by E-learning activities, public health physicians didn’t benefit from the interventions.

Background

Continuing medical education (CME) is an “established method that can facilitate lifelong learning, which focuses on maintaining or developing knowledge, skills, and relationships to ensure competent practice” [1]. Given the context of rapid technological advancement in medicine, it has been proposed that CME may play a more important role in updating the physicians’ knowledge and skills as well as improving the quality of care. Since the 1980s, considerable resources have been invested in developing various CME programs in China and globally [2, 3]. However, the effectiveness of CME interventions remains unclear [4,5,6,7,8]. Existing evidence has shown that traditional didactic sessions have little or no impact, while interactive or mixed workshops were more likely to associated with some positive effects, as revealed by two systematic reviews [1, 6]. In addition, the clinicians are usually too busy in their daily practices to attend training. Lack of high-quality learning resources, unattractive and irrelevant training contents were also key barriers to their learning [9]. Therefore, in order to improve the performance of physicians and the health system, it’s important to find an appropriate CME training format that could increase access to high-quality training resources, considering health workers’ needs in terms of content, timing, and location.

E-learning provides a unique opportunity for addressing these challenges in CME. Due to the COVID-19 pandemic, we saw an unprecedented explosion of online and remote training in 2020 [10, 11]. Existing evidence indicated that E-learning could reduce the cost [12, 13], improve the access to education [14], as well as provide more flexibility for students who have work and family commitments [15, 16]. However, whether E-learning can improve student outcomes remains controversial [16,17,18]. In addition, most of the available evidence comes from higher education-based studies [19,20,21]. Little E-learning research was conducted in CME settings [20, 22, 23], especially in low- and middle-income countries (LMICs) [24,25,26].

Despite the progress it has made in TB control, China still has the world’s third largest TB epidemic in 2019 with 833 thousand new TB cases [28]. More than 20% of relapse TB patients who have previously treated in China are multi-drug-resistant TB (MDR-TB), likely due to previously poor treatment [28]. To address this issue, China-Gates Foundation TB Control Program (phase three) introduced and expanded a new comprehensive model of TB control in China since 2017. E-learning for TB health workers was an integral part of that program. In the past two decades, increasing E-learning tools focus on TB training are available, such as The Structure Operational Research and Training (SORT IT) course developed by The International Union Against Tuberculosis and Lung Disease and Médecins Sans Frontières (MSF) [29, 30]. However, few studies have explored the effectiveness of E-learning tools for health care providers. This study aims to evaluate whether E-learning can improve TB health personnel’s knowledge and behaviour, in order to provide policy recommendations for improving the utilization of E-learning in CME.

Methods

Intervention design

This E-learning subproject was implemented from May 2017 to June 2019 among the three project provinces (Zhejiang represents the most developed eastern area in China, while Jilin is from the less developed central area and Ningxia represents the least developed western area). In each province, we selected two cities as pilot area for E-learning project according to their level of socioeconomic development and TB health service capacity [number of TB health workers, Gross Domestic Product (GDP) per capita, information technology development].

Two key interventions were designed in the E-learning project [31]. First, the "National TB Telemedicine Consultation and Training Platform" is a live, synchronous training platform developed by Clinical Centre on Tuberculosis (hereinafter referred to as "synchronous training"). This platform focused on complex clinical conditions for TB clinical staff at the county level and above. The platform provides multiple formats of training sessions, including lectures, case studies, and online meetings. Second, “China TB prevention Online Training Website”, an asynchronous training and qualification system (hereinafter referred to as "asynchronous training") developed for all TB health workers, including clinical doctors, public health physicians, and primary care medical staff. The online system focused on basic theory and routine treatment according to the latest version of the national TB treatment and control guideline. The website sessions are provided by recorded videos, which created more flexibility of time and space for health personnel. Both synchronous and asynchronous training sessions were delivered by national level TB experts [32].

Evaluation design

This evaluation study was a convergent mixed method study. With the conventional “Input-Process-Output” framework, we selected representative study sites from multiple levels (provincial, city, county, township, and village-level). Before the intervention, in both intervention group (pilot area) and control group (non-pilot area), we selected one city in each province as study sites according to their level of socioeconomic development (for example, GDP per capita, type of TB health service delivery model, etc.). Based on similar criteria, two counties from each city and three towns from each county (including at least one remote or mountainous town) were also selected. We also conducted a baseline survey before the intervention to capture the baseline knowledge level of TB health workforces. During the intervention (May 2017–June 2019), we monitored the process of training by monthly reports from the IT system (quantitative data) and the interviews of organizers, lecturers, and participators (qualitative data). These data would help us to open the black box of mechanism of E-learning. After the intervention, we re-examined the knowledge level of TB health workforce and employed the difference in difference (DID) method to capture the effectiveness of E-learning interventions. (Fig. 1).

Fig. 1
figure1

Evaluation design

We conducted three waves of field trips (baseline, process, and the final evaluation) in 2017 (January), 2018 (July and August), and 2019 (July and August), respectively. Two different types of data were collected for analysis. Firstly, we performed two waves of questionnaire surveys among TB health workers during the baseline (2017, pre-training) and final evaluation (2019, post-training), which provided quantitative data about demographic information, training needs, participants' reactions to the training program, and a 10-question TB quiz. The TB quiz were prepared by national experts from the National Clinical Centre on Tuberculosis, Chinese Centre for Disease Control and Prevention (China CDC). Three different types of quizzes are used for clinical, public health physicians, and primary care workers. We also asked several questions on behaviour change in the questionnaires; the participants could choose whether they can put what they have learned into daily practices. All TB health workers on duty on the day of fieldwork were invited to fill in the questionnaire. In total, 555 TB-related health workers completed the baseline questionnaire survey and 757 completed the final questionnaire survey (Table 1, Additional file 1).

Table 1 Sample size and general information of TB health worker survey

Qualitative data were collected to explore the impacts on behaviour change. With the help of coordinators, participants were also recruited for key informant interviews and focused group discussions (FGDs) at each level with a purposive sampling method. Key informant interviews were conducted with program officers at national and provincial levels, trainers and trainees of E-learning activities. According to the size of the working team in each designated hospital and CDC, FGDs were convened with 6–8 TB-related doctors (clinician group), 2–3 public health physicians (public health physician group), or 1–3 primary care workers (primary care worker group). The topic guides included issues on their motivation to participate, the impact of E-learning on their knowledge and behaviours, and their learning or teaching experience (from both the trainers and trainees). The interviews and FGDs were conducted in a quiet meeting room or office room without any other irrelevant people. A senior researcher conducted the interviews and FGDs as the interviewer or facilitator, with a junior researcher as observer and notetaker. The talks were recorded after participants signed the consent forms. In total, we conducted 30 key informant interviews and 44 FGDs (Additional file 2).

Data analysis

For both quantitative and qualitative data, analyses were conducted around two dimensions. First, the impact of E-learning on physicians’ knowledge about TB. A comparison of knowledge scores was made before and after the training activities among TB health workers. Besides, we used two different identification strategies to capture the association between knowledge improvement and E-learning interventions. We employed the following DID model to control for the unobserved time-invariant fixed effects and common time-varying trends (formula 1). Difference in difference is a widely-used identification strategy in the area of policy analysis and health service research [33,34,35,36]. We take the training intervention as a quasi-experiment in our study. Our sample is broken down into four groups: the control group before the intervention, the control group after the intervention, the treatment group before the intervention, and the treatment group after the intervention.

$$Y={\beta }_{0}+{\beta }_{1}t+{\beta }_{2}Training*t+{\beta }_{3}X+{\alpha }_{i}+\varepsilon$$
(1)

The subscript i indicates the medical institutions (for clinical and public health physicians) or township (for primary care workers). The dependent variables Y represents the TB knowledge score among TB health workers.\({\alpha }_{i}\) is a series of institutional-level fixed effects (for clinical and public health physicians) or township-level fixed effects (for primary care workers) that controls for the unobserved time-invariant heterogeneity across institutions or towns. ε refers to the error term. X is a set of covariates including the health workers’ gender, age, permanent post, professional titles, education level, medical discipline, length of service, and monthly income. The key variable of interest is a dummy variable—Training. It equals 1 for physicians in pilot areas after the implementation training program and 0 elsewhere. By pooling independent cross-sectional data across two years (one before the training and one after the training), we could estimate the effect of training with the following DID estimator—\({\beta }_{2}\) (formula 2):

$$\widehat{{\beta }_{2}}=\left(\overline{{Y }_{t=1,treatment}}-\overline{{Y }_{t=0,treatment}}\right)-\left(\overline{{Y }_{t=1,control}}-\overline{{Y }_{t=0,control}}\right)$$
(2)

Multiple linear regression was also conducted to capture which training format is the most effective in terms of knowledge improvement (formula 3), and to quantify the dose–response relationship between training participation and the accumulation of TB knowledge (formula 4). The subscript f, s, and a indicate three different formats of training: face-to-face training, synchronous E-learning, and asynchronous E-learning. There are two variables of interest: Training here is a dummy variable; it equals 1 if the physician has participated in a specific type of training. \(T\_times\) is a continuous variable that stands for the number of sessions related to this specific type of training. In other words, if there \({\beta }_{2}\), \({\beta }_{3}\), or \({\beta }_{4}\) statistically significant in the following model, we can conjecture that the training intervention may have a positive or negative impact on physicians’ knowledge.

$$Y={\beta }_{0}+{\beta }_{1}t+{\beta }_{2}{Training}_{f}+{\beta }_{3}{Training}_{s}+{\beta }_{4}{Training}_{a}+{\beta }_{5}X+{\alpha }_{i}+\varepsilon$$
(3)
$$Y={\beta }_{0}+{\beta }_{1}t+{\beta }_{2}{T\_times}_{f}+{\beta }_{3}{T\_section}_{s}+{\beta }_{4}{T\_times}_{a}+{\beta }_{5}X+{\alpha }_{i}+\varepsilon$$
(4)

The second dimension was the effectiveness of E-learning on physicians’ TB control practice, which is mostly based on qualitative data with quantitative supplementary data. We transcript and analysed the interviews using a hybrid approach in thematic analysis. The analytical framework was developed based on both the topic guides (inductive) and emerging issues (deductive) from the interviews and FGDs [37]. The quantitative data were analysed using Stata 14.0 (StataCorp, College Station, TX, USA) and the qualitative data were analysed in MAXQDA 2018 (VERBI GmbH, Berlin, German).

Results

Effectiveness of E-learning on TB health workers’ knowledge

After the E-learning training, the knowledge level of TB health workers has been significantly improved. The average score of the clinical quiz raised from 65 to 82 points (P < 0.001), the average score of the public health quiz raised from 65 to 71 points (P = 0.009), and the average score of the primary care quiz raised from 79 to 85 points (P < 0.001).

However, the further univariate analysis suggested that the change score among the clinical physicians in the pilot area (+ 17 points) was not statistically different from the physicians in the non-pilot area (+ 18 points, P = 0.714). Similar results were observed in public health physicians (pilot area: + 4 points, non-pilot area: + 9 points, P = 0.772). Despite the null effect among health workers from the county-level and above, the change score of primary care personnel in pilot area (10 points) was statistically higher than that in non-pilot area (4 points, P < 0.001, Table 2).

Table 2 Descriptive statistics of TB knowledge change of pre-and post-training for TB health workers (mean ± standard deviation, full marks = 100)

The DID model showed that after controlling for institutional fixed effects, personal characteristics and other control variables, the improvement of the knowledge level of pilot area is slightly lower than that of non-pilot area for both clinical physicians and public health physicians, and the difference is not significant. However, the results of primary care workers shown a different pattern: the improvement of knowledge level among primary care workers in the pilot area was significantly higher than that in the non-pilot area, with an average improvement of 8.0 points (P < 0.001). On the contrary, the knowledge level of primary care workers in all sample areas did not change significantly before and after the intervention (Table 3).

Table 3 Effect of China-Gates TB training program on health workers’ knowledge: difference-in-difference model

The results of multiple linear regression showed that for the clinical physicians who actually participated in the synchronous E-learning sessions, their clinical knowledge has improved, compared to their colleagues who did not participate. After controlling for institutional fixed effects and personal characteristics, the TB knowledge level of clinical physicians who have participated in synchronous learning activities increased by 7.3 points on average (P = 0.023). For each time they participate in a synchronous session, their clinical TB quiz score could increase by 0.3 points on average (P = 0.034). Asynchronous learning activities have no significant impact on the knowledge level of clinical and public health physicians, but a significant positive effect was seen among primary care staff. Compared with those who did not participate, trained primary care workers had a higher average score of 10.9 points (P < 0.001). For each module they attended, their knowledge score could increase 1.8 points on average (P < 0.001). Moreover, after they obtain a certificate (indicates that all modules are finished), the average score of the participants could increase by 11.2 points (P < 0.001). Traditional face-to-face training has no significant impact on the knowledge improvement for all medical staff. We didn’t find any statistically significant improvement for the public health physicians in terms of their TB knowledge level, no matter what format of training they have taken (Table 4).

Table 4 Effect of China-Gates TB training program on health workers’ knowledge: multiple linear regression model

We conducted three different types of sensitivity analyses. For the multiple linear regression model, the dependent variable was replaced from the original score to the Z-score (\(Z=\frac{x-\mu }{\sigma }\), \(x\) is the raw score,\(\mu\) is the mean and \(\sigma\) is the standard deviation) and the results did not change significantly. For the DID model, two placebo tests were conducted (test 1: rerandomization; test 2: re-allocation according to the quality control subproject—another subproject of China-Gates TB control program). After regrouping, the interaction coefficients between the pilot area and the time variable were no longer significant, indicating that the intervention effects we observed were not due to a random factor or the effects of other pilot projects. In the leave-one-out analysis, we run the DID model again by excluding one of the 13 counties each time. Although the interaction terms were only significant at the α = 0.1 level after excluding the samples of Tongxin or Nong’an counties, they were still significantly positive at the α = 0.05 level after excluding the other 11 counties one by one, indicating that the improvement in the level of knowledge of the medical staff was significant and robust (Additional file 3).

Effectiveness of E-learning on healthcare worker’s behaviour

The E-learning project not only improved the knowledge of TB among medical staff, but also enabled some medical staff to apply the knowledge they learned in their daily practice, which improved their quality of care. We measured their perceptions of the impact that E-learning would have on their practice as a proxy indicator of behaviour. Overall, there was a high heterogeneity: behaviour change varied among medical staff from different regions and at different levels of institutions.

According to the results of the questionnaire survey, 56.0% (93/166) of the 166 synchronous E-learning session participants agreed that “the training knowledge can be applied to my work”. The univariate analysis suggested that medical staff at the provincial level (P < 0.001), with bachelor degree or above (P < 0.001), and from pilot area (P = 0.039) were more likely to agree with this argument. For asynchronous E-learning training, 57.3% of the 309 participants have expressed a similar view (177/309). Compared with the results of synchronous E-learning, a higher proportion of county-level health workers or those with low-level of education agreed on that argument in asynchronous E-learning training (Table 5).

Table 5 Percentage of TB health workers felt the knowledge and skills they learned through E-learning program can be applied in their daily practices

In the interview, medical staff who believed the training could change their practice behaviours mentioned that the knowledge they learned in the China-Gates E-learning helped them in three ways: First, doctors at the city and county-level institutions or junior doctors learned what they did not know before in E-learning sessions, such as standardized TB diagnosis and treatment, or they got a more clear and deeper understanding about these knowledges. They can adopt what they have learned when they encounter the same problem in their practices. The second is for doctors at the provincial and prefecture-level institutions, they get an opportunity to see more intractable cases through training, and learn cutting-edge knowledge of TB diagnosis and treatment from national experts. Both of them were helpful to cultivate their clinical thinking, which made them better prepared for complex clinical problems. The third is that the training itself creates a positive learning environment in the department and causes imperceptible improvements in doctor’s study habit.

The first time I listened to some sessions about immunization, molecular biology, and genes, I was also confused, but after listening several times, I have come gradually to understand what they are talking about. I guess it is just what we called “gradual progression”, we are cultivating a learning habit by training, and then we can make progress in our clinical (practice). (Provincial TB health professional, FGD in Jilin).

During the interview, three main reasons were also put forward by the medical staff who thought they could not apply the knowledge into their practices. First of all, some physicians complained that they have not remembered the new knowledge, or the topics are too easy for them, either condition would offer any help to their practice. The second reason is due to the hardware equipment. Some county-level medical personnel mentioned that even if they have improved their knowledge by training, the equipment, available drugs, and laboratory capabilities of their medical institutions cannot support them to change—they couldn’t do the same lab test and treatment as national hospitals usually do. The third reason is that the knowledge learned was not relevant to their daily work. Many public health physicians in the interviews reported that the training contents they received did not cover the topics in their daily work and were not helpful.

The problems in daily work are not reflected in the training. I don't understand the problems in the statistical report, but the training has nothing about this. (County TB public health physician’s interview in Ningxia).

Discussion

Considering the lack of high-quality educational resources for grassroots doctors in LMICs, and the great challenges that traditional continuing medical education faces due to the COVID-19 pandemic, E-learning CME has important policy significance in LMICs. Our research results have demonstrated that E-learning can significantly improve the knowledge level of TB health workers. Moreover, it has a greater effect on primary care workers, and plays an important role in "equalization of basic theory, knowledge, and essential skills” among different level of institutions. However, we didn’t see the obvious benefits for public health physicians.

Our results are quite different from most previous studies in higher education settings, most of which identified negative effects of online courses on college student education [38,39,40,41,42,43,44,45,46,47,48,49,50,51,52]. These differences illustrate that the impact of modern educational technology on education is complex. It may improve students’ access to high-quality resources and therefore have a positive impact on students’ learning outcomes, and it may also affect the learning experience and have a negative impact. Most of the existing research focuses on comparing face-to-face and online training in which the students have access to the same educational resources. In this condition, the online training format may prohibit students’ learning participation, resulting in negative learning outcomes. Few studies have explored the effect of E-learning in the context of primary care in LMICs, who had no access to national and provincial medical education resources without E-learning opportunities.

Our results indicate different benefits based on the training format and the target audiences. First, according to the target audience, primary health workers benefited the most from training, while public health personnel benefited the least. Mismatch of the supply & demand for training and environmental factors are the main reasons for the differences [31]. In the China-Gates TB control project, the training supply and demand for clinical physicians and primary care personnel are well matched, while the training contents for public health physicians were not consistent with what they demanded. Secondly, in terms of training format, this study found that these two types of E-learning training (synchronous and asynchronous) were both effective in some ways, but the face-to-face training was not significantly effective. This is consistent with the relevant research results in other fields of medical continuing education [53], suggesting that the existing traditional CME model needs to be reformed.

This study has several limitations. Firstly, we did not collect panel data at the individual level. Due to data availability, the regression analysis may have endogenous missing variables. Therefore, we employed three different strategies in statistical analysis to enhance the validity of the results. Secondly, this study did not involve the objective measurement of physicians' behaviour to evaluate the effect of training intervention. Instead, we used the self-reported behavioural change after the training. Existing evidence in LMICs has shown that training may not change the providers prescribing behaviours even after they learned the guideline [54, 55], so our results need to be interpreted with caution. Third, the whole China-Gates TB project is a set of comprehensive and complex interventions. Many project counties have multiple interventions at the same time, such as the development of new information systems, application of electronic medical monitors, and the health insurance payment reform for tuberculosis patients. Given the study design chosen, it is difficult to distinguish the specific effects of each intervention. However, sensitivity analysis showed no evidence that other interventions have a direct impact on the knowledge level of medical staff. Fourth, we didn’t conduct any follow-up survey after 2019, so we don’t know whether E-learning would produce lasting increases in knowledge scores for health care providers.

Conclusions

This study indicates different impact of E-learning based on the training format and the target audience. For TB clinical and primary care health workers, E-learning interventions are associated with a higher TB knowledge level. However, E-learning activities didn’t provide significant benefits for public health physicians. Traditional face-to-face training has no significant impact on all types of medical staff. Future studies on E-learning activities in CME should aim to create the learning environment within the health system to realize the full potentials of E-learning. Future research should also further explore the effect of E-learning on physician’s behavioural practice, like prescribing behaviours.

Availability of data and materials

The datasets used and/or analysed during the current study are available from the corresponding author on reasonable request.

Abbreviations

TB:

Tuberculosis

CME:

Continuing medical education

DID:

Difference-in-difference

COVID-19:

Coronavirus disease 2019

GDP:

Gross domestic product

CDC:

Centre for Disease Control and Prevention

FGDs:

Focused group discussions

References

  1. 1.

    Davis D, O’brien MAT, Freemantle N, Wolf FM, Mazmanian P, Taylor-Vaisey A. Impact of formal continuing medical education: do conferences, workshops, rounds, and other traditional continuing education activities change physician behavior or health care outcomes? JAMA. 1999;282(9):867–74.

    CAS  PubMed  Article  Google Scholar 

  2. 2.

    Brown C, Belfield C, Field S. Cost effectiveness of continuing professional development in health care: a critical review of the evidence. BMJ. 2002;324(7338):652–5.

    CAS  PubMed  PubMed Central  Article  Google Scholar 

  3. 3.

    Meng Q. A pioneering and innovative course of development, remarkable and brilliant achievements—review and prospects of the establishment of China’s continuing medical education system. China Contin Med Educ. 2009;1(01):4–11.

    Google Scholar 

  4. 4.

    Nissen SE. Reforming the continuing medical education system. JAMA. 2015;313(18):1813–4.

    PubMed  Article  Google Scholar 

  5. 5.

    Davis DA, Thomson MA, Oxman AD, Haynes RB. Changing physician performance: a systematic review of the effect of continuing medical education strategies. JAMA. 1995;274(9):700–5.

    CAS  PubMed  Article  Google Scholar 

  6. 6.

    Forsetlund L, Bjørndal A, Rashidian A, Jamtvedt G, O’Brien MA, Wolf FM, et al. Continuing education meetings and workshops: effects on professional practice and health care outcomes. Cochrane Database Syst Rev. 2009. https://doi.org/10.1002/14651858.CD003030.pub2.

    Article  PubMed  PubMed Central  Google Scholar 

  7. 7.

    Rutschmann OT, Janssens JP, Vermeulen B, Sarasin FP. Knowledge of guidelines for the management of COPD: a survey of primary care physicians. Respir Med. 2004;98(10):932–7.

    PubMed  Article  Google Scholar 

  8. 8.

    Fonarow GC, Yancy CW, Albert NM, Curtis AB, Stough WG, Gheorghiade M, et al. Heart failure care in the outpatient cardiology practice setting: findings from IMPROVE HF. Circ Heart Fail. 2008;1(2):98–106.

    PubMed  Article  Google Scholar 

  9. 9.

    Institute of Medicine (U.S.), Committee on Planning a Continuing Health Professional Education Institute. Redesigning continuing education in the health professions. Washington, DC: National Academies Press; 2010.

  10. 10.

    Hamilton LS, Grant D, Kaufman JH, Diliberti M, Schwartz HL, Hunter GP, et al. COVID-19 and the state of K-12 schools: results and technical documentation from the Spring 2020 American Educator Panels COVID-19 surveys. RAND Corporation. 2020.

  11. 11.

    Bacher-Hicks A, Goodman J, Mulhern C. Inequality in household adaptation to schooling shocks: Covid-induced online learning engagement in real time. J Public Econ. 2021;193:104345.

    Article  Google Scholar 

  12. 12.

    Bettinger EP, Fox L, Loeb S, Taylor ES. Virtual classrooms: how online college courses affect student success. Am Econ Rev. 2017;107(9):2855–75.

    Article  Google Scholar 

  13. 13.

    Deming DJ, Goldin C, Katz LF, Yuchtman N. Can online learning bend the higher education cost curve? Am Econ Rev. 2015;105(5):496–501.

    Article  Google Scholar 

  14. 14.

    Goodman J, Melkers JE, Pallais A. Can online delivery increase access of education? 2016. https://papers.ssrn.com/sol3/papers.cfm?abstract_id=2843625. Accessed 26 Apr 2021.

  15. 15.

    Johnson HP, Mejia MC. Online learning and student outcomes in California’s community colleges. San Francisco, CA: Public Policy Institute of California; 2014.

    Google Scholar 

  16. 16.

    Xu D, Xu Y. The promises and limits of online higher education: understanding how distance education affects access, cost, and quality. Irvine: American Enterprise Institute; 2019.

    Google Scholar 

  17. 17.

    Ruiz JG, Mintzer MJ, Leipzig RM. The impact of E-learning in medical education. Acad Med. 2006;81(3):207–12.

    PubMed  Article  Google Scholar 

  18. 18.

    Hadadgar A, Changiz T, Masiello I, Dehghani Z, Mirshahzadeh N, Zary N. Applicability of the theory of planned behavior in explaining the general practitioners eLearning use in continuing medical education. BMC Med Educ. 2016;16(1):215.

    PubMed  PubMed Central  Article  Google Scholar 

  19. 19.

    Joyce T, Crockett S, Jaeger DA, Altindag O, O’Connell SD. Does classroom time matter? Econ Educ Rev. 2015;46:64–77.

    Article  Google Scholar 

  20. 20.

    Bernard RM, Abrami PC, Lou Y, Borokhovski E, Wade A, Wozney L, et al. How does distance education compare with classroom instruction? A meta-analysis of the empirical literature. Rev Educ Res. 2004;74(3):379–439.

    Article  Google Scholar 

  21. 21.

    U.S. Department of Education, Office of Planning, Evaluation, and Policy Development. Evaluation of evidence-based practices in online learning: a meta-analysis and review of online learning studies. Washington, D.C.; 2009.

  22. 22.

    Vaona A, Banzi R, Kwag KH, Rigon G, Cereda D, Pecoraro V, et al. E-learning for health professionals. Cochrane Database Syst Rev. 2018;1:CD011736.

    PubMed  Google Scholar 

  23. 23.

    Cook DA, Levinson AJ, Garside S, Dupras DM, Erwin PJ, Montori VM. Instructional design variations in internet-based learning for health professions education: a systematic review and meta-analysis. Acad Med. 2010;85(5):909–22.

    PubMed  Article  Google Scholar 

  24. 24.

    Gagnon M-P, Légaré F, Labrecque M, Frémont P, Cauchon M, Desmartis M. Perceived barriers to completing an e-learning program on evidence-based medicine. Inform Prim Care. 2007;15(2):83–91.

    PubMed  Google Scholar 

  25. 25.

    Eslaminejad T, Masood M, Ngah NA. Assessment of instructors’ readiness for implementing e-learning in continuing medical education in Iran. Med Teach. 2010;32(10):e407–12.

    PubMed  Article  Google Scholar 

  26. 26.

    Barteit S, Guzek D, Jahn A, Barnighausen T, Jorge MM, Neuhann F. Evaluation of e-learning for medical education in low- and middle-income countries: a systematic review. Comput Educ. 2020;145:103726.

    PubMed  PubMed Central  Article  Google Scholar 

  27. 27

    .World Health Organization. Global tuberculosis report 2020. Geneva: World Health Organization; 2021.

  28. 28.

    Wang Z, Jiang W, Liu Y, Zhang L, Zhu A, Tang S, et al. Transforming tuberculosis (TB) service delivery model in China: issues and challenges for health workforce. Hum Resour Health. 2019;17(1):83.

    PubMed  PubMed Central  Article  Google Scholar 

  29. 29.

    Bissell K, Harries AD, Reid AJ, Edginton M, Hinderaker SG, Satyanarayana S, et al. Operational research training: the course and beyond. Public Health Action. 2012;2(3):92–7.

    CAS  PubMed  PubMed Central  Article  Google Scholar 

  30. 30.

    Guillerm N, Bissell K, Kumar A, Ramsay A, Reid A, Zachariah R, et al. Sustained research capacity after completing a Structured Operational Research and Training (SORT IT) course. Public Health Action. 2016;6(3):207–8.

    CAS  PubMed  PubMed Central  Article  Google Scholar 

  31. 31.

    Wang ZY, Zhang LJ, Liu YH, Jiang WX, Tang SL, Liu XY. Process evaluation of E-learning in continuing medical education: evidence from the China-Gates Foundation Tuberculosis Control Program. Infect Dis Poverty. 2021;10(1):23.

    PubMed  PubMed Central  Article  Google Scholar 

  32. 32.

    China-Gates TB Control Project Office of Clinical Centre for Tuberculosis Prevention, Chinese Centre for Disease Control and Prevention. Pilot Project for a Comprehensive New Model of Capacity Building for China-Gates TB Control Project (Phase III). Beijing, China. 2017.

  33. 33.

    Bertrand M, Duflo E, Mullainathan S. How much should we trust differences-in-differences estimates? Q J Econ. 2004;119(1):249–75.

    Article  Google Scholar 

  34. 34.

    Fu H, Li L, Li M, Yang C, Hsiao W. An evaluation of systemic reforms of public hospitals: the Sanming model in China. Health Policy Plan. 2017;32(8):1135–45.

    PubMed  Article  Google Scholar 

  35. 35.

    Card D, Krueger AB. Minimum wages and employment: a case study of the fast food industry in New Jersey and Pennsylvania. Am Econ Rev. 1994;84(4):772–93.

    Google Scholar 

  36. 36.

    Wooldridge JM. Introductory econometrics: a modern approach. 6th ed. Boston, MA, USA: Cengage Learning; 2016.

    Google Scholar 

  37. 37.

    Fereday J, Muir-Cochrane E. Demonstrating rigor using thematic analysis: a hybrid approach of inductive and deductive coding and theme development. Int J Qual Methods. 2006;5(1):80–92.

    Article  Google Scholar 

  38. 38.

    Xu D, Jaggars SS. Performance gaps between online and face-to-face courses: differences across types of students and academic subject areas. J Higher Educ. 2014;85(5):633–59.

    Article  Google Scholar 

  39. 39.

    Xu D, Jaggars SS. The impact of online learning on students’ course outcomes: Evidence from a large community and technical college system. Econ Educ Rev. 2013;37(C):46–57.

    Article  Google Scholar 

  40. 40.

    Streich FE. Online education in community colleges: access, school success, and labor-market outcomes. Ann Arbor, MI: University of Michigan; 2014.

    Google Scholar 

  41. 41.

    Smith ND. Examining the effects of online enrollment on course outcomes using weighting procedures after multiple imputation on a state-wide university system [Doctoral]. Raleigh, NC: North Carolina State University; 2017

  42. 42.

    Perna L, Ruby A, Boruch R, Wang N, Scull J, Evans C, et al. The life cycle of a million MOOC users. MOOC Research Initiative Conference 2013.

  43. 43.

    Oreopoulos P, Petronijevic U, Logel C, Beattie G. Improving non-academic student outcomes using online and text-message coaching. J Econ Behav Organ. 2020;171:342–60.

    Article  Google Scholar 

  44. 44.

    Hoxby CM. The economics of online postsecondary education: MOOCs, nonselective education, and highly selective education. Am Econ Rev. 2014;104(5):528–33.

    Article  Google Scholar 

  45. 45.

    Hart CMD, Friedmann E, Hill M. Online course-taking and student outcomes in California Community Colleges. Educ Finance Policy. 2018;13(1):1–58.

    Article  Google Scholar 

  46. 46.

    Figlio D, Rush M, Yin L. Is it live or is it internet? Experimental estimates of the effects of online instruction on student learning. J Labor Econ. 2013;31(4):763–84.

    Article  Google Scholar 

  47. 47.

    Chevalier A, Dolton P, Luhrmann M. 'Making it count': evidence from a field study on assessment rules, study incentives and student performance IZA Discussion Papers. 2014. https://www.econstor.eu/bitstream/10419/104703/1/dp8582.pdf. Accessed 26 Apr 2021.

  48. 48.

    Brown BW, Liedholm CE. Can web courses replace the classroom in principles of microeconomics? Am Econ Rev. 2002;92(2):444–8.

    Article  Google Scholar 

  49. 49.

    Bowen WG, Chingos MM, Lack KA, Nygren TI. Interactive learning online at public universities: evidence from a six-campus randomized trial. J Policy Anal Manag. 2013;33(1):94–111.

    Article  Google Scholar 

  50. 50.

    Banerjee AV, Duflo E. (Dis)organization and success in an economics MOOC. Am Econ Rev. 2014;104(5):446–60.

    Article  Google Scholar 

  51. 51.

    Bambara CS, Harbour CP, Davies TG, Athey S. Delicate engagement: the lived experience of community college students enrolled in high-risk online courses. Community Coll Rev. 2009;36(3):219–38.

    Article  Google Scholar 

  52. 52.

    Alpert WT, Couch KA, Harmon OR. A randomized assessment of online learning. Am Econ Rev. 2016;106(5):378–82.

    Article  Google Scholar 

  53. 53.

    Cervero RM, Gaines JK. The impact of CME on physician performance and patient health outcomes: an updated synthesis of systematic reviews. J Contin Educ Health. 2015;35(2):131–8.

    Article  Google Scholar 

  54. 54.

    Cuevas C, Batura N, Wulandari LPL, Khan M, Wiseman V. Improving antibiotic use through behaviour change: a systematic review of interventions evaluated in low- and middle-income countries. Health Policy Plan. 2021. https://doi.org/10.1093/heapol/czab021.

    Article  PubMed  Google Scholar 

  55. 55.

    Rowe AK, De Savigny D, Lanata CF, Victora CG. How can we achieve and maintain high-quality performance of health workers in low-resource settings? Lancet. 2005;366(9490):1026–35.

    PubMed  Article  Google Scholar 

Download references

Acknowledgements

We would like to thank the key informants who participated in this study. We also thank Dr. Xiaolin Wang from the Fourth People's Hospital of Ningxia, Dr. Xiaomeng Wang from Zhejiang CDC, Dr. Yiming Han from Hangzhou Red Cross Hospital, Dr. Yanli Yuan from Jilin provincial TB dispensary, and Dr. Yang Dong from Jilin Provincial Tuberculosis Hospital for their help during the field survey. We also thank Dr. Howard Bergman and Dr. Isabelle Vedel from McGill University for their insightful comments on the manuscript.

Funding

This study is part of the Duke University research project titled “Monitoring, Learning & Evaluation for the Implementation of the Comprehensive Model of Tuberculosis (TB) Care & Control in China” and received funding from the Bill & Melinda Gates Foundation (Approval number: OPP1149395). The funders of the study had no role in the study design, data collection, data analysis, data interpretation, or writing of the report.

Author information

Affiliations

Authors

Contributions

XL and ST conceived and designed the study. ZW, LZ, WJ, and JJ participated in the surveys and conducted data analysis. ZW prepared the tables and figures; ZW, LZ, and XL wrote the first draft of the manuscript. ST, YL, and JJ contributed to and critically revised the manuscript. All authors reviewed and approved the final manuscript. ZW and LZ contributed equally to this work and should be regarded as co-first authors. All authors read and approved the final manuscript.

Corresponding author

Correspondence to Xiao-Yun Liu.

Ethics declarations

Ethnical approval and consent to participate

The Duke University Institutional Review Board provided ethics approval of the survey (Approval number: 2017-0768). All respondents read a statement that explained the purpose of the study and gave their consent to continue.

Consent for publication

Not applicable.

Competing interests

The authors report no competing of interest. The authors alone are responsible for the content and writing of this article.

Supplementary Information

Additional file 1.

Baseline characteristics for intervention and control group (2017).

Additional file 2.

Sample size for key informant interviews and FGDs.

Additional file 3.

Sensitivity analysis for multiple linear regression model.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Wang, ZY., Zhang, LJ., Liu, YH. et al. The effectiveness of E-learning in continuing medical education for tuberculosis health workers: a quasi-experiment from China. Infect Dis Poverty 10, 72 (2021). https://doi.org/10.1186/s40249-021-00855-y

Download citation

Keywords

  • Continuing medical education
  • Training
  • Tuberculosis
  • E-learning
  • Program evaluation