Translate this page into:
Smartboard for PowerPoint-based lectures in undergraduate paediatric education: A randomised controlled trial
For correspondence: Dr Nikita Diwan, Department of Paediatrics, GSVM Medical College, Kanpur 208 002, Uttar Pradesh, India e-mail: angelsanddemons.nik@gmail.com
-
Received: ,
Accepted: ,
How to cite this article: Diwan N, Rao YK, Midha T, Agarwal A, Venkatesh V, Rao A. Smartboard for PowerPoint-based lectures in undergraduate paediatric education: A randomized controlled trial. Indian J Med Res. 2026;163:334-40. doi: 10.25259/IJMR_2213_2025.
Abstract
Background and objectives
Interactive digital tools such as smartboards are increasingly being used in education, yet robust evidence from low- and middle-income countries (LMICs) is limited. This study compared Smartboard- and PowerPoint-based teaching in undergraduate paediatrics and assessed their impact on knowledge retention and learner engagement.
Methods
A single-centre, open-label, parallel-group randomised controlled trial was conducted among 360 MBBS students (second, third, and final years). Participants were stratified by academic year and baseline test scores, then randomly assigned to Smartboard or PowerPoint groups. Identical paediatric topics were taught using standardised lesson plans. The primary outcome was knowledge retention, assessed through objective structured clinical examinations (OSCEs) immediately, and at 1- and 2-month follow up. Secondary outcomes included student and faculty feedback. The trial was prospectively registered with the Clinical Trials Registry–India (CTRI/2025/04/083965).
Results
Immediate pooled OSCE performance was comparable (Smartboard: 47.2% Grade A vs. PowerPoint: 41.1%, P=0.29). At 1-month follow-up (primary outcome), Smartboard students achieved higher Grade A scores across all years combined (46.1% vs. 40.3%, P=0.028). Smartboard participants also showed superior retention at 2 months in third- (P=0.002) and final-year (P=0.049) cohorts. Students reported greater engagement (66.1% vs. 51.9%; absolute difference 14.2% points), attention (68.1% vs. 33.9%; difference 34.2% points), and recall (46.1% vs. 22.8%; difference 23.3% points). Faculty rated Smartboards higher for engagement (100% vs. 33.3%; difference 66.7% points), but preferred PowerPoint for ease of preparation (83.3% vs. 33.3%; difference 50.0% points).
Interpretation and conclusions
Smartboard-based teaching improved long-term knowledge retention, engagement, and attention compared with PowerPoint, particularly at 1 month and among senior students. Interactive digital tools may enhance learning outcomes in LMIC settings.
Keywords
India
Paediatrics
PowerPoint
Randomised controlled trial
Smartboard
Undergraduate
Digital tools are transforming medical education by promoting interaction, participation, and engagement, helping learners understand complex concepts more effectively.1 PowerPoint has long served as the main teaching aid, offering better structure and clarity than traditional chalk-and-talk methods.2 However, its interactive and multimedia features are rarely used in routine undergraduate classes.
Smartboards, in contrast, are touchscreen devices that integrate projection, real-time annotation, and multimedia, enabling active learner participation. This interactive approach can enhance attention and comprehension compared with conventional methods.3,⁴ While studies from high-income countries show improved engagement and participation with Smartboards,⁵ evidence from low- and middle-income countries (LMICs) remains limited, despite the continued reliance on didactic teaching.⁶
According to Mayer’s cognitive theory of multimedia learning, combining visual, verbal, and interactive inputs strengthens understanding and recall through multiple cognitive pathways.⁷ In LMICs, where large class sizes limit teacher–student interaction, Smartboards could offer a scalable and cost-effective means to improve learning. The World Health Organization’s Global Strategy on Human Resources for Health also recognises digital education as key to workforce development.⁸
This randomised controlled trial compared Smartboard-assisted and PowerPoint-assisted teaching in undergraduate paediatrics, assessing their effects on short- and long-term knowledge retention through objective structured clinical examinations (OSCEs) and evaluating student and faculty perceptions of engagement and interactivity.
Methods
This was a single-centre, open-label, parallel-group randomised controlled trial undertaken by the department of Paediatrics, GSVM Medical College, Kanpur, Uttar Pradesh, India, a tertiary care medical college in North India, which was conducted over 3.5 months (1 April to 15 July 2025). The trial was prospectively registered with the Clinical Trials Registry–India (CTRI/2025/04/083965) and reported in accordance with CONSORT guidelines.9 The study was approved by the Institutional Ethics Committee (For Biomedical Health and Research), GSVM Medical College, Kanpur. Written informed consent was obtained from all participants. Students were assured that participation was voluntary and would not affect their academic standing. Those who did not participate were taught the same topics separately. The study was not part of any formal educational fellowships.
Participants
A total of 360 MBBS students aged 20–24 yr, enrolled in the second, third, or final year of the MBBS programme, were recruited. Students were included if they had (i) pre-class knowledge test scores between 50% and 70% and (ii) attendance greater than 80%. Those with (i) prior exposure to Smartboard-based teaching on the selected topics, (ii) medical or physical conditions limiting participation, and (iii) declared unavailability for follow-up assessments (e.g., students who indicated at recruitment that they would miss sessions due to examinations, health issues, or personal commitments) were excluded.
Interventions and standardization
Students were taught identical topics using two different modalities.
Smartboard group
Sessions were delivered using interactive Smartboards (Samsung Flip Pro™, Samsung Electronics, Suwon, South Korea; 55–86″ LED/LCD displays, 4K resolution) with infrared touch technology, real-time annotation, drag-and-drop functions, and multimedia integration. Standardised Microsoft PowerPoint slide decks were projected through the Smartboard interface. Faculty and students could annotate slides directly on the board, highlight clinical diagrams, and integrate multimedia in real time, encouraging active participation.
PowerPoint group
Sessions were delivered using the same PowerPoint slides, projected via a BenQ MW560 LCD projector (1280 × 800 resolution, 3300 lumens) onto a fixed wall-mounted matte white screen (80″ diagonal) to maintain comparable viewing conditions. The setup was standardised for brightness, alignment, and viewing distance to ensure uniform visibility with the Smartboard classroom. Lectures were teacher-directed, with no touchscreen or annotation functions enabled. Although annotation tools exist in PowerPoint, they are not used in routine undergraduate teaching at our institution; therefore, the control group was designed to reflect standard practice.
Faculty training and standardization
Six senior residents underwent a structured 4-h workshop conducted by the institutional IT team and paediatric faculty, covering device use, annotation, pacing, and multimedia integration. Only those proficient in both modalities conducted sessions. Lesson plans were adapted from a standard textbook and peer-reviewed for accuracy.
Monitoring and fidelity
Faculty completed post-session checklists, and independent observers verified adherence to assigned modalities. Each academic year had six sessions (18 total). Excluded or non-participating students received the same topics through regular departmental lectures.
Randomisation and allocation concealment
A total of 450 MBBS students were screened, of whom 360 [120 from each academic year (average batch size ∼150)] fulfilled eligibility criteria and consented. Within each cohort (n=120), stratified block randomisation was performed using baseline pre-class test scores (50–60% and 61–70%) as strata. Students were then assigned in a 1:1 ratio to Smartboard or PowerPoint groups, resulting in two equal groups of 60 students per year. A computer-generated block randomisation list (block size=4) was prepared by an independent statistician. Allocation concealment was ensured through sealed, opaque envelopes that were opened only after enrolment, preventing selection bias.
Attendance at each session was ensured by mandatory roll call and signed registers. Classes for the two modalities were conducted simultaneously in separate lecture halls, and independent observers verified attendance and compliance. Students were not permitted to switch between classrooms, ensuring that participants remained in their allocated group.
Outcomes: Primary outcomes (quantitative)
Knowledge acquisition and retention were measured using objective structured clinical examinations (OSCEs) immediately post-teaching, at 1 month, and at 2 months. Although OSCEs are traditionally used to assess clinical skills, in this study, the stations were designed to assess applied cognitive knowledge in paediatrics (e.g., growth monitoring, immunisation counselling, recognition of respiratory distress). This approach has been used in other educational studies to evaluate knowledge application in clinically oriented scenarios.
Each OSCE consisted of structured stations (e.g., growth assessment, immunisation counselling, recognition of respiratory distress, and fluid therapy planning), directly linked to the topics taught. Checklists and station content were developed by senior faculty and independently validated by three external subject experts to ensure alignment with the curriculum. Each station was scored on a 1–10 scale (1=poor, 10=excellent). Aggregate scores were converted into grades: A (8–10), B (6–7.9), C (4–5.9), and D (1–3.9). OSCE examiners were blinded to group allocation.
Secondary outcomes
Structured questionnaires collected student and faculty feedback on engagement, clarity, interactivity, pacing, and satisfaction using 5-point Likert scales. Free-text comments were summarised descriptively and grouped under common themes for clarity; no formal qualitative or mixed-methods analysis was undertaken.
Sample size
A pilot study among 40 students found that knowledge retention at one month, assessed via OSCE, was 25% following conventional teaching. Assuming a 15% absolute improvement with Smartboard-based teaching and using a two-sided test with 80% power and a 5% significance level, the required sample size was 163 students per group. Allowing for a 10% loss to follow up, the final target was 179 per group (total=358).
This calculation was performed for the pooled sample across all academic years, as the study was powered to detect differences at the aggregate level. Stratified randomisation by year ensured balanced allocation within cohorts, but subgroup analyses by year were exploratory and not powered for definitive conclusions.
Implementation
Six senior residents in Paediatrics conducted 18 sessions (six per academic year), each was assigned a specific teaching modality and standardised lesson plan. Sessions were held outside routine hours to avoid scheduling conflicts. Recruitment occurred April 1-5, 2025, teaching from April 7–May 7, 2025, and follow-up concluded by July 10, 2025. Data were entered and analysed by an independent statistician.
Blinding
OSCE examiners and qualitative coders were blinded to group allocation. Data analysts worked with anonymised datasets. Participant and instructor blinding was not feasible due to the visible differences between modalities. To minimise bias, standardised lesson plans, structured faculty training, consistent timing, and objective OSCE scoring were employed.
Statistical methods
A per-protocol analysis was performed. Categorical variables (OSCE grades, survey responses) were summarised as frequencies and percentages and compared using Pearson’s chi-square test. Continuous variables were analysed using means and standard deviations. A P value <0.05 was considered statistically significant. All analyses were performed using SPSS version 22.0 (IBM Corp., Armonk, NY, USA).
Results
Figure shows the participant enrolment, randomisation, allocation, follow up, and per-protocol analysis details.

- Study flow chart.
Five students (1.4%) initially randomised to Smartboard were inadvertently scheduled into PowerPoint sessions; all were retained in the per-protocol analysis. Approximately 10% (n=18) of PowerPoint participants reported informal access to Smartboard materials, but adherence to assigned sessions was maintained, and contamination bias was considered minimal.
Baseline characteristics
Of 360 enrolled students, 188 (52.2%) were males and 172 (47.8%) females, with balanced gender distribution across academic years and groups. Mean baseline pre-test scores were comparable (Smartboard: 58.3±5.2; PowerPoint: 57.9±5.4; P=0.62). Average pre-study attendance exceeded 85% in both groups. Baseline characteristics were therefore comparable across groups.
Participant adherence
All students received their allocated intervention. Attendance exceeded 95% in both groups, confirmed by roll-call registers and independent observer verification at each session. No cross-over between classrooms occurred.
OSCE performance
At 1-month follow-up (primary outcome), when all academic years were combined, Smartboard participants achieved a higher proportion of Grade A scores (46.1%) compared with PowerPoint (40.3%) (P=0.028). Immediate post-class pooled performance was comparable between the groups (Smartboard=47.2%; PowerPoint=41.1%; P=0.29).
In subgroup analyses, Smartboard students showed greater knowledge retention at 1 month and 2 months, particularly among third-year and final-year cohorts. Significant advantages were observed in the third year at both 1 month (P=0.001) and 2 months (P=0.002), and in the final year at 2 months (P=0.049). No significant differences were found among second-year students. Extended A–D grade distributions for each group and time point are presented in Table I.
| MBBS year and timepoint | Grade | Smart board, n (%) | Power Point, n (%) | P value* |
|---|---|---|---|---|
| Second year–immediate | ||||
| A (8–10) | 26 (43.3) | 22 (36.6) | 0.231 | |
| B (6–7.9) | 28 (46.6) | 27 (45.0) | ||
| C (4–5.9) | 5 (8.3) | 8 (13.3) | ||
| D (1–3.9) | 1(1.8) | 3 (5.1) | ||
| Second year–1 month | ||||
| A | 27 (45.0) | 23 (38.3) | 0.218 | |
| B | 28 (46.1) | 27 (44.5) | ||
| C | 5 (7.8) | 9 (14.3) | ||
| D | 1 (1.1) | 2 (2.9) | ||
| Second year–2 months | ||||
| A | 27 (45.0) | 21 (35.0) | 0.104 | |
| B | 27 (45.7) | 24 (40.8) | ||
| C | 5 (8.9) | 12 (19.5) | ||
| D | 0 (0.4) | 3 (4.7) | ||
| Third year–immediate | ||||
| A | 28 (46.7) | 25 (41.7) | 0.412 | |
| B | 24 (40.6) | 26 (42.8) | ||
| C | 6 (10.0) | 7 (11.1) | ||
| D | 2 (2.7) | 3 (4.4) | ||
| Third year–1 month | ||||
| A | 27 (45.0) | 23 (38.3) | 0.001 | |
| B | 27 (44.8) | 24 (40.8) | ||
| C | 5 (8.9) | 9 (15.3) | ||
| D | 1 (1.3) | 3 (5.6) | ||
| Third year–2 months | ||||
| A | 30 (50.0) | 21 (35.0) | 0.002 | |
| B | 25 (42.3) | 22 (37.0) | ||
| C | 4 (7.0) | 14 (23.6) | ||
| D | 0 (0.7) | 3 (4.4) | ||
| Final year–immediate | ||||
| A | 31 (51.7) | 27 (45.0) | 0.327 | |
| B | 24 (40.0) | 26 (43.9) | ||
| C | 4 (7.2) | 5 (8.9) | ||
| D | 1 (1.1) | 1 (2.2) | ||
| Final year – 1 month | ||||
| A | 29 (48.3) | 25 (41.7) | 0.219 | |
| MBBS year and timepoint | Grade | Smart board, n (%) | Power Point, n (%) | P value* |
| B | 26 (42.6) | 25 (41.9) | ||
| C | 5 (8.3) | 8 (13.9) | ||
| D | 0 (0.8) | 2 (2.5) | ||
| Final year – 2 months | ||||
| A | 30 (50.0) | 23 (38.3) | 0.049 | |
| B | 25 (42.3) | 24 (39.4) | ||
| C | 4 (7.2) | 11 (18.6) | ||
| D | 0 (0.5) | 2 (3.7) | ||
Student and faculty perceptions
Student feedback favoured Smartboards across all domains, with higher proportions reporting positive engagement (66.1% vs. 51.9%), understanding (72.2% vs. 58.1%), recall (46.1% vs. 22.8%), and attention span (68.1% vs. 33.9%). For future lessons, 69.2% of students strongly preferred Smartboards compared with 33.1% for PowerPoint. Faculty perceptions were consistent: Smartboards were rated superior for engagement (100% vs. 33.3%), understanding (83.3% vs. 50.0%), and communication (100% vs. 50.0%), while PowerPoint was preferred for ease of preparation (83.3% vs. 33.3%) (Table II).
| Domain | Smartboard (% very positive) | Students – PowerPoint (% very positive) |
Smartboard, n (%) |
Faculty – PowerPoint, n (%) |
|---|---|---|---|---|
| Engagement | 66.1 | 51.9 | 6 (100%) | 2 (33.3%) |
| Understanding | 72.2 | 58.1 | 5 (83.3%) | 3 (50.0%) |
| Improved recall | 46.1 | 22.8 | NA | NA |
| Attention span | 68.1 | 33.9 | NA | NA |
| Preference for future lessons* | 69.2 | 33.1 | NA | NA |
| Communication | NA | NA | 6 (100%) | 3 (50.0%) |
| Ease of preparation/use | NA | NA | 2 (33.3%) | 5 (83.3%) |
Student values represent the percentage selecting the most positive category on a 5-point Likert scale; faculty values are n (%) of six respondents. NA, not assessed in that group
Adverse events
No adverse academic effects were reported. Minor technical issues (e.g., Smartboard calibration) occurred but did not disrupt teaching.
Discussion
This randomised controlled trial showed that Smartboard-assisted teaching led to significantly better knowledge retention at one month than PowerPoint-based instruction. Immediate post-class performance was similar, indicating that the Smartboard advantage appeared over time, possibly reflecting stronger consolidation of learning. Student and faculty feedback also supported improved engagement and attentiveness with Smartboards. Together, these findings suggest that interactive, multimedia-supported instruction can enhance long-term learning.
Student perceptions further contextualise these findings. A greater proportion of Smartboard users described sessions as very engaging, effective for understanding, and helpful for recall and attention. Faculty also rated Smartboards higher for engagement and communication, while PowerPoint remained preferred for ease of preparation. Similar benefits of interactive whiteboards on student motivation and participation have been documented in India,1,2,4 Botswana,3 and the United Kingdom.5,10
Key strengths of this trial include a relatively large stratified sample covering three academic years, use of standardised lesson plans, blinded OSCE assessment, and structured perception analysis. Several limitations should be acknowledged. The single-institution setting may limit generalisability. Although separate classrooms and attendance monitoring minimised contamination, informal peer exchange between groups cannot be ruled out. PowerPoint was used in both arms, but annotation and interactive features were deliberately disabled in the control group to reflect standard teaching practice, which may have accentuated the relative benefit of Smartboards. Instructor blinding was not feasible, and awareness of the intervention hypothesis could have influenced teaching enthusiasm. Minor variations in screen size and resolution, along with the novelty effect of a modern technology, might also have contributed to higher engagement and perceived effectiveness. Perception data were collected through structured questionnaires rather than focus groups, and subgroup analyses were exploratory with limited power. The two-month follow up precludes long-term conclusions, and Smartboards require additional preparation time and occasional calibration, which could affect scalability.
Equity considerations remain important. Differences in access to digital technology by gender, socioeconomic status, or geography may widen educational disparities in LMICs. Global frameworks such as the UNESCO Global Education Monitoring Report and WHO–UNICEF guidelines advocate inclusive digital learning approaches.11,12 Future research should therefore include multicentre trials across varied institutional settings, cost-effectiveness assessments, and gender- and equity-sensitive analyses to ensure that digital innovations promote rather than hinder educational equity, consistent with frameworks for 21st century skills.13
Author contributions
YKR: Conceptualisation, methodology, supervision, manuscript writing; ND: Data collection and analysis, manuscript writing; TM: Statistical analysis, data interpretation, manuscript writing; AA: Methodological guidance, manuscript writing; VV: Oversight of academic integrity, content validation; AR: Literature review, preparation of tables/figures, proofreading. All authors have read and approved the final printed version of the manuscript.
Financial support and sponsorship
None.
Conflicts of Interest
None.
Use of Artificial Intelligence (AI)-Assisted Technology for manuscript preparation
The authors confirm that AI tool i.e. ChatGPT (OpenAI, GPT-5) was used for language editing which was reviewed by the authors. No images or figures were generated or modified using AI.
References
- A study on smart board effectiveness in teaching-learning experience. Mumbai: SVKM’s Usha Pravin Gandhi College of Arts, Science and Commerce; 2024.
- Is PowerPoint killing the art of medical teaching and is the interactive board the way forward? SRJHS. 2024. ;4:4-5.
- [Google Scholar]
- Smart boards in medical education: A step toward digital classrooms in India. J Educ Technol Health Sci.. 2021;8:97-102.
- [Google Scholar]
- The impact of interactive smart boards on students’ learning in secondary schools in Botswana: A students’ perspective. Int J Educ Dev Using Inf Commun Technol. . 2020;16:22-39.
- [Google Scholar]
- Interactive whiteboards: boon or bandwagon? A critical review of the literature. J Comput Assist Learn.. 2005;21:91-101.
- [CrossRef] [Google Scholar]
- PowerPoint presentations in the classroom: An examination of student preferences and learning outcomes. Med Educ.. 2009;43:634-40.
- [Google Scholar]
- The Cambridge handbook of multimedia learning (2nd ed). Cambridge: Cambridge University Press; 2014.
- Global strategy on human resources for health: Workforce 2030. Geneva: WHO; 2016.
- CONSORT 2010 statement: updated guidelines for reporting parallel group randomised trials. BMJ. 2010;340:c332.
- [CrossRef] [PubMed] [PubMed Central] [Google Scholar]
- Embedding interactive whiteboards in teaching and learning: The process of change in pedagogic practice. Educ Inf Technol.. 2008;13:291-303.
- [CrossRef] [Google Scholar]
- International Society for Technology in Education (ISTE) Policy Brief. Technology and student achievement: The indelible link. Available from: https://computerexplorers.com/Student-Achievement-Brief.pdf, accessed on November 5, 2025.
- United Nations Educational, Scientific and Cultural Organization (UNESCO). Global education monitoring report 2023: Technology in education – A tool on whose terms? Available from: https://unesdoc.unesco.org/ark:/48223/pf0000385723, accessed on November 5, 2025.
- Comparing frameworks for 21st century skills. In: Bellanca J, Brandt R, eds. 21st century skills: Rethinking how students learn. Bloomington: Solution Tree Press; 2010. p. :51-76.
- [Google Scholar]
