• Users Online: 330
  • Home
  • Print this page
  • Email this page
Home About us Editorial board Ahead of print Current issue Search Archives Submit article Instructions Subscribe Contacts Login 

   Table of Contents      
Year : 2022  |  Volume : 8  |  Issue : 1  |  Page : 26-32

Use of Students’ Learning Outcomes as a Tool for Changing Teaching Content and Methodology: Assessment of Impact

1 Department of Pharmacology, All India Institute of Medical Sciences, New Delhi, India
2 Department of Pharmacology, Maulana Azad Medical College, Delhi University, New Delhi, India

Date of Submission18-Oct-2021
Date of Decision25-Nov-2021
Date of Acceptance11-Feb-2022
Date of Web Publication18-Feb-2022

Correspondence Address:
Dr. Vandana Roy
Department of Pharmacology, Maulana Azad Medical College, Near Bahadur Shah Zafar Marg, New Delhi-110002
Login to access the Email id

Source of Support: None, Conflict of Interest: None

DOI: 10.4103/mamcjms.mamcjms_116_21

Rights and Permissions

Objectives: Assessment of student’s learning outcomes in alignment with the teaching goals can be a tool for modification of curriculum and teaching methods. This study was conducted to evaluate the impact of a pharmacology curriculum on students’ learning outcomes and the use of their assessment as a tool for making curriculum changes. Materials and methods: An assessment of the students’ (exiting 2014) knowledge and skills at the end of their fifth semester training in pharmacology was carried out using a questionnaire that was developed to accommodate testing of all areas which are underlined in the Medical Council of India’s goals and objectives of teaching pharmacology to MBBS undergraduates at the time of designing of the study. Areas where lesser than 50% students scored well were identified for educational interventions with the next two batches of students (2015, batch B and 2016, batch C), who were then subjected to the same assessment. Results: Based on the learning outcomes, 15 areas were identified for educational interventions with batch 2015. Improvement in learning outcomes of students was observed in 10, ranging from 10% to 15% in batch 2015, whereas in batch 2016, an improvement of 20% was observed in three questions and >10% was observed in six questions. Overall improvement in the intervention questions was 60% (+9 questions) in batch 2015 and 80% (+12 questions) in batch 2016 when compared with preintervention batch 2014. The preintervention batch 2014 scored better overall than the intervention batches 2015 and 2016. Conclusions: Changing teaching content and method, based on assessment of students’ learning outcomes alone, may not translate into an improvement in students’ learning outcomes. Teachers must look for other factors that can impact students’ learning.

Keywords: Assessment, learning outcomes, teaching, pharmacology, medical undergraduates

How to cite this article:
Naeem S, Roy V. Use of Students’ Learning Outcomes as a Tool for Changing Teaching Content and Methodology: Assessment of Impact. MAMC J Med Sci 2022;8:26-32

How to cite this URL:
Naeem S, Roy V. Use of Students’ Learning Outcomes as a Tool for Changing Teaching Content and Methodology: Assessment of Impact. MAMC J Med Sci [serial online] 2022 [cited 2022 Aug 19];8:26-32. Available from: https://www.mamcjms.in/text.asp?2022/8/1/26/337892

  Introduction Top

Continuous evaluation of teaching programs with modification in areas found deficient may help in improving students’ learning outcomes. The assessment tool for the same must be in alignment with the goals and objectives of teaching that subject to an identified student population.[1] The practice of evaluating teaching programs for modification based on student’s learning outcomes is not common in India.

Over the last few years, changes in medical education in India have been taking place. The Medical Council of India (MCI), which is now known as National Medical Commission, regulates medical education in the country. The goals and objectives of teaching in all subjects, which have to be achieved at the end of the course, are defined.[2] Various medical colleges affiliated to universities in India are free to design their own curriculum under the wider framework of goals as defined by the Regulator.

At the time of conduct of this study (precompetency-based medical education), pharmacology was taught to students from third to fifth semester. Almost 300 hours were dedicated to teach and evaluate pharmacology. Teaching methods include lectures, tutorials, small group discussions, problem-based learning, practical exercises, P-drug exercises, and use of computer-assisted learning. The annual intake of medical undergraduates in the college is 250.

It is important that all the learning activities in an educational environment be assessed to give an objective idea, of how far the teaching activities are helping in fulfillment of the learning goals. This would help in identifying weakness and reasons for the same, providing an objective feedback to the faculty to analyze whether any changes are required to further improve the curriculum.

This study was planned to (i) analyze the extent to which teaching in pharmacology enabled the students in acquiring basic knowledge and skills as per the defined objectives and goals, (ii) identify subject areas that require further rectification in teaching content or method among the students and implement educational interventions for the same, and (iii) evaluate the impact of the educational interventions on learning outcomes of a subsequent new batch of students.

  Materials and Methods Top

This study was conducted after approval by the Institutional Human Ethics Committee, Maulana Azad Medical College, New Delhi (ref: F.I/IEC/MAMC/(44)/3/2013/No:143, dated October 29, 2014). Written informed consent was taken from all participating students.

Study design: A prospective, educational intervention study.

Setting: The study was conducted in the Department of Pharmacology, Maulana Azad Medical College, New Delhi.

Study subjects: Medical students who were in their second year (third to fifth semester) were included in the study between 2013 and 2016. Three consecutive batches were evaluated: those who joined in 2013 and gave their final examination in December 2014 (batch A); the batch that joined in 2014 and gave its final examination in December 2015 (batch B); and the batch who joined in 2015 and gave its final examination in December 2016 (batch C).

Study duration: The total duration of the study was four and a half years.

A survey was carried out using a pretested, structured, validated questionnaire − Pharmacology Basic Knowledge and Skills Questionnaire (PBKSQ) .

The PBKSQ was made to accommodate testing of all areas which are defined in the MCI goals and objectives of teaching pharmacology to MBBS undergraduates [Table 1]. This was developed by the teachers actively involved in teaching the students. There were a total of 50 questions. The distribution of questions was based on the amount of time spent in teaching a particular aspect or topic. Twenty-two questions aimed to assess students’ skills including psychoanalytical skills in prescribing, monitoring adverse drug reactions, critical analysis of drug formulations and experimental data. The questionnaire was validated through construct and content validity. The internal reliability of the questionnaire was ascertained using Cronbach alpha. Initially, a total of 51 questions were analyzed. After Cronbach alpha analysis, one question was removed to get a value of 0.736.
Table 1 Medical Council of India’s learning objectives in pharmacology and questions made

Click here to view

Method: The first assessment with the PBKSQ was performed with the batch A. The questions which were attempted wrongly by more than 50% of the students were identified as areas where more emphasis and rectification during teaching had to be performed. This was to be in both content and/or method. For these areas, new exercises were planned or alternately the faculty taking these lectures was asked to emphasize these aspects.

Next year, the teaching program was modified to include the interventions planned for the areas found deficient with batch A. The same PBKSQ was administered to batches B and C at the end of fifth semester.

The total scores of the students were tabulated and the percentages were graded using the following grading system [Table 2].
Table 2 Grades based on marks of students

Click here to view

Statistical analysis

The data are presented as either percentages or mean ± standard error of mean. The tables and figure were plotted using MS Excel 2010. The statistical analysis was performed using Chi-square test with symmetrical analysis using Phi or Cramer V test for nominal by nominal data sets using SPSS version 17 (software SPSS software Version 17.0. Chicago: SPSS Inc., Released 2008). One-way analysis of variance with posthoc test was used to test the parametric data.

  Results Top

In batch A, a total of 204 students participated. After the initial survey of batch A fifth semester students, the educational intervention was implemented on batches B and C [Table 3].
Table 3 Areas identified for educational interventions based on performance of batch A

Click here to view

Improvement in total percentage of students (batch B) answering the questions correctly was observed in nine questions and six questions were attempted correctly by lesser percentage of students [Table 3]. In batch C, 12 questions were answered correctly by more students and 3 questions answered correctly by lesser number of students when compared with the preintervention batch (batch A).

In batch B, the improvement in terms of the number of students attempting correctly was very good in questions 16, 37, 41, and 50, where the improvement in 10% more students scoring correctly was observed. In question 20 and 49, 5% to 10% more students scored better and an improvement in 5% or lesser number of students was observed in questions 29, 33, and 34. Four questions (16, 37, 41, and 50) were found to have significant improvement over the preintervention group [Table 4].
Table 4 Comparison of performance of students, batch A with batches B and C after implementation of interventions in target areas

Click here to view

In batch C, more than 20% improvement was observed in three questions (4, 16, and 37), more than 10% improvement was observed in six questions (12, 24, 34, 41, and 50), and lesser than 5% improvement was observed in three questions (15, 33, and 49). The improvement in seven questions was statistically significant (P < 0.05) [Table 4].

The overall performance of batch A was better than of both batches B and C [Table 4]. Overall improvement in the intervention questions was 60% (+9 questions) in batch B and 80% (+12 questions) in batch C when compared with preintervention batch A.

There was a major shift in the students, from category III to category IV in batch B when compared with batch A [Figure 1]. The performance of batch C was better than batch B. Majority of students remain in categories II and III. A positive shift from category IV to category III is observed which is also attested by the increase in the lowest marks obtained by the student (from 17.8 in preintervention batch to 24.9 in postintervention batch C). There was also a decrease in the number of “questions answered incorrectly by more than 50% of students” from 15 in preintervention batch A to 8 in postintervention batch C.
Figure 1 Distribution of students according to their marks (batch A, 2014; batch B, 2015; and batch C, 2016).

Click here to view

Further analysis of the students, total internal assessment showed that both the batches B and C had performed less optimally in comparison with batch A (P = 0.001 and 0.002, respectively) over the whole year [Table 5]. The average internal assessment of base line batch A was 16.04, whereas it was 15.04 and 15.12 for the batches B and C.
Table 5 Overall comparison of the overall results of batch A with batches B and C

Click here to view

Comparison of teaching time and human resources available for teaching was found to be the same for both the batches, 6 faculties and 11 residents. The only difference was the changes introduced for improving learning outcomes in certain identified areas where the students had not fared well.

  Discussion Top

Curriculum has been broadly defined as the total of students’ experiences that occur in the educational process.[3] It is planned keeping in mind the learning objectives and goals of a particular course.[4] The effectiveness of any curriculum can be assessed by its impact on the students’ learning as assessed by standard tools.[5] This needs to be carried out to identify how far the curriculum has been able to fulfill the attainment of the defined learning goals and objectives in a particular subject. Poor performance in a particular area needs to be looked at to identify possible reasons and plan corrective measures to improve the learning outcomes.

As such an exercise must be a part of all curricula, it is not commonly carried out in India. Most assessments are of students and how they are performing. And the reason is to certify them for promotion. The fact that they may be performing poorly in an area because of the teaching learning method or content is not commonly questioned. Even more rare are educational interventions planned to address the deficient subject areas.[6]

This study is the first of its kind in India to evaluate the impact of a pharmacology curriculum on medical undergraduates learning and implementation of educational interventions to improve upon deficient areas, followed by a reassessment. Although the curriculum has been recently revised and changes in approaches suggested many of the basic concepts and domains of learning remains unchanged. The study remains relevant as it is assessing an approach to make dynamic changes in curriculum based on students’ learning outcomes.

A PBKSQ was developed for this purpose. The questionnaire was developed keeping the objectives and goals of teaching pharmacology to medical undergraduates. All efforts to make the questionnaire as application based, involving use of psychoanalytical skills was made. A broad consensus was achieved with the teachers in the department regarding the questionnaire.

The reason why the assessment of the curricula was based on a questionnaire was that a batch of 250 students had to be assessed objectively. They also had to be tested at the same time, as staggering the assessment over days would have required a lot of time and a number of different questionnaires. The assessment had to be comprehensive and broad based to include representative questions to cover the learning objectives of the whole subject. Hence, viva, long answer questions, and practical assessment were not possible.

This assessment was linked directly to the goals and objectives of teaching pharmacology as defined by MCI to MBBS students. The questions were prepared keeping the learning objectives in mind. A comprehensive questionnaire-based summative assessment was carried out, although we had the whole-year internal assessment of the students collected by regular methods. The internal assessment of the students is based on a series of traditional tests carried out over the whole year. Many of them are at the discretion of different faculties, who may prefer to focus on areas of their choice. As many teachers are involved in the assessments, there is a chance that subjective individual variability may creep in. Hence, for the purpose of our study, we needed an objective, comprehensive, feasible, and acceptable tool.[7],[8] The PBKSQ filled the purpose completely. It has been said that locally developed questionnaires based on a local assessment of need may be better than a standardized questionnaire.[1]

The assessment also had to be conducted over three different batches, as we were evaluating the learning based on the whole curriculum. Hence, it had to be at the end of the course. The students of batch A served as a baseline for us to analyze the performance and then based on their performance, develop alternative educational interventions to be implemented on the next batch, to see their impact.

Educational interventions that were implemented were based on the topics where deficiencies had been observed in terms of students’ performance. New group exercises, tutorials, and emphasis in lectures were the methods adopted. For planning and implementing the interventions, we had to keep the large number of students, that is, 250 and the time duration available with us in mind. Hence, the best possible, feasible methods were used, small group exercises and lectures with emphasis of some concepts was also used, although the limitation of lectures as a teaching method for large batches is well known.[9],[10],[11]

The observations that the performance of batches B and C improved in majority topics (9 and 12 questions), where the educational interventions had been conducted indicate that the interventions may have helped the students. No improvement in some topics and overall the batches not performing better over the whole year is an indicator that the department needs to further assess the reasons for the student’s performance.

Both batches B and C fared worse than the preintervention batch in different questions. This implies that one has to continuously assess students’ learning outcomes as well as reasons for the outcomes.

The whole teaching environment was the same for both the batches, the number of teachers, the course, and the time duration spent. The only difference was the different teaching methods and exercises conducted with the students of batches (B and C) as well as the students.

These observations show that assessment of student’s learning outcomes can be a catalyst for changing the content and methodology of teaching programs for improving student’s learning outcomes. They also show that just changing teaching methodology and content may not necessarily improve all the outcomes. There are other student-specific variables involved that may also affect the learning outcomes.[12],[13],[14] The teaching learning process in the department was also limited by the fact that the teacher–student ratio is very disproportionate. Thus small batch teaching is not really small. It is well known that educational interventions targeted at small batches have a greater impact than addressing the same in large batches of 250 students.[15]

Hence, departments must evaluate their curriculum holistically − content, method, human resources, learning environment, and student-specific reasons. This evaluation should be used for bringing changes wherever required to improve learning outcomes for the students.

We have shown how assessments of students’ learning outcomes can be used to improve teaching and learning in a course with defined goals and objectives. The study is important because it has attempted to address an issue not commonly observed, also the same has been carried out with a large batch of 250 students within the regular time frame given, keeping the learning objectives of the course in mind.

The limitations of this study are that (i) only a direct method, based on student scores for gathering evidence on students’ learning outcome was used, (ii) lack of a control group within the same batch of students (both i and ii) was due to limitations of time and feasibility, as the batch of students was very large. In addition, for ethical reasons, we could not have a control group within a batch which was not exposed to the intervention, (iii) the improvement observed among the students could be the result of innate abilities of students and not necessarily because of the intervention.

  Conclusion Top

Objective assessment of impact of curriculum, on students’ learning outcomes linked to the learning goals and objectives of the course, can be used for modifying teaching content and methods. This can improve educational outcomes. For overall improvement in the students’ learning, teachers must in addition look for other factors that can impact student’s learning.

Financial support and sponsorship


Conflicts of interest

There are no conflicts of interest.

  References Top

Middle States Commission on Higher Education. Student Learning Assessment. Option and Resource. 2nd ed. Philadelphia, PA: Middle States Commission for Higher Education; 2007.  Back to cited text no. 1
Medical Council of India Regulation on Graduate Medical Education. 1997; p. 27 Available at https://www.mciindia.org/CMS/rules-regulations/graduate-medical-education-regulations-1997. [Accessed January 31, 2019].  Back to cited text no. 2
Kelly AV. The Curriculum: Theory and Practice. 6th ed. London: Sage Publication; 2009.  Back to cited text no. 3
Harden RM. The integration ladder: a tool for curriculum planning and evaluation. Med Educ 2000;34:551-7.  Back to cited text no. 4
Deno SL. Curriculum-based measurement: the emerging alternative. Except Child 1985;52:219-32.  Back to cited text no. 5
Stephen RS. AMEE guide No. 14: Outcome-based education: Part 2-planning, implementing and evaluating a competency-based curriculum. Med Teacher 1999;21:15-22.  Back to cited text no. 6
Awad SS, Liscum KR, Aoki N, Awad SH, Berger DH. Does the subjective evaluation of medical student surgical knowledge correlate with written and oral exam performance? J Surg Res 2002;104:36-9.  Back to cited text no. 7
Farrell TM, Kohn GP, Owen SM, Meyers MO, Stewart RA, Meyer AA. Low correlation between subjective and objective measures of knowledge on surgery clerkships. J Am Coll Surg 2010;210:683-5.  Back to cited text no. 8
Tiwari A, Lai P, So M, Yuen K. A comparison of the effects of problem-based learning and lecturing on the development of students’ critical thinking. Med Educ 2006;40:547-54.  Back to cited text no. 9
Golden AS. Lecture skills in medical education. Indian J Pediatr 1989;56:29-34.  Back to cited text no. 10
Letterie GS. Medical education as a science: the quality of evidence for computer-assisted instruction. Am J Obstet Gynecol 2003;188:849-53.  Back to cited text no. 11
Trigwell K, Prosser M. Improving the quality of student learning: the influence of learning context and student approaches to learning on learning outcomes. High Educ 1991;22:251-66.  Back to cited text no. 12
Trigwell K, Prosser M. Relating approaches to study and quality of learning outcomes at the course level. Br J Educ Psycolol 1991;61:265-75.  Back to cited text no. 13
Prosser M, Trigwell K. Student evaluations of teaching and courses: student learning approaches and outcomes as criteria of validity. Contemp Educ Psychol 1991;16:293-301.  Back to cited text no. 14
Jalgaonkar S, Sarkate P, Tripathi R. Students perception about small group teaching techniques: role play method and case based learning in pharmacology. Educ Med J 2012;4:e13-8.  Back to cited text no. 15


  [Figure 1]

  [Table 1], [Table 2], [Table 3], [Table 4], [Table 5]


    Similar in PUBMED
   Search Pubmed for
   Search in Google Scholar for
 Related articles
    Access Statistics
    Email Alert *
    Add to My List *
* Registration required (free)  

  In this article
Materials and Me...
Article Figures
Article Tables

 Article Access Statistics
    PDF Downloaded95    
    Comments [Add]    

Recommend this journal