Pedagogy

Team-based learning app


Team-Based Learning (TBL) is a method of small-group learning that encourages student collaboration and engagement.

Prior to the group discussion, students are assigned preparatory readings to ensure they come to class with a foundational understanding of the material. Students then individually complete an Individual Readiness Assurance Test (iRAT) followed by the same test taken with their team, the Team Readiness Assurance Test (tRAT). The tests typically consist of multiple-choice questions.

The no-tech version of tRAT involves having students use a “scratch-off” sheet to self-score their group test, facilitating immediate feedback and discussion among team members. An appeal can be made by teams to challenge questions they answered incorrectly. The process promotes critical thinking and deeper understanding of the material.

To conclude, teachers then focuses on concepts that students found challenging during the assessments.

Since I had been experimenting with the use of ChatGPT to make simple educational apps, I came up with an online tRAT quiz that replaces the “scratch-off” sheet. To modify the quiz options, use this. The teacher will have to edit the csv file with the correct option for each question. The html and csv files can then be uploaded onto a web host such as Amazon S3 or, for the case of Singapore teachers, the Student Learning Space.

For those who are keen to experiment with the use of ChatGPT 3.5 to generate codes, these are the prompts I used. Do note that your results may differ and some customisation or refinement of the prompts might be needed.

Provide the code for the following in a single html file:

  1. Create a website for users to key in their answers to a tRAT quiz for checking.
  2. The answers will be referenced from a csv file containing the question number in the first column and the answer to the multiple choice question (A, B, C or D) in the second column.
  3. The quiz will display the question number and 4 options: A, B, C and D. The user will choose the answer from the 4 options.
  4. If the first option is the correct answer, the letter will become light green and 4 marks will be added to the overall score. If it is the wrong answer, the letter will become dark grey and no marks will be added.
  5. The user will get to try a second time for the same question. If the second option is the correct answer, the letter will become light green and 2 marks will be added to the overall score. If it is the wrong answer, the letter will become dark grey and no marks will be added.
  6. The user will get to try a third time for the same question. If the third option is the correct answer, the letter will become light green and 1 mark will be added to the overall score. If it is the wrong answer, the letter will become dark grey and no marks will be added. There will not be a fourth time for the same question.
  7. Getting it correct on the second try should only get 2 marks added. On the third try, only 1 mark will be added if correct.
  8. The total score will be shown at the bottom of the page.
  9. There will be two other buttons to move to the next question or back to the previous question. Don’t jump to the next question automatically.

Edit: I have also generated a simple webpage for the iRAT assessment tool.

Team-Based Learning with Google Form

Team-based learning is a pedagogical approach that facilitates learning through individual testing and group collaboration. Students are first given time to work on answers individually using the Individual Readiness Assurance Test (iRAT). They then work in teams to discuss the same problems in order to arrive at a consensus and check their answers against a pre-filled MCQ scratch card that reveals if their selected answer is correct or wrong, after which an immediate feedback is given. This is known as the Team Readiness Assurance Test (tRAT). If they got the answer wrong, teams get a chance to either appeal their answer or to try the same question again. A clarification session then ensues, with teachers focusing more on questions that teams have difficulty in.

Schools that want to use Team-Based Learning might either subscribe to platforms that allow for repeated attempts such as InteDashboard or purchase the Immediate Feedback Assessment Technique (IF-AT) scratch cards. There are some free options such as that from Cosma Gottardi.

However, I was wondering if a simple one could be done with Google Form, using the quiz mode together with branching options, to achieve the same results. I tested it out immediately last night and came up with this proof-of-concept. It seems possible and easy to edit.

I created a template for anyone who is keen to try:

https://docs.google.com/forms/d/1l2msnjt2ioSWcmz4GpQWgm1_CoFBRQDmBOwZQopnefI/edit?usp=sharing

Reflection on the SLS Pedagogical Scaffold v2.0

The Student Learning Space Pedagogical Scaffold was designed and rolled out just as I joined the Learning Partnerships in Education Branch. I was involved in some of the training sessions such as the SLS Design Challenge in 2018 where it was shared with all the schools in Singapore. After 4 years, I am about to co-facilitate training with it again – this time, with version 2.0 and with our school’s teachers on its use.

The SLS Pedagogical Scaffold

Here are some quick thoughts that I wanted to jot down before that happens and that I will try to refine along the way.

  1. What is the SLS PS? It includes a series of questions intended to help both beginning teachers and experienced teachers think about how “active learning with technology” can be achieved. Breaking that phrase down down: active learning is where students are actively engaged in sense making as opposed to didactic teaching where the only input is auditory or visual. Done with technology if it serves us well, this means technology is optional. Despite the name, the SLS PS need not be done with SLS as well! But instead it encourages us to consider it in light of all the other tools out there to see which tool best serves our purpose.
  2. One thing we kept emphasising was to make success criteria explicit:
    1. To promote metacognition (i.e. students will also know if they learnt)
    2. Helps us not to over-plan. Squeezing too many SCs in one lesson would make it difficult to assess if students are learning.
    3. SCs are more than just SIOs from the syllabus document, but specific performance goals that can be observed by both learner and instructor.
  3. Even though the SLS PS’s Learning Experiences are named after Diana Laurillard’s 6 LEs, The “LE”s in the SLS PS work toward a lesson/unit/topical plan, each with a “flavour” or a main mode of teaching. The components of this plan would include different combinations of the following types of activities, meant to:
    1. Activate Learning
    2. Promote Thinking and Discussion
    3. Facilitate Demonstration of Learning
    4. Monitoring and Providing Feedback
  4. Therefore, a learning activities found inside an “Acquisition” lesson plan can be a “Discussion” or “Practice” activity as well.
  5. The Design Map gives everyone a visual overview of the timeline, types of activities, and level of interaction while highlighting the technology or resources used. This is mainly used for lesson sharing.
  6. One main purpose of the PS is to allows us to consider the key applications of technology, namely:
    1. Personalisation through fostering student agency, giving choice in the learning goals, process and pace through digital resources
    2. Differentiation through harnessing the interactivity and multimodal features of digital technologies to differentiate the
      1. nature of content,
      2. learning processes and
      3. products of learning
    3. Conceptual Change through multimodal representations of abstract concepts, allowing students to discern critical features, and patterns, and infer generalisations.
    4. Scaffolding in the digital learning environment to support thinking and guide interactions between students, teachers and content.
    5. Learning Together (collaborative learning) by integrating supports for students to collectively improve their ideas over time by sharing, building on, organising, and synthesizing their knowledge and developing understandings.
    6. Metacognition by integrating automated supports for students to make sense of and regulate their learning activities and group knowledge, and articulate their reflection through multiple modes.
    7. Assessment for Learning by capturing and analysing assessment data to provide a student- or group-targeted feedback about their level of understanding, learning processes and progress and resources for students to access an expert’s conceptual organisation and modulate their actions.