Effective classroom learning does not happen by accident, so I am intentionally thoughtful about the way that I prepare, implement, and reflect on my instructional practices. While this has been my approach throughout my teaching career, the importance of informed decision-making was highlighted as our school transitioned to a remote learning model in March 2020 in response to Covid-19. What began as a two week hiatus, during which teachers were instructed to provide ‘maintenance learning opportunities’ without introducing new content, quickly transitioned to an effort to develop the infrastructure for teaching and learning remotely for the remainder of the year.
As a physics teacher, I am aware that research has long supported that students learn science best by personal exploration, and this continues to be supported by current studies. In a typical school year, I regularly integrate lab explorations into each of my courses. A significant portion of class time is dedicated to student lab experience, and when time constraints or resources are limited, I still prefer to show students physical phenomena through demonstration as opposed to simply talking about it theoretically. This became a significant challenge in a remote setting, as students did not have access to lab equipment, nor I, to my physics demonstration supply closet in my classroom.
Being in the midst of a global pandemic did significantly lower the expectations for what students were expected to accomplish during remote learning at the district level. That said, I did not want my students to be deprived of the opportunity to explore some really exciting physics topics in a way that was meaningful and resulted in real learning. In an effort to meet the “demonstration” component to my instruction, I developed a series of video lectures, many of which include the modeling of various phenomena through simulations or footage of live examples. While it would have been easier to assign a relevant chapter to read in the textbook, I think ‘seeing is believing’ when it comes to physics, and wanted my students to see as much as possible.
Artifact 1: Lecture videos incorporating demonstrations
The video below is an example of a lesson that I created which was then distributed through Edpuzzle. This enabled me to monitor student engagement and understanding by analyzing student responses to prompts throughout the video. In this lesson, I do utilize an interactive color wheel and sample artwork from Carnovsky, neither of which I created. The video lesson and resulting Edpuzzle is the artifact that I did create. Unfortunately the Edpuzzle itself cannot be shared outside of my school network, but this provides an example of my efforts to incorporate visual demonstrations into my lessons. As students watched this lesson, they scored very well on progress check questions that I had embedded into the Edpuzzle. At the conclusion of the unit, the majority performed well on the summative assessment, indicating that this tool (while not preferred over in-person instruction) was still a useful tool in communicating with students about physical phenomena.
Artifact 2: Digital Lab Guide
In order to provide a more authentic ‘science course experience’, I needed to develop an opportunity for lab exploration and was able to do so using a variety of simulations online, such as this PhET simulation which illustrates Faraday’s Law. In a traditional lab, I would be present to guide students through lab protocols, but as students were working asynchronously most of the time, that was not an option in the early remote learning model. To support student exploration, I developed a PhET Magnetism Lab Activity Google Form which served as a guide and assessment tool of student observations. I embedded screenshots and descriptions of how the simulation worked, then asked students to make observations and draw conclusions. While this is not a completely inquiry-based lab where students define the question and methods of exploration, I was proud of the opportunity it created for students to engage with the science in as “hands on” a manner as possible in this context.
Artifact 3: Analysis of Responses to Student Survey
To keep up with current physics pedagogy, find inspiration for new lab activities, and strengthen my understanding of course standards, I am a member of PrettyGoodPhysics (PGP). PGP is a community of physics teachers who share resources, lesson ideas, and reflections on effective teaching strategies. This network provides a rich environment for professional discourse that naturally blends technology, pedagogy, and content. While getting outside input is critical to making informed instructional decisions, I believe that incorporating student feedback is also an essential component of informed instruction. During the remote learning period of the 2019-2020 school year, I decided to chunk content into two week units. This allowed students to engage in a new topic through lectures, problem solving, and lab explorations. At the end of each two week unit, I assigned a short assessment. These assessments focused primarily on measurement of content knowledge and skills, but I decided to include a short questionnaire on each quiz so that I could better gauge the student perception of the workload, content that I was producing, and my methods of utilizing technology. I weighed student comments very heavily, and you can see by my analysis of the results below that I was able to modify my instruction to better fit my students’ needs. After the third unit, I had a pretty good measure of what were reasonable expectations for students in this unprecedented time, as comments shifted away from what I could do to make the experience better towards focusing on elements of the content they were interested in. In latter weeks, many of the responses simply said something to the effect of “this was good” or “I liked this week”, so I truncated my analysis in the sample below. The document includes the questionnaire that was added to each unit assessment, analysis of student ratings, and a summary of the changes I made in response to what they shared. I shared this document through Google Classroom, so students could see the accumulated data of all my classes and how I responded to trends in student requests. Anecdotally, several students shared that this practice of getting student feedback allowed me to provide some of their most successful experiences in remote learning. Administrators in my district have commented on several occasions regarding the effectiveness of my efforts to maintain a quality student experience, and I attribute my success in this arena largely to my responsiveness to student feedback.
