Our PLN delivered yet again as we collectively developed our Annotated Bibliography for this semester’s F431. I thought there was a great diversity of topics and yet there were some common themes around pedagogy, history of tool development in support of education, and the willingness to share results that did not always show promise – the concept of agile testing to see if a technique even works before we spend more time and money on implementing a new strategy.
I had three primary takeaways as a result of reviewing this collection of articles.
- I discovered a need to better understand what Effective Size is and why it used in the evaluation of whether or not a given educational approach has been observed to make a change. This article helped illuminate the answer to that question. While the article focused on the nuances of medical education, it provided an excellent explanation of both the p-value and the effect size. The p-value looks strictly at whether or not the results from a particular data set are statistically significant or not. As stated in the article by Sullivan and Feinn, “Statistical significance is the probability that the observed difference between two groups is due to chance”. The effect size is independent of the p-value and measures the effect (aptly named) of the change. So one could have a p-value that is statistically significant and still have no observed change in whatever was the desired outcome for the educational intervention that was administered. I also learned that effect size is measured using different indices – e.g., it does not rely on a single statistical test, but leverages different approaches, depending on how the experiment was constructed. That means that my comments on Melissa’s article may not be accurate since I didn’t quite understand at that point in time that there were multiple paths to the evaluation of the application. More to learn in this space going forward.
- I appreciated the integration of pedagogy with the tools themselves – not just focusing on the tools and the fact that the tools were not necessarily the answer. It was interesting to observe that tool selection and pedagogy did not seem to be reliant on each other – that essentially a tool could be identified to support the pedagogical approach of choice. I had a chance to further explore how gamification is influencing the development of edtech tools as well as how to design studies to evaluate the impact of these tools.
- The focus on student-based learning on topics including Social Emotional Learning (SEL) including those who are differently-abled and the difference between using apps, artificial intelligence, and human intervention. I particularly liked this article on how edtech tools are redefining literacy to be more inclusive than just reading, but the “effect” of what knowing how to read can have on our other skills and knowledge.
As I was looking for an image for this post, I googled “Edtech” and came across this article – focusing on where edtech might go in the future. This article caused me to think about the business of education, how brick and mortar universities may evolve after the time of Covid, and what federal regulatory guidance might be helpful in this new environment. Perhaps the real recognition of edtech will come when the word is finally accepted into Merriam-Webster’s dictionary and when the idealized implementation of edtech becomes a reality for all. But that is a post for another day.
Ultimately, I left this exercise with a better understanding of the importance of having teachers in the educational process. As I’ve stated in previous posts, I have pursued this degree solely for the purpose of becoming a better teacher. I found the insights provided by veteran educational researchers and providers to be a useful practice that I should continue as part of a baseline practice. What a great activity to enhance the PLN and PLE!