This week I have reviewed feedback from classmates and colleagues to improve my final project for Assessments in E-learning, EDUC-762. My final project can be accessed here, or by clicking on the adjacent image.
Being aware of my strengths and weaknesses, my revisions have focused upon grammar, language and viewer experience. When looking at text for extended amounts of time and revising for clarity, our brains may read text one way but in essence may overlook spelling, grammar and punctuation. I have elicited additional proofing eyes to review content to identify areas in need of revision and make suggestions for word choice. Feedback suggested I revise specific terminology that may not be familiar to the end viewer, or add links for further explanations. In its finished state, my final artifact of learning displays my understanding and effective use of assessments and evaluation within a student-centered e-learning environment.
|
Soon enough I will be shifting from the student role to the online facilitator role and cybercoach. I will take this knowledge and learning to extend constructivist experiences to class participants. Embedded assessments throughout the learning continuum will offer formative feedback via teacher-student, student-student, and self. One medium of feedback that I can offer is through monitoring and responding to student discussion posts.
I have been apprised of a new method in which to gleem insight into student understanding through discussion posts and online discussion board assessments termed discourse analysis. I am aware of, and have been interested in, the inclusion of learning analytics as another means to offer insight to individual and class interaction, Learning analytics focuses on patterns of activity and timeliness, yet discourse analysis focuses on sentence structure and presentation of meaning in language. Discourse analysis begins with critical reading. Kurland (2000) identifies key components to critical reading including recognizing the texts purpose, tone and persuasive elements as well as identification of bias and includes skill sets in what to look for in text and the interpretation of the content you reveal. Discourse analysis ranges from simple to complex. Determining tone in communication through categories of certainty, activity, optimism, commonality, and realism coupled with identifying communication styles of informant, supporter and/or flamer can assist in seeking tone and sociability as well as building and sustaining a positive, online computer mediated discussion. These are indicative of simplistic measures. However, to truly begin to identify student understanding and learning, more complex analysis is required. Two methods discussed by Marra, R., Moore, J., & Mlimczak, A. (2004) that incorporate detailed analysis techniques include Gunawardena, Lowe, & Anderson (1997) Interaction Analysis Model (IAM) and Newman, Webb, & Cochrane (1996) Critical Thinking Model (p. 25). Both models employ coding of text to identify knowledge-construction. The IAM model uses 5 phases/category indicators in comparison to the critical thinking model which used 40+ indicators. I thought I would review a thread I posted during Module 3 of this model. I found both IAM and the Critical Thinking Model analysis to be very challenging and ones in which further instruction is required before I could feel confident coding a post of a future student. Due to my limited knowledge base with these analysis techniques I chose to follow Tallent-Runnels (2004) discourse analysis model which uses Bloom’s taxonomy to identify levels of critical thought outlined by McFerrin & Christensen (2013) in Developing a Positive Asynchronous Online Discussion Forum. Through this activity I determined my posts included analysis, evaluation, application and metacognition, which were above the normative results of application and analysis founded by Tallent-Runnell (2004). This offered insight as to the higher-level cognitive thinking I was communicating. Tallent-Runnell (2004) suggested using a rubric for evaluation of online discussions could serve two purposes, 1) an evaluation rubric by the instructor and 2) an instructional guide for the class participants. (McFarrin & Christensen, 2013). Brandon (2004) concurs with using a rubric as a tool to help students discover how to reflect in action. Through review of these models, I will begin my journey in using learning analytics to observe frequency, timeliness and breadth of participation, establish collaborative netiquette rules of communication, be cognizant of social tone throughout posts as well as develop and share a discourse analysis rubric (including participant interaction, audience coding and discourse function coding) at the beginning of the course as both an evaluation and instructional guide to encourage students to extend their thinking within our computer mediated communications and discussion. Citations Brandon, B. (2004, June 29). Applying Instructional Systems Processes to Constructivist Learning Environments. The ELearning Developers' Journal. Retrieved from November 1, 2015 from http://www.learningsolutionsmag.com/articles/296/applying-instructional-systems-processes-to-constructivist-learning-environments?_ga=1.215383127.1758579424.1449102142. Gunawardena, C.N., Lowe, C.A. & Anserson T. (1997). Analusis of global online debate and the development of an interaction analysis model for examining social construction of knowledge in computer conferencing. Journal of Educational Computing Research, 17(4), 397-431. Kurland, D. (2000). What is Critical Reading? Retrieved December 6, 2015, from http://www.criticalreading.com/critical_reading.htm Marra, R. Moore, J. & Klimczak, A. (2004). Content Analysis of Online Discussion Forums: A Comparative Analysis of Protocols. (2004). ETR&D, 52(2), 23-40. ISSN:1042-1629. Newman, D.R., Webb, B. & Cochrane, C. (1996). A content analysis method to measure critical thinking in face-to-face and computer supported group learning. Journal of the American Society for Information Science, 48(6), 484-495. McFerrin, K. & Christensen, P. (2013). Developing a Positive Asynchronous Online Discussion Forum. In R. McBride & M. Searson (Eds.), Proceedings of Society for Information Technology & Teacher Education International Conference 2013 (pp. 769-774). Chesapeake, VA: Association for the Advancement of Computing in Education (AACE). Tallent-Runnels, M.K. (2004). Raising the bar: Encouraging high level thinking in online discussion forums. Roeper Review. |
I will share my final thoughts with an Audioboom recording. As I have chosen to incorporate a variety of communication tools within my blog, ones I have never used prior, this tools was yet another new discovery for me during this final week. I thought it would be an appropriate tool as a closing to my learning blog.
AudioBoom: The account set-up was simplistic and of no cost. Recording was a 1-2-3 step process offering quick options to save, upload and embed. This web 2.0 tool could be useful for sharing building background knowledge prior to course activities or brief overview. I have also provided a transcript of the audio recording to provide access for those with hearing disabilities.
|