Saturday, April 27, 2019

First Class

This image shows three sticky notes, one with no tech, low tech and high tech and a quote at the bottom "technology for learning"


Assistive Technology: Access to Literacy

Today was our first class and I am happy to be back learning about assistive technology. This is my second course having completed the first in the fall. Learning about assistive technology this year has been a HUGE learning curve for me and has been such an important part of my job teaching in a learning centre this year. After each class, I always feel invigorated and engaged and wanting to share all I've learned with my colleagues!

Three takeaways from today's class for me:

1. The idea that "21st Century Literacy" goes beyond just traditional reading and writing. Today literacy encompasses so much more. To be literate today, technology plays a huge role. Digital literacy is essential in participating in many aspects of society today. There is no turning back!

2. I think having a notion of presumed competence in students (and people in general) is a major part of a universally designed classroom or approach to presenting material so that all students can access it. Presumed competency is the idea that all individuals have ability and capability of learning. We must provide students tools for them to be able to demonstrate knowledge and designing programs that allow all to participate is key.


This is a picture of the app chomp.

3. For Assignment #1, Amanda and I were tasked with defining "Dual Coding" and to showcase this definition in the app Chomp. I had never used the app before and I found it easy to navigate and fun to explore. I can see students having a lot of fun with it and see how you could incorporate it into a way for them to present information learned. 

This picture shows an outline of a human head with an image of a dog on one side and the word dog on the other.
Dual coding theory is a cognition theory developed by Allan Paivio.


It theorizes that there is two ways that a person can remember information: Verbal associations and visual imagery. Verbal and visual information is processed differently therefore when both are used it allows the mental codes to be acted upon, stored and retrieved for subsequent use.

People process in three main ways:
1.  Representational—through verbal or non-verbal representation (ie. A person is shown a card with a written word or just a picture to process the information)

2.  Referential—the activation of the verbal system by the nonverbal or vice-versa (ie. Shown a picture of a dog and you remember how to spell the word “dog”)

3.  Associative processing- the activation of representations within the same verbal or nonverbal system

People may require, 1 or all of these codes in order to remember the information being taught. I feel like this theory makes a lot of sense and knowing that brain processing is such a vital piece to students solidifying what they've learned. We know that students need multiple ways of being shown material and information in order to really be comfortable and learn material in depth. 

---> One more take away that really stood out to me today was when we were tasked with reading a passage from a book and asked to write a step by step process of what you were doing as you were reading this excerpt...

Task analysis: 
Why do we read?
-For information, enjoyment or entertainment, to convey information

Task analysis of reading:

1. Scan text visually to see length of text, size of text
2. Start with the first word seen, begin reading from left and follow words along 
    the line to the right of the page
3. Decode text and use reading skills to take in information
4. Deep breathing through nose and trying to concentrate on words
5. Visualize the scene described in the text

Hard to do a task analysis so in depth because it’s so innate or seems to be. Starts at birth and by 9 months to a year, receptive language skills are already developing. We have to engage major executive functions to attend to tasks, maintain attention, eye gaze, neuro-muscular functions, proprioception, self-regulation…all about brain processing

Need to be able to decode text, visual processing, auditory processing, working together
Language knowledge, prior experience, imploring short term and long term memory…

--If people don’t have vocab or language exposure, reading and writing can be so difficult for students to manage



Reflect on Understood.org
I think this site would be a GREAT resource for teachers and parents to use to be able to understand struggling learners in their classrooms and/or families. My immediate thought was to share it with the staff at my school or that it would be so good to present at a PD session and have teachers really think about UDL principles in their planning and lesson design. 

It was eye opening to see the videos but then to be put to the "test" and complete various activities that simulated how these people feel when presented with various academic tasks in their day to day school learning. I was so stressed trying to complete the typing activity I tried, and I know that this was just a glimpse of the anxiety that a struggling learner would feel. It really allowed me to see these types of learning activities from another perspective. 

Here's the link:

Finally after having learned about Universal Design for Learning and all the ways in which we can make material accessible and attainable for ALL, I think this is probably one of the most important and overlooked areas in education right now. We must design lessons while considering ALL students in our class. How can each student access the material? Will everyone produce the exact same product to show their learning? The first seems quite common sensical and the second seems quite absurd but yet, this is how many teachers present material to their students and expect that all students will produce the same work! 



This is an image of UDL practices that are pictured in a circle with aspects of UDL design such as generating ideas, student engagement, sketch design, feedback and reflection.


Review of accessibility features on my devices:

I use a Mac Book and I have an IPhone XR in most of my day to day technology. While I don't have a need for most of the accessibility options to be enabled on my devices, it is pretty amazing to see all of the different features that are now embedded in the tools that we use so much everyday. I think some of the features available, like switch control or assistive touch open many doors for those with limited mobility. The hearing features, such as the LED flash for alerts and phone noise cancellation (a feature I have switched on) would benefit people using hearing aids. Guided access is a huge benefit for parents and educators alike, who with this feature are able to control time and access to whichever app they would like the child/student to access. This is especially helpful if you are trying to assess how the child is interacting with the device, or if they are being assessed for possible AAC device use in the future, or if they are working through an app that assesses knowledge on a particular subject and data is collected. 

I have a couple of scanner apps installed on my phone and I am going to try and use them this week to upload some hard copy sheets that I know will be handed out to some of the students on my caseload, who would be able to access these materials and have them read aloud on their chromebooks. I've been encouraging a classroom teacher whom I work closely with to become more digitized with her handouts and she has been trying, but I feel like a lot of the time, it's after the fact. The students will come down to my classroom with sheet in hand, no chromebook or device in site and be needing assistance to get through the sheet and answer the questions or complete the activity. I feel like this is something that teachers have yet to really implement at the design level of their lessons. Hoping to gain more knowledge and familiarity myself so that I can pass along what I know and in turn, better support students. 




3 comments:

  1. Thanks Kate for all your thoughts. Just a note - re the task analysis of reading. Yes, it is quite a complex task. It is actually Language that is innate, reading needs to be taught. Also just wanted to make sure the scanning apps on your device are OCR scanning Apps (optical Character recognition) otherwise it will not offer the same kind of capabilities when trying to provide read alouds. Also - check your photocopiers at your school...they just might have an option to OCR scan photocopies! Then those copies could just be "popped":):) up to Showbie or Google Classroom for students to access on their devices IN their classrooms.

    ReplyDelete
  2. Kate,
    I wish more teachers were using the UDL approach. It would be so helpful when it comes to our students and their needs for success. It would be as simple as having an OCR digital copy! I think I will be using understood.org in future EPA PD days because it is so eye opening and would help them understand what some students go through everyday.

    ReplyDelete
  3. Kate, I found your analysis of dual coding really interesting! Just today I had a student trying to infer a word in a text and she was saying "I know what it means... but I just can't say it" -- the word was gerbil. So on some level, by looking at the picture, she knew it was a small pet and she wanted to say it was something like a hamster but she just couldn't get the word out!

    So odd how sometimes our brains know what something is visually without having the words. Since we rely language so heavily, having 'thoughts' without words can feel very frustrating!

    ReplyDelete

Last Class and Presentations!!

Today is the last day of class for this Assistive tech class. I've learned so much in this course and once again, I'm excited to...