In 2016, I was profiled on this site's Community Member Spotlight. At the time, I worked at a different institution, logged in using a different account, and sported a different haircut. Recently, I moved on to a different institution and wanted to get active on the Bb Community again. I thought that a blog post on my "robot ear" and accessibility would be a good place to start.
The backstory: One fine October evening in 2014, I suddenly (and I mean instantaneously), went deaf in my right ear, all sound input was replaced by a horrible loud ringing (a condition known as tinnitus). I assumed it would correct itself, and I certainly wasn't going to the ER for hearing loss at 9PM on a Wednesday. The next morning I woke up, still half deaf with a ringing deaf ear, but now with the added bonus of a terrible case of vertigo. (For those who don't know, vertigo is the sensation that the room is moving even though you know it is not. To simulate vertigo, stand up, and spin around as fast as you can for about 60 seconds. Ok. That feeling? The dizziness? The inability to focus one's eyes? That's sort of what vertigo feels like except vertigo doesn't go away so quickly). My family doctor referred me to an ENT who performed some tests and determined that, for reasons that will never be known, the vestibular nerve in the right side of my head was no longer functioning. The vestibular nerve carries two important signals (sound and balance) from the ear into the brain. When the nerve went kaput, so too, did my sense of balance. My brain, failing to receive a signal from my right ear, assumed that my body was falling... all..... the...... time. The vertigo subsided (for the most part) after about two years.
Meanwhile, although the hearing loss was irreversible, I was able to get a Cochlear implant. The Cochlear implant isn't something they usually give people with single sided hearing loss, but because the tinnitus (ringing in my deaf ear) was so intense (I could hardly sleep, it was so loud), they decided to give it a shot (it helped). After the surgery, I had to wait a month, then they turned on my new robot ear.
So what's it like??
The long and short of it is that I wear a receiver/processor on my deaf ear. The device looks sort of like a Bluetooth ear piece, but instead of playing sound into my ear, the processor connects to an implant inside my head by way of a magnet. The processor receives audio input through a couple of microphones then converts the input into a digital signal, transmits the signal through the magnet, into the implant, and the implant sends the data into my brain as something that sounds something like sound. (Yeah, it's just exactly as sci-fi as it sounds, and while my brain is receiving the "sounds" they don't sound much like the sounds we hear naturally. Cochlear "sound" is garbled, robotic, and takes a lot of concentration and practice to understand).
Although the robot ear delivers "sound" into my brain, the cybernetic appliance doesn't distinguish between what sound is important and what sound is irrelevant. To the Cochlear implant, all sounds are created equal. A human voice and the rattle of the air conditioner are of equal import to the robot ear. It is just the messenger. It makes no value judgments and simply delivers audio.
This can be a problem, especially when I'm in a room where lots of people are talking or where there is just a lot of background noise. Usually, in large group settings, while everyone else is talking and milling about, I'm trying to find a corner where I can escape the cacophony. So, if you see me at BbWorld, please don't be offended if I can't follow along in conversation. Sorry.
Okay. Cool story, but what does this have to do with teaching and learning and technology???
Ahhhh yes, the point. The ever elusive part of anything I say or write. So I say all of that stuff to provide a framework for a conversation that will go on for the rest of my life. I work in web-based Instructional Technology. That's my playground and my workplace. Over the years, I've done course design and development, content development, programming, process documentation, project management, end-user support, etc,. etc,. etc. and everything I've done, I've always tried to keep accessibility in mind because, you know, that 508 box has to have a checkmark. Or at least I thought I did. The truth is, I had no idea what I was talking about.
Accessible content is an afterthought until you're the one needing it.
Speaking as one who now requires accommodations and accessible content, the curtain has been lifted and I can see, very clearly, just what a poor job we have done in designing and creating this digital ecosystem. Our systems and their content were designed for use by folks who have full sight and hearing. Efforts to address accessibility issues are typically slapped on after the fact, if they're slapped on at all.
Ultimately, the "slapped on after the fact" approaches rarely remove any barriers. They just reframe (or more accurately, reskin) the existing barriers. Accessibility options shouldn't need to be enabled. They should be turned on by default. By forcing end-users who need assistance to locate, then enable some feature or function, we further separate them from the content they're trying to access. Then, there is an additional learning curve, and because the accessibility features are unfamiliar to most others, the learner has fewer options when asking for assistance. Students aren't in school to learn how to use a platform. The platform should be intuitive and accommodating by default.
The Bb Learn environment isn't the only tool that puts barriers in place, but since this is a Blackboard community, here are a couple of examples the Blackboard ecosystem...
- The webinar is a revolutionary innovation that allows us to bridge the distance and pool our collective expertise. It is an amazing tool, but here's the reality. If the audio connection is just a little bit bad, then I'm going to have a very hard time hearing it. If someone with no impediments is struggling to make out a word here are there, chances are that those of us who require accommodations are unable to make out anything. To make things worse, if the speaker isn't live and on camera, then I can't even try lipreading to get the gist of what's going on. I'm not just picking on Collaborate here, this is a problem endemic with ANY webinar system. (Yes, I know, transcripts are sometimes made available, but often too long after the fact. If we're dealing with time-sensitive information that needs a quick decision and action, I'm at a distinct disadvantage.)
- Audio and video feedback tool - it's great (in theory), but if the connection wasn't strong when the media was recorded or if the person didn't speak clearly, I have absolutely no idea what's being said and any value the multimedia feedback may have added is negated. Furthermore, there are no captions or transcripts. The responsibility lies partly with the content creator, but the tool should guide the creator through the process in a way that fosters accessible content creation.
So what to do?
We in education technology are always on new ground. It is always new and exciting and no two days at the office are ever exactly the same. We get to weave technologies together to provide content delivery solutions that were unthinkable just a few years ago.
So as I trudge along and encounter things (some new, some yet-to-be discovered), I see these innovations with new eyes (and listen with one new cybernetic ear), not just at what problem they solve, but also at what additional barriers they erect for others.
Because at the end of the day, that's what we do.
We remove obstacles.
I'm going to continue writing about this, (I think). I've got a few ideas that I'd like to share and a few new practices that I'd like to try out.
thanks for letting me ramble...