By Michelle Hof, AIB
An interview with Nadia Gouy (Part II)
Last week, we heard from visually impaired interpreter Nadia Gouy about how interpreting trainers can make their classrooms more accessible. This week, she talks to Michelle Hof about her experience working with online interpreting tools for distance interpreting, terminology management and computer-assisted interpreting, and gives some essential accessibility tips for platform developers.
Michelle Hof: In the first part of this interview, we talked about accessibility in training. Let’s talk now about the other main question, leaving the classroom behind and moving the conversation over to professional interpreting platforms. A lot of the work that I do on my course at Glendon involves exploring different interpreting platforms, different types of software that have been developed to help interpreters deliver their services remotely, and generally various aspects of interpreting technology. Working through that with you has helped me understand some of the things that these platforms have to keep in mind when they are developing their product to make sure that visually impaired interpreters aren’t getting left behind.
I want you to tell me about your experience with the different technologies that we worked with this year. Here I am thinking about both the platforms that deliver interpreting services – your typical RSI platform – and other types of technology, including terminology management and computer-assisted interpreting (CAI) tools. What was your experience with those tools?
Nadia Gouy: For the RSI platforms, it really varied, all the way from sheer frustration to a good level of satisfaction and ease with the platform. As for the CAI tools or interpreting banks, they were really not very accessible at all. One exception was Interpreter’s Help, which I did manage to use. Even though I wasn’t able to use the extraction feature, I was still able to use parts of it, like the glossary features, so I could read, create and share glossaries. Other tools are really not helpful. InterpretBank, for instance, is out of the question, as it is inaccessible on all levels.
MH: Going back a bit, you said that when you heard I was going to give a course on interpreting technologies, you shuddered because you thought it was just going to be a repeat of all those frustrating moments with user-unfriendly translation tools?
NG: Exactly. I thought it would be the same frustrating experience and it would remind me of how unequal the playing field is. When I get onto an inaccessible RSI platform, like WebSwitcher or VoiceBoxer, all I can think is, “Okay, here is another lost opportunity for me, another chance that is not equal.” So I am not even going to be judged on my interpreting skills, I am just out of the game, completely excluded.
MH: Please tell me there is some good news at the end of this, that it is not all terrible.
NG: Not at all, that’s why I said it ranged from sheer frustration to satisfaction. There are other, very accessible platforms, such as KUDO and QuaQua. What I liked about them is how responsive they were. I remember testing KUDO in November of last year, and they didn’t have sound cues for when I needed to do the handover. So I wrote to them and told them about it. Then, when we tested it again in March, that feature was there. There are now sound cues, a slight, non-distracting beep that will alert me to a handover request or that my boothmate has accepted my handover request.
The same happened with QuaQua. When we tested it for the first time in March, I couldn’t use the handover feature, as I had to keep interpreting and also watch out for any messages from my boothmate. There was no way I could tell otherwise, I had to keep reading with my screen reader to find out if I had to switch or accept or deny a request. Later, we tested their beta version, scheduled for release in May, and they had added sound cues for the handover. I thought that was really amazing. I love how they take the time to listen to my comments and those of other visually impaired interpreters.
MH: Of course, Zoom isn’t “just” an interpreting platform. I mean, the entire world lives on Zoom these days! So there is a large user community.
NG: Yes, and there is the Americans with Disabilities Act that they have to comply with. And as a bigger company, they are more likely to be sued than a small, specialized platform.
MH: Let’s talk about what these providers can do right. What are three tips that you can give them? It would seem to me that one would be that they must be aware of the existing standards and legislation. What do they need to be pointed towards?
NG: Well, there’s the World Wide Web Consortium on accessibility. They have clear guidelines that any web developer should be familiar with and I don’t think they are that complicated. Basically, it is about labelling buttons, putting in alternative text, etc. For instance, on an interpreting platform, as a sighted interpreter you may know that you are broadcasting because there is a light flashing or a colour change alerting you to that, but I wouldn’t know that if it’s not written “Nadia broadcasting”, so I would just keep trying to find out if I am broadcasting or not.
MH: These are existing accessibility standards that web developers have to keep in mind, and yet they don’t always do so. It’s like your typical architect who is designing conference rooms with interpreting booths who is supposed to consult the ISO standards for booths, and yet obviously we know that some of them don’t bother – and then we have to live with the results!
NG: The second thing I would ask for is shortcuts, shortcuts, shortcuts, please! It just makes life easier for me. For instance, to mute myself, I could look for the mute button, but that entails doing two things at once, and that is a workaround. If there is no shortcut for mute, what I have to do, while I am interpreting, is hit Ctrl+F, type in “mute”, then press ENTER to go to the mute button, and then press ENTER on the mute button for it to work! And that is even assuming the button is labeled “mute”.
MH: And by that time, you’ve already coughed up half a lung on a live mic!
NG: Exactly! But if there is a Ctrl+M or Alt+M shortcut, it changes everything. Does it really cost you that much to add it? This brings us to the idea of universal design – something that not only works for me, but works for you, too. It benefits all of us, sighted or otherwise.
MH: Shortcuts as an example of universal design – they are good for everybody, but in particular they are good for you. What other tips do you have?
NG: Well, let’s look at an example of best practice now. If you could configure your platform to work with hard consoles, that would help us.
MH: We’re talking about the type of hard console that you can order in the post and then plug into your computer and it will give you access to all the functions of the platform’s soft console but with the dials and buttons of a conventional console. That would also be high up on your wish list?
NG: Yes, very high up!
MH: There are a few platforms that offer that already.
NG: Yes, there is KUDO, for instance. I say this because I tried KUDO with a colleague and we did a comparison of KUDO with a hard console and KUDO with its soft console with all of the accessibility features and shortcuts – because they do have shortcuts – and I found that I can work faster with a hard console. So I gain time and I am also just more comfortable using a hard console, and I am also more confident that I have pressed the right button.
MH: This also seems to be in the spirit of universal design – I know of many sighted interpreters who are happy to work remotely, but who would really love to use a hard console, even when working from home. So you’re not the only one with that at the top of your wish list!
NG: That’s good news!
MH: Well, it has been very interesting to hear your insights, as a visually impaired interpreter, into online training and best practice in the online classroom and in particular to learn about your experience working with interpreting technologies. If I were to ask you to wrap it all up in a takeaway message, what would that be?
NG: It would be that accessibility is not costly, and it doesn’t harm anyone. If you can do it for us from the very beginning, and just follow the standards and the benchmarks, it will help everybody in the long run. Also, accessibility is not a luxury for some people, it is a must. For me, if there is no accessibility built into a virtual space, I can’t be there. I may not even be able to log in to begin with!
MH: That is a very important message to wrap up with: that accessibility isn’t a luxury, or an add-on, or an afterthought. It’s not something that you do because somebody reminds you that it needs to be done. It’s a must.
NG: And it’s not costly!
MH: Accessibility should come at the beginning of the design process. What’s probably more costly is having to fix things after the fact instead of building them in right from the start.
NG: Yes, SDL keeps saying that they cannot go back to the drawing board because it would be too costly for them to build in accessibility features at this point. I think it is easier for interpreting platforms because they are web-based, so it’s less of an issue. But still, you want to get it right from the start.
MH: Thank you so much, Nadia! These are questions I’ve been meaning to ask you for a very long time. Thanks for sharing your ideas with me and with the readers of our blog.
NG: Thank you for inviting me, and for getting me to do a bit of reflection on my daily life. It was great, I enjoyed it!
This brings us to the end of our interview with visually impaired interpreter Nadia Gouy. If you are a visually impaired interpreter and would like to join Nadia and other colleagues in promoting accessibility in our profession, or if you would like to know more about how to teach and work with interpreters with visual impairment, please reach out to us at gouy.nadia@gmail.com and m.hof@aibcnet.com.