By Michelle Hof, AIB
An interview with Nadia Gouy (Part II)
Last week, we heard from visually impaired interpreter Nadia Gouy about how interpreting trainers can make their classrooms more accessible. This week, she talks to Michelle Hof about her experience working with online interpreting tools for distance interpreting, terminology management and computer-assisted interpreting, and gives some essential accessibility tips for platform developers.
Michelle Hof: In the first part of this interview, we talked about accessibility in training. Let’s talk now about the other main question, leaving the classroom behind and moving the conversation over to professional interpreting platforms. A lot of the work that I do on my course at Glendon involves exploring different interpreting platforms, different types of software that have been developed to help interpreters deliver their services remotely, and generally various aspects of interpreting technology. Working through that with you has helped me understand some of the things that these platforms have to keep in mind when they are developing their product to make sure that visually impaired interpreters aren’t getting left behind.
I want you to tell me about your experience with the different technologies that we worked with this year. Here I am thinking about both the platforms that deliver interpreting services – your typical RSI platform – and other types of technology, including terminology management and computer-assisted interpreting (CAI) tools. What was your experience with those tools?
Nadia Gouy: For the RSI platforms, it really varied, all the way from sheer frustration to a good level of satisfaction and ease with the platform. As for the CAI tools or interpreting banks, they were really not very accessible at all. One exception was Interpreter’s Help, which I did manage to use. Even though I wasn’t able to use the extraction feature, I was still able to use parts of it, like the glossary features, so I could read, create and share glossaries. Other tools are really not helpful. InterpretBank, for instance, is out of the question, as it is inaccessible on all levels.
MH: You said your experience with the interpreting platforms ranged from sheer frustration to satisfaction. Let’s talk about that, and how it feels for you in the classroom. Obviously, every student is going to have a slightly different experience when learning about these platforms. But how does it feel when you see you are being completely blocked from even trying to explore a platform because it simply hasn’t occurred to the developers that a visually impaired interpreter might want to access it? I really felt for you in class sometimes, because it seemed you were being completely left out of the conversation. Let’s talk about that frustration for a minute before we look at possible solutions.
NG: I have to say, when I saw that your course was going to be about interpreting technologies and platforms and whatnot, I really thought I would hate it! I thought it was going to be a series of frustrating experiences. One of the main reasons why I started seriously considering interpreting over translation has to do with how machine translation and CAT tools have moved the translation industry in a direction that has made it less accessible for visually impaired professionals – especially SDL, which is completely inaccessible.
MH: Really? One of the market leaders, after all these years? They’ve had 20 years to get their act together.
NG: No, they never wanted to collaborate or make it accessible. Same with memoQ and MemSource. Fluency is the only one that’s being used by blind translators, but it’s not widely used on the market. So as a blind translator, you feel disadvantaged because of a tool that has been introduced on the market.
MH: So you were basically being forced out the market because of the adoption of CAT tools that weren’t accessible? Was there no receptivity on the part of developers? Did you not even try to go there?
NG: I read reviews by other blind translators who had tried it and said it didn’t work out. We are a minority, very few and far between, so who listens to a minority?
MH: But that doesn’t give them the right to ignore you.
NG: No, I am not justifying their actions, I am just stating the bitter realities here.
MH: Going back a bit, you said that when you heard I was going to give a course on interpreting technologies, you shuddered because you thought it was just going to be a repeat of all those frustrating moments with user-unfriendly translation tools?
NG: Exactly. I thought it would be the same frustrating experience and it would remind me of how unequal the playing field is. When I get onto an inaccessible RSI platform, like WebSwitcher or VoiceBoxer, all I can think is, “Okay, here is another lost opportunity for me, another chance that is not equal.” So I am not even going to be judged on my interpreting skills, I am just out of the game, completely excluded.
MH: Please tell me there is some good news at the end of this, that it is not all terrible.
NG: Not at all, that’s why I said it ranged from sheer frustration to satisfaction. There are other, very accessible platforms, such as KUDO and QuaQua. What I liked about them is how responsive they were. I remember testing KUDO in November of last year, and they didn’t have sound cues for when I needed to do the handover. So I wrote to them and told them about it. Then, when we tested it again in March, that feature was there. There are now sound cues, a slight, non-distracting beep that will alert me to a handover request or that my boothmate has accepted my handover request.
The same happened with QuaQua. When we tested it for the first time in March, I couldn’t use the handover feature, as I had to keep interpreting and also watch out for any messages from my boothmate. There was no way I could tell otherwise, I had to keep reading with my screen reader to find out if I had to switch or accept or deny a request. Later, we tested their beta version, scheduled for release in May, and they had added sound cues for the handover. I thought that was really amazing. I love how they take the time to listen to my comments and those of other visually impaired interpreters.
MH: That’s good news. It seems that in the translation world, you felt shut out, that didn’t feel you had any agency, that there was nothing you could do to make sure your voice was heard, while in the very much smaller world of interpreting, your voice does carry some weight.
I have to say, in a perfect world, anybody who develops a new platform or software should not have to first develop it and then be told by people that it is not accessible. In a perfect world, accessibility would be baked in right from the beginning.
NG: Yes. Take Zoom as an example. At least it has all the basics. It will tell me, once I am in the booth, that I am being assigned to Arabic, and it will use voice prompts to tell me which shortcut to use to switch between English and Arabic. This accessibility has been built in from the start, from the early versions of Zoom.
MH: Is the difference between Zoom and some of these smaller companies just one of sheer size? Big companies will have a department that deals with accessibility, staffed by people who know the rules and standards, whereas smaller companies might not have those resources? Or where does this difference come from?
NG: Well, Zoom will have a wider community of users who are visually impaired or partially sighted.
MH: Of course, Zoom isn’t “just” an interpreting platform. I mean, the entire world lives on Zoom these days! So there is a large user community.
NG: Yes, and there is the Americans with Disabilities Act that they have to comply with. And as a bigger company, they are more likely to be sued than a small, specialized platform.
MH: Let’s talk about what these providers can do right. What are three tips that you can give them? It would seem to me that one would be that they must be aware of the existing standards and legislation. What do they need to be pointed towards?
NG: Well, there’s the World Wide Web Consortium on accessibility. They have clear guidelines that any web developer should be familiar with and I don’t think they are that complicated. Basically, it is about labelling buttons, putting in alternative text, etc. For instance, on an interpreting platform, as a sighted interpreter you may know that you are broadcasting because there is a light flashing or a colour change alerting you to that, but I wouldn’t know that if it’s not written “Nadia broadcasting”, so I would just keep trying to find out if I am broadcasting or not.
MH: These are existing accessibility standards that web developers have to keep in mind, and yet they don’t always do so. It’s like your typical architect who is designing conference rooms with interpreting booths who is supposed to consult the ISO standards for booths, and yet obviously we know that some of them don’t bother – and then we have to live with the results!
NG: The second thing I would ask for is shortcuts, shortcuts, shortcuts, please! It just makes life easier for me. For instance, to mute myself, I could look for the mute button, but that entails doing two things at once, and that is a workaround. If there is no shortcut for mute, what I have to do, while I am interpreting, is hit Ctrl+F, type in “mute”, then press ENTER to go to the mute button, and then press ENTER on the mute button for it to work! And that is even assuming the button is labeled “mute”.
MH: And by that time, you’ve already coughed up half a lung on a live mic!
NG: Exactly! But if there is a Ctrl+M or Alt+M shortcut, it changes everything. Does it really cost you that much to add it? This brings us to the idea of universal design – something that not only works for me, but works for you, too. It benefits all of us, sighted or otherwise.
MH: Shortcuts as an example of universal design – they are good for everybody, but in particular they are good for you. What other tips do you have?
NG: Well, let’s look at an example of best practice now. If you could configure your platform to work with hard consoles, that would help us.
MH: We’re talking about the type of hard console that you can order in the post and then plug into your computer and it will give you access to all the functions of the platform’s soft console but with the dials and buttons of a conventional console. That would also be high up on your wish list?
NG: Yes, very high up!
MH: There are a few platforms that offer that already.
NG: Yes, there is KUDO, for instance. I say this because I tried KUDO with a colleague and we did a comparison of KUDO with a hard console and KUDO with its soft console with all of the accessibility features and shortcuts – because they do have shortcuts – and I found that I can work faster with a hard console. So I gain time and I am also just more comfortable using a hard console, and I am also more confident that I have pressed the right button.
MH: This also seems to be in the spirit of universal design – I know of many sighted interpreters who are happy to work remotely, but who would really love to use a hard console, even when working from home. So you’re not the only one with that at the top of your wish list!
NG: That’s good news!
MH: Well, it has been very interesting to hear your insights, as a visually impaired interpreter, into online training and best practice in the online classroom and in particular to learn about your experience working with interpreting technologies. If I were to ask you to wrap it all up in a takeaway message, what would that be?
NG: It would be that accessibility is not costly, and it doesn’t harm anyone. If you can do it for us from the very beginning, and just follow the standards and the benchmarks, it will help everybody in the long run. Also, accessibility is not a luxury for some people, it is a must. For me, if there is no accessibility built into a virtual space, I can’t be there. I may not even be able to log in to begin with!
MH: That is a very important message to wrap up with: that accessibility isn’t a luxury, or an add-on, or an afterthought. It’s not something that you do because somebody reminds you that it needs to be done. It’s a must.
NG: And it’s not costly!
MH: Accessibility should come at the beginning of the design process. What’s probably more costly is having to fix things after the fact instead of building them in right from the start.
NG: Yes, SDL keeps saying that they cannot go back to the drawing board because it would be too costly for them to build in accessibility features at this point. I think it is easier for interpreting platforms because they are web-based, so it’s less of an issue. But still, you want to get it right from the start.
MH: Thank you so much, Nadia! These are questions I’ve been meaning to ask you for a very long time. Thanks for sharing your ideas with me and with the readers of our blog.
NG: Thank you for inviting me, and for getting me to do a bit of reflection on my daily life. It was great, I enjoyed it!
This brings us to the end of our interview with visually impaired interpreter Nadia Gouy. If you are a visually impaired interpreter and would like to join Nadia and other colleagues in promoting accessibility in our profession, or if you would like to know more about how to teach and work with interpreters with visual impairment, please reach out to us at gouy.nadia@gmail.com and m.hof@aibcnet.com.