Specialists in oral translation services

Wednesday, May 26, 2021

Learning and working online as a visually impaired interpreter (Part 2)

 By Michelle Hof, AIB

An interview with Nadia Gouy (Part II)

Last week, we heard from visually impaired interpreter Nadia Gouy about how interpreting trainers can make their classrooms more accessible. This week, she talks to Michelle Hof about her experience working with online interpreting tools for distance interpreting, terminology management and computer-assisted interpreting, and gives some essential accessibility tips for platform developers.

Michelle Hof: In the first part of this interview, we talked about accessibility in training. Let’s talk now about the other main question, leaving the classroom behind and moving the conversation over to professional interpreting platforms. A lot of the work that I do on my course at Glendon involves exploring different interpreting platforms, different types of software that have been developed to help interpreters deliver their services remotely, and generally various aspects of interpreting technology. Working through that with you has helped me understand some of the things that these platforms have to keep in mind when they are developing their product to make sure that visually impaired interpreters aren’t getting left behind. 

I want you to tell me about your experience with the different technologies that we worked with this year. Here I am thinking about both the platforms that deliver interpreting services – your typical RSI platform – and other types of technology, including terminology management and computer-assisted interpreting (CAI) tools. What was your experience with those tools?

Nadia Gouy: For the RSI platforms, it really varied, all the way from sheer frustration to a good level of satisfaction and ease with the platform. As for the CAI tools or interpreting banks, they were really not very accessible at all. One exception was Interpreter’s Help, which I did manage to use. Even though I wasn’t able to use the extraction feature, I was still able to use parts of it, like the glossary features, so I could read, create and share glossaries. Other tools are really not helpful. InterpretBank, for instance, is out of the question, as it is inaccessible on all levels. 

MH: You said your experience with the interpreting platforms ranged from sheer frustration to satisfaction. Let’s talk about that, and how it feels for you in the classroom. Obviously, every student is going to have a slightly different experience when learning about these platforms. But how does it feel when you see you are being completely blocked from even trying to explore a platform because it simply hasn’t occurred to the developers that a visually impaired interpreter might want to access it? I really felt for you in class sometimes, because it seemed you were being completely left out of the conversation. Let’s talk about that frustration for a minute before we look at possible solutions.

NG: I have to say, when I saw that your course was going to be about interpreting technologies and platforms and whatnot, I really thought I would hate it! I thought it was going to be a series of frustrating experiences. One of the main reasons why I started seriously considering interpreting over translation has to do with how machine translation and CAT tools have moved the translation industry in a direction that has made it less accessible for visually impaired professionals – especially SDL, which is completely inaccessible.

MH: Really? One of the market leaders, after all these years? They’ve had 20 years to get their act together.

NG: No, they never wanted to collaborate or make it accessible. Same with memoQ and MemSource. Fluency is the only one that’s being used by blind translators, but it’s not widely used on the market. So as a blind translator, you feel disadvantaged because of a tool that has been introduced on the market.

MH: So you were basically being forced out the market because of the adoption of CAT tools that weren’t accessible? Was there no receptivity on the part of developers? Did you not even try to go there?

NG: I read reviews by other blind translators who had tried it and said it didn’t work out. We are a minority, very few and far between, so who listens to a minority?

MH: But that doesn’t give them the right to ignore you.

NG: No, I am not justifying their actions, I am just stating the bitter realities here. 

MH: Going back a bit, you said that when you heard I was going to give a course on interpreting technologies, you shuddered because you thought it was just going to be a repeat of all those frustrating moments with user-unfriendly translation tools?

NG: Exactly. I thought it would be the same frustrating experience and it would remind me of how unequal the playing field is. When I get onto an inaccessible RSI platform, like WebSwitcher or VoiceBoxer, all I can think is, “Okay, here is another lost opportunity for me, another chance that is not equal.” So I am not even going to be judged on my interpreting skills, I am just out of the game, completely excluded.

MH: Please tell me there is some good news at the end of this, that it is not all terrible.

NG: Not at all, that’s why I said it ranged from sheer frustration to satisfaction. There are other, very accessible platforms, such as KUDO and QuaQua. What I liked about them is how responsive they were. I remember testing KUDO in November of last year, and they didn’t have sound cues for when I needed to do the handover. So I wrote to them and told them about it. Then, when we tested it again in March, that feature was there. There are now sound cues, a slight, non-distracting beep that will alert me to a handover request or that my boothmate has accepted my handover request. 

The same happened with QuaQua. When we tested it for the first time in March, I couldn’t use the handover feature, as I had to keep interpreting and also watch out for any messages from my boothmate. There was no way I could tell otherwise, I had to keep reading with my screen reader to find out if I had to switch or accept or deny a request. Later, we tested their beta version, scheduled for release in May, and they had added sound cues for the handover. I thought that was really amazing. I love how they take the time to listen to my comments and those of other visually impaired interpreters. 

MH: That’s good news. It seems that in the translation world, you felt shut out, that didn’t feel you had any agency, that there was nothing you could do to make sure your voice was heard, while in the very much smaller world of interpreting, your voice does carry some weight. 

I have to say, in a perfect world, anybody who develops a new platform or software should not have to first develop it and then be told by people that it is not accessible. In a perfect world, accessibility would be baked in right from the beginning. 

NG: Yes. Take Zoom as an example. At least it has all the basics. It will tell me, once I am in the booth, that I am being assigned to Arabic, and it will use voice prompts to tell me which shortcut to use to switch between English and Arabic. This accessibility has been built in from the start, from the early versions of Zoom.

MH: Is the difference between Zoom and some of these smaller companies just one of sheer size? Big companies will have a department that deals with accessibility, staffed by people who know the rules and standards, whereas smaller companies might not have those resources? Or where does this difference come from?

NG: Well, Zoom will have a wider community of users who are visually impaired or partially sighted. 

MH: Of course, Zoom isn’t “just” an interpreting platform. I mean, the entire world lives on Zoom these days! So there is a large user community. 

NG: Yes, and there is the Americans with Disabilities Act that they have to comply with. And as a bigger company, they are more likely to be sued than a small, specialized platform.   

MH: Let’s talk about what these providers can do right. What are three tips that you can give them? It would seem to me that one would be that they must be aware of the existing standards and legislation. What do they need to be pointed towards?

NG: Well, there’s the World Wide Web Consortium on accessibility. They have clear guidelines that any web developer should be familiar with and I don’t think they are that complicated. Basically, it is about labelling buttons, putting in alternative text, etc. For instance, on an interpreting platform, as a sighted interpreter you may know that you are broadcasting because there is a light flashing or a colour change alerting you to that, but I wouldn’t know that if it’s not written “Nadia broadcasting”, so I would just keep trying to find out if I am broadcasting or not. 

MH: These are existing accessibility standards that web developers have to keep in mind, and yet they don’t always do so. It’s like your typical architect who is designing conference rooms with interpreting booths who is supposed to consult the ISO standards for booths, and yet obviously we know that some of them don’t bother – and then we have to live with the results! 

NG: The second thing I would ask for is shortcuts, shortcuts, shortcuts, please! It just makes life easier for me. For instance, to mute myself, I could look for the mute button, but that entails doing two things at once, and that is a workaround. If there is no shortcut for mute, what I have to do, while I am interpreting, is hit Ctrl+F, type in “mute”, then press ENTER to go to the mute button, and then press ENTER on the mute button for it to work! And that is even assuming the button is labeled “mute”.

MH: And by that time, you’ve already coughed up half a lung on a live mic!

NG: Exactly! But if there is a Ctrl+M or Alt+M shortcut, it changes everything. Does it really cost you that much to add it? This brings us to the idea of universal design – something that not only works for me, but works for you, too. It benefits all of us, sighted or otherwise.

MH: Shortcuts as an example of universal design – they are good for everybody, but in particular they are good for you. What other tips do you have?

NG: Well, let’s look at an example of best practice now. If you could configure your platform to work with hard consoles, that would help us. 

MH: We’re talking about the type of hard console that you can order in the post and then plug into your computer and it will give you access to all the functions of the platform’s soft console but with the dials and buttons of a conventional console. That would also be high up on your wish list? 

NG: Yes, very high up!

MH: There are a few platforms that offer that already.

NG: Yes, there is KUDO, for instance. I say this because I tried KUDO with a colleague and we did a comparison of KUDO with a hard console and KUDO with its soft console with all of the accessibility features and shortcuts – because they do have shortcuts – and I found that I can work faster with a hard console. So I gain time and I am also just more comfortable using a hard console, and I am also more confident that I have pressed the right button. 

MH: This also seems to be in the spirit of universal design – I know of many sighted interpreters who are happy to work remotely, but who would really love to use a hard console, even when working from home. So you’re not the only one with that at the top of your wish list!

NG: That’s good news!

MH: Well, it has been very interesting to hear your insights, as a visually impaired interpreter, into online training and best practice in the online classroom and in particular to learn about your experience working with interpreting technologies. If I were to ask you to wrap it all up in a takeaway message, what would that be?

NG: It would be that accessibility is not costly, and it doesn’t harm anyone. If you can do it for us from the very beginning, and just follow the standards and the benchmarks, it will help everybody in the long run. Also, accessibility is not a luxury for some people, it is a must. For me, if there is no accessibility built into a virtual space, I can’t be there. I may not even be able to log in to begin with!

MH: That is a very important message to wrap up with: that accessibility isn’t a luxury, or an add-on, or an afterthought. It’s not something that you do because somebody reminds you that it needs to be done. It’s a must.

NG: And it’s not costly!

MH: Accessibility should come at the beginning of the design process. What’s probably more costly is having to fix things after the fact instead of building them in right from the start. 

NG: Yes, SDL keeps saying that they cannot go back to the drawing board because it would be too costly for them to build in accessibility features at this point. I think it is easier for interpreting platforms because they are web-based, so it’s less of an issue. But still, you want to get it right from the start.

MH: Thank you so much, Nadia! These are questions I’ve been meaning to ask you for a very long time. Thanks for sharing your ideas with me and with the readers of our blog.

NG: Thank you for inviting me, and for getting me to do a bit of reflection on my daily life. It was great, I enjoyed it! 

This brings us to the end of our interview with visually impaired interpreter Nadia Gouy. If you are a visually impaired interpreter and would like to join Nadia and other colleagues in promoting accessibility in our profession, or if you would like to know more about how to teach and work with interpreters with visual impairment, please reach out to us at gouy.nadia@gmail.com and m.hof@aibcnet.com.

Wednesday, May 19, 2021

Learning and working online as a visually impaired interpreter (Part 1)

 By Michelle Hof, AIB

An interview with Nadia Gouy (Part I)

Have you ever wondered what it is like to work with online platforms as a visually impaired interpreter? Do you want to know how to improve accessibility for students and interpreters with a visual impairment? In this two-part interview, Nadia Gouy, a senior interpreting student at the Glendon Master of Conference Interpreting, shares with AIB member Michelle Hof her top accessibility tips for interpreting trainers and online platforms. 

In Part I, Nadia tells us of her experience as a visually impaired student of interpreting and offers guidance on how to ensure an accessible, inclusive classroom experience. Part II next week will contain accessibility tips for developers of online platforms and tools. 

Michelle Hof: We are speaking today with Nadia Gouy, a student of conference interpreting at the Glendon MCI in Toronto. Nadia and I have been working together this year on a course I teach there on interpreting technologies. Welcome, Nadia! Why don’t you start by introducing yourself?

Nadia Gouy: Thank you. I am Nadia Gouy from Morocco. I have been a translator for over 12 years and also an interpreter – mainly liaison and diplomatic interpreting for the Minister of Parliamentary Relations in Morocco – for five or six years. I also did a bit of freelance conference interpreting in Morocco. In 2017, when I arrived in Canada as a newly landed immigrant, I didn’t have a network and the luxury of choosing what to do, so I worked for the government public services here. When the Covid-19 pandemic hit and the MCI program went entirely online, I decided to sit the advanced entry exam (I was actually going to do it in 2018 but didn’t, for personal reasons).

MH: Why did you decide to pursue formal training? Was it because you had practical experience with interpreting and you wanted to know more about it, or was it that you had a lot of time on your hands due to Covid-19? What was the draw?

NG: This is actually something that has been lingering in my mind for years. In 2006, when I  graduated from a school of translation in Morocco, I was awarded the Fulbright Scholarship and was accepted to the Middlebury Institute (MIIS). However, they didn’t offer Arabic, and so for that and a few other reasons, I ended up changing degree tracks and instead I did a master’s in international development and public administration – the world of NGOs and the like. That was really good, as what I learned through that program helped me expand my work opportunities while working as an interpreter for the ministry back in Morocco and helped me understand the conferences I worked at as an interpreter.  

MH: It sounds like an interesting profile for a conference interpreter to have, and this somewhat wandering trajectory seems to have brought you right to where you need to be right now! Which brings me to my next question: you said you joined the Glendon MCI in the year that it went fully online due to Covid-19 restrictions. Now, we haven’t said this yet, but you are student with visual impairment and this may mean that your experience with online learning might be different from that of your sighted classmates. What has your impression been of this past year of online learning at Glendon?

NG: Well, it has been very interesting, because before joining the MCI, I didn’t use Zoom. Even at work we used to hold our meetings on Teams, so moving to Zoom involved a learning curve for me, even though Zoom is very accessible. It’s a lot of things to juggle at the same time – my screen reader talking to me, the instructor talking, the chitchat going on between students in the chat box, and in the beginning I found it really annoying. I didn’t know how to stop the notifications and it was too much to juggle. There was lots going on at the same time, with so many voice feeds. But then I got used to it, I learned the shortcuts, and actually started enjoying writing and chatting while listening to the instructor and doing so much else at the same time!

MH: Well, we always talk about interpreters having to cultivate split attention, so I guess you had a bit of a head start! I sometimes think that students in an online classroom have a lot of demands on their attention. They are asked to pay attention to what the instructor is saying, they are asked to pay attention to the task, and in the world of online learning, they are often paying attention to the chat box as well. I know many of your sighted classmates have difficulties organizing their attention in such situations, and end up dedicating it either to the chat or to what is going on in the main room. Trainers often have to make a similar choice as well. Have you ever been tempted to just turn off the chat, or do you make a conscious choice to try to incorporate it into your online classroom experience?

NG: Both! It depends on my attention span, and on whether the chat is going to be useful – for instance, in your classes, I do follow the chat a lot, as students ask a lot of questions that are complementary to the main discussion. But sometimes I don’t even read the chat because it distracts me from the class, especially if we are interpreting or doing other exercises that require my attention.

MH: Did your classmates ever consider refraining from that idle chitchat, knowing that it might be drawing away some of that much needed attention for you? Sighted interpreters might take a quick glance at the chat box and decide there is nothing interesting going on there, but in your case, you have to wait until your screen reader tells you what’s being said, and then you have to decide, after you’ve heard it, whether it’s interesting or not. Did it ever change the dynamic in class, where your classmates said, “C’mon guys, Nadia has to put up with all of our nonsense, it must be really annoying, let’s hold back for her”?

NG: I never really brought it up. At the start of term, when I attended the induction session, I logged in on my phone and it was a mess, because my phone just kept reading out everything and I thought “This is going to be hell on earth! I am not going to survive!” And I thought that if it were to continue like that, I would have to bring it up. But then I started using my computer, where I have the choice of ignoring the chat completely or just checking it occasionally, and so I didn’t say anything. I didn’t want to limit people’s choices, it felt like too much to ask. Yes, I need to read everything to get the information I need, but at the same time, I have gotten used to it in many ways, so now I just scroll down quickly to what I need to read. 

MH: So you can scan as well and pick out what’s most important? 

NG: Yes.

MH: Did your experience with other online conferencing platforms like Teams help you with that, or was this a new skill set that you had to learn?

NG: I had to learn it at Glendon, because at work we didn’t have side chats. 

MH: Now I want to ask you for some tips. As trainers of interpreters, we sometimes work with students with visual impairment, either in person or online. I have worked with you and other blind students at Glendon over the years and we do receive guidance for that, but it’s not every day that we get to ask the students themselves. So what are the main things that are really important for instructors to keep in mind when working with you in a classroom?

NG: I would like to start by saying that all my Glendon instructors have been really careful with this – for instance, they take care to read out their PowerPoints. They don’t just say “this here” or “that there”, as I have had to deal with in other learning situations – this is my third master’s degree, so I’ve been in school for a long time and seen a bit of everything! And I think Glendon’s is the most accessible program I have attended, all in all. 

But I did have some unpleasant experiences with invited guests. In one of my courses, there were a few guest speakers, and nobody told them beforehand that there would be a visually impaired student. I still remember one presentation on accounting – thankfully, I had taken a graduate course on accounting so I knew what it was about – but still, I just needed the speaker to explain what he was referring to, because he just kept saying “this and that, this and that…”. I kept asking him in the chat to refrain from doing so, but he wasn’t reading the chat, nor did he see my raised hand. So one of my classmates had to step in and explain what I needed him to know.

MH: That’s a good point. Guest speakers or visiting professors need to be told there will be someone with different requirements at the lecture, and they also need to be given the guidance on best practice – for instance, don’t say “this” and “that”, always describe what is on the PowerPoint, don’t just assume everybody has had a look at it, and so on. We will have to keep that in mind at Glendon. It just becomes second nature to do these things because we work with you on a regular basis, so we forget that others may not know it. It’s the lack of communication that gives rise to these uncomfortable situations for all involved.

NG: Another thing has to do with exercises involving interpretation with text, that is, sight translation of a text straight of the page. It’s very complicated for me to do as a user of a screen reader. We did a lot of that on one of my courses. I really tried my best, but the instructor thought that my processing was too slow. I did explain once that I was using a screen reader, but I didn’t want him to think I was seeking a pretext. So instead of insisting, I just let him think that it was me trying to find the right word and not that I was being slowed down by my screen reader. Mostly, when I am slow, it’s because I can’t read ahead.

MH: So your screen reader was holding you back. Were any accommodations made for that? A few years ago, we were told we could give visually impaired students the full text in advance of such an exercise so they could scan it with their screen reader. 

NG: No, I got it at the same time as the others. I don’t mind, it’s just that you need to understand that I am going to take longer, obviously.

MH: So, another tip would be that there are particular exercises that trainers need to know are going to feel different for you working through a screen reader.

NG: Yes. Also, if any materials are going to be used in class, they need to be sent beforehand and in an accessible format: PowerPoint presentations, or documents in .pdf format, etc. If I get a .pdf at the last minute, it will take time to convert – especially if it’s in Arabic! Those are very hard to convert.

MH: Screen readers work best with .doc format, right?

NG: Screen readers can actually work with some .pdf  files. There are two types of .pdfs. Some are accessible because they are scanned documents, not images, and these are fine. Others are scanned images and they need to be run through OCR software to be converted to text. In English, that is pretty easy and I can do it in a split second. But in Arabic, it’s complicated. Just generally, .pdf files are pretty horrible for Arabic. In one of my classes, we had to go over some UN documentation and I couldn’t do the exercise because my screen reader couldn’t read the Arabic written in the .pdf. 

MH: So the problem is with the document format? And if you get it in advance, you aren’t struggling during class time with converting files and having to adapt on the fly. You need extra lead time to prepare class materials.

NG: Exactly. 

MH: Those are some very good tips! And they are in line with the guidance we receive at Glendon on working with visually impaired students. It’s good to get it from the horse’s mouth, to hear that these things really do make a difference for you. We try at Glendon to make sure that visually impaired students have the same sort of access as other students and it’s heartening to hear that it has that effect.

NG: Truly, that is the case.

And that’s the end of Part I on fostering accessibility in training. Please check in next week for Nadia’s accessibility tips for developers of online platforms and tools for interpreters.