Speaker 1: Hello, and welcome to this class, Subtitling Basics. I'm Dr. Jolly here at Missouri State University. Thanks for joining me. This is going to be a very brief presentation on just the basics of subtitling videos. So, let's get started. So, we know that in today's day and age, we're seeing a lot more screens. And that means that the demand for subtitles is higher than ever. If we look around our daily lives, we're at the airport, we look at the CNN monitor, we see the closed caption, or we might watch a television show and turn on the closed captioning, right? We see the screens on the airplanes, or we turn on Netflix and we start streaming something, and maybe the captions come on. And if you're just browsing social media like Facebook or Instagram or one of those services, you'll also notice that most of the video content is captioned. And that's for a very good reason. Studies show that up to 85% of people or all videos viewed on Facebook are viewed with the sound off. So, we sort of take these captions for granted, but they're popping up all over the place. Most video content these days is captioned or has subtitles, translated captions. And one thing that I wanted to mention is that although a couple of the examples on the screen are examples of closed captions that were originated to assist the deaf and the hearing impaired, that's not the case anymore. Like I said, almost all video content is captioned. So, let's talk really briefly about the different types of captions and subtitles. There used to be a big distinction between captions, closed captions, open captions, subtitles, same language subtitles. These distinctions are kind of sort of melting away, and we just talk about captions and sometimes subtitles if they're translated. But originally, in about the 1970s, when concerns started to sort of build for, how do we help give access to the deaf and the hearing impaired, closed captions emerged. So, these were captions that appeared on the screen that were originally designed for the deaf or the hearing impaired. Now, if you've ever seen closed captions, and I'm sure all of you have, you'll notice that they usually contain sometimes in parentheses or brackets descriptions of all the sounds. So, it might say something like, you know, ominous music playing in parentheses or in brackets, or crash from the kitchen, or something, a description of sound. And again, that is for the benefit of the deaf or the hearing impaired. So, you're getting the transcription, I should say, of the language, but also the description of different sounds. That's typical of closed captions, because they were designed for the deaf and hearing impaired. Another thing, the reason they're called closed captions as opposed to open captions is because you can turn them on or off using the settings on your device. So, for example, on your television, or on your iPad, or other tablet, or telephone. If you can turn these captions on or off, technically they are closed captions. Now, these types of captions can be done either in post-production after a video, or a television, or a movie has already been produced, or they can also be, like you'll often see on the evening news, or a live sports event. They can happen in real time, and those are called real-time or online captions, and those are done with a stenographer and the aid of some computer software. So, they happen in real time. Open captions, these are captions that cannot be turned on or off. So, lots of times when you're watching a video, a video might pop up on your Facebook feed, and it has captions that you don't have any access to. They're basically burned onto the image, right? They're part of the video image that you're seeing. You can't turn them on or off. That's done on purpose because, as I mentioned earlier, producers of video content for social media know that most people will view the video with the sound turned way down or off. And so, they build that caption onto, they layer it onto the video, and they freeze it or burn it into place. So, the fact that you can't turn it on or off is what makes it an open caption. And then, finally, subtitles, right? Subtitles, they have same length. There's such a thing as same language subtitles. That's when the dialogue or the narration of the video is transcribed and appears at the bottom usually in the same language, same language subtitles. These have been used in different literacy programs throughout the world where people may, a large percentage of the populace may be illiterate. So, they put same language subtitles. And you start seeing it more and more in television here in the United States. You might be watching a reality TV show, and you don't even notice it, but everything the characters say or the narrator says is appearing in English in captions at the bottom. Those would be same language subtitles. But when we think about subtitles in the context of translation, we're usually thinking about subtitles that have been translated. So, foreign language subtitles. These are the translation. This is the translation of the dialogue or narration into a different language. So, that's what we're going to focus on for the rest of this video. So, I wanted to make the point that subtitling is a form of translation, right? Subtitling is the written translation. We know that all translation is written, interpreting is spoken. Subtitling is the written translation of the dialogue or, very frequently, the narration of a film or some other type of video presentation. So, you are translating, if you're the translator of the subtitle, you're translating that dialogue or that narration. You might first have to create a transcript, which is very tedious, time-consuming, and kind of a pain, or you may be given, hopefully, an electronic file. But then you have to make sure that that script, that's my second bullet point, often what characters say is not the same as the script, you have to make sure that that script reflects what is actually being said in the screenplay or script. So, and that's often not the case, and you want to make sure that, third bullet point here, your subtitles, what you're actually translating, reflects the words that are spoken as part of the dialogue or the narration of that video. Speaking of movies, for example, or television series that, you know, the quantity of which is just exploding with these streaming services like Amazon, Hulu, Netflix, and all the other ones, Sling, there's tons of them, right? And they're all producing their own original television content, movies and series. The dialogue in these types of products and this type of content often involves really informal language, even slang, right? Street language, very informal. And so that poses a challenge to translation because most of the time translators work in more structured, higher-register types of genres. There are also some other factors that make the translation of subtitles challenging, right? And that's because the translation of dialogue or narration may be constrained by at least a couple of factors that we'll talk about. One is the screen width, right, the aspect ratio. You run out of screen. You just can't fit a lot of characters, a lot of words on the screen. And also, readers can only read visually and process mentally a certain amount of text in a given amount of time. So we have the area on the screen, and we also have the processing time in our heads. That's going to sometimes limit. And so we may not be able to translate every single word that's said, but we still have to capture the meaning. And that's kind of one of the differences. It's not really on the slide to point out between dubbing, right, when you see something, when you hear the language track in the foreign language, it might not match what's being translated, the translation. So if you had the foreign language audio, let's say it's Spanish, you have the Spanish language audio, which is a dubbed version, and you had Spanish language subtitles, they might not match because they have different constraints. People can process spoken speech faster than they can read written speech on the screen. So because of this, the translator often has to come up with some creative solutions, and you have to be very skilled and have sort of an artistic temperament, right, sensitivity to translate effectively this type of content. So also, you might have to use special tools, special software, even special hardware, special equipment to place the subtitles onto the video clip. Talk a little bit about some of these specifications you probably never noticed, or maybe you did notice as you're watching some of this captioned or subtitled. Remember, it's not so important that we have this terminology because it's very fluid, and the distinctions are blurry. As you watch captions or subtitles, whether they're in your same language or in a foreign language, you may never have noticed some of these things, but they are constraints or specifications, technical specifications for subtitles. And there's no master manual anywhere. Some countries, for example, the BDC has produced a manual, if you want to Google that, BBC specifications for subtitles. Some countries have specifications, but in general, there's no global standard for these. But I've sort of given you some generic specifications here on this slide. Subtitles should be centered and placed as low as possible, typically on the lower one-fifth or one-sixth of the screen. You may have noticed that movie screens are getting wider and wider and sort of shallower and shallower, so that's becoming a challenge. Subtitles should be readable, so that means that they have to stand out. So sometimes you may have seen subtitles used to be produced in a yellow color, and then they sort of went to a more off-white or cream color. And now you see a lot of subtitles that are actually white, but if you look really closely on a high-res or high-definition television, you'll see that there's a shadow behind each letter that helps them pop, or there might be what's called a ghost box, that gray box that you can see the subtitles stand out against. So you can see a couple of examples here on the screen of the shadowing. They should be in a clear font, typically what's called a sans-serif font, like the ones you see on the screen. So something like Arial or Helvetica, as opposed to Times, okay? Also, there's a restriction or a limitation, traditionally, on the number of characters you can have in each line. And the number of lines, you rarely see more than two lines stacked of subtitles on a screen. You just can't read that text that fast. And then the lines are typically limited to 35 characters each. That's usually about five to six to seven words per line. Also, if you pay attention, the titles will be on the screen for different durations of time, but usually not very long, because people talk fast, camera angles change, there are transitions between scenes and things like that. So, and as I mentioned earlier, a person can only process visually, visual text, right, can only process that mentally so fast. So if you have a title that's one or two words, like, good morning, it could be up for maybe one and a half seconds. If you have a single line, like you see down here with Keanu Reeves, I was hoping I could reason with them, that title might be up for a maximum of three seconds. And then if you have two lines, maybe it's two characters who are dialoguing, or just one character who's saying more than six or seven words, that's going to be on the screen for a maximum of seven seconds. These are, again, generalizations, but typically you'll find that they're true. A little bit more on the specifications. Typically, you'll see that the subtitles appear just a microsecond before the character's mouth starts moving, and then they'll linger, they'll have a lag time. So they'll have a slight lead time, and they'll have a slight lag time. So, personal talk, it appears right before they talk, and it lingers or lags on the screen for just a split second after they finish speaking. When you're doing subtitle translation, you have to be viewing the movie, the scenes, right? Not necessarily always a movie, it could be a documentary, it could be a commercial for a local car dealership. But there might be transitions, or cuts, or camera changes, and so the subtitles can't stay on the screen as the camera cuts, right? So you have to be aware of camera shot changes, such as cuts and transitions. Also, if you pay attention to your captions or titles, you'll notice that certain typefaces, such as italic, or punctuation, like dashes and ellipses, have a special purpose. Typically, like you can see with the Arnold Schwarzenegger, I think this might be from the movie Predator, if two different characters have their lines at the same time they're dialoguing, but they're two different characters, you'll have the dashes to indicate dialogue. It's not necessary if only one person is talking, even if two people are present. Ellipses, the dot, dot, dot, that means that the subtitle is going to continue, or the person's going to continue talking, and a new subtitle will be there. So you can see, I think this movie's called Ted, down here in the lower right-hand corner. You can see, after the word planeamos, it goes dot, dot, dot, that's an ellipses. That just means the same character is going to continue talking, and that's going to continue to another camera shot, or just another two-line stack of subtitles. Sometimes you'll see, or you'll hear a voice from off-camera, maybe the phone rings, and you can hear the answering machine in the other room, or something like that. Typically, when a person speaking is off-screen like that, or it's voiceover is even more frequent, the titles will appear in italics. So those are just a few examples. There are other examples of how translators use punctuation and the actual type itself in subtitles, in a conventional way. Finally, just a quick note, or a quick few notes about how subtitles are produced and created in this day and age. So, movie studios, the big ones, and the little ones, the independent studios, streaming services like Netflix, Hulu, Amazon Prime, and other video production companies, like the company down the street, or a department at your university, they will hopefully hire qualified translators to produce subtitles. There are numerous apps and software programs that can be used to create subtitles, so you can just Google those, subtitling software, subtitling equipment, subtitling apps, you'll find that. Also, there's a curious phenomenon that you may or may not be aware of, but I think it really started sort of in the anime community, where people would want to either collaborate on subtitles, or subtitle a film, or a cartoon, an animated program, that didn't have subtitles in their language. So they would download the subtitles, say, in English, or in Spanish, a very common language, and they would overwrite that text file. You can see an example down here in the lower right, and translate, and then they would upload that back to the website. That's called fan subbing, and if you just Google, for example, subtitle download sites, or fan sub sites, you'll find that there are dozens, and the upper image on the screen is from one of those sites. I can't remember the name of the site. And you can see that they have just hundreds of different television shows and movies, and for each of those, let's just say it's The Walking Dead, for example. For The Walking Dead, you will find each episode of each season in dozens of languages, and not only that, but several versions for each language, for each episode, for each season. So it's crazy, and these people are volunteer translators, right? Okay, so fan subbing, or subtitle downloads, if you want to Google that. And then finally, video aggregators, or video website services like YouTube, also encourage, strongly encourage content. I think maybe they may start requiring it at some point, but they strongly encourage content creators to add captions and subtitles to their videos, and they even make it easy, because if you upload a video, it will ask you what the video language is. If you check English, Spanish, French, or another very common language, German, right, Portuguese, Chinese, then the magical computers at Google and YouTube will go into the audio file associated with that video and automatically create captions. Process takes, you know, anywhere from a few minutes to a couple of days. So if you go back to that video, you'll see that automatic captions have been created, and then there are ways, pretty easy, to create either automatically subtitles in different languages, or to download that file, edit it, right, really refine it, so that it's perfect, and then you can also overwrite that into different languages. I'm going to upload a separate video about adding subtitles in different languages to YouTube videos. So that was a really quick intro to the phenomenon of captions and subtitles in the early 21st century. Just to recap, as we've seen an increase in screens, streaming services, and content from many countries, this has led to an increase in demand for captions and subtitles. You've never seen as many captions and titles in your life as you do today, and tomorrow you'll see more, next year you'll see more. So because of this, captioning and subtitling are seen as increasingly important and valuable. Again, it's practically a requirement for anyone who knows what they're doing on social media, or digital video production, or filmmaking, that those products, that that content, be captioned in the same language, but also be translated and have subtitles in multiple other languages. As I mentioned a few times, the distinctions we used to talk about, you know, closed captions, open captions, same language subtitles, foreign language subtitles, those are all kind of blurring in this context, and kind of a hybrid form is emerging, I think. So it doesn't really matter if we say captions or subtitles, no big deal. We talked about subtitles as a form of translation, it's something that requires a lot of skill and creativity, because, for one reason, subtitles are subject to many technical specifications, so in terms of their length, right, number, duration on the screen, font, punctuation, et cetera. And finally, this is a good thing, technology's made it easier for any translator, whether amateur or professional, to produce and share subtitles for their favorite content. And hopefully, that's an extra stream of income, if you're lucky enough to work in subtitling. I always thought that would be the dream job, right, to be hired, you know, as an in-house translator for Columbia Pictures, or Fox, or one of the major studios. That would be so cool to just sit around subtitling films all day into your favorite language, or languages. Anyways, that's it for the lesson, hope you've enjoyed it, and I'll see you in the next video lesson.
Generate a brief summary highlighting the main points of the transcript.
GenerateGenerate a concise and relevant title for the transcript based on the main themes and content discussed.
GenerateIdentify and highlight the key words or phrases most relevant to the content of the transcript.
GenerateAnalyze the emotional tone of the transcript to determine whether the sentiment is positive, negative, or neutral.
GenerateCreate interactive quizzes based on the content of the transcript to test comprehension or engage users.
GenerateWe’re Ready to Help
Call or Book a Meeting Now