I don’t even notice them, until I’m watching on someone else’s system. Subtitles are such a ubiquitous part of my viewing experience that they’re on even for the movies I show my students at work.
Subtitles are the difference between being totally lost and being, maybe, able to figure out the plot. I caught a glimpse of Macbeth playing. I say glimpse, perhaps without full honesty. Opening few scenes. Which knowing the plot in advance should be really helpful for. And it is. Just, not enough, not this time. Now I’ve read Macbeth and I know the plot. The words are the same as ever, more or less. But without subtitles, and without the ability to ask someone what was going on, I found myself adrift.
So when a video came across my facebook feed, as they so often do, in sign language (with subtitles) advocating for ubiquitous subtitles, I pondered the utility. Obviously, for those that are not hearing, subtitles are essential, more so than they are for me.
Videos are everywhere. My students, when tasked with finding information, turn more often to video than to text. Would having subtitles on everything improve their use of language? Their spelling? I don’t know.
I know that subtitles help me to pick up names more easily, and to catch aspects of a plot I might otherwise miss. They give me an anchor, one more edge in figuring out things the rest of the world can catch via facial recognition. Videos, for all that I prefer to avoid them, seem to be the norm, as we shift into a world that is very, very visual.