Learn how to add subtitles or captions to your videos for accessibility and multi-language support.
Introduction to subtitles and captions
Learn about the common uses for videos with subtitle support such as accessibility and multi-language support
How to add subtitles to your video
Learn how to add subtitles to your assets
Workflow for generating subtitles
Learn how to build a workflow for generating subtitles for your assets
Subtitles and captions allow for text overlays on a video to be shown at a specified time. First, let's clarify these two terms which are often used interchangeably.
[crowd cheers]
, you are seeing captions on your screen.In any case, Mux supports both in the form of WebVTT or SRT and these files can be human or computer generated. From Mux's perspective these files are converted into "text tracks" associated with the asset. If the text track provided is captions then supply the attribute closed_captions: true
when creating the text track.
The rest of this guide will use the term "subtitles" to refer to adding text tracks that can be either subtitles or captions.
You can add subtitles to any video asset in Mux. To add subtitles, you will need to provide either a SRT
or WebVTT
file containing the subtitle information to the Mux API.
Here's an example of what a WebVTT file looks like:
00:28.000 --> 00:30.000 position:90% align:right size:35%
...you have your robotics, and I
just want to be awesome in space.
00:31.000 --> 00:33.000 position:90% align:right size:35%
Why don't you just admit that
you're freaked out by my robot hand?
Mux does not have an API for automatically generating subtitles for videos.
When you create an assetAPI in Mux, you can also include text tracks as part of the input. There's no limit on the number of tracks you can include when you make the request.
The first input in your array of inputs must be the video file. After that, the caption tracks should be appended to the list, each including the source URL to the caption track, plus additional metadata. Here's an example of the order to use here:
{
"input": [
{
"url": "{VIDEO_INPUT_URL}"
},
{
"url": "https://tears-of-steel-subtitles.s3.amazonaws.com/tears-en.vtt",
"type": "text",
"text_type": "subtitles",
"closed_captions": false,
"language_code": "en",
"name": "English"
},
{
"url": "https://tears-of-steel-subtitles.s3.amazonaws.com/tears-fr.vtt",
"type": "text",
"text_type": "subtitles",
"closed_captions": false,
"language_code": "fr",
"name": "Française"
}
],
"playback_policy": [
"public"
]
}
This will enable WebVTT subtitles in the stream URL, which can then be used by many different players.
You can also add text tracks using the create asset trackAPI. This can be helpful for adding captions to live stream recordings once they have finished, or if you need to update or remove additional languages for a video after it was first added to Mux.
To show subtitles by default, you can include an additional playback modifier with the HLS stream request like this:
https://stream.mux.com/{PLAYBACK_ID}.m3u8?default_subtitles_lang=en
The default_subtitles_lang
playback modifier requires a valid BCP-47 language value to set the DEFAULT attribute value to YES for that language.
If there's no exact language match, the closest match of the same language is selected.
For instance, subtitles text track with language en-US
is selected for default_subtitles_lang=en
. This helps with regional variations and gives more flexibility.
Video players play the default text track for autoplaying videos even when muted.
default_subtitles_lang
with signed URLsIf you are using signed playback URLs make sure you include the extra parameter in your signed token.
The A11Y project is a community-driven effort to make digital accessibility easier and includes checking videos for accessibility.
With Mux videos, the jsx-a11y/media-has-caption
rule fails because it looks for a <track>
attribute on the player. However, Mux videos include subtitles with HLS manifest when you request the stream.
If you have added text tracks to your Mux videos you can safely disable this linting rule and still provide accessible video.
You may want to generate subtitle tracks for your Mux assets. These might be machine generated or human-generated by yourself or a 3rd party. Some example third-party services you might use to do this are Rev.com and Simon Says.
Using static renditions and webhooks from Mux, your automated flow might look like this:
input
parameter, or the recording of a live stream).mp4_support
to your asset either at asset creation time or add mp4_support
to your asset if it is already created. See Download your videos guide for details about how to do this.video.asset.static_renditions.ready
webhook. This lets you know that the mp4 rendition(s) are now available.