We're excited to announce that the Swift Upload SDK is now generally available! The 1.0.0 release includes support for standardizing video to 3840p (4K) video resolution. Video input standardization now checks average bitrate levels and the SDK no longer attempts to initiate an upload if it was canceled during standardization. The documentation and example code have been given a refresh.
Mux’s Upload SDK for Android is now GA! Our upload SDK helps app developers upload large video files reliably even on iffy networks, allowing uploads to be resumed if they are interrupted by network loss or app death. It can also “standardize” your videos to either 720p, 1080p, or 4k resolution, which minimizes processing time for your Mux Video assets.
Mux video can now generate clips of live streams and assets created from live streams instantly and at no extra cost. Instant clipping is controlled by setting program_start_time
and program_end_time
on the playback URL of a live stream or asset.
Mux Video now supports configuring a playback restriction to filter playback requests by the User-Agent HTTP header. It is possible to filter both traffic with no user agent, as well as known high-risk user agents, as defined internally by Mux. This feature can be used alongside signed URLs and referrer restrictions, to increase the security of your video delivery.
We’ve added support for error categorization in the Objective-C Core SDK. Error details are now sent alongside each error sent to Mux and are no longer deduplicated. Watch time will continue to be recorded after an error event is received based on the playhead progression.
Objective-C Core GitHub Release
Objective-C Core SDK docs: sending error events
We’ve added support for error categorization in the Java Core SDK. Error details are now sent alongside each error sent to Mux and are no longer deduplicated.The error event API now allows an error context to be set on the error event directly. Watch time will continue to be recorded after an error event is received based on the playhead progression.
Mux Video now supports receiving live streams over the Secure Reliable Transport (SRT) protocol, which can increase the reliability of live streams being sent over imperfect networks. Streams sent to Mux over SRT can optionally be sent with the HEVC codec to reduce network bandwidth requirements.
You can now sign up for our Starter plan, which is $10/mo for $100 of usage and includes access to all Mux products. Usage over $100 transitions to pay as you go rates. No commitment, cancel anytime. A credit card is required to activate the plan.
Mux Node SDK 8.0 supports: Deno, Cloudflare workers, Bun and other non-node JS runtimes. This SDK also has improved TypeScript types, including types for webhook payloads. Version 8 solves the big pain points that we had been hearing from the community. This upgrade comes with significant syntax changes and we have provided a codemod script for upgrading.
See the Github release for more details and a link to the upgrade guide:
https://github.com/muxinc/mux-node-sdk/releases/tag/v8.0.0
Automatic Cold Storage is now available. Assets will automatically transition to Cold or Infrequent based on when they were last viewed, and will be less expensive to store. Assets must have MP4s disabled to qualify.
Mux Video now supports 2 new playback modifiers, min_resolution
and rendition_order
. min_resolution
allows developers to limit the renditions available by a minimum resolution, for example 720p
. rendition_order
allows developers to set the order renditions appear in the manifest.
Mux Video now supports a new type of on-demand asset with free video encoding. Baseline assets are a cost-effective option for video applications that have simpler quality needs. The standard tier of video encoding is now referred to as "smart" encoding, and offers superior visual quality and scalability for video-centric applications.
Auto-generated captions can now be enabled on Mux Video Assets through the API at asset creation time. Additionally, a plain text transcript of the video is available for assets where auto-generated captions are enabled.
We’re excited to announce that the Mux Data SDK for Media3 is now out of public beta! Version 1.0.0 of our SDK for Media3 has full feature parity with our ExoPlayer SDK, plus lots of internal improvements and the ability to automatically detect metadata of the media you play. If you’re ready to migrate over from the old ExoPlayer, check out our dev guide to update your Mux Data integration.
Read more:
Mux Video now supports ingesting, storing, and delivering videos at 4K resolutions (2160p), allowing customers to deliver higher quality video content to viewers.
Mux Video now supports adding alternate audio tracks to a video asset. This can be used to allow users to pick between different audio languages, to add director's commentary, or to increase accessibility with descriptive audio.
We now have 2 new SDKs for handling Direct Uploads from mobile applications. Mux Uploader for iOS and Android have been released in beta. Mux Uploader will handle file splitting and chunking logic to optimize upload speed and in cases of non-standard input Mux Uploader will process the video on the client in order to minimize the time it takes for the asset to be playable after upload.
See more
We’ve added more features to the Build tab for both Assets and Live Streams. You can now create thumbnails, gifs, and timeline hover previews with storyboards for Assets and Live Streams right from the Dashboard.
The Build tab for Spaces has two new sections - one for integrating, another for broadcasting - that direct users on how to best build a Real Time experience with Mux.
We’ve improved the Viewer Engagement metrics in the Monitor tab for both Assets and Live Streams by adding 7 day historical View and Overall Viewer Experience data. We removed the “unique viewers” and “updated (x)s ago” counters for clarity - current views will continue to update live. We’ve also improved the null state for clarity.
Passthrough Data is now shown in the Details tabs in Assets and Live Streams for better visibility and use.
iOS and Android SDKs for Mux Real-Time Video (Spaces) have been released to General Availability. This 1.0.0 release of both SDKs includes new features (Custom Events, Display Names), performance improvements, and bug fixes. Please refer to the GA blog post for more details.
Version 1.2.0 of the Spaces Web SDK has been released. This version adds support for setting, updating, and retrieving participant display names.
Spaces Web SDK 1.2.0 Release Notes.
Guide to using participant display names.
A new Live Stream Health Stats API is now available in Public Beta for developers to get live ingest health status. For example, you can call the API to return the live stream stats to your streamer during a live event, so that the streamer can monitor the status and take actions when issues occur.
Read this documentation for more information.
This update to the web SDK for Spaces includes the ability to publish custom events in the session, a helper function to create LocalTrack
s from a MediaStream
, changing the default subscriptions from 20 to 16, and enforcement of published track limits. For more details, see the Spaces Web SDK 1.1.0 Release Notes.
You can now export your Live Stream Input Health data to your infrastructure using the Streaming Exports feature. You can send the Live Stream Input Health messages that occur every five seconds for each active Live Stream to an Amazon Kinesis or Google Pub/Sub endpoint in your cloud account. The docs provide a how-to guide for more detailed information. Please note that this feature is available to customers on a Media plan, or if it has been added as part of a Mux Video contract. You can find this feature by going to the Settings menu in your Mux Dashboard and click on the “Streaming Exports” option. If that option is not available in your Mux Dashboard, contact our sales team if you would like more information.
Mux now supports webp images for storyboards, thumbnails, and GIFs. Webp images increase website performance due to smaller image sizes compared to JPEG and PNG. Increased page speed due to more lightweight images decreases bandwidth and can positively impact site ranking in search engines.
This release of the Mux Data SDK for AVPlayer adds the ability to track the number of dropped video frames in a session, as well as adding 5 more custom dimensions to track whatever data is relevant to you and your audience.
Mux Real-Time Video (Spaces) has been released to General Availability. This update includes updated SDKs for all platforms, multi-region support, user experience report collection, and additional improvements and bug fixes. Please refer to the following for more details:
We are announcing the exciting and extensive release of the Mux Data SDK v3.0.0 for ExoPlayer. This update converts most of the SDK to Kotlin, removes long-deprecated methods, and adds API changes. A more simple SDK setup process no longer requires you to specify the screen size of your device. You may now pass your `ENV_KEY` through a constructor instead of supplying it to `CustomerPlayerData`. Finally, we removed exposed internal callback methods that likely will not impact you.
For the full list of API changes, check out the release notes on GitHub.
We’ve released Reconnect Window support for all modes of live streams, including reduced & low latency, in Beta. Reconnect Window is the time in seconds you want Mux to wait for the live stream to reconnect before considering it completed and generating a recorded asset. You can also add a slate image as a video frame during live stream interruptions to let your viewers know the video isn’t over and you’re trying to reconnect. You can read more on docs and the blog post.
We’ve released Custom Domain for Mux Video to Beta. You can stream videos or serve images from your branded domain instead of from stream.mux.com
and image.mux.com
. You can learn more about Custom Domains, reasons to use, and information on requesting access from the announcement blog.
By providing technical terms and proper names to us before a live stream, we can increase the accuracy of auto-generated live closed captions. Create a transcription library by making a POST
request to the /transcription-vocabularies
endpoint and define the input parameters.
Closed captions refer to the visual display of the audio in a program. Auto-generated live closed captions use Artificial Intelligence based speech-to-text technology to generate closed captions. You can enable live auto-generated captions by adding the generated_subtitles
array at time of stream creation or to an existing live stream.
We've added version 0.1.0 of our mux-csharp
SDK. This is the initial release of the Mux C# SDK and it reflects the current state of the Mux API. You can read more on the 0.1.0 Release Notes. You can also view the NuGet package and GitHub repository. This is a release that we're confident is stable and usable in production, but we would love any customer feedback! To submit feedback, please email sdks@mux.com.
We’ve released v2.7.0
of our Mux Data SDK for ExoPlayer, and it’s a big one. We added support for setting dimension values from HLS session data, and fixed some bugs related to CDN detection. We also added support for ExoPlayer 2.17.1, and the Official Port of ExoPlayer for Amazon Devices. You can update to either version by following Step 1 of our Dev Guide. You can read more about this update in the Release Notes.
If you are a new Mux Video user, you can enjoy an updated getting started experience. The new Assets and Live Streams pages make it easy to upload your first video, find in-context onboarding guidance and get quick links to support and documentation. Users who have already uploaded a video no longer see the Getting Started link in the navigation.
We’re pleased to announce the initial release of the Mux Data SDK for Bitmovin Player for Android. The first public version is v0.5.1, and it reports all playback events. Support for Bandwidth, Experiments, and Live Latency is being planned or investigated for the v1.0.0 release. Read more in our Integration guide for the Bitmovin Player Data SDK or the GitHub repository.
We’ve added support for tracking experiment values via metadata such as X-SESSION-DATA
HLS tags. The tags override values of the dimensions we track. Once the session data tags on the main playlist are loaded by your player, you may pass them to MuxStats::setSessionData(List<SessionTag>)
in order to track the experiment values. Currently experiments are only supported for HLS streams.
We’ve added the max_continuous_duration
parameter to Live Streams. You can now set the maximum duration for recording a single live stream event lower than 12 hours. Set the max_continuous_duration
parameter during Live Stream creation or update an existing live stream. On hitting the duration, the behavior is the same as signaling the live stream has finished. For more information, see the Live Stream API and the Signal a live stream has finished API.
We’ve added a new status
filter to the List Live Streams API endpoint. With this new filter, you can get a list of all live streams that have active
or idle
or disabled
status. This list is sorted by live stream’s creation time from most recent to oldest. You can learn more in the API Reference.
We’ve introduced Referrer Validation, a new method of Playback Restrictions to secure your videos. You can restrict your videos to play only on your approved websites with Referrer Validation. Mux validates the requesting website against your approved list by examining the HTTP Referrer header sent by the Web browser. This feature requires the use of Signed URLs. You can read more on the Blog Post, API Reference, and Guide.
We’ve added the PATCH method to several API endpoints for updating Assets & Live Streams. You can update the passthrough
parameter value of Assets & Live Streams anytime after creating them. Similarly, you can update the latency_mode
and reconnect_window
parameter values of Live Streams. For more details, see the Live Streams PATCH API or Assets PATCH API documentation.
Streamers can broadcast on the same Live Stream multiple times, each time creating a new video asset. We’ve added a new live_stream_id
filter to the Delivery Usage API. With this new filter, you can get delivered minutes usage information for all the video assets created from a single live stream. You can learn more from the API Reference.
You can now use Mux Data with Kaltura video players to collect engagement and quality of experience metrics. We’ve added new Mux Data SDKs for Kaltura web, iOS, and Android players. To configure and use the SDKs refer to the documentation: Kaltura web SDK, Kaltura iOS SDK, Kaltura Android SDK.