Basics of bandwidth usage
Like any other digital asset, a video is delivered over the internet to browsers and apps as data. The amount of data transferred is the bandwidth consumed, and that you are charged for. This bandwidth can be divided into two parts:
- Used bandwidth: the data associated with part(s) of the video that the viewer watched
- Overhead bandwidth: data associated with part(s) of the video that were downloaded, but not actually viewed - there is always some overhead because:
- Video data is buffered ahead of the viewing progress so that the video won't keep pausing while additional data is downloaded
- In some cases devices preload part or all of the video data (depending on the video file format) before the viewer tries to play it - the reason for this is that network bandwidth available to devices is often less than what's available to computer, and therefore preloading allows the video to start playing more quickly when the viewer taps the play button
Bandwidth usage is going up! There are several reasons why this happens:
- More viewers: overall, the number of people who connect to the internet through computers and/or devices is increasing, and they also watch an increasing amount of video online. Of course, this is a global increase and varies over particular sites and apps. You can tell whether you have more viewers using Video Cloud Analytics, looking at such metrics as video impressions, video views, and unique users.
- More internet bandwidth: thanks to improvements in the infrastructure and network protocols that support the internet, internet users on the average have more available bandwidth (again, this is a global average, and there
are local variations). This has two implications for online video:
- more video data can get downloaded faster, whether the user views the video or not
- faster download times mean that higher quality renditions can be viewed in many cases
- Internet usage is shifting toward devices: and this includes consumption of online video. This tends to increase bandwidth consumption in some cases because of the preloading behavior described above. In addition, improvements in screen quality, memory, and networking for devices means that viewers are able to view higher quality renditions of videos than they could in the past.
- Changes in Video Cloud: the most significant change here is the introduction of Ingest Profiles, which give you greater flexibility in the sets of renditions you create for different kinds of video. The renditions defined in the standard ingest profiles have increased over time in response to the changes described above, which allow many users to consume higher quality video.
To the extent that these factors increase used bandwidth, that is a good thing, as it likely means that you have more viewers watching better quality renditions of your video. These increases inevitably mean an increase in overhead bandwidth as well, however, and it may be that the amount of increased overhead is significant enough that you would like to reduce it if possible. The sections of the topic that follow discuss the various factors that affect bandwidth consumption, and options you have that might reduce bandwidth if you wish to.
In the post-Flash world, video files are delivered to browsers and devices over HTTP connections, and the file type determines how this is done. MP4 renditions are delivered as whole files, while time-segmented renditions (HLS or DASH) are delivered in short segments as they are needed. The difference can have a significant impact on overhead bandwidth, especially on devices that preload the video. If the viewer stays on the page or app view long enough, an entire MP4 rendition will be downloaded, even if none of it is watched. HLS renditions, on the other hand, are generally broken up into 10 second segments, so in the same scenario, only part of the video data would be downloaded.
It is still important to have MP4 renditions for legacy platforms that cannot play segmented video formats, but you can significantly reduce overhead bandwidth by ensuring that all of your videos have a good set of HLS renditions (or DASH, if you require DRM types that don't support HLS).
As you have already seen, by default most devices will preload the first HLS (or DASH) segment immediately - or an entire MP4 rendition, if no time-segmented rendition is available. The improves the apparent performance of the video, because by the time the viewer taps the play button, playback will begin immediately. Better performance generally will increase the play rate for the video. The cost of this is that video bandwidth is consumed even for videos that are rarely or never played.
Whether you should turn off preloading for some or all players depends on several factors:
- If video is the focus of your page or you have good reason to believe a user will watch your video, you should leave preloading on.
- Look at your video analytics - videos with very low play rates are the worst offenders for wasting bandwidth, so consider turning off preloading for players that deliver these videos.
- Turning off preloading is recommended if you have several players on the same page, as it is less likely that all the videos will be viewed.
Higher bitrate video renditions provide better quality video, but at the cost of consuming more bandwidth. If delivering the highest quality video possible in all cases is your top priority, then this is a cost you have to live with. However, it's worth considering whether you can save overhead bandwidth in this area by retranscoding videos and eliminating some of the highest bitrate renditions (if you are using the high-resolution profile for all your videos, for example, you are creating some very high bitrate renditions).
Your video analytics can help here. Look at the device types and device OSs where your videos are being viewed. If the views are predominantly on phones, for example, smart phones on wifi connections may have enough internet bandwidth to use very high quality renditions, but on the phone screen these may not look noticeably better than lower quality renditions. The best approach here is to actually download the various renditions for videos from Studio and view them on your target platforms to see how much quality difference there is between the highest bitrate renditions and the lower ones.
The Brightcove player automatically buffers video during playback - this means that additional future segments of a video are downloaded to prevent the video from pausing while additional data is downloaded when the end of a segment is reached. By default, the player tries to maintain a buffer video data ahead of the current playback position.
Bandwidth overhead due to buffering is proportionally greater for short videos. Consider, for example, a one minute video of which, on the average, thirty seconds are viewed. For every view, then, as much bandwidth is overhead as is used.
The amount that is buffered is based on current time of the player. The logic for that implementation is that the longer a user watches content, the more likely they are going to continue watching and complete the content, so more content can be buffered without it being wasted bandwidth. The logic is very simple, the player linearly interpolates current time between a minimum and a maximum. The default minimum and maximum is 30s and 60s respectively. So once the player has viewed 30 seconds of content, there will be a forward buffer of approximately 60 seconds. The implementation does not track if the viewer has scrubbed forward in the video.