How to reduce latency for your WebRTC media server

reduce latency for webrtc media server
18 April 2023

Our recent series of blogs on Web Real-Time Communication (RTC) applications has looked to highlight all the key considerations you need to make regardless of the stage your RTC app is in. From optimizing bandwidth to looking at things at the network level and preventing DDoS attacks to keep your users happy, reducing latency remains a fundamental concern.

For clarity’s sake, latency refers to the delay between the time a user sends a data packet and the time the receiver receives it. High latency can negatively affect the quality of audio and video communication, causing disruptions, poor quality and video lag. In this article, we will be zooming in on the latency problem and looking at best practices and strategies WebRTC application developers can take on the infrastructural level to counter them.

Table of Contents

Leveraging RTC media servers

RTC media servers act as intermediaries between two or more users communicating via WebRTC. These servers help improve the quality of WebRTC communication by reducing latency and improving bandwidth. RTC media servers work by relaying audio and video data between users. When a user sends audio or video data, the server receives it and then sends it to the other users in the conversation. This process reduces the amount of direct communication between users, which can help to reduce latency and improve bandwidth. Using RTC media servers can thus be helpful in reducing latency and keeping your users happy with uninterrupted service.

Latency in RTC media servers

As we’ve seen in previous articles, network congestion can be a reason for increased latency as data packages wait to be transmitted through the network. This, however, shouldn’t usually be the case if the network is properly designed.

There are other factors that may add latency, however.


Actual distance between two points on a network can cause latency as the data must be transmitted over a physical fiber — its maximum speed is capped at the speed of light. Naturally, the further the two points are in terms of distance, the higher the latency is likely to be. Luckily, there are practices that can be employed to mitigate this. One of the more obvious ones is to choose the server location based on your users. The physical distance between the server and participants is one of the key factors that influence network latency. By choosing server locations that are closer to the participants, you can reduce the amount of time it takes for data to travel back and forth, which can help reduce latency.

Network equipment

Network equipment such as routers, switches and firewalls can have a significant impact on latency and ultimately the end-user experience on your WebRTC application. Latency can be reduced by choosing routers and switches with low latency and fast switching capabilities. To reduce firewall latency, you can configure the firewall to allow RTC traffic to bypass some of the inspection processes.

Improving latency for RTC media servers

To reduce latency for RTC media servers, you need to ensure that your network is optimized for real-time communication. Here are some tips to help:

Choose a server with a low-latency network connection

The first step in reducing latency on RTC media servers is to ensure that the server is connected to the internet through a low-latency network connection. This will ensure that messages are transmitted quickly between the server and the participants.

Leverage a network that is peered with local ISPs

RTC media servers work best when the network they’re connected to is peered locally. This not only means that the server should be located as close as possible to the users who are communicating, but also that the network the media server is connected to is locally peered with the main ISPs your end users are leveraging.

Look at network architecture

Beyond connections to ISP, how a network is designed can set it up for failure or success. Are there enough redundant pathways to pick up your traffic in case of any drops or unplanned incidents? Do regions next to each other have direct routes or does traffic have toi take long circuitous routes to get to its destination? Are there regularly planned expansions to the network that can connect you to new locations and allow for existing Points of Presence (PoPs) to be bolstered as a result? Speak to your infrastructure providers network engineers to understand their strategy before committing.

Optimize the server configuration

You can optimize the server configuration by using a high-performance server with a powerful processor and sufficient memory. Additionally, you can configure the server to use efficient codecs and other settings that minimize latency. You could set up your server to use UDP instead of TCP, which can help to reduce latency. UDP is a connectionless protocol within the Internet protocol suite. Unlike TCP, UDP does not require a connection (or a “handshake”) to be established between the source and endpoint during data transmission. Instead, the endpoint quickly retrieves data from the source and promptly directs it to the network without dividing or modifying the content.

Use a Content Delivery Network (CDN)

A CDN is a network of servers that are distributed geographically and designed to deliver content quickly to end-users. By using a CDN, you can reduce the distance between the participants and the server, which can significantly reduce latency.

Implement WebRTC data channels

WebRTC data channels allow for direct communication between participants without having to go through the server. This can significantly reduce latency as messages are transmitted directly between participants.

Use Adaptive Bitrate (ABR) streaming

ABR streaming is a technique that allows the server to adjust the quality of the video or audio stream based on the available network bandwidth. By using ABR streaming, you can ensure that the communication experience remains smooth and uninterrupted even if network conditions change.

Use a dedicated server for signaling

Separating signaling from media traffic can help reduce latency by offloading some of the processing from the media server. By using a dedicated server for signaling, you can ensure that messages are delivered quickly between participants.

These are some strategies you can employ to significantly reduce latency on RTC media servers and ensure a smooth and seamless communication experience for your users. Speak to our experts for more information on this or to get detailed advice for your own RTC project.

Main Take-Aways

Online performance and latency can make or break a game. Make sure you run on a high-performance, low-latency global network with years of experience handling massive online games. With an extended suite of products tailored for seamless, safe, and fun online experiences, offers a Game Hosting Platform that’s flexible, scalable, and fast. It just works.

Get in touch with our experts and discover what our network can do for your game.

This site is registered on as a development site. Switch to a production site key to remove this banner.