We’ve been busy working with a leading game studio to measure how much our solutions at Edgegap can improve the player experience by lowering latency and increasing fairness. In their 1 versus 1 game scenario, we’ve managed to prove a colossal improvement for their players:
· Improved latency for 91% of the players
· Average RTT per player reduced by 46.5%
· Average RTT per match reduced by 36%
· Fairness improved from 46.5 to 33ms
· 25% of players below 50ms with Edgegap, vs 2% with the cloud
· 69% of players below 100ms with Edgegap, vs 33% with the cloud
All these improvements will translate into more revenue for this studio as players will not suffer from bad experiences and remain engaged. This is especially important as games are becoming services and their LiveOps is a crucial part of the monetization aspect of games. We’ve also observed that our solutions improves the fairness between players, which helps them improve their competitive edge. This is seen as highly desirable in a time where esports tournaments are forced to go online due to Covid-19. The studio did not have to modify their netcode or use an SDK; the game client remained as is. The full case study is linked below:
Our latest case study got picked up by an analyst at Light Reading. We are surprised and happy as we were not expecting this coverage. As cloud gaming is taking off with Stadia launch last week, latency in gaming is seen as a major roadblock to great player experience.
We have been working with the well-known gaming studio Ubisoft, here in Montreal, on measuring by how much our solution can improve player’s experience. After a few months of work, the results are finally in, and they are beyond expectations!
We were able to reduce the average round trip time by 58%. We improved results 95% of the time during the test. We’ve increased the percentage of players within the group of “great latency” by a wide margin. This means that without Edgegap’s solution, only 14% of the player had below 50ms, while Edgegap would have allowed 78% of them to get under this threshold.
Huge thanks to the team at Ubisoft for helping us and providing the right feedback throughout the process!
Let’s make it simple: we are bringing the deployment of game instances and players experience to a whole new level by reducing the average latency per match by 33% and improving stability all over the network by a whooping 77%. How did we achieve this? Dynamic decision making & Edge computing my friends!
Imagine you and your friend want to play a multiplayer online game but there is a problem, your friend lives in San Francisco and you in Toronto… He tells you: “Lets go on West Coast server!” and now he is having a lot of fun with his 30ms latency… but you don’t because you lag, and you are having a hard time to only move in the game… You tell your friend “Sorry, I can’t play, it lags too much, let’s move to the Eastern Server!” … and I let you guess what happens next. Same can happen in a competitive context where you decide to play a ranked game of your favorite title and by any luck your opponent resides right beside the server because he lives in one of the main city and you, peasant, live in a quiet nice town far away. Let me tell you what’s going to happen, you will turn the corner of a wall and : *DEAD*… you will ear steps sounds behind you but by the time you try to react : *DEAD*, you will get angry, the other guy will tell you that you are a “noob”, you will rage quit and stop spending time and money on this “stupid” game. In fact, this was not your fault, simply because he had 25ms of latency and you were around 65ms…
You see this is one of the use cases that Cloud computing can’t resolve without spending billions of dollars to build new infrastructures everywhere in the world… but, oh wait! Those infrastructures already exist since service providers, private clouds and other Edge infrastructures are already in place. Now imagine that you can use every single one of those edge sites to deploy in a blink of an eye your server to play with your friend right in the best spot for both of you. You might not have a 30ms latency for both since no one can bend law of physics by accelerating the speed of light, but at least both of you will be able to enjoy a great game with minimal impact coming from the network, and equivalent outcome making it fair for everyone.
But let me guess, you want proof that what I say is true? Using our decision-making software, we successfully managed to gather enough data from every request made in our system to show the improvements you get by leveraging game instances directly to the Edge. Using real-time telemetry from 1000 players connected in 100 matches, we were able to measure and demonstrate by how much and where we can improve your player’s experience.
You can see below the graph of the standard distribution we get from our data:
Coincidence? I think not! Those are facts and confirm what Edge computing can bring to your players: satisfaction and happiness! Speaking of which, we calculate the “happiness” of a user by comparing latency, jitter, packet drop and environment context during a request to our system. Click here to see a full report of our case study with 1000 players showing by how much we can improve their experiences: