Ahead of the launch of 5G networks this year, the buzz around speed is palpable. It’s more than just speed for the sake of speed, though. Faster networks can transform actual user experiences, with early tests showing that an entire 4K movie can be downloaded in as little as eight seconds. As we eagerly wait for faster speeds and more bandwidth, the killer feature of 5G is low latency. Mobile carriers are exploring ways to leverage low latency to bring higher fidelity content to mobile devices, ranging from game streaming to mobile mixed reality experiences and even remote surgeries.
Low latency not only reduces the lags and delays that can ruin gaming, but the feature can be used to bring more power to devices with limited hardware. How? By moving more and more of that GPU power to the cloud.
Rethinking rendering with a hybrid approach
“We know we can’t do 100 percent of the rendering on the devices simply because we don’t have enough batteries to support it, and if we did, we would melt the device – we simply cannot make that much heat,” Dr. Morgan McGuire, a research scientist at Nvidia, explained. “We’ve known for a while that cloud streaming has to be the answer.”
But streaming a 4K or 8K gaming experience, especially with high frame rates, is intensive for any network, let alone a mobile network, so carriers are devising new approaches to handling immersive, interactive experiences. “We can’t just think of it like we used to do — 100ms latency for passive media and now we’re reducing it to 10ms or 1ms, and that’s what we need for [gaming and VR],” Alisha Seam, a principal researcher at AT&T’s Foundry lab in Palo Alto, California. “It’s really not that simple.”
Depending on the research and the network, generally accepted latency figures for game streaming is between several milliseconds to approximately 20ms.
One approach, which AT&T and Verizon are exploring for gaming, is to take a hybrid approach and split the rendering duties between the cloud and the device. Essentially, hybrid rendering relies on a virtual gaming PC in the edge cloud to do most of the heavy lifting and the client device to do some of the decoding of rendered images. It’s a technique that Nvidia is using for its GeForce Now game streaming platform, which gives light computing devices access to GeForce graphics located on the edge cloud.
“There are different approaches where you can launch everything as a web service or virtualizing everything as a VM where you send your inputs up and your video streams down or finally having a hybrid stack where you’re decoupling the render loops and seeing what you can actually offset,” Raheel Khalid, the chief engineer for Verizon’s virtual reality lab said, with each solution having its own pros and cons.
But when you’re streaming games, you’ll have to evaluate successful experiences using new metrics. Passive video streams were fine with 100ms latency. But for VR and gaming, the server isn’t just pushing content to your eyes, you’re constantly sending input to the server based on your movement in a game. The server then has to render new video and send it back to the device to be decoded and displayed. Latency has to be further reduced.
“The big determining factor for user experience is responsiveness, and we’re fighting against the vestibular ocular reflex – the correlation between what your eyes see and where your inner ear places you at when you turn your head – and that’s on the order of 7ms,” Seam explained with VR experiences.
“The metric that we’re most interested in is motion to photon latency, which is measured by time-stamping packets,” Seam continued. “We measure what input the server is responding to, so the user will hit an input. We will send that packet to the server with a stamp from the user, render something with that input and send it back. And we can basically measure the time between the user making that movement, what we’re rendering, and what is actually displayed to users.”
AT&T predicts that game streaming will largely rely on split rendering where the servers will render scenes and the heavy lifting on the client devices will be used for clever technologies that augment the gaming experience. Seam envisions that client devices can fill in the gaps with tricks like foveating technology for VR. “We don’t want the client side to be consumed with trying to make up for the network, so the more closely we’re able to couple the network performance with the performance of the software layer of these applications, the more we’re able to let them do what they should be doing and not have them be just network workarounds,” she said.
That’s different from the historical relationship that cloud gaming has had with the network. In the past, cloud gaming platforms tried to compensate for the unpredictability of a network by sending through massive amounts of data hoping that some of those data packets will arrive. When the do arrive, many of these packets are delayed or are out of order, placing strain on the client device to decode and rearrange these packets in a timely and orderly manner.
“The latency number itself is important, but even more so is the distribution of that latency, and that’s something that we can really get into with 5G and edge,” she continued, noting that jitter, or the variability of the latency, plays a significant role in streaming performance.
Verizon’s plug-in model
Verizon’s hybrid approach is largely app-based, and the carrier had worked with Unreal Engine to create an edge-based plug-in that will enable split streams.
“When we have this ecosystem with very complex games — which have a lot of rendering potential that have to have a lot done in a short amount of time and you have to also maintain a 60 or 120 Hz refresh rate on your device where you have your inputs received and never have jitter or lag — we look at how you decouple these two things, and what we started to do is build a new paradigm for game engines where you can take certain rendering steps and push them into the edge, or the cloud, and decouple that from the input loops that you have on your device,” Khaleed said.
“And that’s mainly where we focus on – the future of that main rendering stack and how we’re going to achieve that,” Khaleed continued. “How you’re going to take the input and the frame update and separate that from things that you need? As we’ve gone and built this Unreal Engine plugin and investigated how you build split rendering stacks, we figured out what you’re going to take and move up into the edge and what you can move up into the cloud.”
Regardless of the approach, both Seam and Khaleed agree that frame loss is a major factor on making game streaming successful. Gamers may not care if certain effects – like lighting or shadows – may be delayed by a frame or two. What makes or breaks the user experience are input lags and frame loss.
“Hardcore gamers care about that. If you have a frame loss, it’s going to be the end of that service. You’re never coming back, and why would you? It’s too big an impact,” Khaleed said. “The casual gaming crowd may be acceptable and tolerant, but at the end of the day, we’ve built the holy grail: How do you go and separate your game update loop and your game buffer update from the most compute heavy operations.”
The economics of 5G
Because interactive streaming, like gaming, is more complex to deliver, carriers expect to charge a premium for gamers who demand a more stable network experience on 5G.
“It’s just so much more complicated because you can’t do something as simple as a progressive download,” said John Benko, a researcher at Orange’s Silicon Valley lab. “Since we’re talking about really pushing 10, 50, 100, 200 Mbps through the wireless channel, this is not going to be something that everyone can do and expect to pay the exact same price that they’re paying now to stream a 2 Mbps signal. So, the economics will need to be looked at carefully to see how we can make it a reality for people who want it.”
Part of the advantage of using a 5G network, Benko explained, is that operators can create network slices for particular use cases, offering more reliability and stability for customers who are willing to pay more.
Casual traffic from a mobile phone can get deprioritized for congestion, for example, but if your client device is provisioned for gaming or VR, networks can offer a guaranteed experience with a promise latency range for that use. Beyond gaming, mission critical applications, like remote surgery, can be provisioned to an even higher priority tier, to avoid any potential network interruptions that can endanger the application.
Though 5G promises to deliver a lot for gaming and other uses, cost remains a big factor for mobile adoption of game streaming.