Pixel Streaming
Ian Berget: Awesome. All right.
Ian Berget: Hi, everyone. This is Ian Berget XR manager at Elara Systems and I’m here with our creative director, Bob Dyce.
Bob Dyce: Hey, everybody.
Ian Berget: We are talking about pixel streaming today. What it is, how we do it, what its uses are, etc. It is a pretty incredible technology. What we’re looking at on-screen right now is Bob’s screencast over Discord. And we’re looking at a Unreal Engine 4 project that we packaged. We put in all of that real-time lighting, a number of different models and we’ve thrown it together, put it in an app, of course. And we’re streaming it from a web browser. And we’re doing that with pixel streaming. So what’s great about this is that we have, ultimately, a desktop computer, whether it be a real computer or a virtual machine, running the game, running the engine, showing off all these really high-quality visuals. And then we’re playing them back on, in this case, a desktop or a laptop, but it could be a phone. It could be a tablet. It could be anything. And it’s not limited by the resources of the mobile device or the second device. It’s only limited by the internet speed, how quickly they can stream this set of pixels.
Bob Dyce: And really this technology was first dreamed up when the streaming content likes of Netflix and others began mid-2007, I think, around that time. And I think from that time, there’s been a dream of, can we stream games in the same way? So there’s been many attempts to offload the console to the cloud. And then really we’re just using video streaming, like a Netflix to send in content, or we’re sending back and forth different mouse movements and clicks so that the UI works.
So really, as I’m dragging this thing around, it’s sending information to rotate a camera up to the cloud. It’s changing the camera position on the game engine there, and it’s sending me back frames. But really what it opens the door for, kind of the holy grail of this whole thing is, can we have ultra-high quality renderings on any device with access to the internet and a browser, so an iPad or a phone. It looks the same on all devices. It’s not without its challenges. There can be lag included. It requires a pretty fast internet connection. But as you can tell here for a compute that’s in the cloud, it is pretty quick and really looks amazing.
Ian Berget: In our industry where we are doing a medical demonstration and interactive processes and getting into simulations and doing some industrial VR work as well, this is actually even better for us than it might for the games industry. Because in the games industry, if you’re racing a car, if you are doing some sword fight, some combat, whatever it might be, you do need effectively, extremely low latency to perform those types of steps. In our cases, as long as the visuals match and the interaction is there, the latency isn’t quite as severe an issue because nothing is expected to be a millisecond perfect on our end. So we have a little bit more lenience there. Though there are plenty of services like the Stadia that are getting in the direction of having truly zero lag insofar as that’s possible with local servers all around the world for gaming. And I’m sure that’s just a matter of time before that’s available for enterprise as well.
Bob Dyce: Yeah, it’s really a technology we’ve been looking at for a long time, have run demos internally, as we’re seeing now. Watched it work. And now we’re at the point where we’re running it by customers and we’re showing them that if they really want to be able to distribute across all platforms and not be dependent on having high-end graphics cards or very specific computers, if really high-level distribution is the key, well then this is a viable solution because this will play on a Mac or a PC. This will play on an iPhone or an Android phone or a tablet, really any device that has a nice, strong web connection. We can stream video too. And again it’s thanks to all of the streaming content servers out there that really paved the way. And now it’s opening doors to two-way interactive streaming.
Ian Berget: Exactly. So we’d be remiss to not speak about the relative downsides of the service compared to some other techniques because this isn’t a magic bullet. It won’t solve every solution for everyone. It’s definitely one to keep in mind because it adds a lot of functionality, but in terms of where it is currently not up to speed is this won’t work to stream to a VR headset. This is definitely designed for a 2D screen right now. I wouldn’t expect that the VR streaming is available necessarily this year. I know that there are technologies that are working to change that, especially local in-house streaming, where you would stream from a computer locally without a cord to your own headset. I know that’s coming. Over the internet, that’s going to be a little trickier because they will need to be able to match that head rotation very quickly. That’s going to take some more advanced technology.
The other relative downside to this is every single user does need a server on the backend, streaming their content to them, unless you have viewers, and then the player, where you can kind of hot seat, go back and forth, controlling what’s on-screen. But if you do want to have many users using this at once, just add up more and more and more render time, server time, that you’re using for the service. So if you have a thousand users and you expect to have them all play this simultaneously, you are going to take your service charge and it’s going to go up by a thousand linear cost.
Now that will, I’m sure, depend on the service provider. As you can see here, we’re working with Furious, if that’s the right way to pronounce it, but there are other services you can do this with. We’ve also tested doing this with Amazon web services and the UB4 pixel streaming project. And that worked fairly well. And there are others that are competing in the same space. We’re excited about the possibilities of Nvidia and AWS teaming up to create a more robust solution here that has a very highly optimized compression and distribution of video stream and the input stream. So expect to see improvements in the future, expect to see more services at better rates coming soon. But even now we see this as a viable solution that could be effective in production.
Bob Dyce: Well put. It’s very exciting technology. We’re happy to start seeing that it’s ready for prime time, start developing content, deploying it to the cloud, and then just making instances for people to be able to hop onto a web browser and enjoy all that beautiful 3D, interactive goodness.
Ian Berget: Hear hear. All right. We’re going to sign off. Have a wonderful day, everybody, and talk to you later.
Bob Dyce: Thank you. Bye-bye.