VR Visionaries #2: Tim Sweeney, CEO, Epic Games / Unreal Engine

Tim Sweeney is the co-founder of Epic Games, the visionary company behind the Unreal Engine, plus iPhone killer app Infinity Blade. Epic have been making waves recently with VR tech demos such as Showdown, The Hobbit: A Thief in the Shadows and Bullet Train.

Tell us about Epic's relationship with VR, how long have you been looking into it?

We’ve followed VR for as long as we’ve been doing 3D – starting in 1995 with the Unreal Engine. There were some early efforts at consumer VR; running at 20 frames per second at a resolution of 320 pixels by 200 pixels. At the time that was more exciting for academic papers than creating an actual viable product.

This has always been at the backs of our minds as something that would eventually come to pass and have a fundamental impact on everyone’s use of 3D graphics.

We kind of forgot about it for about a decade and then Palmer Luckey and John Carmack came along and had the key recognition that it was finally possible to take existing displays, CPUs and GPUs and build an awesome VR package. As soon as they did that, for us it was an ‘aha!’ moment.

I remember talking about this with Michael Abrash when he was just starting to dig into VR. It immediately become apparent when we saw the specifications that this was going to become a big thing. For the past three and a half years we’ve been wholly immersed in it and using VR as one of the forefront drivers of our engine efforts.

All the major components of VR: Frame rate, performance, visual realism, characters, networking, and social communication – they’re really core to all of our thinking.

Our expectation is that VR represents a small consumer market right now, but in a decade it will be billions of users.VR is going to change the way that everybody interacts with computers.

You're bringing Unreal Engine 4 support to Google’s Daydream, does that mean you see mobile VR as the future?

There’s no question in my mind that the billion user version of this hardware is going to be a mobile form factor, but that’s a decade out.

What can mobile VR look like when you have 20 Teraflops in your pocket? When you have something that looks like Oakley sunglasses on your head, with a completely immersive combination of computer imagery and the real world combined seamlessly? That’s the long term.

In the short term there are going to be mobile initiatives to push it forward that are more convenient to consumers. They will also have significantly less computing power: about 1/10th the power of a PC.

Then there will be high-end developments on PC. It’s going to be important for developers to push on both those fronts. High-end PC is the way to push the leading edge of technology and figure out what will be possible in ten years, but mobile is where the billions of users are today. We have to do both and the interesting part will be watching over the next decade as these two segments meet up into a pervasive VR and AR future.

Do you think mobile VR and True VR as we see them today will remain distinct, or do you foresee a point where the two will merge?

I think the merge point is 10 to 12 years out. It will definitely exist, just because the mobile hardware is getting faster and faster. The other thing to consider is that your eyes have a finite amount of bandwidth. That’s the funny realisation with computer graphics – there is an amount of computing power that is enough.

Desktop PCs are approaching that point today, and mobile will approach that point somewhere in the future. I think mobile will ultimately be the solution.

However there isn’t anything today that can beat an 11 Teraflop NVIDIA GPU when you’re looking for that high performance.

What are the most impressive applications for VR that you've seen or heard of?

There have been a lot of developers pushing realism, which has been really impressive; and we did that recently ourselves with the Bullet Train demo. We created a realistic environment that you can teleport around within and interact with enemies in a modern videogame experience.

However, seeing things like the social VR experiences where you’re in a world with other people and you can talk and interact with them – it’s really starting to feel like you’re actually with the other person, even if they’re halfway around the country. It’s going to be exciting to watch that develop and see it catch up with the real time motion capture technology that we’ve been demonstrating in collaboration with our partners.

Where all of this is converging is something like the Metaverse from Science Fiction where you have super realistic computer graphics, a lot of user-created content and environments and all kinds of interesting game or other entertainment experiences to engage in – all within a virtual space that feels as if you are sharing it physically with other people.

With Infinity Blade you created the first killer app in gaming for iPhone, is that something you're looking to do again for VR?

We aspire to! As an engine provider there are tens of thousands of developers working on VR apps and we’ve been rapidly adopting VR features in our development pipeline.

So now we provide the full Unreal Editor for VR; you can run the full Unreal Editor, and then within this immersive VR experience you can reach out and move objects around to build your game scene. You can also bring up the classic 2D elements of the user interface as if you had a little VR iPad in your pocket, which brings even more options for developers. We’re wholeheartedly pushing the engine in that direction.

At the same time we’ve been pushing our entertainment development expertise. We’ve shown some of that with our tech demos like Couch Knights, or the Showdown demo where you’re flying through an interactive movie scene in slow motion. Each of those demo experiences acts as much of a learning experience as it does a demonstration of the capabilities of Unreal Engine.

With each step we learn more and more. Ultimately we learn how to go about building the next generation of games by building demos first. Eventually that demo knowledge accumulates and we can build something bigger and better. We’re certainly moving in that direction now.

What lessons have you learned through making VR demos?

We’re always completely open about our learnings because we want to see our partners succeed. The biggest learning is probably much the same as what we first learned about smartphone games: great VR games are custom designed for VR.

You have to employ control and locomotion schemes that are completely different from traditional gaming genres. On a certain level we have to throw out all of our previous knowledge from making games and relearn things.

VR film makers are learning the same thing. A lot of the cues that would work in a linear experience shown on a screen don’t work for VR. You need to create experiences that use new techniques like characters that establish eye contact with you, even if you move around or dynamic experiences that will wait for you if you’re looking in the wrong direction when something is about to happen.

There has also been an enormous amount of effort in maximizing frame rate and engine performance because VR is absolutely intolerant of any inconsistencies in those areas. We need to run at 90 frames per second with super high-end graphics on the VR hardware that people own, which has driven a huge amount of engine optimization.

What are the main barriers VR faces in terms of widespread adoption?

On the mobile front it’s the lack of features and performance. The Samsung Gear VR is a solid product for a limited number of games that can work with just directional controls. However that market is going to get a whole lot more interesting the moment you have motion tracking, hand tracking and other features as well as higher GPU performance.

On PC the hardware available right now is awesome. The limitations are on the development side. We’re learning so many lessons so quickly that it’s going to take a few more years of accumulated knowledge before we are building the masterpieces of the VR generation.

In the meantime you’re going to see a lot of developers who will create interesting VR experiences. They’ll also make mistakes along the way and learn from them.
Unlike the development of other previous media, VR is starting a lot bigger. There were more VR hardware units sold in the first year than there were Personal Computers in the first year of the PC revolution.

What are you going to bring along to VRX in San Francisco this year?

We’re going to be talking all about our experiences as VR developers and as engine developers. We have learned from our demo experiences techniques that work. The main thing I want to share and discuss is how all these different technical and design components will fit together in the future of VR entertainment experiences.

I’ve been doing a lot of thinking about the Metaverse and how games and other experiences will combine social interaction with the feeling of presence of yourself and others in a gaming environment to create entirely new game experiences. This is really one of the most interesting topics for the whole industry because it affects all of us.

I’m also really looking forward to what everyone else brings along too, because there is so much original thinking going on right now.

Tim Sweeney is speaking at the VRX 2016 coming up in San Francisco on December 7-8 alongside a huge range of senior business leaders from across gaming, consumer entertainment, brands and enterprise. For more information, head to http://vr-intelligence.com/usa/