
After my latest blog on GameWorks, readers posed an insightful question: Why has NVIDIA moved away from only giving sample code for game effects to developers and toward distributing middleware we call GameWorks?
To answer that, let’s take a quick look at the history of water in video games. Realistic water is common now, but how did it get that way?
Water was just a blue blob in some of the first games I played. Not compelling. Not believable. Just part of the background. Its interactivity was limited. You were in it or out of it. But if you saw the blue blob in Pitfall—an adventure game released in 1982—you knew to stay out or stand on the gators’ heads.
In 1999, I first saw what developers could do with rendered water. That’s when NVIDIA introduced the world’s first GPU, the GeForce 256. It featured a transform and lighting engine for geometry calculations that let developers create more interactive objects, like water.
NVIDIA released a pair of demos to show what was possible. The Bubble demo showed a reflective bubble, which would ripple when poked. The Crystal Ball demo showed a transparent bubble, another great water effect.
But real breakthroughs didn’t arrive until 2002. That’s when GeForce 4 hit the scene with second generation programmable vertex and pixel shaders. DirectX 8 had evolved along with these new programmable shaders, and we began to pour resources into our game developer program.
That year, The Elder Scrolls III: Morrowind added environmental reflections to their water. We showed it off in a well. You could even make waves in the water.
In a demo for Novalogic’s Comanche 4, a helicopter combat game, wind from your helicopter’s propellers would make water below swirl and churn. You could even see the chopper’s reflection.
Water is important on a golf course, so we worked with EA on Tiger Woods PGA Tour 2002. It was a big deal when you hit a bad shot and your golf ball made a splash and ripples in the water hazard.
Those where are all great technology advancements for 2002. But our demos showed that we at least had the ability to do those effects in 1999. Why did this realistic water take so long to spread throughout the industry.
These days, YouTube videos show off realistic water in lots of mainstream titles. Water effects are a highlight for titles such as the Crysisseries, BioShock series, Empire: Total War, Assassin’s Creed: Black Flag and Elder Scrolls V: Skyrim.
What changed? The way we introduce cutting-edge new effects to developers.
ILM for Games
We call this effort GameWorks. GameWorks encompasses all the game-related technologies we’ve invented and refined over the years. It’s a robust suite of tools and graphics technologies backed by 300+ visual effects engineers who create libraries, developer tools and samples. They work closely with developers—often onsite—to enhance their games.
Back in 1999, we would write and deliver piles of code to developers to add to their code. But that created more work for developers. As we tackled increasingly challenging visual effects problems, just giving away code samples proved even less effective.
So, we adopted a more production-oriented approach and turned our library of special effects into middleware. Think of us as Industrial Light & Magic for games.
The result: Developers can adopt new techniques by just dropping that middleware into their games. It’s a proven formula to success. PhysX – one of our key GameWorks technologies — is now one of the most widely used pieces of middleware in the industry.
Four years ago, we introduced middleware we call NVIDIA Turbulence. Less than a year later it appeared in DarkVoid. Then we added support for Unreal Engine 3, Unreal Engine 4 and CryEngine.
Turbulence effects are now common in games. You’ve seen them in Unreal Engine 3 games like Batman: Arkham Origins, Hawken, and WarFrame. They are in Unreal Engine 4 games such as Daylight. The CryEngine game Warface just added them. Even games based on proprietary game engines like Assassin’s Creed 4: Black Flag, Call of Duty: Ghost, Metro: Last Light and Planetside 2 have added Turbulence.
With the middleware in place and support built in to the key game engines, the task of adding these effects to games is significantly less challenging for developers. It took just four weeks to integrate Turbulence in to Daylight and only six weeks to add it to Batman: Arkham Origins.
In short, we’ve gone from telling developers what they can do – to showing them how to do it. That’s shortened the time it takes for new technologies to reach games and accelerated the pace of innovation. Isn’t that what it’s all about?