Yesterday, Digital Foundry posted its tech review for the PC port of erstwhile PlayStation exclusive God of War, concluding that the work delivered by Sony Santa Monica and Jetpack Interactive was simply superb. In addition to delivering an excellent conversion of the core content, the team has gone several steps further. Yes, the prerequisites of high frame-rate and ultrawide display support are in, but it’s also great to see Sony’s ports embracing worthwhile technologies such as Nvidia DLSS and Reflex, in addition to AMD FidelityFX Super Resolution – while integrating its own temporal upscaler.
After receiving review code pre-Christmas, Sony approached us asking whether we’d like to speak to the developers about the game and we eagerly took up the opportunity! In this tech interview, you’ll find out what the goals of Sony Santa Monica and Jetpack Interactive were, why the choice was made to use the older DirectX 11 API as its foundation (other first-party Sony ports have favoured DX12), and how the team converted a game based on a single platform with a unified memory subsystem to the split pools used on PC.
In this interview, edited for length and clarity, we talk with Matt DeWald, senior manager for technical production at Sony Santa Monica, and Steve Tolin, a lead from Jetpack Interactive.
Digital Foundry: When did the porting process begin? And who is working on it, Sony Santa Monica, as well as Jetpack?
Matt DeWald: So, we’ve actually been working on this for a while, but it started kind of slow to begin with. It just started as like, “Hey, can we even do this? Do we have the technical expertise? How do we work together? blah, blah, blah.” So, there’s a lot of interstitial work that needs to happen just to figure that out. It’s probably been about two years of total work, but with a very small crew – there’s a team of four at Jetpack that have been doing the primary engineering efforts. And they’re almost all engineering efforts, there’s been a little bit of assistance from internal, just where things are, “Hey, how does this work? Where is this thing placed?” and then internal Santa Monica, myself leading the project from a production standpoint, but then mostly, it’s just QA support, and then tapping different individuals to help fix bugs that may have existed on PC. So, it’s a very small team that we’ve tried to produce this with. Keep it lean and mean.
Digital Foundry: What were the specific project goals as the project started?
Matt DeWald: The ultimate project goal was to get a well-performing version of God of War on PC. The key was to make sure that we had a good PC game, right? You could just take it over from console, put it on PC, and just leave it at that without making any updates, but we did want to make sure that we improved the game as much as we could, without, you know, rewriting underlying systems and rebuilding the engine from scratch.
There’s plenty of work just to convert our custom engine over to DirectX, but then there were also all the extra features: so, making sure that we had really good keyboard and mouse controls, making sure that we supported PC-specific features like ultrawide, adding in DLSS and FSR to make sure we supported those technologies. And then what do we have on hand that we could do to really bump it up for those people that spend $3000, $6000, $8,000 on a PC? What are the tweaks that we could put into the game that turn on additional shadows or improve resolutions of things or increased rays, distances, all those kinds of things to make sure that people can have a great experience with the highest quality settings they can? So, we created what we call the PC+ feature set: like, these are the things that we want to do that are just extra for PC, as well as just porting the game over – and let’s knock those things out so that we can release a high quality game on the PC platform.
Steve Tolin: One of the goals that we talked about very early on was just highlighting the content – the content team at SMS is fantastic. How can we just highlight that content at the highest fidelity and create that experience on the PC? That was a core goal.
Digital Foundry: What was the starting state here? Was there an internally maintained PC renderer always on hand that the project started with?
Matt DeWald: So, some things were on PC by default, like rendering a cube and stuff like that we could absolutely do. Some systems were not… Steve can go into more detail about trying to port over the particle system, for instance, which used a lot of the underlying tech on PlayStation – that had to be completely ported over for DirectX. It’s a mixed bag. Some stuff worked out of the box, and we were able to get that stuff up and running quickly. Some stuff took a long time to really get working.
Digital Foundry: Did the shift from unified memory on console to split pools on PC cause any problems?
Steve Tolin: You’ve hit on one of the core differences between the architecture of the of the original PlayStation and PC, that split memory. So, balancing pushing to the GPU to do work to get that memory back, and then to reuse it within the frame. That continuous cycle just took a lot of tuning and a lot of synchronisation that didn’t need to exist in the PlayStation world. And then, it’s just a matter of discovering all of those things where we could optimise for, the smallest chunk of memory back and forth, back and forth, back and forth, to use that bus the most efficiently.
Matt DeWald: And that’s why our min-spec requires a 4GB card, it’s really we just need that GPU memory when the chip itself is more powerful than what we need. But without four gigs, you’re going to start swapping into regular memory and you’ll get a poor experience.
Steve Tolin: Even if you are running at [a lower] render scale resolution, the VRAM – that chunk of your RAM – is important for a higher resolution display.
Digital Foundry: Were there any of those specific systems that are actually going back and forth over the PCI Express bus?
Steve Tolin: probably particles in particular, it’s all a GPGPU particle system… the wind system. These are all small, small buffers that continuously need updating… luminance calculations, anything that’s GPGPU interacting with compute jobs.
Digital Foundry: Were there any advantages or differences from that console lower-level API that caused issues, where you had something on GNM (the PS4 rendering API) that you couldn’t necessarily do on PC?
Steve Tolin: A couple things in the particles, because the particle system was non- existent, there was no framework, it was written specifically for Orbis [PS4’s codename] and targeting specific Orbis hardware. Right. So, Kyle here at jetpack basically focused on that early and we solved the problems there.
Digital Foundry: You’re using DirectX 11 here. Why use this and not a lower-level API such as DX12 or Vulkan?
Steve Tolin: Part of that was the initial testing framework that existed but a lot of it has to do with the content authoring. The content authoring is right from original HLSL [high-level shader language] source that’s in that format and it’s all the way through through the pipeline, so that maintaining the content and not needing to edit and change the content was one of the core reasons to [use DX11].
Matt DeWald: And just the sheer effort of trying to port the engine over to Direct X12 is just beyond the scope of the project.
Digital Foundry: Let’s talk a bit about some of the visual upgrades. For example, DLSS, why include it in the first place? And secondly, could you describe the implementation process there? Because the game already had TAA and checkerboarding. Was it that hard to integrate, actually?
Steve Tolin: We wanted to address hitting a wide space of machines, so the image scalars basically allow a lot of scalability for player. Once we put in our own temporal image scaler, then it was okay, now, let’s support the DLSS and the AMD scaler, to basically give players the vast variety of options, to be able to tune it to get that experience that they need. You may have a 4K monitor, but you may not have a GPU that’s actually going to run at 4K, so we’re giving those options to the players be able to scale the image and the rendering, to get the image that they want.
Matt DeWald: And let’s be honest, DLSS is an amazing technology, so we definitely wanted to support that. And then when AMD came out with Super Res (FSR), we just decided to get on that as well, on top of our own scaler… [they all] tap into the same underlying system, so they’re fairly straightforward to implement. When you have one of them implemented, you’re getting the other ones cheaper.
Digital Foundry: So that the standard percentage scaler option actually uses a temporal upsample?
Matt DeWald: Yeah, if you don’t have FSR or DLSS turned on, it uses the TAA scalar instead.
Digital Foundry: There’s a new ambient occlusion method and the game’s advertising actually talked a bit about it, like a mix of GTAO with SSAO. Can you explain a bit about how that works and what it does?
Matt DeWald: It’s some new technology that we decided to port over and say, you know, let’s give an option to bump things up. But the idea is that, yes, we have directional occlusion as well as ground truth ambient inclusion to add in some additional quality, so that people that have high-end systems that can utilise that extra a couple of milliseconds can actually see those changes.
Digital Foundry: Let’s talk about the other settings as well. In the PC ports we’ve seen in the past, the game has been designed around the original settings, and you pump it up to a higher level. But the game streaming systems, like the pool sizes and things like that were very much so designed around that. Here, I’ve been playing at ultra, not noticing any really fidgeting or stuttering as a result of having a lot larger shadow pool and a much greater geometry distance. What changes needed to be or made respected in the port to make sure that higher settings weren’t just nominally higher, but they actually ran well in their own right?
Steve Tolin: Again, it’s just a lot of work. So, focusing on that, is always the goal, right? A lot of these techniques that you’re talking about was, so here’s the content, and that was at fixed size resolution buffers, or texture samples, or ray casts or samples, right? What we’ve been able to do is expand some resolution, expand more texture sampling, expand the number of rays in some reflections, and basically just to do more work to target a higher fidelity image as part of any of those single effects.
Digital Foundry: Was there some special in this game you’d like to highlight, or some aspect of your work that you really enjoyed in this project?
Steve Tolin: To be able to work on the God of War franchise, you know, be working with Santa Monica for a while, and being able to take that game and that content and deliver it to a new audience at a higher fidelity.
Matt DeWald: And stealing a little bit of my thunder, because I joined six weeks after they launched God of War in 2018. I’ve been here for almost four years, but I didn’t get to ship the game and didn’t get to experience that. So, now I’m getting to experience that. But on top of that, we’re all gamers at heart and I play a bunch of games on PC and a bunch of games on console. And even though we’ve sold, you know, what, nearly 20 million copies on console, there’s this whole audience out there that’s never experienced this game and never had a chance to, and at this point, they’re never going to buy the console to play it, right?
To be able to give them a way to play the game and say, you can experience this thing, this thing that won 270+ Game of the Year awards, and like, you can now experience it on your PC… I can’t wait to see what people say about it, what people experience and just getting that conversation going again, because it’s a very emotional story. And just getting people to talk about it again, I think will be great.