There are a lot of reasons to be excited about Marvel’s Spider-Man 2.
It tells an ambitious story featuring two playable Spider-Men and the iconic symbiote saga. It adds web wings that completely shake up the core swinging mechanics with new combat abilities to emphasize each hero’s unique strengths. And it’s been built solely for PS5, unlike PlayStation exclusives from the past few years, including 2020’s Mile Morales, which spanned two console generations.
In terms of PS5 tech, the rapid SSD-powered fast travel and character swapping have garnered much-deserved attention, but Marvel’s Spider-Man 2 has a number of other technical tricks up its sleeves, especially when it comes to the console’s Tempest 3D Audio and DualSense controller’s haptic feedback and adaptive triggers.
To learn more about how these features have been leveraged in Marvel’s Spider-Man 2, MobileSyrup sat down with Insomniac’s Johannes Hammers, advanced senior audio designer, and Doug Sheahan, senior programming director. Together, they discussed how this technology affects how they approach creating sequences for Peter and Miles, landing on the sounds for the symbiote and Venom himself and more.
Question: Once you knew this game would feature two playable Spider-Men, how did you approach using 3D Audio and the DualSense to emphasize what makes each of them unique?
Johannes Hammers: We started very early on to give each hero a separate sonic identity, and we basically started from the ground up. After Miles [Morales], we had a really clear vision of how his character sounded with the bioelectricity and those abilities. And Pete was coming over from Spider-Man 2018 and so we thought, “Why don’t we start from scratch and build each character a brand new sound set, sonically?” So we created all new punches, kicks, “thwips,” and traversal sounds for each character, and at that granular level, we were able to really differentiate between the two characters. And having Pete with symbiote abilities and Miles with bio-electric abilities, we can even make that separation more distinct. They’re both equally powerful in their abilities, but their characters lend themselves to just sonically being so different, and we really wanted the players to experience that.
Doug Sheahan: On the tech side, we did plenty of things outside of the audio and the DualSense to separate them. But the really great thing about the way that the DualSense works for us is that it’s so connected to our audio systems and everything that, on the programming side, it’s really just about giving the audio team hooks into things. So it’s like, “Oh, this is when this ability triggers, this is when this ability triggers, this is what character you are.”
And the really cool thing about the way it’s set up, and the way that rumble runs off of audio waveforms and things like that — which is a really new and powerful feature that’s part of the controller now that’s different than what’s been there before — meant that we were able to really let the audio team and Johannes and everybody else over there just kind of go nuts and really let their creative juices flow. And for us, it was minimal input of just saying, “Here are the things in the game that are happening, and then they were able to go through and just really add all that personality to it, without really having to get too far into bouncing back and forth.” It gave them a lot of freedom, a lot of autonomy, from what the technology groups had to do to maintain [it]. They were able to just run with it, which was really fantastic.
Q: One of the coolest new things in the game is the web wings. Part of what enhances that whole experience is the audio design coupled with the haptics. I’m curious — how does that whole process work? Do you start with the audio design and then match it to the haptics or is it more of a haptics-first approach tied to the web wings action itself, and then you sort of layer on the audio?
Hammers: The process is audio first. Sometimes we’ll start with a pre-vis, we’ll get some game capture, and I’ll do the sound design or the team will do the sound design for what we’re seeing. And then comes the implementation process where we actually get those sounds in the game. And the sounds for the web wings are a really interesting combination of using [real] world sounds like wind, rushing wind, rumbles — things like that, but Herschell [Bailey], one of the sound designers here, [also] actually did a lot of foley [the reproduction of everyday sounds] for that.
I remember there were gloves and Super Balls on each finger, and they were manipulating those by hitting different fabrics. They tried, I think, cellophane stretched really tight. And it just gave it that rippling sensation. And then after that was done, [sound designer] Tyler Hoffman came in and did the haptic programming, and I remember him talking about a loop that he made that would randomly go between the two left and right grips to give it that further feeling of wind and that turbulence that you get, especially when you go into the wind tunnels.
Sheahan: One of the really fun back-and-forth elements of empowering the audio team to play with the haptics with the web wings in particular… I think it was Tyler, when he was working on the haptics, he was like, “Oh, I really want you to feel it when you dive and then pull up,” and so our audio team was really able to communicate with our gameplay programming team and be like, “Hey, we really want to be able to represent this.” And so for us, it was able to be like, “Okay, we can detect when that happens, we get your values that drive it on here’s how hard you’re pulling up, here’s how hard you were diving,” and get into a lot of that kind of stuff. And so a lot of times, it’s about them looking to express what they’re trying to represent, express the things they’re looking for in the game, then talking with the rest of the team, saying “How does the game know that this is happening?”
And then for us to be able to turn that into data, push it over to them, and then let them get those ripples, get those new things, have the controller speaker pick up, have the different feel and rumble kick out when you pull up to feel like you’re not just pulling back on the stick, but there’s this kind of rumble with it of, “Oh, my God, I’m having to air brake as this is happening.” And [it] really give[s] a tactile sense to doing those actions.
Q: This question is more about the controller but audio of course plays a factor. One of the highlights of the game from the adaptive trigger perspective is the particle accelerator minigame where you’re juggling different beams of energy with different resistances using L2 and R2. For something like that, does the idea come about naturally over the course of discovering the story — in this case, knowing that Peter would be helping Harry with a science experiment — or do you have that idea early on and you come up with a story reason to support it?
Sheahan: I think on that one, we knew generally that at that moment, Peter was going to be helping Harry out, and we were going to be looking to solve problems. And our design team is able to really come up and work with [the] gameplay, audio, [and] animation [teams] to say, “What can we do inside this accelerator? What can we do to fix it and manipulate it?” We’re looking for opportunities to do things that fit within what the character is doing when it comes to using the controller, using the haptics, using the adaptive triggers and everything like that. So we don’t want it to feel like there’s this disconnect between what we’re asking the player to do with the controller and what the character is doing in-game, and that one ended up being a really, really great example of having the two mirror each other, and so it gives the player a really good connected feel.
And I know on that one, when we were looking to develop it, the design, gameplay, everybody was kind of saying, “I want to be able to do this with my eyes shut.” And I think that that’s something that the controller and the resolution on the vibration and the way that those kinds of actuators and motors work in the adaptive triggers, the way we can do resistance… It gives so much detail to a player that through their hands, through their controllers, that is wholly unique to that controller — that we were able to make something that feels really special. Players can find that sweet spot without having to look at the UI, necessarily — it gives that extra kind of access to feedback and everything to the player and ends up being something you can’t really experience somewhere else. As a gamer, you can connect to something in the game in a different way to what you’re doing than just watching it happen.
Q: On the audio side of things, there are so many neat sounds related to the symbiote, especially with Tony Todd’s Venom voice. What’s the process like for those? Do you go in with a plan for “this will be a cool symbiote moment,” or do you discover that as you go along?
Hammers: It’s a lot of both. I do a lot of pre-vis work, so we’ll get game captures and we get these amazing animations from the animators, and then I’ll just do a linear pass on the sound design. And that’ll oftentimes set a blueprint or a launch pad for other audio team members to jump in and take it and run with it. But for the symbiote, that was many, many iterations, because [creative director] Bryan Intihar didn’t want it to sound too wet. And so there was this combination of how wet and how dry, and what sounds to use to get those effects. So for the symbiote, just on a practical level, there was leather creaking and rope stretches.
And I found that if you took plastic CD cases and bent them, you’d get this stress sound that sounded like a creak but kind of otherworldly. And then with things you might expect, like mud, we had our foley team in San Diego — Joanna Fang and Blake Collins — do a pass. And all those elements sort of come together as a ‘sonic lexicon’ of how these Venom sounds are going to go into the game. And then it’s a matter of creating assets and actually implementing them. It’s so much fun — I could talk about this all day. It’s wonderful.
Q: On a broader level, how did having access to these kinds of unique PS5 features change how you approached development on Marvel’s Spider-Man 2 vs. previous games you’ve worked on?
Sheahan: I think a lot of it for us — one of the big ones is just giving players another axis of feedback and understanding. You have the traditional audio and visual in games — those are the two things you can experience through your TV really well. But the more you can layer on to the player, the better their understanding is going to be, intuitively. And so I think that it’s one of those things that when we’re looking to make things accessible, when we’re looking to make things intuitive to pick up and play, having that additional axis for players to experience stuff is really powerful — just in terms of people understanding the game. We have an accessibility feature that lets you dial back some of the rumbles we put into just some key elements that only show up in that additional mode, and so players that might be low vision or something like that have the ability to say, “Oh, I’m going to get really clear rumbles when I have my Spider-Sense go off.” It’s not necessarily something that they’re going to rely on exclusively, but it is that extra thing that they can get trained to learn.
Some other places — you look at the Coney Island mission [with Peter, MJ and Harry]. The designer, Christina Curlee, was a huge fan of [pre-installed PS5 title Astro’s Playroom] and everything like that, and she really wanted to give players an experience of going through that kind of carnival feel in all of those games in a way that was more than just ‘button mash’ minigames. And by having the controller there, it provides us the opportunity to give players a little bit more of a one-to-one feel of what they’re doing versus what the characters are doing. Some of the best systems are all about how you present things to the players and what their interaction level is going to be. And so by having those new kinds of interaction methods, it lets us do new things and take things that might have been experienced very differently and maybe in a little bit more of a flat way in the past and do them in a more kind of real, tactile, 3D sort of way over what we had before.
And then just giving us more analogue — to some extent more buttons, but not [actually] with more buttons. When you look at how the puzzle in the particle accelerator you mentioned, we’re using that analogue and having that feedback. If we didn’t have that feedback, that minigame would feel mushy and kind of bad. But having the adaptive triggers in there means that that opens up a new kind of experience — a new mechanic — that wouldn’t have even been on the table before. And so there’s all kinds of new stuff that we can do and look for opportunities to make it so that players get a nice variety of experiences throughout the game and have new things happen to them.
Hammers: I’d say for audio, our biggest initiative to push was the 3D aspect of things. So we’ve really leaned into that Tempest 3D Audio tech engine that we have, and just the ability to place objects in specific spaces so that you’re more aware in combat of where an enemy might be, or where an object might be, or where you need to go for your next mission. If you’re if you’re playing headphones, or you’re just on TV speakers, you’ll feel that positionality. So that was just fantastic.
Q: What are your favourite implementations of the DualSense and 3D Audio in Marvel’s Spider-Man 2?
Hammers: The very first mission, Sandman. I worked on that for such a long time with my partner Brooke [Yap], and bringing that to life — the 3D aspect of that — was fantastic. The other thing would be finishers — those three-second gems of sound effects, VFX and animation. It’s like the perfect combination of everything that everybody can do all in one moment.
Sheahan: There’s a point partway through the game where you have to — it’s a smaller thing — use the symbiote tendrils to rip or push open a door. And they did a thing with the adaptive triggers where it’s really high resistance, a lot of rumble, but then all of a sudden it gives way when the door finally opens. I just crack a smile every time I play that part of the game because it’s that experience of you pushing on something and it breaks and then you fall forward. And it’s a physical response to what’s happening in the game that happens through the controller. For whatever reason, it surprises me every time — I always forget how much it impacts you when it’s just your two fingers that suddenly fall that half an inch to finish out the thing, but it’s really impactful and effective.
This interview has been edited for language and clarity.
Marvel’s Spider-Man 2 will launch exclusively on PlayStation 5 on October 20th.
For more on the game, check out our full review and interview with Insomniac’s narrative team.
MobileSyrup may earn a commission from purchases made via our links, which helps fund the journalism we provide free on our website. These links do not influence our editorial content. Support us here.