With Friday’s final night of the Injustice 2 ELEAGUE Championship on TBS, a tech company called The Future Group is bringing DC characters to life, allowing them to interact with the real world in a brand new way.
The new technology is called Interactive Mixed Reality, allowing Turner’s broadcast to include, for example, Batman and Superman standing on the stage next to contestants.
The key is the character’s interaction in real time, from any angle and at any moment. As the director of the tournament’s live broadcast decides what camera and angle will be utilized next, the technology from The Future Group allows the characters to look realistically present on the stage.
TBS has been broadcasting the Injustice 2 ELEAGUE tournament for the last couple weeks, but tonight’s show will be the competition’s finale. It begins at 10 p.m. Eastern time, live on TBS.
Newsarama talked to Lawrence Jones, the vice president of business and content development for The Future Group. Overseeing the U.S. branch of the Norway-based company, Jones revealed a little bit about the technology’s properties, why the Justice League characters were a unique challenge, and what sorts of things viewers might see from DC characters on tonight’s broadcast.
Newsarama: Lawrence, let’s start with just an overview of exactly what The Future Group’s role was in bringing these realistic characters onto Turner’s live broadcast. Can you describe the company’s role and what you guys were able to do?
Lawrence Jones: The Future Group has created for Turner — and specifically the ELEAGUE Injustice 2 Tournament — the ability for the characters from the game to be projected into the real world of that live event on broadcast.
What’s amazing for fans of the game is that they can actually see these characters in their real world, and it gives it a whole new level of engagement to them.
What the Future Group does is we take the characters and make sure that, from all angles, they look amazing.
Then we add all sorts of effects and ways for them to interact with the real world and feel like they really are there.
Nrama: It sounds like there would be a lot of challenges to handling this for a live broadcast.
Jones: Yeah, anytime you’re doing real-time visual effects — what we call real-time simulation — there are challenges.
So if there’s a cape — you know, say, Superman’s cape — and that cape has to interact with the real world, that has to be a real-time thing, not something that’s done beforehand with 10 computers.
There are always challenges in making sure that looks good in any scenario.
When you’re doing mixed reality for television, you really don’t know what the director’s going to do, in terms of the angle he’s going to look at it, or what he’s expecting.
So you always have to be prepared for the worst, and be prepared to make sure that, from any angle, everything has to look great.
This is a big differentiator from, say, broadcast television or film, where the director dictates what you’re seeing and you know what that shot is going to look like. Here, it’s a different story. The director’s the last one who’s controlling it for the world to see. So you always have to make sure everything looks great. That’s a big challenge.
With real-time simulation, things like cloth and hair and effects – it’s a big challenge.
Nrama: I know you used this technology previously for the Street Fighter tournament in May, but as you started to work with the Justice League characters for this event, were there unique challenges to these characters?
Jones: The Flash was a really big challenge for our guys because of what everyone is used to in terms of his effect — the Speed Force. Everyone’s accustomed to seeing that and looking at it.
And another challenge was Wonder Woman because of her hair and the different effects we’re doing with her — her shield’s hitting the sword.
I hope all these things air, but it’s not up to me. You know?
Nrama: But you have to be ready for anything.
Jones: You do. You really do.
My favorite is Flash. He’s so cool — it’s his personality. He’s my favorite anyway.
But we’re doing this thing where he flies in at super-speed, and he has a cup of coffee in his hand. He drinks it normal speed, then he runs away at super-speed, but the cup falls at normal speed and bounces off the ground. It’s a really cool effect.
Nrama: There’s probably a lot of application for this technology for live events. I know some of our readers would love to interact live with these characters. Is that something down the road?
Jones: Oh yeah. We’ve done things in our studio where, when the camera gets close to that character, within a certain tolerance or distance, you trigger the character to actually look at you, point at you, threaten you, smile at you — there’s like that live human connection that we can be creating with something that’s not real but can be made to feel very real.
So yeah, the sky’s the limit. Our company’s agnostic to outside technology. So every time something new comes out, it’s a new tool that we can implement into what we’re doing.
There are so many applications for this type of mixed reality — location-based entertainment, live e-sporting events, and more broadcasts.
Nrama: I’m thinking something for Comic Con International at San Diego next year.
Jones: That’s a great idea!
Nrama: Are some of the things we’re going to see on tonight’s broadcast particular to the Injustice game? I assume you worked with the game developers to make sure these characters were like their game versions.
Jones: Yeah, we worked very closely and in tandem with Turner and the game’s publisher – in this case it’s NetherRealm. And also with Warner Bros.
So there were all these checks and balances to make sure the integrity of that character is a constant and we don’t stray from that. So there aren’t a lot of creative liberties we take in terms of the character’s aesthetic.
But with what we add to that, we can take creative liberty. Like what would the debris look like when Superman flies up and we’re insinuating that he’s crashing through the ceiling of ELEAGUE Studio. So we can design that.
But in terms of the integrity of the character, to make sure — like, if it’s the Flash, we have to make sure the Flash doesn’t look different than what they’re accustomed to. That’s on us to really adhere to, and I think our partnership with Turner and NetherRealm really helped with that.
Nrama: So can you describe at all the technology you use to do this? Or how you evolved existing technology involved with gaming?
Jones: Yeah, well, what’s truly unique about our technology is that we’re taking real-time gaming — in this case, we’re taking the Unreal eEngine (called UE4) and we are creating our own version of it.
And that version allows us to be able to broadcast the results to the world.
So there are a lot of technical things that are important there. But this allows us to be able to take a live camera in a studio and insert these characters into that real world so that camera can see it and feel like they’re really there. And that’s what’s unique.
And then there’s other technologies that allow us to bring in live video feed and different things about live broadcast that you need in a mixed-reality experience.
And our road map is leading toward doing more of these techniques that, at the end of the day, the important part is creating tools that allow that integration between the computer-generated world and the real world to look seamless.
Nrama: Then to finish up, is there anything else you want to tell fans about tonight’s championship?
Jones: Definitely tune in on TBS from around 10 p.m. to midnight. It’s been a great partnership with Turner allowing us to be creative with them and come up with really cool ways to show fans that ELEAGUE is a really great place to watch their favorite games and their favorite gaming competitions.