5G Experts discuss the future: Verizon, Mutable and Qwake Technologies hero graphic

5G Experts discuss the future: Verizon, Mutable and Qwake Technologies

Posted On

By Elisabeth Kindig


5G is the promising new generation of mobile communications that will reduce latency with internet speeds up to 100 times faster than the 4G networks we are currently using on our mobile phones. These fast speeds and higher-frequency spectrums will require a closer range with microdata centers or small cell sites that are closer together compared to the longer range we use for 4G.

The implications of 5G go beyond downloading your favorite Netflix movie, as 5G will enable self-driving cars to brake quickly, hospital equipment to communicate faster for better outcomes, first responders will know exact GPS locations rather than requiring 911 operators to dispatch, and many other applications that today would be hard to fathom.

Tech Lightning Rounds by Intertrust Technologies provides a unique opportunity to hear Verizon, the front runner in 5G, alongside a disruptive startup Mutable who is taking on the cloud as we know it, and also Qwake Technologies, who has designed an augmented reality mask to give firefighters vision in smoke-filled buildings.


00:25 Beth Kindig: Welcome to Tech Lighting Rounds. I’m your host, Beth Kindig. This podcast interviews key people with deep expertise on one topic for a 360 degree view. One difference between this podcast and the other podcasts you listen to is that I hold short interviews called ‘Lightning Rounds’, with the goal of giving you a lot of compelling information very quickly so you can get on with your day.


00:53 BK: I went to Mobile World Congress 2019 in Barcelona, which is the world’s largest mobile conference with over 80,000 attendees, and I spoke with three leading experts in 5G. The people I interview come from Verizon, one of the leaders in 5G networks.

01:09 Lani Ingram: When a city is ready to move a metric, they want to be able to say that in the year 2025, I want the traffic congestion to be reduced by half. Or, I wanna be able to reduce the amount of energy usage by 30% in the next 18 months.

01:26 BK: Mutable, who’s delivering a major disruption to the Cloud with micro data centers and the public edge cloud.

01:31 Anthony Pellegrino: Instead of having $1,000 devices, you now can make them just a glass pane with the battery attached streaming video that’s running at the edge of the network, ’cause the latency is so low.

01:44 BK: I also speak with Quake Technologies who’s solving a real-world problem for firefighters using 5G and augmented reality. I go into a heated pitch black room and I use a 5G-enabled AR firefighter’s mask to save a baby from a simulated burning building.

02:01 Sam Cossman: Our first product is really focused, not only on public safety, but on fire fighters, who today in this present day of self-driving cars and all this amazing technology that we’ve seen here at Mobile World Congress, go into a burning building and are utterly blind when they are performing search and rescue scenarios. So our goal was to give them vision back.

02:23 BK: Verizon is one of the first networks to roll out 5G, and they did this recently in four cities, including Houston, Indianapolis, Los Angeles and Sacramento. I speak with Lani Ingram, Vice President of Smart Cities at Verizon who discusses the various iterations smart cities have undergone.

02:41 BK: Going back to real and practical use cases, can you describe to me what will a smart city powered by 5G look like? How will my experience walking through the city change?

02:54 LI: Yeah, one of the things that I think is really interesting about the smart cities and 5G activity is the evolution that’s been happening with the smart cities has been a very long journey. It’s been about 10, 11, 12 years now that we’ve been working on smart cities. In the early years, I used to call it smart cities 1.0. And the smart cities 1.0 was kind of this early evangelistic years where we were trying to figure out how technology even can participate in helping government with their core services that they offer to the citizens. And then the smart city 2.0 started to happen, and people built out the solutions and platforms similar to what Verizon’s been doing.

03:36 LI: But what’s been happening with the customers, with the cities is, they’ve been in pilots. They’re co creating, being able to kind of test out the solutions in the neighborhood, or perhaps a street. It’s been very good in order to be able to understand and grow in being able to be smart city ready. But where 5G I think is really gonna help the smart cities in the future is with what I call smart cities 3.0. Here’s where when a city is ready to move a metric, they want to be able to say that in the year 2025, I want the traffic congestion, to be reduced by half. Or, I wanna be able to reduce the amount of energy usage by 30% in the next 18 months; a real solid metric. And that means they’ve got to scale. They’ve gotta be able to do things much more in partnership with people like Verizon, where we can participate with them, not just on the technology layer but also on the ability to run and operate that.

04:39 BK: And so basically, the network will roll out, people will upgrade their devices, and then at some point, applications will also have to catch up to basically really leverage 5G. Does Verizon have any idea what some of those applications might be? I know we talked about on a city level, public or first responder level, etcetera. But as far as even on the handset, does Verizon have any vision there?

05:05 LI: So, the innovation, I don’t think is even gonna happen afterwards. We’re really working on that innovation even before and during that rollout. We’ve got a lot of innovation hubs that we’ve actually put in different parts of the United States, pulling in entrepreneurs to be able to help create these kinds of solutions, so that when that rollout actually becomes available to citizens on a regular basis, they’ll be able to have those applications right off the bat; a tremendous amount of different things. I think augmented reality and virtual reality are going to be very different in the future than it is today, where it’s more of a gaming experience, etcetera. It can actually now be leveraged in more of our day-to-day activities in workplaces, being able to understand what is happening while we are actually experiencing. And say you’re a construction worker and you need to be able to understand what’s going on up above. Maybe you have drones that are going up there and you can actually see everything from a headset and make virtual decisions.

06:16 LI: The way we think about healthcare, and maybe being able to leverage, again, robotics from a distance, to be able to perform different type of procedures, having full scans of medical information being able to be sent over in milliseconds. These types of things are going to change the way we live and operate in so many different types of industries. And I think that the innovation on that is going to continue to evolve within each individual vertical in a pretty significant way.

06:55 BK: Maybe we can go a little deeper into hospitals actually, with even the four cities that you’ve rolled out to. Are hospitals already utilizing 5G? Or, what would be kind of some of the first ways that hospitals will utilize 5G?

07:10 LI: I think that the medical industry, in general, needs a tremendous amount of data. And the faster that they can get to that data, the better they’re going to be able to serve their patients. I see the 5G capabilities helping all the way from the ambulances, ensuring that we can clear the roadways and get that vehicle from the place where they’re picking up the individual, to the hospital as fast as possible. There’s a lot of improvement that can be made in there that I think 5G is gonna change. When you’re in that vehicle, in that ambulance, the ability for the doctor to be able to see what’s happening. So the mobility aspect of 5G is quite interesting, keeping that connection going as that car is moving at very fast speeds. Now all of a sudden, the doctor may be able to have camera feeds, and AR capabilities to where they can actually help starting to treat that patient right from the early stages, as opposing to having to wait until they actually get into the hospital.

08:16 LI: And then once again, you’re in the hospital, being able to reach out to specialists who might be in different locations, and sending information real-time to them, and having them being able to, almost virtually, be there while you’re treating the patient. Those are just some areas that we can think about from the journey of what a patient and doctor relationship might look like in the future.

08:44 BK: My second interview is with Anthony Pellegrino, the CEO of Mutable, who discusses the more technical aspects around 5G infrastructure. His company is at the forefront of how the cloud will undergo a massive shift due to micro data centers and edge computing. His descriptions are incredibly coherent on a topic that can often be hard to translate.

09:05 BK: What are micro data centers, and what will they do to rival the cloud as we know it right now?

09:09 AP: For sure. So, when you have the cloud data centers that are these hyperscale multiple football fields long big, they put them in spots that can sustain them. And so those locations are middle of Iowa, middle of Oregon, really on the outskirts of where they can get experts, like really taking advantage of the landscape where power and cooling are very abundant. But with the edge, you’re actually running inside of cities. You’re running it down the street. You’re taking advantage of these small, not football field, but conference room size locations, and taking those resources and dynamically allocating them for developer’s use.

09:55 BK: Can you expand on what it means for developers to push applications to the edge?

10:00 AP: Absolutely. So, whenever you’re pushing something, really what it means is when you write code, you have to deploy, or basically host it somewhere. And so hosting it is running it on these cloud servers, or in this case, when we talk about pushing to the edge where we’re having those applications run literally in these micro data centers, and we’re deploying them when they’re needed on demand. So instead of, as we talked about, they’re like conference-size style locations that only have maybe a couple of hundred servers or less. And inside that room, you can’t have everything running all the time. So think about Ford, and Ford if they wanna do autonomous vehicles, are they going to put redundant compute literally in thousands of locations, or are they going to, when a car comes by going through that neighborhood, you’re connected to 5G, and they’re sending a request across, you can just spin up an instance of these applications on demand, and just use it when it’s needed, instead of having everything all over the place. That’s very cost-effective.

11:14 BK: What kind of apps will we see from 5G and these micro data centers? How will they differ from previous generation apps?

11:23 AP: Yeah. You actually can, initially enough, think of this close to what happened when the iPhone and that ecosystem was created with 4G. You had Uber and Lyft and all these new applications that took advantage of the fact that you can track people’s location, send information in real-time, and stream videos, and then create a whole new ecosystem. Well, that’s gonna be happening again. Because, instead of having it on devices, which are now becoming incredibly expensive, and to compensate for what things can need to run, if it’s AI as with video, if it’s gaming and all these type of things, instead of having $1,000 devices, you now can make them just a glass pane with a battery attached streaming video that’s running at the edge of the network, ’cause the latency is so low. It’s actually faster than your anti-lock brakes that you have in your car. So in that kind of realm, you can offload a lot of compute that would normally be on a device, and shift it just a few miles away, and have it run there.

12:32 BK: Because Anthony has an important perspective on 5G, I wanted to ask him a question that I had also asked Lani of Verizon, which is to break down what 5G will do on an individual level. We also talk about what 5G will do for businesses and app developers.

12:47 BK: So how does the current cloud infrastructure today post a problem in the situation that you described? Like, what is it exactly that I will benefit? Like, how will I benefit from micro data centers as a user of mobile, as maybe somebody who lives in San Francisco, in a city? Like, what will it do for me?

13:03 AP: A great example of that is autonomous vehicles. So with autonomous vehicles, when you have cars, you can fill it up with batteries, and you can have it go from point A to point B. But the more compute that you have on, or servers that you have on these cars, the less you’ll travel because you’re now using that energy not just to move the car, but to make decisions. And then a lot of the decision-making that’s done is actually predicting what will happen. So you’re constantly predicting the future, and that requires even more and more resources, because less and less is unknown. So when you have more things that are unknown, it’s hard for the actual car to figure out what it needs to do, so it spends a lot of time and energy to make those predictions. Now, if you’re running this on the edge, you can actually offload a lot of decision-making and have that run and take data from multiple sources, from cameras, from other vehicles, and you can communicate to that car in real-time, in less than 5 milliseconds, and that’s faster than anti-lock brakes. And in that realm, you now can have these cars travel further.

14:17 BK: Do you think there’s gonna be a first mover advantage to companies that go towards the public edge cloud and move away from the traditional infrastructure?

14:26 AP: For sure. So, the big advantage that they’re going to have is actually, especially in the world of IOT, being able to collect more data more frequently, and then be able to process it near real-time, instead of collecting data and sending it all the way to these locations that are in the middle of the country. It’s gonna take longer to process, ’cause all of it gets centralized. While all this can happen as we’re saying, just a couple of milliseconds away, you can do it a lot more frequently.

14:57 AP: Another great example is actually even personalized search. So, if you’re doing YouTube or Spotify, and you’re searching for a song, or all those type of things where the search is actually personalized towards you. And in that realm, essentially, what you can do is you can provide this unique experience of as you’re typing, it literally shows exactly what you’re doing. And that’s actually one of the biggest drop offs that these companies have, is the fact that, as you’re typing, if it takes too long, they actually switch apps. So, Spotify is a big culprit of that where a lot of the users jump off of Spotify and go into YouTube because it’s faster for them. So those type of economics of keeping the user on longer by giving them a better experience is very huge for ad revenue and allowing them to keep engagement.

15:57 BK: It’s one thing to talk about 5G and another thing entirely to use 5G enabled augmented reality. I got this opportunity at MWC with Sam Cossman, the Co-founder of the company Quake Technologies, who has designed a mask to help firefighters see in a smoke-filled room with zero visibility. I couldn’t help but be impressed at the real world problem Quake Technologies was solving.

16:20 BK: Sam, tell us about the device that I’m looking at right now that helps firefighters locate people and heat patterns within a burning building.

16:29 SC: Yeah. So we’re a small company out of Silicon Valley called Qwake Technologies. And the first product we’ve created is called C-THRU. And the whole idea behind our company is helping individuals operating in high stress environments reduce their cognitive load so they can do their mission critical tasks more effectively. Our first product is really focused, not only on public safety, but on firefighters, who today in this present day of self-driving cars and all this amazing technology that we’ve seen here at Mobile World Congress, go into a burning building and are utterly blind when they are performing search and rescue scenarios. So our goal was to give them vision back. And we’ve done that by integrating a couple of different technologies, augmented reality optics, high-speed thermal cameras, and GPU embedded chips, to create an experience that essentially flips the lights on for them in complete darkness and in complete smoke.

17:21 BK: I assume I’m going to put this on my head. It’s basically like a ventilation mask that firefighters typically use.

17:27 SC: Yes, so what you’re looking at right here is a traditional face mask except we’ve modified it by placing some augmented reality optics inside the face-piece. As you can see on the outside, we have a thermal camera, which is an old-standing technology. It’s been around for a long time, but the problem is the way it’s currently used today is that a firefighter will hold this clunky thing up to their face, they will have to do that with their hand, which of course they need to navigate through burning buildings by following the hose line. And they’re looking, oftentimes, at a very complex image, which as most of us know, in stressful environments, cognitive abilities take a nosedive.

18:04 SC: So we really wanted to take all the amazing qualities of thermal imaging and package them in a way that’s just much more usable. So yeah, today we’re gonna actually put this mask on you and put you into a heated environment, and show you what it looks like to go from blind to having vision all of a sudden. And actually on the screen here, I know this isn’t video, but what you’re looking at is actually a mirrored image of yourself in computer vision. So that’s… You can wave at the camera there. That’s you. It’s a real-time feed of a computer vision application that we’ve written to, again, reduce the cognitive load for an individual operating in these high stressed environments. So you can quickly discern, “That’s a door, that’s my path to safety. That’s a crib that has a baby. That’s my victim that I’m saving. Or, perhaps that’s a ladder, an escape path to safety.” So let’s go ahead and get you equipped and put the mask on your face here.

18:56 BK: How hot is this room? Define heated environment.

19:01 SC: [chuckle] So, your shoes should not melt.

19:02 BK: As you’ll soon hear, I put the firefighter mask on and navigated a completely pitch black room using AR optics and a thermal camera. My task was to find a baby in a crib, locate an escape ladder, and identify which door handle was closest to the fire. The remarkable thing is you can immediately see objects and heat patterns to discern where the fire is burning, who should be saved, and how to exit quickly, all while being hands-free.

19:29 SC: Beth, I’m going to walk you this way towards me.

19:32 BK: Okay.


19:33 SC: And then I’m gonna turn you around, just this direction, and on the count of three, I’m gonna have you go ahead and push that button one time, alright? So I’m gonna stand about 10 feet away from you.

19:45 BK: Okay.

19:45 SC: And then I’m gonna tell you when to click the button, and you’re gonna go from what you’re seeing right now, which is complete darkness, to having vision, hopefully. So go ahead on the count of three. One, two, three. There we go.

20:00 BK: Yeah, I can see you.

20:02 SC: So, what…

20:04 BK: There’s the crib. Oh, it has a microphone.

20:04 SC: Yep. Why don’t you describe what you’re seeing?

20:07 BK: Oh, okay, so I see you in front of me, and there’s a crib over to the left. And, let’s see, I’m going to walk over to the crib.

20:17 SC: Do you see anything in there?

20:24 BK: Let’s see here. There’s a baby right here.

20:26 SC: That’s right. You found the baby. Good job.

20:28 BK: I’m saving a baby.

20:29 SC: You just saved a life. So that baby has a heating pad in it, so that it has a little thermal gradient. Why don’t you turn around and tell me what you see in this corner.

20:37 BK: A ladder.

20:37 SC: Yeah, that’s right. This is a kind of an escape ladder. And then if you switch it one more time on the button there.

20:45 BK: Yeah.

20:45 SC: Go ahead and… And there you go. So what color do you see here?

20:48 BK: Red.

20:48 SC: Yeah. And what does this look like down here?

20:51 BK: Red.

20:53 SC: Now, switch it back to the green. And what am I touching here?

20:58 BK: The door handle.

20:58 SC: That’s right. So this is the door, and it’s red because it’s, as it suggests, hot. So the whole idea behind this interface is to take thermal imaging, which is again an amazing technology, but just been in bad form factor, strip it down to the most basic elements of shapes and contours that give you all the information you need to be able to do your job, and present it right where you need it, which is right in front of your eye in real-time.

21:24 BK: I don’t think that I was meant to be a firefighter. They have a hard job.

21:26 SC: [chuckle] They do have a hard job, and you have a good appreciation for it when you go into a place like that, right?

21:34 BK: Yeah, I do actually. That’s a whole different experience.

21:38 BK: Thank you for listening to The 5G episode. Please subscribe to this podcast and leave a review in iTunes.


Related blog posts


The future of clean energy

Read more


Why data governance matters

Read more


What is data interoperability?

Read more