Ice Lounge Media

Ice Lounge Media

Podcast: when your face is your ticket, your face is your ticket, your face could be your ticket

IceLoungeMedia IceLoungeMedia

In part-three of this latest series, Jennifer Strong and the team at MIT Technology Review jump on the court to unpack just how much things are changing. 

We meet:

  • Donnie Scott, senior vice president of public security, IDEMIA
  • Michael D’Auria, vice president of business development, Second Spectrum
  • Jason Gay, sports columnist, The Wall Street Journal
  • Rachel Goodger, director of business development, Fancam
  • Rich Wang, director of analytics and fan engagement, Minnesota Vikings

Credits

This episode was reported and produced by Jennifer Strong, Anthony Green, Tate Ryan-Mosley, Emma Cillekens and Karen Hao. We’re edited by Michael Reilly and Gideon Lichfield. 

Transcript

 [TR ID]

Strong: I’m in Queens in the neighborhood near a massive stadium complex called Citi Field. It’s home to the New York Mets, though because it’s baseball’s offseason. Right now, everything is locked up and all you can really hear is rush hour traffic.

But if you look up, along the edge of the stadium where thousands of fans will, eventually, return, you can see some of the hardware that powers the team’s use of face recognition. These cameras are meant to detect faces that have been banned from the grounds–folks like ticket scalpers, people who’ve run onto the field, even committed crimes out in the parking lot and that system is powered by one of the biggest names in face recognition – N-E-C. It’s able to measure things like ears — and it still works with people wearing masks, hats and sunglasses.

And then once you get over to the turnstiles – there’s another face system from a company that’s known for airport security – called Clear – and that’s for ticketless entry. Basically you can use your face as a ticket. When you get inside there’s a payments system in a concessions area – meaning you can buy a beer with your face, if you wish.

But it’s when you get to your seat that things get really interesting. Even before the pandemic, attendance at baseball games has been on the decline. Actually, this stadium has about 15-thousand fewer seats in it than the one it replaced. And so, on the one hand, stadiums are trying to make the experience just as safe and hassle-free as they possibly can but they’re also trying to learn just as much as they can about who these people are in the stands and that too is being done with face recognition. I’m Jennifer Strong, and in this latest episode of our mini-series, we look at how this and other tracking systems are changing the sports experience in the stands and on the court.

[SHOW ID]

[Sound from Chicago White Sox at Milwaukee Brewers (Anchor): Ok we are back to playing ball. Two out. 1st inning. No score. And the batter will be Harold Baines with a 7-game hitting streak…]

[Sound from Chicago White Sox at Milwaukee Brewers: crowd cheers]

Strong: For decades, crowding around the TV or radio was the go-to way to consume sports.  Oftentimes, that meant tuning in for hours like this 1984 Major League Baseball game between the Chicago White Sox and the Milwaukee Brewers.

[Sound from Chicago White Sox at Milwaukee Brewers (Anchor): That’s deep in the center field. Going back.. It could be out of here. Manning looks up. It’s outta here! A home run for Harold Baines. The Soxs win 7-6 in the longest game in American league history.]

Strong: The game lasted eight hours and six minutes. And it had to be completed over two days. But, sports watching today looks pretty different.  Human attention spans are measured in seconds and they’re shrinking. Millions of people still tune in to watch but about a third stream them on mobile devices. And of those who still watch on television, 80 percent of them do so while using a second device to search stats, live scores, message other fans, and watch related videos. The segment of fans who attend games in-person are now seen as high-value customers. And that’s another place where face ID comes in. 

[Sound from CNBC newscast (Anchor): And if you were angered over Facebook invading your privacy, you may not want to attend a major sporting event.]

[Sound from CNBC newscast (Eric Chemi): New high tech cameras can now snap a high-rez photo of every person, in every seat, every minute of the game.]

Strong: Face data collected in stadiums by companies like Fancam is now being used to get insights on fan demographics like age, gender and race. Panoramic cameras are able to capture images in such fine detail, that you can zoom in (from a birds-eye view of a stadium) into the stands, onto an individual person, and still be able to make out nuances like a smile, the writing on their shirt, even the texture of their jacket. And now you can also quickly calculate the percentage of people wearing masks – Like in the case of the NFL’s Minnesota Vikings.

Wang: This is new for everybody. We’re still trying to work out exactly how we enforce these mask rules and how to monitor them and track them.

Strong: Rich Wang is their director of analytics & fan engagement. He’s on a Zoom call showcasing how they use computer vision. 

Wang: Also, if you look at this graph. The lowest point is that 87% of people who have their mask on at most of the time and in most of the game. People are you know behaving and enforcing the mask rule. So those are really positive storylines that will continue to support our case of increasing fans

Goodger: Being able to utilize these stats to reopen venues and get fans back into the stadium. And then just as a safeguard as well, once fans are back in the stadium using some of these metrics in addition to the mask usage, also being able to utilize the information of section capacity. 

Strong: And this is Rachel Goodger, the director of business development at Fancam.

Goodger: So, obviously fans have a seat assigned to them when they go back into the stadium and fans are socially distanced. But what happens if fans start to move around the stadium, and one section becomes over capacity. You know, in real time us to able to notify staff and for them being able to see that information and say, ok well, we need to go break up this section a little bit. And then for teams being able to look back after every single game and say “wow we did a great job today.” Or “wow we really need to work more on mask usage in the lower goal or upper goal of this section” and things like that. I think it is data that is going to be very important for not only, as I mentioned, reopening these stadiums but keeping them open in the future. 

Strong: The company sells data back to the sports teams who use it to advance their marketing, affecting everything from what music is played at stadiums to what ads people see during and even after the game has ended.

Scott: You’re gonna start to see the data that you’re willing to share more broadly coupled with the technology used for identification to make things more predictable.

Strong: Donnie Scott is the senior vice president of public security at IDEMIA. It designs AI-driven identity and security solutions to all kinds of businesses.  

Scott: And that would be everything from a digital driver’s license on your phone to a physical license, to a credit card, to an electronic payment mechanism.

Strong: They also make biometric technologies that recognize faces, fingerprints or eyes which can be used to verify identity in sports stadiums or other places like airports and theaters.

Scott: So, we would essentially embed the technology in their loyalty program but we’d add to it, the ability to link either their biometrics – face, fingerprint, iris in some countries that prefer it because of face coverings and other things, or their mobile device where you could authoritatively share your biometric information, or the fact that you’re a season ticket holder, with a piece of equipment at the venue. And therefore, you know, when you show up, they know, okay, Jennifer has tickets to this game. They’re valid at this date. She can pass through the gate.

Strong: Their goal? Is to be invisible. Identity data is captured by cameras concealed as appears to be a normal turnstile. It’s all about creating what’s known as a frictionless experience.

Scott: So particularly around theme parks, um, but the same with stadiums and other concert venues, the technology is evolving from being a device  that kind of stands out to being part of the normal flow and cue of  the venue itself. 

Strong: We already unlock smartphones with our eyes, fingers and face and that got us used to this idea of biometrics in our daily lives. Scott thinks that may be why the response to these services has been mostly positive. 

Scott: You know, I’ve watched my kids grow up with  first opening an Apple device with their thumb print, then moving on that they felt they were very mistreated because they couldn’t unlock it with their face. And we’ve all become, you know, the last 15 years, 10 years, desensitized to the weirdness of it. I think most of society is focused on how it makes my life easier.

Strong: And in a world where confirming your identity is as easy as unlocking a phone, your biometric data could become more important than a passport, car keys or any other physical item we carry with us.

Scott: I think people are going to become really accustomed to the technology being there, how to use it, how to interact with it and what to expect from it because I think we’re going to see it in all walks of life. We’re going to see it when we travel. We’re going to see it when we do business with our government. We’re going to see it when we do business in grocery stores in you know sports and concert venues and music parks as well. So it’s going to become such a standard way of life that the access part will become a de facto normal. And then it’s what happens next.

Strong: And what happens next could mean more personalized experiences. 

Scott: I think that the next thing to come is going to be, to enable the fan experience. But after that, it becomes, how does the fan experience fit in your life? And, you know, that is a concept that is pretty big and broad, but one that once the first two pieces are enabled through technology and enabled through an acceptance by the user themselves are only natural things that come with an improved, mature use of a technology. You could think of an amusement park, head or character where kids could walk up to their favorite character and be recognized for who they are and have a custom experience specific to them. 

Strong: Which is likely to happen at scale. 

Scott: You could see a future where as you arrived to the airport or as you arrive to the sporting event, and it directs you to your parking based on recognizing your car or on sharing who you are from your phone with the airport operator or the airline or the TSA themselves.  You would have an, you know, a known time to gate, right. Which is the ideal state where it says I’ve got a five o’clock flight today based on the wait times that are predicted and where we are. I know that it’s going to take me 12 minutes to get from the front of the airport  through the checkpoint to the gate. And you’re going to have directions along the way, the same experience is going to happen for  sports venues and for concert venues, where from parking, you’re going to be directed through the shortest line, you know, that line’s going to move quickly because it’s biometrically enabled, and then it’s going to be able to guide you to where can I get my concessions that I want, how long do I have to, before I have to start walking, so I can be in my seat before it kicks off, I think those types of secondary benefits are going to come pretty quickly as the, as the venues get instrumented, to be able to recognize and identify folks.

D’Auria: I think there’s a huge opportunity to make the kind of sports fan experience, more engaging, more potent. And I just think where we’re at the early days of that. I’m Mike D’Auria and I’m the vice president of business development at Second Spectrum.

Strong: The company provides tracking data and analytics software for professional sports leagues like the NBA and Major League Soccer. A series of cameras no bigger than your standard security camera, provide unprecedented machine understanding of every game.

D’Auria: The kind of core of this technology is computer vision that runs on top of these camera feeds. And what this is intended to do is track the movement of every player and the ball 25 times a second. So you can kind of think over the course of one umm typical NBA basketball game, you’re able to capture millions of data points that didn’t exist before and use those to kind of, build a suite of products or experiences on top of that can really change the way that we see and interact with sports.

Strong: Those data points are rapidly analyzed with AI, which can spit out predictions such as the likelihood a player will sink a three-pointer—while the play is still in progress. It’s also using this data to deliver a more personalized, interactive viewing experience for fans watching remotely. 

D’Auria: In this last NBA finals, we ran what we call video augmentation essentially real time on top of the game. And so what you could do there is for example, take that shot probability model. And while the game is being played, you could integrate into 3D space in the video, a shot probability bubble over every offensive player’s head that updates in real time.  We can diagram the play that’s being run as it’s unfolding. So if you’re trying to learn about the game a little bit, you can kind of, you know, have a bit of a tutorial or what would it feel like to have a coach sitting next to you. You know, Or if you just want to have fun or kind of game-ify this a little bit, you know, every time somebody dunks the ball, you can see a lightning strikeon the back board. And so each of those experiences might not be right for everybody, but I think we will move to a world where live sports can be really personalized to the way you want to view it.

Strong: And access to troves of data has transformed how coaches train their players. 

D’Auria: So if you kind of step back and think about the way data has traditionally been captured in sports, you would have people either sitting in the stands or watching the game on TV and kind of manually coding. That was a shot. That was a pass. That was a pick and roll action. And so from this kind of underlying tracking dataset you can apply machine learning to kind of automate that whole process.

Strong: That automation allows for all that data to be matched to game film. Coaches, general managers, and analysts can then sift through it with a software tool that functions like a search engine.  

D’Auria: And so for folks who work on an NBA team, you can ask very complicated questions or make very kind of detailed queries about the game. And with a few keystrokes, a few clicks your mouse, you can get a very precise answer in data visualization and a automatically generated playlist of, you know, for example, if I wanted to look at,  Anthony Davis, LeBron James, pick and roll from the right wing where the defense ices and Anthony Davis rolls and somebody tags him from the weak side. And so LeBron James takes a jump shot and makes it. You know, you can get the very precise set of every time that combination has happened in the course of these guys NBA careers in a matter of seconds, and then kind of use that for your coaching purposes. And now, uh, someone at a team level can spend their time saying, well, I have this video or this information, how can I help a coach implement that into his game plan? Or how can I help my players kind of learn something new on the court? And so it kind of shifts their workflow to teaching and implementation versus kind of, you know, data gathering and manual labor.

Strong: And he says, over the next couple of years, the roles of these machines in the game could shift from assistant coach to assistant referee—adding context and nuance to difficult calls.

D’Auria: I mean, we’ve seen this already in some other places where we work.  So we’ll kind of give the soccer example of you now have technology that will help with the goal, no goal call, right? You see this in tennis with computer systems being used to kind of judge, if a ball is  over line or, you know, inbounds or out of bounds and be able to do this with  precision that’s quite frankly, better than what a line judge could do or  a referee who might have a really difficult angle to see if like literally every millimeter of the ball went over. You’re starting to see this with the offsides line in soccer as well.  And so I think generally the first place this happens is to basically, um, you know, augment or assist a referee’s capabilities. So you can kind of think about providing a referee and additional data source or, you know, an additional validation of one of their decisions.

Strong: Because the system can already identify players from their jerseys, Second Spectrum doesn’t need to use facial mapping or recognition. But it is useful for analytics. And that’s not just specific to capturing faces. Right now, players appear in the system as dots on a map. And as their camera systems improve those dots could transform into full skeletons. Extra detail like real-time elbow angle could help with even more accurate shot predictions. Though, not everyone is onboard.

Gay: You know, a sport that I follow and find fascinating is bike racing and bike racing is a sport that is actually in a long conversation about. Removing technology. 

Strong: Jason Gay is a sports columnist for The Wall Street Journal.

Gay: Technology now in cycling can say, okay, if you want to win this race or catch up to this person, you have to put out X amount of effort for X amount of minutes. And you actually have this data right on an onboard computer, on a bicycle in front of you telling you exactly what to do. Now. That’s like an amazing thing. However, it’s also not terribly human, right? It seems to be somewhat clinical and it’s created what many people feel is a little bit of a dry style of racing where people are data-driven and they’re using their heads too much, as opposed to their hearts. The French have an expression of panache. They love to see races won with panache, which basically means our gut instinct. And so there’s been conversations about, well, what if we take away these computers from riders and make them, you know, use their heads in their hearts to cycle. Now there’s a safety consideration here that’s concurrent with this, right? You want to actually have that information creates a safer experience for a rider oftentimes, but it is fascinating that the tech has gotten so good in certain instances, in terms of maximizing effort or telling an athlete, what effort is required, that they’re starting to draw back from that.

Strong: And for sports embracing this tech, It’s changing how the game is played. 

Gay: Here’s an example from baseball and we see quite often a manager will come to a mound and remove a pitcher from a game, even though the pitcher is pitching very, very well that day, the reason they remove them is that the data shows that this pitcher tends to break down at a certain point. It’s almost like a car tire or something. And they’re just saying, well this pitcher at this point of the game historically is going to stop performing at the high level we need him to. So we’re going to make that move. We’re removing sort of the gut of saying there oh well he’s rolling today, let’s just let them go. They’re relying on the numbers.

Strong: Data driven game strategies are also changing how teams recruit. Like in basketball, where players who can execute a three-point shot (once considered a gimmick by the NBA) are now deemed extremely valuable. 

Gay: The reason is that basketball teams by looking at their numbers discovered that a three-point shot is a more efficient shot. You’d rather take that three-point shot than certainly take a longer two point jump shot. And so you prioritize the three pointer in an offense. The most extreme example of this – the Houston Rockets, where you have a perennial MVP candidate in James Harden who oftentimes is taking three pointer after three pointer in a game, because it’s an efficient way for them to play.

[Sound of Houston Rockets at Los Angeles Clippers (Announcers): Harden, nobody near him, sets all the time and nails the three-pointer! Steps back, open three, got it! James Harden steps back puts up a three, It goes, bounces and drops through!]

Strong: Technology is also playing assistant coach in places like the locker room of The Dallas Mavericks. 

[Sound from video of Marc Cuban at Dallas Mavericks (Cuban): What will happen is when a player walks in, or anybody walks in, we’ll have facial recognition. It’ll take a picture of you and it will say ‘ok here comes Marc or here comes Dirk’]

Strong: Marc Cuban is their owner.

[Sound from video of Marc Cuban at Dallas Mavericks (Cuban): And for any of the players or any of the staff, it’ll put coaches notes: here’s what you’re expected to do and tell you what’s going on. For anybody we don’t know it’s going to be ehh-ehh-ehh get the heck out.”]

Strong: And it’s not just basketball. Using AI to find the most efficient pattern of play is growing across all sports. And there’s a role for face ID too. That same face-mapping that sees when you’re looking directly at your phone to unlock it could also help coaches see what players are focusing on during the game.

Gay: I mean, that’s an incredibly integral thing for say a football quarterback. If you could somehow be able to render what a football quarterback is looking at or more importantly not looking at, not seeing downfield. Well, you could see, you know, immediate utility for any quarterback, any football team. But it also applies to a point guard or, you know, somebody playing left tackle or somebody catching on a baseball team. There are numerous plays that if you’re able to sort of look at what an athlete is seeing on the court or not seeing again, which is probably the more essential thing, that would have enormous consequences. 

Strong: Next episode, we wrap up our miniseries with a look at how face mapping is transforming the shopping experience. And spoiler alert – it goes way beyond just identifying who’s in the store  

Guive Balooch: In order to really virtually be able to try on with augmented reality makeup, you need to detect where the eye is and where the eyebrow is. And, um, it has to be at a level of accuracy that when the product’s on there, it doesn’t look like it’s not exactly on your lip   and people’s lips are, can vary in shape, the color between your skin tone and your lip, can also be very different. And so you need to have an algorithm that can detect it and make sure it works.

Strong: This episode was reported and produced by me, Anthony Green, Tate Ryan-Mosley, Emma Cillekens and Karen Hao. We’re edited by Michael Reilly and Gideon Lichfield. Thanks for listening, I’m Jennifer Strong. 

[TR ID]