this post was submitted on 13 Sep 2023
47 points (79.0% liked)

Technology

59421 readers
3619 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Amazon Prime Video took a different approach during the offseason as it looked to improve its coverage of “Thursday Night Football.”

While most networks review the film, go to college, or seek input from teams on different ways to broadcast games, Prime Video’s production, tech, and engineering teams visited Tel Aviv to meet with Prime Video’s Computer Vision Machine Learning team.

The goal was to find ways to help fans understand and enjoy football better, said Betsy Riley, Prime Video’s senior coordinating producer of live events.

“I can’t say that I’m able to diagram plays like Vince Lombardi but we spent some time on the whiteboards talking about football, diagramming plays, and helping that team understand how we see the game,” Riley said. “It was neat to be in a room mashing on ideas and having these varied perspectives to find solutions.”

Riley told TNF analytics expert Sam Schwartzstein to think big when developing ideas. Schwartzstein worked with the XFL during its 2020 startup, overseeing everything from the rule book to how players and coaches were paid. He said this was a big step up in machine-based insight compared to the rules-based insight he was doing three years ago.

The first result of this collaboration is giving viewers a better understanding of who might blitz the quarterback. Defensive alerts will be on Prime Vision with Next Gen Stats, one of “Thursday Night Football's” alternate streams.

“It all starts with these foundational storytelling questions. What if we could show our fans what the quarterback’s looking at? And so that became an answer to that question,” Riley said.

Defensive alerts are an AI-powered feature that will identify players in real-time with the best odds of rushing the quarterback. It tracks player movements before the snap. Via machine learning and custom logic, a highlighted circle appears under what is considered a potential pass rusher.

Tracking for all players is done by RFID chips in their shoulder pads.

The data comes from the league’s Next Gen Stats, which tracks all players and provides analytics data to the NFL. Amazon Web Services has been the NFL’s official technology provider in developing Next Gen Stats since 2017.

In identifying the players for the defensive alerts, Schwartzstein had to ensure scientists had a three-dimensional view of the data. The film shows when a player is in a three-point stance compared to an edge rusher lining up, ready to blitz.

“When we went to the film, it’s like, ‘Oh yeah, that’s a big guy. We don’t need to highlight the big guy. He’s in a three-point stance.’ It’s like we lived in this data world for so long. It’s still about football and what you see on the screen,” he said. “That’s how we’re trying to translate what we’re able to do from a huge data set prediction.”

Schwartzstein, who started 23 games at center while playing for Stanford from 2008 to 2012, was surprised by how quickly the machine could identify blitzing players.

“Keeping Andrew Luck upright was my entire job (in college). Part of it was identifying the unique players who are going to rush, which is what the model is doing. And for them to do it nearly instantly and to be better than me was kind of a shot to my pride, but also just so excited that we’re going to reveal in real time something to viewers that is over the level of what NFL-caliber players are doing right now.”

The Prime Vision with Next Gen Stats feed uses an all-22 coach’s camera angle but includes announcers Al Michaels and Kirk Herbstreit and sideline reports from Kaylee Hartung.

Defensive alerts remains in final testing but viewers will be able to see other elements using AI during Thursday night’s game between the Eagles and Minnesota Vikings.

Viewers will be able to see the route trees receivers are running along with highlighting who is open and their chances of converting a first down.

On third-down plays, a line will appear on the field showing where the offense needs to advance the ball to receive a firm “go for it” recommendation on fourth down.

Nearing the end of the first half or of the game, the feed will highlight on the field the probability of a kicker making a field goal from certain distances.

All the Next Gen features are driven by a neural network that grows stronger as more plays are loaded.

“I think Prime Vision is an area that we like for us because we know it resonates with a specific subset of fans already. It also serves as an incubator for us to explore new technology, push the envelope on analytics, utilize computer vision, machine learning, and see what we can develop,” said Jared Stacy, Amazon’s director of global live sports production.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 1 year ago* (last edited 1 year ago)

Purists didn't like it, but it led to a rise in viewership. It was a disaster because the pucks and cameras were very expensive. Many stadiums still had live camera operators. I was an intern at the Spectrum and the brand new CoreStates Center, which had just been completed that year and already required structural renovation to support the new automated camera systems.

It died because Fox owned the technology and lost the broadcast rights to ABC. Modern pucks now use a similar IR tracking system for puck tracking, they just don't use the blue onscreen effect.