Project Description

Computer Vision Meetup Roundup #3 2017

At the end of March, the meetup crowd gathered once again to talk and learn about interesting projects in the area Computer Vision!

This time, Thomas Willomitzer talked about “Methods and Technologies in Computer Vision for Snapscreen”, which can shortly be described as something like “Shazam for TV”. After an intro Thomas handed over to his colleague from Smart Engines, the company that collaborated in building parts of the technology with Snapscreen.

After that, we had a spontaneous product demo from Andreas Daniel Hartl, who presented a mobile Augmented Reality Framework with MRZ reading.


Thomas started his talk “How we built snapscreen” with the initial idea and how he came up with it. The original idea was to take a photo of a TV show with any kind of phone and send the photo to an email address. The reply should then contain contextual information about the TV program.

When they decided to execute the idea, they came across some problems: the screen detection, image matching and TV data gathering and storing.

How snapscreen generally works: you snap a picture with your phone and the pictures is being compared with a huge database of TV shows.

Since snapscreen comes as a SDK and can be implemented in any kind of mobile app, the main use cases are TV apps (for getting infos about the TV program you are watching), betting apps and sports apps.

Imagine sitting in a sports bar and watching a soccer game. You might want to bet on it. You could open the app from the betting company, snap a picture of the screen and it takes you right to the correct bet. Pretty cool, right?

After a video and a short live demo, Thomas handed over to Vladimir Arlazarov from smart engines.

They are a Computer Vision company from Moscow that built certain parts of the technology for Snapscreen. Vladimir explained the technology behind Snapscreen a bit and also showed other use cases like vehicle classification or navigation detection.

Demo time: mobile Augmented Reality Framework!

In the second talk/demo Andreas Daniel Hartel talked about his own mobile Augmented Reality Framework with MRZ demo.

He showed how he could track documents with his app such as passports. His prototype included MRZ detection, reading, tracking and augmentation. He then pointed out the difference to other Augmented Reality Providers, which is that his app doesn’t need a server connection.

Hold a talk yourself!

You have a project or topic you would like to talk about or you know someone, who would like to share his/her experiences and knowledge at our Computer Vision Meetup? Please contact us!
It is great to see how our community is growing each month, so if you haven’t already – don’t forget to join our meetup group ! ;)


If you have questions, suggestions or feedback on this, please don’t hesitate to reach out to us via FacebookTwitter or simply via [email protected]! Cheers!