Customized video recognition platform for your business

Our product allows you to create a customized video recognition platform for your business with clear processes and an easy-to-use annotation system.

In this blog, we wrote that AI is for everyone, laying the foundations of what we envision building at Deepomatic. Today, we want to give you an update on real use cases we are working on a daily basis.

Our Human-In-The-Loop (HITL) approach

Everyone should benefit from the huge progress that happened in the last few years in AI. That’s why we want to empower anyone to create their own custom image recognition system. You should not have to struggle with network architecture or worry about GPU management. You should only focus on the problem you want to fix.

Therefore, we have designed a Human-In-The-Loop (HITL) approach. Our platform enables you to transfer the intelligence required to complete a given task through annotation. Indeed, data remain key in the whole process. And it’s the quality of the annotation that is directly responsible for the accuracy of the AI you are building.

The human-in-the-loop (HITL) approach.

Our annotation system: a balance between quality and workload

Through our annotation platform, you can either do the annotations yourself or simply review the ones performed by external agents.

Annotating images or videos is a very time-consuming process. However, you will need a lot of it if you want to reach a high level of accuracy. That’s why we also built a platform to boost this process, enabling you to train models on the fly and to mine the images depending on the confidence level given by the neural network for the task that is being automated.

Depending on the accuracy you want to obtain, the system will start automatically annotating the images for which it is very confident when statistical significance has been reached, and you will then be able to focus your annotation effort where it creates more value.

The whole process ensures a good balance between the quality of the AI you are continuously building and the workload that you will inevitably have to invest in the process.

Some examples of use cases

A marketplace company specialized in automotive parts is already using our platform to build classifiers and ensure that objects added on the website are well categorized, that their photos are matching their description and that those photos meet their standard of quality.

Classified images: photos vs blueprints
Classified images: photos vs blueprints.

Classified images: left vs right rear-view mirrors.
Classified images: left vs right rear-view mirrors.

We are also working with an industrial company that wants to automate the detection of defects in industrial parts during quality control processes. Our platform enables this company to develop its own set of classifiers based on the 9 types of defects they want to detect.

This is just the beginning, but we are already very excited to see what challenges and what problems we can help you solve in your organization.

Write to us at to get an invite for the beta release of our platforms.

You may also like

The AI Software 2.0 is already here
Helping blind and low vision people cross the road
The roadmap of our video recognition platform is public
Be aware of how many times you touch your face and help fight the expansion of covid 19
Deepomatic has been named “Strong Performer” in The Forrester New Wave™: Computer Vision Platforms, Q4 2019
How we improved computer vision metrics by more than 5% only by cleaning labelling errors
Où nous trouver?

Deepomatic New York
135 East 57th street, 16th floor
New York, NY 10022

Deepomatic Paris
53 rue de Turbigo, 75 003 Paris

©Deepomatic 2020 – Privacy Policy

This website stores cookies on your computer. These cookies are used to collect information about how you interact with our website and allow us to remember you. We use this information in order to improve and customize your browsing experience and for analytics and metrics about our visitors both on this website and other media. To find out more about the cookies we use, see our Privacy Policy.