Smart monitoring target camera – case study
Abto Software

Abto Software @abtosoftware

About: Founded in 2007, we build modern and sustainable custom software solutions that are used by millions of users worldwide every day, handle billions of metrics, work at the government level.

Location:
Lviv, Ukraine
Joined:
Jun 30, 2022

Smart monitoring target camera – case study

Publish Date: Nov 25 '24
7 0

Shooting range target application’ usability improvement, the project from A to Z.

1. Project overview

This project focused on delivering a deep-dive analysis for a well-known company in the ammunition industry. The client specializes in creating gear for shooting enthusiasts, particularly target cameras that make long-range shooting more accurate and engaging.

Initially, Abto Software proposed introducing computer vision technology for accurate hole detection. However, after testing the current application thoroughly, our team discovered that the existing UI/UX design was overly complicated, even when using manual markers. Plus, it offered no flexibility for implementing advanced computer vision features.

Following a detailed analysis, we developed a new concept—“smart monitoring”. This approach aimed to fix critical design flaws, rethink inefficient features, and integrate cutting-edge technologies like CV-based solutions.

2. Main goals

The primary objective was to carefully examine the strengths and weaknesses of the legacy solution. To get accurate insights, our team replicated the real-world use of the target camera at a shooting range.

Abto Software’s key tasks included:

  • Conducting both preliminary and detailed investigations

  • Identifying issues and defining strategies

  • Designing a new concept

  • Building a proof of concept (POC)

3. Main challenges

The biggest challenge was understanding what end-users truly need and the problems they encounter. To solve this, we went beyond office-based research and conducted extensive field testing.

Initial roadblocks during the discovery phase

  1. Creating a realistic environment

    The legacy app couldn’t function without specific equipment. To simulate real conditions, we ordered four specialized cameras and set up a proper testing environment.

  2. Going beyond standard research methods

    Office-based simulations didn’t give us the depth of insight we wanted. So, we invested extra resources and time to carry out field tests in real shooting conditions.

This allowed us to pinpoint issues related to:

  • Timestamp placement

  • Hole marking accuracy

  • Target visibility and clarity

  • Shooting history tracking

By replicating actual use conditions, we uncovered several key problems:

  • Shooters had to log events manually, risking data loss if they were distracted.

  • Every new bullet hole needed manual marking, which was confusing and error-prone.

  • Dust clouds from ground shots were often missed because the shooter was focused on manual inputs.

  • No access to historical shooting data made progress tracking difficult.

  • A static display made it hard to tell real-time events from older screenshots.

  • Playback delays made it almost impossible to analyze precise hits.

  • Additional issues arose from audio-visual limitations and user-specific needs like left- or right-handed controls.

Our goal was to develop an interface that displayed every event clearly, eliminating these pain points.

4. Core functionality

The current legacy app offers:

  • Live images from the target camera with a set FPS

  • Option to save images to the gallery

  • Video recording via device screen capture (limited by device resolution)

  • Manual timestamping

  • Frame switching to compare snapshots with real-time shots (blinker mode)

  • Distance settings for shooter-target calculation

While these features work for short and long-range shooting, the user experience is far from optimal.

Our smart monitoring concept

Our redesigned concept focuses on automation and ease of use. Here’s what we proposed:

  • Event detection using computer vision and sound recognition:

    • Detects new holes by comparing frames and highlights differences.
    • Identifies ground shots and displays dust clouds on-screen.
    • Filters sound to recognize rifle shots and ignore irrelevant noises.
  • Enhanced recording experience:

    • Timeline-based playback for easy event review.
    • Users can replay key events like rifle shots and dust impacts.
    • Quick navigation to previous or next recorded events.
  • Historical data control:

    • Ability to manually mark holes if the system misses them.
    • Use color-coding for holes in case of detection errors.
    • Add text or voice notes for better record-keeping.
  • Accessibility features:

    • Optimized for both left- and right-handed shooters.

At the same time, the system is designed to ignore irrelevant activity, such as:

  • Grass or target movements

  • Bugs, birds, or small animals

  • Other environmental distractions

5. Our contribution

After signing a mutual NDA, our team:

  1. Conducted a preliminary review of the legacy app.

  2. Performed hands-on testing at the shooting range.

  3. Developed a hypothesis aligned with project goals.

  4. Built a successful proof of concept.

For future collaboration, we plan to:

  • Redesign the UI/UX completely.

  • Reinterpret main features using computer vision and sound detection. This can either be a complete overhaul or a switchable mode in the app.

Tech stack:

  • Android SDK

  • Flutter

  • OpenCV

Timeline:

August 2022 – February 2023

Team composition:

  • 1 Business Analyst

  • 1 CV Engineer

  • 1 Mobile Developer

  • 1 UI/UX Designer

6. Business value

The discovery phase laid the foundation for future product development and improved business efficiency. By providing an external perspective, we were able to:

  • Highlight existing issues clearly.

  • Offer actionable strategies to fix them.

  • Identify growth opportunities.

Benefits for the client:

  • A fresh, independent viewpoint.

  • Advanced technical expertise.

  • A clear roadmap for problem-solving.

  • Detailed insights for product enhancement.

Impact for end-users:

  • Automatic event tracking saves time and improves precision.

  • Timeline playback makes reviewing progress easy.

  • Flexible event editing allows accurate historical data.

  • Event filtering removes unnecessary distractions like wind or moving targets.

Impact for the business:

  • Increased competitiveness—currently, no ammunition market solution combines computer vision and sound detection.

  • Higher sales—enhanced usability and added features attract more customers.

What’s next?

We believe the future lies in AI-powered solutions. By using advanced machine learning and computer vision, we can help shooters detect, record, and analyze every event with unmatched accuracy, eliminating manual effort.

FAQs

  1. What is a smart target camera?

    It’s an advanced system that uses computer vision and sound recognition to track bullet holes and events automatically during shooting sessions.

  2. Why was the legacy app not effective?

    It relied heavily on manual input, which was confusing, time-consuming, and error-prone.

  3. How does computer vision help in shooting practice?

    It detects new bullet holes, ground shots, and other critical events automatically, improving speed and accuracy.

  4. Can the new system filter irrelevant activity?

    Yes. It ignores unnecessary movements like grass, insects, or target sway.

  5. Will this technology benefit both professionals and hobbyists?

    Absolutely. It makes shooting more efficient for everyone, from beginners to experts.

Comments 0 total

    Add comment