bIngAr

iOS POC · 2024 · Study Project · Team of 2

AI-powered bingo card recognition using Apple's machine learning frameworks

bIngAr Screenshot 1
bIngAr Screenshot 2
bIngAr Screenshot 3
bIngAr Screenshot 4
bIngAr Screenshot 5
bIngAr Screenshot 6

The Goal

bIngAr is a proof of concept built to explore Apple's on-device machine learning ecosystem — specifically CreateML, CoreML, and Vision Framework.


Beyond the ML stack, we also explored the Speech Framework for voice interaction and worked through the full pipeline of building a dataset, training a model, and integrating it into a live iOS app — all without third-party dependencies.

What We Built

A bingo app that uses the device camera to detect physical bingo cards in real time, extract numbers using the Vision framework, and enable voice-controlled gameplay.

Technical Highlights

Card Detection

Custom CoreML model trained with CreateML to detect bingo cards through the device camera.

Number Recognition

Vision Framework reads numbers from physical cards and maps them to the in-app interface.

Voice Recognition

Speech Framework integration allows players to mark numbers hands-free using the device microphone.

Win Detection

An algorithm checks winning patterns and triggers haptic feedback and visual effects when a player wins.

Custom Dataset

Dataset of bingo card images collected, annotated, and augmented manually by the team.

Real-Time Detection

Live camera feed with on-device ML inference — no server or network requests required.

Tools & Technologies

SwiftUI CoreML CreateML Vision Framework Speech Framework MVVM Architecture Xcode

My Contribution

Development Team

Jaide Fernando de Carvalho Zardin

Jaide Zardin

Development

Luana Rafaela Gerber

Luana Gerber

Development

Open Source

The full source code is available on GitHub.
Explore how CoreML, Vision, and Speech frameworks come together in a single iOS app.