AI-powered bingo card recognition using Apple's machine learning frameworks
bIngAr is a proof of concept built to explore Apple's on-device machine learning ecosystem — specifically CreateML, CoreML, and Vision Framework.
Beyond the ML stack, we also explored the Speech Framework for voice interaction and worked through the full pipeline of building a dataset, training a model, and integrating it into a live iOS app — all without third-party dependencies.
A bingo app that uses the device camera to detect physical bingo cards in real time, extract numbers using the Vision framework, and enable voice-controlled gameplay.
Custom CoreML model trained with CreateML to detect bingo cards through the device camera.
Vision Framework reads numbers from physical cards and maps them to the in-app interface.
Speech Framework integration allows players to mark numbers hands-free using the device microphone.
An algorithm checks winning patterns and triggers haptic feedback and visual effects when a player wins.
Dataset of bingo card images collected, annotated, and augmented manually by the team.
Live camera feed with on-device ML inference — no server or network requests required.
Development
Development
The full source code is available on GitHub.
Explore how CoreML, Vision, and Speech frameworks come together in a single iOS app.