cam X is an iOS project written in Swift that I built for ONVIF Open Source Spotlight Challenge
It's a proof of concept project to demonstrate how Deep Learning, IPFS and Blockchain can be applied together in practice.
cam X uses FFmpeg library to decode and stream live video from cameras use ONVIF protocol, HTTP, RTSP or iOS device build-in cameras. It equips with Tiny Yolo and Yolo 2 deep learning object detection models as video analytics engine on camera. Users are given options to pick any object class to detect or raise alarm. Users manually pick alarm to save to IPFS. iOS device UUID used as key to store alarm hash to Ethereum Rinkeby Test Network via a simple lookup smart contract.
I posted an article on Medium to explain the project in details Deep Learning + IPFS + Ethereum Blockchain in practice
-
Run FFmpeg-iOS-build-script to build FFmpeg library for iOS.
-
Run 'pod update' in root directory to install ONVIFCamera library.
-
Install Carthage and run 'carthage update --platform iOS' to install web3swift library and swift-ipfs-api library.
-
Run download.sh shell script to download pre-trained CoreML models: Tiny Yolo and Yolo 2.
-
Then you can build the project in Xcode. You need minimum iOS 11.0 to run the app.
See cam X page for detailed slides and screenshots. Alternatively, I made demo videos on network cameras and iPad Pro backward facing camera