Most the real time object detection applications run in the cloud using Python based web services.It requires users send their image or video over the network to the service. This could compromise privacy and security of user's data. With the advent or coreml and neural engine, Apple devices can now perform object detection on the iPhone. The models run entirely on phone without requirement any wifi or cellular connectivity.
Imagine if you are in remote island watching beautiful coral plants and would like to find species of plants, what better way than using your iPhone accomplish these tasks.
I have played with coreML and few deep learning models and have prepared easy to digest courses for you. These courses teach you machine learning and deep learning from ground up.
If you would like to learn step by step on how to perform object detection on iPhone, please visit links below
Inception V3: https://www.udemy.com/course/building-ios-object-detection-app-with-inception-v3-ml-model/?referralCode=D58FE2F8676A8B65EE41
MobileNet V2 : https://www.udemy.com/course/building-ios-object-detection-app-with-mobilenet-ml-model/?referralCode=B5A0BA11DEBFE7D1E9CB
Squeezenet Model :