A flutter widget that show the camera stream and allow ML vision recognition on it, it allow you to detect barcodes, labels, text, faces...

Overview

Flutter Camera Ml Vision

pub package

A Flutter package for iOS and Android to show a preview of the camera and detect things with Firebase ML Vision.

Installation

First, add flutter_camera_ml_vision as a dependency.

...
dependencies:
  flutter:
    sdk: flutter
  flutter_camera_ml_vision: ^2.2.4
...

Configure Firebase

You must also configure Firebase for each platform project: Android and iOS (see the example folder or https://codelabs.developers.google.com/codelabs/flutter-firebase/#4 for step by step details).

iOS

Add two rows to the ios/Runner/Info.plist:

  • one with the key Privacy - Camera Usage Description and a usage description.
  • and one with the key Privacy - Microphone Usage Description and a usage description. Or in text format add the key:
<key>NSCameraUsageDescription</key>
<string>Can I use the camera please?</string>
<key>NSMicrophoneUsageDescription</key>
<string>Can I use the mic please?</string>

If you're using one of the on-device APIs, include the corresponding ML Kit library model in your Podfile. Then run pod update in a terminal within the same directory as your Podfile.

pod 'Firebase/MLVisionBarcodeModel'
pod 'Firebase/MLVisionFaceModel'
pod 'Firebase/MLVisionLabelModel'
pod 'Firebase/MLVisionTextModel'

Android

Change the minimum Android sdk version to 21 (or higher) in your android/app/build.gradle file.

minSdkVersion 21

ps: This is due to the dependency on the camera plugin.

If you're using the on-device LabelDetector, include the latest matching ML Kit: Image Labeling dependency in your app-level build.gradle file.

android {
    dependencies {
        // ...

        api 'com.google.firebase:firebase-ml-vision-image-label-model:19.0.0'
    }
}

If you receive compilation errors, try an earlier version of ML Kit: Image Labeling.

Optional but recommended: If you use the on-device API, configure your app to automatically download the ML model to the device after your app is installed from the Play Store. To do so, add the following declaration to your app's AndroidManifest.xml file:

<application ...>
  ...
  <meta-data
    android:name="com.google.firebase.ml.vision.DEPENDENCIES"
    android:value="ocr" />
  <!-- To use multiple models: android:value="ocr,label,barcode,face" -->
</application>

Usage

1. Example with Barcode

CameraMlVision<List<Barcode>>(
  detector: FirebaseVision.instance.barcodeDetector().detectInImage,
  onResult: (List<Barcode> barcodes) {
    if (!mounted || resultSent) {
      return;
    }
    resultSent = true;
    Navigator.of(context).pop<Barcode>(barcodes.first);
  },
)

CameraMlVision is a widget that shows the preview of the camera. It takes a detector as a parameter here we pass the detectInImage method of the BarcodeDetector object. The detector parameter can take all the different FirebaseVision Detector. Here is a list :

FirebaseVision.instance.barcodeDetector().detectInImage
FirebaseVision.instance.cloudLabelDetector().detectInImage
FirebaseVision.instance.faceDetector().processImage
FirebaseVision.instance.labelDetector().detectInImage
FirebaseVision.instance.textRecognizer().processImage

Then when something is detected the onResult callback is called with the data in the parameter of the function.

Exposed functionality from CameraController

We expose some functionality from the CameraController class here a list of these :

  • value
  • prepareForVideoRecording
  • startVideoRecording
  • stopVideoRecording
  • takePicture

Getting Started

See the example directory for a complete sample app.

Features and bugs

Please file feature requests and bugs at the issue tracker.

Technical Support

For any technical support, don't hesitate to contact us. Find more information in our website

For now, all the issues with the label support mean that they come out of the scope of the following project. So you can contact us as a support.

Comments
  • takePicture() Method does not work

    takePicture() Method does not work

    Hi there i want to add The Capturing Image Feature on the Camera using the CameraMlVisionState using a Global key : onTap: () async { final path = join( (await getTemporaryDirectory()).path, '${DateTime.now()}.png', ); await _scanKey.currentState.takePicture(path); Navigator.push( context, MaterialPageRoute( builder: (context) => DisplayPicture(imagePath: path), ), ); },

    Now the Problem is that it doesn't take a Picture and it doesn't even navigate to the DisplayPage. But when i removed the takePicture Line , it navigated normally to the DisplayPage but of course it showed a White screen since there is no photo captured. Is there a Solution or i just wrote a wrong Code . That Code is actually similar to the Flutter Dev CooKBook Page which gives an example of capturing an image from the camera : https://flutter.dev/docs/cookbook/plugins/picture-using-camera.

    need more info 
    opened by Loopex2019 30
  • Can not install The Camera Package while using the Camera ML Package

    Can not install The Camera Package while using the Camera ML Package

    Whenever i install the camera package there is an error occurs related to the Camera ML Package:

    Because flutter_camera_ml_vision 1.2.0 depends on camera from git and no versions of flutter_camera_ml_vision match >1.2.0 <2.0.0, flutter_camera_ml_vision ^1.2.0 requires camera from git.

    So, because looper_2019 depends on both flutter_camera_ml_vision ^1.2.0 and camera from hosted, version solving failed. pub get failed (1) exit code 1

    opened by Loopex2019 11
  • Face Detection not working (not about the Ml Package)

    Face Detection not working (not about the Ml Package)

    Hi There. I tried Face Detection in my App but every time i try to run it , It produces an Error : "DynamiteModule(13840): Local module descriptor class for com.google.android.gms.vision.dynamite.face not found. " I searched about that error and i found out that the reason for that Error is that the Storage is low. But then i checked my Storage and i had acually 1+Gb Free Storage. But still it's Showing me that Error. I know that the Error is not coming from the Camera ML Package . But can someone please help me about that , because i need it very strong.

    opened by Loopex2019 10
  • _cameraController intializer takePicture

    _cameraController intializer takePicture

    fixed camera initialization await before takePicture method being called, other wise the following exception was thrown: Exception has occurred. CameraException (CameraException(error, CaptureRequest contains unconfigured Input/Output Surface!))

    Fixes for issue: https://github.com/rushio-consulting/flutter_camera_ml_vision/issues/62#

    opened by imasif 8
  • Do not work if you turn off the phone screen

    Do not work if you turn off the phone screen

    Hello Do not work if you turn off the phone screen, and turn it on. (onStop onResume) The camera is working, but does not recognize barcodes. 2019-05-23 23:34:42.671 30661-31098/com.lan4.lan4 E/libc: Access denied finding property "vendor.camera.aux.packagelist" 2019-05-23 23:34:42.655 30661-30661/com.lan4.lan4 W/Binder:30661_5: type=1400 audit(0.0:13034): avc: denied { read } for name="u:object_r:vendor_camera_prop:s0" dev="tmpfs" ino=20626 scontext=u:r:untrusted_app:s0:c107,c257,c512,c768 tcontext=u:object_r:vendor_camera_prop:s0 tclass=file permissive=0 ppid=783 pcomm="main" pgid=30661 pgcomm="com.lan4.lan4" 2019-05-23 23:34:42.682 30661-31098/com.lan4.lan4 E/libc: Access denied finding property "vendor.camera.aux.packagelist" 2019-05-23 23:34:42.691 30661-30684/com.lan4.lan4 I/flutter: @@@ barcodes.length=0 2019-05-23 23:34:42.830 30661-31098/com.lan4.lan4 E/libc: Access denied finding property "vendor.camera.aux.packagelist" 2019-05-23 23:34:42.882 30661-31098/com.lan4.lan4 E/libc: Access denied finding property "vendor.camera.aux.packagelist" 2019-05-23 23:34:47.045 30661-30661/com.lan4.lan4 W/com.lan4.lan4: type=1400 audit(0.0:13045): avc: denied { read } for name="u:object_r:vendor_camera_prop:s0" dev="tmpfs" ino=20626 scontext=u:r:untrusted_app:s0:c107,c257,c512,c768 tcontext=u:object_r:vendor_camera_prop:s0 tclass=file permissive=0 ppid=783 pcomm="main" pgid=783 pgcomm="main" 2019-05-23 23:34:47.062 30661-30661/com.lan4.lan4 E/libc: Access denied finding property "persist.vendor.camera.privapp.list" 2019-05-23 23:34:47.087 30661-31098/com.lan4.lan4 E/libc: Access denied finding property "vendor.camera.aux.packagelist" 2019-05-23 23:34:47.087 30661-30675/com.lan4.lan4 E/libc: Access denied finding property "vendor.camera.aux.packagelist" 2019-05-23 23:34:47.100 30661-31098/com.lan4.lan4 E/libc: Access denied finding property "vendor.camera.aux.packagelist" 2019-05-23 23:34:47.100 30661-31098/com.lan4.lan4 E/libc: Access denied finding property "vendor.camera.aux.packagelist" 2019-05-23 23:34:47.107 30661-30661/com.lan4.lan4 E/libc: Access denied finding property "persist.vendor.camera.privapp.list"

    Sony XZ1, Android 9.0 flutter_camera_ml_vision: 2.1.0

    CameraMlVision scannerWidget() { var camera; if (camera == null) { camera = CameraMlVision<List>( key: _scanKey, overlayBuilder: (c) { return Container( decoration: ShapeDecoration( shape: _ScannerOverlayShape( borderColor: Theme.of(context).primaryColor, borderWidth: 3.0, ), ), ); }, detector: FirebaseVision.instance.barcodeDetector().detectInImage, onResult: (List barcodes) { print("@@@ barcodes.length=" + barcodes.length.toString());

          if (barcodes == null ||
              barcodes.isEmpty ||
              barcodes.contains(barcodes.first.displayValue) ||
              !mounted) {
            return;
          }
    
          checkTicket(barcodes.first.displayValue);
          print("@@@ barcode=" + barcodes.first.displayValue);
    
          //resultSent = true;
          //Navigator.of(context).pop<Barcode>(barcodes.first);
        },
      );
    }
    return camera;
    

    }

    need more info 
    opened by xalabax 8
  • Access denied finding property

    Access denied finding property "vendor.camera.aux.packagelist"

    Hi guys, I'm getting this error when opening the camera preview

    E/libc (25094): Access denied finding property "vendor.camera.aux.packagelist" W/om.example.hits(25094): type=1400 audit(0.0:8817): avc: denied { read } for name="u:object_r:vendor_camera_prop:s0" dev="tmpfs" ino=3997 scontext=u:r:untrusted_app:s0:c54,c257,c512,c768 tcontext=u:object_r:vendor_camera_prop:s0 tclass=file permissive=0 E/libc (25094): Access denied finding property "persist.vendor.camera.privapp.list" W/om.example.hits(25094): type=1400 audit(0.0:8818): avc: denied { read } for name="u:object_r:vendor_camera_prop:s0" dev="tmpfs" ino=3997 scontext=u:r:untrusted_app:s0:c54,c257,c512,c768 tcontext=u:object_r:vendor_camera_prop:s0 tclass=file permissive=0 W/Binder:25094_6(25094): type=1400 audit(0.0:8819): avc: denied { read } for name="u:object_r:vendor_camera_prop:s0" dev="tmpfs" ino=3997 scontext=u:r:untrusted_app:s0:c54,c257,c512,c768 tcontext=u:object_r:vendor_camera_prop:s0 tclass=file permissive=0 E/libc (25094): Access denied finding property "vendor.camera.aux.packagelist" E/libc (25094): Access denied finding property "vendor.camera.aux.packagelist" W/System (25094): A resource failed to call release. I/flutter (25094): Bad state: No element, #0 List.first (dart:core/runtime/libgrowable_array.dart:216:5) I/flutter (25094): #1 _MLScannerPageState.build.. package:hits/mlScannerTest.dart:40 I/flutter (25094): #2 State.setState package:flutter/…/widgets/framework.dart:1122 I/flutter (25094): #3 _MLScannerPageState.build. package:hits/mlScannerTest.dart:39 I/flutter (25094): #4 CameraMlVisionState._processImage (package:flutter_camera_ml_vision/flutter_camera_ml_vision.dart) I/flutter (25094): I/flutter (25094): #5 CameraController.startImageStream. package:camera/camera.dart:367 I/flutter (25094): #6 _rootRunUnary (dart:async/zone.dart:1132:38) I/flutter (25094): #7 _CustomZone.runUnary (dart:async/zone.dart:1029:19) I/flutter (25094): #8 _CustomZone.runUnaryGuarded (dart:async/zone.dart:931:7) I/flutter (25094): #9 _BufferingStreamSubscription._sendData (dart:async/stream_impl.dart:336:11) I/flutter (25094): #10 _DelayedData.perform (dart:async/stream_impl.dart:591:14) I/flutter (25094): #11 _StreamImplEvents.handleNext (dart:async/stream_impl.dart:707:11) I/flutter (25094): #12 _PendingEvents.schedule. (dart:async/stream_impl.dart:667:7) I/flutter (25094): #13 _rootRun (dart:async/zone.dart:1120:38) I/flutter (25094): #14 _CustomZone.run (dart:async/zone.dart:1021:19) I/flutter (25094): #15 _CustomZone.runGuarded (dart:async/zone.dart:923:7) I/flutter (25094): #16 _CustomZone.bindCallbackGuarded. (dart:async/zone.dart:963:23) I/flutter (25094): #17 _rootRun (dart:async/zone.dart:1124:13) I/flutter (25094): #18 _CustomZone.run (dart:async/zone.dart:1021:19) I/flutter (25094): #19 _CustomZone.runGuarded (dart:async/zone.dart:923:7) I/flutter (25094): #20 _CustomZone.bindCallbackGuarded. (dart:async/zone.dart:963:23) I/flutter (25094): #21 _microtaskLoop (dart:async/schedule_microtask.dart:41:21) I/flutter (25094): #22 _startMicrotaskLoop (dart:async/schedule_microtask.dart:50:5) I/flutter (25094): W/System (25094): A resource failed to call release. I/flutter (25094): Bad state: No element, #0 List.first (dart:core/runtime/libgrowable_array.dart:216:5) I/flutter (25094): #1 _MLScannerPageState.build.. package:hits/mlScannerTest.dart:40 I/flutter (25094): #2 State.setState package:flutter/…/widgets/framework.dart:1122 I/flutter (25094): #3 _MLScannerPageState.build. package:hits/mlScannerTest.dart:39 I/flutter (25094): #4 CameraMlVisionState._processImage (package:flutter_camera_ml_vision/flutter_camera_ml_vision.dart) I/flutter (25094): I/flutter (25094): #5 CameraController.startImageStream. package:camera/camera.dart:367 I/flutter (25094): #6 _rootRunUnary (dart:async/zone.dart:1132:38) I/flutter (25094): #7 _CustomZone.runUnary (dart:async/zone.dart:1029:19) I/flutter (25094): #8 _CustomZone.runUnaryGuarded (dart:async/zone.dart:931:7) I/flutter (25094): #9 _BufferingStreamSubscription._sendData (dart:async/stream_impl.dart:336:11) I/flutter (25094): #10 _DelayedData.perform (dart:async/stream_impl.dart:591:14) I/flutter (25094): #11 _StreamImplEvents.handleNext (dart:async/stream_impl.dart:707:11) I/flutter (25094): #12 _PendingEvents.schedule. (dart:async/stream_impl.dart:667:7) I/flutter (25094): #13 _rootRun (dart:async/zone.dart:1120:38) I/flutter (25094): #14 _CustomZone.run (dart:async/zone.dart:1021:19) I/flutter (25094): #15 _CustomZone.runGuarded (dart:async/zone.dart:923:7) I/flutter (25094): #16 _CustomZone.bindCallbackGuarded. (dart:async/zone.dart:963:23) I/flutter (25094): #17 _rootRun (dart:async/zone.dart:1124:13) I/flutter (25094): #18 _CustomZone.run (dart:async/zone.dart:1021:19) I/flutter (25094): #19 _CustomZone.runGuarded (dart:async/zone.dart:923:7) I/flutter (25094): #20 _CustomZone.bindCallbackGuarded. (dart:async/zone.dart:963:23) I/flutter (25094): #21 _microtaskLoop (dart:async/schedule_microtask.dart:41:21) I/flutter (25094): #22 _startMicrotaskLoop (dart:async/schedule_microtask.dart:50:5) I/flutter (25094): E/libc (25094): Access denied finding property "vendor.camera.aux.packagelist" W/Binder:25094_7(25094): type=1400 audit(0.0:8821): avc: denied { read } for name="u:object_r:vendor_camera_prop:s0" dev="tmpfs" ino=3997 scontext=u:r:untrusted_app:s0:c54,c257,c512,c768 tcontext=u:object_r:vendor_camera_prop:s0 tclass=file permissive=0 E/libc (25094): Access denied finding property "vendor.camera.aux.packagelist" W/Binder:25094_7(25094): type=1400 audit(0.0:8822): avc: denied { read } for name="u:object_r:vendor_camera_prop:s0" dev="tmpfs" ino=3997 scontext=u:r:untrusted_app:s0:c54,c257,c512,c768 tcontext=u:object_r:vendor_camera_prop:s0 tclass=file permissive=0 E/EventChannel#plugins.flutter.io/camera/imageStream(25094): Failed to close event stream E/EventChannel#plugins.flutter.io/camera/imageStream(25094): java.lang.NullPointerException: Attempt to invoke virtual method 'void android.media.ImageReader.setOnImageAvailableListener(android.media.ImageReader$OnImageAvailableListener, android.os.Handler)' on a null object reference E/EventChannel#plugins.flutter.io/camera/imageStream(25094): at io.flutter.plugins.camera.CameraHandler$Camera$9.onCancel(CameraHandler.java:827) E/EventChannel#plugins.flutter.io/camera/imageStream(25094): at io.flutter.plugin.common.EventChannel$IncomingStreamRequestHandler.onCancel(EventChannel.java:194) E/EventChannel#plugins.flutter.io/camera/imageStream(25094): at io.flutter.plugin.common.EventChannel$IncomingStreamRequestHandler.onMessage(EventChannel.java:162) E/EventChannel#plugins.flutter.io/camera/imageStream(25094): at io.flutter.view.FlutterNativeView$PlatformMessageHandlerImpl.handleMessageFromDart(FlutterNativeView.java:188) E/EventChannel#plugins.flutter.io/camera/imageStream(25094): at io.flutter.embedding.engine.FlutterJNI.handlePlatformMessage(FlutterJNI.java:202) E/EventChannel#plugins.flutter.io/camera/imageStream(25094): at android.os.MessageQueue.nativePollOnce(Native Method) E/EventChannel#plugins.flutter.io/camera/imageStream(25094): at android.os.MessageQueue.next(MessageQueue.java:326) E/EventChannel#plugins.flutter.io/camera/imageStream(25094): at android.os.Looper.loop(Looper.java:165) E/EventChannel#plugins.flutter.io/camera/imageStream(25094): at android.app.ActivityThread.main(ActivityThread.java:6729) E/EventChannel#plugins.flutter.io/camera/imageStream(25094): at java.lang.reflect.Method.invoke(Native Method) E/EventChannel#plugins.flutter.io/camera/imageStream(25094): at com.android.internal.os.RuntimeInit$MethodAndArgsCaller.run(RuntimeInit.java:493) E/EventChannel#plugins.flutter.io/camera/imageStream(25094): at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:858)


    Doctor summary (to see all details, run flutter doctor -v): [√] Flutter (Channel stable, v1.2.1, on Microsoft Windows [Version 10.0.17134.706], locale en-US) [√] Android toolchain - develop for Android devices (Android SDK version 28.0.3) [!] Android Studio (version 3.2) X Flutter plugin not installed; this adds Flutter specific functionality. X Dart plugin not installed; this adds Dart specific functionality. [√] VS Code, 64-bit edition (version 1.33.1) [√] Connected device (1 available)

    ! Doctor found issues in 1 category.


    Testing with my device: Nokia 6.1 plus.

    Dependencies

    firebase_ml_vision: ^0.7.0+2 flutter_camera_ml_vision: ^2.0.0


    Here's my code

    mlScannerTest.dart

    import 'package:flutter/material.dart';
    import 'package:flutter_camera_ml_vision/flutter_camera_ml_vision.dart';
    import 'package:firebase_ml_vision/firebase_ml_vision.dart';
    
    class MLScannerPage extends StatefulWidget {
      MLScannerPage({Key key, this.title}) : super(key: key);
    
      final String title;
    
      @override
      _MLScannerPageState createState() => _MLScannerPageState();
    }
    
    class _MLScannerPageState extends State<MLScannerPage> {
      String _code;
      StatefulWidget _camML;
      bool _startCamera = true;
      bool _resultSent = false;
      final _scanKey = GlobalKey<CameraMlVisionState>();
    
      @override
      void initState() {
        super.initState();
      }
    
      @override
      Widget build(BuildContext context) {
        _camML = _startCamera
            ? CameraMlVision<List<Barcode>>(
                key: _scanKey,
                detector: FirebaseVision.instance.barcodeDetector().detectInImage,
                onResult: (List<Barcode> barcodes) {
                  if (_resultSent || !mounted) {
                    return;
                  }
                  _resultSent = true;
    
                  setState(() {
                    _code = barcodes.first.displayValue;
                  });
                })
            : null;
    
        return Scaffold(
          appBar: AppBar(
            title: Text(_code != null && _code.isNotEmpty ? _code : 'None'),
          ),
          body: SizedBox.expand(
            child: _camML,
          ),
          floatingActionButton: FloatingActionButton(
            onPressed: () {
              setState(() {
                _startCamera = !_startCamera;
                _resultSent = false;
              });
            },
            tooltip: 'Code',
            child: Icon(Icons.add),
          ),
        );
      }
    }
    

    main.dart

    import 'package:flutter/material.dart';
    import 'package:hits/mlScannerTest.dart';
    
    void main() => runApp(MyApp());
    
    class MyApp extends StatelessWidget {
      // This widget is the root of your application.
      @override
      Widget build(BuildContext context) {
        return MaterialApp(
          title: 'Flutter Demo',
          theme: ThemeData(
            primarySwatch: Colors.blue,
          ),
          home: MLScannerPage(title: 'Flutter Demo Home Page'),
        );
      }
    }
    

    So basically, the preview was shown. But no detection whatsoever.

    opened by moseskarunia 8
  • How to check if there are no faces in the Camera ?

    How to check if there are no faces in the Camera ?

    Hi there, I want to check if there are no faces in the Camera . I tried many ways to check that but it didn't work for me : Here is what i tried : onResult: (List faces) { // i mean List of Faces if (faces.isEmpty) {
    print('Face');
    } }, // i tried faces.isEmpty // i tried faces.length == 0 // i tried faces.length == null
    // i tried faces == null // i tried faces.first == null // i tried faces[0] == null

    The only Thing which worked for me is the Opposite of what i want : // faces.isNotEmpty
    Can anyone help me with that

    enhancement good first issue question 
    opened by Loopex2019 8
  • Can not install version 2.2.0 cause of the firebase ml package.

    Can not install version 2.2.0 cause of the firebase ml package.

    I tried installing the latest version of the Package but whenever i run it , it always show me that Error : [looper_2019] flutter packages get Running "flutter packages get" in looper_2019...
    Because flutter_camera_ml_vision >=2.1.0 depends on firebase_ml_vision ^0.8.0 and looper_2019 depends on firebase_ml_vision ^0.7.0, flutter_camera_ml_vision >=2.1.0 is forbidden.

    So, because looper_2019 depends on flutter_camera_ml_vision ^2.2.0, version solving failed. pub get failed (1) exit code 1 I think it has something to do with the firebase ml vision package version 0.7.0

    opened by Loopex2019 7
  • How to return a Widget if there are no faces (Question)

    How to return a Widget if there are no faces (Question)

    Hi. Checking if there is no face is working great but with printing some Text to the Debug Console. But i want to return a Widget. Can i do this in the onResult ? i tried but it didn't work for me. So i tried display a Widget in the overlayBuilder by initializing a boolean value and set it to false : onResult: (List<Face> faces) { if (faces.isEmpty) { _noface = true; print('No Face'); } }, overlayBuilder: (BuildContext context) { if (_noface == true) { return _buildNoFaceText(); } return _buildDetectButtonAndTabText(); },
    I initialized the boolean value on the top of the class and it's default is false. My Goal is to connect the Checking of the onResult with the overlayBuilder to render the Widget i want. Is there something wrong in my Logic Code and if there can someone tell me the Solution ?

    question support 
    opened by Loopex2019 7
  • CameraMLVision close event stream exception

    CameraMLVision close event stream exception

    Hi, probably I'm doing something wrong. CameraMLVision is set dinamically by a generic widget, calling setState

    Widget _activeWidget = CameraMlVision<VisionText>(
            detector: FirebaseVision.instance.textRecognizer().processImage,
            onResult: (VisionText visionTextResult) {
              //print("******* visionText => ${visionTextResult.text}");
              if (!_isScanning && visionTextResult.text.length > 0) {
                var priceDetected = _regexCapturePrice(visionTextResult.text);
                if (priceDetected != null) {
                  setState(() {
                    _activeWidget = _priceResultView(priceDetected);
                  });
                }
              }
            });
    

    As you can see in this code, when result are returned, _activeWidget becomes a report widget with details.

    Doing this, I get this exception:

    E/EventChannel#plugins.flutter.io/camera/imageStream(26834): Failed to close event stream E/EventChannel#plugins.flutter.io/camera/imageStream(26834): java.lang.NullPointerException: Attempt to invoke virtual method 'void android.media.ImageReader.setOnImageAvailableListener(android.media.ImageReader$OnImageAvailableListener, android.os.Handler)' on a null object reference E/EventChannel#plugins.flutter.io/camera/imageStream(26834): at io.flutter.plugins.camera.CameraHandler$Camera$9.onCancel(CameraHandler.java:827) E/EventChannel#plugins.flutter.io/camera/imageStream(26834): at io.flutter.plugin.common.EventChannel$IncomingStreamRequestHandler.onCancel(EventChannel.java:194) E/EventChannel#plugins.flutter.io/camera/imageStream(26834): at io.flutter.plugin.common.EventChannel$IncomingStreamRequestHandler.onMessage(EventChannel.java:162) E/EventChannel#plugins.flutter.io/camera/imageStream(26834): at io.flutter.view.FlutterNativeView$PlatformMessageHandlerImpl.handleMessageFromDart(FlutterNativeView.java:188) E/EventChannel#plugins.flutter.io/camera/imageStream(26834): at io.flutter.embedding.engine.FlutterJNI.handlePlatformMessage(FlutterJNI.java:202) E/EventChannel#plugins.flutter.io/camera/imageStream(26834): at android.os.MessageQueue.nativePollOnce(Native Method) E/EventChannel#plugins.flutter.io/camera/imageStream(26834): at android.os.MessageQueue.next(MessageQueue.java:323) E/EventChannel#plugins.flutter.io/camera/imageStream(26834): at android.os.Looper.loop(Looper.java:143) E/EventChannel#plugins.flutter.io/camera/imageStream(26834): at android.app.ActivityThread.main(ActivityThread.java:7225) E/EventChannel#plugins.flutter.io/camera/imageStream(26834): at java.lang.reflect.Method.invoke(Native Method) E/EventChannel#plugins.flutter.io/camera/imageStream(26834): at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:1230) E/EventChannel#plugins.flutter.io/camera/imageStream(26834): at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:1120)

    I attached log.txt full log, if you need.

    Is there a graceful way to close CameraVision widget? Or valid way is calling Navigator.of(context).pop... like in your sample?

    Thank you again for your help. Lewix

    bug 
    opened by lewixlabs 6
  • MLKit Text Recognition support

    MLKit Text Recognition support

    Hi, first congrats! Great mlkit & Camera plugin, it works better (for me) than original from flutter team 👍 My question is: does your flutter plugin support text detection feature too? I can't find an example.

    Thanks!

    opened by lewixlabs 6
  • Execution failed for task ':app:checkDebugAarMetadata'. > Could not resolve all files for configuration ':app:debugRuntimeClasspath'.    > Could not find com.google.firebase:firebase-ml-vision:.      Required by:          project :app > project :firebase_ml_vision

    Execution failed for task ':app:checkDebugAarMetadata'. > Could not resolve all files for configuration ':app:debugRuntimeClasspath'. > Could not find com.google.firebase:firebase-ml-vision:. Required by: project :app > project :firebase_ml_vision

    Execution failed for task ':app:checkDebugAarMetadata'.

    Could not resolve all files for configuration ':app:debugRuntimeClasspath'. Could not find com.google.firebase:firebase-ml-vision:. Required by: project :app > project :firebase_ml_vision

    opened by choudhryr723 1
  • Could not find com.google.firebase:firebase-ml-vision

    Could not find com.google.firebase:firebase-ml-vision

    When I added flutter_camera_ml_vision to an existing project, I got the following error.

    Execution failed for task ':app:checkDebugAarMetadata'.                 
    > Could not resolve all files for configuration ':app:debugRuntimeClasspath'.
       > Could not find com.google.firebase:firebase-ml-vision:.            
         Required by:                                                       
             project :app > project :firebase_ml_vision
    

    I didn't know what caused it, so I copied the code (from the official website). After that, I tried to link firebase and added only flutter_camera_ml_vision, but I got the same error as above.

    The main.dart is the same as the URL above, and the other files that I changed are as follows. The version of flutter_camera_ml_vision is 3.0.1. As a supplement, the ./gradlew command succeeds.

    //android/app/src/main/AndroidManifest.xml
    
    <manifest xmlns:android="http://schemas.android.com/apk/res/android"
        package="com.example.association_firebase">
       <application
            android:label="association_firebase"
            android:icon="@mipmap/ic_launcher">
            <activity
                android:name=".MainActivity"
                android:launchMode="singleTop"
                android:theme="@style/LaunchTheme"
                android:configChanges="orientation|keyboardHidden|keyboard|screenSize|smallestScreenSize|locale|layoutDirection|fontScale|screenLayout|density|uiMode"
                android:hardwareAccelerated="true"
                android:windowSoftInputMode="adjustResize">
    
                <meta-data
                  android:name="io.flutter.embedding.android.NormalTheme"
                  android:resource="@style/NormalTheme"
                  />
    
                <meta-data
                  android:name="io.flutter.embedding.android.SplashScreenDrawable"
                  android:resource="@drawable/launch_background"
                  />
                <intent-filter>
                    <action android:name="android.intent.action.MAIN"/>
                    <category android:name="android.intent.category.LAUNCHER"/>
                </intent-filter>
            </activity>
    
            <meta-data
                android:name="flutterEmbedding"
                android:value="2" />
            <!-- add -->
            <meta-data
                android:name="com.google.firebase.ml.vision.DEPENDENCIES"
                android:value="ocr" />
        </application>
    </manifest>
    
    //android/app/build.gradle
    
    def localProperties = new Properties()
    def localPropertiesFile = rootProject.file('local.properties')
    if (localPropertiesFile.exists()) {
        localPropertiesFile.withReader('UTF-8') { reader ->
            localProperties.load(reader)
        }
    }
    
    def flutterRoot = localProperties.getProperty('flutter.sdk')
    if (flutterRoot == null) {
        throw new GradleException("Flutter SDK not found. Define location with flutter.sdk in the local.properties file.")
    }
    
    def flutterVersionCode = localProperties.getProperty('flutter.versionCode')
    if (flutterVersionCode == null) {
        flutterVersionCode = '1'
    }
    
    def flutterVersionName = localProperties.getProperty('flutter.versionName')
    if (flutterVersionName == null) {
        flutterVersionName = '1.0'
    }
    
    apply plugin: 'com.android.application'
    // add
    apply plugin: 'com.google.gms.google-services'
    apply plugin: 'kotlin-android'
    apply from: "$flutterRoot/packages/flutter_tools/gradle/flutter.gradle"
    
    android {
        compileSdkVersion 30
    
        sourceSets {
            main.java.srcDirs += 'src/main/kotlin'
        }
    
        defaultConfig {
            applicationId "com.example.association_firebase"
            minSdkVersion 21   // changed to 21
            targetSdkVersion 30
            versionCode flutterVersionCode.toInteger()
            versionName flutterVersionName
        }
    
        buildTypes {
            release {
                signingConfig signingConfigs.debug
            }
        }
        // add
        dependencies {
            api 'com.google.firebase:firebase-ml-vision-image-label-model:19.0.0'
        }
    }
    
    flutter {
        source '../..'
    }
    
    dependencies {
        implementation "org.jetbrains.kotlin:kotlin-stdlib-jdk7:$kotlin_version"
        //add
        implementation platform('com.google.firebase:firebase-bom:28.0.1')
        //add
        implementation 'com.google.firebase:firebase-analytics'
    }
    
    //android/build.gradle
    
    buildscript {
        ext.kotlin_version = '1.3.50'
        repositories {
            google()
            jcenter()
        }
    
        dependencies {
            classpath 'com.android.tools.build:gradle:4.1.0'
            classpath "org.jetbrains.kotlin:kotlin-gradle-plugin:$kotlin_version"
            //add
            classpath 'com.google.gms:google-services:4.3.8'
        }
    }
    
    allprojects {
        repositories {
            google()
            jcenter()
        }
    }
    
    rootProject.buildDir = '../build'
    subprojects {
        project.buildDir = "${rootProject.buildDir}/${project.name}"
    }
    subprojects {
        project.evaluationDependsOn(':app')
    }
    
    task clean(type: Delete) {
        delete rootProject.buildDir
    }
    
    opened by tnagasak 10
  • Uninitialized CameraController after closing and opening the application

    Uninitialized CameraController after closing and opening the application

    E/flutter (14891): [ERROR:flutter/lib/ui/ui_dart_state.cc(177)] Unhandled Exception: CameraException(Uninitialized CameraController, startImageStream was called on uninitialized CameraController.) E/flutter (14891): #0 CameraController.startImageStream (package:camera/camera.dart:438:7) E/flutter (14891): #1 CameraMlVisionState._start (package:flutter_camera_ml_vision/flutter_camera_ml_vision.dart:148:23) E/flutter (14891): #2 CameraMlVisionState.build. (package:flutter_camera_ml_vision/flutter_camera_ml_vision.dart:308:11) E/flutter (14891): #3 VisibilityDetectorLayer._fireCallback (package:flutter_widgets/src/visibility_detector/src/visibility_detector_layer.dart:268:24) E/flutter (14891): #4 VisibilityDetectorLayer._processCallbacks (package:flutter_widgets/src/visibility_detector/src/visibility_detector_layer.dart:239:13) E/flutter (14891): #5 _TaskEntry.run. (package:flutter/src/scheduler/binding.dart:80:34) E/flutter (14891): #6 Timeline.timeSync (dart:developer/timeline.dart:163:22) E/flutter (14891): #7 _TaskEntry.run (package:flutter/src/scheduler/binding.dart:77:16) E/flutter (14891): #8 SchedulerBinding.handleEventLoopCallback (package:flutter/src/scheduler/binding.dart:460:15) E/flutter (14891): #9 SchedulerBinding._runTasks (package:flutter/src/scheduler/binding.dart:438:9) E/flutter (14891): #10 _rootRun (dart:async/zone.dart:1182:47) E/flutter (14891): #11 _CustomZone.run (dart:async/zone.dart:1093:19) E/flutter (14891): #12 _CustomZone.runGuarded (dart:async/zone.dart:997:7) E/flutter (14891): #13 _CustomZone.bindCallbackGuarded. (dart:async/zone.dart:1037:23) E/flutter (14891): #14 _rootRun (dart:async/zone.dart:1190:13) E/flutter (14891): #15 _CustomZone.run (dart:async/zone.dart:1093:19) E/flutter (14891): #16 _CustomZone.bindCallback. (dart:async/zone.dart:1021:23) E/flutter (14891): #17 Timer._createTimer. (dart:async-patch/timer_patch.dart:18:15) E/flutter (14891): #18 _Timer._runTimers (dart:isolate-patch/timer_impl.dart:397:19) E/flutter (14891): #19 _Timer._handleMessage (dart:isolate-patch/timer_impl.dart:428:5) E/flutter (14891): #20 _RawReceivePortImpl._handleMessage (dart:isolate-patch/isolate_patch.dart:168:12) E/flutter (14891):

    class FaceVision extends StatefulWidget {
      @override
      _FaceVisionState createState() => _FaceVisionState();
    }
    
    class _FaceVisionState extends State<FaceVision> {
      List<Face> _faces = [];
      final _scanKey = GlobalKey<CameraMlVisionState>();
      CameraLensDirection cameraLensDirection = CameraLensDirection.front;
      FaceDetector detector = FirebaseVision.instance.faceDetector(
        FaceDetectorOptions(
          enableTracking: true,
          mode: FaceDetectorMode.fast,
        ),
      );
    
      @override
      Widget build(BuildContext context) {
        return CameraMlVision<List<Face>>(
          key: _scanKey,
          resolution: ResolutionPreset.max,
          cameraLensDirection: cameraLensDirection,
          detector: detector.processImage,
          overlayBuilder: (c) {
            return CustomPaint(
              painter: FaceDetectorPainter(
                  _scanKey.currentState.cameraValue.previewSize.flipped, _faces,
                  reflection: cameraLensDirection == CameraLensDirection.front),
            );
          },
          onResult: (faces) {
            if (faces == null || !mounted) {
              return;
            }
            setState(() {
              _faces = faces;
            });
            sl<RecognitionBloc>()..add(NewFaces(faces: _faces));
          },
          onDispose: () {
            detector.close();
          },
        );
      }
    
      @override
      void dispose() {
        super.dispose();
      }
    }
    
    opened by petodavid 2
  • Camera Preview Screenshot is not getting

    Camera Preview Screenshot is not getting

    I want to get the screenshot of the camera preview but don't want to store it locally in the gallery I want to store it in the temp folder. So, Here is the scenario of my app. 1- The Ml vision camera package of the flutter detects the face of the human. 2- Because the ml vision is not detecting the face properly So, my workaround to make an API request of the screenshot if the API response is good then go to the next screen. My scenario is working correctly. But the screenshot is saved on the gallery. So, you can say use RenderBondary but with this camera, the preview is not showing. This is the issue in GitHub also regarding this. If you have a workaround how can get this done without saving or complete the scenario of the app?

    import 'dart:convert';
    import 'dart:io';
    import 'dart:typed_data';
    
    import 'package:auto_route/auto_route.dart';
    import 'package:esol/screens/home/home_page.dart';
    import 'package:flutter/material.dart';
    
    //Flutter Packages
    import 'package:camera/camera.dart';
    import 'package:firebase_ml_vision/firebase_ml_vision.dart';
    import 'package:flutter_camera_ml_vision/flutter_camera_ml_vision.dart';
    
    import 'package:esol/screens/routes.gr.dart';
    import 'package:http/http.dart' as http;
    import 'package:native_screenshot/native_screenshot.dart';
    import 'package:shared_preferences/shared_preferences.dart';
    
    class VeifyFaceDetect extends StatefulWidget {
      @override
      _VeifyFaceDetectState createState() => _VeifyFaceDetectState();
    }
    
    class _VeifyFaceDetectState extends State<VeifyFaceDetect> {
      // File _imageFile;
    
      List<Face> faces = [];
      final _scanKey = GlobalKey<CameraMlVisionState>();
      CameraLensDirection cameraLensDirection = CameraLensDirection.front;
      FaceDetector detector =
          FirebaseVision.instance.faceDetector(FaceDetectorOptions(
        enableClassification: true,
        enableTracking: true,
        enableLandmarks: true,
        mode: FaceDetectorMode.accurate,
      ));
      String detectString = 'No Face Found';
    
      Future<void> uploadData(
          {String base64Image,
          bool template,
          bool cropImage,
          bool faceAttributes,
          bool facialFeatures,
          bool icaoAttributes}) async {
        setState(() {
          detectString = 'Verifying Face';
        });
        final url = 'https://dot.innovatrics.com/core/api/v6/face/detect';
        final response = await http.post(
          url,
          headers: {'Content-Type': 'application/json'},
          body: json.encode(
            {
              'image': {
                "data": base64Image,
                "faceSizeRatio": {
                  "min": 0.01,
                  "max": 0.9,
                }
              },
              'template': template,
              'cropImage': cropImage,
              'faceAttributes': faceAttributes,
              'facialFeatures': facialFeatures,
              'icaoAttributes': icaoAttributes,
            },
          ),
        );
    
        if (response.statusCode == 200) {
          // setState(() {
          print("Here is the code");
          // });
          if (response.body.toString().contains('NO_FACE_DETECTED')) {
            setState(() {
              detectString = 'Move Face Closer';
            });
            print('This is the response ${response.body}');
          }
          if (!response.body.toString().contains('NO_FACE_DETECTED')) {
            print('This is the another response ${response.body}');
            // Navigator.of(context).pushNamed(Routes.homePage);
            SharedPreferences prefs = await SharedPreferences.getInstance();
            prefs.setString('data', "ok");
            Future.delayed(Duration(seconds: 2), () {
              Navigator.of(context).pop();
            });
            // Navigator.of(context).pop();
          }
        }
      }
    
      _takeScreenShot() async {
        final dataValue = await NativeScreenshot.takeScreenshot();
        if (dataValue != null) {
          var imageFile = File(dataValue);
    
          Uint8List byteFile = imageFile.readAsBytesSync();
          String base64Image = base64Encode(byteFile);
          uploadData(
              base64Image: base64Image,
              cropImage: true,
              faceAttributes: true,
              facialFeatures: true,
              icaoAttributes: true,
              template: true);
        }
      }
    
      @override
      Widget build(BuildContext context) {
        String layoutHeight = MediaQuery.of(context).size.height.toString();
        String layoutWidth = MediaQuery.of(context).size.width.toString();
    
        print('This is data String $detectString');
        print('This is the height of the page : $layoutHeight');
        return Scaffold(
          backgroundColor: Color.fromRGBO(0, 85, 255, 1),
          appBar: AppBar(
            elevation: 0.0,
            leading: IconButton(
              icon: Icon(Icons.arrow_back_ios),
              onPressed: () => Navigator.of(context).pop(),
            ),
            actions: [
              IconButton(
                icon: Icon(Icons.close),
                onPressed: () => Navigator.of(context).pop(),
              ),
            ],
          ),
          body: Stack(children: [
            Positioned(
              // bottom: 100,
              // right: 80,
              // top: 100,
              child: Align(
                alignment: Alignment.topCenter,
                child: Text(
                  detectString,
                  style: TextStyle(color: Colors.white, fontSize: 30),
                ),
              ),
            ),
            Center(
              child: Container(
                // height: MediaQuery.of(context).size.height / 3.2,
                // width: MediaQuery.of(context).size.width * 0.7,
                height: 300,
                width: 300,
                child: ClipOval(
                  child: CameraMlVision<List<Face>>(
                    key: _scanKey,
                    cameraLensDirection: cameraLensDirection,
                    detector: (FirebaseVisionImage image) {
                      return detector.processImage(image);
                    },
                    overlayBuilder: (c) {
                      return Text('');
                    },
                    onResult: (faces) {
                      if (faces == null || faces.isEmpty || !mounted) {
                        return;
                      }
    
                      setState(() {
                        faces = []..addAll(faces);
                        if (faces[0].rightEyeOpenProbability >= 0.9 &&
                            faces[0].leftEyeOpenProbability >= 0.9 &&
                            faces[0].rightEyeOpenProbability >= 0.5 &&
                            faces[0].leftEyeOpenProbability >= 0.5 &&
                            faces[0].boundingBox.isEmpty == false) {
                          detectString = 'Face detected';
                          _takeScreenShot();
                        }
                        if (faces[0].rightEyeOpenProbability <= 0.5 &&
                            faces[0].leftEyeOpenProbability <= 0.5) {
                          detectString = 'Open your Eyes';
                        }
                      });
                    },
                    onDispose: () {
                      detector.close();
                    },
                  ),
                ),
              ),
            ),
          ]),
        );
      }
    }
    
    question 
    opened by shahryar-cmyk 0
Releases(2.2.0)
Owner
Rushio Consulting
Rushio Consulting
Flutter sample app using MLKit Vision API for text recognition

Flutter ML Kit Vision This a sample Flutter app integrated with the ML Kit Vision API for recognition of email addresses from an image. NOTE: The ML K

Souvik Biswas 21 Oct 12, 2022
A Simple Todo app design in Flutter to keep track of your task on daily basis. Its build on BLoC Pattern. You can add a project, labels, and due-date to your task also you can sort your task on the basis of project, label, and dates

WhatTodo Life can feel overwhelming. But it doesn’t have to. A Simple To-do app design in flutter to keep track of your task on daily basis. You can a

Burhanuddin Rashid 1k Jan 6, 2023
A mobile image uploader in which you can upload image to your personal gallery from either your camera or mobile gallery and it can detect your current geographic location and address using firebase firestore and storage.

Image Uploader In Flutter About It is an Image Uploader gallery which tracks your address from which you're uploading using Flutter and Image picker.

Prahen parija 6 Dec 20, 2022
Stream sticker animation - Stream Sticker Animation using Rive

Stream Sticker Animation using Rive Sample Flutter project to demonstrate how to

Souvik Biswas 4 Feb 8, 2022
A flutter app face detection and emotion, can detect if you're smiling, big smiley, sad or if there is not face on the screen.

My Emotion A flutter app face detection and emotion, can detect if you're smiling, big smiley, sad or if there is not face on the screen. News feactur

António Nicolau 29 Dec 31, 2022
A unique flutter application aimed at helping people getting their vitals using Photoplethysmography and Computer Vision

A unique flutter application aimed at helping people getting their vitals using Photoplethysmography and Computer Vision Current Goals: Use the camera

Smaranjit Ghose 37 Dec 27, 2022
Flutter implementation of Google Mobile Vision.

flutter_mobile_vision Flutter implementation for Google Mobile Vision. Based on Google Mobile Vision. Android Samples -=- iOS Samples Liked? ⭐ Star th

Eduardo Folly 450 Jan 6, 2023
LinkWell is a Text widget that highlight all the links in the text which then navigates the user to the URL when tapped

LinkWell LinkWell is Text Plugin that detects URLs and Emails in a String and when tapped opens in user browsers, linkwell GitHub ScreenShots Basic Us

Samuel Ezedi 31 Sep 27, 2022
App can detect COVID via X-Ray image, just use some sample image available in the listed links.

Covid19detector : Detecting COVID-19 from X-Ray ?? App can detect COVID via X-Ray image, just use some sample image available in the listed links. And

Sanskar Tiwari 21 Jun 14, 2022
Provide easy and flexible way to show SnackBar. Simple text, undo, and error style are supported.

snack_bar_presenter Provide easy and flexible way to show SnackBar. Simple text, undo, and error style are supported. . . . Usage import 'package:exam

Masayuki Ono (mono) 8 Nov 30, 2020
A Flutter application implementing AR core, Text-to-speech, and Speech-to-text technologies.

ar_x_ai A Flutter application implementing AR core, Text to speech and Speech to text technologies. Getting Started This project is a starting point f

Alston Fernandes 1 Dec 17, 2021
A Translator App Which is Build using Flutter, Speech To Text, Google ML Kit, Google Translator and Text To Speech.

AI Translator This is a Translator App Which is Build using Flutter, Speech To Text, Google ML Kit, Google Translator and Text To Speech. Download App

null 4 Jul 16, 2022
Stream Feed official Flutter SDK. Build your own feed experience using Dart and Flutter.

Official Flutter packages for Stream Activity Feeds The official Dart client for Stream Activity Feeds, a service for building activity feed applicati

Stream 67 Sep 26, 2022
P2P payment solution using Stream's Flutter SDK and Rapyd's Wallet API

Peer-to-peer payment integration to a messaging app using Flutter ?? This project shows how to integrate a peer-to-peer payment solution to your Strea

Souvik Biswas 15 Dec 8, 2022
A fully functional chat application built with Flutter and Stream

?? SpikeChat A fully functional chat application built with Flutter and Stream! ✅ Join the chat room (If you have the secret passcode hehe) ✅ Send tex

Ashton Jones 20 Apr 30, 2022
Flutter Chat App with localization - powered by Stream

flutter_chat_app A new Flutter project. Getting Started This project is a starting point for a Flutter application. A few resources to get you started

Pawan Kumar 49 Dec 23, 2022
An abstraction on top of flutter camera.

An abstraction on top of flutter camera. Features Provides the lower level handling of the camera plugin Handles all camera resources Handle camera st

Mateus Felipe C. C. Pinto 25 May 3, 2022
Basic banking app - A Banking App that allow transfer money between multiple customers using SQLite database

basic_banking_app A Basic Banking App that allow transfer money between multiple

Esraa Mostfa 0 Feb 10, 2022
This plugin lets you show a message in a simple way.

error_message This plugin lets you show a message in a simple way. Usage ErrorMessage( icon: Icon(Icons.error), title: "Error Title",

TamilKannan-Developer 0 Dec 5, 2021