Camera and Microphone streaming library via RTMP for Flutter.

Overview

HaishinKit Plugin

pub package

  • A Flutter plugin for iOS, Android. Camera and Microphone streaming library via RTMP.
Android iOS
Support SDK 21+ iOS 9.0+

🌏 Dependencies

Project name Notes License
HaishinKit for iOS, macOS and tvOS. Camera and Microphone streaming library via RTMP, HLS for iOS, macOS and tvOS. BSD 3-Clause "New" or "Revised" License
HaishinKit for Android. Camera and Microphone streaming library via RTMP for Android. BSD 3-Clause "New" or "Revised" License

🎨 Features

RTMP

  • Authentication
  • Publish and Recording (H264/AAC)
  • Playback (Beta)
  • Adaptive bitrate streaming
    • Automatic drop frames
  • Action Message Format
    • AMF0
    • AMF3
  • SharedObject
  • RTMPS
    • Native (RTMP over SSL/TLS)

🐾 Example

Here is a small example flutter app displaying a camera preview.

import 'dart:async';

import 'package:audio_session/audio_session.dart';
import 'package:flutter/material.dart';
import 'package:haishin_kit/audio_source.dart';
import 'package:haishin_kit/net_stream_drawable_texture.dart';
import 'package:haishin_kit/rtmp_connection.dart';
import 'package:haishin_kit/rtmp_stream.dart';
import 'package:haishin_kit/video_source.dart';
import 'package:permission_handler/permission_handler.dart';

void main() {
  runApp(const MyApp());
}

class MyApp extends StatefulWidget {
  const MyApp({Key? key}) : super(key: key);

  @override
  State<MyApp> createState() => _MyAppState();
}

class _MyAppState extends State<MyApp> {
  RtmpConnection? _connection;
  RtmpStream? _stream;
  bool _recording = false;
  CameraPosition currentPosition = CameraPosition.back;

  @override
  void initState() {
    super.initState();
    initPlatformState();
  }

  Future<void> initPlatformState() async {
    await Permission.camera.request();
    await Permission.microphone.request();

    // Set up AVAudioSession for iOS.
    final session = await AudioSession.instance;
    await session.configure(const AudioSessionConfiguration(
      avAudioSessionCategory: AVAudioSessionCategory.playAndRecord,
      avAudioSessionCategoryOptions:
      AVAudioSessionCategoryOptions.allowBluetooth,
    ));

    RtmpConnection connection = await RtmpConnection.create();
    connection.eventChannel.receiveBroadcastStream().listen((event) {
      switch (event["data"]["code"]) {
        case 'NetConnection.Connect.Success':
          _stream?.publish("live");
          setState(() {
            _recording = true;
          });
          break;
      }
    });
    RtmpStream stream = await RtmpStream.create(connection);
    stream.attachAudio(AudioSource());
    stream.attachVideo(VideoSource(position: currentPosition));

    if (!mounted) return;

    setState(() {
      _connection = connection;
      _stream = stream;
    });
  }

  @override
  Widget build(BuildContext context) {
    return MaterialApp(
      home: Scaffold(
        appBar: AppBar(title: const Text('HaishinKit example app'), actions: [
          IconButton(
            icon: const Icon(Icons.flip_camera_android),
            onPressed: () {
              if (currentPosition == CameraPosition.front) {
                currentPosition = CameraPosition.back;
              } else {
                currentPosition = CameraPosition.front;
              }
              _stream?.attachVideo(VideoSource(position: currentPosition));
            },
          )
        ]),
        body: Center(
          child: _stream == null
              ? const Text("")
              : NetStreamDrawableTexture(_stream),
        ),
        floatingActionButton: FloatingActionButton(
          child: _recording
              ? const Icon(Icons.fiber_smart_record)
              : const Icon(Icons.not_started),
          onPressed: () {
            if (_recording) {
              _connection?.close();
              setState(() {
                _recording = false;
              });
            } else {
              _connection?.connect("rtmp://192.168.1.9/live");
            }
          },
        ),
      ),
    );
  }
}
Comments
  • Cannot connect to Mux RTMP secure server URL

    Cannot connect to Mux RTMP secure server URL

    Describe the bug

    Thank you for all of your work on your excellent product.

    I use your example Flutter code in debug mode to try to connect via an iPhone 11 to a Mux RTMP server secure URL e.g. rtmps://global-live.mux.com:443/app/MUX STREAM KEY and the response after attempting to connect is always "NetConnection.Connect.Closed". I test the same URL on the iPhone using Larix and it works fine.

    To Reproduce

    1. Use your example Flutter code
    2. Edit pubspec.yml to use haishin_kit 0.9.1
    3. Use a Mux secure URL e.g. rtmps://global-live.mux.com:443/app/MUX STREAM KEY
    4. Observe that is does not connect

    Expected behavior

    To connect and broadcast

    Version

    haishin_kit 0.9.1

    Smartphone info.

    • Device: iPhone 11
    • OS: iOS 15.5

    Additional context

    No response

    Screenshots

    No response

    Relevant log output

    No response

    opened by DaleBeckles 3
  • Missing release of bugfix

    Missing release of bugfix

    Describe the bug

    PR #7 fixed a critical bug. The release is still not available on pub.dev though

    To Reproduce

    1. Download the latest version from pub.dev
    2. Set video settings
    3. Start streaming

    Expected behavior

    Streaming is done in the settings provided

    Version

    0.9.1

    Smartphone info.

    No response

    Additional context

    No response

    Screenshots

    No response

    Relevant log output

    No response

    opened by TheFe91 2
  • Build failed with an exception.

    Build failed with an exception.

    Describe the bug

    Execution failed for task ':app:checkDebugAarMetadata'.

    Could not resolve all files for configuration ':app:debugRuntimeClasspath'. Could not resolve com.github.shogo4405.HaishinKit~kt:haishinkit:0.10.2. Required by: project :app > project :haishin_kit > Could not resolve com.github.shogo4405.HaishinKit~kt:haishinkit:0.10.2. > Could not get resource 'https://jitpack.io/com/github/shogo4405/HaishinKit~kt/haishinkit/0.10.2/haishinkit-0.10.2.pom'. > Could not GET 'https://jitpack.io/com/github/shogo4405/HaishinKit~kt/haishinkit/0.10.2/haishinkit-0.10.2.pom'. Received status code 521 from server:

    • Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

    To Reproduce

    1. flutter run

    Expected behavior

    1. flutter run

    Version

    0.9.2

    Smartphone info.

    • Android

    Additional context

    No response

    Screenshots

    No response

    Relevant log output

    No response

    opened by byn17 1
  • Sigsev on dispose

    Sigsev on dispose

    Describe the bug

    Segfault in OplusCCodec when closing or disposing the connection or stream.

    To Reproduce

    @override void dispose() { super.dispose(); _connection?.close(); }

    Expected behavior

    it doesn't crash the app.

    Version

    haishin_kit: 0.9.2

    Smartphone info.

    • Device: OnePlus 8 pro
    • Os: OxygenOs 12.1 / Android 12

    Additional context

    any attempt to close or dispose the resource ends in a segfault.

    Screenshots

    No response

    Relevant log output

    D/RtmpStream(16158): current=PUBLISHING, change=CLOSED
    D/AudioRecord(16158): stop(2625): mActive:1
    D/AudioRecord(16158): mAudioRecord->stop()
    D/AudioRecord(16158): AudioRecordThread pause()
    D/AudioRecord(16158): stop() end
    D/OplusCCodec(16158): initiateShutdown [386]: (0xb4000075e0a15340) keepComponentAllocated=1
    D/CCodecBufferChannel(16158): [c2.android.aac.encoder#630] MediaCodec discarded an unknown buffer
    D/CCodecBufferChannel(16158): [c2.android.aac.encoder#630] MediaCodec discarded an unknown buffer
    D/CCodecBufferChannel(16158): [c2.android.aac.encoder#630] MediaCodec discarded an unknown buffer
    D/CCodecBufferChannel(16158): [c2.android.aac.encoder#630] MediaCodec discarded an unknown buffer
    D/CCodecBufferChannel(16158): [c2.android.aac.encoder#630] MediaCodec discarded an unknown buffer
    D/CCodecBufferChannel(16158): [c2.android.aac.encoder#630] MediaCodec discarded an unknown buffer
    D/OplusCCodec(16158): initiateShutdown [386]: (0xb4000075e0a15340) keepComponentAllocated=0
    F/libc    (16158): Fatal signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr 0x7703fde000 in tid 16596 (odec.AudioCodec), pid 16158 (com.partaga.app)
    Process name is com.partaga.app, not key_process
    keyProcess: 0
    *** *** *** *** *** *** *** *** *** *** *** *** *** *** *** ***
    Build fingerprint: 'OnePlus/OnePlus8Pro_EEA/OnePlus8Pro:12/RKQ1.211119.001/Q.GDPR.202210170945:user/release-keys'
    Revision: '0'
    ABI: 'arm64'
    Timestamp: 2022-11-22 04:27:17.263872176+0100
    Process uptime: 0s
    Cmdline: com.partaga.app
    pid: 16158, tid: 16596, name: odec.AudioCodec  >>> com.partaga.app <<<
    uid: 10607
    signal 11 (SIGSEGV), code 1 (SEGV_MAPERR), fault addr 0x7703fde000
        x0  0000007703fde000  x1  00000076ead176f0  x2  00000000000006f8  x3  0000000000000000
        x4  0000000000000008  x5  0000000000000000  x6  0000007703fde000  x7  0004000400040004
        x8  0004000400040004  x9  1e050955096496b5  x10 0000000000000700  x11 0000000000000000
        x12 0000000000033680  x13 00000076ead170e8  x14 0000000000000002  x15 0000007593376000
        x16 00000076e776bae8  x17 00000076ec686940  x18 0000007576e00000  x19 b4000075e0b69400
        x20 0000000000000800  x21 0000000000000000  x22 0000007703fde000  x23 00000076e76ce260
        x24 0000000000000700  x25 0000007593376000  x26 00000076e7769978  x27 0000007593374e18
        x28 0000007593374e50  x29 0000007593374ca0
        lr  00000076e7703730  sp  0000007593374c70  pc  00000076ec6869f4  pst 0000000000001000
    backtrace:
          #00 pc 00000000000759f4  /apex/com.android.runtime/lib64/bionic/libc.so (memcpy_opt+180) (BuildId: bbbdeb7c87c74f1491f92c6e605095b0)
          #01 pc 000000000005d72c  /system/lib64/libaudioclient.so (android::AudioRecord::read(void*, unsigned long, bool)+380) (BuildId: 3cd928556fc187c2febdb332bc052fa9)
          #02 pc 0000000000172864  /system/lib64/libandroid_runtime.so (android_media_AudioRecord_readInDirectBuffer(_JNIEnv*, _jobject*, _jobject*, int, unsigned char)+248) (BuildId: 0a3f50eaf6daea7090ba06720efb1840)
          #03 pc 000000000033b1c0  /data/misc/apexdata/com.android.art/dalvik-cache/arm64/boot.oat (art_jni_trampoline+128)
    
    opened by Wicpar 1
  • fix: setAudioSettings were used instead of setVideoSettings

    fix: setAudioSettings were used instead of setVideoSettings

    Description & motivation

    setAudioSettings were used in set videoSettings and set captureSettings, so all videoSettings changes were ignored.

      set videoSettings(VideoSettings videoSettings) {
        assert(_memory != null);
        _videoSettings = videoSettings;
        RtmpStreamPlatform.instance.setAudioSettings( // <- this should to be RtmpStreamPlatform.instance.setVideoSettings(
            {"memory": _memory, "settings": videoSettings.toMap()});
      }
    

    Type of change

    • [X] Bug fix (non-breaking change which fixes an issue)
    opened by Kodam-zz 1
  • Youtube RTMP stream bug

    Youtube RTMP stream bug

    Describe the bug

    I am facing a bug - I think it's native swift issue - when I pass the youtube stream url and the stream key it gives me an error.

    import 'dart:async';
    
    import 'package:audio_session/audio_session.dart';
    import 'package:flutter/material.dart';
    import 'package:haishin_kit/audio_settings.dart';
    import 'package:haishin_kit/audio_source.dart';
    import 'package:haishin_kit/net_stream_drawable_texture.dart';
    import 'package:haishin_kit/rtmp_connection.dart';
    import 'package:haishin_kit/rtmp_stream.dart';
    import 'package:haishin_kit/video_settings.dart';
    import 'package:haishin_kit/video_source.dart';
    import 'package:permission_handler/permission_handler.dart';
    
    void main() {
      runApp(const MyApp());
    }
    
    class MyApp extends StatefulWidget {
      const MyApp({Key? key}) : super(key: key);
    
      @override
      State<MyApp> createState() => _MyAppState();
    }
    
    class _MyAppState extends State<MyApp> {
      RtmpConnection? _connection;
      RtmpStream? _stream;
      bool _recording = false;
      String _mode = "publish";
      CameraPosition currentPosition = CameraPosition.front;
    
      @override
      void initState() {
        super.initState();
        initPlatformState();
      }
    
      @override
      void dispose() {
        _stream?.dispose();
        _connection?.dispose();
        super.dispose();
      }
    
      Future<void> initPlatformState() async {
        await Permission.camera.request();
        await Permission.microphone.request();
    
        // Set up AVAudioSession for iOS.
        final session = await AudioSession.instance;
        await session.configure(const AudioSessionConfiguration(
          avAudioSessionCategory: AVAudioSessionCategory.playAndRecord,
          avAudioSessionCategoryOptions:
              AVAudioSessionCategoryOptions.allowBluetooth,
        ));
    
        RtmpConnection connection = await RtmpConnection.create();
        connection.eventChannel.receiveBroadcastStream().listen((event) {
          switch (event["data"]["code"]) {
            case 'NetConnection.Connect.Success':
              if (_mode == "publish") {
                _stream?.publish("live");
              } else {
                _stream?.play("live");
              }
              setState(() {
                _recording = true;
              });
              break;
          }
        });
    
        RtmpStream stream = await RtmpStream.create(connection);
        stream.audioSettings = AudioSettings(muted: false, bitrate: 64 * 1000);
        stream.videoSettings = VideoSettings(
          width: 480,
          height: 272,
          bitrate: 512 * 1000,
        );
        stream.attachAudio(AudioSource());
        stream.attachVideo(VideoSource(position: currentPosition));
    
        if (!mounted) return;
    
        setState(() {
          _connection = connection;
          _stream = stream;
        });
      }
    
      @override
      Widget build(BuildContext context) {
        return MaterialApp(
          home: Scaffold(
            appBar: AppBar(title: const Text('HaishinKit'), actions: [
              IconButton(
                icon: const Icon(Icons.play_arrow),
                onPressed: () {
                  if (_mode == "publish") {
                    _mode = "playback";
                    _stream?.attachVideo(null);
                    _stream?.attachAudio(null);
                  } else {
                    _mode = "publish";
                    _stream?.attachAudio(AudioSource());
                    _stream?.attachVideo(VideoSource(position: currentPosition));
                  }
                },
              ),
              IconButton(
                icon: const Icon(Icons.flip_camera_android),
                onPressed: () {
                  if (currentPosition == CameraPosition.front) {
                    currentPosition = CameraPosition.back;
                  } else {
                    currentPosition = CameraPosition.front;
                  }
                  _stream?.attachVideo(VideoSource(position: currentPosition));
                },
              )
            ]),
            body: Center(
              child: _stream == null
                  ? const Text("")
                  : NetStreamDrawableTexture(_stream),
            ),
            floatingActionButton: FloatingActionButton(
              child: _recording
                  ? const Icon(Icons.fiber_smart_record)
                  : const Icon(Icons.not_started),
              onPressed: () {
                if (_recording) {
                  _connection?.close();
                  setState(() {
                    _recording = false;
                  });
                } else {
                  _connection?.connect("rtmp://a.rtmp.youtube.com/live2");
                  _stream?.publish("....-....-....-....-....");
                 
                }
              },
            ),
          ),
        );
      }
    }
    
    

    To Reproduce

    Call the connect function

    Expected behavior

    I Expected to live stream on youtube.

    Version

    I am running haishin_kit: ^0.9.2

    Smartphone info.

    • Device Iphone 6s
    • OS (IOS 12.5)

    Additional context

    No response

    Screenshots

    No response

    Relevant log output

    `Unsupported value: [Optional(undefined)] of type __SwiftValue`
    `Lost connection to device.`
    
    opened by MahmoudAshours 0
  • Back Camera quality is very poor and blurred

    Back Camera quality is very poor and blurred

    Describe the bug

    I have tested the livestreaming with the example code and it worked on mux.com

    But, the biggest problem is the video quality is very poor and can see blurred view on mux playback and Can you please help me to understand what I am missing? What is causing this poor video quality?

    import 'dart:async';
    
    import 'package:audio_session/audio_session.dart';
    import 'package:flutter/material.dart';
    import 'package:haishin_kit/audio_settings.dart';
    import 'package:haishin_kit/audio_source.dart';
    import 'package:haishin_kit/capture_settings.dart';
    import 'package:haishin_kit/net_stream_drawable_texture.dart';
    import 'package:haishin_kit/rtmp_connection.dart';
    import 'package:haishin_kit/rtmp_stream.dart';
    import 'package:haishin_kit/video_settings.dart';
    import 'package:haishin_kit/video_source.dart';
    import 'package:permission_handler/permission_handler.dart';
    
    void main() {
      runApp(const MyApp());
    }
    
    const streamKey = "***********************";
    
    class MyApp extends StatefulWidget {
      const MyApp({Key? key}) : super(key: key);
    
      @override
      State<MyApp> createState() => _MyAppState();
    }
    
    class _MyAppState extends State<MyApp> {
      RtmpConnection? _connection;
      RtmpStream? _stream;
      bool _recording = false;
      String _mode = "publish";
      CameraPosition currentPosition = CameraPosition.front;
    
      @override
      void initState() {
        super.initState();
        initPlatformState();
      }
    
      @override
      void dispose() {
        _stream?.dispose();
        _connection?.dispose();
        super.dispose();
      }
    
      Future<void> initPlatformState() async {
        await Permission.camera.request();
        await Permission.microphone.request();
    
        // Set up AVAudioSession for iOS.
        final session = await AudioSession.instance;
        await session.configure(const AudioSessionConfiguration(
          avAudioSessionCategory: AVAudioSessionCategory.playAndRecord,
          avAudioSessionCategoryOptions:
          AVAudioSessionCategoryOptions.allowBluetooth,
        ));
    
        RtmpConnection connection = await RtmpConnection.create();
        connection.eventChannel.receiveBroadcastStream().listen((event) {
          switch (event["data"]["code"]) {
            case 'NetConnection.Connect.Success':
              if (_mode == "publish") {
                _stream?.publish(streamKey);
              } else {
                _stream?.play("live");
              }
              setState(() {
                _recording = true;
              });
              break;
          }
        });
    
        RtmpStream stream = await RtmpStream.create(connection);
        stream.audioSettings = AudioSettings(muted: false, bitrate: 64 * 1000);
        stream.videoSettings = VideoSettings(
          width: 1080,
          height: 1920,
          bitrate: 12 * 1000,
          frameInterval: 2
        );
        stream.attachAudio(AudioSource());
        stream.attachVideo(VideoSource(position: currentPosition));
    
        if (!mounted) return;
    
        setState(() {
          _connection = connection;
          _stream = stream;
        });
      }
    
      @override
      Widget build(BuildContext context) {
        return MaterialApp(
          home: Scaffold(
            appBar: AppBar(title: const Text('HaishinKit'), actions: [
              // IconButton(
              //   icon: const Icon(Icons.play_arrow),
              //   onPressed: () {
              //     if (_mode == "publish") {
              //       _mode = "playback";
              //       _stream?.attachVideo(null);
              //       _stream?.attachAudio(null);
              //     } else {
              //       _mode = "publish";
              //       _stream?.attachAudio(AudioSource());
              //       _stream?.attachVideo(VideoSource(position: currentPosition));
              //     }
              //   },
              // ),
              IconButton(
                icon: const Icon(Icons.flip_camera_android_sharp),
                onPressed: () {
                  currentPosition = currentPosition == CameraPosition.front
                      ? CameraPosition.back
                      : CameraPosition.front;
    
                  _stream?.attachVideo(VideoSource(position: currentPosition));
                },
              )
            ]),
            body: Center(
              child: _stream == null
                  ? const CircularProgressIndicator()
                  : Stack(
                  children: [
                    NetStreamDrawableTexture(_stream),
                    Positioned(
                      bottom: 20,
                      right: 20,
                      left: 20,
                      child: SizedBox(
                        width: double.infinity,
                        height: 100,
                        child: Row(
                          mainAxisAlignment: MainAxisAlignment.spaceEvenly,
                          children: const [
                            Icon(Icons.shopping_bag, color: Colors.white,),
                            Icon(Icons.thumb_up, color: Colors.white,),
                            Icon(Icons.report, color: Colors.white,),
                          ],
                        ),
                      ),
                    )
                  ]
              ),
            ),
            floatingActionButton: FloatingActionButton(
              child: _recording
                  ? const Icon(Icons.fiber_smart_record)
                  : const Icon(Icons.not_started),
              onPressed: () {
                if (_recording) {
                  _connection?.close();
                  setState(() {
                    _recording = false;
                  });
                } else {
                  _connection?.connect("rtmps://global-live.mux.com:443/app");
                }
              },
            ),
          ),
        );
      }
    }
    

    To Reproduce

    Run the above code as mentioned in the bug.

    Expected behavior

    The 720p should show HD and so on.

    Version

    0.9.2

    Smartphone info.

    • Xiomi note 8 pro

    Additional context

    No response

    Screenshots

    No response

    Relevant log output

    Launching lib/main.dart on Redmi Note 8 Pro in debug mode...
    Running Gradle task 'assembleDebug'...
    ✓  Built build/app/outputs/flutter-apk/app-debug.apk.
    Installing build/app/outputs/flutter-apk/app.apk...
    Debug service listening on ws://127.0.0.1:64958/HisNmM4CRNs=/ws
    Syncing files to device Redmi Note 8 Pro...
    E/ion     ( 1410): ioctl c0044901 failed with code -1: Invalid argument
    I/estream_haishi( 1410): ProcessProfilingInfo new_methods=1122 is saved saved_to_disk=1 resolve_classes_delay=8000
    D/libMEOW ( 1410): applied 1 plugins for [com.example.livestream_haishin]:
    D/libMEOW ( 1410):   plugin 1: [libMEOW_gift.so]:
    I/GED     ( 1410): [GT]_get_procNameprocess pid(1410)
    I/GED     ( 1410): [GT]_getprocess name(com.example.livestream_haishin)
    I/estream_haishi( 1410): [GT] ret(1) gt_status(00000000) aniso_debug_level(0) gt_aniso_max_level(16) ani so mask(00000001) tri mask(00000002)
    I/libMEOW_gift( 1410): ctx:0xb4000071be46ff70, ARC not Enabled.
    I/BufferQueueConsumer( 1410): [](id:58200000000,api:0,p:-1,c:1410) connect(): controlledByApp=true
    I/CameraManagerGlobal( 1410): Connecting to camera service
    D/libMEOW ( 1410): applied 1 plugins for [com.example.livestream_haishin]:
    D/libMEOW ( 1410):   plugin 1: [libMEOW_gift.so]:
    I/GED     ( 1410): [GT]_get_procNameprocess pid(1410)
    I/GED     ( 1410): [GT]_getprocess name(com.example.livestream_haishin)
    I/estream_haishi( 1410): [GT] ret(1) gt_status(00000000) aniso_debug_level(0) gt_aniso_max_level(16) ani so mask(00000001) tri mask(00000002)
    I/libMEOW_gift( 1410): ctx:0xb4000071b765bf68, ARC not Enabled.
    W/CameraManagerGlobal( 1410): [soar.cts] ignore the status update of camera: 20
    W/CameraManagerGlobal( 1410): [soar.cts] ignore the status update of camera: 21
    W/CameraManagerGlobal( 1410): [soar.cts] ignore the status update of camera: 22
    W/CameraManagerGlobal( 1410): [soar.cts] ignore the status update of camera: 61
    W/CameraManagerGlobal( 1410): ignore the torch status update of camera: 21
    W/CameraManagerGlobal( 1410): ignore the torch status update of camera: 22
    W/CameraManagerGlobal( 1410): ignore the torch status update of camera: 61
    I/BufferQueueProducer( 1410): [SurfaceTexture-0-1410-0](id:58200000000,api:1,p:1410,c:1410) connect(): api=1 producerControlledByApp=true
    E/libc    ( 1410): Access denied finding property "persist.vendor.camera.privapp.list"
    E/CameraManagerGlobal( 1410): Camera 61 is not available. Ignore physical camera status change
    W/CameraManagerGlobal( 1410): ignore the torch status update of camera: 21
    W/CameraManagerGlobal( 1410): ignore the torch status update of camera: 22
    W/CameraManagerGlobal( 1410): ignore the torch status update of camera: 61
    I/BufferQueueConsumer( 1410): [](id:58200000001,api:0,p:-1,c:1410) connect(): controlledByApp=true
    I/BufferQueueConsumer( 1410): [](id:58200000002,api:0,p:-1,c:1410) connect(): controlledByApp=true
    I/BufferQueueProducer( 1410): [SurfaceTexture-1-1410-1](id:58200000001,api:4,p:825,c:1410) connect(): api=4 producerControlledByApp=true
    I/BufferQueueProducer( 1410): [SurfaceTexture-1-1410-2](id:58200000002,api:4,p:825,c:1410) connect(): api=4 producerControlledByApp=true
    I/mali_config( 1410): @get_buffer_dataspace_setting: update dataspace (0x10010000 -> 0x08020000)
    I/mali_config( 1410): @get_buffer_dataspace_setting: update dataspace (0x10010000 -> 0x08020000)
    I/mali_config( 1410): @get_buffer_dataspace_setting: update dataspace (0x10010000 -> 0x08020000)
    I/mali_config( 1410): @get_buffer_dataspace_setting: update dataspace (0x10010000 -> 0x08020000)
    I/mali_config( 1410): @get_buffer_dataspace_setting: update dataspace (0x10010000 -> 0x08020000)
    I/mali_config( 1410): @get_buffer_dataspace_setting: update dataspace (0x10010000 -> 0x08020000)
    I/mali_config( 1410): @get_buffer_dataspace_setting: update dataspace (0x10010000 -> 0x08020000)
    I/mali_config( 1410): @get_buffer_dataspace_setting: update dataspace (0x10010000 -> 0x08020000)
    V/PhoneWindow( 1410): DecorView setVisiblity: visibility = 4, Parent = android.view.ViewRootImpl@f58e519, this = DecorView@8fd59de[MainActivity]
    I/GED     ( 1410): ged_boost_gpu_freq, level 100, eOrigin 2, final_idx 27, oppidx_max 27, oppidx_min 0
    E/CameraCaptureSession( 1410): Session 0: Exception while stopping repeating: 
    E/CameraCaptureSession( 1410): android.hardware.camera2.CameraAccessException: CAMERA_ERROR (3): The camera device has encountered a serious error
    E/CameraCaptureSession( 1410): 	at android.hardware.camera2.impl.CameraDeviceImpl.checkIfCameraClosedOrInError(CameraDeviceImpl.java:2305)
    E/CameraCaptureSession( 1410): 	at android.hardware.camera2.impl.CameraDeviceImpl.stopRepeating(CameraDeviceImpl.java:1263)
    E/CameraCaptureSession( 1410): 	at android.hardware.camera2.impl.CameraCaptureSessionImpl.close(CameraCaptureSessionImpl.java:578)
    E/CameraCaptureSession( 1410): 	at com.haishinkit.media.Camera2Source.setSession(Camera2Source.kt:79)
    E/CameraCaptureSession( 1410): 	at com.haishinkit.media.Camera2Source.setDevice(Camera2Source.kt:52)
    E/CameraCaptureSession( 1410): 	at com.haishinkit.media.Camera2Source.onError(Camera2Source.kt:199)
    E/CameraCaptureSession( 1410): 	at android.hardware.camera2.impl.CameraDeviceImpl.notifyError(CameraDeviceImpl.java:1703)
    E/CameraCaptureSession( 1410): 	at android.hardware.camera2.impl.CameraDeviceImpl.lambda$oDs27OTfKFfK18rUW2nQxxkPdV0(Unknown Source:0)
    E/CameraCaptureSession( 1410): 	at android.hardware.camera2.impl.-$$Lambda$CameraDeviceImpl$oDs27OTfKFfK18rUW2nQxxkPdV0.accept(Unknown Source:8)
    E/CameraCaptureSession( 1410): 	at com.android.internal.util.function.pooled.PooledLambdaImpl.doInvoke(PooledLambdaImpl.java:278)
    E/CameraCaptureSession( 1410): 	at com.android.internal.util.function.pooled.PooledLambdaImpl.invoke(PooledLambdaImpl.java:201)
    E/CameraCaptureSession( 1410): 	at com.android.internal.util.function.pooled.OmniFunction.run(OmniFunction.java:97)
    E/CameraCaptureSession( 1410): 	at android.os.Handler.handleCallback(Handler.java:938)
    E/CameraCaptureSession( 1410): 	at android.os.Handler.dispatchMessage(Handler.java:99)
    E/CameraCaptureSession( 1410): 	at android.os.Looper.loop(Looper.java:236)
    E/CameraCaptureSession( 1410): 	at android.os.HandlerThread.run(HandlerThread.java:67)
    W/CameraManagerGlobal( 1410): ignore the torch status update of camera: 21
    I/BufferQueueProducer( 1410): [SurfaceTexture-1-1410-1](id:58200000001,api:4,p:825,c:1410) disconnect(): api=4
    W/CameraManagerGlobal( 1410): ignore the torch status update of camera: 22
    D/libMEOW ( 1410): applied 1 plugins for [com.example.livestream_haishin]:
    D/libMEOW ( 1410):   plugin 1: [libMEOW_gift.so]:
    W/CameraManagerGlobal( 1410): ignore the torch status update of camera: 61
    I/BufferQueueProducer( 1410): [SurfaceTexture-1-1410-2](id:58200000002,api:4,p:825,c:1410) disconnect(): api=4
    E/CameraManagerGlobal( 1410): Camera 61 is not available. Ignore physical camera status change
    
    opened by kamal-github 2
  • Camera Not Closed

    Camera Not Closed

    Describe the bug

    ...When Leaving and Disposing the Page. (calling stream.close() has no effect on the camera referenced)

    The bug is in this file

    https://github.com/shogo4405/HaishinKit.dart/blob/main/android/src/main/kotlin/com/haishinkit/haishin_kit/RtmpStreamHandler.kt

    Ctrf + F, camera,

    you can see you called camera.open()

    but never called close / dispose on the camera,

    as a result the camera is never released even after leaving/closing the page (the privacy alert camera icon is still in the status bar)

    To Reproduce

    please check the source code in file

    https://github.com/shogo4405/HaishinKit.dart/blob/main/android/src/main/kotlin/com/haishinkit/haishin_kit/RtmpStreamHandler.kt

    Expected behavior

    close camera when stream.close() is called

    Version

    latest

    Smartphone info.

    No response

    Additional context

    No response

    Screenshots

    No response

    Relevant log output

    No response

    opened by kairan77 1
  • Can't Dispose camera when leaving the page.

    Can't Dispose camera when leaving the page.

    Describe the bug

    Camera is not disposing though leaving the page. And camera is showing black .

    To Reproduce

    ` import 'dart:async';

    import 'package:audio_session/audio_session.dart'; import 'package:flutter/material.dart'; import 'package:haishin_kit/audio_source.dart'; import 'package:haishin_kit/net_stream_drawable_texture.dart'; import 'package:haishin_kit/rtmp_connection.dart'; import 'package:haishin_kit/rtmp_stream.dart'; import 'package:haishin_kit/video_source.dart'; import 'package:permission_handler/permission_handler.dart';

    class Stream extends StatefulWidget { const Stream({super.key});

    @override State createState() => _StreamState(); }

    class _StreamState extends State { RtmpConnection? _connection; RtmpStream? _stream; bool _recording = false; CameraPosition currentPosition = CameraPosition.front; @override void initState() { initPlatformState(); super.initState(); }

    Future initPlatformState() async { await Permission.camera.request(); await Permission.microphone.request();

    // Set up AVAudioSession for iOS.
    final session = await AudioSession.instance;
    await session.configure(const AudioSessionConfiguration(
      avAudioSessionCategory: AVAudioSessionCategory.playAndRecord,
      avAudioSessionCategoryOptions:
          AVAudioSessionCategoryOptions.allowBluetooth,
    ));
    
    RtmpConnection connection = await RtmpConnection.create();
    connection.eventChannel.receiveBroadcastStream().listen((event) {
      switch (event["data"]["code"]) {
        case 'NetConnection.Connect.Success':
          _stream?.publish("new");
          setState(() {
            _recording = true;
          });
          break;
      }
    });
    RtmpStream stream = await RtmpStream.create(connection);
    
    stream.attachAudio(AudioSource());
    stream.attachVideo(VideoSource(position: currentPosition));
    
    if (!mounted) return;
    
    setState(() {
      _connection = connection;
      _stream = stream;
    });
    

    }

    @override Future dispose() async { await _stream!.close(); await _stream!.dispose(); _connection!.close(); _connection!.dispose(); super.dispose(); }

    @override Widget build(BuildContext context) { return Scaffold( appBar: AppBar(title: const Text('HaishinKit example app'), actions: [ IconButton( icon: const Icon(Icons.flip_camera_android), onPressed: () { if (currentPosition == CameraPosition.front) { currentPosition = CameraPosition.back; } else { currentPosition = CameraPosition.front; } _stream?.attachVideo(VideoSource(position: currentPosition)); }, ) ]), body: Center( child: _stream == null ? const Text("No Response") : NetStreamDrawableTexture(_stream), ), floatingActionButton: FloatingActionButton( child: _recording ? const Icon(Icons.fiber_smart_record) : const Icon(Icons.not_started), onPressed: () { if (_recording) { _connection?.close(); setState(() { _recording = false; }); } else { _connection?.connect("rtmp://10.0.2.2:1935/app/i"); // setState(() { // _recording = true; // }); } }, ), ); } }

    `

    Expected behavior

    Dispose camera when leaving the page

    Version

    haishin_kit: ^0.9.1

    Smartphone info.

    Android emulator

    Additional context

    No response

    Screenshots

    No response

    Relevant log output

    EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    D/EGL_emulation(18846): eglMakeCurrent: 0xa56f17e0: ver 3 0 (tinfo 0xa76ee670)
    
    opened by Saleque474 0
  • Unsupported code error

    Unsupported code error

    image
    Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'Unsupported value for standard codec'
    
    Unsupported value: [] of type __SwiftValue
    *** Assertion failure in -[FlutterStandardWriter writeValue:], FlutterStandardCodec.mm:338
    *** Terminating app due to uncaught exception 'NSInternalInconsistencyException', reason: 'Unsupported value for standard codec'
    *** First throw call stack:
    (0x1b13c1288 0x1ca0bb744 0x1b2c4e360 0x10a673478 0x10a6735ac 0x10a6735ac 0x10a673b00 0x10a6712fc 0x105bc5c50 0x105bc5388 0x105bc5454 0x1b1352834 0x1b13eefd4 0x1b13c21d0 0x1b13688ac 0x1b2b37754 0x1065913bc 0x1065914c8 0x10665ccf0 0x106652038 0x106652644 0x106652644 0x106652788 0x10667efe0 0x10661afd0 0x10661bcac 0x10661c2f8 0x1b142c120 0x1b144917c 0x1b1026e6c 0x1b1028a30 0x1b1030124 0x1b1030c80 0x1b103b500 0x222c800bc 0x222c7fe5c)
    libc++abi: terminating with uncaught exception of type NSException
    * thread #34, queue = 'com.haishinkit.HaishinKit.NetSocket.input', stop reason = signal SIGABRT
        frame #0: 0x00000001e8f56b38 libsystem_kernel.dylib`__pthread_kill + 8
    libsystem_kernel.dylib`__pthread_kill:
    ->  0x1e8f56b38 <+8>:  b.lo   0x1e8f56b58               ; <+40>
        0x1e8f56b3c <+12>: pacibsp 
        0x1e8f56b40 <+16>: stp    x29, x30, [sp, #-0x10]!
        0x1e8f56b44 <+20>: mov    x29, sp
    Target 0: (Runner) stopped.
    
    opened by Hansrider 7
Releases(0.9.2)
Flutter plugin for selecting images from the Android and iOS image library, taking new pictures with the camera, and edit them before using such as rotation, cropping, adding sticker/text/filters.

advance_image_picker Flutter plugin for selecting multiple images from the Android and iOS image library, taking new pictures with the camera, and edi

Weta Vietnam 91 Dec 19, 2022
QR.Flutter is a Flutter library for simple and fast QR code rendering via a Widget or custom painter.

QR.Flutter is a Flutter library for simple and fast QR code rendering via a Widget or custom painter. Need help? Please do not submit an issue for a "

Yakka 614 Jan 8, 2023
Pancake is an android streaming app for movies, for movies, tv-shows and anime.

Pancake Pancake is an android streaming app for movies, for movies, tv-shows and anime. Feature Ad free, No Ads whatsoever No tracking/analytics Insta

tejas 31 Jan 2, 2023
Flutterbodydetection - A flutter plugin that uses MLKit on iOS/Android platforms to enable body pose and mask detection using Pose Detection and Selfie Segmentation APIs for both static images and live camera stream.

body_detection A flutter plugin that uses MLKit on iOS/Android platforms to enable body pose and mask detection using Pose Detection and Selfie Segmen

null 18 Dec 5, 2022
An app to pick, upload and display images from camera and gallery with size and extension constraints.

image_uploader A new Flutter project. Getting Started This project is a starting point for a Flutter application. A few resources to get you started i

Ehmad Saeed⚡ 4 Mar 7, 2022
Flutter plugin to simply integrate Agora Video Calling or Live Video Streaming to your app with just a few lines of code.

Agora UI Kit for Flutter Instantly integrate Agora video calling or video streaming into your Flutter application. Getting started Requirements An Ago

Agora.io Community 106 Dec 16, 2022
A streaming client for the Komga self-hosted comics/manga/BD server targeting Android/iOS written in Dart/Flutter

Klutter A streaming client for the Komga self-hosted comics/manga/BD server targeting Android/iOS written in Dart/Flutter Background This is a project

Mark Winckle 58 Dec 7, 2022
An sample app demonstrating online radio streaming in flutter

Flutter Radio App Hey, This is an app demonstarting online radio streaming in flutter. Have a look. Demo Catch the demo in this video. If you like it,

Aman gautam 24 Jun 28, 2022
Netflix redesign - Redesign a Netflix movie streaming app UI With Flutter

Redesign Netflix App Packages http jiffy cached_network_image flutter_svg carous

tustoz 20 Oct 10, 2022
Flutter plugin for playing or streaming YouTube videos inline using the official iFrame Player API

Flutter plugin for playing or streaming YouTube videos inline using the official iFrame Player API. The package exposes almost all the API provided by iFrame Player API. So, it's 100% customizable.

Pratap Singh 0 May 15, 2022
Music Streaming App made in Flutter.

Musify Music Streaming and Downloading app made in Flutter! Features Online Song Search ?? Streaming Support ?? Download Support ⬇️ Play Local / Downl

Aditya 7 Jan 2, 2023
An example of "reactive/streaming repository" as a solution for BLoC to BLoC communication

Reactive Repositories An example of listening to a Stream from repository layer (instead of explicitly using get/fetch) as a solution for BLoC to BLoC

Sandro Lovnički 11 Jan 3, 2023
Face Mask Detection mobile application built with Flutter and TensorFlow lite in order to detect face masks using images and live camera.

Face Mask Detector App Face Mask Detection mobile application built with Flutter and TensorFlow lite in order to detect face masks using images and li

Yousef Shaban 3 Aug 15, 2022
Real-time object detection in Flutter using camera and tflite plugin

For details: https://medium.com/@shaqian629/real-time-object-detection-in-flutter-b31c7ff9ef96 flutter_realtime_detection Real-time object detection i

Post_Swift 6 Oct 12, 2022
💳 A Flutter package for making payments via credo central. Provides support for both Android and iOS

?? Credo Package for Flutter TODO: Put a short description of the package here that helps potential users know whether this package might be useful fo

Samuel Abada 0 Dec 26, 2021
its just take image from gallery or camera and save to file (in flutter)

flutter_img_pdf A new Flutter project. Getting Started This project is a starting point for a Flutter application. A few resources to get you started

vivek kumar 0 Dec 28, 2021
A flutter plugin about qr code or bar code scan , it can scan from file、url、memory and camera qr code or bar code .Welcome to feedback your issue.

r_scan A flutter plugin about qr code or bar code scan , it can scan from file、url、memory and camera qr code or bar code .Welcome to feedback your iss

PengHui Li 112 Nov 11, 2022
This is a flutter plugin to detect edges in a live camera, take the picture of detected edges object, crop it, and save.

edge_detection A flutter plugin to detect edges of objects, scan paper, detect corners, detect rectangles. It allows cropping of the detected object i

Sawan Kumar Bundelkhandi 180 Dec 28, 2022
A Dart Build Plugin that uploads debug symbols for Android, iOS/macOS and source maps for Web to Sentry via sentry-cli

Sentry Dart Plugin A Dart Build Plugin that uploads debug symbols for Android, iOS/macOS and source maps for Web to Sentry via sentry-cli. For doing i

Sentry 36 Jan 4, 2023