AVCaptureVideoDataOutputSampleBufferDelegate 未触发 captureOutput

问题描述 投票:0回答:1

我正在尝试使用IOS的视觉框架构建一个卡片扫描仪应用程序。我成功地打开了相机,但问题是 AVCaptureVideoDataOutputSampleBufferDelegate 没有触发 captureOutput 函数,我将在其中进行分析以识别卡。

这是我的.h 文件

#import <AVFoundation/AVFoundation.h>
#import <UIKit/UIKit.h>

@interface Trying : NSObject <AVCaptureVideoDataOutputSampleBufferDelegate>
@property (nonatomic, strong) AVCaptureSession *captureSession;
@property (nonatomic, strong) AVCaptureVideoPreviewLayer *previewLayer;
@property (nonatomic, strong) dispatch_queue_t videoDataOutputQueue;
@property (nonatomic, strong) UIView *view;
- (void)setupCamera;
@end

和.m文件

#import <Foundation/Foundation.h>
#import <AVFoundation/AVFoundation.h>
#import "Trying.h"

@interface Trying ()

@end

@implementation Trying

- (void)setupCamera {
    self.captureSession = [[AVCaptureSession alloc] init];
    
    AVCaptureDevice *captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    
    NSError *error;
    AVCaptureDeviceInput *deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:captureDevice error:&error];
    
    if (deviceInput) {
        [self.captureSession addInput:deviceInput];
        
        AVCaptureVideoPreviewLayer *previewLayer = [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
        previewLayer.frame = self.view.bounds;
        previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
        [self.view.layer addSublayer:previewLayer];
        
        self.previewLayer = previewLayer;
        
        // Create a video data output and set the delegate
        self.videoDataOutputQueue = dispatch_queue_create("VideoDataOutputQueue", DISPATCH_QUEUE_SERIAL);
        AVCaptureVideoDataOutput *videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
        [videoDataOutput setSampleBufferDelegate:self queue:self.videoDataOutputQueue];
        [self.captureSession addOutput:videoDataOutput];
        
        // Start the session
        [self.captureSession startRunning];
    } else {
        NSLog(@"Error setting up camera: %@", error.localizedDescription);
    }
}

- (void)captureOutput:(AVCaptureOutput *)output didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    // Process the live sample buffer here
    NSLog(@"Got a live sample buffer");
    
    // You can use the sampleBuffer for further processing, such as image recognition or analysis
}

- (void)captureOutput:(AVCaptureOutput *)output didDropSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {
    NSLog(@"Dropped a sample buffer");
    // Handle dropped frames if needed
}
@end

任何人都可以帮忙解释为什么

captureOutput
没有被
AVCaptureVideoDataOutputSampleBufferDelegate
触发。我可以在视图中看到实时摄像头画面,但
captureOutput
未被调用

ios objective-c avcapturesession
1个回答
0
投票

您需要按住输出对象以防止其被释放。
1、@property(非原子,强)AVCaptureVideoDataOutput*cameraOutput;
2、_cameraOutput = 视频数据输出;
建议您还创建一个

AVCaptureConnection
对象来区分
AVCaptureVideoDataOutputSampleBufferDelegate
中的连接。
3、AVCaptureConnection* _videoConnection;
4、_videoConnection = [_cameraOutput 连接WithMediaType:AVMediaTypeVideo];
5、这个
didOutputSampleBuffer
,如果(_videoConnection==connection),就可以使用sampleBuffer。

© www.soinside.com 2019 - 2024. All rights reserved.