ios – 如何将CIFilter输出到Camera视图?

ios – 如何将CIFilter输出到Camera视图?,第1张

概述我刚刚开始使用Objective-C,我正在尝试创建一个简单的应用程序,它会显示带有模糊效果的摄像机视图.我让Camera输出与AVFoundation框架一起工作.现在,我正在尝试连接核心图像框架,但不知道如何,Apple文档让我感到困惑,在线搜索指南和教程导致没有结果.在此先感谢您的帮助. #import "ViewController.h"#import <AVFoundation/AVF 我刚刚开始使用Objective-C,我正在尝试创建一个简单的应用程序,它会显示带有模糊效果的摄像机视图.我让Camera输出与AVFoundation框架一起工作.现在,我正在尝试连接核心图像框架,但不知道如何,Apple文档让我感到困惑,在线搜索指南和教程导致没有结果.在此先感谢您的帮助.
#import "VIEwController.h"#import <AVFoundation/AVFoundation.h>@interface VIEwController ()@property (strong,nonatomic) CIContext *context;@end@implementation VIEwControllerAVCaptureSession *session;AVCaptureStillimageOutput *stillimageOutput;-(CIContext *)context{    if(!_context)    {        _context = [CIContext contextWithOptions:nil];    }    return _context;}- (voID)vIEwDIDLoad {    [super vIEwDIDLoad];    // Do any additional setup after loading the vIEw,typically from a nib.}-(voID)vIEwWillAppear:(BOol)animated{    session = [[AVCaptureSession alloc] init];    [session setSessionPreset:AVCaptureSessionPresetPhoto];    AVCaptureDevice *inputDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVIDeo];    NSError *error;    AVCaptureDeviceinput *deviceinput = [AVCaptureDeviceinput deviceinputWithDevice:inputDevice error:&error];    if ([session canAddinput:deviceinput]) {        [session addinput:deviceinput];    }    AVCaptureVIDeoPrevIEwLayer *prevIEwLayer = [[AVCaptureVIDeoPrevIEwLayer alloc] initWithSession:session];    [prevIEwLayer setVIDeoGravity:AVLayerVIDeoGravityResizeAspectFill];    CALayer *rootLayer = [[self vIEw] layer];    [rootLayer setMasksToBounds:YES];    CGRect frame = self.imageVIEw.frame;    [prevIEwLayer setFrame:frame];    [prevIEwLayer.connection setVIDeoOrIEntation:AVCaptureVIDeoOrIEntationLandscapeRight];    [rootLayer insertSublayer:prevIEwLayer atIndex:0];    stillimageOutput = [[AVCaptureStillimageOutput alloc] init];    NSDictionary *outputSettings = [[NSDictionary alloc] initWithObjectsAndKeys:AVVIDeoCodecJPEG,AVVIDeoCodecKey,nil];    [stillimageOutput setoutputSettings:outputSettings];    [session addOutput:stillimageOutput];    [session startRunning];     }@end
解决方法 这是让你入门的东西.这是以下链接的代码的更新版本.
https://gist.github.com/eladb/9662102

诀窍是使用AVCaptureVIDeoDataOutputSampleBufferDelegate.
使用此委托,您可以使用imageWithCVPixelBuffer从相机缓冲区构建CIImage.

现在虽然我想弄清楚如何减少滞后.我会尽快更新.

更新:延迟现在很小,并且在某些影响上无法察觉.不幸的是,模糊似乎是最慢的之一.您可能想要查看vImage.

#import "VIEwController.h"#import <CoreImage/CoreImage.h>#import <AVFoundation/AVFoundation.h>@interface VIEwController () {}@property (strong,nonatomic) CIContext *coreImageContext;@property (strong,nonatomic) AVCaptureSession *cameraSession;@property (strong,nonatomic) AVCaptureVIDeoDataOutput *vIDeoOutput;@property (strong,nonatomic) UIVIEw *blurCameraview;@property (strong,nonatomic) CIFilter *filter;@property BOol cameraOpen;@end@implementation VIEwController- (voID)vIEwDIDLoad {    [super vIEwDIDLoad];    self.blurCameraview = [[UIVIEw alloc]initWithFrame:[[UIScreen mainScreen] bounds]];    [self.vIEw addSubvIEw:self.blurCameraview];    //setup filter    self.filter = [CIFilter filterWithname:@"CIGaussianBlur"];    [self.filter setDefaults];    [self.filter setValue:@(3.0f) forKey:@"inputRadius"];    [self setupCamera];    [self openCamera];    // Do any additional setup after loading the vIEw,typically from a nib.}- (voID)dIDReceiveMemoryWarning {    [super dIDReceiveMemoryWarning];    // dispose of any resources that can be recreated.}- (voID)setupCamera{    self.coreImageContext = [CIContext contextWithOptions:@{kCIContextUseSoftwareRenderer : @(YES)}];    // session    self.cameraSession = [[AVCaptureSession alloc] init];    [self.cameraSession setSessionPreset:AVCaptureSessionPresetLow];    [self.cameraSession commitConfiguration];    // input    AVCaptureDevice *shootingCamera = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVIDeo];    AVCaptureDeviceinput *shootingDevice = [AVCaptureDeviceinput deviceinputWithDevice:shootingCamera error:NulL];    if ([self.cameraSession canAddinput:shootingDevice]) {        [self.cameraSession addinput:shootingDevice];    }    // vIDeo output    self.vIDeoOutput = [[AVCaptureVIDeoDataOutput alloc] init];    self.vIDeoOutput.alwaysdiscardsLateVIDeoFrames = YES;    [self.vIDeoOutput setSampleBufferDelegate:self queue:dispatch_get_global_queue(disPATCH_QUEUE_PRIORITY_HIGH,0)];    if ([self.cameraSession canAddOutput:self.vIDeoOutput]) {        [self.cameraSession addOutput:self.vIDeoOutput];    }    if (self.vIDeoOutput.connections.count > 0) {        AVCaptureConnection *connection = self.vIDeoOutput.connections[0];        connection.vIDeoOrIEntation = AVCaptureVIDeoOrIEntationPortrait;    }    self.cameraOpen = NO;}- (voID)captureOutput:(AVCaptureOutput *)captureOutput dIDOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection {    // Get a CMSampleBuffer's Core VIDeo image buffer for the media data    CVImageBufferRef imageBuffer = CMSampleBufferGetimageBuffer(sampleBuffer);    // turn buffer into an image we can manipulate    CIImage *result = [CIImage imageWithCVPixelBuffer:imageBuffer];    // filter    [self.filter setValue:result forKey:@"inputimage"];    // render image    CGImageRef blurredImage = [self.coreImageContext createCGImage:self.filter.outputimage fromrect:result.extent];    dispatch_async(dispatch_get_main_queue(),^{        self.blurCameraview.layer.contents = (__brIDge ID)blurredImage;        CGImageRelease(blurredImage);    });}- (voID)openCamera {    if (self.cameraOpen) {        return;    }    self.blurCameraview.Alpha = 0.0f;    [self.cameraSession startRunning];    [self.vIEw layoutIfNeeded];    [UIVIEw animateWithDuration:3.0f animations:^{        self.blurCameraview.Alpha = 1.0f;    }];    self.cameraOpen = YES;}
总结

以上是内存溢出为你收集整理的ios – 如何将CIFilter输出到Camera视图?全部内容,希望文章能够帮你解决ios – 如何将CIFilter输出到Camera视图?所遇到的程序开发问题。

如果觉得内存溢出网站内容还不错,欢迎将内存溢出网站推荐给程序员好友。

欢迎分享,转载请注明来源:内存溢出

原文地址:https://54852.com/web/1105328.html

(0)
打赏 微信扫一扫微信扫一扫 支付宝扫一扫支付宝扫一扫
上一篇 2022-05-28
下一篇2022-05-28

发表评论

登录后才能评论

评论列表(0条)

    保存