之前在 AVFoundation拍照和录像 中使用 AVCaptureMovieFileOutput类来捕捉 QuickTime影片,这个类定义了捕捉视频数据的简单方法。当对捕捉到的视频数据进行更底层的操作时,就需要用到 AVCaptureVideoDataOutput
AVCaptureVideoDataOutputSampleBufferDelegate
与AVCaptureMovieFileOutput的委托回调不同,AVCaptureVideoDataOutput的委托回调为AVCaptureVideoDataOutputSampleBufferDelegate
它定义了一下两个方法:
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection;
 
- (void)captureOutput:(AVCaptureOutput *)captureOutput didDropSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection NS_AVAILABLE(10_7, 6_0);
- 每当一个迟到的视频帧被丢弃时,就会被调用。通常是因为 在didOutputSampleBuffer中调用了耗时的操作。
 
 
CMSampleBufferRef
CMSampleBufferRef将基本的样本数据进行封装并提供格式和时间信息,还会家伙是哪个所有在转换和处理数据时用到的元数据。
- 样本数据 CVImageBufferRef
CVImageBufferRef  pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) 
 
- 格式信息 CMFormatDescriptionRef 
CMFormatDescriptionRef formatDescription = CMSampleBufferGetFormatDescription(sampleBuffer); 
 
- 时间信息 CMTime 
 
- 附加元数据 
CMGetAttachment 获取 
AVCaptureVideoDataOutput
配置
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24
   | - (BOOL)setupSessionOutputs:(NSError **)error {
  	self.videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];     self.videoDataOutput.alwaysDiscardsLateVideoFrames = YES;     self.videoDataOutput.videoSettings =          @{(id)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA)};
 
  	[self.videoDataOutput setSampleBufferDelegate:self                                             queue:dispatch_get_main_queue()];
      if ([self.captureSession canAddOutput:self.videoDataOutput]) {         [self.captureSession addOutput:self.videoDataOutput];         return YES;     }
 
      return NO; } - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer        fromConnection:(AVCaptureConnection *)connection { }       
  | 
 
OpenGL ES处理数据,将视频数据贴在立方体上
OpenGL ES 参考
Apple guide - About OpenGl ES
OpenGL Tutorial for iOS: OpenGL ES 2.0
OpenGL ES 2.0 iOS教程
创建CVOpenGLESTextureCacheCreate
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34
   | #import "THCameraController.h" #import <AVFoundation/AVFoundation.h> #import <OpenGLES/ES2/gl.h> #import <OpenGLES/ES2/glext.h>
  @interface THCameraController () <AVCaptureVideoDataOutputSampleBufferDelegate>
  @property (weak, nonatomic) EAGLContext *context; @property (strong, nonatomic) AVCaptureVideoDataOutput *videoDataOutput;
 
  @property (nonatomic) CVOpenGLESTextureCacheRef textureCache; @property (nonatomic) CVOpenGLESTextureRef cameraTexture;
  @end
  @implementation THCameraController
  - (instancetype)initWithContext:(EAGLContext *)context {     self = [super init];     if (self) {         _context = context;                  CVReturn err = CVOpenGLESTextureCacheCreate(kCFAllocatorDefault,                                                     NULL,                                                     _context,                                                     NULL,                                                     &_textureCache);         if (err != kCVReturnSuccess) {             NSLog(@"Error creating texture cache. %d", err);         }     }     return self; }
   | 
 
创建OpenGL ES贴图
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48
   |  - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer        fromConnection:(AVCaptureConnection *)connection {
      CVReturn err;      	CVImageBufferRef pixelBuffer =                                                   CMSampleBufferGetImageBuffer(sampleBuffer);          CMFormatDescriptionRef formatDescription =                                       CMSampleBufferGetFormatDescription(sampleBuffer);     CMVideoDimensions dimensions =         CMVideoFormatDescriptionGetDimensions(formatDescription);               err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault,                                                         _textureCache,                                                        pixelBuffer,                                                        NULL,                                                        GL_TEXTURE_2D,                                                        GL_RGBA,                                                        dimensions.height,                                                        dimensions.height,                                                        GL_BGRA,                                                        GL_UNSIGNED_BYTE,                                                        0,                                                        &_cameraTexture);          if (!err) {         GLenum target = CVOpenGLESTextureGetTarget(_cameraTexture);                  GLuint name = CVOpenGLESTextureGetName(_cameraTexture);         [self.textureDelegate textureCreatedWithTarget:target name:name];        } else {         NSLog(@"Error at CVOpenGLESTextureCacheCreateTextureFromImage %d", err);     }
      [self cleanupTextures]; }
 
  - (void)cleanupTextures {                                                        if (_cameraTexture) {         CFRelease(_cameraTexture);         _cameraTexture = NULL;     }     CVOpenGLESTextureCacheFlush(_textureCache, 0); }
 
  |