AVFoundation-视频处理

之前在 AVFoundation拍照和录像 中使用 AVCaptureMovieFileOutput类来捕捉 QuickTime影片,这个类定义了捕捉视频数据的简单方法。当对捕捉到的视频数据进行更底层的操作时,就需要用到 AVCaptureVideoDataOutput

AVCaptureVideoDataOutputSampleBufferDelegate

与AVCaptureMovieFileOutput的委托回调不同,AVCaptureVideoDataOutput的委托回调为AVCaptureVideoDataOutputSampleBufferDelegate

它定义了一下两个方法:

  • - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection;
    • 每当有一个视频帧写入该方法就会被调用
  • - (void)captureOutput:(AVCaptureOutput *)captureOutput didDropSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection NS_AVAILABLE(10_7, 6_0);
    • 每当一个迟到的视频帧被丢弃时,就会被调用。通常是因为 在didOutputSampleBuffer中调用了耗时的操作。

CMSampleBufferRef

CMSampleBufferRef将基本的样本数据进行封装并提供格式和时间信息,还会家伙是哪个所有在转换和处理数据时用到的元数据。

  • 样本数据 CVImageBufferRef
    • CVImageBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)
  • 格式信息 CMFormatDescriptionRef
    • CMFormatDescriptionRef formatDescription = CMSampleBufferGetFormatDescription(sampleBuffer);
  • 时间信息 CMTime
  • 附加元数据 CMGetAttachment 获取

AVCaptureVideoDataOutput

配置

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
- (BOOL)setupSessionOutputs:(NSError **)error {

self.videoDataOutput = [[AVCaptureVideoDataOutput alloc] init];
self.videoDataOutput.alwaysDiscardsLateVideoFrames = YES;
self.videoDataOutput.videoSettings =
//色度子抽样初始格式为 420v,不过openGL ES一般会使用bgra
@{(id)kCVPixelBufferPixelFormatTypeKey : @(kCVPixelFormatType_32BGRA)};


[self.videoDataOutput setSampleBufferDelegate:self
queue:dispatch_get_main_queue()];

if ([self.captureSession canAddOutput:self.videoDataOutput]) {
[self.captureSession addOutput:self.videoDataOutput];
return YES;
}


return NO;
}
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection {
}

OpenGL ES处理数据,将视频数据贴在立方体上

OpenGL ES 参考
Apple guide - About OpenGl ES
OpenGL Tutorial for iOS: OpenGL ES 2.0
OpenGL ES 2.0 iOS教程

创建CVOpenGLESTextureCacheCreate

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
#import "THCameraController.h"
#import <AVFoundation/AVFoundation.h>
#import <OpenGLES/ES2/gl.h>
#import <OpenGLES/ES2/glext.h>

@interface THCameraController () <AVCaptureVideoDataOutputSampleBufferDelegate>

@property (weak, nonatomic) EAGLContext *context;
@property (strong, nonatomic) AVCaptureVideoDataOutput *videoDataOutput;
//Core Video 提供了CVOpenGLESTextureCacheRef 作为Core Vide像素buffer和OpenGL ES贴图之间的桥梁。
//缓存的目的是减少数据从CPU何GPU之间转移的消耗
@property (nonatomic) CVOpenGLESTextureCacheRef textureCache;
@property (nonatomic) CVOpenGLESTextureRef cameraTexture;

@end

@implementation THCameraController

- (instancetype)initWithContext:(EAGLContext *)context {
self = [super init];
if (self) {
_context = context;
//这个函数关键是后备EAGLContext和textureCache指针
CVReturn err = CVOpenGLESTextureCacheCreate(kCFAllocatorDefault,
NULL,
_context,
NULL,
&_textureCache);
if (err != kCVReturnSuccess) {
NSLog(@"Error creating texture cache. %d", err);
}
}
return self;
}

创建OpenGL ES贴图

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48

- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection {

CVReturn err;
//从sampleBuffer中获取数据
CVImageBufferRef pixelBuffer = // 1
CMSampleBufferGetImageBuffer(sampleBuffer);
//获取视频帧维度 返回带有宽高的CMVideoDimensions
CMFormatDescriptionRef formatDescription = // 2
CMSampleBufferGetFormatDescription(sampleBuffer);
CMVideoDimensions dimensions =
CMVideoFormatDescriptionGetDimensions(formatDescription);

//CVOpenGLESTextureCacheCreateTextureFromImage创建贴图
err = CVOpenGLESTextureCacheCreateTextureFromImage(kCFAllocatorDefault, // 3
_textureCache,
pixelBuffer,
NULL,
GL_TEXTURE_2D,
GL_RGBA,
dimensions.height,
dimensions.height,
GL_BGRA,
GL_UNSIGNED_BYTE,
0,
&_cameraTexture);
//GLenum GLuint 用于将贴图对象与旋转的小方块表面进行核实的绑定
if (!err) {
GLenum target = CVOpenGLESTextureGetTarget(_cameraTexture); // 4
GLuint name = CVOpenGLESTextureGetName(_cameraTexture);
[self.textureDelegate textureCreatedWithTarget:target name:name]; // 5
} else {
NSLog(@"Error at CVOpenGLESTextureCacheCreateTextureFromImage %d", err);
}

[self cleanupTextures];
}

//释放贴图并刷新贴图缓存
- (void)cleanupTextures { // 6
if (_cameraTexture) {
CFRelease(_cameraTexture);
_cameraTexture = NULL;
}
CVOpenGLESTextureCacheFlush(_textureCache, 0);
}