AVAssetWriter介绍
能够通过AVAssetWriter来对媒体样本从新做编码。
针对一个视频文件,只能够应用一个AVAssetWriter来写入,所以每一个文件都须要对应一个新的AVAssetWriter实例。
AVAssetWriter初始化
应用一个视频文件门路对AVAssetReader进行初始化,并指定文件类型。
NSError * error;_mAssetWriter = [[AVAssetWriter alloc] initWithURL:videoUrl fileType:AVFileTypeAppleM4V error:&error];
AVAssetWriter设置Input
在写入之前,须要设置Input,与AVAssetReader的Output一样,也能够设置AVAssetWriterInput输出的类型为AVMediaTypeAudio或者AVMediaTypeVideo,以下设置以AVMediaTypeVideo为例。
在设置Input时能够指定output设置,这个设置里次要蕴含视频参数。
AVVideoCompressionPropertiesKey对应的属性值是编码相干的,比方一下参数:
- AVVideoAverageBitRateKey:视频尺寸*比率,10.1相当于AVCaptureSessionPresetHigh,数值越大,显示越精密(只反对H.264)。
- AVVideoMaxKeyFrameIntervalKey:关键帧最大距离,若设置1每帧都是关键帧,数值越大压缩率越高(只反对H.264)。
AVVideoProfileLevelKey:画质级别,与设施相干。
a. P-Baseline Profile:根本画质。反对I/P 帧,只反对无交织(Progressive)和CAVLC;
b. EP-Extended profile:进阶画质。反对I/P/B/SP/SI 帧,只反对无交织(Progressive)和CAVLC;
c. MP-Main profile:支流画质。提供I/P/B 帧,反对无交织(Progressive)和交(Interlaced),也反对CAVLC 和CABAC 的反对;
d. HP-High profile:高级画质。在main Profile 的根底上减少了8×8外部预测、自定义量化、 无损视频编码和更多的YUV 格局;
AVVideoCodecKey:视频的编码方式,这里设置为H.264.
AVVideoWidthKey, AVVideoHeightKey:视频的宽高。
更多的设置能够参考文档:Video Settings | Apple Developer DocumentationNSDictionary *codec_settings = @{AVVideoAverageBitRateKey: @(_bitRate)}; NSDictionary *video_settings = @{AVVideoCodecKey: AVVideoCodecH264, AVVideoCompressionPropertiesKey: codec_settings, AVVideoWidthKey: @(1920), AVVideoHeightKey: @(1080)}; _mAssetWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:video_settings];
针对AVAssetWriterInput还能够设置相应的AVAssetWriterInputPixelBufferAdaptor来接管CVPixelBuffer。
AVAssetWriterInputPixelBufferAdaptor提供了一个CVPixelBufferPoolRef,您能够应用它来调配用于写入输入文件的像素缓冲区。文档中写到应用提供的像素缓冲池进行缓冲区调配通常比附加应用独自池调配的像素缓冲区更无效。
初始化的时候能够设置相干的参数,比方CVPixelBuffer的色彩格局,CPU和GPU的内存共享形式等。
CVPixelBuffer能够由AVAssetWriterInputPixelBufferAdaptor提供的缓冲池创立。
CVOpenGLESTextureCacheRef创立一块专门用于寄存纹理的缓冲区,这样每次传递纹理像素数据给GPU时,间接应用这个缓冲区中的内存,防止了反复创立,进步了效率。NSMutableDictionary * attributes = [NSMutableDictionary dictionary];attributes[(NSString *) kCVPixelBufferPixelFormatTypeKey] = @(kCVPixelFormatType_32BGRA);NSDictionary *IOSurface_properties = @{@"IOSurfaceOpenGLESFBOCompatibility": @YES, @"IOSurfaceOpenGLESTextureCompatibility": @YES};attributes[(NSString *) kCVPixelBufferIOSurfacePropertiesKey] = IOSurface_properties;_mAssetWriterPixelBufferInput = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:_mAssetWriterInput sourcePixelBufferAttributes:attributes];CVPixelBufferRef renderTarget;CVOpenGLESTextureCacheRef videoTextureCache;CVReturn err;if (videoTextureCache == NULL) { err = CVOpenGLESTextureCacheCreate(kCFAllocatorDefault, NULL, [EAGLContext currentContext], NULL, & videoTextureCache); if (err) { //错误处理 }} err = CVPixelBufferPoolCreatePixelBuffer (NULL, [_mAssetWriterPixelBufferInput pixelBufferPool], &renderTarget); if (err) { //错误处理 } //对CVPixelBuffer增加附加信息,做色彩格局的转化 CVBufferSetAttachment(renderTarget, kCVImageBufferColorPrimariesKey, kCVImageBufferColorPrimaries_ITU_R_709_2, kCVAttachmentMode_ShouldPropagate); CVBufferSetAttachment(renderTarget, kCVImageBufferYCbCrMatrixKey, kCVImageBufferYCbCrMatrix_ITU_R_601_4, kCVAttachmentMode_ShouldPropagate); CVBufferSetAttachment(renderTarget, kCVImageBufferTransferFunctionKey, kCVImageBufferTransferFunction_ITU_R_709_2, kCVAttachmentMode_ShouldPropagate);
从CVPixelBuffer创立OpenGL的texture,会将renderTarget中的像素数据传输给OpenGL,能够在该texture上的绘制再编码进文件中。
CVOpenGLESTextureRef renderTexture; err = CVOpenGLESTextureCacheCreateTextureFromImage (kCFAllocatorDefault, videoTextureCache, renderTarget, NULL, GL_TEXTURE_2D, GL_RGBA, [1920], [1080], GL_BGRA, GL_UNSIGNED_BYTE, 0, & renderTexture);
在写入之前设置好Input,之后调用startWriting办法。
if ([_mAssetWriter canAddInput:_mAssetWriterInput]){ [_mAssetWriter addInput:_mAssetWriterInput]; } [_mAssetWriter startWriting]; [_mAssetWriter startSessionAtSourceTime:kCMTimeZero];
数据写入
以AVAssetReader读取的sampleBuffer作为输出源来做数据写入,须要解决的异常情况也比拟多,留神writer的状态解决。
代码示例//判断input是否筹备好承受新的数据while (_mAssetWriterInput.isReadyForMoreMediaData) { CMSampleBufferRef sampleBuffer = [output copyNextSampleBuffer]; if (sampleBuffer) { BOOL error = NO; if (_reader.status != AVAssetReaderStatusReading || _writer.status != AVAssetWriterStatusWriting) { error = YES; } if (_videoOutput == output) { // update the video progress _lastSamplePresentationTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer); CVPixelBufferRef pixelBuffer = (CVPixelBufferRef)CMSampleBufferGetImageBuffer(sampleBuffer); if (![_mAssetWriterPixelBufferInput appendPixelBuffer:pixelBuffer withPresentationTime:_lastSamplePresentationTime]) { error = YES; } dispatch_async(dispatch_get_main_queue(), ^{ _progress(CMTimeGetSeconds(_lastSamplePresentationTime) / _duration * 0.8); }); } if (error){ return NO; } } else { //数据写入实现,标记Input完结 [_mAssetWriterInput markAsFinished]; return NO; } }