我的应用程序中有一个例程,可通过AVAssetWriter将MTLTextures捕获为电影。当我在设备上运行该应用程序时,内存管理似乎保持稳定,并且电影被创建并写入磁盘。但是,当我在App Store上发布该应用程序并下载该应用程序时,无论电影大小如何,该应用程序在创建电影后不久就会崩溃。通过工具运行该应用程序将显示在从MTLTextures中提取PixelBuffer的例程中发生了巨大的内存分配。这是崩溃发生的地方。该例程在我要记录的每个MTLTexture的循环中运行:
func AVAssetWriterEncodeFrame(forTexture texture: MTLTexture) {
while !assetWriterVideoInput!.isReadyForMoreMediaData {
} // hang out here until isReadyForMoreMediaData == true
autoreleasepool(invoking: { () -> () in
let fps: Int32 = Int32(Constants.movieFPS)
guard let pixelBufferPool = assetWriterPixelBufferInput!.pixelBufferPool else {
print("[MovieMakerVC]: Pixel buffer asset writer input did not have a pixel buffer pool available; cannot retrieve frame")
return
}
var maybePixelBuffer: CVPixelBuffer? = nil
let status = CVPixelBufferPoolCreatePixelBuffer(nil, pixelBufferPool, &maybePixelBuffer)
if status != kCVReturnSuccess {
print("[MovieMakerVC]: Could not get pixel buffer from asset writer input; dropping frame...")
return
}
guard let pixelBuffer = maybePixelBuffer else { return }
CVPixelBufferLockBaseAddress(pixelBuffer, [])
let pixelBufferBytes = CVPixelBufferGetBaseAddress(pixelBuffer)!
// Use the bytes per row value from the pixel buffer since its stride may be rounded up to be 16-byte aligned
let bytesPerRow = CVPixelBufferGetBytesPerRow(pixelBuffer)
let region = MTLRegionMake2D(0, 0, texture.width, texture.height)
texture.getBytes(pixelBufferBytes, bytesPerRow: bytesPerRow, from: region, mipmapLevel: 0)
let presentationTime = CMTimeMake(value: Int64(frameIndexForPresentationTime), timescale: fps)
assetWriterPixelBufferInput!.append(pixelBuffer, withPresentationTime: presentationTime)
CVPixelBufferUnlockBaseAddress(pixelBuffer, [])
frameIndexForPresentationTime += frameHoldLength // set up for next frame
}) // end of autoreleasepool
} // end of func AVAssetWriterEncodeFrame(forTexture texture: MTLTexture)
图形中看到的尖峰在影片被写入磁盘后会不受控制地增长。分配列表显示了数千个条目,如下所示:
@@ autoreleasepool内容。 ... 4 KiB AVFoundation。 -[AVAssetWriter输入助手]。 (如上图所示)
有人可以指出我在做什么错,希望可以解决该问题吗?
罪魁祸首就是这段代码:
while!assetWriterVideoInput!.isReadyForMoreMediaData {} //在这里闲逛,直到isReadyForMoreMediaData == true
事实证明,在AVAssetWriterInputPixelBufferAdaptor对象中测试这种情况的虚假性确实很糟糕。
相反,将逻辑块的整体包装如下:
var frameIsCaptured: Bool = false
while (assetWriterVideoInput!.isReadyForMoreMediaData && frameIsCaptured == false) {
... // insert all the texture processing logic
texture.getBytes(pixelBufferBytes, bytesPerRow: bytesPerRow, from: region, mipmapLevel: 0)
... // insert pixelBuffer ingestion logic
frameIsCaptured = true. // needed so we only process this texture once
}
[我之前所做的事情在某种意义上说是例程将不断循环直到isReadyForMoreMediaData
变为真(在这一点上,我将完成将pixelBuffer附加到正在进行的电影中所需的所有纹理工作) ,但是由该空循环创建的可笑的开销似乎并不是要解决的设计缺陷。当然,Instruments指出此循环中存在问题,但花了我一段时间才看到光。