ios9 – – copyPixelBufferForItemTime:itemTimeForDisplay:null值

我的问题是,当我的应用程序尝试从AVPlayerItemVideoOutput获取CVPixelBufferRef并使用 – copyPixelBufferForItemTime:itemTimeForDisplay:function时,我使用iOS9 sdk编译我的应用程序时,我会在加载视频和所有实例时不时得到一个空值被创造了.

使用iOS 8我的应用程序工作正常,但iOS9给我的问题,甚至我的应用程序商店的版本可供下载使用iOS 8 SDK编译时给我同样的问题安装在IOS9.

当问题发生并且我得到一个null getCVPixelBufferRef时,如果我按下主页按钮并且当我再次打开应用程序并且变为活动时应用程序转到后台时,给我空的CVPixelBufferRef的AVPlayerItemVideoOutput实例开始正常工作并且问题是解决了.

这是一个youtube视频,我在其中复制问题:

https://www.youtube.com/watch?v=997zG08_DMM&feature=youtu.be

以下是为了创建所有项目的实例的示例代码:

NSURL *url ;
url = [[NSURL alloc] initFileURLWithPath:[_mainVideo objectForKey:@"file"]];

NSDictionary *pixBuffAttributes = @{(id)kCVPixelBufferPixelFormatTypeKey: @(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange)};
_videoOutput = [[AVPlayerItemVideoOutput alloc] initWithPixelBufferAttributes:pixBuffAttributes];
_myVideoOutputQueue = dispatch_queue_create("myVideoOutputQueue", DISPATCH_QUEUE_SERIAL);
[_videoOutput setDelegate:self queue:_myVideoOutputQueue];

_player = [[AVPlayer alloc] init];


// Do not take mute button into account
NSError *error = nil;
BOOL success = [[AVAudioSession sharedInstance]
                setCategory:AVAudioSessionCategoryPlayback
                error:&error];
if (!success) {
   // NSLog(@"Could not use AVAudioSessionCategoryPlayback", nil);
}

asset = [AVURLAsset URLAssetWithURL:url options:nil];


if(![[NSFileManager defaultManager] fileExistsAtPath:[[asset URL] path]]) {
   // NSLog(@"file does not exist");
}

NSArray *requestedKeys = [NSArray arrayWithObjects:kTracksKey, kPlayableKey, nil];

[asset loadValuesAsynchronouslyForKeys:requestedKeys completionHandler:^{

    dispatch_async( dispatch_get_main_queue(),
                   ^{
                       /* Make sure that the value of each key has loaded successfully. */
                       for (NSString *thisKey in requestedKeys)
                       {
                           NSError *error = nil;
                           AVKeyValueStatus keyStatus = [asset statusOfValueForKey:thisKey error:&error];
                           if (keyStatus == AVKeyValueStatusFailed)
                           {
                               [self assetFailedToPrepareForPlayback:error];
                               return;
                           }
                       }

                       NSError* error = nil;
                       AVKeyValueStatus status = [asset statusOfValueForKey:kTracksKey error:&error];
                       if (status == AVKeyValueStatusLoaded)
                       {
                           //_playerItem = [AVPlayerItem playerItemWithAsset:asset];


                           [_playerItem addOutput:_videoOutput];
                           [_player replaceCurrentItemWithPlayerItem:_playerItem];
                           [_videoOutput requestNotificationOfMediaDataChangeWithAdvanceInterval:ONE_FRAME_DURATION];

                           /* When the player item has played to its end time we'll toggle
                            the movie controller Pause button to be the Play button */
                           [[NSNotificationCenter defaultCenter] addObserver:self
                                                                    selector:@selector(playerItemDidReachEnd:)
                                                                        name:AVPlayerItemDidPlayToEndTimeNotification
                                                                      object:_playerItem];

                           seekToZeroBeforePlay = NO;

                           [_playerItem addObserver:self
                                         forKeyPath:kStatusKey
                                            options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew
                                            context:AVPlayerDemoPlaybackViewControllerStatusObservationContext];

                           [_player addObserver:self
                                     forKeyPath:kCurrentItemKey
                                        options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew
                                        context:AVPlayerDemoPlaybackViewControllerCurrentItemObservationContext];

                           [_player addObserver:self
                                     forKeyPath:kRateKey
                                        options:NSKeyValueObservingOptionInitial | NSKeyValueObservingOptionNew
                                        context:AVPlayerDemoPlaybackViewControllerRateObservationContext];


                           [self initScrubberTimer];

                           [self syncScrubber];


                       }
                       else
                       {
                         //  NSLog(@"%@ Failed to load the tracks.", self);
                       }
                   });
}];

下面是示例代码,给出了空像素缓冲区

CVPixelBufferRef pixelBuffer =
[_videoOutput
 copyPixelBufferForItemTime:[_playerItem currentTime]
itemTimeForDisplay:nil];

NSLog(@"the pixel buffer is %@", pixelBuffer);
NSLog (@"the _videoOutput is %@", _videoOutput.description);
CMTime dataTime = [_playerItem currentTime];
//NSLog(@"the current time is %f", dataTime);
return pixelBuffer;

最佳答案 我遇到了同样的问题,并在这个帖子中找到了答案:
https://forums.developer.apple.com/thread/27589#128476

在添加输出之前,您必须等待视频准备好播放,否则它将失败并返回nil.我的快速代码如下:

func retrievePixelBufferToDraw() -> CVPixelBuffer? {
  guard let videoItem = player.currentItem else { return nil }
  if videoOutput == nil || self.videoItem !== videoItem {
    videoItem.outputs.flatMap({ return $0 as? AVPlayerItemVideoOutput }).forEach {
      videoItem.remove($0)
    }
    if videoItem.status != AVPlayerItemStatus.readyToPlay {
      // see https://forums.developer.apple.com/thread/27589#128476
      return nil
    }

    let pixelBuffAttributes = [
      kCVPixelBufferPixelFormatTypeKey as String: kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange,
      ] as [String: Any]

    let videoOutput = AVPlayerItemVideoOutput.init(pixelBufferAttributes: pixelBuffAttributes)
    videoItem.add(videoOutput)
    self.videoOutput = videoOutput
    self.videoItem = videoItem
  }
  guard let videoOutput = videoOutput else { return nil }

  let time = videoItem.currentTime()
  if !videoOutput.hasNewPixelBuffer(forItemTime: time) { return nil }
  return videoOutput.copyPixelBuffer(forItemTime: time, itemTimeForDisplay: nil)
}
点赞