4

AVAudioRecorder 및 AVAudioPlayer를 사용하여 오디오를 녹음하고 재생하는 응용 프로그램에서 전화가 걸려 올 경우 시나리오를 보았습니다. 녹음이 진행 중이고 전화가 오면 오디오 전화 통화 후에 녹음 된 녹음은 전화 통화 이전에 녹음 된 오디오의 계속됨을 원합니다.AVAudioRecorder는 중단 후 오디오 만 녹음합니다.

난 인터럽트가 AVAudioRecorderDelegate 방법

를 이용하여 오디오 레코더에 발생하는 추적
  • (공극) audioRecorderBeginInterruption (AVAudioRecorder *) avRecorder 및
  • (공극) audioRecorderEndInterruption (AVAudioRecorder *) avRecorder.

내 EndInterruption 메서드에서 나는 audioSession을 활성화합니다. 여기

내가 같은 문제가 말하고 다른 링크 how to resume recording after interruption occured in iphone?http://www.iphonedevsdk.com/forum/iphone-sdk-development/31268-avaudiorecorderdelegate-interruption.html 건너 온이 문제에 대한 해결책을 찾고있는 동안 내가

- (void)startRecordingProcess 
{ 
    AVAudioSession *audioSession = [AVAudioSession sharedInstance]; 
    NSError *err = nil; 
    [audioSession setCategory:AVAudioSessionCategoryPlayAndRecord error:&err]; 
    if(err) 
    { 
     DEBUG_LOG(@"audioSession: %@ %d %@", [err domain], [err code], [[err userInfo] description]); 
     return; 
    } 
    [audioSession setActive:YES error:&err]; 
    err = nil; 
    if(err) 
    { 
     DEBUG_LOG(@"audioSession: %@ %d %@", [err domain], [err code], [[err userInfo] description]); 
     return; 
    } 
    // Record settings for recording the audio 
    recordSetting = [[NSDictionary alloc] initWithObjectsAndKeys: 
        [NSNumber numberWithInt:kAudioFormatMPEG4AAC],AVFormatIDKey, 
        [NSNumber numberWithInt:44100],AVSampleRateKey, 
        [NSNumber numberWithInt: 2],AVNumberOfChannelsKey, 
        [NSNumber numberWithInt:16],AVLinearPCMBitDepthKey, 
        [NSNumber numberWithBool:NO],AVLinearPCMIsBigEndianKey, 
        [NSNumber numberWithBool:NO],AVLinearPCMIsFloatKey, 
        nil]; 
    BOOL fileExists = [[NSFileManager defaultManager] fileExistsAtPath:recorderFilePath]; 
    if (fileExists) 
    {   
     BOOL appendingFileExists = 
      [[NSFileManager defaultManager] fileExistsAtPath:appendingFilePath]; 
     if (appendingFileExists) 
     { 
      [[NSFileManager defaultManager]removeItemAtPath:appendingFilePath error:nil]; 
     } 
     if (appendingFilePath) 
     { 
      [appendingFilePath release]; 
      appendingFilePath = nil; 
     } 
     appendingFilePath = [[NSString alloc]initWithFormat:@"%@/AppendedAudio.m4a", DOCUMENTS_FOLDER]; 
     fileUrl = [NSURL fileURLWithPath:appendingFilePath]; 
    } 
    else 
    { 
     isFirstTime = YES; 
     if (recorderFilePath) 
     { 
      DEBUG_LOG(@"Testing 2"); 
      [recorderFilePath release]; 
      recorderFilePath = nil; 
     } 
     DEBUG_LOG(@"Testing 3"); 
     recorderFilePath = [[NSString alloc]initWithFormat:@"%@/RecordedAudio.m4a", DOCUMENTS_FOLDER]; 
     fileUrl = [NSURL fileURLWithPath:recorderFilePath]; 
    } 
    err = nil; 
    recorder = [[recorder initWithURL:fileUrl settings:recordSetting error:&err]retain]; 
    if(!recorder) 
    { 
     DEBUG_LOG(@"recorder: %@ %d %@", [err domain], [err code], [[err userInfo] description]); 
     [[AlertFunctions sharedInstance] showMessageWithTitle:kAppName 
                 message:[err localizedDescription] 
                delegate:nil 
              cancelButtonTitle:@"Ok"]; 
     return; 
    } 
    //prepare to record 
    [recorder setDelegate:self]; 
    [recorder prepareToRecord]; 
    recorder.meteringEnabled = YES; 
    [recorder record]; 

} 

를 사용하는 레코딩 코드입니다. 해당 링크에서 제공되었지만 성공적이지 못한 제안을 시도했습니다. AVAudioRecorder 자체에서 작동하도록하고 싶습니다. 이 문제에 대한 해결책을 찾을 수있는 방법이 있습니까? 모든 유용한 제안에 감사드립니다.

답변

3

나는 여러 가지 조사를 한 끝에 현재의 API에서 문제가 있음을 Apple에 통보 받았습니다. 그래서 나는 중단 직후 이전 오디오 파일을 저장하고 재개 된 오디오 파일과 결합함으로써이 문제에 대한 해결책을 찾을 수있었습니다. 누군가가 같은 문제에 직면 할 수있는 누군가를 도울 수 있기를 바랍니다.

+0

게시 하시겠습니까? 두 파일을 어떻게 합치시겠습니까? –

+0

두 파일을 적절히 사용하여 AV 구성을 만들 수 있습니다. 다음 링크에는 http://stackoverflow.com/questions/7775040/play-avmutablecomposition-with-avplayer를 수행하는 방법에 대한 샘플 코드가 있습니다. 그런 다음 AVAssetExportSession을 사용하여 컴포지션을 단일 오디오 파일로 내보낼 수 있습니다. 이 링크를 통해 이동하십시오 http://stackoverflow.com/questions/8019033/avassetexportsession-not-working-in-ios5/9767681#9767681. 희망이 당신의 문제를 해결합니다. – Siddharth

2

AVAudioRecorder이 중단 된 후에 만 ​​녹음하는 비슷한 문제에 직면했습니다.
그래서 녹음 배열을 유지하고 NSTemporaryDirectory에 보관하고 마지막에 병합하여이 문제를 해결했습니다.

  1. 하는 클래스가 AVAudioSessionInterruptionNotification을들을 확인 :

    다음은 주요 단계입니다. 중단에

  2. 는 중단 끝 (AVAudioSessionInterruptionTypeEnded)에 녹음
  3. 저장 (AVAudioSessionInterruptionTypeBegan)를 시작 저장 버튼을 눌렀을에 AVAudioSessionInterruptionOptionShouldResume
  4. 추가]를 중단 옵션에 대한 모든 기록을 새로 녹음을 시작합니다. 위에서 언급 한 단계에 대한

코드 조각은 다음과 같습니다

다음
// 1. Make this class listen to the AVAudioSessionInterruptionNotification in viewDidLoad 
- (void)viewDidLoad 
{ 
    [super viewDidLoad]; 

    [[NSNotificationCenter defaultCenter] addObserver:self 
              selector:@selector(handleAudioSessionInterruption:) 
               name:AVAudioSessionInterruptionNotification 
               object:[AVAudioSession sharedInstance]]; 

    // other coding stuff 
} 

// observe the interruption begin/end 
- (void)handleAudioSessionInterruption:(NSNotification*)notification 
{ 
    AVAudioSessionInterruptionType interruptionType = [notification.userInfo[AVAudioSessionInterruptionTypeKey] unsignedIntegerValue]; 
    AVAudioSessionInterruptionOptions interruptionOption = [notification.userInfo[AVAudioSessionInterruptionOptionKey] unsignedIntegerValue]; 

    switch (interruptionType) { 
     // 2. save recording on interruption begin 
     case AVAudioSessionInterruptionTypeBegan:{ 
      // stop recording 
      // Update the UI accordingly 
      break; 
     } 
     case AVAudioSessionInterruptionTypeEnded:{ 
      if (interruptionOption == AVAudioSessionInterruptionOptionShouldResume) { 
       // create a new recording 
       // Update the UI accordingly 
      } 
      break; 
     } 

     default: 
      break; 
    } 
} 

// 4. append all recordings 
- (void) audioRecorderDidFinishRecording:(AVAudioRecorder *)avrecorder successfully:(BOOL)flag 
{ 
    // append all recordings one after other 
} 

가 작동 예입니다 : 내가 아는

// 
// XDRecordViewController.m 
// 
// Created by S1LENT WARRIOR 
// 

#import "XDRecordViewController.h" 

@interface XDRecordViewController() 
{ 
    AVAudioRecorder *recorder; 

    __weak IBOutlet UIButton* btnRecord; 
    __weak IBOutlet UIButton* btnSave; 
    __weak IBOutlet UIButton* btnDiscard; 
    __weak IBOutlet UILabel* lblTimer; // a UILabel to display the recording time 

    // some variables to display the timer on a lblTimer 
    NSTimer* timer; 
    NSTimeInterval intervalTimeElapsed; 
    NSDate* pauseStart; 
    NSDate* previousFireDate; 
    NSDate* recordingStartDate; 

    // interruption handling variables 
    BOOL isInterrupted; 
    NSInteger preInterruptionDuration; 

    NSMutableArray* recordings; // an array of recordings to be merged in the end 
} 
@end 

@implementation XDRecordViewController 

- (void)viewDidLoad 
{ 
    [super viewDidLoad]; 

    // Make this class listen to the AVAudioSessionInterruptionNotification 
    [[NSNotificationCenter defaultCenter] addObserver:self 
              selector:@selector(handleAudioSessionInterruption:) 
               name:AVAudioSessionInterruptionNotification 
               object:[AVAudioSession sharedInstance]]; 

    [self clearContentsOfDirectory:NSTemporaryDirectory()]; // clear contents of NSTemporaryDirectory() 

    recordings = [NSMutableArray new]; // initialize recordings 

    [self setupAudioSession]; // setup the audio session. you may customize it according to your requirements 
} 

- (void)viewDidAppear:(BOOL)animated 
{ 
    [super viewDidAppear:animated]; 

    [self initRecording]; // start recording as soon as the view appears 
} 

- (void)dealloc 
{ 
    [self clearContentsOfDirectory:NSTemporaryDirectory()]; // remove all files files from NSTemporaryDirectory 

    [[NSNotificationCenter defaultCenter] removeObserver:self]; // remove this class from NSNotificationCenter 
} 

#pragma mark - Event Listeners 

// called when recording button is tapped 
- (IBAction) btnRecordingTapped:(UIButton*)sender 
{ 
    sender.selected = !sender.selected; // toggle the button 

    if (sender.selected) { // resume recording 
     [recorder record]; 
     [self resumeTimer]; 
    } else { // pause recording 
     [recorder pause]; 
     [self pauseTimer]; 
    } 
} 

// called when save button is tapped 
- (IBAction) btnSaveTapped:(UIButton*)sender 
{ 
    [self pauseTimer]; // pause the timer 

    // disable the UI while the recording is saving so that user may not press the save, record or discard button again 
    btnSave.enabled = NO; 
    btnRecord.enabled = NO; 
    btnDiscard.enabled = NO; 

    [recorder stop]; // stop the AVAudioRecorder so that the audioRecorderDidFinishRecording delegate function may get called 

    // Deactivate the AVAudioSession 
    NSError* error; 
    [[AVAudioSession sharedInstance] setActive:NO error:&error]; 
    if (error) { 
     NSLog(@"%@", error); 
    } 
} 

// called when discard button is tapped 
- (IBAction) btnDiscardTapped:(id)sender 
{ 
    [self stopTimer]; // stop the timer 

    recorder.delegate = Nil; // set delegate to Nil so that audioRecorderDidFinishRecording delegate function may not get called 
    [recorder stop]; // stop the recorder 

    // Deactivate the AVAudioSession 
    NSError* error; 
    [[AVAudioSession sharedInstance] setActive:NO error:&error]; 
    if (error) { 
     NSLog(@"%@", error); 
    } 

    [self.navigationController popViewControllerAnimated:YES]; 
} 

#pragma mark - Notification Listeners 
// called when an AVAudioSessionInterruption occurs 
- (void)handleAudioSessionInterruption:(NSNotification*)notification 
{ 
    AVAudioSessionInterruptionType interruptionType = [notification.userInfo[AVAudioSessionInterruptionTypeKey] unsignedIntegerValue]; 
    AVAudioSessionInterruptionOptions interruptionOption = [notification.userInfo[AVAudioSessionInterruptionOptionKey] unsignedIntegerValue]; 

    switch (interruptionType) { 
     case AVAudioSessionInterruptionTypeBegan:{ 
      // • Recording has stopped, already inactive 
      // • Change state of UI, etc., to reflect non-recording state 
      preInterruptionDuration += recorder.currentTime; // time elapsed 
      if(btnRecord.selected) { // timer is already running 
       [self btnRecordingTapped:btnRecord]; // pause the recording and pause the timer 
      } 

      recorder.delegate = Nil; // Set delegate to nil so that audioRecorderDidFinishRecording may not get called 
      [recorder stop]; // stop recording 
      isInterrupted = YES; 
      break; 
     } 
     case AVAudioSessionInterruptionTypeEnded:{ 
      // • Make session active 
      // • Update user interface 
      // • AVAudioSessionInterruptionOptionShouldResume option 
      if (interruptionOption == AVAudioSessionInterruptionOptionShouldResume) { 
       // Here you should create a new recording 
       [self initRecording]; // create a new recording 
       [self btnRecordingTapped:btnRecord]; 
      } 
      break; 
     } 

     default: 
      break; 
    } 
} 

#pragma mark - AVAudioRecorderDelegate 
- (void) audioRecorderDidFinishRecording:(AVAudioRecorder *)avrecorder successfully:(BOOL)flag 
{ 
    [self appendAudiosAtURLs:recordings completion:^(BOOL success, NSURL *outputUrl) { 
     // do whatever you want with the new audio file :) 
    }]; 
} 

#pragma mark - Timer 
- (void)timerFired:(NSTimer*)timer 
{ 
    intervalTimeElapsed++; 
    [self updateDisplay]; 
} 

// function to time string 
- (NSString*) timerStringSinceTimeInterval:(NSTimeInterval)timeInterval 
{ 
    NSDate *timerDate = [NSDate dateWithTimeIntervalSince1970:timeInterval]; 
    NSDateFormatter *dateFormatter = [[NSDateFormatter alloc] init]; 
    [dateFormatter setDateFormat:@"mm:ss"]; 
    [dateFormatter setTimeZone:[NSTimeZone timeZoneForSecondsFromGMT:0.0]]; 
    return [dateFormatter stringFromDate:timerDate]; 
} 

// called when recording pauses 
- (void) pauseTimer 
{ 
    pauseStart = [NSDate dateWithTimeIntervalSinceNow:0]; 

    previousFireDate = [timer fireDate]; 

    [timer setFireDate:[NSDate distantFuture]]; 
} 

- (void) resumeTimer 
{ 
    if (!timer) { 
     timer = [NSTimer scheduledTimerWithTimeInterval:1.0 
               target:self 
               selector:@selector(timerFired:) 
               userInfo:Nil 
               repeats:YES]; 
     return; 
    } 

    float pauseTime = - 1 * [pauseStart timeIntervalSinceNow]; 

    [timer setFireDate:[previousFireDate initWithTimeInterval:pauseTime sinceDate:previousFireDate]]; 
} 

- (void)stopTimer 
{ 
    [self updateDisplay]; 
    [timer invalidate]; 
    timer = nil; 
} 

- (void)updateDisplay 
{ 
    lblTimer.text = [self timerStringSinceTimeInterval:intervalTimeElapsed]; 
} 

#pragma mark - Helper Functions 
- (void) initRecording 
{ 

    // Set the audio file 
    NSString* name = [NSString stringWithFormat:@"recording_%@.m4a", @(recordings.count)]; // creating a unique name for each audio file 
    NSURL *outputFileURL = [NSURL fileURLWithPathComponents:@[NSTemporaryDirectory(), name]]; 

    [recordings addObject:outputFileURL]; 

    // Define the recorder settings 
    NSMutableDictionary *recordSetting = [[NSMutableDictionary alloc] init]; 

    [recordSetting setValue:@(kAudioFormatMPEG4AAC) forKey:AVFormatIDKey]; 
    [recordSetting setValue:@(44100.0) forKey:AVSampleRateKey]; 
    [recordSetting setValue:@(1) forKey:AVNumberOfChannelsKey]; 

    NSError* error; 
    // Initiate and prepare the recorder 
    recorder = [[AVAudioRecorder alloc] initWithURL:outputFileURL settings:recordSetting error:&error]; 
    recorder.delegate = self; 
    recorder.meteringEnabled = YES; 
    [recorder prepareToRecord]; 

    if (![AVAudioSession sharedInstance].inputAvailable) { // can not record audio if mic is unavailable 
     NSLog(@"Error: Audio input device not available!"); 
     return; 
    } 

    intervalTimeElapsed = 0; 
    recordingStartDate = [NSDate date]; 

    if (isInterrupted) { 
     intervalTimeElapsed = preInterruptionDuration; 
     isInterrupted = NO; 
    } 

    // Activate the AVAudioSession 
    [[AVAudioSession sharedInstance] setActive:YES error:&error]; 
    if (error) { 
     NSLog(@"%@", error); 
    } 

    recordingStartDate = [NSDate date]; // Set the recording start date 
    [self btnRecordingTapped:btnRecord]; 
} 

- (void)setupAudioSession 
{ 

    static BOOL audioSessionSetup = NO; 
    if (audioSessionSetup) { 
     return; 
    } 

    AVAudioSession* session = [AVAudioSession sharedInstance]; 

    [session setCategory:AVAudioSessionCategoryPlayAndRecord 
      withOptions:AVAudioSessionCategoryOptionDefaultToSpeaker 
        error:Nil]; 

    [session setMode:AVAudioSessionModeSpokenAudio error:nil]; 

    audioSessionSetup = YES; 
} 

// gets an array of audios and append them to one another 
// the basic logic was derived from here: http://stackoverflow.com/a/16040992/634958 
// i modified this logic to append multiple files 
- (void) appendAudiosAtURLs:(NSMutableArray*)urls completion:(void(^)(BOOL success, NSURL* outputUrl))handler 
{ 
    // Create a new audio track we can append to 
    AVMutableComposition* composition = [AVMutableComposition composition]; 
    AVMutableCompositionTrack* appendedAudioTrack = 
    [composition addMutableTrackWithMediaType:AVMediaTypeAudio 
          preferredTrackID:kCMPersistentTrackID_Invalid]; 

    // Grab the first audio track that need to be appended 
    AVURLAsset* originalAsset = [[AVURLAsset alloc] 
           initWithURL:urls.firstObject options:nil]; 
    [urls removeObjectAtIndex:0]; 

    NSError* error = nil; 

    // Grab the first audio track and insert it into our appendedAudioTrack 
    AVAssetTrack *originalTrack = [[originalAsset tracksWithMediaType:AVMediaTypeAudio] firstObject]; 
    CMTimeRange timeRange = CMTimeRangeMake(kCMTimeZero, originalAsset.duration); 
    [appendedAudioTrack insertTimeRange:timeRange 
           ofTrack:originalTrack 
           atTime:kCMTimeZero 
            error:&error]; 
    CMTime duration = originalAsset.duration; 

    if (error) { 
     if (handler) { 
      dispatch_async(dispatch_get_main_queue(), ^{ 
       handler(NO, Nil); 
      }); 
     } 
    } 

    for (NSURL* audioUrl in urls) { 
     AVURLAsset* newAsset = [[AVURLAsset alloc] 
           initWithURL:audioUrl options:nil]; 

     // Grab the rest of the audio tracks and insert them at the end of each other 
     AVAssetTrack *newTrack = [[newAsset tracksWithMediaType:AVMediaTypeAudio] firstObject]; 
     timeRange = CMTimeRangeMake(kCMTimeZero, newAsset.duration); 
     [appendedAudioTrack insertTimeRange:timeRange 
            ofTrack:newTrack 
            atTime:duration 
             error:&error]; 

     duration = appendedAudioTrack.timeRange.duration; 

     if (error) { 
      if (handler) { 
       dispatch_async(dispatch_get_main_queue(), ^{ 
        handler(NO, Nil); 
       }); 
      } 
     } 
    } 

    // Create a new audio file using the appendedAudioTrack 
    AVAssetExportSession* exportSession = [AVAssetExportSession 
              exportSessionWithAsset:composition 
              presetName:AVAssetExportPresetAppleM4A]; 
    if (!exportSession) { 
     if (handler) { 
      dispatch_async(dispatch_get_main_queue(), ^{ 
       handler(NO, Nil); 
      }); 
     } 
    } 

    NSArray* appendedAudioPath = @[NSTemporaryDirectory(), @"temp.m4a"]; // name of the final audio file 
    exportSession.outputURL = [NSURL fileURLWithPathComponents:appendedAudioPath]; 
    exportSession.outputFileType = AVFileTypeAppleM4A; 
    [exportSession exportAsynchronouslyWithCompletionHandler:^{ 

     BOOL success = NO; 
     // exported successfully? 
     switch (exportSession.status) { 
      case AVAssetExportSessionStatusFailed: 
       break; 
      case AVAssetExportSessionStatusCompleted: { 
       success = YES; 

       break; 
      } 
      case AVAssetExportSessionStatusWaiting: 
       break; 
      default: 
       break; 
     } 

     if (handler) { 
      dispatch_async(dispatch_get_main_queue(), ^{ 
       handler(success, exportSession.outputURL); 
      }); 
     } 
    }]; 
} 

- (void) clearContentsOfDirectory:(NSString*)directory 
{ 
    NSFileManager *fm = [NSFileManager defaultManager]; 
    NSError *error = nil; 
    for (NSString *file in [fm contentsOfDirectoryAtPath:directory error:&error]) { 
     [fm removeItemAtURL:[NSURL fileURLWithPathComponents:@[directory, file]] error:&error]; 
    } 
} 

@end 

의 질문에 대답하지만 희망이 도움이 너무 늦게 다른 사람!