iOS:Audio Unit RemoteIO无法在iPhone上运行

iOS:Audio Unit RemoteIO无法在iPhone上运行,第1张

概述我正在尝试根据麦克风的输入创建自己的自定义音效音频单元.该应用程序允许从麦克风到扬声器的同时输入/输出.我可以使用模拟器应用效果和工作,但当我尝试在iPhone上测试时,我听不到任何声音.如果有人可以帮我,我会粘贴我的代码: - (id) init{ self = [super init]; OSStatus status; // Describe audio comp 我正在尝试根据麦克风的输入创建自己的自定义音效音频单元.该应用程序允许从麦克风到扬声器的同时输入/输出.我可以使用模拟器应用效果和工作,但当我尝试在iPhone上测试时,我听不到任何声音.如果有人可以帮我,我会粘贴我的代码:

- (ID) init{    self = [super init];    Osstatus status;    // Describe audio component    AudioComponentDescription desc;    desc.componentType = kAudioUnitType_Output;    desc.componentSubType = kAudioUnitSubType_RemoteIO;    desc.componentFlags = 0;    desc.componentFlagsMask = 0;    desc.componentManufacturer = kAudioUnitManufacturer_Apple;    // Get component    AudioComponent inputComponent = AudioComponentFindNext(NulL,&desc);    // Get audio units    status = AudioComponentInstanceNew(inputComponent,&audioUnit);    checkStatus(status);    // Enable IO for recording    UInt32 flag = 1;    status = AudioUnitSetProperty(audioUnit,kAudioOutputUnitProperty_EnableIO,kAudioUnitScope_input,kinputBus,&flag,sizeof(flag));    checkStatus(status);    // Enable IO for playback    status = AudioUnitSetProperty(audioUnit,kAudioUnitScope_Output,kOutputBus,sizeof(flag));    checkStatus(status);    // Describe format    AudioStreamBasicDescription audioFormat;    audioFormat.mSampleRate         = 44100.00;    audioFormat.mFormatID           = kAudioFormatlinearPCM;    audioFormat.mFormatFlags        = kAudioFormatFlagIsSignedInteger | kAudioFormatFlagIsPacked;    audioFormat.mFramesPerPacket    = 1;    audioFormat.mChannelsPerFrame   = 1;    audioFormat.mBitsPerChannel     = 16;    audioFormat.mBytesPerPacket     = 2;    audioFormat.mBytesPerFrame      = 2;    // Apply format    status = AudioUnitSetProperty(audioUnit,kAudioUnitProperty_StreamFormat,&audioFormat,sizeof(audioFormat));    checkStatus(status);    status = AudioUnitSetProperty(audioUnit,sizeof(audioFormat));    checkStatus(status);    // Set input callback    AURenderCallbackStruct callbackStruct;    callbackStruct.inputProc = recordingCallback;    callbackStruct.inputProcRefCon = self;    status = AudioUnitSetProperty(audioUnit,kAudioOutputUnitProperty_SetinputCallback,kAudioUnitScope_Global,&callbackStruct,sizeof(callbackStruct));    checkStatus(status);    // Set output callback    callbackStruct.inputProc = playbackCallback;    callbackStruct.inputProcRefCon = self;    status = AudioUnitSetProperty(audioUnit,kAudioUnitProperty_SetRenderCallback,sizeof(callbackStruct));    checkStatus(status);    // Allocate our own buffers (1 channel,16 bits per sample,thus 16 bits per frame,thus 2 bytes per frame).    // Practice learns the buffers used contain 512 frames,if this changes it will be fixed in processAudio.    tempBuffer.mNumberChannels = 1;    tempBuffer.mDataByteSize = 512 * 2;    tempBuffer.mData = malloc( 512 * 2 );    // Initialise    status = AudioUnitinitialize(audioUnit);    checkStatus(status);    return self;}

当来自麦克风的新音频数据可用时,将调用此回调.但是当我在iPhone上测试时,永远不要进入这里:

static Osstatus recordingCallback(voID *inRefCon,AudioUnitRenderActionFlags *ioActionFlags,const AudioTimeStamp *inTimeStamp,UInt32 inBusNumber,UInt32 inNumberFrames,audiobufferlist *ioData) {    AudioBuffer buffer;    buffer.mNumberChannels = 1;    buffer.mDataByteSize = inNumberFrames * 2;    buffer.mData = malloc( inNumberFrames * 2 );    // Put buffer in a audiobufferlist    audiobufferlist bufferList;    bufferList.mNumberBuffers = 1;    bufferList.mBuffers[0] = buffer;    // Then:    // Obtain recorded samples    Osstatus status;    status = AudioUnitRender([iosAudio audioUnit],ioActionFlags,inTimeStamp,inBusNumber,inNumberFrames,&bufferList);    checkStatus(status);    // Now,we have the samples we just read sitting in buffers in bufferList    // Process the new data    [iosAudio processAudio:&bufferList];    // release the malloc'ed data in the buffer we created earlIEr    free(bufferList.mBuffers[0].mData);    return noErr;}
解决方法 我解决了我的问题.我只需要在播放/录制之前初始化AudioSession.我使用以下代码执行此 *** 作:

Osstatus status;AudioSessionInitialize(NulL,NulL,self);UInt32 sessioncategory = kAudioSessioncategory_PlayAndRecord;status = AudioSessionSetProperty (kAudioSessionProperty_Audiocategory,sizeof (sessioncategory),&sessioncategory);if (status != kAudioSessionNoError){    if (status == kAudioServicesUnsupportedPropertyError) {        NSLog(@"AudioSessionInitialize Failed: unsupportedPropertyError");    }else if (status == kAudioServicesBadPropertySizeError) {        NSLog(@"AudioSessionInitialize Failed: badPropertySizeError");    }else if (status == kAudioServicesBadSpecifIErSizeError) {        NSLog(@"AudioSessionInitialize Failed: badSpecifIErSizeError");    }else if (status == kAudioServicesSystemSoundUnspecifIEdError) {        NSLog(@"AudioSessionInitialize Failed: systemSoundUnspecifIEdError");    }else if (status == kAudioServicesSystemSoundClIEntTimedOutError) {        NSLog(@"AudioSessionInitialize Failed: systemSoundClIEntTimedOutError");    }else {        NSLog(@"AudioSessionInitialize Failed! %ld",status);    }}AudioSessionSetActive(TRUE);

总结

以上是内存溢出为你收集整理的iOS:Audio Unit RemoteIO无法在iPhone上运行全部内容,希望文章能够帮你解决iOS:Audio Unit RemoteIO无法在iPhone上运行所遇到的程序开发问题。

如果觉得内存溢出网站内容还不错,欢迎将内存溢出网站推荐给程序员好友。

欢迎分享,转载请注明来源:内存溢出

原文地址:https://54852.com/web/1053414.html

(0)
打赏 微信扫一扫微信扫一扫 支付宝扫一扫支付宝扫一扫
上一篇 2022-05-25
下一篇2022-05-25

发表评论

登录后才能评论

评论列表(0条)

    保存