AudioKit:AKNodeOutputPlot和AKMicrophone无法正常工作,可能是由于Lifecycle或MVVM架构决策

问题描述 投票:2回答:1

在我使用AudioKit学习的早期,以及在更大的应用程序中进行扩展时,我采用了AudioKit should be effectively be a global singleton.的标准建议,我设法构建了一个非常复杂的原型,并且在世界范围内都很好。

一旦我开始扩大规模并接近实际发布。我们决定将MVVM用于我们的架构,并尝试不再使用可怕的大型AudioKit Singelton来处理应用程序中音频需求的各个方面。简而言之,MVVM非常优雅,并且已经明确地清理了我们的代码库。

与我们的AudioKit结构直接相关,它是这样的:

AudioKit和AKMixer驻留在Singelton实例中,并具有公共函数,允许各种视图模型和我们的其他音频模型连接和分离各种节点(AKPlayerAKSampler等...)。在我做过的最小测试中,我可以确认这是有效的,因为我用我的AKPlayer模块尝试它并且效果很好。

我遇到了一个问题,我不能,为了我的生活,让AKNodeOutputPlotAKMicrophone相互合作,尽管实际的代码实现与我的工作原型相同。

我担心的是我做错了以为我可以模块化AudioKit以及需要连接它的各种节点和组件,或者AKNodeOutputPlot是否有我不知道的特殊要求。

以下是我可以提供的最简短的代码段,而不会压倒这个问题:

AudioKit Singelton(在AppDelegate中调用):

import Foundation
import AudioKit

class AudioKitConfigurator
{
    static let shared: AudioKitConfigurator = AudioKitConfigurator()

    private let mainMixer: AKMixer = AKMixer()

    private init()
    {
        makeMainMixer()
        configureAudioKitSettings()
        startAudioEngine()
    }

    deinit
    {
        stopAudioEngine()
    }

    private func makeMainMixer()
    {
        AudioKit.output = mainMixer
    }

    func mainMixer(add node: AKNode)
    {
        mainMixer.connect(input: node)
    }

    func mainMixer(remove node: AKNode)
    {
        node.detach()
    }

    private func configureAudioKitSettings()
    {
        AKAudioFile.cleanTempDirectory()
        AKSettings.defaultToSpeaker = true
        AKSettings.playbackWhileMuted = true
        AKSettings.bufferLength = .medium

        do
        {
            try AKSettings.setSession(category: .playAndRecord, with: .allowBluetoothA2DP)
        }

        catch
        {
            AKLog("Could not set session category.")
        }

    }

    private func startAudioEngine()
    {
        do
        {
            try AudioKit.start()
        }
        catch
        {
            AKLog("Fatal Error: AudioKit did not start!")
        }
    }

    private func stopAudioEngine()
    {
        do
        {
            try AudioKit.stop()
        }
        catch
        {
            AKLog("Fatal Error: AudioKit did not stop!")
        }
    }
}

麦克风组件:

import Foundation
import AudioKit
import AudioKitUI

enum MicErrorsToThrow: String, Error
{
    case recordingTooShort          = "The recording was too short, just silently failing"
    case audioFileFailedToUnwrap    = "The Audio File failed to Unwrap from the recorder"
    case recorderError              = "The Recorder was unable to start recording."
    case recorderCantReset          = "In attempt to reset the recorder, it was unable to"
}

class Microphone
{
    private var mic:            AKMicrophone    = AKMicrophone()
    private var micMixer:       AKMixer         = AKMixer()
    private var micBooster:     AKBooster       = AKBooster()
    private var recorder:       AKNodeRecorder!
    private var recordingTimer: Timer

    init()
    {
        micMixer = AKMixer(mic)
        micBooster = AKBooster(micMixer)
        micBooster.gain = 0
        recorder = try? AKNodeRecorder(node: micMixer)

        //TODO: Need to finish the recording timer implementation, leaving blank for now
        recordingTimer = Timer(timeInterval: 120, repeats: false, block: { (timer) in

        })

        AudioKitConfigurator.shared.mainMixer(add: micBooster)
    }

    deinit {
//      removeComponent()
    }

    public func removeComponent()
    {
        AudioKitConfigurator.shared.mainMixer(remove: micBooster)
    }

    public func reset() throws
    {
        if recorder.isRecording
        {
            recorder.stop()
        }
        do
        {
            try recorder.reset()
        }
        catch
        {
            AKLog("Recorder can't reset!")
            throw MicErrorsToThrow.recorderCantReset
        }
    }

    public func setHeadphoneMonitoring()
    {
        // microphone will be monitored while recording
        // only if headphones are plugged
        if AKSettings.headPhonesPlugged {
            micBooster.gain = 1
        }
    }

    /// Start recording from mic, call this function when using in conjunction with a AKNodeOutputPlot so that it can display the waveform in realtime while recording
    ///
    /// - Parameter waveformPlot: AKNodeOutputPlot view object which displays waveform from recording
    /// - Throws: Only error to throw is from recorder property can't start recording, something wrong with microphone. Enum is MicErrorsToThrow.recorderError
    public func record(waveformPlot: AKNodeOutputPlot) throws
    {
        waveformPlot.node = mic
        do
        {
            try recorder.record()
//          self.recordingTimer.fire()
        }
        catch
        {
            print("Error recording!")
            throw MicErrorsToThrow.recorderError
        }
    }

    /// Stop the recorder, and get the recording as an AKAudioFile, necessary to call if you are using AKNodeOutputPlot
    ///
    /// - Parameter waveformPlot: AKNodeOutputPlot view object which displays waveform from recording
    /// - Returns: AKAudioFile
    /// - Throws: Two possible errors, recording was too short (right now is 0.0, but should probably be like 0.5 secs), or could not retrieve audio file from recorder, MicErrorsToThrow.audioFileFailedToUnwrap, MicErrorsToThrow.recordingTooShort
    public func stopRecording(waveformPlot: AKNodeOutputPlot) throws -> AKAudioFile
    {
        waveformPlot.pause()
        waveformPlot.node = nil

        recordingTimer.invalidate()
        if let tape = recorder.audioFile
        {
            if tape.duration > 0.0
            {
                recorder.stop()
                AKLog("Printing tape: CountOfFloatChannelData:\(tape.floatChannelData?.first?.count) | maxLevel:\(tape.maxLevel)")
                return tape
            }
            else
            {
                //TODO: This should be more gentle than an NSError, it's just that they managed to tap the buttona and tap again to record nothing, honestly duration should probbaly be like 0.5, or 1.0 even. But let's return some sort of "safe" error that doesn't require UI
                throw MicErrorsToThrow.recordingTooShort
            }
        }
        else
        {
            //TODO: need to return error here, could not recover audioFile from recorder
            AKLog("Can't retrieve or unwrap audioFile from recorder!")
            throw MicErrorsToThrow.audioFileFailedToUnwrap
        }
    }
}

现在,在我的VC中,AKNodeOutputPlot是关于Storybard的视图,并通过IBOutlet连接起来。它呈现在屏幕上,它根据我的喜好风格化,它肯定是连接和工作。 VC / VM中还有我的Microphone组件的实例属性。我的想法是,在录制时,我们会将nodeOutput对象传递给ViewModel,后者将调用record(waveformPlot: AKNodeOutputPlot)Microphone函数,然后waveformPlot.node = mic就足以将它们连接起来。可悲的是,事实并非如此。

视图:

class ComposerVC: UIViewController, Storyboarded
{
    var coordinator: MainCoordinator?
    let viewModel: ComposerViewModel = ComposerViewModel()

    @IBOutlet weak var recordButton: RecordButton!
    @IBOutlet weak var waveformPlot: AKNodeOutputPlot! // Here is our waveformPlot object, again confirmed rendering and styled

    // MARK:- VC Lifecycle Methods
    override func viewDidLoad()
    {
        super.viewDidLoad()

        setupNavigationBar()
        setupConductorButton()
        setupRecordButton()
    }

    func setupWaveformPlot() {
        waveformPlot.plotType = .rolling
        waveformPlot.gain = 1.0
        waveformPlot.shouldFill = true
    }

    override func viewDidAppear(_ animated: Bool)
    {
        super.viewDidAppear(animated)

        setupWaveformPlot()

        self.didDismissComposerDetailToRootController()
    }

    // Upon touching the Record Button, it in turn will talk to ViewModel which will then call Microphone module to record and hookup waveformPlot.node = mic
    @IBAction func tappedRecordView(_ sender: Any)
    {
        self.recordButton.recording.toggle()
        self.recordButton.animateToggle()
        self.viewModel.tappedRecord(waveformPlot: waveformPlot)
        { (waveformViewModel, error) in
            if let waveformViewModel = waveformViewModel
            {
                self.segueToEditWaveForm()
                self.performSegue(withIdentifier: "composerToEditWaveForm", sender: waveformViewModel)
                //self.performSegue(withIdentifier: "composerToDetailSegue", sender: self)
            }
        }
    }

视图模型:

import Foundation
import AudioKit
import AudioKitUI

class ComposerViewModel: ViewModelProtocol
{

//MARK:- Instance Variables
var recordingState: RecordingState

var mic:            Microphone                      = Microphone()

init()
{
    self.recordingState = .readyToRecord
}


func resetViewModel()
{
    self.resetRecorder()
}

func resetRecorder()
{
    do
    {
        try mic.reset()
    }
    catch let error as MicErrorsToThrow
    {
        switch error {
        case .audioFileFailedToUnwrap:
            print(error)
        case .recorderCantReset:
            print(error)
        case .recorderError:
            print(error)
        case .recordingTooShort:
            print(error)
        }
    }
    catch {
        print("Secondary catch in start recording?!")
    }
    recordingState = .readyToRecord
}

func tappedRecord(waveformPlot: AKNodeOutputPlot, completion: ((EditWaveFormViewModel?, Error?) -> ())? = nil)
{
    switch recordingState
    {
    case .readyToRecord:
        self.startRecording(waveformPlot: waveformPlot)

    case .recording:
        self.stopRecording(waveformPlot: waveformPlot, completion: completion)

    case .finishedRecording: break
    }
}

func startRecording(waveformPlot: AKNodeOutputPlot)
{

    recordingState = .recording
    mic.setHeadphoneMonitoring()
    do
    {
        try mic.record(waveformPlot: waveformPlot)
    }

    catch let error as MicErrorsToThrow
    {
        switch error {
        case .audioFileFailedToUnwrap:
            print(error)
        case .recorderCantReset:
            print(error)
        case .recorderError:
            print(error)
        case .recordingTooShort:
            print(error)
        }
    }
    catch {
        print("Secondary catch in start recording?!")
    }
}

我很乐意提供更多代码,但我只是不想用他们的时间压倒。逻辑似乎是合理的,我只是觉得我错过了一些明显的东西,或者对AudioKit + AKNodeOutputPlot + AKMicrohone的完全误解。

任何想法都是如此受欢迎,谢谢!

ios audiokit
1个回答
1
投票

编辑AudioKit 4.6修复了所有问题!高度鼓励为您的项目提供AudioKit的MVVM /模块化!

====

经过大量的实验。我得出了一些结论:

  1. 在一个单独的项目中,我带来了我的AudioKitConfiguratorMicrophone类,初始化它们,将它们连接到AKNodeOutputPlot并且它完美无缺。
  2. 在我非常大的项目中,无论我做什么,我都无法让同一个班级完成工作。

现在,我正在恢复旧的构建,慢慢添加组件,直到它再次中断,并将逐个更新架构,因为这个问题太复杂,可能与其他一些库交互。我也从AudioKit 4.5.6降级到AudioKit 4.5.3。

这不是解决方案,而是目前唯一可行的解​​决方案。好消息是,完全可以格式化AudioKit以使用MVVM架构。

© www.soinside.com 2019 - 2024. All rights reserved.