Reputation: 189
I have set up a midi input port in my code and have attached a call back for reading midi data received. That is all working fine. I am reading Midi Timecode and parsing it in my call back. What I have noticed is that depending on when I start my application, I could be as late as 1 second from the device that is transmitting the MTC. Sometimes it is a frame behind. Regardless, it is inconsistent and frustrating. I am not doing any blocking or Obj-C calls in my readProc. I have even gone to the trouble of disconnecting my usb midi device after running my application to see if there is any weird IOKit stuff going on. I could really use some help, even wild-eyed theories? I feel as if Midi TimeStamps are useless as there is no objective reference to compare them to.
Upvotes: 1
Views: 720
Reputation: 40390
I'm going to assume that you know what you are doing here and mean actual MIDI Timecode and not MIDI clock, which is the more common of the two synchronization methods. Regardless, MIDI is slow and you will need to provide an offset (probably in milliseconds) to the client so that it can react accordingly. For example, look at how Ableton Live does it:
I realize that the above screenshot is for MIDI clock, but the same should apply to MTC as well. You may need to provide some type of UI to determine the offset, since as you have discovered, the latency changes depending on runtime conditions.
Upvotes: 0