In yesterday’s post, I posed a question: do USB chipsets matter in the 2.0 environment? I had reason to suspect they might.
The answer is holy crap yes they matter they matter so much it is unbelievable.
First, let me talk about what prompted this research, so you’ll know why this matters.
On my old sound interfaces I had live monitoring in hardware, so I didn’t have a lot of need to care about latency. Since that won’t mean much to most people, I’ll explain; when recording, it’s good if you can hear yourself, in headphones. If you’re multitracking, it’s critical.
My old audio interfaces did this with direct connections in the hardware. Whatever came in the microphones also went out the headset. There are advantages to this method, but also disadvantages, in that you aren’t actually hearing what’s being recorded, just what’s being sent in the microphone jack.
But now, I have this shiny new 1818vsl, which doesn’t do hardware monitoring under Linux. Higher-level kit generally doesn’t provide that; they’re assuming you have enough computer that your computer can send back what is actually being recorded, effects and all, and that you’ll do that instead.
This means I now have to care about latency in my system. Latency is basically delay, between mic and computer, and computer and headset. And if the computer is feeding my monitor headphones, that delay matters. You want to hear yourself live, or close to it, not with, oh, a quarter second of delay or something horrible like that.
Now, the good news was that straight out of the box on Ubuntu 16.04 (the latest long-term support version), I had better, lower latency numbers on my new 1818vsl than on my old hardware, when I was using that on 12.04. I could get down to a buffer size of 256 samples, and three frames, which gave me about 30ms basic latency – roughly half what I had with my old hardware and old install. I could use it as-was.
But I couldn’t go any lower on those buffers. One more setting down, and even playback would lag. It’d be okay until the system had to do anything else, then you’d get a playback pause, or a skip, or if recording – presumably, I didn’t bother trying – lost sound. That’s unacceptable, so 30ms was the lower limit, and I wasn’t sure it was a safe lower limit.
And that’s what got me doing all that chipset research I talked about yesterday, and I ordered a new USB card (plugs into PCI sockets) based on that research. I was hoping for a couple fewer milliseconds of latency, that I wouldn’t actually even use; I just wanted a safety margin.
So that new card arrived on Sunday, with its OHCI-compliant chipset made by NEC, and I popped it into the machine and started things up with normal settings.
At first, I was disappointed, because I only saw about half a millisecond less lag, instead of the 1-2ms drop I’d hoped to see. But across tests, it was more consistent – it was always at that same number, which meant I could rely on that 30ms latency in ways I wasn’t sure I could before.
They I decided to see what would happen moving the sample buffer setting one level lower, into what had been failure mode. And the result was 1) it actually worked just fine, where it hadn’t before, and 2) when running analysis, tests showed much lower latency at that setting than with the previous USB ports.
That was an ‘oh ho‘ moment, because it implied that the 256-sample run rate was basically the spot at which the on-motherboard USB could just keep up, and trying to run faster wouldn’t actually produce any actual processing improvement. It’d try, but fail, and time out.
So I did a couple of recordings on that, and they all worked. Then I dropped it another level, until finally, I just said hell with it, let’s just set it as far down as the software will allow and see how hilariously we explode.
I just successfully recorded test tracks four times with these settings, on the new card:
0.7 milliseconds isn’t even something you think about on USB 2.0. 2.8ms, maybe, okay. I’ve seen that managed a few times before, and that’s genuinely indistinguishable from realtime/hardware monitoring. But 0.7ms?
Seriously, this is well into “…is that actually possible?” territory. I’ve never even heard of someone running over USB 2.0 at latencies this low.
So, I guess it looks like the chipset matters a whole lot. Maybe not for most applications, and maybe not in the same way as in USB 3.0 or in FireWire, were there are serious compatibility issues. But in the 2.0 world, in realtime audio, it appears that the chipset makes all the difference in the world.
And yet, I can find this nowhere online. I’m beginning to think nobody bothered until now. Certainly when I’ve asked about it, the response has “why are you on USB get firewire” or “why are you on USB get PCI” because sure I want to throw out all this hardware and start over THANKS NO.
I think USB users have been trained just to accept it and deal. But surprise! You don’t have to! You can actually get a better USB card, if your system allows it, and it’s $30 instead of $1300!
So, HELLO, OTHER SMALL-STUDIO MUSICIANS! You want a chipset that uses OHCI on the USB 1.1 level even if it’s a USB 2.0 card or later because the 1.1 layer still matters, and still gets invoked by the higher-order drivers for card management. See previous post for why that’s important.
This means avoid Intel and VIA chipsets, and look for NEC or SiS – or anything else that loads OHCI drivers and not UHCI. If you’re on Linux, you want to:
cat /proc/interrupts | grep usb
If you see “uhci_hcd” in there, you have a UHCI chipset running your USB port and getting a new USB card with an OHCI-compatible chipset (and disabling whatever’s already installed) might help you with your latency issues.
Mirrored from Crime and the Blog of Evil. Come check out our music at:
Bandcamp (full album streaming) | Videos | iTunes | Amazon | CD Baby
Also posted to ソ-ラ-バ-ド-のおん; comments at Dreamwidth.