Audio Measurements and beyond rightMark

The goto program for audio measurement in the Internet age is RightMark Audio Analyzer (RMAA). It’s not an easy program to use in isolation, and is used best with some old-skool analogue technology. In particular, it doesn’t really do absolute level in any way – everything is referenced to 0dBFS.

RMAA testing is deconstructed by NwAvGuy here. His thesis is that it is impossible to use RMAA right. particularly if you have no experience of analogue electronics and no other test gear. And I’m guilty as charged of publishing RMAA test results on the internet 🙂

It saddens me a little bit that measurement has now become go out and buy £x,000 worth of test gear, plug it it, attach to D.U.T. press the button and report the result. And if you can’t do that, well, no Audio Precision test kit, no comment. I’m not dissing NwAvGuy’s observation – it’s the loss of other ways of testing audio gear I regret. I don’t test for distortion – I scan for it. That’s because I’m testing finished gear usually for how noisy it is with mics at low levels. If distortion/frequency response looks okay/reasonable with RMAA that’s great, if it doesn’t I look for what I have done wrong in setup. Most manufacturers get the distortion and frequency response basics right, but mic preamp noise does vary because most audio recording is music and therefore has plenty of signal, so preamp noise is not usually a key parameter in a field recorder.

BBC Designs EP14/1 audio test set - a tone source and a meter
BBC Designs EP14/1 audio test set – a tone source and a meter

Way back when I was working at BBC Designs, using their EP14/1 test set things were a little more from first principles than ‘press the button of this expensive gear and report back’. The EP14/1 was basically a tone source and a meter with a precision attenuator in front of it.The meter was used comparatively – you would adjust the attenuators to make it read the same as a reference reading, and the wanted information was in the different setting of the attenuators. This way any nonlinearity of the meter scale was greatly minimised.

I use a vintage audio generator and a RMS meter here to mimic the functionality of that, however to line up a 1kHz sinewave you can use a regular DVM AC range because these are designed to read correctly with a sine wave, provided the meter is specified for 1kHz (the Fluke 73 is). In general you don’t want to be measuring the signal at the -67dBu level – by using a 100:1 attenuator I can measure the input of the attenuator much easier.

Noise, level, and frequency response can be measured to a satisfactory degree by analogue equipment using the methods of yesteryear. Frequency response is tedious to measure by hand because you have to take a lot of points, but it can be done. RMAA is a good test for frequency response – most digital gear looks flat bar the usual rolloff at the frequency extremes. You have to be a little bit mad to test frequency response the old way – that is what RMAA is designed for. Just make sure it is within the linear range of the medium.

So I will use the old ways. It’s more gonzo that pressing the button on an Audio Precision test set, but it is more accessible, provided you keep your wits about you. It’s how engineers used to test basic audio parameters. In particular, the need to qualify signal levels, sensitivity and noise levels haven’t gone away. Analysing distortion probably does need the finest AP test kit, but not everything does.

Mic preamp noise testing

The things I am usually testing is the noise performance of mic preamps. RMAA knows nothing of absolute levels. However, not everybody who can’t afford an Audio Precision audio test set is automatically disenfranchised from making measurements. But your lack of money means you will have to be careful, and your results will of course lack the precision and traceability to national standards of a AP test set.

Here is how I test preamp noise. In general the units I test are recorders; in this case one of the objections to the use of sound cards in general automatically disappears – the recorder is the ADC – that is its job!

I use a battery powered 1kHz tone source at -67dBu and a source impedance of 150Ω[ref]150Ω is the typical source impedance for measuring a microphone input, corresponding to -131dBu of noise in a 20kHz bandwidth[/ref]. The tone source is a home made Wien Bridge oscillator based on the Bubba oscillator [ref]TI SLOA078 Sine wave Oscillators[/ref]; the nice thing about this is that it uses the opamps limiting on the PSU rails to set amplitude. It’s easier to control the supply voltage over the long term  than to stabilise the classic PTC lamp feedback that started off Dave Packard and Bill Hewlett in their garage. The 9V battery is regulated by a 5V regulator to reduce the first source of variability, battery voltage which can fall to 6.7V as still deliver a stable output.

How do I know the output is -67dBu? More to the point how do I know it is still -67dBu some time later?

the tone source, presenting -67dBu at a source impedance of 150
the tone source, presenting -67dBu at a source impedance of 150R

I have a Tek analogue ‘scope and a Fluke RMS meter, and a Fluke 73 multimeter. All of this is very secondhand, at a guess they may be over 20 years old, and it’s been a long time since any of this gear has seen a calibration facility – ISO9001 calibration traceable to NAMAS[ref]National Measurement Accreditation Service, which is now UKAS[/ref] is not on the cards. I also have a Farnell audio oscillator – the plan is to measure the oscillator output with the scope Fluke 73 and after a 100:1 attenuator made with a 15k series resistor and 150Ω to ground via the Fluke RMS meter set to 600Ω reference. Then to use the 10dB steps on the oscillator to back off the signal to match -67dBu and compare the reference with this, using the recorder to amplify the signal to make it visible. [ref]I visually confirmed on the Fluke the 10dB steps and went one 10dB stage further to confirm I hadn’t dropped into the noisefloor of the Fluke RMS meter. I concluded that the Farnell 10dB attenuators were satisfactory despite the 35 year old vintage of this kit[/ref]

Since I am comparing one reference with another at the same frequency the gain of the recorder is not relevant. Sources of uncertainty are:

  1. the attenuator
  2. the absolute calibration of ‘scope, Fluke 73, Fluke RMS meter
  3. the 10dB steps of the oscillator

For the attenuator i connected input and output to the channels of the ‘scope and adjusted the gain on the output side to be 100x that of the input

scope showing attenuator output x100 inverted against input
scope showing attenuator output x100 inverted against input

I think I can justifiably say I am within 100:1 ±2.5% – within half of one of the small grads. I swapped the channels and gains and got the same result. Yes, there could be some systematic error that affects specifically the lower range of the input attenuators that would be cancelled out. I’m after due diligence here rather than incontrovertible proof. I’m not a court of law sending someone down for 5 years, and money is an object. I’ll make the assumption Tek didn’t screw up their resistor selection.

So I feel good about point 1

Howsabout absolute calibration? Tricky one, I have no Weston cell, no access to something traceable to national standards. Nevertheless, let’s compare these three measurement systems.

The Tek Oscilloscope

Scope showing -7dBu (note readout is 10x high as I am not using Tek 10x probes)
Scope showing -7dBu (note readout is right for 1x  as I am not using a 10x probe)

I lined up the input to the RMS meter via the attenuator to read -27dBu after the 100x loss, so this reading from the scope before the attenuator should be 13dBu (ie 20 log(100) higher).

It is 9.8V p-p so Vp = 9.8÷2=4.7 So Vrms= Vp÷√2 = 4.7÷√2 = 3.3 Vrms

At this stage I got lazy and entered 3.3Vrms into Sengpiel audio’s calculator to get 12.6dBu. To do it the hard way

dBu = 20×log(3.3/0.7746[ref]The reference voltage for 0dBu is 0.7746, log is log base 10[/ref])

The Fluke 73 DVM

Fluke 73 says 3.4Vrms
Fluke 73 says 3.4Vrms – that’s about 4% off

Looking at the RMS meter it shows

The fluke reading of a -7dBu signal after a 100x attenuator (-20dB)
The Fluke RMS meter reading of a ~13dBu signal after a 100x attenuator (-40dB)

I feel okay now about my attenuator and my three pieces of test gear – the scope and Fluke 73 agree on the input of the attenuator and the Fluke RMS meter squares with the input value less 40dB (20×log{1/100}). However, it always pays to ask the obvious question, and move the scope input to the Fluke RMS meter (it is capable of showing dBu[ref]actually the scale says dBm but the input impedance is 10MΩ, so with the 600 setting the display shows the same as the dBu value[/ref] directly using the knob to set the reference impedance to 600Ω)

Fluke RMS meter showing input of +13dBu
Fluke RMS meter showing input of +13dBu

It finally remains to use a recorder eg the Olympus LS10 to compare the attenuated output of the Farnell oscillator with the homebuilt -67dBu tone source

Olympus showing the test input from the Farnell, at max gain
Olympus showing -14dBFS for the test input from the Farnell, at max gain

Now the tone source

LS10 shows the same level of -14dBFS for the battery powered tone source
LS10 shows the same level of -14dBFS for the battery powered tone source

I conclude that despite the test kit being over 20 years old there is a reasonably good confidence that this really does show a level of -67dBu ±1dB. That’s not fantastic, it isn’t as good as an Audio Precision test set, but it is eminently usable. It is enough to show, once the recording is analysed using the tone and then with the tone source switched off but still terminating the input, roughly what the noise level of the mic preamplifiers is like and what the sensitivity is like. It isn’t incontrovertible – it is possible that all three measures of absolute value have drifted identically. They are unlikely to be 10% out (corresponding roughly to 1dB voltage).

I’d notice that sort of error in a multimeter over time – all 5V TTL rails would show me 4.5V or 5.5V, which would make me uncomfortable – I want to see that 4.8 to 5.2V usually. Some time back I compared this multimeter to another one of the same model, and at ~12V they agreed but for one LSB in four digits. That’s kind of stupendous in analogue gear over 20 years old. Analogue has a terrible rap for stability, but by the end of the analogue era in the 1990s they seem to have sorted things pretty well.

Finally, to make things easier in future I compared what the Fluke RMS meter said for the Farnell at -67dBu

Fluke RMS meter showing the Farnell via attenuator at -67dBu
Fluke RMS meter showing the Farnell via attenuator at -67dBu

with what it showed for the tone source

Fluke RMS meter showing the tone source at -67dBu
Fluke RMS meter showing the tone source at -67dBu

and then I pulled the power to the tone source, so the Fluke was terminated in 150Ω, to confirm I was above the noise floor

The noise floor with the input terminated in 150 ohm
The noise floor with the input terminated in 150 ohm

I was 9dB away, so I certainly don’t want to be any lower, but it is good enough.

With care and with attention to detail, using comparative measurements you can get useful audio test results with boatanchor equipment at modest costs.

Interpreting the output

I catch the output from a recorder as an uncompressed WAV. using Adobe Audition Stats over a selection of tone and of silence, taking the average RMS power(![ref]Audition knows only about 0dBFS, not about system levels or impedances[/ref]) gives me

-32dBFS for tone
-32dBFS for tone (22kHz BW)

then the tone source was powered off but left connected (150Ω still terminated the iPod) and the stats run on a section of the recording of silence

Silence is -71dBFS (22kHz BW)

The signal present in the silence is the ein of the mic preamplifier (AGC was off in the Spectrumview app used). So ein is -67 + (-71 + 32) = -106dBu in 22kHz BW, which is about 24dB worse than the theoretical maximum. A typical SD card recorder of 2010 vintage seems to come in about 8 to 10dB better. Which is fair enough – the iPod touch is not billed as a field recorder and the performance is fine for a closeup mic.

It’s possible to use the Rightmark spectrum analyser report instead

Rightmark spectrum, IF bandwidth 2.7Hz
Rightmark spectrum, measurement bandwidth 2.7Hz, according to RMAA

but you have to increase the noise floor to account for the narrow 2.7Hz IF bandwidth. The wanted signal measures -34dBFS and the noise about -110dBFS, but the noise has to be increased by 10log10 (22,000/2.7) = 39dB[ref]10 not 20 log because of the way noise adds up with the root of bandwidth[/ref], so the difference is (34-110 + 39) = -37, ie about -104dBu. The discrepancy is probably because I had to estimate by eye from the RMAA chart. It is possible the tone source is noisy, so the unpowered spectrum is shown below – the noise floor is not significantly different.

spectrum with tone source powered off
spectrum with tone source powered off

Leave a Reply

Your email address will not be published. Required fields are marked *