[ale] [OT] Help! burned cd playback wave clipped, original wave perfect
Alex Carver
agcarver+ale at acarver.net
Mon Sep 9 21:31:24 EDT 2013
On 9/9/2013 17:16, Ron Frazier (ALE) wrote:
> Hi Neal,
>
> Thanks for the tip. I was able to solve the problem I had using the mic input on the laptop, once I figured out how to set it. I'll keep the usb dongle in mind. I have some really cheap ones of those from amazon (no name brand) for a few dollars each. Those had mono inputs, and I needed stereo. The one you mentioned might be better.
>
> In terms of the equipment, I had a dual cassette deck line out hooked into an amp, then the line output of the amp hooked to the laptop. The main problem was that, when I set the controls of the mic input on the laptop one way, it clipped even though the overall level was not too high, and when I set it the other way, it worked. It was quite confusing.
There's not much to the issue. Microphone level inputs are expecting
microvolts to single digit millivolt inputs (maybe 10-20mV at the
absolute maximum for some types of microphones). Line input levels are
typically one volt peak-to-peak (707mV RMS for a pure sinewave). The
microphone preamp in the laptop is using gain to give the sound card's
ADC a measurable signal[1]. Too much gain and a line-level input on a
microphone preamp that expects no more than a few millivolts will clip
the ADC.
[1] A cheap ADC for a sound card is usually around 10 bits of conversion
resolution and the input swing is expected to be about one volt
(line-level signals since it's a sound card). One bit of resolution on
a 10-bit ADC with a one volt peak-to-peak (1VPP) is approximately 2mV:
1VPP = 2 volts full scale (Vfs)
(this comes from a signal going from -1 V to 1 V).
2Vfs/2^10 = 2Vfs/1024 = 0.0019 V = 1.9mV.
A microphone has only a few millivolts of total swing so the ADC can't
directly see a microphone with any decent resolution. Only a few bits
worth of output would show up. The upper bits are "wasted" in a manner
of speaking. The sound card thus uses an amplifier for the microphone
input to give the ADC an opportunity to improve fidelity by using more
of it's range (to stop "wasting" bits). The gain factor from a few mV
to a few hundred mV is going to be on the order of 100. However, the
amplifier is just a simple amplifier (granted one that can be controlled
via software but it's otherwise fairly simple). If you drive that
amplifier with a signal whose value is about 70mV instead of 7mV, the
output of that amplifier will try to reach a huge 7 Volts, seven times
higher than the ADC ever expected. It won't make it though because the
amp probably runs on the 5V or 3.3V power supply so its output will
clip. Assuming the amp could output 7V, you've already exceeded the ADC
limits and the ADC itself clips (digitally).
So it comes as no surprise that driving your microphone input clipped
with a line level output even if the line level is turned down (though
you also turned up the microphone gain which made things worse). The
only way around the problem is to have unity gain on the microphone
input which, in a very loose manner, turns it into a line level input.
However it's not that great of a solution because the circuit present on
a microphone input is very different and you can end up with all sorts
of other artifacts that aren't normally present on a line-input
(compression, for example).
More information about the Ale
mailing list