![]() ![]() |
Audio Asylum Thread Printer Get a view of an entire thread on one page |
For Sale Ads |
71.112.95.63
In Reply to: RE: No, the ASRC sample input buffer posted by Slider on July 08, 2007 at 17:28:21
Howdy
Todd doesn't like in principle the fact that ASRC transforms any portion jitter into noise. His point is that the part of jitter filtering you are talking about is available to SSRC so it's arguably not a feature of ASRC but of just having a buffer of any kind.
My point is that after accounting for the similar jitter filtering of the buffer in a good SSRC and the buffer in a good ASRC implementation how does the result of the remaining jitter affecting the SSRC DAC's sound compare to the effect of the (much more jitter free but nosier data) in the ASRC DAC's sound?
Tho I agree with Todd in principle that I don't like adding any noise to the data, I have to admit that my old modified Perp Tech P-3A which proudly used ASRC sounded pretty darn good.
-Ted
Follow Ups:
I think the root of our disagreement is over the effects of the noise transformation. I think the effects are worse than had the jitter been left alone. I get the impression you don't think it's that awful. It's a disagreement I can live with.
The other thing we must realize is the output clock in an ASRC has never been proven to be demonstrably more-precise than the clock controlling the input signal, provided its already a competent design. We are adding noise, yet are also at the mercy of an output clock whose jitter performance has never been truly evaluated. It's another presumption that has never been investigated, but unlike the conversion itself, this one would require actual test data in jitter performance to determine validity.
Howdy
Let me take issue with your 2nd paragraph. In general a local clock is more precise than the same clock separated by transceivers and cables. The transceivers, separate power supplies, and even cables themselves add jitter that indeed make a remote clock more jitter prone than a local clock.
-Ted
You can't do much jitter attenuation with a synchronous buffer, at least not very easily. The jitter reduction I talked about in the ASRC is there because the input and output clocks aren't synchronous, the buffer size really just controls the lower cutoff frequency. But I understand that the remaining jitter is encoded in the data, and that's why I chose the synchronous path, but with a very low cutoff point and extreme isolation between clock domains.
Thanks for the info
Howdy
I almost lost my post replying to the post you deleted. Now I get to edit my post a little :) I'm not assuming you don't know this, but perhaps it will help some other people following this thread:
People get hung up on generalities when the devil is in the details of any particular implementation.
A synchronous buffer can low pass filter jitter to the length of time you want to wait for sync up and delays of the input signal. But the details that some people overlook are that a DAC isn't only used with a CD transport. Often it can be used when there is some other parallel signal, e.g. DVD video and you can't, in general wait all that long. Also customers can get upset when you wait too long to sync up after they interrupt the data stream (e.g. changing channels or skipping tracks...)
-Ted
FAQ |
Post a Message! |
Forgot Password? |
|
||||||||||||||
|
This post is made possible by the generous support of people like you and our sponsors: