Home Pro Audio Asylum

Pro studio recording equipment and music production/industry.

Re: Technical question for the Pros using yesterdays gear with todays (long)

208.158.191.6

The disc mastering delay is indeed a digital output of the encoded signal prior to video encoding... I'm not sure of the format however. -You may well know the idea behind mastering delays, but just in case you're not familiar, here's a quick rundown:

Back in the later all-Analog days, disc-cutting rooms had progressed to computerised disc-cutting lathes that could maximise the amount of time you could wring out of a vinyl disc pressing. -Louder signals make the groove deviate from side to side more, so louder signals in the days before computers meant shorter recordings, since you had to space the spirals out to give more room between adjacent grooves to stop them banging into each other. (...yes, I know...there's only one long groove on each side, but most folks still think of them as 'adjacent grooves'!) Louder cuts meant shorter recordings, longer recordings meant that the signal to noise ratio went down.

The advent of decent computerised lathes (The Neumann VMS-70 series was a great favourite!) meant that the computer could slow down the lead screw (tightening up the grooves) in quieter passages, -allowing the groove spacing to close up when it deviated less- and speed up the lead screw to space out the groove spacing when the signal got louder. -The only thing that complicated the process was that it had to be able to predict the future... it had to know how loud a signal was going to get and when, so that it could give the groove enough space to get loud before the signal actually got loud...

The analog tape machine solution was to provide two sets of playback electronics and two playback heads, with a fairly long tape path between them. -The first head (the 'preview' output) would allow the lathe to see what was going to happen in 1 revolution's time, and let the computer figure out how much space to give the next groove for signal level. -the second playback output -one full revolution later- would be the signal that was actually sent to the cutting head, now that the computer has spaced it as wide as it needs, but as tight as it could.

The length of the tape path between the two playback heads would need to be different for different turntable speeds, -45RPM and 33+1/3 for example- and this was normally accomplished by having the heads fixed, but running the tape through different 'pulley-paths' by routing the tape over different guide rollers for each cutting speed.

Some mastering facilities used single head playback machine and used stereo digital delays to generate the delayed signal, the advantages included only one stereo playback alignment instead of two, and the delay line could have setting memories to quickly switch between the different delay times for different cutting speeds. -The big disadvantage was that the signal was degraded by the A/D & D/A conversion & filtering, especially given the state of the art of converter design at that time!

Now.... back to the DBX units... Using the boxes you have, the two signals ('preview' and 'cut') should be identical, although you'd need two decode devices. -The first playback converter would decode the video and convert it to raw digital information, which would be sent both to the D/A converter (sent to the lathe as it's 'preview' signal) and to the mastering delay output. This would bedelayed -still in it's digital form- by the required amount (one lathe revolution) and its output would then be fed to a second playback decoder, where the identical raw digital data (just later in time) would be re-constructed into an analog waveform by an identical converter to the first (preview) converter's output.

The delay line and second converter would all have to run from a common sample clock, so either the 25-pin port would contain the timing clock along with the data, or there would be an additional clock source -similar to a BNC wordclock connector used with SDIF, MADI or SDIF-2 nowadays... -Hmmm.... I don't think that it could actualy be SDIF-2 though, because of the high (nonstandard) sampling rate... -It might use similar line drive hardware though...

Unfortunately, the great bulk of my expertise is in Analog (long may it reign!) -if you can get a computer data storage geek to figure out a way to lock onto the clock, strip the data and send it to a common storage system, you might have a nice little setup there, but I'm afraid I don;t know anything about that end... as soon as music stobs being a waveform and starts being a stream of numbers I just lose interest... (-It seems more like accounting to me than signal treatment!!!)

I suppose Roland's shorter answer is a better one... Simply put, there's no way unless you know -or are- someone who's really good at data storage!

I'd love to know if you can get it to work, though...

Keith A.


This post is made possible by the generous support of people like you and our sponsors:
  McShane Design  


Follow Ups Full Thread
Follow Ups


You can not post to an archived thread.