[Clam-devel] TokenDelay patch

Xavier Amatriain xavier at create.ucsb.edu
Sat May 5 21:26:54 PDT 2007


David García Garzón wrote:
> That's what we did originally, 
Uhm? That is not wat we had originally (and still have by the way when 
CLAM_OPTIMIZE is not defined).
What we had was a std::deque that was never reserved for. All we used 
were push and pops.

In any case, current optimized implementation does use a vector and the 
vector is resized according to the
Max delay. The potentially dangerous behavior is formally correct, the 
reason that was not happening in the
previous implementation is because of a "bug" in that one. See the 
explanation:

If you set  a delay to T ms, what should happen when you call Do before 
T ms have actually gone by? If
you are delaying samples the answer is more or less obvious: you should 
return 0! In the deque implementation
that is not happening... you are actually returning the last token 
written in (which turns to be the input).

Now, although this is wrong, it has the side effect of being safe in the 
case you are delaying other complex data
such as Spectrum...

If you are delaying a spectrum and you are in the case described above 
(current time < delay time) you should
be returning a "zero spectrum". This turns out to be a 
"default-constructed spectrum". The problem with this
is that such a spectrum will have zero-sized buffers (and a default 
configuration for that matter). If when calling the
Do you expect to get a similar kind of spectrum to the one you input, 
you are dead in the water!

So, my take on the problem is that my implementation is correct... but 
unsafe. It demands the user to do
something like:

myDelay.Do(in, out);
if (out is default constructed)
    Initialize out to the right size and configuration

Possible solutions:

- Use of prototypes (I do not favor this interface much)
- Do the previous check inside of the TokenDelay function







More information about the clam-devel mailing list