octave-bug-tracker
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Octave-bug-tracker] [bug #42742] polygcd fails valid test


From: Dan Sebald
Subject: [Octave-bug-tracker] [bug #42742] polygcd fails valid test
Date: Sun, 05 Oct 2014 01:17:31 +0000
User-agent: Mozilla/5.0 (X11; Linux x86_64; rv:18.0) Gecko/20100101 Firefox/18.0 SeaMonkey/2.15

Follow-up Comment #6, bug #42742 (project octave):

I've looked into this issue a little more.  It doesn't seem to be a tolerance
issue, but a case of deconvolution going unstable under the circumstance of
the random polynomial coefficients being ill conditioned.

What I was noticing is that on multiple passes through the loop those residues
(the 'r') exhibit an error for seemingly no reason and then that error gets
amplified with successive passes resulting in r being greater than the
tolerance.  And it seems it can be rather big sometimes.  By error, I mean
when printing out


if (length(a) <= length(b))
  cv = (conv(d,a) + r);
  cv - b
endif


I was seeing non-zero cv-b on occasion, and the deconv() help says that should
be zero, by definition.

So, try this routine


for i=1:100
  a = rand(3,1);
  b = rand(15,1);
  y = conv(b,a);
  [b_est, r] = deconv(y,a);
  z = conv(b_est,a) + r - y;
  a
  roots(a)
  [b b_est]
  if any(z)
    z
    break;
  endif
endfor


and notice how the cases that fail have a(1) near zero and how the
deconvolution results can go quickly out of bounds.  But, otherwise the
results look pretty accurate.

Well, partly this isn't a surprise, and the reason has to do with 'a' being
the feedback coefficients of the filter routine used in deconvolution:


    b = filter (y, a, x);


Above 'a' determine the poles of the system and if the poles are outside the
unit circle, filtering will diverge.  So, I've includes the "roots" in the
hunk of code above and one can see how the roots (poles) have a magnitude that
is far outside the unit circle.

Now, there may be better algorithms for doing this sort of thing
(deconvolution is often dodgy, and I did start looking at the roots approach
earlier, which I'll think about some more), but I think as far as the
algorithm as given this is a case of finding the weakness of the algorithm
because of the random choice of data.  There should be restrictions on p1 (and
possibly p2) to keep deconvolution numerically stable.  For example, rather
than completely random, the polynomials could be {C_p} + {n_p} where C are
nice polynomial coefficients and n is added noise.


    _______________________________________________________

Reply to this item at:

  <http://savannah.gnu.org/bugs/?42742>

_______________________________________________
  Message sent via/by Savannah
  http://savannah.gnu.org/




reply via email to

[Prev in Thread] Current Thread [Next in Thread]