discuss-gnuradio
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Discuss-gnuradio] GPU accelerated Viterbi decoder?


From: Marcus Müller
Subject: Re: [Discuss-gnuradio] GPU accelerated Viterbi decoder?
Date: Tue, 21 Mar 2017 13:44:19 +0100
User-agent: Mozilla/5.0 (X11; Linux x86_64; rv:45.0) Gecko/20100101 Thunderbird/45.3.0

Hi Mehmeto,

On 21.03.2017 10:24, Mehmeto wrote:
> Dera All,
>  In almost all cases the Viterbi decoder block eats a lot of CPU time. 
True, but only for higher rates. But yeah, channel coding is a major CPU
load, always.
> The
> best alternative is a GPU solution. 
That *is* a bold claim. Note that doing things on a GPU not only incurs
quite some overhead due to copying data in and out of GPU memory, but
also, Viterbi decoders are algorithms that typically need to find the
maximum across a lot of different branches of computation – maximum
finding is usually something that can only be partly implemented in
parallel; however, the fact that you can compute a lot of probabilities
in parallel is very sexy, indeed.

> I have searched for an open source GPU implementation but could not find one.
>  There is a MATLAB implementation but
> that one is far from open source.
> Any Ideas? Is it easy to port gr-fec to a CUDA/ openCL based platform?
I don't know whether porting makes much sense – the things would be
implemented specifically different on a GPU, so you'd more likely to
write it from ground up, and use gr-fec to check it.

Best regards,
Marcus



reply via email to

[Prev in Thread] Current Thread [Next in Thread]