|
From: | Gregory Ratcliff |
Subject: | Re: [Discuss-gnuradio] Continuously Write FFT Samples to a File |
Date: | Sat, 21 Jan 2017 11:33:40 -0500 |
I spend my working hours on big data and Hadoop. It occurs to me you really need to be thinking about something outside of a normal file system. HDFS lets you write out data in chunks that you later combine when you have time. There are some really (really) fast implementation projects that write to hdfs. Most of the new work is in java, but I think you are asking for something pretty light. I can visualize a "gatherer" for RF and a "filer" in HDFS that writes out xx MB chunks every period. Now as others have said, you don't just slap some stuff together, you will need to optimize the integration points and think about the best caching and write speeds of the "filer" system and the persistent storage. Likewise, there are plenty of apache tools that will recombine the HDFS chunks back into files of arbitrary size.....which you can then analyze later with gnuradio...when time doesn't matter as much. You might not need much of Hadoop that the file system and some tools. I have always though HDFS + Gnuradio are destined for each other. It may be a bit early for this with today's hardware; Mr. Moore is helping us along just fine, so is AWS. Greg Nz8r
|
[Prev in Thread] | Current Thread | [Next in Thread] |