discuss-gnuradio
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Discuss-gnuradio] Multiple simultaneous top blocks?


From: Kshitij Kumar Singh
Subject: [Discuss-gnuradio] Multiple simultaneous top blocks?
Date: Fri, 15 Feb 2008 12:24:41 -0500

Hi,

       For an application I needed to import some data into my module
and then packetize it. The data
being quite large, comes from a file. My problem is that multiple
instances of top_block aren't allowed,
so once I have imported my data, I can't run my program. Even if I nest
this inside a gr.hier_block2 class,
I don't know how to create a nestable flow graph so that I run this
module/class first and then proceed 
with the execution of the program once I have the data.
I went through the tx_voice.py example and noticed that all the
transmit_path.py and pick_bitrate.py do is to provide some necessary
functions (none of them need to 'run' some module beforehand).
Is there some way I could get this done using nestable flow graphs, or
do I end up putting the data in a ten page tuple like random_mask?
Kindly help. Thanks in advance.
---------------------------------------------------------------------------------------------------------------------------------
#testbot.py
#This module just takes some data from a file and stores it in a
variable (just a huge string) 
#which is imported in another module which then transmits it in the form
of
# packets. Note that the data to be imported is quite large in size else
this operation 
#would have simply been reduced a payload='abcd' string. Thus we import
the
#data as:
#''import testbot
# testbot.data'' in our module. 

from gnuradio import gr, packet_utils

src =
gr.file_source(gr.sizeof_char,'/home/kshitij/work/transmit.dat',False) 
sink = gr.vector_sink_b()
tb = gr.top_block()
tb.connect(src,sink)

try:
    tb.run() 
    a = sink.data()
    y=[]
    for b in a:
        y.append(str(b)) 
    data = []    
    data = ''.join(y)
except KeyboardInterrupt:
    pass
------------------------------------------------------------------------------------------------------------------------------------------

Regards,
Kshitij





reply via email to

[Prev in Thread] Current Thread [Next in Thread]