simulavr-devel
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [Simulavr-devel] [patch #7030] do not eat 100% CPU if GDB disconnect


From: ThomasK
Subject: Re: [Simulavr-devel] [patch #7030] do not eat 100% CPU if GDB disconnects
Date: Wed, 06 Jan 2010 12:39:34 +0100
User-agent: Thunderbird 2.0.0.23 (X11/20090817)

Hi Petr,

A free running simulation after exiting the gdb - is it desired behavior?

Yes, that's the question! This feature is old, as I have seen in code. So I wouldn't remove this without the question: why and who needs this?

I assume (!!!), that this is implemented this way to get a behaviour like a ICE adapter to real hardware. (I'havn't such, so I can't check it) Simply to have a virtual processor and you can connect to it any time you want without restarting it.

And, of course, usage of simulavr with gdb was the first intention to develop simulavr. Use simulavr with tcl or later python interface was implemented later. So the primary goal would be (in my opinion) to hold gdb support operational. But there is a other question: if you want to simulate a multi avr processor environment (possible by tcl/python scripting interface) and then connect to one processor with gdb - how should be the behaviour there? Should the not connected processor run even if the connected processor is halted by gdb. Only one question from many. This needs some good ideas and a rewrite of gdb interface. (maybe one wants to debug not only one processor, this isn't possible now)

cu, Thomas




reply via email to

[Prev in Thread] Current Thread [Next in Thread]