swarm-support
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

RE: simulating large models


From: Darold Higa
Subject: RE: simulating large models
Date: Tue, 18 Jun 2002 17:12:10 -0700

I tend to agree.  Define a "large model".  Agents in the thousands are
easily handled by the Objective-C version of Swarm, as long as you are not
expecting realtime observations.  I think tens of thousands or hundreds of
thousands of agents can create problems if each agent consumes a lot of
memory...but unless they have more than 1MB of RAM each, a machine with
2-3GB RAM should be able to handle simulations into the low thousands.

The other technique is to store common data in lookup tables, or use bits to
store lots of data in very little memory.  I found using binary flags allows
me to store a lot of data in a few integer variables.  There are ways to
trade off computing speed for memory by recalculating data instead of
storing it in each object.  Things like that seem to work well.  I had
massive instability problems at first once I had large populations, but I
found that most of my problems were related to mismanagement of memory.

Darold Higa

University of Southern California
School of International Relations


                  ==================================
   Swarm-Support is for discussion of the technical details of the day
   to day usage of Swarm.  For list administration needs (esp.
   [un]subscribing), please send a message to <address@hidden>
   with "help" in the body of the message.



reply via email to

[Prev in Thread] Current Thread [Next in Thread]