[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
RE: large-n simulations
From: |
Darold Higa |
Subject: |
RE: large-n simulations |
Date: |
Mon, 10 Jun 2002 22:45:10 -0700 |
Oops, big correction there...I meant to say 400-500 iterations, with an end
population of around 1500-2000 agents. I have run it in a few hours if I
turn off saving the displays to the hard drive. I can speed it up even more
if I turn off the nonessential graphical elements.
Swarm is great for doing large-n agent-based models. I have hopes to be
able to run simulations with 5000+ thinking agents interacting in a market
and in a physical environment in the next stage of my simulation. That is
probably at least a year or two off, so I hope that I will be running it on
a 4GHz machine by then.
My biggest difficulty with large-n simulations was memory. I wanted to give
every agent a larger chunk of memory, but I began to have problems managing
some of it (more a limit of my programming ability than Swarm's fault).
I wouldn't be surprised to see social science simulations in the 10,000+
agent range in the near future. Is anyone already doing it?
Darold Higa
University of Southern California
School of International Relations
==================================
Swarm-Support is for discussion of the technical details of the day
to day usage of Swarm. For list administration needs (esp.
[un]subscribing), please send a message to <address@hidden>
with "help" in the body of the message.