[Top][All Lists]
[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [Bug-gnubg] User training of the Neural Nets
From: |
Joseph Heled |
Subject: |
Re: [Bug-gnubg] User training of the Neural Nets |
Date: |
Thu, 24 Aug 2006 11:24:08 -0600 |
<personal opinion>
Using the current setup, as is, you may get better net(s), but not
anything significantly better. To go to the next level requires
"programatical" changes suggested by Øystein, either in the inputs,
the number of nets, the way nets interact, a new way to train, and so
on. one worthwhile addition that we know to work but was not
implemented in the 1+0 ply evaluation, which has the same
computational cost as 1 ply but is better. (some like to cal it 0.5
ply, some 1.5 ply, but neither name make sense to me) .basically it is
simply averaging the 0 and 1 ply equities (i think the best weights
were .6 something). I don't remember if I implemented or tested this
for cubefull evaluations. perhaps I only did that for cubeless. this
can be a fun project for someone to get proficient in the setup,
python and all the toolchain in developing nets for gnubg.
-Joseph
On 8/23/06, Øystein Johansen <address@hidden> wrote:
Christian Anthon wrote:
> Okay then what level of programming skills would be needed and what
> should a programmer do?
1. Get the gnubg-nn code
cvs -d:something:blah co gnubg-nn
2. make it compile
If I remeber correctly it was:
cd py
gmake safe
I believe there is some documentation on how to compile it.
(Maybe a linux system makes it easier ?)
3. Before you start training anything:
Steal the neural net evaluation code from gnubg, the code
that uses SSE, and apply it to the code ing gnubg-nn.
This step will save you a lot of time in the traing.
(commit the changes back to the cvs)
4. Get the reference databases and the neural nets. It's on
a ftp somewhere. I'll find the address when you need it.
5. Use the training scrips provided.
train.py - trains a net
buildnet.py - builds a net from scratch... really slow.
getth.py - finds post that eval mismatch from n-ply to 0-ply
referr.py - finds the error of a new breeded net against the
reference database.
Here's where I stranded... It worked it worked! I could breed
new nets, but none of the nets I trained was significantly
better than the original onem no matter how long I trained.
6. A programmer can now try out different things, like further
splitting of neural nets, or altering the inputs, or guessing
other algorithms thar might work.
Look at the different hand crafted inputs, can anyone be
removed? Can anything be added? I believe there is code to
dynamically add and remove nn inputs. If you add a input
make sure you add a new 'concept' and not just something
that's linearly depending on some other inputs.
I would love to see someone taking the training further.
-Øystein
_______________________________________________
Bug-gnubg mailing list
address@hidden
http://lists.gnu.org/mailman/listinfo/bug-gnubg
- [Bug-gnubg] User training of the Neural Nets, Albert Silver, 2006/08/23
- Re: [Bug-gnubg] User training of the Neural Nets, Christian Anthon, 2006/08/27
- RE: [Bug-gnubg] User training of the Neural Nets, Ian Shaw, 2006/08/25
- Re: [Bug-gnubg] User training of the Neural Nets, Øystein Johansen, 2006/08/25
- RE: [Bug-gnubg] User training of the Neural Nets, Albert Silver, 2006/08/23