[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]
Re: [Gneuralnetwork] [PATCH] Compile gneuralnetwork as library
From: |
Aljosha Papsch |
Subject: |
Re: [Gneuralnetwork] [PATCH] Compile gneuralnetwork as library |
Date: |
Mon, 2 May 2016 21:19:25 +0200 |
User-agent: |
Mozilla/5.0 (X11; Linux x86_64; rv:38.0) Gecko/20100101 Icedove/38.7.2 |
On 02.05.2016 21:03, Aljosha Papsch wrote:
On 30.04.2016 06:40, Nala Ginrut wrote:
Hi Aljosha!
On Sun, 2016-04-24 at 17:01 +0200, Aljosha Papsch wrote:
steps:
* Write Guile bindings using FFI
* Write Scheme programs equivalent to the .script files
Agreed.
This patch still needs some love. When I tried loading the library
with the FFI:
(define libgneural (dynamic-link "libgneural.so"))
I got the error "file not found". After some thorough investigation
with strace and LD_DEBUG=all, I got to the root of the error:
writev(2, [{" 16551:\t", 12}, {"/opt/gneural/lib/libgneural.so",
30}, {": error: ", 9}, {"symbol lookup error", 19}, {": ", 2},
{"undefined symbol: fp", 20}, {" (", 2}, {"fatal", 5}, {")\n", 2}],
9 16551: /opt/gneural/lib/libgneural.so: error: symbol lookup
error: undefined symbol: fp (fatal)
Global variables! fp is defined in gneural_network.c and some library
code depends on it. I will have to track down the dependencies and
make them independent of the global variables (is reproducible the
right term here?). Only then step 0 (librarifying) is done.
It's worse than I thought. Take for example NEURON, defined in
gneural_network.c:
find src -name '*.c' -print0 | xargs -0 grep -ne NEURON
./gradient_descent.c:59: for(j=0;j<NEURON[i].nw;j++){
./gradient_descent.c:60: wbackup[k]=NEURON[i].w[j];
./gradient_descent.c:70: for(j=0;j<NEURON[i].nw;j++){
./gradient_descent.c:71: NEURON[i].w[j]=wbackup[k]-delta;
./gradient_descent.c:73: NEURON[i].w[j]=wbackup[k]+delta;
./gradient_descent.c:82: for(j=0;j<NEURON[i].nw;j++){
./gradient_descent.c:83: NEURON[i].w[j]=wbackup[k]-gamma*diff[k];
./load.c:44: NEURON[i].nw=(int)(tmp);
./load.c:45: for(j=0;j<NEURON[i].nw;j++){
./load.c:47: NEURON[i].w[j]=tmp; // weights
./load.c:49: for(j=0;j<NEURON[i].nw;j++){
./load.c:51: NEURON[i].connection[j]=(int)(tmp);
./load.c:54: NEURON[i].activation=(int)(tmp);
./load.c:56: NEURON[i].discriminant=(int)(tmp);
./load.c:77: printf("\nNEURONS\n=======\n");
./load.c:81: printf("NEURON[%d].nw = %d\n",i,NEURON[i].nw); // number
of input connections (weights)
./load.c:82: for(j=0;j<NEURON[i].nw;j++) printf("NEURON[%d].w[%d] =
%g\n",i,j,NEURON[i].w[j]); // weights
./load.c:83: for(j=0;j<NEURON[i].nw;j++)
printf("NEURON[%d].connection[%d] = %d\n",i,j,NEURON[i].connection[j]);
// connections to other neurons
./load.c:84: printf("NEURON[%d].activation =
%d\n",i,NEURON[i].activation); // activation function
./load.c:85: printf("NEURON[%d].discriminant =
%d\n",i,NEURON[i].discriminant); // discriminant function
./msmco.c:49: for(j=0;j<NEURON[i].nw;j++){
./msmco.c:50: NEURON[i].w[j]=0.5*(WMAX-WMIN)+(0.5-rnd())*0.5*(WMAX-WMIN);
./msmco.c:56: for(j=0;j<NEURON[i].nw;j++){
./msmco.c:57:
NEURON[i].w[j]=wbest[k]+(0.5-rnd())*0.5*(WMAX-WMIN)*pow(gamma,m);
./msmco.c:69: for(j=0;j<NEURON[i].nw;j++){
./msmco.c:70: wbest[k]=NEURON[i].w[j];
./msmco.c:82: for(j=0;j<NEURON[i].nw;j++){
./msmco.c:83: NEURON[i].w[j]=wbest[k];
./parser.c:44: // NUMBER OF NEURONS
./parser.c:45: else if(strcmp(s,"NUMBER_OF_NEURONS")==0){
./parser.c:49: printf("NUMBER_OF_NEURONS must be an integer number!\n");
./parser.c:53: printf("NUMBER_OF_NEURONS must be a positive number!\n");
./parser.c:56: printf("TOTAL NUMBER OF NEURONS = %d [OK]\n",nnum);
./parser.c:58: printf("allocating memory for NEURONS[]\n");
./parser.c:60: // DEFINITION OF NEURONS
./parser.c:61: else if(strcmp(s,"NEURON")==0){
./parser.c:78: // syntax: NEURON ind NUMBER_OF_CONNECTIONS num
./parser.c:96: printf("NEURON %d NUMBER_OF_CONNECTIONS = %d
[OK]\n",index,num_of_connections);
./parser.c:97: NEURON[index].nw=num_of_connections;
./parser.c:100: // syntax: NEURON ind ACTIVATION function
./parser.c:104: NEURON[index].activation=TANH;
./parser.c:105: printf("NEURON %d ACTIVATION = TANH\n",index);
./parser.c:108: NEURON[index].activation=EXP;
./parser.c:109: printf("NEURON %d ACTIVATION = EXP\n",index);
./parser.c:112: NEURON[index].activation=ID;
./parser.c:113: printf("NEURON %d ACTIVATION = ID\n",index);
./parser.c:116: NEURON[index].activation=POL1;
./parser.c:117: printf("NEURON %d ACTIVATION = POL1\n",index);
./parser.c:120: NEURON[index].activation=POL2;
./parser.c:121: printf("NEURON %d ACTIVATION = POL2\n",index);
./parser.c:128: // syntax: NEURON ind DISCRIMINANT function
./parser.c:132: NEURON[index].discriminant=LINEAR;
./parser.c:133: printf("NEURON %d DISCRIMINANT = LINEAR\n",index);
./parser.c:136: NEURON[index].discriminant=LEGENDRE;
./parser.c:137: printf("NEURON %d DISCRIMINANT = LEGENDRE\n",index);
./parser.c:140: NEURON[index].discriminant=LAGUERRE;
./parser.c:141: printf("NEURON %d DISCRIMINANT = LAGUERRE\n",index);
./parser.c:144: NEURON[index].discriminant=FOURIER;
./parser.c:145: printf("NEURON %d DISCRIMINANT = FOURIER\n",index);
./parser.c:153: // syntax: NEURON ind CONNECTION connection_id
global_neuron_id_2
./parser.c:165: if(connection_id>(NEURON[index].nw-1)){
./parser.c:183: printf("NEURON %d CONNECTION %d %d
[OK]\n",index,connection_id,global_neuron_id_2);
./parser.c:184: NEURON[index].connection[connection_id]=global_neuron_id_2;
./parser.c:214: // syntax: NETWORK LAYER ind NUMBER_OF_NEURONS num
./parser.c:231: if(strcmp(s,"NUMBER_OF_NEURONS")!=0){
./parser.c:233: printf("NUMBER_OF_NEURONS expected!\n");
./parser.c:246: if(num>MAX_NUM_NEURONS){
./parser.c:248: printf("please increase MAX_NUM_NEURONS and
recompile!\n");
./parser.c:252: printf("NETWORK LAYER %d NUMBER_OF_NEURONS %d
[OK]\n",ind,num);
./parser.c:255: // syntax: NETWORK ASSIGN_NEURON_TO_LAYER layer_id
local_neuron_id global_neuron_id
./parser.c:256: else if(strcmp(s,"ASSIGN_NEURON_TO_LAYER")==0){
./parser.c:299: printf("NETWORK ASSIGN_NEURON_TO_LAYER %d %d %d
[OK]\n",layer_id,local_neuron_id,global_neuron_id);
./parser.c:371: if(conn>(NEURON[neu].nw-1)){
./parser.c:797: if(conn>(NEURON[neu].nw-1)){
./random_search.c:49: for(j=0;j<NEURON[i].nw;j++){
./random_search.c:50: wbackup[k]=NEURON[i].w[j];
./random_search.c:65: for(j=0;j<NEURON[i].nw;j++){
./random_search.c:66: NEURON[i].w[j]=wbackup[k];
./randomize.c:31: for(i=0;i<NEURON[n].nw;i++){
./randomize.c:32: NEURON[n].w[i]=WMIN+rnd()*(WMAX-WMIN);
./save.c:40: fprintf(fp,"%d\n",NEURON[i].nw); // number of input
connections (weights)
./save.c:41: for(j=0;j<NEURON[i].nw;j++)
fprintf(fp,"%g\n",NEURON[i].w[j]); // weights
./save.c:42: for(j=0;j<NEURON[i].nw;j++)
fprintf(fp,"%d\n",NEURON[i].connection[j]); // connections to other neurons
./save.c:43: fprintf(fp,"%d\n",NEURON[i].activation); // activation
function
./save.c:44: fprintf(fp,"%d\n",NEURON[i].discriminant); // discriminant
function
./save.c:60: printf("\nNEURONS\n=======\n");
./save.c:64: printf("NEURON[%d].nw = %d\n",i,NEURON[i].nw); // number
of input connections (weights)
./save.c:65: for(j=0;j<NEURON[i].nw;j++) printf("NEURON[%d].w[%d] =
%g\n",i,j,NEURON[i].w[j]); // weights
./save.c:66: for(j=0;j<NEURON[i].nw;j++)
printf("NEURON[%d].connection[%d] = %d\n",i,j,NEURON[i].connection[j]);
// connections to other neurons
./save.c:67: printf("NEURON[%d].activation =
%d\n",i,NEURON[i].activation); // activation function
./save.c:68: printf("NEURON[%d].discriminant =
%d\n",i,NEURON[i].discriminant); // discriminant function
./simulated_annealing.c:66: for(j=0;j<NEURON[i].nw;j++){
./simulated_annealing.c:67: wbackup[k]=NEURON[i].w[j];
./simulated_annealing.c:90: for(j=0;j<NEURON[i].nw;j++){
./simulated_annealing.c:91: NEURON[i].w[j]=wbackup[k];
./simulated_annealing.c:103: for(j=0;j<NEURON[i].nw;j++){
./simulated_annealing.c:104: wbest[k]=NEURON[i].w[j];
./simulated_annealing.c:116: for(j=0;j<NEURON[i].nw;j++){
./simulated_annealing.c:117: NEURON[i].w[j]=wbest[k];
./error.c:44: for(j=0;j<NEURON[neuron_id].nw;j++){
./error.c:45:
NEURON[neuron_id].x[j]=NEURON[neuron_id].output=X[n][neuron_id][j];
./error.c:48:// NEURON[0].x[0]=NEURON[0].output=X[n];
./error.c:54:
y=NEURON[NETWORK.neuron_id[NETWORK.num_of_layers-1][j]].output;
./error.c:69: for(j=0;j<NEURON[neuron_id].nw;j++){
./error.c:70:
NEURON[neuron_id].x[j]=NEURON[neuron_id].output=X[n][neuron_id][j];
./error.c:73:// NEURON[0].x[0]=NEURON[0].output=X[n];
./error.c:79:
y=NEURON[NETWORK.neuron_id[NETWORK.num_of_layers-1][j]].output;
./error.c:83:// err+=pow(Y[n]-NEURON[5].output,2.);
./feedforward.c:38: for(i=0;i<NEURON[id].nw;i++)
NEURON[id].x[i]=NEURON[NEURON[id].connection[i]].output;
./feedforward.c:39: switch(NEURON[id].discriminant){
./feedforward.c:41: for(i=0;i<NEURON[id].nw;i++)
./feedforward.c:42: x+=NEURON[id].x[i]*NEURON[id].w[i]; // linear
product between w[] and x[]
./feedforward.c:45: for(i=0;i<NEURON[id].nw;i++){
./feedforward.c:49:
tmp+=pow(NEURON[id].x[i],j)*binom(i,j)*binom((i+j-1)/2,j);
./feedforward.c:51: tmp*=a*NEURON[id].w[i];
./feedforward.c:56: for(i=0;i<NEURON[id].nw;i++){
./feedforward.c:58: for(j=0;j<=i;j++)
tmp+=binom(i,j)*pow(NEURON[id].x[i],j)*pow(-1,j)/fact(j);
./feedforward.c:59: tmp*=NEURON[id].w[i];
./feedforward.c:64: for(i=0;i<NEURON[id].nw;i++){
./feedforward.c:66: for(j=0;j<=i;j++)
tmp+=sin(2.*j*M_PI*NEURON[id].x[i]);
./feedforward.c:67: tmp*=NEURON[id].w[i];
./feedforward.c:74: NEURON[id].output=activation(NEURON[id].activation,x);
./genetic_algorithm.c:39: for(j=0;j<NEURON[i].nw;++j){
./genetic_algorithm.c:95: for(j=0;j<NEURON[i].nw;j++){
./genetic_algorithm.c:127: for(j=0;j<NEURON[i].nw;j++){
./genetic_algorithm.c:128: NEURON[i].w[j]=individuals[n]->weights[k];
./genetic_algorithm.c:179: for(j=0;j<NEURON[i].nw;j++){
./genetic_algorithm.c:180: NEURON[i].w[j]=individuals[0]->weights[k];
./genetic_algorithm.c:238:// for (j = 0; j < NEURON[i].nw; j++) {
./genetic_algorithm.c:239:// if (m == 0) wbest[m][k] =
NEURON[i].w[j];
./genetic_algorithm.c:240:// else wbest[m][k] =
NEURON[i].w[j] + rate * rnd() * (WMAX - WMIN);
./genetic_algorithm.c:251:// for (j = 0; j <
NEURON[i].nw; j++) {
./genetic_algorithm.c:252:// NEURON[i].w[j] =
wbest[m][k];
./genetic_algorithm.c:270:// for (j = 0; j <
NEURON[i].nw; j++) {
./genetic_algorithm.c:271:// NEURON[i].w[j] = w[k] =
wbest[m_best][k];
./genetic_algorithm.c:281:// for (j = 0; j <
NEURON[i].nw; j++) {
./genetic_algorithm.c:282:// wbest[m][k] =
NEURON[i].w[j] + rate * rnd() * (WMAX - WMIN);
./genetic_algorithm.c:292:// for (j = 0; j < NEURON[i].nw; j++) {
./genetic_algorithm.c:293:// NEURON[i].w[j] = w[k];
./gneural_network.c:58:neuron NEURON[MAX_NUM_NEURONS];
./gneural_network.c:81:double
X[MAX_TRAINING_POINTS][MAX_NUM_NEURONS][MAX_IN];
./gneural_network.c:82:double Y[MAX_TRAINING_POINTS][MAX_NUM_NEURONS];
./gneural_network.c:87:double
OUTPUT_X[MAX_NUM_POINTS][MAX_NUM_NEURONS][MAX_IN];
./gneural_network.c:239: for(j=0;j<NEURON[neuron_id].nw;j++){
./gneural_network.c:241: NEURON[neuron_id].x[j]=NEURON[neuron_id].output=x;
./gneural_network.c:247:
y=NEURON[NETWORK.neuron_id[NETWORK.num_of_layers-1][j]].output;
All these occurences, in addition to the other global variables in
gneural_network.c, need to be replaced with a local variable (never mind
parser.c). It's gonna take some time. But Rome wasn't built in a night,
right? ;-)
Best regards
Aljosha