getfem-users
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

[Getfem-users] parallel Getfem


From: Andriy Andreykiv
Subject: [Getfem-users] parallel Getfem
Date: Fri, 11 Nov 2011 19:11:27 +0100

Dear Getfem users,

I'm struggling with a problem of running Getfem in parallel (serial runs well).

My system is Ubuntu 11.10, 64 - bit.
Getfem release version 4.1.1
Installed MPICH2 (including development files), MUMPS 4.9.2 (with dev files) and ATLAS 3.8.4 (with dev).
Compiling with gcc 4.6.1

configured getfem as:  ./configure --with-pic --enable-paralevel=2
(configure recognized MPICH2, MUPS and ATLAS and said that Getfem will be built in parallel)

Building getfem succeeds, but "make check" was running with some linking problems for the very first test.
Searching the web I discovered some problem with MPICH2 where people say that I should also link libmpl after mpich2 which I did 
by editing Makefile LIBS in the main and tests directories (not neat, but I don't know how I can force this linkage with configure)

This resolved the linking problems and most of the tests where built, however I got stuck with building schwarz_additive.cc.
This tests does #include <mpi++.h> which I don't have. Somewhere I've read that I can use mpi.h instead (or mpicxx.h) which I did 
(by replacing #include <mpi++.h> to #include <mpich2/mpi.h> and then #include <mpich2/mpicxx.h>). This produces many template and other errors.
So, do I have to install some separate mpi c++ bindings? It is a bit strange, as I thought MPICH2 provides it.

But OK, even though I still cannot build schwarz_additive.cc I thought I can run the tests that were built. 
So I tried to run 
    ./laplacian laplacian.param
       and
   ./nonlinear_elastostatic nonlinear_elastostatic.param

Both tests did some initiation but both ended with MPI error. For instance nonlinear_elastostatic produced:
MESH_TYPE=GT_PK(3,1)
FEM_TYPE=FEM_PK(3,2)
INTEGRATION=IM_TETRAHEDRON(6)
Selecting Neumann and Dirichlet boundaries
Attempting to use an MPI routine before initializing MPICH


running ./elastostatic elastostatic.param gave:

Trace 1 in elastostatic.cc, line 497: Running parallelized Getfem level 2
MESH_FILE=structured:GT="GT_PK(2,1)";SIZES=[1,1];NOISED=0
FEM_TYPE=FEM_PK(2,1)
INTEGRATION=IM_TRIANGLE(6)
temps creation maillage 0.00829101
Selecting Neumann and Dirichlet boundaries
enumerate dof time 0.00613904
temps init 0.0162442
begining resol
Number of dof for P: 2400
Number of dof for u: 882
terminate called after throwing an instance of 'gmm::gmm_error'
  what():  Error in getfem_mesh_region.cc, line 170 : 
the 'all_convexes' regions are not supported for set operations
Aborted

I would really appreciate your advice,
                                           regards,
                                                              Andriy




reply via email to

[Prev in Thread] Current Thread [Next in Thread]