octave-maintainers
[Top][All Lists]
Advanced

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

Re: [fem-fenics] MPI parallelisation


From: Eugenio Gianniti
Subject: Re: [fem-fenics] MPI parallelisation
Date: Thu, 31 Jul 2014 20:37:26 +0000


On 31 Jul 2014, at 21:31, Marco Vassallo <address@hidden> wrote:


Il giorno 31/lug/2014 20:54, "Eugenio Gianniti" <address@hidden> ha scritto:
>
> > Hi Eugenio,
> >
> > we mark the subdomain in the mesh.oct files in order to be consistent with the mesh representation in the msh pkg. In fact, the (p, e, t) representation contain this information and so we keep it also in fem-fenics. I do agree with you that it is not widely used, but for example in the msh_refine function it is necessary in order to give back in Octave a refined mesh with all the subdomain available (if they were present in the non-refined mesh).
>
> I noticed that they are also used to apply DirichletBC. Indeed, currently I got parallel assembly working and running a full Neumann problem yields the same solution both in serial and in parallel execution. On the other hand, DirichletBCs do not work in parallel due to the missing markers, and DOLFIN 1.4.0 still does not support them, so problems with Dirichlet boundary conditions cannot be solved in parallel (better, the code runs fine till the end, but the solution is crap).
>
> After going through DOLFIN code, I figured out that dolfin::DirichletBC can be instantiated also giving as argument a MeshFunction identifying subdomains. I would then move the information from the Mesh itself to two MeshFunctions, one for the boundary facets and one for the region identifiers. I wonder where it is better to store such objects. Should I just add them as members of the mesh class or implement a new class to wrap MeshFunction? Probably with the first approach the only change visible by the user would be a new mesh argument needed by DirichletBC.
>
> Eugenio
>

Hi Eugenio,
it seems that you are doing some really good progress towards a good solution.

I have two points:
1) why do you think that with a separate meshfunction the code should work ?

A couple of examples among those distributed with DOLFIN use this approach. In version 1.3.0 you can find “subdomains”, which just shows how to use MeshFunction to store markers. In version 1.4.0 there is also "subdomains-poisson”, which further uses such MeshFunctions to define DirichletBCs, even if this one features only the python sample. Both examples run without problems also in parallel.

2) provided that the solition proposed works, I don t think that we shoul change the user interface. We should look for a convenient way of translating the info from a mesh to a meshfunction when building it. Probably there is some fenics method which could do something like this, or we can ask on the Fenics mailing list.

I don’t think I have understood what you are saying. If your point is that the information already present in the pet mesh produced by the msg package needs to be extracted and added to the fem-fenics representation, that’s what I want to do. Basically instead of setting markers in the Mesh object I would set the corresponding values of a MeshFunction. Anyway, this object needs to be available when constructing DirichletBC, so I must find a way of passing it to that oct-file. If you meant something else, I need you to explain it once more :).

As a side note, I’ve seen that it is possible to attach a MeshFunction to a Mesh using MeshData, so maybe this could be an interesting alternative. However, I still need to go through the code and figure out if it works in parallel, since in the FEniCS book there’s just a small paragraph about it.

Eugenio

HTH

marco



reply via email to

[Prev in Thread] Current Thread [Next in Thread]