[Getdp] large problems with getdp
Helmut Müller
helmut.mueller at ingenieurbuero-mbi.de
Tue Dec 14 15:31:51 CET 2010
Hello Guillaume,
thank you for your reply.
Yes, problems with 200000 DOF can be solved on this machine, without probs. But larger problems with > 250.000 DOF always lead to malloc failure messages. I assume not the machine is limiting the size of solvable probs, but getdp, at least in the precompiled dl version.
Unfortunately I didn´t manage to recompile getdp by myself. I have had different tries in the past, but no success, so I use the downloadable version.
Perhaps I´ll have to dig myself into recompilation again. Can you confirm that a recompiled version would solve larger problems than the precompiled version? Next question :-) why ? And then, why ist the precompiled Version limited ?
Please, do not missunderstand me, I don´t want to complain about this software or the service that precompiled Versions are availlable, but I´d like to understand ( typical engeneer ).
Thanks in advance, best regards,
Helmut Müller
PS: just tried your suggested command with getdp version 2.1.0 :
Info : InitSolution[T]
Info : Generate[T]
Info : (CPU = 13.1382s, Mem = 537Mb)
Info : Solve[T]
Info : N: 225215
getdp(19601) malloc: *** mmap(size=2147483648) failed (error code=12)
*** error: can't allocate region
*** set a breakpoint in malloc_error_break to debug
( this message repeated several times)
getdp(19601,0xa052c500) malloc: *** mmap(size=1948217344) failed (error code=12)
*** error: can't allocate region
*** set a breakpoint in malloc_error_break to debug
UMFPACK V5.2.0 (Nov 1, 2007): ERROR: out of memory
[0]PETSC ERROR: --------------------- Error Message ------------------------------------
[0]PETSC ERROR: Error in external library!
[0]PETSC ERROR: umfpack_UMF_numeric failed!
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: Petsc Release Version 3.0.0, Patch 7, Mon Jul 6 11:33:34 CDT 2009
[0]PETSC ERROR: See docs/changes/index.html for recent updates.
[0]PETSC ERROR: See docs/faq.html for hints about trouble shooting.
[0]PETSC ERROR: See docs/index.html for manual pages.
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: problem.pro on a umfpack-c named MacBookPro.local by helmutmuller Tue Dec 14 15:26:14 2010
[0]PETSC ERROR: Libraries linked from /Users/geuzaine/src/petsc-3.0.0-p7/umfpack-cxx-opt/lib
[0]PETSC ERROR: Configure run at Tue Aug 18 10:37:03 2009
[0]PETSC ERROR: Configure options --with-debugging=0 --with-scalar-type=complex --with-clanguage=cxx --with-shared=0 --with-x=0 --with-fortran=0 --with-mpi=0 --with-umfpack=1 --download-umfpack=ifneeded
[0]PETSC ERROR: ------------------------------------------------------------------------
[0]PETSC ERROR: MatLUFactorNumeric_UMFPACK() line 176 in src/mat/impls/aij/seq/umfpack/umfpack.c
[0]PETSC ERROR: MatLUFactorNumeric() line 2390 in src/mat/interface/matrix.c
[0]PETSC ERROR: PCSetUp_LU() line 222 in src/ksp/pc/impls/factor/lu/lu.c
[0]PETSC ERROR: PCSetUp() line 794 in src/ksp/pc/interface/precon.c
[0]PETSC ERROR: KSPSetUp() line 237 in src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: KSPSolve() line 353 in src/ksp/ksp/interface/itfunc.c
[0]PETSC ERROR: User provided function() line 62 in unknowndirectory/LinAlg_PETSC.cpp
Abort trap
So I was unable to solve a problem wit 225215 DOF. So, how the hell have you ( or Christophe) been able to solve about 10 millions DOF ?
regards, Helmut
Am 14.12.2010 um 14:31 schrieb Guillaume Demesy:
> Hello Helmut,
>
> 8Gb should be more than enough to solve this 200000 DOF problem.
> Getdp binaries are compiled with the PETSc solver umfpack. Have you tried it? Or are you using GMRES?
> getdp myfile.pro -solve myresolution -ksp_type preonly -pc_type lu -pc_factor_mat_solver_package umfpack
>
> But the best solution when tackling big problems is probably to
> 1- recompile your own petsc with openmpi support and MUMPS
> 2- compile getdp with your petsc
> ...which will provide you a parallel 'solve'. The preprocessing will remain serial.
>
> see:
> https://geuz.org/trac/getdp/wiki/PETScCompile
> ./configure --CC=/opt/local/bin/openmpicc --CXX=/opt/local/bin/openmpicxx --FC=/opt/local/bin/openmpif90 --with-debugging=0 --with-clanguage=cxx --with-shared=0 --with-x=0 --download-mumps=1--download-parmetis=1--download-scalapack=1 --download-blacs=1 --with-scalar-type=complex
>
> Good luck!
>
> Guillaume
>
>
>
> On 14/12/2010 06:28, Helmut Müller wrote:
>>
>> Hi all,
>> first I´d like to thank you for this impressive software!
>> I use it for (quite simple) simulations regarding buildingphysics, I just solve heat equations. Therefore I have quite large models (2m by 2m by 2m) with some small parts or thin layers (10mm).
>> Unfortunately I have to spend a lot of time adjusting the mesh and/or simplify the geometry because I didn´t manage to solve Problems with more than approx. 220.000 DOF on my Mac (8GB RAM, quadCore ). Those problems are solved within seconds or few minutes.
>> From working with other FEM Simulations I know that it is really important to have a "good" mesh, but I´d like to spend less time for optimization of the geometry and/or the mesh for the price of longer calc times on larger models. A longer calculation time
>> would cost me far less than optimization.
>> In this context I have read a mail on the list:
>> > This has been successfully tested with both iterative (GMRES+SOR) and
>> > direct (MUMPS) parallel solvers on up to 128 CPUs, for test-cases up to
>> > 10 millions dofs.
>> With which trick or procedure has this been done ? On which Platform ? How can I at least use the available memory to perform such calculations ( my getdp 2.1 on MacOS (binary download) uses only a small part ca. 1GB of the available memory, pets fails with
>> a malloc error message, the new release of getdp uses all cores but with no benefit for maximum possible model size in respect to DOF. So I assume with 8GB it should be possible to do calculations of at least 500000 DOF.
>> So, what do I miss ? Could partitioning of the mesh and doing separate, iterative calculations be a solution ?
>> Thanks in advance for any suggestion. I assume that other people are interested in this too.
>> Helmut Müller
>>
>>
>> _______________________________________________
>> getdp mailing list
>> getdp at geuz.org
>> http://www.geuz.org/mailman/listinfo/getdp
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://www.geuz.org/pipermail/getdp/attachments/20101214/ad4662e4/attachment.html>