HyPar - Hyperbolic-Parabolic PDE Solver
This code is a work-in-progress. The Git commit messages in the repository show the latest progress like bugfixes, addition of features, etc.
Note: Some of the features listed below may not yet be available. They will be soon! :-)
With Bitbucket account and SSH keys:
git clone email@example.com:dog5011/hypar.git
git clone https://bitbucket.org/dog5011/hypar.git
A unified finite-difference algorithm to solve the n-dimensional hyperbolic-parabolic equations, with or without a source term, on a n-dimensional Cartesian grid. The dimensionality and vector sizes are arbitrary. The parallel version implements a n-dimensional domain decomposition (of course, the number of processes along any one or more dimensions can be set to one, thus obtaining a reduced-dimension decomposition).
- First order upwind
- Third order MUSCL with Koren's limiter
- Fifth order WENO
- Fifth order CRWENO
- Second order central
- Fourth order central
- Fourth order compact
- Sixth order compact
- Sixth order WENO
- With PETSc (with-petsc branch)
- Native time-integrators (without PETSc): Multi-stage explicit Runge Kutta scheme (First, second, third, fourth order, SSP RK3)
- Linear advection-diffusion-reaction
- Fokker-Planck double well
- Fokker-Planck model for Power Systems
- Euler equations (1D, 2D, 3D)
- Porous Medium Equations
- Plain text
- Tecplot (2D and 3D problems only)
The following steps should compile the code (the default prefix is the source folder itself, so the binary will be installed in hypar/bin):
autoreconf -iThis will generate the required files for:
[CFLAGS="..."] ./configure [--with-mpidir=/path/to/mpi] [--prefix=/install/dir]
** A serial version can be compiled using the "-Dserial" compile flag.
** If MPI is unavailable or the configure script is unable to detect MPI, then a serial version of the code will get compiled.
Compiling with PETSc - for the with-petsc branch:
** It is preferable to use the same MPI compiler for building PETSc and this code.
** If the variables PETSC_DIR and PETSC_ARCH are not defined in the environment, the code will get compiled without PETSc time-integration.
[/path/to/mpi/bin/]mpiexec -n $NPROC /path/to/HYPARNote: $NPROC needs to be equal to the product of the "iproc" entries in the input file "solver.inp".
See the README file in the source for a list of required input files and their explanations.
The folder Examples contains several examples with all the input files required to run a case. Each example has a README file describing that case.