petsc-3.3-p7 2013-05-11


Creates a new matrix based sparse Mat storage


#include "petscmat.h" 
PetscErrorCode  MatCreateBlockMat(MPI_Comm comm,PetscInt m,PetscInt n,PetscInt bs,PetscInt nz,PetscInt *nnz, Mat *A)
Collective on MPI_Comm

Input Parameters

comm - MPI communicator
m - number of rows
n - number of columns
bs - size of each submatrix
nz - expected maximum number of nonzero blocks in row (use PETSC_DEFAULT if not known)
nnz - expected number of nonzers per block row if known (use PETSC_NULL otherwise)

Output Parameter

A -the matrix

PETSc requires that matrices and vectors being used for certain operations are partitioned accordingly. For example, when creating a bmat matrix, A, that supports parallel matrix-vector products using MatMult(A,x,y) the user should set the number of local matrix rows to be the number of local elements of the corresponding result vector, y. Note that this is information is required for use of the matrix interface routines, even though the bmat matrix may not actually be physically partitioned. For example,


matrix, bmat, create

See Also


Index of all Mat routines
Table of Contents for all manual pages
Index of all manual pages