#include "petscmat.h" PetscErrorCode MatCreateMPIAIJWithArrays(MPI_Comm comm,PetscInt m,PetscInt n,PetscInt M,PetscInt N,const PetscInt i[],const PetscInt j[],const PetscScalar a[],Mat *mat)Collective on MPI_Comm

comm | - MPI communicator | |

m | - number of local rows (Cannot be PETSC_DECIDE) | |

n | - This value should be the same as the local size used in creating the x vector for the matrix-vector product y = Ax. (or PETSC_DECIDE to have calculated if N is given) For square matrices n is almost always m. | |

M | - number of global rows (or PETSC_DETERMINE to have calculated if m is given) | |

N | - number of global columns (or PETSC_DETERMINE to have calculated if n is given) | |

i | - row indices | |

j | - column indices | |

a | - matrix values |

The i and j indices are 0 based, and i indices are indices corresponding to the local j array.

The format which is used for the sparse matrix input, is equivalent to a row-major ordering.. i.e for the following matrix, the input data expected is

1 0 0 2 0 3 P0 ------- 4 5 6 P1

Process0 [P0]: rows_owned=[0,1] i = {0,1,3} [size = nrow+1 = 2+1] j = {0,0,2} [size = nz = 6] v = {1,2,3} [size = nz = 6]

Process1 [P1]: rows_owned=[2] i = {0,3} [size = nrow+1 = 1+1] j = {0,1,2} [size = nz = 6] v = {4,5,6} [size = nz = 6]

MPIAIJ, MatCreateAIJ(), MatCreateMPIAIJWithSplitArrays()

** Level:intermediate
Location:**src/mat/impls/aij/mpi/mpiaij.c

Index of all Mat routines

Table of Contents for all manual pages

Index of all manual pages