MPI error in code on Fujitsu VVP


Gentle Users...

Found and fixed problem (below). It is a one-line change to the file
MPP/RSL/RSL/rsl_initial.c. The following line must be added around line 103,
in the routine RSL_INITIALIZE :

RSL_INITIALIZE ()
{
  rslMPIInit() ;                             <<<< add this line
  rsl_mpi_communicator = MPI_COMM_WORLD ;
  rsl_initialize_internal() ;
}

The problem was that the code was attempting to assign rsl_mpi_communicator
with the value of MPI_COMM_WORLD before MPI_Init had been called. On most
systems, MPI_COMM_WORLD is a defined constant and it doesn't matter. On the
VPP, however, it appears that MPI_COMM_WORLD is a variable and it is
initialized in the call to MPI_Init. The call to rslMPIInit does this (after
first checking to make sure it hasn't already been initialized).

Once the change is made to rsl_initial.c, the user must 'make uninstall' and
then 'make mpp' for the change to take effect.

Happy New Year from the MM5 MPP answer beast.

Rotang; Jan. 2002


> > > 
> > > 	I'm sorry to bother you but I'm experimenting some problems 
> > > with the last MM5 release (3-4, December 2001). MM5 is run operationally
> > > since more than one year on a Fujitsu VPP 700 machine. With the new MM5
> > > release I'm receiving the following message whenever I submit the job:
> > > 
> > > 2 - Error in MPI_COMM_SIZE3 - Erro : Invalr in MPI_COMM_SIZE1 - Erro0 -
> > > Erroid communicator : Invalr in MPI_COMM_SIZEid communicator : InvalUXP/V
> > > MPI  2: Aborting with error code=5 class=5: Invalid communicator
> > > 
> > > id communicatorUXP/V MPI  3: Aborting with error code=5 class=5: Invalid
> > > communicator
> > > 
> > > r in MPI_COMM_SIZEUXP/V MPI  1: Aborting with error code=5 
> > > class=5: Invalid communicator : Invalid communicator
> > > PEID = 0004  PPID = 0004  TYPE = IMPE
> > > 
> > > The configuration and the deck file are in attachement. The programme
> > > aborts whatever the minimum number of processors allowed in E/W dim
> > > (PROCMIN_EW... 1, 4, 8, 16). The model configuration is exactly the 
> > > same we were using with the model release 2-3.
> > > 
> > > I would very appreciate if you could give me some hints on this problem.
> > > 
> > > Thank you very much,
> > > 
> > > 		xxxxxxxxxxxxxxxxx
> > > 
> > > 
> > --