Note that this item has been fixed.
See 1999 09 22 .



Date: Tue, 14 Sep 1999 10:42:04 +0800
From: David YEUNG 
Organization: CCAR HKUST
To: "michalak@ucar.edu" 
CC: "'Alexis Kai-Hon Lau'" , Wei Wang 
Subject: Re: Running MM5 on multiple-CPU PC under linux

John

I tried some debugger commands but not successfully to find where the 
seg fault occurred. I sent the same output to Portland Group, here is 
the reply:

-----------------------------
Subject: Re: pgdbg-internal-err[1323]: Unrecognized stab entry
Date: Sun, 12 Sep 1999 23:00:30 -0700 (PDT)
From: Dave Borer 
To:   dyeung@ust.hk
CC:   trs@pgroup.com

David,

   It appears you are encountering a libpthread.a problem 
with the RH6 libpthreads.a We should have this fixed in the 
3.1 release. Until then, we do not have a workaround for you.

regards,
dave
------------------------------

And they also said that the newer version will be released about 10 days later.
I'll try it again when I get the new compiler, and update you our progress.

Thanks

david


"John G. Michalakes" wrote:
> 
> David,
> 
> Can you run under a debugger to find out where the seg fault is occurring?
>  Does it occur in the same place every time you run? What does the size command
> say about your executable?  Can you watch the program as it runs and tell how
> much memory it's using when it dies?  Does it run single-domain if you compile
> with MAXNES=1 instead of 2 to reduce the size of the program in memory?
> 
> John
> 
> On Wednesday, September 08, 1999 9:28 PM, David YEUNG [SMTP:dyeung@ust.hk]
> wrote:
> > John
> >
> > I tried to compile the mm5v2 (SMP version) on a 2-CPU PC. If I compile
> > without
> > OpenMP option, i.e., single CPU mode, the mm5 runs OK. However, if I compile
> > using OpenMP, the mm5.exe gets core dump. Attached is the output from the
> > pgdbg when
> > I compile with -g option, and the variables in the 'configure.user'.
> >
> > Any suggestion?
> >
> > Thanks
> >
> > david
> >
> > -----------------------------------------------------------------------------
> > pgdbg 3.0-4, Copyright 1989-1999, The Portland Group, Incorporated.
> >         *** Reading dwarf
> > 100% complete
> >         *** Reading Symbol Table
> >         *** Reading stabs
> > pgdbg-internal-err[1323]: Unrecognized stab entry
> >                 Stab Type: N_FUN
> >                 Stab:
> >                 Source: fstat.c
> > Symbol table built: ./mm5.exe
> > Loaded: ./mm5.exe
> >
> > pgdbg> run
> > reloading ./mm5.exe
> >
> > ***************  MULTI LEVEL RUN!!!  ***************
> > ***************    2 DOMAIN TOTAL    ***************
> >  STARTING DATE FOR THIS MM5V2 INPUT FILE IS 1999042912
> >  INPUT DOMAIN ID IS            1
> >  IX, JX, DX =           64          82   54000.00     (METERS)
> >  INPUT LANDUSE = OLD
> >  LANDUSE TYPE = OLD  FOUND          13 CATEGORIES           2 SEASONS
> >   WATER CATEGORY =            7
> >  STARTING DATE FOR THIS MM5V2 FORECAST IS 1999042912
> >
> >
> >
> >  MIX =   85   MJX =  100   MKX =  27,   MAXNES =   2
> >  INHYD  =  1  IKFFC  =  1  IARASC =  0  IEXMS  =  1  IICE   =  1  IICEG  =  0
> >  IRDDIM =  1  ISLDIM =  1  IFDDAG =  0  IFDDAO =  0
> >
> >
> >
> >  IMPHYS =   5  MIXED PHASE SCHEME (REISNER).
> >             IMOIST =   2
> >
> >  ICUPA  =   3  GRELL CONVECTIVE PARAMETERIZATION IS USED
> >
> >  IBLTYP =   5  MRF PBL IS USED.
> >             VERTICAL MIXING MOIST ADIABATIC IN CLOUDS.
> >
> >  ITGFLG =   1  THE SURFACE ENERGY BUDGET IS USED TO CALCULATE THE GROUND
> >  TEMPERATURE.
> >             JULDAY = 119  GMT = 12.0
> >  ISOIL  =   1  MULTI-LAYER SOIL THERMAL DIFFUSION .
> >  ISFFLX =   1  HEAT AND MOISTURE FLUXES FROM THE GROUND ARE CONSIDERED.
> >             SURFACE PARAMETERS ARE VARIABLE.
> >
> >  IFRAD  =   2  LONGWAVE AND SHORTWAVE SCHEMES (DUDHIA, 1989).
> >             THE RADIATION EFFECTS DUE TO CLOUDS ARE CONSIDERED.
> >             THE RADIATION IS COMPUTED EVERY   12 STEPS.
> >
> >  ITPDIF =   1  HORIZONTAL DIFFUSION OF PERTURBATION TEMPERATURE.
> >  IVQADV =   1  VERTICAL MOISTURE ADVECTION USES LINEAR INTERPOLATION.
> >  IVTADV =   1  VERTICAL TEMPERATURE ADVECTION USES LINEAR INTERPOLATION.
> >  ITHADV =   1  POTENTIAL TEMPERATURE ADVECTION USED
> >  ICOR3D =   1  FULL 3D CORIOLIS FORCE.
> >  IFUPR  =   1  UPPER RADIATIVE BOUNDARY CONDITION.
> >
> >  IBOUDY =   3  RELAXATION I/O BOUNDARY CONDITIONS ARE USED.
> >
> >
> >
> > 0 K    SIGMA(K)     A(K)     DSIGMA(K)    TWT(K,1)     TWT(K,2)     QCON(K)
> >
> >   1     0.0000     0.0250     0.0500       0.0000       0.0000       0.0000
> >   2     0.0500     0.0700     0.0400       0.5556       0.4444       0.4444
> >   3     0.0900     0.1100     0.0400       0.5000       0.5000       0.5000
> >   4     0.1300     0.1500     0.0400       0.5000       0.5000       0.5000
> >   5     0.1700     0.1900     0.0400       0.5000       0.5000       0.5000
> >   6     0.2100     0.2300     0.0400       0.5000       0.5000       0.5000
> >   7     0.2500     0.2700     0.0400       0.5000       0.5000       0.5000
> >   8     0.2900     0.3100     0.0400       0.5000       0.5000       0.5000
> >   9     0.3300     0.3500     0.0400       0.5000       0.5000       0.5000
> >  10     0.3700     0.3900     0.0400       0.5000       0.5000       0.5000
> >  11     0.4100     0.4300     0.0400       0.5000       0.5000       0.5000
> >  12     0.4500     0.4700     0.0400       0.5000       0.5000       0.5000
> >  13     0.4900     0.5100     0.0400       0.5000       0.5000       0.5000
> >  14     0.5300     0.5500     0.0400       0.5000       0.5000       0.5000
> >  15     0.5700     0.5900     0.0400       0.5000       0.5000       0.5000
> >  16     0.6100     0.6300     0.0400       0.5000       0.5000       0.5000
> >  17     0.6500     0.6700     0.0400       0.5000       0.5000       0.5000
> >  18     0.6900     0.7100     0.0400       0.5000       0.5000       0.5000
> >  19     0.7300     0.7500     0.0400       0.5000       0.5000       0.5000
> >  20     0.7700     0.7900     0.0400       0.5000       0.5000       0.5000
> >  21     0.8100     0.8300     0.0400       0.5000       0.5000       0.5000
> >  22     0.8500     0.8700     0.0400       0.5000       0.5000       0.5000
> >  23     0.8900     0.9100     0.0400       0.5000       0.5000       0.5000
> >  24     0.9300     0.9450     0.0300       0.5714       0.4286       0.4286
> >  25     0.9600     0.9700     0.0200       0.6000       0.4000       0.4000
> >  26     0.9800     0.9850     0.0100       0.6667       0.3333       0.3333
> >  27     0.9900     0.9950     0.0100       0.5000       0.5000       0.5000
> >  28     1.0000
> >
> >
> >  MAXIMUM TIME      =  120. MINUTES
> >  TIME STEP         =  150.00 SECONDS
> >  COARSE MESH DX    =  54000. METERS
> >  GRID POINTS (X,Y) = ( 82, 64)
> >  NUMBER OF LEVELS  = 27
> >
> >  CONSTANT HOR. DIFF. COEF. =  0.58320E+05 M*M/S
> >  MAXIMUM  HOR. DIFF. COEF. =  0.60750E+06 M*M/S
> >  *** INITIAL TOTAL AIR =  0.12889E+18 KG, TOTAL WATER =  0.57123E+15 KG IN
> >  LARGE DOMAIN.
> >
> >  *** BOUNDARY CONDITIONS VALID BETWEEN       0.00 -     360.00 MINUTES ARE
> >  READ IN AT TIME =       0.00 MINUTES.
> >  *** SOLAR DECLINATION ANGLE =  14.20 DEGREES. SOLAR CONSTANT =  1349.85
> W/M**
> >  2 ***
> >  !!!!!!!!!! NON-HYDROSTATIC RUN !!!!!!!!!!
> > Program stopped
> >    Signal segv
> > Program has exited
> >
> > Program terminated, exit code is 0
> >
> > pgdbg>
> >
> ------------------------------------------------------------------------------
> > --------------
> >
> >
> > The variables defined in the configure.user:
> >
> ------------------------------------------------------------------------------
> > --------------
> > FC = pgf90
> > #FCFLAGS = -I$(LIBINCLUDE) -O2 -Mcray=pointer -tp p6 -pc 32 -Mnoframe -
> > byteswapio -Mnosgimp -mp
> > FCFLAGS = -g -I$(LIBINCLUDE) -Mcray=pointer -tp p6 -pc 32 -byteswapio -
> > Mnosgimp -mp
> > #FCFLAGS = -I$(LIBINCLUDE) -O2 -Mcray=pointer -tp p6 -pc 32 -Mnoframe -
> > byteswapio
> > CFLAGS = -O
> > CPP = /lib/cpp -C
> > CPPFLAGS = -I$(LIBINCLUDE)
> > #LDOPTIONS = -O2 -Mcray=pointer -tp p6 -pc 32 -Mnoframe -byteswapio -Mnosgimp
> > -mp
> > LDOPTIONS = -g -Mcray=pointer -tp p6 -pc 32 -byteswapio -Mnosgimp -mp
> > #LDOPTIONS = -O2 -Mcray=pointer -tp p6 -pc 32 -Mnoframe -byteswapio
> > LOCAL_LIBRARIES =
> > MAKE = make
> >
> ------------------------------------------------------------------------------
> > ---------
> >
> >
> >
> > "John G. Michalakes" wrote:
> > >
> > > Alexis, I'm not aware of anyone having done this yet, but I looked at the
> > > Portland Group web site ( http://www.pgroup.com ) and version 3.0 of PGF77
> > > does
> > > support OpenMP.  That's all you need, I would guess.  Please let me know
> > > how
> > > this progresses.
> > >
> > > Thanks,
> > >
> > > John
> > >
> > > On Wednesday, September 08, 1999 4:04 AM, Alexis Kai-Hon Lau
> > > [SMTP:alau@ust.hk]
> > > wrote:
> > > > Wei and John,
> > > >
> > > > We want to try running MM5 using a cluster of PC, each having a number of
> > > > CPUs.
> > > > We have already been able to run MM5 on a cluster of PC with each having
> 1
> > > >
> > > > processor. Any idea whether there is any group running on cluster of
> > > > multiple-CPU PCs? Any make file to test out?
> > > >
> > > > If not, any idea whether there is any group running on PC with multiple-
> > > > CPU
> > > > (George Lai from NASA??) and whether there is make file that we can try
> > > > out.
> > > >
> > > > Alexis