Project: 2676 (Run 1, Clone 151, Gen 70)

Moderators: Site Moderators, FAHC Science Team

Post Reply
road-runner
Posts: 227
Joined: Sun Dec 02, 2007 4:01 am
Location: Willis, Texas

Project: 2676 (Run 1, Clone 151, Gen 70)

Post by road-runner »

Not sure what is going on with this WU?

Code: Select all

[16:19:57] Project: 2676 (Run 1, Clone 151, Gen 70)
[16:19:57] 
[16:19:57] Assembly optimizations on if available.
[16:19:57] Entering M.D.
[16:20:07] Run 1, Clone 151, Gen 70)
[16:20:07] 
[16:20:07] Entering M.D.
NNODES=8, MYRANK=0, HOSTNAME=i7-Upstairs
NNODES=8, MYRANK=2, HOSTNAME=i7-Upstairs
NNODES=8, MYRANK=1, HOSTNAME=i7-Upstairs
NNODES=8, MYRANK=3, HOSTNAME=i7-Upstairs
NNODES=8, MYRANK=5, HOSTNAME=i7-Upstairs
NNODES=8, MYRANK=6, HOSTNAME=i7-Upstairs
NNODES=8, MYRANK=7, HOSTNAME=i7-Upstairs
NNODES=8, MYRANK=4, HOSTNAME=i7-Upstairs
NODEID=2 argc=20
NODEID=0 argc=20
                         :-)  G  R  O  M  A  C  S  (-:

                   Groningen Machine for Chemical Simulation

                          :-)  VERSION 4.0.3_pre  (-:


      Written by David van der Spoel, Erik Lindahl, Berk Hess, and others.
       Copyright (c) 1991-2000, University of Groningen, The Netherlands.
             Copyright (c) 2001-2008, The GROMACS development team,
            check out http://www.gromacs.org for more information.


                                :-)  mdrun  (-:

Reading file work/wudata_04.tpr, VERSION 3.3.99_development_20070618 (single precision)
NODEID=1 argc=20
NODEID=3 argc=20
NODEID=6 argc=20
NODEID=7 argc=20
NODEID=4 argc=20
NODEID=5 argc=20
Note: tpx file_version 48, software version 58

NOTE: The tpr file used for this simulation is in an old format, for less memory usage and possibly more performance create a new tpr file with an up to date version of grompp

Making 3D domain decomposition 2 x 2 x 2
starting mdrun '23130 system in water'
17750002 steps,  35500.0 ps (continuing from step 17500002,  35000.0 ps).
[16:32:33] Completed 5008 out of 250000 steps  (2%)
[16:38:40] Completed 7508 out of 250000 steps  (3%)
[16:44:48] Completed 10008 out of 250000 steps  (4%)
[16:50:55] Completed 12508 out of 250000 steps  (5%)
[16:57:04] Completed 15008 out of 250000 steps  (6%)
[17:03:12] Completed 17508 out of 250000 steps  (7%)
[17:09:20] Completed 20008 out of 250000 steps  (8%)
[17:15:29] Completed 22508 out of 250000 steps  (9%)
[17:21:37] Completed 25008 out of 250000 steps  (10%)
[17:27:44] Completed 27508 out of 250000 steps  (11%)
[17:33:51] Completed 30008 out of 250000 steps  (12%)
[17:39:58] Completed 32508 out of 250000 steps  (13%)
[17:46:06] Completed 35008 out of 250000 steps  (14%)
[17:52:15] Completed 37508 out of 250000 steps  (15%)
[17:58:23] Completed 40008 out of 250000 steps  (16%)
[18:04:33] Completed 42508 out of 250000 steps  (17%)
[18:10:42] Completed 45008 out of 250000 steps  (18%)
[18:16:48] Completed 47508 out of 250000 steps  (19%)
[18:22:57] Completed 50008 out of 250000 steps  (20%)
[18:29:06] Completed 52508 out of 250000 steps  (21%)
[18:35:14] Completed 55008 out of 250000 steps  (22%)
[18:41:22] Completed 57508 out of 250000 steps  (23%)
[18:47:32] Completed 60008 out of 250000 steps  (24%)
[18:53:40] Completed 62508 out of 250000 steps  (25%)
[18:59:50] Completed 65008 out of 250000 steps  (26%)
[19:05:58] Completed 67508 out of 250000 steps  (27%)
[19:12:05] Completed 70008 out of 250000 steps  (28%)
[19:18:14] Completed 72508 out of 250000 steps  (29%)
[19:24:21] Completed 75008 out of 250000 steps  (30%)
[19:30:30] Completed 77508 out of 250000 steps  (31%)
[19:36:36] Completed 80008 out of 250000 steps  (32%)
[19:42:45] Completed 82508 out of 250000 steps  (33%)
[19:48:54] Completed 85008 out of 250000 steps  (34%)
[19:55:02] Completed 87508 out of 250000 steps  (35%)
[20:01:10] Completed 90008 out of 250000 steps  (36%)
[20:07:18] Completed 92508 out of 250000 steps  (37%)
[20:13:25] Completed 95008 out of 250000 steps  (38%)
[20:19:34] Completed 97508 out of 250000 steps  (39%)
[20:25:43] Completed 100008 out of 250000 steps  (40%)
[20:31:51] Completed 102508 out of 250000 steps  (41%)
[cli_6]: aborting job:
Fatal error in MPI_Sendrecv: Error message texts are not available
[cli_7]: aborting job:
Fatal error in MPI_Sendrecv: Error message texts are not available
[0]0:Return code = 0, signaled with Segmentation fault
[0]1:Return code = 0, signaled with Segmentation fault
[0]2:Return code = 0, signaled with Segmentation fault
[0]3:Return code = 0, signaled with Segmentation fault
[0]4:Return code = 0, signaled with Segmentation fault
[0]5:Return code = 0, signaled with Segmentation fault
[0]6:Return code = 1
[0]7:Return code = 1
[20:36:22] CoreStatus = 1 (1)
[20:36:22] Sending work to server
[20:36:22] Project: 2676 (Run 1, Clone 151, Gen 70)
[20:36:22] - Error: Could not get length of results file work/wuresults_04.dat
[20:36:22] - Error: Could not read unit 04 file. Removing from queue.
[20:36:22] - Preparing to get new work unit...
[20:36:22] + Attempting to get work packet
[20:36:22] - Connecting to assignment server
[20:36:23] - Successful: assigned to (171.67.108.24).
[20:36:23] + News From Folding@Home: Welcome to Folding@Home
[20:36:23] Loaded queue successfully.
[20:36:34] + Closed connections
[20:36:39] 
[20:36:39] + Processing work unit
[20:36:40] Core required: FahCore_a2.exe
[20:36:40] Core found.
[20:36:40] Working on queue slot 05 [April 3 20:36:40 UTC]
[20:36:40] + Working ...
[20:36:40] 
[20:36:40] *------------------------------*
[20:36:40] Folding@Home Gromacs SMP Core
[20:36:40] Version 2.04 (Thu Jan 29 16:43:57 PST 2009)
[20:36:40] 
[20:36:40] Preparing to commence simulation
[20:36:40] - Ensuring status. Please wait.
[20:36:49] - Looking at optimizations...
[20:36:49] - Working with standard loops on this execution.
[20:36:49] - Files status OK
[20:36:50] - Expanded 4864663 -> 24067137 (decompressed 494.7 percent)
[20:36:50] Called DecompressByteArray: compressed_data_size=4864663 data_size=24067137, decompressed_data_size=24067137 diff=0
[20:36:50] - Digital signature verified
[20:36:50] 
[20:36:50] Project: 2676 (Run 1, Clone 151, Gen 70)
[20:36:50] 
[20:36:51] Entering M.D.
NNODES=8, MYRANK=1, HOSTNAME=i7-Upstairs
NNODES=8, MYRANK=2, HOSTNAME=i7-Upstairs
NNODES=8, MYRANK=3, HOSTNAME=i7-Upstairs
NNODES=8, MYRANK=0, HOSTNAME=i7-Upstairs
NNODES=8, MYRANK=4, HOSTNAME=i7-Upstairs
NNODES=8, MYRANK=5, HOSTNAME=i7-Upstairs
NNODES=8, MYRANK=6, HOSTNAME=i7-Upstairs
NNODES=8, MYRANK=7, HOSTNAME=i7-Upstairs
NODEID=2 argc=20
NODEID=3 argc=20
NODEID=4 argc=20
NODEID=5 argc=20
NODEID=0 argc=20
                         :-)  G  R  O  M  A  C  S  (-:

                   Groningen Machine for Chemical Simulation

                          :-)  VERSION 4.0.3_pre  (-:


      Written by David van der Spoel, Erik Lindahl, Berk Hess, and others.
       Copyright (c) 1991-2000, University of Groningen, The Netherlands.
             Copyright (c) 2001-2008, The GROMACS development team,
            check out http://www.gromacs.org for more information.


                                :-)  mdrun  (-:

Reading file work/wudata_05.tpr, VERSION 3.3.99_development_20070618 (single precision)
NODEID=1 argc=20
NODEID=6 argc=20
NODEID=7 argc=20
Note: tpx file_version 48, software version 58

NOTE: The tpr file used for this simulation is in an old format, for less memory usage and possibly more performance create a new tpr file with an up to date version of grompp

Making 3D domain decomposition 2 x 2 x 2
starting mdrun '23130 system in water'
17750002 steps,  35500.0 ps (continuing from step 17500002,  35000.0 ps).
[20:43:08] Completed 2508 out of 250000 steps  (1%)
[20:49:15] Completed 5008 out of 250000 steps  (2%)
[20:55:23] Completed 7508 out of 250000 steps  (3%)
[21:01:30] Completed 10008 out of 250000 steps  (4%)
[21:07:39] Completed 12508 out of 250000 steps  (5%)
[21:13:46] Completed 15008 out of 250000 steps  (6%)
[21:19:55] Completed 17508 out of 250000 steps  (7%)
[21:26:03] Completed 20008 out of 250000 steps  (8%)
[21:32:10] Completed 22508 out of 250000 steps  (9%)
[21:38:17] Completed 25008 out of 250000 steps  (10%)
[21:44:24] Completed 27508 out of 250000 steps  (11%)
[21:50:33] Completed 30008 out of 250000 steps  (12%)
[21:56:38] Completed 32508 out of 250000 steps  (13%)
[22:02:46] Completed 35008 out of 250000 steps  (14%)
[22:08:53] Completed 37508 out of 250000 steps  (15%)
[22:15:01] Completed 40008 out of 250000 steps  (16%)
[22:21:10] Completed 42508 out of 250000 steps  (17%)
[22:27:18] Completed 45008 out of 250000 steps  (18%)
[22:33:23] Completed 47508 out of 250000 steps  (19%)
[22:39:31] Completed 50008 out of 250000 steps  (20%)
[22:45:39] Completed 52508 out of 250000 steps  (21%)
[22:51:46] Completed 55008 out of 250000 steps  (22%)
[22:57:52] Completed 57508 out of 250000 steps  (23%)
[23:04:01] Completed 60008 out of 250000 steps  (24%)
[23:10:09] Completed 62508 out of 250000 steps  (25%)
[23:16:15] Completed 65008 out of 250000 steps  (26%)
[23:22:23] Completed 67508 out of 250000 steps  (27%)
[23:28:30] Completed 70008 out of 250000 steps  (28%)
[23:34:37] Completed 72508 out of 250000 steps  (29%)
[23:40:44] Completed 75008 out of 250000 steps  (30%)
[23:46:52] Completed 77508 out of 250000 steps  (31%)

-------------------------------------------------------
Program mdrun, VERSION 4.0.3_pre
Source code file: domdec_top.c, line: 172

Software inconsistency error:
Some interactions seem to be assigned multiple times
-------------------------------------------------------

Thanx for Using GROMACS - Have a Nice Day

Error on node 7, will try to stop all the nodes
Halting parallel program mdrun on CPU 7 out of 8

gcq#0: Thanx for Using GROMACS - Have a Nice Day

[cli_7]: aborting job:
application called MPI_Abort(MPI_COMM_WORLD, -1) - process 7
[23:51:29] 
[23:51:29] Folding@home Core Shutdown: INTERRUPTED

A list of missing interactions:
               Angle of  42183 missing     -1
          exclusions of 250365 missing     -4
[cli_0]: aborting job:
application called MPI_Abort(MPI_COMM_WORLD, 102) - process 0
[cli_1]: aborting job:
Fatal error in MPI_Allreduce: Error message texts are not available
[cli_2]: aborting job:
Fatal error in MPI_Allreduce: Error message texts are not available
[cli_3]: aborting job:
Fatal error in MPI_Allreduce: Error message texts are not available
[cli_5]: aborting job:
Fatal error in MPI_Allreduce: Error message texts are not available

Folding@Home Client Shutdown.
david@i7-Upstairs:~/folding1/FAH1$ ./fah6 -smp 8 -local'
> 
> 
david@i7-Upstairs:~/folding1/FAH1$ ./fah6 -smp 8 -local

Note: Please read the license agreement (fah6 -license). Further 
use of this software requires that you have read and accepted this agreement.

Using local directory for work files
8 cores detected


--- Opening Log file [April 4 01:10:03 UTC] 


# Linux SMP Console Edition ###################################################
###############################################################################

                       Folding@Home Client Version 6.24beta

                          http://folding.stanford.edu

###############################################################################
###############################################################################

Launch directory: /home/david/folding1/FAH1
Executable: ./fah6
Arguments: -smp 8 -local 

[01:10:03] - Ask before connecting: No
[01:10:03] - User name: road-runner (Team 12772)
[01:10:03] - User ID: 96D91915346BDE2
[01:10:03] - Machine ID: 1
[01:10:03] 
[01:10:03] Loaded queue successfully.
[01:10:03] 
[01:10:03] + Processing work unit
[01:10:03] Core required: FahCore_a2.exe
[01:10:03] Core found.
[01:10:03] Working on queue slot 05 [April 4 01:10:03 UTC]
[01:10:03] + Working ...
[01:10:03] 
[01:10:03] *------------------------------*
[01:10:03] Folding@Home Gromacs SMP Core
[01:10:03] Version 2.04 (Thu Jan 29 16:43:57 PST 2009)
[01:10:03] 
[01:10:03] Preparing to commence simulation
[01:10:03] - Ensuring status. Please wait.
[01:10:04] Called DecompressByteArray: compressed_data_size=4864663 data_size=24067137, decompressed_data_size=24067137 diff=0
[01:10:04] - Digital signature verified
[01:10:04] 
[01:10:04] Project: 2676 (Run 1, Clone 151, Gen 70)
[01:10:04] 
[01:10:04] Assembly optimizations on if available.
[01:10:04] Entering M.D.
[01:10:10] Will resume from checkpoint file
[01:10:14] ng M.D.
[01:10:20] Will resume from checkpoint file
NNODES=8, MYRANK=0, HOSTNAME=i7-Upstairs
NNODES=8, MYRANK=1, HOSTNAME=i7-Upstairs
NNODES=8, MYRANK=2, HOSTNAME=i7-Upstairs
NNODES=8, MYRANK=3, HOSTNAME=i7-Upstairs
NNODES=8, MYRANK=5, HOSTNAME=i7-Upstairs
NNODES=8, MYRANK=6, HOSTNAME=i7-Upstairs
NNODES=8, MYRANK=7, HOSTNAME=i7-Upstairs
NNODES=8, MYRANK=4, HOSTNAME=i7-Upstairs
NODEID=2 argc=20
NODEID=3 argc=20
NODEID=0 argc=20
                         :-)  G  R  O  M  A  C  S  (-:

                   Groningen Machine for Chemical Simulation

                          :-)  VERSION 4.0.3_pre  (-:


      Written by David van der Spoel, Erik Lindahl, Berk Hess, and others.
       Copyright (c) 1991-2000, University of Groningen, The Netherlands.
             Copyright (c) 2001-2008, The GROMACS development team,
            check out http://www.gromacs.org for more information.


                                :-)  mdrun  (-:

NODEID=1 argc=20
Reading file work/wudata_05.tpr, VERSION 3.3.99_development_20070618 (single precision)
NODEID=6 argc=20
NODEID=4 argc=20
NODEID=7 argc=20
NODEID=5 argc=20
Note: tpx file_version 48, software version 58

NOTE: The tpr file used for this simulation is in an old format, for less memory usage and possibly more performance create a new tpr file with an up to date version of grompp

Making 3D domain decomposition 2 x 2 x 2
starting mdrun '23130 system in water'
17750002 steps,  35500.0 ps (continuing from step 17500002,  35000.0 ps).

-------------------------------------------------------
Program mdrun, VERSION 4.0.3_pre
Source code file: md.c, line: 1107

Fatal error:
Checkpoint error on step 17577520

-------------------------------------------------------

Thanx for Using GROMACS - Have a Nice Day

Error on node 0, will try to stop all the nodes
Halting parallel program mdrun on CPU 0 out of 8

gcq#0: Thanx for Using GROMACS - Have a Nice Day

[cli_0]: aborting job:
application called MPI_Abort(MPI_COMM_WORLD, -1) - process 0
[01:10:22] te: I/O failed dir=0, var=0000000001EE8150, varsize=394668
[01:10:22] fcCheckPointResume: failure in call to fcSaveRestoreState() to restore state.
[0]0:Return code = 255
[0]1:Return code = 0, signaled with Quit
[0]2:Return code = 0, signaled with Quit
[0]3:Return code = 0, signaled with Quit
[0]4:Return code = 0, signaled with Quit
[0]5:Return code = 0, signaled with Quit
[0]6:Return code = 0, signaled with Quit
[0]7:Return code = 0, signaled with Quit
[01:10:30] CoreStatus = FF (255)
[01:10:30] Sending work to server
[01:10:30] Project: 2676 (Run 1, Clone 151, Gen 70)
[01:10:30] - Error: Could not get length of results file work/wuresults_05.dat
[01:10:30] - Error: Could not read unit 05 file. Removing from queue.
[01:10:30] - Preparing to get new work unit...
[01:10:30] + Attempting to get work packet
[01:10:30] - Connecting to assignment server
[01:10:31] - Successful: assigned to (171.67.108.24).
[01:10:31] + News From Folding@Home: Welcome to Folding@Home
[01:10:31] Loaded queue successfully.
[01:10:44] + Closed connections
[01:10:49] 
[01:10:49] + Processing work unit
[01:10:49] Core required: FahCore_a2.exe
[01:10:49] Core found.
[01:10:49] Working on queue slot 06 [April 4 01:10:49 UTC]
[01:10:49] + Working ...
[01:10:49] 
[01:10:49] *------------------------------*
[01:10:49] Folding@Home Gromacs SMP Core
[01:10:49] Version 2.04 (Thu Jan 29 16:43:57 PST 2009)
[01:10:49] 
[01:10:49] Preparing to commence simulation
[01:10:49] - Ensuring status. Please wait.
[01:10:59] - Looking at optimizations...
[01:10:59] - Working with standard loops on this execution.
[01:10:59] - Files status OK
[01:11:00] - Expanded 4864663 -> 24067137 (decompressed 494.7 percent)
[01:11:00] Called DecompressByteArray: compressed_data_size=4864663 data_size=24067137, decompressed_data_size=24067137 diff=0
[01:11:00] - Digital signature verified
[01:11:00] 
[01:11:00] Project: 2676 (Run 1, Clone 151, Gen 70)
[01:11:00] 
[01:11:00] Entering M.D.
NNODES=8, MYRANK=0, HOSTNAME=i7-Upstairs
NNODES=8, MYRANK=1, HOSTNAME=i7-Upstairs
NNODES=8, MYRANK=2, HOSTNAME=i7-Upstairs
NNODES=8, MYRANK=4, HOSTNAME=i7-Upstairs
NNODES=8, MYRANK=5, HOSTNAME=i7-Upstairs
NNODES=8, MYRANK=7, HOSTNAME=i7-Upstairs
NNODES=8, MYRANK=3, HOSTNAME=i7-Upstairs
NNODES=8, MYRANK=6, HOSTNAME=i7-Upstairs
NODEID=4 argc=20
NODEID=0 argc=20
                         :-)  G  R  O  M  A  C  S  (-:

                   Groningen Machine for Chemical Simulation

                          :-)  VERSION 4.0.3_pre  (-:


      Written by David van der Spoel, Erik Lindahl, Berk Hess, and others.
       Copyright (c) 1991-2000, University of Groningen, The Netherlands.
             Copyright (c) 2001-2008, The GROMACS development team,
            check out http://www.gromacs.org for more information.


                                :-)  mdrun  (-:

Reading file work/wudata_06.tpr, VERSION 3.3.99_development_20070618 (single precision)
NODEID=1 argc=20
NODEID=5 argc=20
NODEID=6 argc=20
NODEID=7 argc=20
NODEID=2 argc=20
NODEID=3 argc=20
Note: tpx file_version 48, software version 58

NOTE: The tpr file used for this simulation is in an old format, for less memory usage and possibly more performance create a new tpr file with an up to date version of grompp

Making 3D domain decomposition 2 x 2 x 2
starting mdrun '23130 system in water'
17750002 steps,  35500.0 ps (continuing from step 17500002,  35000.0 ps).


Image
toTOW
Site Moderator
Posts: 6453
Joined: Sun Dec 02, 2007 10:38 am
Location: Bordeaux, France
Contact:

Re: Project: 2676 (Run 1, Clone 151, Gen 70)

Post by toTOW »

There's no data for this WU in the DB ... this looks like a bad one :S
Image

Folding@Home beta tester since 2002. Folding Forum moderator since July 2008.
Post Reply