Project: 5751 (Run 6, Clone 154, Gen 8)
Posted: Thu Jan 29, 2009 4:53 am
Since most of my 57xxs complete successfully, I thought this one might be a bad WU.
Code: Select all
[16:06:12] + Closed connections
[16:06:12]
[16:06:12] + Processing work unit
[16:06:12] Core required: FahCore_11.exe
[16:06:12] Core found.
[16:06:12] Working on queue slot 09 [January 18 16:06:12 UTC]
[16:06:12] + Working ...
[16:06:12] - Calling '.\FahCore_11.exe -dir work/ -suffix 09 -priority 96 -checkpoint 15 -verbose -lifeline 1940 -version 620'
[16:06:12]
[16:06:12] *------------------------------*
[16:06:12] Folding@Home GPU Core - Beta
[16:06:12] Version 1.19 (Mon Nov 3 09:34:13 PST 2008)
[16:06:12]
[16:06:12] Compiler : Microsoft (R) 32-bit C/C++ Optimizing Compiler Version 14.00.50727.762 for 80x86
[16:06:12] Build host: amoeba
[16:06:12] Board Type: Nvidia
[16:06:12] Core :
[16:06:12] Preparing to commence simulation
[16:06:12] - Looking at optimizations...
[16:06:12] - Created dyn
[16:06:12] - Files status OK
[16:06:12] - Expanded 98610 -> 492276 (decompressed 499.2 percent)
[16:06:12] Called DecompressByteArray: compressed_data_size=98610 data_size=492276, decompressed_data_size=492276 diff=0
[16:06:12] - Digital signature verified
[16:06:12]
[16:06:12] Project: 5751 (Run 6, Clone 154, Gen 8)
[16:06:12]
[16:06:12] Assembly optimizations on if available.
[16:06:12] Entering M.D.
[16:06:19] Working on Protein
[16:06:21] Client config found, loading data.
[16:06:21] Starting GUI Server
[16:06:21] mdrun_gpu returned
[16:06:21] NANs detected on GPU
[16:06:21]
[16:06:21] Folding@home Core Shutdown: UNSTABLE_MACHINE
[16:06:24] CoreStatus = 7A (122)
[16:06:24] Sending work to server
[16:06:24] Project: 5751 (Run 6, Clone 154, Gen 8)
[16:06:24] - Read packet limit of 540015616... Set to 524286976.
[16:06:24] - Error: Could not get length of results file work/wuresults_09.dat
[16:06:24] - Error: Could not read unit 09 file. Removing from queue.
[16:06:24] Trying to send all finished work units
[16:06:24] + No unsent completed units remaining.