Batch Status

Summary

last updated: 06:52:01 24.04.2019

81 active nodes (41 used, 40 free)

4920 cores (2354 used, 2566 free)

35 running jobs, 165424:00:00 remaining core hours

0 waiting jobs, - waiting core hours

Nodes

toggle node display

Running Jobs (35)

color job queue user #proc #nodes ppn vmem t_remain t_req t_used started jobname hosts
94422 hpc2 rberre2m 48 1 48 60GB 4:09:02 3:00:00:00 2:19:50:58 21.04.2019 11:01:03 job.sh wr29
94423 hpc2 rberre2m 48 1 48 60GB 4:09:02 3:00:00:00 2:19:50:58 21.04.2019 11:01:03 job.sh wr30
94421 hpc2 rberre2m 48 1 48 60GB 4:09:02 3:00:00:00 2:19:50:58 21.04.2019 11:01:03 job.sh wr28
94424 hpc2 rberre2m 48 1 48 60GB 4:09:05 3:00:00:00 2:19:50:55 21.04.2019 11:01:06 job.sh wr31
94425 hpc2 rberre2m 48 1 48 60GB 4:09:05 3:00:00:00 2:19:50:55 21.04.2019 11:01:06 job.sh wr32
94426 hpc2 rberre2m 48 1 48 60GB 4:09:05 3:00:00:00 2:19:50:55 21.04.2019 11:01:06 job.sh wr33
94427 hpc2 rberre2m 48 1 48 60GB 4:09:05 3:00:00:00 2:19:50:55 21.04.2019 11:01:06 job.sh wr34
94428 hpc2 rberre2m 48 1 48 60GB 4:09:05 3:00:00:00 2:19:50:55 21.04.2019 11:01:06 job.sh wr35
94429 hpc2 rberre2m 48 1 48 60GB 4:09:05 3:00:00:00 2:19:50:55 21.04.2019 11:01:06 job.sh wr36
94431 hpc2 rberre2m 48 1 48 60GB 4:09:08 3:00:00:00 2:19:50:52 21.04.2019 11:01:09 job.sh wr38
94430 hpc2 rberre2m 48 1 48 60GB 4:09:08 3:00:00:00 2:19:50:52 21.04.2019 11:01:09 job.sh wr37
94432 hpc2 rberre2m 48 1 48 60GB 4:09:08 3:00:00:00 2:19:50:52 21.04.2019 11:01:09 job.sh wr39
94433 hpc2 rberre2m 48 1 48 60GB 4:09:08 3:00:00:00 2:19:50:52 21.04.2019 11:01:09 job.sh wr40
94434 hpc2 rberre2m 48 1 48 60GB 4:09:08 3:00:00:00 2:19:50:52 21.04.2019 11:01:09 job.sh wr41
94435 hpc2 rberre2m 48 1 48 60GB 4:09:11 3:00:00:00 2:19:50:49 21.04.2019 11:01:12 job.sh wr42
94080 hpc3 vbrief3s 64 1 64 180GB 4:28:44 3:00:00:00 2:19:31:16 21.04.2019 11:20:45 Netzstudie23 wr50
94662 gpu bmahes2s 64 1 64 16GB 4:44:41 10:00:00 5:15:19 1:36:42 job_main_model.sh wr19
94660 gpu aprabh2s 2 1 2 170GB 17:54:25 1:00:00:00 6:05:35 0:46:26 job_3_fashion.sh wr18
94482 gpu mwasil2s 16 1 16 24GB 1:03:13:00 3:00:00:00 1:20:47:00 22.04.2019 10:05:01 job_tflab.sh wr16
94486 gpu4 mwasil2s 32 1 32 24GB 1:05:43:31 3:00:00:00 1:18:16:29 22.04.2019 12:35:32 job_tflab.sh wr15
94495 wr13 rberre2m 272 1 272 60GB 2:01:25:09 3:00:00:00 22:34:51 23.04.2019 8:17:10 job13.sh wr13
94522 hpc dgromm3m 48 1 48 96GB 2:05:25:57 3:00:00:00 18:34:03 23.04.2019 12:17:58 start_mpi.sh wr52
94527 hpc dgromm3m 48 1 48 96GB 2:05:41:34 3:00:00:00 18:18:26 23.04.2019 12:33:35 start_mpi.sh wr53
94532 hpc dgromm3m 48 1 48 96GB 2:06:02:10 3:00:00:00 17:57:50 23.04.2019 12:54:11 start_mpi.sh wr54
94533 hpc dgromm3m 64 1 64 96GB 2:06:14:33 3:00:00:00 17:45:27 23.04.2019 13:06:34 start_mpi.sh wr55
94534 hpc dgromm3m 64 1 64 96GB 2:06:14:41 3:00:00:00 17:45:19 23.04.2019 13:06:42 start_mpi.sh wr56
94535 hpc3 pputin3s 256 4 64 120GB 2:07:27:33 3:00:00:00 16:32:27 23.04.2019 14:19:34 30mm_510deg_128proc wr57,wr58,wr59,wr60
94540 hpc dgromm3m 48 1 48 96GB 2:08:26:42 3:00:00:00 15:33:18 23.04.2019 15:18:43 start_mpi.sh wr61
94543 hpc dgromm3m 48 1 48 96GB 2:08:34:05 3:00:00:00 15:25:55 23.04.2019 15:26:06 start_mpi.sh wr62
94545 hpc dgromm3m 64 1 64 96GB 2:08:36:19 3:00:00:00 15:23:41 23.04.2019 15:28:20 start_mpi.sh wr63
94546 hpc dgromm3m 64 1 64 96GB 2:08:36:29 3:00:00:00 15:23:31 23.04.2019 15:28:30 start_mpi.sh wr64
94548 hpc dgromm3m 64 1 64 96GB 2:08:41:18 3:00:00:00 15:18:42 23.04.2019 15:33:19 start_mpi.sh wr65
94549 hpc dgromm3m 64 1 64 96GB 2:08:41:26 3:00:00:00 15:18:34 23.04.2019 15:33:27 start_mpi.sh wr66
94550 hpc dgromm3m 48 1 48 96GB 2:08:47:14 3:00:00:00 15:12:46 23.04.2019 15:39:15 start_mpi.sh wr67
94559 hpc3 pputin3s 256 4 64 120GB 2:10:04:26 3:00:00:00 13:55:34 23.04.2019 16:56:27 30mm_519deg_128proc wr76,wr77,wr78,wr79