Batch Status

Summary

last updated: 23:56:02 07.12.2025

71 active nodes (26 used, 45 free)

6752 hw threads (3136 used, 3616 free)

13 running jobs, 51712:00:00 remaining core hours

15 waiting jobs, 11520:00:00 waiting core hours

Nodes

toggle node display

Running Jobs (13)

color job queue user #proc #nodes ppn gpn vmem_req vmem_used t_remain t_req t_used started jobname hosts
881030 hpc1 ebueck3s 512 4 128 0 360 GB 146 GB 3:48:16 4:00:00 11:44 23:44:18 BRSM_1_OF_(yaw=0_vel=50) wr50,wr51,wr52,wr53
881032 hpc1 ebueck3s 512 4 128 0 360 GB 147 GB 3:48:25 4:00:00 11:35 23:44:27 BRSM_1_OF_(yaw=0_vel=50) wr54,wr55,wr56,wr59
881034 hpc1 ebueck3s 512 4 128 0 360 GB 145 GB 3:48:32 4:00:00 11:28 23:44:34 BRSM_1_OF_(yaw=0_vel=50) wr60,wr61,wr62,wr63
881036 hpc1 ebueck3s 512 4 128 0 360 GB 146 GB 3:48:38 4:00:00 11:22 23:44:40 BRSM_1_OF_(yaw=0_vel=50) wr64,wr65,wr66,wr67
881038 hpc1 ebueck3s 512 4 128 0 360 GB 144 GB 3:48:47 4:00:00 11:13 23:44:49 BRSM_1_OF_(yaw=0_vel=50) wr68,wr69,wr70,wr71
880688 any dgromm3m 128 1 128 0 128 GB 69 GB 12:07:53 3:00:00:00 2:11:52:07 05.12.2025 12:03:55 start_mpi.sh wr44
880737 any dgromm3m 64 1 64 0 128 GB 70 GB 15:59:55 3:00:00:00 2:08:00:05 05.12.2025 15:55:57 start_mpi.sh wr57
880738 any dgromm3m 64 1 64 0 128 GB 71 GB 15:59:55 3:00:00:00 2:08:00:05 05.12.2025 15:55:57 start_mpi.sh wr58
880739 any dgromm3m 64 1 64 0 128 GB 70 GB 15:59:55 3:00:00:00 2:08:00:05 05.12.2025 15:55:57 start_mpi.sh wr58
880992 gpu4 mmensi2s 64 1 64 1 0 B 141 GB 2:04:43:06 3:00:00:00 19:16:54 4:39:08 cgan-unet-10x10_multiclass_cgan_unet_csv wr22
880993 gpu4 mmensi2s 64 1 64 1 0 B 141 GB 2:04:56:24 3:00:00:00 19:03:36 4:52:26 cgan-unet-10x10_multiclass_cgan_unet_csv wr22
880995 gpu mmensi2s 64 1 64 1 0 B 177 GB 2:04:56:32 3:00:00:00 19:03:28 4:52:34 cgan-10x10_multiclass_cgan_csv wr17
880996 gpu mmensi2s 64 1 64 1 0 B 151 GB 2:04:56:33 3:00:00:00 19:03:27 4:52:35 cgan-unet-10x10_multiclass_cgan_unet_csv wr18

Waiting/Blocked Jobs (15)

job queue user state #proc #nodes ppn gpn vmem t_req prio enqueued waiting jobname wait reason
881039 hpc1 ebueck3s PD 512 4 128 0 360 GB 4:00:00 80794 23:44:49 11:13 BRSM_2_OF_(yaw=0_vel=50)_dependent (Dependency)
881037 hpc1 ebueck3s PD 512 4 128 0 360 GB 4:00:00 80794 23:44:40 11:22 BRSM_2_OF_(yaw=0_vel=50)_dependent (Dependency)
881035 hpc1 ebueck3s PD 512 4 128 0 360 GB 4:00:00 80794 23:44:34 11:28 BRSM_2_OF_(yaw=0_vel=50)_dependent (Dependency)
881033 hpc1 ebueck3s PD 512 4 128 0 360 GB 4:00:00 80794 23:44:27 11:35 BRSM_2_OF_(yaw=0_vel=50)_dependent (Dependency)
881031 hpc1 ebueck3s PD 512 4 128 0 360 GB 4:00:00 80794 23:44:18 11:44 BRSM_2_OF_(yaw=0_vel=50)_dependent (Dependency)
880855 wr44 pgroeg2s PD 256 1 256 0 16 GB 30:00 53011 06.12.2025 14:19:53 1:09:36:09 job_loopsched.sh (Resources)
880845 wr44 lauer2s PD 256 1 256 0 16 GB 30:00 51796 06.12.2025 13:32:33 1:10:23:29 job_loopsched.sh (Priority)
880843 wr44 mstemp2s PD 256 1 256 0 16 GB 30:00 50984 06.12.2025 0:18:44 1:23:37:18 job_loopsched.sh (Priority)
880859 wr44 fhoepf2s PD 256 1 256 0 16 GB 30:00 50175 06.12.2025 20:26:38 1:03:29:24 job_loopsched.sh (Priority)
880860 wr44 jitter2s PD 256 1 256 0 16 GB 30:00 49324 06.12.2025 21:47:53 1:02:08:09 job_loopsched.sh (Priority)
880798 wr44 aschal2s PD 256 1 256 0 16 GB 30:00 47334 05.12.2025 17:13:00 2:06:43:02 job_loopsched.sh (Priority)
881007 wr44 lkraus2s PD 256 1 256 0 16 GB 30:00 46280 14:57:07 8:58:55 job_loopsched.sh (Priority)
881016 wr44 njabra2s PD 256 1 256 0 16 GB 30:00 42302 19:45:12 4:10:50 job_loopsched.sh (Priority)
881008 wr44 jmeyer2s PD 256 1 256 0 16 GB 30:00 42294 15:02:27 8:53:35 job_loopsched.sh (Priority)
881009 wr44 jmeyer2s PD 256 1 256 0 16 GB 30:00 42276 15:10:07 8:45:55 job_loopsched.sh (Priority)