Batch Status

Summary

last updated: 13:35:04 20.05.2022

65 active nodes (53 used, 12 free)

4760 hw threads (3412 used, 1348 free)

40 running jobs, 144320:00:00 remaining core hours

2 waiting jobs, - waiting core hours

Nodes

toggle node display

Running Jobs (40)

color job queue user #proc #nodes ppn vmem_req vmem_used t_remain t_req t_used started jobname hosts
477022 wr44 aschal2s 256 1 256 30GB 6GB 59:38 1:00:00 0:22 13:34:39 job_vector.sh wr44
477019 any schaar3m 16 1 16 16GB 7GB 5:12:58 6:00:00 47:02 12:47:59 EffNetv2_job_labels_partition.sh wr14
477010 hpc3 sbobad3s 256 4 64 180GB 36GB 5:59:08 8:00:00 2:00:52 11:34:09 BRSM_OF_(yaw=0_vel=50) wr76,wr79,wr80,wr81
477011 hpc3 sbobad3s 256 4 64 180GB 35GB 5:59:11 8:00:00 2:00:49 11:34:12 BRSM_OF_(yaw=0_vel=50) wr82,wr83,wr84,wr85
477012 hpc3 sbobad3s 256 4 64 180GB 37GB 5:59:15 8:00:00 2:00:45 11:34:16 BRSM_OF_(yaw=0_vel=50) wr86,wr87,wr88,wr89
477013 hpc3 sbobad3s 256 4 64 180GB 36GB 5:59:15 8:00:00 2:00:45 11:34:16 BRSM_OF_(yaw=0_vel=50) wr90,wr91,wr92,wr93
477014 hpc3 sbobad3s 256 4 64 180GB 38GB 5:59:18 8:00:00 2:00:42 11:34:19 BRSM_OF_(yaw=0_vel=50) wr94,wr95,wr96,wr97
476954 gpu sthodu2m 16 1 16 40GB 32GB 14:26:00 2:12:00:00 1:21:34:00 18.05.2022 16:01:01 resnet_vt1.sh wr16
476913 hpc dgromm3m 64 1 64 96GB 251GB 1:01:31:40 3:00:00:00 1:22:28:20 18.05.2022 15:06:41 start_mpi.sh wr51
476914 hpc dgromm3m 64 1 64 96GB 251GB 1:01:32:13 3:00:00:00 1:22:27:47 18.05.2022 15:07:14 start_mpi.sh wr52
476916 hpc dgromm3m 64 1 64 96GB 251GB 1:01:33:40 3:00:00:00 1:22:26:20 18.05.2022 15:08:41 start_mpi.sh wr61
476917 hpc dgromm3m 64 1 64 96GB 251GB 1:01:34:35 3:00:00:00 1:22:25:25 18.05.2022 15:09:36 start_mpi.sh wr62
476919 hpc dgromm3m 64 1 64 96GB 251GB 1:01:37:11 3:00:00:00 1:22:22:49 18.05.2022 15:12:12 start_mpi.sh wr56
476920 hpc dgromm3m 64 1 64 96GB 251GB 1:01:37:56 3:00:00:00 1:22:22:04 18.05.2022 15:12:57 start_mpi.sh wr57
476923 hpc dgromm3m 64 1 64 96GB 251GB 1:01:42:44 3:00:00:00 1:22:17:16 18.05.2022 15:17:45 start_mpi.sh wr50
476924 hpc dgromm3m 64 1 64 96GB 251GB 1:01:43:22 3:00:00:00 1:22:16:38 18.05.2022 15:18:23 start_mpi.sh wr53
476925 hpc dgromm3m 64 1 64 96GB 251GB 1:01:43:43 3:00:00:00 1:22:16:17 18.05.2022 15:18:44 start_mpi.sh wr54
476926 hpc dgromm3m 64 1 64 96GB 253GB 1:01:45:08 3:00:00:00 1:22:14:52 18.05.2022 15:20:09 start_mpi.sh wr55
476927 hpc dgromm3m 64 1 64 96GB 126GB 1:01:46:06 3:00:00:00 1:22:13:54 18.05.2022 15:21:07 start_mpi.sh wr58
476928 hpc dgromm3m 64 1 64 96GB 126GB 1:01:46:55 3:00:00:00 1:22:13:05 18.05.2022 15:21:56 start_mpi.sh wr59
476929 hpc dgromm3m 64 1 64 96GB 126GB 1:01:47:29 3:00:00:00 1:22:12:31 18.05.2022 15:22:30 start_mpi.sh wr60
476930 hpc dgromm3m 64 1 64 96GB 251GB 1:01:48:17 3:00:00:00 1:22:11:43 18.05.2022 15:23:18 start_mpi.sh wr63
476931 hpc dgromm3m 64 1 64 96GB 251GB 1:01:48:46 3:00:00:00 1:22:11:14 18.05.2022 15:23:47 start_mpi.sh wr64
476932 hpc dgromm3m 64 1 64 96GB 251GB 1:01:49:29 3:00:00:00 1:22:10:31 18.05.2022 15:24:30 start_mpi.sh wr65
476933 hpc dgromm3m 64 1 64 96GB 251GB 1:01:55:23 3:00:00:00 1:22:04:37 18.05.2022 15:30:24 start_mpi.sh wr66
476934 hpc dgromm3m 64 1 64 96GB 251GB 1:01:56:42 3:00:00:00 1:22:03:18 18.05.2022 15:31:43 start_mpi.sh wr67
476935 hpc dgromm3m 64 1 64 96GB 251GB 1:01:57:34 3:00:00:00 1:22:02:26 18.05.2022 15:32:35 start_mpi.sh wr68
476938 hpc dgromm3m 64 1 64 96GB 251GB 1:02:02:47 3:00:00:00 1:21:57:13 18.05.2022 15:37:48 start_mpi.sh wr69
476939 hpc dgromm3m 64 1 64 96GB 251GB 1:02:03:31 3:00:00:00 1:21:56:29 18.05.2022 15:38:32 start_mpi.sh wr70
476940 hpc dgromm3m 64 1 64 96GB 251GB 1:02:04:09 3:00:00:00 1:21:55:51 18.05.2022 15:39:10 start_mpi.sh wr71
476941 gpu sthodu2m 16 1 16 40GB 32GB 1:02:06:57 3:00:00:00 1:21:53:03 18.05.2022 15:41:58 resnet_vt7.sh wr12
476945 hpc dgromm3m 64 1 64 96GB 251GB 1:02:14:08 3:00:00:00 1:21:45:52 18.05.2022 15:49:09 start_mpi.sh wr72
476948 hpc dgromm3m 64 1 64 96GB 251GB 1:02:15:27 3:00:00:00 1:21:44:33 18.05.2022 15:50:28 start_mpi.sh wr77
476949 hpc dgromm3m 64 1 64 96GB 251GB 1:02:16:15 3:00:00:00 1:21:43:45 18.05.2022 15:51:16 start_mpi.sh wr78
476957 hpc dgromm3m 64 1 64 96GB 251GB 1:02:26:19 3:00:00:00 1:21:33:41 18.05.2022 16:01:20 start_mpi.sh wr73
476960 hpc dgromm3m 64 1 64 96GB 251GB 1:02:30:38 3:00:00:00 1:21:29:22 18.05.2022 16:05:39 start_mpi.sh wr74
476961 hpc dgromm3m 64 1 64 96GB 251GB 1:02:31:09 3:00:00:00 1:21:28:51 18.05.2022 16:06:10 start_mpi.sh wr75
476976 gpu sthodu2m 16 1 16 40GB 32GB 1:20:34:52 3:00:00:00 1:03:25:08 19.05.2022 10:09:53 resnet_vtp1.sh wr12
476977 gpu sthodu2m 16 1 16 40GB 48GB 1:20:38:05 3:00:00:00 1:03:21:55 19.05.2022 10:13:06 resnetv1.sh wr12
476995 gpu4test mbedru3s 4 1 4 490GB 513GB 2:08:05:30 3:00:00:00 15:54:30 19.05.2022 21:40:31 job.sh wr20

Waiting/Blocked Jobs (2)

job queue user state #proc #nodes ppn vmem t_req prio enqueued waiting jobname wait reason
477015 hpc3 sbobad3s PD 128 4 32 180GB 8:00:00 7295 11:34:19 2:00:42 BRSM_OF_(yaw=0_vel=50) (Resources)
477016 hpc3 sbobad3s PD 128 4 32 180GB 8:00:00 7294 11:34:20 2:00:41 BRSM_OF_(yaw=0_vel=50) (Priority)