Batch Status

Summary

last updated: 07:29:05 14.03.2026

71 active nodes (35 used, 36 free)

6752 hw threads (3288 used, 3464 free)

38 running jobs, 171840:00:00 remaining core hours

1 waiting jobs, 6144:00:00 waiting core hours

Nodes

toggle node display

Running Jobs (38)

color job queue user #proc #nodes ppn gpn vmem_req vmem_used t_remain t_req t_used started jobname hosts
133811 hpc1 ipolat2s 128 1 128 1 256 GB 93 GB 3:32:34 12:00:00 8:27:26 13.03.2026 23:01:36 cml_parallel wr64
133815 any vvicto2s 8 1 8 1 32 GB 124 GB 17:47:25 1:00:00:00 6:12:35 1:16:27 pointcnn_modelnet wr15
133687 any dgromm3m 64 1 64 1 64 GB 55 GB 1:09:39:24 3:00:00:00 1:14:20:36 12.03.2026 17:08:26 start_mpi.sh wr50
133688 any dgromm3m 64 1 64 1 64 GB 55 GB 1:09:40:43 3:00:00:00 1:14:19:17 12.03.2026 17:09:45 start_mpi.sh wr50
133689 any dgromm3m 64 1 64 1 64 GB 56 GB 1:09:41:31 3:00:00:00 1:14:18:29 12.03.2026 17:10:33 start_mpi.sh wr51
133690 any dgromm3m 64 1 64 1 128 GB 55 GB 1:09:42:25 3:00:00:00 1:14:17:35 12.03.2026 17:11:27 start_mpi.sh wr51
133691 any dgromm3m 64 1 64 1 64 GB 56 GB 1:09:43:00 3:00:00:00 1:14:17:00 12.03.2026 17:12:02 start_mpi.sh wr52
133694 any dgromm3m 64 1 64 1 128 GB 56 GB 1:09:47:27 3:00:00:00 1:14:12:33 12.03.2026 17:16:29 start_mpi.sh wr52
133695 any dgromm3m 64 1 64 1 64 GB 56 GB 1:09:47:51 3:00:00:00 1:14:12:09 12.03.2026 17:16:53 start_mpi.sh wr53
133696 any dgromm3m 64 1 64 1 64 GB 56 GB 1:09:49:22 3:00:00:00 1:14:10:38 12.03.2026 17:18:24 start_mpi.sh wr53
133697 any dgromm3m 64 1 64 1 64 GB 54 GB 1:09:50:40 3:00:00:00 1:14:09:20 12.03.2026 17:19:42 start_mpi.sh wr54
133698 any dgromm3m 64 1 64 1 64 GB 55 GB 1:09:53:17 3:00:00:00 1:14:06:43 12.03.2026 17:22:19 start_mpi.sh wr54
133699 any dgromm3m 64 1 64 1 64 GB 55 GB 1:09:54:26 3:00:00:00 1:14:05:34 12.03.2026 17:23:28 start_mpi.sh wr55
133700 any dgromm3m 64 1 64 1 64 GB 56 GB 1:09:55:03 3:00:00:00 1:14:04:57 12.03.2026 17:24:05 start_mpi.sh wr55
133701 any dgromm3m 64 1 64 1 64 GB 56 GB 1:09:55:17 3:00:00:00 1:14:04:43 12.03.2026 17:24:19 start_mpi.sh wr56
133702 any dgromm3m 64 1 64 1 64 GB 55 GB 1:09:55:36 3:00:00:00 1:14:04:24 12.03.2026 17:24:38 start_mpi.sh wr56
133703 any dgromm3m 64 1 64 1 64 GB 55 GB 1:09:55:59 3:00:00:00 1:14:04:01 12.03.2026 17:25:01 start_mpi.sh wr57
133704 any dgromm3m 64 1 64 1 64 GB 54 GB 1:09:57:00 3:00:00:00 1:14:03:00 12.03.2026 17:26:02 start_mpi.sh wr57
133705 any dgromm3m 64 1 64 1 64 GB 55 GB 1:09:57:41 3:00:00:00 1:14:02:19 12.03.2026 17:26:43 start_mpi.sh wr58
133706 any dgromm3m 128 1 128 1 128 GB 137 GB 1:10:02:10 3:00:00:00 1:13:57:50 12.03.2026 17:31:12 start_mpi.sh wr59
133707 any dgromm3m 128 1 128 1 128 GB 135 GB 1:10:07:03 3:00:00:00 1:13:52:57 12.03.2026 17:36:05 start_mpi.sh wr60
133708 any dgromm3m 128 1 128 1 128 GB 138 GB 1:10:08:35 3:00:00:00 1:13:51:25 12.03.2026 17:37:37 start_mpi.sh wr61
133709 any dgromm3m 128 1 128 1 128 GB 139 GB 1:10:10:04 3:00:00:00 1:13:49:56 12.03.2026 17:39:06 start_mpi.sh wr62
133805 gpu4 ipolat2s 128 1 128 4 400 GB 437 GB 1:21:09:49 2:00:00:00 2:50:11 4:38:51 bai_opt_13 wr24
133806 gpu4 ipolat2s 128 1 128 4 400 GB 436 GB 1:21:10:49 2:00:00:00 2:49:11 4:39:51 bai_opt_14 wr25
133807 gpu4 ipolat2s 128 1 128 4 400 GB 436 GB 1:21:35:53 2:00:00:00 2:24:07 5:04:55 bai_opt_15 wr23
133808 gpu4 ipolat2s 128 1 128 4 400 GB 437 GB 1:22:05:07 2:00:00:00 1:54:53 5:34:09 bai_opt_16 wr22
133809 gpu4 ipolat2s 128 1 128 4 400 GB 436 GB 1:22:07:22 2:00:00:00 1:52:38 5:36:24 bai_opt_17 wr21
133749 any dgromm3m 64 1 64 1 64 GB 49 GB 2:05:16:05 3:00:00:00 18:43:55 13.03.2026 12:45:07 start_mpi.sh wr63
133753 any dgromm3m 64 1 64 1 64 GB 49 GB 2:05:20:12 3:00:00:00 18:39:48 13.03.2026 12:49:14 start_mpi.sh wr65
133754 hpc3 hfataf3m 32 1 32 1 40 GB 4 GB 2:06:50:03 3:00:00:00 17:09:57 13.03.2026 14:19:05 S1_chunk wr75
133758 hpc3 hfataf3m 32 1 32 1 40 GB 4 GB 2:06:50:51 3:00:00:00 17:09:09 13.03.2026 14:19:53 S2_chunk wr78
133762 hpc3 hfataf3m 32 1 32 1 40 GB 4 GB 2:06:51:24 3:00:00:00 17:08:36 13.03.2026 14:20:26 S3_chunk wr80
133766 hpc3 hfataf3m 32 1 32 1 40 GB 4 GB 2:06:51:59 3:00:00:00 17:08:01 13.03.2026 14:21:01 S4_chunk wr82
133770 hpc3 hfataf3m 32 1 32 1 40 GB 4 GB 2:06:52:43 3:00:00:00 17:07:17 13.03.2026 14:21:45 S5_chunk wr84
133774 hpc3 hfataf3m 32 1 32 1 40 GB 4 GB 2:06:53:18 3:00:00:00 17:06:42 13.03.2026 14:22:20 S6_chunk wr85
133799 gpu4 smoses2s 8 1 8 1 32 GB 278 GB 2:12:17:41 3:00:00:00 11:42:19 13.03.2026 19:46:43 ulr2ss_training5_joint_off_bs16_gpu4 wr20
133817 gpu4 smoses2s 8 1 8 1 32 GB 208 GB 2:21:10:14 3:00:00:00 2:49:46 4:39:16 ulr2ss_training4_joint_off_bs16_gpu4 wr20

Waiting/Blocked Jobs (1)

job R queue user state #proc #nodes ppn gpn vmem t_req prio enqueued waiting jobname wait reason
133810 gpu4 ipolat2s PD 128 1 128 4 400 GB 2:00:00:00 83116 13.03.2026 21:00:03 10:28:59 bai_opt_18 (Resources)