Table 1 Performance and computational complexity of SSBlazer and SSBlazer-LM. Each model was trained on 1 \(\times\) NVIDIA A100 GPU with 80 GB memory, and the batch size was 2048. Training time refers to the backpropagation time of the training set of S1 END-seq dataset (imbalance ratio = 1, number of samples = 231,500), and inference time is the forward propagation time during the testing period (number of samples = 22,044)