Skip to main content

Table 1 Performance and computational complexity of SSBlazer and SSBlazer-LM. Each model was trained on 1 \(\times\) NVIDIA A100 GPU with 80 GB memory, and the batch size was 2048. Training time refers to the backpropagation time of the training set of S1 END-seq dataset (imbalance ratio = 1, number of samples = 231,500), and inference time is the forward propagation time during the testing period (number of samples = 22,044)

From: SSBlazer: a genome-wide nucleotide-resolution model for predicting single-strand break sites

Model

AUROC

AUPRC

MACs

Params

Training (s)

Inference (s)

SSBlazer-LM

0.9661

0.9634

43661.70G

110M

422

29

SSBlazer

0.9626

0.9621

332.77G

1.9M

30

8