Skip to main content
Fig. 2 | Genome Biology

Fig. 2

From: Meta-analysis of (single-cell method) benchmarks reveals the need for extensibility and interoperability

Fig. 2

Code/data availability, reproducibility and technical aspects of 62 single-cell method benchmarks. A Each column of the heatmap represents a benchmark study and each row represents a factual question; responses are represented by colours (Yes: blue; Partially: orange; Not Applicable: white; No: red). Not Applicable corresponds to benchmarks that did not use simulated data (synthetic data is available row) and to a benchmark that evaluated secondary measures only (performance results available row). "results available" refers to computational methods run on datasets; "performance results" refers to the results that are compared to a ground truth. B Type of workflow system used (benchmarks with no workflow used or no code available are represented in red, otherwise grey). C Reviewer’s opinions on the availability and extensibility of benchmarking code. Jitter is added to the X-axis and Y-axis of the scores. D Licence specification across benchmarking studies (benchmarks without licences or no code available are represented in red, otherwise grey)

Back to article page