Skip to main content

Table 4 Methods for performing ICA that we compared

From: Application of independent component analysis to microarrays

Algorithm

Variations

Abbreviation

Description

Reference

Software

Natural Gradient Maximum Likelihood Estimation

-

NMLE

Natural gradient is applied to MLE for efficient learning

[28, 29]

[72]

Extended Information Maximization

-

ExtIM

NMLE for separating mix of super- and sub-Gaussian sources

[32]

[73]

Fast Fixed-Point

Kurtosis with deflation

FP

Maximizing non-Gaussianity

[31]

[74]

 

Symmetric orthogonalization

Fpsym

   
 

Tanh nonlinearity with symmetric orthogonalization

Fpsymth

   

Joint Approximate Diagonalization of Eigenmatrices

-

JADE

Using higher-order cumulant tensor

[30]

[75]

Nonlinear ICA

Gaussian RBF kernel

NICAgauss

Kernel-based approach

[34, 37, 50]

[50]

 

Using polynomial kernel

NICApoly

   
  1. Eight methods are based on five algorithms. The method's name, variations, abbreviation, short description, references and software that we use, are listed.