learn:4.6. Kernel Approximation

参考:

之所以使用approximate explicit feature maps compared to thekernel trick, 是因为这样便于online learning,且能够适用于大数据集。但是还是建议,,如果可能,approximate andexact kernel methods应该对比着用。

1、Nystroem Method for Kernel Approximation

is a general method for low-rank approximations of kernels. It achieves this by essentially subsampling the data on which the kernel is evaluated. 默认情况下,使用rbfkernel,也可以自定义。 The number of samples used – which is also the dimensionality of the features computed – is given by the parametern_components.

2、Radial Basis Function Kernel

Theconstructs an approximate mapping for the radial basis function kernel:

The mapping relies on a Monte Carlo approximation to the kernel values. Thefitfunction performs the Monte Carlo sampling, whereas thetransformmethod performs the mapping of the data. 由于本身的随机性、多次调用fit结果可能不同。

3、Additive Chi Squared Kernel

The additive chi squared kernel as used here is given by

The classimplements this component wise deterministic sampling. Each component is sampledtimes, yieldingdimensions per input dimension (the multiple of two stems from the real and complex part of the Fourier transform). In the literature,is usually chosen to be 1 or 2, transforming the dataset to sizen_samples*5*n_features(in the case of).

4、Skewed Chi Squared Kernel

The skewed chi squared kernel is given by:

好吧,我承认这几节都翻译的不好,但他内容就是这个样子。。。

版权声明:本文为博主原创文章,未经博主允许不得转载。

人生至少要有两次冲动,一为奋不顾身的爱情,一为说走就走的旅行。

learn:4.6. Kernel Approximation

相关文章:

你感兴趣的文章:

标签云: