Learning rates for SVM classifiers with polynomial kernels

Author(s):  
Dan Wu ◽  
Feilong Cao
Author(s):  
LUOQING LI

This article considers regularized least square regression on the sphere. It develops a theoretical analysis of the generalization performances of regularized least square regression algorithm with spherical polynomial kernels. The explicit bounds are derived for the excess risk error. The learning rates depend on the eigenvalues of spherical polynomial integral operators and on the dimension of spherical polynomial spaces.


2008 ◽  
Vol 24 (5-6) ◽  
pp. 619-631 ◽  
Author(s):  
Hongzhi Tong ◽  
Di-Rong Chen ◽  
Lizhong Peng

2016 ◽  
Vol 28 (1) ◽  
pp. 71-88 ◽  
Author(s):  
Hongzhi Tong

We present a better theoretical foundation of support vector machines with polynomial kernels. The sample error is estimated under Tsybakov’s noise assumption. In bounding the approximation error, we take advantage of a geometric noise assumption that was introduced to analyze gaussian kernels. Compared with the previous literature, the error analysis in this note does not require any regularity of the marginal distribution or smoothness of Bayes’ rule. We thus establish the learning rates for polynomial kernels for a wide class of distributions.


2019 ◽  
Author(s):  
Krisztina Sára Lukics ◽  
Ágnes Lukács

First language acquisition is facilitated by several characteristics of infant-directed speech, but we know little about their relative contribution to learning different aspects of language. We investigated infant-directed speech effects on the acquisition of a linear artificial grammar in two experiments. We examined the effect of incremental presentation of strings (starting small) and prosody (comparing monotonous, arbitrary and phrase prosody). Presenting shorter strings before longer ones led to higher learning rates compared to random presentation. Prosody marking phrases had a similar effect, yet, prosody without marking syntactic units did not facilitate learning. These studies were the first to test the starting small effect with a linear artificial grammar, and also the first to investigate the combined effect of starting small and prosody. Our results suggest that starting small and prosody facilitate the extraction of regularities from artificial linguistic stimuli, indicating they may play an important role in natural language acquisition.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Natasza D. Orlov ◽  
Jessica Sanderson ◽  
Syed Ali Muqtadir ◽  
Anastasia K. Kalpakidou ◽  
Panayiota G. Michalopoulou ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document