Your selections:

30801 Artificial Intelligence and Image Processing
20906 Electrical and Electronic Engineering
21702 Cognitive Science
2Rate of convergence
1Adaptation
1Dimension reduction
1Greedy algorithm
1Gröbner basis
1L error
1L-2-boosting
1L2 error
1List decoding
1Monomial
1Nonparametric regression
1Predictable leading
1Random variables
1Rational interpolation
1Reed-Solomon code
1Regression
1Regression analysis

Show More

Show Less

Format Type

Estimation of a regression function by maxima of minima of linear functions

- Bagirov, Adil, Clausen, Conny, Kohler, Michael

**Authors:**Bagirov, Adil , Clausen, Conny , Kohler, Michael**Date:**2009**Type:**Text , Journal article**Relation:**IEEE Transactions on Information Theory Vol. 55, no. 2 (2009), p. 833-845**Full Text:****Reviewed:****Description:**In this paper, estimation of a regression function from independent and identically distributed random variables is considered. Estimates are defined by minimization of the empirical L2 risk over a class of functions, which are defined as maxima of minima of linear functions. Results concerning the rate of convergence of the estimates are derived. In particular, it is shown that for smooth regression functions satisfying the assumption of single index models, the estimate is able to achieve (up to some logarithmic factor) the corresponding optimal one-dimensional rate of convergence. Hence, under these assumptions, the estimate is able to circumvent the so-called curse of dimensionality. The small sample behavior of the estimates is illustrated by applying them to simulated data. © 2009 IEEE.

**Authors:**Bagirov, Adil , Clausen, Conny , Kohler, Michael**Date:**2009**Type:**Text , Journal article**Relation:**IEEE Transactions on Information Theory Vol. 55, no. 2 (2009), p. 833-845**Full Text:****Reviewed:****Description:**In this paper, estimation of a regression function from independent and identically distributed random variables is considered. Estimates are defined by minimization of the empirical L2 risk over a class of functions, which are defined as maxima of minima of linear functions. Results concerning the rate of convergence of the estimates are derived. In particular, it is shown that for smooth regression functions satisfying the assumption of single index models, the estimate is able to achieve (up to some logarithmic factor) the corresponding optimal one-dimensional rate of convergence. Hence, under these assumptions, the estimate is able to circumvent the so-called curse of dimensionality. The small sample behavior of the estimates is illustrated by applying them to simulated data. © 2009 IEEE.

A Root-finding algorithm for list decoding of Reed-Muller codes

- Wu, Xinwen, Kuijper, Margreta, Udaya, Parampalli

**Authors:**Wu, Xinwen , Kuijper, Margreta , Udaya, Parampalli**Date:**2006**Type:**Text , Journal article**Relation:**IEEE transactions on information theory Vol. 51, no. 3 (2006), p. 1190-1196**Full Text:**false**Reviewed:****Description:**Let Fq[X1,...,Xm] denote the set of polynomials over Fq in m variables, and Fq[X1,...,Xm]≤u denote the subset that consists of the polynomials of total degree at most u. Let H(T) be a nontrivial polynomial in T with coefficients in Fq[X1,...,Xm]. A crucial step in interpolation-based list decoding of q-ary Reed-Muller (RM) codes is finding the roots of H(T) in Fq[X1,...,Xm]≤u. In this correspondence, we present an efficient root-finding algorithm, which finds all the roots of H(T) in Fq[X1,...,Xm]≤u. The algorithm can be used to speed up the list decoding of RM codes.**Description:**C1**Description:**2003005726

An L-2-Boosting Algorithm for Estimation of a Regression Function

- Bagirov, Adil, Clausen, Conny, Kohler, Michael

**Authors:**Bagirov, Adil , Clausen, Conny , Kohler, Michael**Date:**2010**Type:**Text , Journal article**Relation:**IEEE Transactions on Information Theory Vol. 56, no. 3 (2010), p. 1417-1429**Full Text:****Reviewed:****Description:**An L-2-boosting algorithm for estimation of a regression function from random design is presented, which consists of fitting repeatedly a function from a fixed nonlinear function space to the residuals of the data by least squares and by defining the estimate as a linear combination of the resulting least squares estimates. Splitting of the sample is used to decide after how many iterations of smoothing of the residuals the algorithm terminates. The rate of convergence of the algorithm is analyzed in case of an unbounded response variable. The method is used to fit a sum of maxima of minima of linear functions to a given data set, and is compared with other nonparametric regression estimates using simulated data.

**Authors:**Bagirov, Adil , Clausen, Conny , Kohler, Michael**Date:**2010**Type:**Text , Journal article**Relation:**IEEE Transactions on Information Theory Vol. 56, no. 3 (2010), p. 1417-1429**Full Text:****Reviewed:****Description:**An L-2-boosting algorithm for estimation of a regression function from random design is presented, which consists of fitting repeatedly a function from a fixed nonlinear function space to the residuals of the data by least squares and by defining the estimate as a linear combination of the resulting least squares estimates. Splitting of the sample is used to decide after how many iterations of smoothing of the residuals the algorithm terminates. The rate of convergence of the algorithm is analyzed in case of an unbounded response variable. The method is used to fit a sum of maxima of minima of linear functions to a given data set, and is compared with other nonparametric regression estimates using simulated data.

A parametric approach to list decoding of Reed-Solomon codes using interpolation

- Ali, Mortuza, Kiujper, Margreta

**Authors:**Ali, Mortuza , Kiujper, Margreta**Date:**2011**Type:**Text , Journal article**Relation:**IEEE Transaction on Information Theory Vol. 57, no. 10 (2011), p. 6718-6728**Full Text:**false**Reviewed:****Description:**Abstract—In this paper, we present a minimal list decoding algorithm for Reed-Solomon (RS) codes. Minimal list decoding for a code refers to list decoding with radius , where is the minimum of the distances between the received word and any codeword in . We consider the problem of determining the value of as well as determining all the codewords at distance . Our approach involves a parametrization of interpolating polynomials of a minimal Gröbner basis . We present two efficient ways to compute . We also show that so-called re-encoding can be used to further reduce the complexity. We then demonstrate how our parametric approach can be solved by a computationally feasible rational curve fitting solution from a recent paper by Wu. Besides, we present an algorithm to compute the minimum multiplicity as well as the optimal values of the parameters associated with this multiplicity, which results in overall savings in both memory and computation

- «
- ‹
- 1
- ›
- »

Are you sure you would like to clear your session, including search history and login status?