# Discussion of the Paper \additive Logistic Regression: a Statistical View of Boosting" By

@inproceedings{Friedman2000DiscussionOT, title={Discussion of the Paper \additive Logistic Regression: a Statistical View of Boosting" By}, author={Jerome H. Friedman and Trevor J. Hastie and Robert Tibshirani and Yoav Freund and Robert E. Schapire}, year={2000} }

The main and important contribution of this paper is in establishing a connection between boosting, a newcomer to the statistics scene, and additive models. One of the main properties of boosting that has made it interesting to statisticians and others is its relative (but not complete) immunity to overrtting. As pointed out by the authors, the current paper does not address this issue. Leo Breiman 1] tried to explain this behaviour in terms of bias and variance. In our paper with Bartlett and… Expand

No Paper Link Available

#### 1,566 Citations

Boosting as a Regularized Path to a Maximum Margin Classifier

- Mathematics, Computer Science
- J. Mach. Learn. Res.
- 2004

It is built on recent work by Efron et al. to show that boosting approximately (and in some cases exactly) minimizes its loss criterion with an l1 constraint on the coefficient vector, and shows that as the constraint is relaxed the solution converges (in the separable case) to an "l1-optimal" separating hyper-plane. Expand

Different Paradigms for Choosing Sequential Reweighting Algorithms

- Mathematics, Computer Science
- Neural Computation
- 2004

A very simple family of iterative reweighting algorithms that can be understood as different trade-offs between the two paradigms are derived and argued that this can allow for a suitable adaptivity to different classification problems, particularly in the presence of noise or excessive complexity of the base classifiers. Expand

Improving Boosting by Exploiting Former Assumptions

- Computer Science
- MCD
- 2007

This study proposes a new approach and modifications carried out on the algorithm of AdaBoost, called hybrid approach, and demonstrates that it is possible to improve the performance of the Boosting, by exploiting assumptions generated with the former iterations to correct the weights of the examples. Expand

Response to Mease and Wyner, Evidence Contrary to the Statistical View of Boosting, JMLR 9:131-156, 2008

- Mathematics
- 2008

For such a simple algorithm, it is fascinating and remarkable what a rich diversity of interpretations, views, perspectives and explanations have emerged of AdaBoost. Originally, AdaBoost was… Expand

The Fast Convergence of Boosting

- Computer Science, Mathematics
- NIPS
- 2011

This manuscript considers the convergence rate of boosting under a large class of losses, including the exponential and logistic losses, where the best previous rate of convergence was O(exp(1/∊2); the principal technical hurdle throughout this work is the potential unattainability of the infimal empirical risk. Expand

Supervised projection approach for boosting classifiers

- Mathematics, Computer Science
- Pattern Recognit.
- 2009

A new approach for boosting methods for the construction of ensembles of classifiers, based on using the distribution given by the weighting scheme of boosting to construct a non-linear supervised projection of the original variables, instead of using the weights of the instances to train the next classifier. Expand

Boosting with the L 2-Loss : Regression and Classi cationPeter

- 2002

This paper investigates a computationally simple variant of boosting, L 2 Boost, which is constructed from a functional gradient descent algorithm with the L 2-loss function. As other boosting… Expand

Boosting and Support Vector Machines as Optimal Separators

- Computer Science, Engineering
- IS&T/SPIE Electronic Imaging
- 2003

It is shown that boosting approximately (and in some cases exactly) minimizes its loss criterion with an L1 constraint and that as the constraint diminishes, or equivalently as the boosting iterations proceed, the solution converges in the separable case to an “L1-optimal” separating hyper-plane. Expand

Further results on the margin explanation of boosting: new algorithm and experiments

- Mathematics, Computer Science
- Science China Information Sciences
- 2012

An efficient algorithm is developed that, given a boosting classifier, learns a new voting classifier which usually has a smaller Emargin bound, and finds that the new classifier often has smaller test errors, which agrees with what the EmargIn theory predicts. Expand

A Refined Margin Analysis for Boosting Algorithms via Equilibrium Margin

- Mathematics, Computer Science
- J. Mach. Learn. Res.
- 2011

A refined analysis of the margin theory is made, which proves a bound in terms of a new margin measure called the Equilibrium margin (Emargin) which is uniformly sharper than Breiman's minimum margin bound. Expand

#### References

SHOWING 1-6 OF 6 REFERENCES

Boosting the margin: A new explanation for the effectiveness of voting methods

- Mathematics, Computer Science
- ICML
- 1997

It is shown that techniques used in the analysis of Vapnik's support vector classifiers and of neural networks with small weights can be applied to voting methods to relate the margin distribution to the test error. Expand

The Alternating Decision Tree Learning Algorithm

- Computer Science
- ICML
- 1999

A new type of classi cation rule, the alternating decision tree, which is a generalization of decision trees, voted decision trees and voted decision stumps and generates rules that are usually smaller in size and thus easier to interpret. Expand

An Adaptive Version of the Boost by Majority Algorithm

- Mathematics, Computer Science
- COLT '99
- 1999

The paper describes two methods for finding approximate solutions to the differential equations and a method that results in a provably polynomial time algorithm based on the Newton-Raphson minimization procedure, which is much more efficient in practice but is not known to bePolynomial. Expand

Arcing classi ers

- The Annals of Statistics,
- 1998

Arcing classiiers. The Annals of Statistics

- Arcing classiiers. The Annals of Statistics
- 1998

Boosting the margin: A new explanation for the eeectiveness of voting methods The Annals of Statistics

- Machine Learning: Proceedings of the Fourteenth International Conference
- 1997