References of "Statistics & Probability Letters"
     in
Bookmark and Share    
Full Text
Peer Reviewed
See detailFurther comments on the representation problem for stationary processes
Laurent, Stéphane ULg

in Statistics & Probability Letters (2010), 80

We comment on some points about the coding of stochastic processes by sequences of independent random variables. The most interesting question has to do with the standardness property of the filtration ... [more ▼]

We comment on some points about the coding of stochastic processes by sequences of independent random variables. The most interesting question has to do with the standardness property of the filtration generated by the process, in the framework of Vershik's theory of filtrations. Non-standardness indicates the presence of long memory in a purely probabilistic sense. We aim to provide a short, non-technical presentation of Vershik's theory of filtrations. [less ▲]

Detailed reference viewed: 61 (11 ULg)
Full Text
Peer Reviewed
See detailImproved rank-based dependence measures for categorical data
Vandenhende, François; Lambert, Philippe ULg

in Statistics & Probability Letters (2003), 63

We extend rank-based dependence measures like Spearman's rho to categorical data so that the same ±1 limits are always reached under complete dependence. A goodness-of-fit procedure is derived for ... [more ▼]

We extend rank-based dependence measures like Spearman's rho to categorical data so that the same ±1 limits are always reached under complete dependence. A goodness-of-fit procedure is derived for dependence models using copulas. [less ▲]

Detailed reference viewed: 17 (1 ULg)
Full Text
Peer Reviewed
See detailThe breakdown behavior of the maximum likelihood estimator in the logistic regression model
Croux, C.; Flandre, C.; Haesbroeck, Gentiane ULg

in Statistics & Probability Letters (2002), 60(4), 377-386

In this note we discuss the breakdown behavior of the maximum likelihood (ML) estimator in the logistic regression model. We formally prove that the ML-estimator never explodes to infinity, but rather ... [more ▼]

In this note we discuss the breakdown behavior of the maximum likelihood (ML) estimator in the logistic regression model. We formally prove that the ML-estimator never explodes to infinity, but rather breaks down to zero when adding severe outliers to a data set. An example confirms this behavior. (C) 2002 Published by Elsevier Science B.V. [less ▲]

Detailed reference viewed: 18 (3 ULg)