Abstract
Using Monte Carlo simulations, several methods for detecting a trend in the magnitude of extreme values are compared. Ordinary least squares regression is found to be the least reliable method. A Kendall's tau based method provides some improvement. The advantage of this method over that of least squares diminishes when the sample size is moderate to small. Explicit consideration of the extreme value distribution when computing trend always outperforms the above two methods. The use of the r largest values as extremes enhances the power of detection for moderate values of r; the use of larger values of r may lead to bias in the magnitude of the estimated trend.
Users
Please
log in to take part in the discussion (add own reviews or comments).