bestsetNoise {DAAG} | R Documentation |
Best subset selection applied to completely random noise. This function demonstrates how variable selection techniques in regression can often err in suggesting that more variables be included in a regression model than necessary.
bestsetNoise(m=100, n=40, method="exhaustive", nvmax=3, X=NULL, print.summary = TRUE, really.big=FALSE) bestset.noise(m=100, n=40, method="exhaustive", nvmax=3, X=NULL, print.summary = TRUE, really.big=FALSE) bsnCV(m = 100, n = 40, method = "exhaustive", nvmax = 3, X=NULL, nfolds = 2, print.summary = TRUE, really.big=FALSE) bsnVaryNvar(m = 100, nvar = nvmax:50, nvmax = 3, method = "exhaustive", plotit = TRUE, xlab = "# of variables from which to select", ylab = "p-values for t-statistics", main = paste("Select 'best'", nvmax, "variables"), details = FALSE, really.big = TRUE, smooth = TRUE)
m |
the number of observations to be simulated. |
n |
the number of predictor variables in the simulated model. |
method |
Use |
nvmax |
maximum number of explanatory variables in model. |
X |
Use columns from the matrix that is supplied. If not |
nvar |
range of number of candidate variables ( |
nfolds |
For splitting the data into training and text sets, the number of folds. |
print.summary |
Should summary information be printed |
plotit |
Plot a graph? ( |
xlab |
x-label for graph ( |
ylab |
y-label for graph ( |
main |
main title for graph ( |
details |
Return detailed output list ( |
really.big |
Set to |
smooth |
Fit smooth to graph? ( |
If X
is not supplied, and in any case for bsnVaryNvar
, a
set of n
predictor variables are simulated as independent
standard normal, i.e. N(0,1), variates. Additionally a N(0,1) response
variable is simulated. The best model with nvmax
variables is
selected using the regsubsets()
function from the leaps package.
(The leaps package must be installed for this function to work.)
The function bsnCV
splits the data (randomly) into nfolds
(2 or more) parts. It puts each part aside in turn for use to fit
the model (effectively, test data), with the remaining data used
for selecting the variables that will be used for fitting. One model
fit is returned for each of the nfolds
parts.
The function bsnVaryVvar
makes repeated calls to
bestsetNoise
bestsetNoise
returns the lm
model object for the "best"
model.
bsnCV
returns as many models as there are folds.
bsnVaryVvar
silently returns either (details=FALSE
) a
matrix that has p-values of the coefficients for the ‘best’
choice of model for
each different number of candidate variables, or
(details=TRUE
) a list with elements:
coef |
A matrix of sets of regression coefficients |
SE |
A matrix of standard errors |
pval |
A matrix of p-values |
Matrices have one row for each choice of nvar
. The statistics
returned are for the ‘best’ model with nvmax explanatory variables.
J.H. Maindonald
leaps.out <- try(require(leaps, quietly=TRUE)) leaps.out.log <- is.logical(leaps.out) if ((leaps.out.log==TRUE)&(leaps.out==TRUE)){ bestsetNoise(20,6) # `best' 3-variable regression for 20 simulated observations # on 7 unrelated variables (including the response) bsnCV(20,6) # `best' 3-variable regressions (one for each fold) for 20 # simulated observations on 7 unrelated variables # (including the response) bsnVaryNvar(m = 50, nvar = 3:6, nvmax = 3, method = "exhaustive", plotit=FALSE, details=TRUE) }