In: Statistics and Probability
How does the computational time changes when we decrease the k in k-fold cross validation? Why? Explain.
b. In which procedures, we can apply k-fold cross validation. Consider all the procedures that we learned.
a) the compulational time changes when we decrease the k in k-fold cross validation.
explanation: In k-fold cross-validation, the original sample is randomly partitioned into k equal sized subsamples. Of the k subsamples, a single subsample is retained as the validation data for testing the model, and the remaining k − 1 subsamples are used as training data. The cross-validation process is then repeated k times, with each of the k subsamples used exactly once as the validation data. The k results can then be averaged to produce a single estimation. The advantage of this method over repeated random sub-sampling (see below) is that all observations are used for both training and validation, and each observation is used for validation exactly once. 10-fold cross-validation is commonly used,but in general k remains an unfixed parameter.
b) Cross-validation is a resampling procedure used to evaluate machine learning models on a limited data sample. That k-fold cross validation is a procedure used to estimate the skill of the model on new data.
The following are the general procedures for the k-fold cross validation.
1) Shuffle the dataset randomly.
2) Split the dataset into k groups.
3) For each unique group:
a) Take the group as a hold out or test data set.
b) Take the remaining groups as a training data set.
c) Fit a model on the training set and evaluate it on the test set.
d) Retain the evaluation score and discard the model.
4) Summarize the skill of the model using the sample of model evaluation scores.
i hope this answer will help yoy. please give me thumsup.