0

I am using parameters optimization (random search) with mlr3 but it gives me the following error. I tried with other models too (kknn) but the same error comes in.

Error: Resampling 'cv' may not be instantiated

///My code is here 

data = readARFF("xerces.arff")
index= sample(1:nrow(data), 0.7*nrow(data))
train= data[index,]
test= data[-index,]
task = TaskRegr$new("data", backend = train, target = "bug")

learner5=lrn("regr.randomForest")
resampling_cv = rsmp("cv", folds = 10L)
resampling_cv$instantiate(task)
measure= msr("regr.mae")

search_space = paradox::ParamSet$new(
 params = list(paradox::ParamInt$new("ntree", lower = 100, upper = 500)))

terminator = trm("evals", n_evals = 30)
tuner = tnr("random_search")

at = AutoTuner$new(
        learner = learner5,
        resampling = resampling_cv,
        measure = measure,
        search_space = search_space,
        terminator = terminator,
        tuner = tuner,  store_tuning_instance = TRUE,
        store_benchmark_result = TRUE,
        store_models = TRUE
)

1 Answer 1

1

You instantiate the resampling with resampling_cv$instantiate(task). Remove this line and it should work.

Sign up to request clarification or add additional context in comments.

5 Comments

reampling should not be instantiated? mlr3 book says it should be.
Usually, the resampling is automatically instantiated. Where did you read that in the book?
I read it it in the "resamplings" (3rd chapter) Performance Evaluation and Comparison
Yes it says: Note that if you want to compare multiple Learners in a fair manner, using the same instantiated resampling for each learner is mandatory, such that each learner gets exactly the same training data and the performance of the trained model is evaluated in exactly the same test set.
If you want to compare multiple learners, use nested resampling. You don't have to instantiate the resampling before.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.