Algorithm CART in R: Where are the Decision Trees' cross-validation error costs?

I was making a Decision Tree. I used the following code:

CPUE <- read.csv("CPUE_C1_11.csv", header = TRUE, sep = ";", dec = ".")

Everything in my data is numerical, they are seen as following:

Data

When I wanted to see how the variables are dealt with CPUEt_m3.h by means of a Decision tree, it's only shown as following:

mCPUE1 <- rpart(CPUEt_m3.h ~ ., data=CPUE, method="anova")

mCPUE1 n= 25

node), split, n, deviance, yval * denotes terminal node

  1. root 25 0.0011435000 0.02680000
  2. C9u< 622.4586 18 0.0002807778 0.02338889 *
  3. C9u>=622.4586 7 0.0001147143 0.03557143 *

Plotting:

rpart.plot(mCPUE1, type=3, digits=3, fallen.leaves=TRUE)

Only C9u is shown by R

CPUEt_m3.h depends on 22 variables.

Is it possible to see cross-validation error cost (CV-cost) and the cost of validation on the training sample (Resubstitution cost) to decide which Decision tree represents better my dependent variable?.

Here is my data: https://www.dropbox.com/scl/fi/7sgyqsa06zv2n9nvw9ath/CPUE_C1_11.csv?rlkey=qphkxc6ogu2lnjbicj27lr3sp&st=cj92aewj&dl=0

Upvotes: 0

Views: 52

Answers (0)

Related Questions