Quantcast
Channel: Data Science, Analytics and Big Data discussions - Topics tagged data_science
Viewing all articles
Browse latest Browse all 787

Top Decile Lift in Repeated 10-fold Cross Validation

$
0
0

@davide16 wrote:

Scope: Compute Top decile lift in the test sets within a repeated(x10) 10-fold cross validation using R.

Should I compute the Lift in each fold, and then average all the lifts derived?

Then, as second step, within the repetitions, should I average again each repetition lift derived from step 1 to compute the final lift for the model?

In R it seems the Lift package is the one most commonly utilized to compute the top decile lift. However it seems working only for vectors, while the probabilities and labels within a 10x 10-folds cross validation are listed in a nested list as example below for one of the folders:

```
> f.list$label
[[1]]
 [1] 1 0 1 1 1 1 0 1 1 0 0 1 1 1 1 0 0 0

[[2]]
 [1] 1 1 0 1 0 0 1 1 0 1 0 1 1 0 1 0 0 1 1

[[3]]
 [1] 0 0 0 1 1 1 1 1 1 0 0 0 1 1 1 0 1 1 0

[[4]]
 [1] 1 1 1 1 1 0 1 1 0 0 1 1 0 0 1 0 0 1 0 1

[[5]]
 [1] 1 0 0 0 1 1 0 1 0 1 1 0 1 0 1 1 1 1

[[6]]
 [1] 1 1 1 1 0 1 1 1 1 1 1 0 0 0 0 0 0 1 0

[[7]]
 [1] 0 1 0 1 1 1 0 1 1 0 1 1 1 1 0 0 0 1

[[8]]
 [1] 1 1 1 1 1 0 1 0 1 0 0 1 0 1 0 1 0 0 1

[[9]]
 [1] 1 1 1 1 1 0 0 1 0 1 0 0 1 1 1 0 0 1

[[10]]
 [1] 1 0 1 0 0 1 1 1 0 0 0 1 1 1 1 1 0 1 0

> f.list$prob
[[1]]
 [1] 0.6754964 0.7003950 0.6754964 0.6754964 0.8065828 0.7003950 0.6754964 0.7003950 0.6754964 0.7003950 0.6754964
[12] 0.8065828 0.7003950 0.8065828 0.8065828 0.6294260 0.6754964 0.6754964

[[2]]
 [1] 0.7179469 0.6748065 0.7179469 0.7179469 0.6748065 0.6366714 0.7179469 0.6748065 0.6748065 0.7179469 0.7179469
[12] 0.9244904 0.7179469 0.6748065 0.7154514 0.7179469 0.7154514 0.6748065 0.5581462

[[3]]
 [1] 0.7254756 1.0000000 0.7254756 0.7254756 0.6269753 0.7254756 0.7254756 0.8053198 0.8144527 1.0000000 0.7254756
[12] 0.6316320 0.8053198 0.6859796 0.7254756 0.6859796 0.7254756 0.7254756 0.6859796

[[4]]
 [1] 0.7013487 0.6642448 0.7013487 0.7013487 0.7013487 0.6642448 0.6642448 0.7013487 0.6130222 0.7013487 0.5819991
[12] 0.6130222 0.6642448 0.7013487 0.6642448 0.6642448 0.7013487 0.6642448 0.6642448 0.8052724

[[5]]
 [1] 0.7732975 0.7019505 0.6582106 0.6582106 0.7019505 0.5451431 0.7019505 0.7732975 0.6582106 0.7019505 0.6582106
[12] 0.6582106 0.6582106 0.7019505 0.7903264 0.7019505 0.6582106 0.8891492

[[6]]
 [1] 0.6723612 0.9196713 0.7963977 0.6723612 0.6451440 0.6500513 0.8009641 0.8009641 0.7963977 0.6723612 0.6164599
[12] 0.6451440 0.6164599 0.6164599 0.0000000 0.7143067 0.6723612 0.6723612 0.6164599

[[7]]
 [1] 0.5808796 0.7064913 0.7064913 0.0000000 0.7064913 0.6615101 0.5808796 0.7064913 0.7064913 0.7064913 0.7064913
[12] 0.6306560 0.7064913 0.6615101 0.5062597 0.6615101 0.6615101 0.5062597

[[8]]
 [1] 1.0000000 0.6604876 0.6604876 0.6998193 0.6604876 1.0000000 0.6998193 0.6998193 0.6604876 0.6604876 0.6998193
[12] 0.6604876 0.8010080 0.6998193 0.6998193 0.6998193 0.6998193 0.6604876 0.8010080

[[9]]
 [1] 0.6931611 0.6931611 0.6931611 0.6527919 0.0000000 0.5490894 0.6074932 0.7858720 0.6527919 0.6931611 0.5910396
[12] 0.6527919 0.6527919 0.8885704 0.6527919 0.6527919 0.0000000 0.6931611

[[10]]
 [1] 0.6679090 0.7148292 0.6679090 0.6679090 0.6679090 0.7148292 0.6679090 0.7148292 0.6679090 0.6679090 0.7726609
[12] 0.6679090 0.8129300 0.6002113 0.4593323 0.8129300 0.6349487 0.7883672 0.6679090
```

Posts: 1

Participants: 1

Read full topic


Viewing all articles
Browse latest Browse all 787

Trending Articles