In machine learning, support-vector machines (SVMs, also support-vector networks) are supervised learning models with associated learning algorithms that analyze data used for classification and regression analysis. Developed at AT&T Bell Laboratories by Vapnik with colleagues (Boser et al., 1992, Guyon et al., 1993, Vapnik et al., 1997), it presents one of the most robust prediction methods, based on the statistical learning framework or VC theory proposed by Vapnik and Chervonekis (1974) and Vapnik (1982, 1995). Given a set of training examples, each marked as belonging to one or the other of two categories, an SVM training algorithm builds a model that assigns new examples to one category or the other, making it a non-probabilistic binary linear classifier (although methods such as Platt scaling exist to use SVM in a probabilistic classification setting). An SVM model is a representation of the examples as points in space, mapped so that the examples of the separate categories are divided by a clear gap that is as wide as possible. New examples are then mapped into that same space and predicted to belong to a category based on the side of the gap on which they fall.
In addition to performing linear classification, SVMs can efficiently perform a non-linear classification using what is called the kernel trick, implicitly mapping their inputs into high-dimensional feature spaces.
When data are unlabelled, supervised learning is not possible, and an unsupervised learning approach is required, which attempts to find natural clustering of the data to groups, and then map new data to these formed groups. The support-vector clustering[2] algorithm, created by Hava Siegelmann and Vladimir Vapnik, applies the statistics of support vectors, developed in the support vector machines algorithm, to categorize unlabeled data, and is one of the most widely used clustering algorithms in industrial applications.
Here we are going to implement SVM using Telecom Churn Dataset.
library(DBI)
library(corrgram)
library(caret) # contains SVM function
library(gridExtra)
library(ggpubr)
Today is a good practice to start parallelizing your code. The common motivation behind parallel computing is that something is taking too long time. For somebody that means any computation that takes more than 3 minutes – this because parallelization is incredibly simple and most tasks that take time are embarrassingly parallel. Here are a few common tasks that fit the description:
# process in parallel on Windows
library(doParallel)
cl <- makeCluster(detectCores(), type='PSOCK')
registerDoParallel(cl)
# process in parallel on Mac OSX and UNIX like systems
library(doMC)
registerDoMC(cores = 4)
#Set working directory where CSV is located
#getwd()
#setwd("...YOUR WORKING DIRECTORY WITH A DATASET...")
#getwd()
# Load the DataSets:
dataSet <- read.csv("TelcoCustomerChurnDataset.csv", header = TRUE, sep = ',')
colnames(dataSet) #Check the dataframe column names
# Print top 10 rows in the dataSet
head(dataSet, 10)
Account_Length | Vmail_Message | Day_Mins | Eve_Mins | Night_Mins | Intl_Mins | CustServ_Calls | Churn | Intl_Plan | Vmail_Plan | ⋯ | Day_Charge | Eve_Calls | Eve_Charge | Night_Calls | Night_Charge | Intl_Calls | Intl_Charge | State | Area_Code | Phone | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
<int> | <int> | <dbl> | <dbl> | <dbl> | <dbl> | <int> | <fct> | <fct> | <fct> | ⋯ | <dbl> | <int> | <dbl> | <int> | <dbl> | <int> | <dbl> | <fct> | <int> | <fct> | |
1 | 128 | 25 | 265.1 | 197.4 | 244.7 | 10.0 | 1 | no | no | yes | ⋯ | 45.07 | 99 | 16.78 | 91 | 11.01 | 3 | 2.70 | KS | 415 | 382-4657 |
2 | 107 | 26 | 161.6 | 195.5 | 254.4 | 13.7 | 1 | no | no | yes | ⋯ | 27.47 | 103 | 16.62 | 103 | 11.45 | 3 | 3.70 | OH | 415 | 371-7191 |
3 | 137 | 0 | 243.4 | 121.2 | 162.6 | 12.2 | 0 | no | no | no | ⋯ | 41.38 | 110 | 10.30 | 104 | 7.32 | 5 | 3.29 | NJ | 415 | 358-1921 |
4 | 84 | 0 | 299.4 | 61.9 | 196.9 | 6.6 | 2 | no | yes | no | ⋯ | 50.90 | 88 | 5.26 | 89 | 8.86 | 7 | 1.78 | OH | 408 | 375-9999 |
5 | 75 | 0 | 166.7 | 148.3 | 186.9 | 10.1 | 3 | no | yes | no | ⋯ | 28.34 | 122 | 12.61 | 121 | 8.41 | 3 | 2.73 | OK | 415 | 330-6626 |
6 | 118 | 0 | 223.4 | 220.6 | 203.9 | 6.3 | 0 | no | yes | no | ⋯ | 37.98 | 101 | 18.75 | 118 | 9.18 | 6 | 1.70 | AL | 510 | 391-8027 |
7 | 121 | 24 | 218.2 | 348.5 | 212.6 | 7.5 | 3 | no | no | yes | ⋯ | 37.09 | 108 | 29.62 | 118 | 9.57 | 7 | 2.03 | MA | 510 | 355-9993 |
8 | 147 | 0 | 157.0 | 103.1 | 211.8 | 7.1 | 0 | no | yes | no | ⋯ | 26.69 | 94 | 8.76 | 96 | 9.53 | 6 | 1.92 | MO | 415 | 329-9001 |
9 | 117 | 0 | 184.5 | 351.6 | 215.8 | 8.7 | 1 | no | no | no | ⋯ | 31.37 | 80 | 29.89 | 90 | 9.71 | 4 | 2.35 | LA | 408 | 335-4719 |
10 | 141 | 37 | 258.6 | 222.0 | 326.4 | 11.2 | 0 | no | yes | yes | ⋯ | 43.96 | 111 | 18.87 | 97 | 14.69 | 5 | 3.02 | WV | 415 | 330-8173 |
# Print last 10 rows in the dataSet
tail(dataSet, 10)
Account_Length | Vmail_Message | Day_Mins | Eve_Mins | Night_Mins | Intl_Mins | CustServ_Calls | Churn | Intl_Plan | Vmail_Plan | ⋯ | Day_Charge | Eve_Calls | Eve_Charge | Night_Calls | Night_Charge | Intl_Calls | Intl_Charge | State | Area_Code | Phone | |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
<int> | <int> | <dbl> | <dbl> | <dbl> | <dbl> | <int> | <fct> | <fct> | <fct> | ⋯ | <dbl> | <int> | <dbl> | <int> | <dbl> | <int> | <dbl> | <fct> | <int> | <fct> | |
3324 | 117 | 0 | 118.4 | 249.3 | 227.0 | 13.6 | 5 | yes | no | no | ⋯ | 20.13 | 97 | 21.19 | 56 | 10.22 | 3 | 3.67 | IN | 415 | 362-5899 |
3325 | 159 | 0 | 169.8 | 197.7 | 193.7 | 11.6 | 1 | no | no | no | ⋯ | 28.87 | 105 | 16.80 | 82 | 8.72 | 4 | 3.13 | WV | 415 | 377-1164 |
3326 | 78 | 0 | 193.4 | 116.9 | 243.3 | 9.3 | 2 | no | no | no | ⋯ | 32.88 | 88 | 9.94 | 109 | 10.95 | 4 | 2.51 | OH | 408 | 368-8555 |
3327 | 96 | 0 | 106.6 | 284.8 | 178.9 | 14.9 | 1 | no | no | no | ⋯ | 18.12 | 87 | 24.21 | 92 | 8.05 | 7 | 4.02 | OH | 415 | 347-6812 |
3328 | 79 | 0 | 134.7 | 189.7 | 221.4 | 11.8 | 2 | no | no | no | ⋯ | 22.90 | 68 | 16.12 | 128 | 9.96 | 5 | 3.19 | SC | 415 | 348-3830 |
3329 | 192 | 36 | 156.2 | 215.5 | 279.1 | 9.9 | 2 | no | no | yes | ⋯ | 26.55 | 126 | 18.32 | 83 | 12.56 | 6 | 2.67 | AZ | 415 | 414-4276 |
3330 | 68 | 0 | 231.1 | 153.4 | 191.3 | 9.6 | 3 | no | no | no | ⋯ | 39.29 | 55 | 13.04 | 123 | 8.61 | 4 | 2.59 | WV | 415 | 370-3271 |
3331 | 28 | 0 | 180.8 | 288.8 | 191.9 | 14.1 | 2 | no | no | no | ⋯ | 30.74 | 58 | 24.55 | 91 | 8.64 | 6 | 3.81 | RI | 510 | 328-8230 |
3332 | 184 | 0 | 213.8 | 159.6 | 139.2 | 5.0 | 2 | no | yes | no | ⋯ | 36.35 | 84 | 13.57 | 137 | 6.26 | 10 | 1.35 | CT | 510 | 364-6381 |
3333 | 74 | 25 | 234.4 | 265.9 | 241.4 | 13.7 | 0 | no | no | yes | ⋯ | 39.85 | 82 | 22.60 | 77 | 10.86 | 4 | 3.70 | TN | 415 | 400-4344 |
# Dimention of Dataset
dim(dataSet)
# Check Data types of each column
table(unlist(lapply(dataSet, class)))
factor integer numeric 5 8 8
# Check Data types of individual column
data.class(dataSet$Account_Length)
data.class(dataSet$Vmail_Message)
data.class(dataSet$Day_Mins)
data.class(dataSet$Eve_Mins)
data.class(dataSet$Night_Mins)
data.class(dataSet$Intl_Mins)
data.class(dataSet$CustServ_Calls)
data.class(dataSet$Intl_Plan)
data.class(dataSet$Vmail_Plan)
data.class(dataSet$Day_Calls)
data.class(dataSet$Day_Charge)
data.class(dataSet$Eve_Calls)
data.class(dataSet$Eve_Charge)
data.class(dataSet$Night_Calls)
data.class(dataSet$Night_Charge)
data.class(dataSet$Intl_Calls)
data.class(dataSet$Intl_Charge)
data.class(dataSet$State)
data.class(dataSet$Phone)
data.class(dataSet$Churn)
dataSet$Intl_Plan <- as.numeric(dataSet$Intl_Plan)
dataSet$Vmail_Plan <- as.numeric(dataSet$Vmail_Plan)
dataSet$State <- as.numeric(dataSet$State)
# Check Data types of each column
table(unlist(lapply(dataSet, class)))
factor integer numeric 2 8 11
# Find out if there is missing value in rows
rowSums(is.na(dataSet))
# Find out if there is missing value in columns
colSums(is.na(dataSet))
#Checking missing value with the mice package
library(mice)
md.pattern(dataSet)
Attaching package: ‘mice’ The following objects are masked from ‘package:base’: cbind, rbind
/\ /\ { `---' } { O O } ==> V <== No need for mice. This data set is completely observed. \ \|/ / `-----'
Account_Length | Vmail_Message | Day_Mins | Eve_Mins | Night_Mins | Intl_Mins | CustServ_Calls | Churn | Intl_Plan | Vmail_Plan | ⋯ | Eve_Calls | Eve_Charge | Night_Calls | Night_Charge | Intl_Calls | Intl_Charge | State | Area_Code | Phone | ||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
3333 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | ⋯ | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 1 | 0 |
0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | ⋯ | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | 0 |
#Checking missing value with the VIM package
library(VIM)
mice_plot <- aggr(dataSet, col=c('navyblue','yellow'),
numbers=TRUE, sortVars=TRUE,
labels=names(dataSet[1:21]), cex.axis=.9,
gap=3, ylab=c("Missing data","Pattern"))
Loading required package: colorspace Loading required package: grid VIM is ready to use. Suggestions and bug-reports can be submitted at: https://github.com/statistikat/VIM/issues Attaching package: ‘VIM’ The following object is masked from ‘package:datasets’: sleep
Variables sorted by number of missings: Variable Count Account_Length 0 Vmail_Message 0 Day_Mins 0 Eve_Mins 0 Night_Mins 0 Intl_Mins 0 CustServ_Calls 0 Churn 0 Intl_Plan 0 Vmail_Plan 0 Day_Calls 0 Day_Charge 0 Eve_Calls 0 Eve_Charge 0 Night_Calls 0 Night_Charge 0 Intl_Calls 0 Intl_Charge 0 State 0 Area_Code 0 Phone 0
After the observation, we can claim that dataset contains no missing values.
# Selecting just columns with numeric data type
numericalCols <- colnames(dataSet[c(1:7,9:20)])
Difference between the lapply and sapply functions (we will use them in the next 2 cells):
We use lapply - when we want to apply a function to each element of a list in turn and get a list back.
We use sapply - when we want to apply a function to each element of a list in turn, but we want a vector back, rather than a list.
#Sum
lapply(dataSet[numericalCols], FUN = sum)
#Mean
lapply(dataSet[numericalCols], FUN = mean)
#median
lapply(dataSet[numericalCols], FUN = median)
#Min
lapply(dataSet[numericalCols], FUN = min)
#Max
lapply(dataSet[numericalCols], FUN = max)
#Length
lapply(dataSet[numericalCols], FUN = length)
# Sum
sapply(dataSet[numericalCols], FUN = sum)
# Mean
sapply(dataSet[numericalCols], FUN = mean)
# Median
sapply(dataSet[numericalCols], FUN = median)
# Min
sapply(dataSet[numericalCols], FUN = min)
# Max
sapply(dataSet[numericalCols], FUN = max)
# Length
sapply(dataSet[numericalCols], FUN = length)
In the next few cells, you will find three different options on how to aggregate data.
# OPTION 1: (Using Aggregate FUNCTION - all variables together)
aggregate(dataSet[numericalCols], list(dataSet$Churn), summary)
Group.1 | Account_Length | Vmail_Message | Day_Mins | Eve_Mins | Night_Mins | Intl_Mins | CustServ_Calls | Intl_Plan | Vmail_Plan | Day_Calls | Day_Charge | Eve_Calls | Eve_Charge | Night_Calls | Night_Charge | Intl_Calls | Intl_Charge | State | Area_Code |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
<fct> | <dbl[,6]> | <dbl[,6]> | <dbl[,6]> | <dbl[,6]> | <dbl[,6]> | <dbl[,6]> | <dbl[,6]> | <dbl[,6]> | <dbl[,6]> | <dbl[,6]> | <dbl[,6]> | <dbl[,6]> | <dbl[,6]> | <dbl[,6]> | <dbl[,6]> | <dbl[,6]> | <dbl[,6]> | <dbl[,6]> | <dbl[,6]> |
no | 1, 73, 100, 100.7937, 127, 243 | 0, 0, 0, 8.604561, 22, 51 | 0, 142.825, 177.2, 175.1758, 210.30, 315.6 | 0.0, 164.5, 199.6, 199.0433, 233.20, 361.8 | 23.2, 165.90, 200.25, 200.1332, 234.90, 395.0 | 0, 8.4, 10.2, 10.15888, 12.0, 18.9 | 0, 1, 1, 1.449825, 2, 8 | 1, 1, 1, 1.065263, 1, 2 | 1, 1, 1, 1.295439, 2, 2 | 0, 87.0, 100, 100.2832, 114.0, 163 | 0, 24.2825, 30.12, 29.78042, 35.75, 53.65 | 0, 87, 100, 100.0386, 114, 170 | 0.00, 13.980, 16.97, 16.91891, 19.820, 30.75 | 33, 87, 100, 100.0582, 113, 175 | 1.04, 7.470, 9.01, 9.006074, 10.570, 17.77 | 0, 3, 4, 4.532982, 6, 19 | 0.00, 2.27, 2.75, 2.743404, 3.24, 5.1 | 1, 14, 27, 27.01193, 40, 51 | 408, 408, 415, 437.0747, 510, 510 |
yes | 1, 76, 103, 102.6646, 127, 225 | 0, 0, 0, 5.115942, 0, 48 | 0, 153.250, 217.6, 206.9141, 265.95, 350.8 | 70.9, 177.1, 211.3, 212.4101, 249.45, 363.7 | 47.4, 171.25, 204.80, 205.2317, 239.85, 354.9 | 2, 8.8, 10.6, 10.70000, 12.8, 20.0 | 0, 1, 2, 2.229814, 4, 9 | 1, 1, 1, 1.283644, 2, 2 | 1, 1, 1, 1.165631, 1, 2 | 0, 87.5, 103, 101.3354, 116.5, 165 | 0, 26.0550, 36.99, 35.17592, 45.21, 59.64 | 48, 87, 101, 100.5611, 114, 168 | 6.03, 15.055, 17.96, 18.05497, 21.205, 30.91 | 49, 85, 100, 100.3996, 115, 158 | 2.13, 7.705, 9.22, 9.235528, 10.795, 15.97 | 1, 2, 4, 4.163561, 5, 20 | 0.54, 2.38, 2.86, 2.889545, 3.46, 5.4 | 1, 17, 27, 27.33954, 39, 51 | 408, 408, 415, 437.8178, 510, 510 |
# OPTION 2: (Using Aggregate FUNCTION - variables separately)
aggregate(dataSet$Intl_Mins, list(dataSet$Churn), summary)
aggregate(dataSet$Day_Mins, list(dataSet$Churn), summary)
aggregate(dataSet$Night_Mins, list(dataSet$Churn), summary)
Group.1 | x |
---|---|
<fct> | <dbl[,6]> |
no | 0, 8.4, 10.2, 10.15888, 12.0, 18.9 |
yes | 2, 8.8, 10.6, 10.70000, 12.8, 20.0 |
Group.1 | x |
---|---|
<fct> | <dbl[,6]> |
no | 0, 142.825, 177.2, 175.1758, 210.30, 315.6 |
yes | 0, 153.250, 217.6, 206.9141, 265.95, 350.8 |
Group.1 | x |
---|---|
<fct> | <dbl[,6]> |
no | 23.2, 165.90, 200.25, 200.1332, 234.90, 395.0 |
yes | 47.4, 171.25, 204.80, 205.2317, 239.85, 354.9 |
# OPTION 3: (Using "by" FUNCTION instead of "Aggregate" FUNCTION)
by(dataSet$Intl_Mins, dataSet[8], FUN = summary)
by(dataSet$Day_Mins, dataSet[8], FUN = summary)
by(dataSet$Night_Mins, dataSet[8], FUN = summary)
Churn: no Min. 1st Qu. Median Mean 3rd Qu. Max. 0.00 8.40 10.20 10.16 12.00 18.90 ------------------------------------------------------------ Churn: yes Min. 1st Qu. Median Mean 3rd Qu. Max. 2.0 8.8 10.6 10.7 12.8 20.0
Churn: no Min. 1st Qu. Median Mean 3rd Qu. Max. 0.0 142.8 177.2 175.2 210.3 315.6 ------------------------------------------------------------ Churn: yes Min. 1st Qu. Median Mean 3rd Qu. Max. 0.0 153.2 217.6 206.9 265.9 350.8
Churn: no Min. 1st Qu. Median Mean 3rd Qu. Max. 23.2 165.9 200.2 200.1 234.9 395.0 ------------------------------------------------------------ Churn: yes Min. 1st Qu. Median Mean 3rd Qu. Max. 47.4 171.2 204.8 205.2 239.8 354.9
# Correlations/covariances among numeric variables
library(Hmisc)
cor(dataSet[c(2,5,11,13,16,18)], use="complete.obs", method="kendall")
cov(dataSet[c(2,5,11,13,16,18)], use="complete.obs")
Loading required package: survival Attaching package: ‘survival’ The following object is masked from ‘package:caret’: cluster Loading required package: Formula Attaching package: ‘Hmisc’ The following objects are masked from ‘package:base’: format.pval, units
Vmail_Message | Night_Mins | Day_Calls | Eve_Calls | Night_Charge | Intl_Charge | |
---|---|---|---|---|---|---|
Vmail_Message | 1.000000000 | 0.003718463 | -0.009573189 | -5.382921e-03 | 0.003710434 | -1.263503e-03 |
Night_Mins | 0.003718463 | 1.000000000 | 0.012550159 | 3.291091e-03 | 0.999625309 | -7.103399e-03 |
Day_Calls | -0.009573189 | 0.012550159 | 1.000000000 | 9.253492e-03 | 0.012531632 | 1.038631e-02 |
Eve_Calls | -0.005382921 | 0.003291091 | 0.009253492 | 1.000000e+00 | 0.003310838 | -9.536135e-05 |
Night_Charge | 0.003710434 | 0.999625309 | 0.012531632 | 3.310838e-03 | 1.000000000 | -7.097366e-03 |
Intl_Charge | -0.001263503 | -0.007103399 | 0.010386309 | -9.536135e-05 | -0.007097366 | 1.000000e+00 |
Vmail_Message | Night_Mins | Day_Calls | Eve_Calls | Night_Charge | Intl_Charge | |
---|---|---|---|---|---|---|
Vmail_Message | 187.37134656 | 5.3174453 | -2.6229779 | -1.59925653 | 0.23873433 | 0.02975334 |
Night_Mins | 5.31744529 | 2557.7140018 | 23.2812431 | -2.10859729 | 115.09955435 | -0.57867377 |
Day_Calls | -2.62297790 | 23.2812431 | 402.7681409 | 2.58373944 | 1.04716693 | 0.32775442 |
Eve_Calls | -1.59925653 | -2.1085973 | 2.5837394 | 396.91099860 | -0.09322113 | 0.13025644 |
Night_Charge | 0.23873433 | 115.0995543 | 1.0471669 | -0.09322113 | 5.17959717 | -0.02605168 |
Intl_Charge | 0.02975334 | -0.5786738 | 0.3277544 | 0.13025644 | -0.02605168 | 0.56817315 |
# Correlations with significance levels
rcorr(as.matrix(dataSet[c(2,5,11,13,16,18)]), type="pearson")
Vmail_Message Night_Mins Day_Calls Eve_Calls Night_Charge Vmail_Message 1.00 0.01 -0.01 -0.01 0.01 Night_Mins 0.01 1.00 0.02 0.00 1.00 Day_Calls -0.01 0.02 1.00 0.01 0.02 Eve_Calls -0.01 0.00 0.01 1.00 0.00 Night_Charge 0.01 1.00 0.02 0.00 1.00 Intl_Charge 0.00 -0.02 0.02 0.01 -0.02 Intl_Charge Vmail_Message 0.00 Night_Mins -0.02 Day_Calls 0.02 Eve_Calls 0.01 Night_Charge -0.02 Intl_Charge 1.00 n= 3333 P Vmail_Message Night_Mins Day_Calls Eve_Calls Night_Charge Vmail_Message 0.6576 0.5816 0.7350 0.6583 Night_Mins 0.6576 0.1855 0.9039 0.0000 Day_Calls 0.5816 0.1855 0.7092 0.1857 Eve_Calls 0.7350 0.9039 0.7092 0.9056 Night_Charge 0.6583 0.0000 0.1857 0.9056 Intl_Charge 0.8678 0.3810 0.2111 0.6167 0.3808 Intl_Charge Vmail_Message 0.8678 Night_Mins 0.3810 Day_Calls 0.2111 Eve_Calls 0.6167 Night_Charge 0.3808 Intl_Charge
# Pie Chart from data
mytable <- table(dataSet$Churn)
lbls <- paste(names(mytable), "\n", mytable, sep="")
pie(mytable, labels = lbls, col=rainbow(length(lbls)),
main="Pie Chart of Classes\n (with sample sizes)")
# Barplot of categorical data
par(mfrow=c(1,1))
barplot(table(dataSet$Churn), ylab = "Count",
col=c("darkblue","red"))
barplot(prop.table(table(dataSet$Churn)), ylab = "Proportion",
col=c("darkblue","red"))
barplot(table(dataSet$Churn), xlab = "Count", horiz = TRUE,
col=c("darkblue","red"))
barplot(prop.table(table(dataSet$Churn)), xlab = "Proportion", horiz = TRUE,
col=c("darkblue","red"))
# Scatterplot Matrices from the glus Package
library(gclus)
dta <- dataSet[c(2,5,11,13,16,18)] # get data
dta.r <- abs(cor(dta)) # get correlations
dta.col <- dmat.color(dta.r) # get colors
# reorder variables so those with highest correlation are closest to the diagonal
dta.o <- order.single(dta.r)
cpairs(dta, dta.o, panel.colors=dta.col, gap=.5,
main="Variables Ordered and Colored by Correlation" )
Loading required package: cluster
corrgram(dataSet[c(2,5,11,13,16,18)], order=TRUE, lower.panel=panel.shade,
upper.panel=panel.pie, text.panel=panel.txt, main=" ")
# More graphs on correlatios amaong data
# Using "Hmisc"
res2 <- rcorr(as.matrix(dataSet[,c(2,5,11,13,16,18)]))
# Extract the correlation coefficients
res2$r
# Extract p-values
res2$P
Vmail_Message | Night_Mins | Day_Calls | Eve_Calls | Night_Charge | Intl_Charge | |
---|---|---|---|---|---|---|
Vmail_Message | 1.000000000 | 0.007681136 | -0.009548068 | -0.005864351 | 0.007663290 | 0.002883658 |
Night_Mins | 0.007681136 | 1.000000000 | 0.022937845 | -0.002092768 | 0.999999215 | -0.015179849 |
Day_Calls | -0.009548068 | 0.022937845 | 1.000000000 | 0.006462114 | 0.022926638 | 0.021666095 |
Eve_Calls | -0.005864351 | -0.002092768 | 0.006462114 | 1.000000000 | -0.002055984 | 0.008673858 |
Night_Charge | 0.007663290 | 0.999999215 | 0.022926638 | -0.002055984 | 1.000000000 | -0.015186139 |
Intl_Charge | 0.002883658 | -0.015179849 | 0.021666095 | 0.008673858 | -0.015186139 | 1.000000000 |
Vmail_Message | Night_Mins | Day_Calls | Eve_Calls | Night_Charge | Intl_Charge | |
---|---|---|---|---|---|---|
Vmail_Message | NA | 0.6575570 | 0.5816089 | 0.7350335 | 0.6583020 | 0.8678283 |
Night_Mins | 0.6575570 | NA | 0.1855268 | 0.9038694 | 0.0000000 | 0.3809828 |
Day_Calls | 0.5816089 | 0.1855268 | NA | 0.7091964 | 0.1857418 | 0.2111142 |
Eve_Calls | 0.7350335 | 0.9038694 | 0.7091964 | NA | 0.9055511 | 0.6166654 |
Night_Charge | 0.6583020 | 0.0000000 | 0.1857418 | 0.9055511 | NA | 0.3807855 |
Intl_Charge | 0.8678283 | 0.3809828 | 0.2111142 | 0.6166654 | 0.3807855 | NA |
# Using "corrplot"
library(corrplot)
library(RColorBrewer)
corrplot(res2$r, type = "upper", order = "hclust", col=brewer.pal(n=8, name="RdYlBu"),
tl.col = "black", tl.srt = 45)
corrplot(res2$r, type = "lower", order = "hclust", col=brewer.pal(n=8, name="RdYlBu"),
tl.col = "black", tl.srt = 45)
corrplot 0.84 loaded
# Using PerformanceAnalytics
library(PerformanceAnalytics)
data <- dataSet[, c(2,5,11,13,16,18)]
chart.Correlation(data, histogram=TRUE, pch=19)
Loading required package: xts Loading required package: zoo Attaching package: ‘zoo’ The following objects are masked from ‘package:base’: as.Date, as.Date.numeric Attaching package: ‘PerformanceAnalytics’ The following object is masked from ‘package:graphics’: legend
# Using Colored Headmap
col <- colorRampPalette(c("blue", "white", "red"))(20)
heatmap(x = res2$r, col = col, symm = TRUE)
We should notice that Night_Mins and Night_Charge have a strong, linear, positive relationship.
train_test_index <- createDataPartition(dataSet$Churn, p=0.75, list=FALSE)
training_dataset <- dataSet[, c(1:20)][train_test_index,]
testing_dataset <- dataSet[, c(1:20)][-train_test_index,]
dim(training_dataset)
dim(testing_dataset)
control <- trainControl(method="repeatedcv", # repeatedcv / adaptive_cv
number=2, repeats = 2,
verbose = TRUE, search = "grid",
allowParallel = TRUE)
metric <- "Accuracy"
tuneLength = 2
getModelInfo("svmLinear"); getModelInfo("svmRadial"); getModelInfo("svmPoly");
parameter | class | label |
---|---|---|
<chr> | <chr> | <chr> |
tau | numeric | Regularization Parameter |
function (x, y, len = NULL, search = "grid")
{
if (search == "grid") {
out <- expand.grid(tau = 2^((1:len) - 5))
}
else {
out <- data.frame(tau = 2^runif(len, min = -5, max = 10))
}
out
}
function (x, y, wts, param, lev, last, classProbs, ...)
{
kernlab::lssvm(x = as.matrix(x), y = y, tau = param$tau,
kernel = kernlab::polydot(degree = 1, scale = 1, offset = 1),
...)
}
function (modelFit, newdata, submodels = NULL)
{
out <- kernlab::predict(modelFit, as.matrix(newdata))
if (is.matrix(out))
out <- out[, 1]
out
}
function (x, ...)
{
if (hasTerms(x) & !is.null(x@terms)) {
out <- predictors.terms(x@terms)
}
else {
out <- colnames(attr(x, "xmatrix"))
}
if (is.null(out))
out <- names(attr(x, "scaling")$x.scale$`scaled:center`)
if (is.null(out))
out <- NA
out
}
function (x)
lev(x)
function (x)
x
parameter | class | label |
---|---|---|
<chr> | <chr> | <chr> |
C | numeric | Cost |
function (x, y, len = NULL, search = "grid")
{
if (search == "grid") {
out <- data.frame(C = 1)
}
else {
out <- data.frame(C = 2^runif(len, min = -5, max = 10))
}
out
}
function (x, y, wts, param, lev, last, classProbs, ...)
{
if (any(names(list(...)) == "prob.model") | is.numeric(y)) {
out <- kernlab::ksvm(x = as.matrix(x), y = y, kernel = kernlab::vanilladot(),
C = param$C, ...)
}
else {
out <- kernlab::ksvm(x = as.matrix(x), y = y, kernel = kernlab::vanilladot(),
C = param$C, prob.model = classProbs, ...)
}
out
}
function (modelFit, newdata, submodels = NULL)
{
svmPred <- function(obj, x) {
hasPM <- !is.null(unlist(obj@prob.model))
if (hasPM) {
pred <- kernlab::lev(obj)[apply(kernlab::predict(obj,
x, type = "probabilities"), 1, which.max)]
}
else pred <- kernlab::predict(obj, x)
pred
}
out <- try(svmPred(modelFit, newdata), silent = TRUE)
if (is.character(kernlab::lev(modelFit))) {
if (class(out)[1] == "try-error") {
warning("kernlab class prediction calculations failed; returning NAs")
out <- rep("", nrow(newdata))
out[seq(along = out)] <- NA
}
}
else {
if (class(out)[1] == "try-error") {
warning("kernlab prediction calculations failed; returning NAs")
out <- rep(NA, nrow(newdata))
}
}
if (is.matrix(out))
out <- out[, 1]
out
}
function (modelFit, newdata, submodels = NULL)
{
out <- try(kernlab::predict(modelFit, newdata, type = "probabilities"),
silent = TRUE)
if (class(out)[1] != "try-error") {
if (any(out < 0)) {
out[out < 0] <- 0
out <- t(apply(out, 1, function(x) x/sum(x)))
}
out <- out[, kernlab::lev(modelFit), drop = FALSE]
}
else {
warning("kernlab class probability calculations failed; returning NAs")
out <- matrix(NA, nrow(newdata) * length(kernlab::lev(modelFit)),
ncol = length(kernlab::lev(modelFit)))
colnames(out) <- kernlab::lev(modelFit)
}
out
}
function (x, ...)
{
if (hasTerms(x) & !is.null(x@terms)) {
out <- predictors.terms(x@terms)
}
else {
out <- colnames(attr(x, "xmatrix"))
}
if (is.null(out))
out <- names(attr(x, "scaling")$x.scale$`scaled:center`)
if (is.null(out))
out <- NA
out
}
function (x)
kernlab::lev(x)
function (x)
{
x[order(x$C), ]
}
parameter | class | label |
---|---|---|
<chr> | <chr> | <chr> |
cost | numeric | Cost |
function (x, y, len = NULL, search = "grid")
{
if (search == "grid") {
out <- expand.grid(cost = 2^((1:len) - 3))
}
else {
out <- data.frame(cost = 2^runif(len, min = -5, max = 10))
}
out
}
function (x, y, wts, param, lev, last, classProbs, ...)
{
if (any(names(list(...)) == "probability") | is.numeric(y)) {
out <- e1071::svm(x = as.matrix(x), y = y, kernel = "linear",
cost = param$cost, ...)
}
else {
out <- e1071::svm(x = as.matrix(x), y = y, kernel = "linear",
cost = param$cost, probability = classProbs, ...)
}
out
}
function (modelFit, newdata, submodels = NULL)
{
predict(modelFit, newdata)
}
function (modelFit, newdata, submodels = NULL)
{
out <- predict(modelFit, newdata, probability = TRUE)
attr(out, "probabilities")
}
function (x, ...)
{
out <- if (!is.null(x$terms))
predictors.terms(x$terms)
else x$xNames
if (is.null(out))
out <- names(attr(x, "scaling")$x.scale$`scaled:center`)
if (is.null(out))
out <- NA
out
}
function (x)
x$levels
function (x)
{
x[order(x$cost), ]
}
parameter | class | label |
---|---|---|
<chr> | <chr> | <chr> |
cost | numeric | Cost |
Loss | character | Loss Function |
function (x, y, len = NULL, search = "grid")
{
if (search == "grid") {
out <- expand.grid(cost = 2^((1:len) - 3), Loss = c("L1",
"L2"))
}
else {
out <- data.frame(cost = 2^runif(len, min = -10, max = 10),
Loss = sample(c("L1", "L2"), size = len, replace = TRUE))
}
out
}
function (x, y, wts, param, lev, last, classProbs, ...)
{
if (param$Loss == "L2") {
model_type <- if (is.factor(y))
2
else 12
}
else model_type <- if (is.factor(y))
3
else 13
out <- LiblineaR::LiblineaR(data = as.matrix(x), target = y,
cost = param$cost, type = model_type, ...)
out
}
function (modelFit, newdata, submodels = NULL)
{
predict(modelFit, newdata)$predictions
}
function (x, ...)
{
out <- colnames(x$W)
out[out != "Bias"]
}
function (x)
x$levels
function (x)
{
x[order(x$cost), ]
}
parameter | class | label |
---|---|---|
<chr> | <chr> | <chr> |
cost | numeric | Cost |
weight | numeric | Class Weight |
function (x, y, len = NULL, search = "grid")
{
if (search == "grid") {
out <- expand.grid(cost = 2^((1:len) - 3), weight = 1:len)
}
else {
out <- data.frame(cost = 2^runif(len, min = -5, max = 10),
weight = runif(len, min = 1, max = 25))
}
out
}
function (x, y, wts, param, lev, last, classProbs, ...)
{
if (length(levels(y)) != 2)
stop("Currently implemented for 2-class problems")
cwts <- c(1, param$weight)
names(cwts) <- levels(y)
out <- e1071::svm(x = as.matrix(x), y = y, kernel = "linear",
cost = param$cost, probability = classProbs, class.weights = cwts,
...)
out
}
function (modelFit, newdata, submodels = NULL)
{
predict(modelFit, newdata)
}
function (modelFit, newdata, submodels = NULL)
{
out <- predict(modelFit, newdata, probability = TRUE)
attr(out, "probabilities")
}
function (x, ...)
{
out <- if (!is.null(x$terms))
predictors.terms(x$terms)
else x$xNames
if (is.null(out))
out <- names(attr(x, "scaling")$x.scale$`scaled:center`)
if (is.null(out))
out <- NA
out
}
function (x)
x$levels
function (x)
{
x[order(x$cost, x$weight), ]
}
parameter | class | label |
---|---|---|
<chr> | <chr> | <chr> |
cost | numeric | Cost |
Loss | character | Loss Function |
weight | numeric | Class Weight |
function (x, y, len = NULL, search = "grid")
{
if (search == "grid") {
out <- expand.grid(cost = 2^((1:len) - 3), Loss = c("L1",
"L2"), weight = 1:len)
}
else {
out <- data.frame(cost = 2^runif(len, min = -10, max = 10),
Loss = sample(c("L1", "L2"), size = len, replace = TRUE),
weight = runif(len, min = 1, max = 25))
}
out
}
function (x, y, wts, param, lev, last, classProbs, ...)
{
model_type <- if (param$Loss == "L2")
2
else 3
if (length(levels(y)) != 2)
stop("Currently implemented for 2-class problems")
cwts <- c(1, param$weight)
names(cwts) <- levels(y)
out <- LiblineaR::LiblineaR(data = as.matrix(x), target = y,
cost = param$cost, type = model_type, wi = cwts, ...)
out
}
function (modelFit, newdata, submodels = NULL)
{
predict(modelFit, newdata)$predictions
}
function (x, ...)
{
out <- colnames(x$W)
out[out != "Bias"]
}
function (x)
x$levels
function (x)
{
x[order(x$cost), ]
}
parameter | class | label |
---|---|---|
<chr> | <chr> | <chr> |
sigma | numeric | Sigma |
tau | numeric | Regularization Parameter |
function (x, y, len = NULL, search = "grid")
{
sigmas <- kernlab::sigest(as.matrix(x), na.action = na.omit,
scaled = TRUE)
if (search == "grid") {
out <- expand.grid(sigma = seq(min(sigmas), max(sigmas),
length = min(6, len)), tau = 2^((1:len) - 5))
}
else {
rng <- extendrange(log(sigmas), f = 0.75)
out <- data.frame(sigma = exp(runif(len, min = rng[1],
max = rng[2])), tau = 2^runif(len, min = -5, max = 10))
}
out
}
function (x, y, wts, param, lev, last, classProbs, ...)
{
kernlab::lssvm(x = as.matrix(x), y = y, tau = param$tau,
kernel = "rbfdot", kpar = list(sigma = param$sigma),
...)
}
function (modelFit, newdata, submodels = NULL)
{
out <- kernlab::predict(modelFit, as.matrix(newdata))
if (is.matrix(out))
out <- out[, 1]
out
}
function (x, ...)
{
if (hasTerms(x) & !is.null(x@terms)) {
out <- predictors.terms(x@terms)
}
else {
out <- colnames(attr(x, "xmatrix"))
}
if (is.null(out))
out <- names(attr(x, "scaling")$x.scale$`scaled:center`)
if (is.null(out))
out <- NA
out
}
function (x)
lev(x)
function (x)
x
parameter | class | label |
---|---|---|
<chr> | <chr> | <chr> |
sigma | numeric | Sigma |
C | numeric | Cost |
function (x, y, len = NULL, search = "grid")
{
sigmas <- kernlab::sigest(as.matrix(x), na.action = na.omit,
scaled = TRUE)
if (search == "grid") {
out <- expand.grid(sigma = mean(as.vector(sigmas[-2])),
C = 2^((1:len) - 3))
}
else {
rng <- extendrange(log(sigmas), f = 0.75)
out <- data.frame(sigma = exp(runif(len, min = rng[1],
max = rng[2])), C = 2^runif(len, min = -5, max = 10))
}
out
}
function (x, y, wts, param, lev, last, classProbs, ...)
{
if (any(names(list(...)) == "prob.model") | is.numeric(y)) {
out <- kernlab::ksvm(x = as.matrix(x), y = y, kernel = "rbfdot",
kpar = list(sigma = param$sigma), C = param$C, ...)
}
else {
out <- kernlab::ksvm(x = as.matrix(x), y = y, kernel = "rbfdot",
kpar = list(sigma = param$sigma), C = param$C, prob.model = classProbs,
...)
}
out
}
function (modelFit, newdata, submodels = NULL)
{
svmPred <- function(obj, x) {
hasPM <- !is.null(unlist(obj@prob.model))
if (hasPM) {
pred <- kernlab::lev(obj)[apply(kernlab::predict(obj,
x, type = "probabilities"), 1, which.max)]
}
else pred <- kernlab::predict(obj, x)
pred
}
out <- try(svmPred(modelFit, newdata), silent = TRUE)
if (is.character(kernlab::lev(modelFit))) {
if (class(out)[1] == "try-error") {
warning("kernlab class prediction calculations failed; returning NAs")
out <- rep("", nrow(newdata))
out[seq(along = out)] <- NA
}
}
else {
if (class(out)[1] == "try-error") {
warning("kernlab prediction calculations failed; returning NAs")
out <- rep(NA, nrow(newdata))
}
}
if (is.matrix(out))
out <- out[, 1]
out
}
function (modelFit, newdata, submodels = NULL)
{
out <- try(kernlab::predict(modelFit, newdata, type = "probabilities"),
silent = TRUE)
if (class(out)[1] != "try-error") {
if (any(out < 0)) {
out[out < 0] <- 0
out <- t(apply(out, 1, function(x) x/sum(x)))
}
out <- out[, kernlab::lev(modelFit), drop = FALSE]
}
else {
warning("kernlab class probability calculations failed; returning NAs")
out <- matrix(NA, nrow(newdata) * length(kernlab::lev(modelFit)),
ncol = length(kernlab::lev(modelFit)))
colnames(out) <- kernlab::lev(modelFit)
}
out
}
function (x, ...)
{
if (hasTerms(x) & !is.null(x@terms)) {
out <- predictors.terms(x@terms)
}
else {
out <- colnames(attr(x, "xmatrix"))
}
if (is.null(out))
out <- names(attr(x, "scaling")$x.scale$`scaled:center`)
if (is.null(out))
out <- NA
out
}
function (x)
kernlab::lev(x)
function (x)
{
x[order(x$C, -x$sigma), ]
}
parameter | class | label |
---|---|---|
<chr> | <chr> | <chr> |
C | numeric | Cost |
function (x, y, len = NULL, search = "grid")
{
if (search == "grid") {
out <- data.frame(C = 2^((1:len) - 3))
}
else {
out <- data.frame(C = 2^runif(len, min = -5, max = 10))
}
out
}
function (x, y, wts, param, lev, last, classProbs, ...)
{
if (any(names(list(...)) == "prob.model") | is.numeric(y)) {
out <- kernlab::ksvm(x = as.matrix(x), y = y, kernel = "rbfdot",
C = param$C, ...)
}
else {
out <- kernlab::ksvm(x = as.matrix(x), y = y, kernel = "rbfdot",
C = param$C, prob.model = classProbs, ...)
}
out
}
function (modelFit, newdata, submodels = NULL)
{
svmPred <- function(obj, x) {
hasPM <- !is.null(unlist(obj@prob.model))
if (hasPM) {
pred <- kernlab::lev(obj)[apply(kernlab::predict(obj,
x, type = "probabilities"), 1, which.max)]
}
else pred <- kernlab::predict(obj, x)
pred
}
out <- try(svmPred(modelFit, newdata), silent = TRUE)
if (is.character(kernlab::lev(modelFit))) {
if (class(out)[1] == "try-error") {
warning("kernlab class prediction calculations failed; returning NAs")
out <- rep("", nrow(newdata))
out[seq(along = out)] <- NA
}
}
else {
if (class(out)[1] == "try-error") {
warning("kernlab prediction calculations failed; returning NAs")
out <- rep(NA, nrow(newdata))
}
}
if (is.matrix(out))
out <- out[, 1]
out
}
function (modelFit, newdata, submodels = NULL)
{
out <- try(kernlab::predict(modelFit, newdata, type = "probabilities"),
silent = TRUE)
if (class(out)[1] != "try-error") {
if (any(out < 0)) {
out[out < 0] <- 0
out <- t(apply(out, 1, function(x) x/sum(x)))
}
out <- out[, kernlab::lev(modelFit), drop = FALSE]
}
else {
warning("kernlab class probability calculations failed; returning NAs")
out <- matrix(NA, nrow(newdata) * length(kernlab::lev(modelFit)),
ncol = length(kernlab::lev(modelFit)))
colnames(out) <- kernlab::lev(modelFit)
}
out
}
function (x, ...)
{
if (hasTerms(x) & !is.null(x@terms)) {
out <- predictors.terms(x@terms)
}
else {
out <- colnames(attr(x, "xmatrix"))
}
if (is.null(out))
out <- names(attr(x, "scaling")$x.scale$`scaled:center`)
if (is.null(out))
out <- NA
out
}
function (x)
kernlab::lev(x)
function (x)
{
x[order(x$C), ]
}
parameter | class | label |
---|---|---|
<chr> | <chr> | <chr> |
sigma | numeric | Sigma |
C | numeric | Cost |
function (x, y, len = NULL, search = "grid")
{
sigmas <- kernlab::sigest(as.matrix(x), na.action = na.omit,
scaled = TRUE)
if (search == "grid") {
out <- expand.grid(sigma = seq(min(sigmas), max(sigmas),
length = min(6, len)), C = 2^((1:len) - 3))
}
else {
rng <- extendrange(log(sigmas), f = 0.75)
out <- data.frame(sigma = exp(runif(len, min = rng[1],
max = rng[2])), C = 2^runif(len, min = -5, max = 10))
}
out
}
function (x, y, wts, param, lev, last, classProbs, ...)
{
if (any(names(list(...)) == "prob.model") | is.numeric(y)) {
out <- kernlab::ksvm(x = as.matrix(x), y = y, kernel = "rbfdot",
kpar = list(sigma = param$sigma), C = param$C, ...)
}
else {
out <- kernlab::ksvm(x = as.matrix(x), y = y, kernel = "rbfdot",
kpar = list(sigma = param$sigma), C = param$C, prob.model = classProbs,
...)
}
out
}
function (modelFit, newdata, submodels = NULL)
{
svmPred <- function(obj, x) {
hasPM <- !is.null(unlist(obj@prob.model))
if (hasPM) {
pred <- kernlab::lev(obj)[apply(kernlab::predict(obj,
x, type = "probabilities"), 1, which.max)]
}
else pred <- kernlab::predict(obj, x)
pred
}
out <- try(svmPred(modelFit, newdata), silent = TRUE)
if (is.character(kernlab::lev(modelFit))) {
if (class(out)[1] == "try-error") {
warning("kernlab class prediction calculations failed; returning NAs")
out <- rep("", nrow(newdata))
out[seq(along = out)] <- NA
}
}
else {
if (class(out)[1] == "try-error") {
warning("kernlab prediction calculations failed; returning NAs")
out <- rep(NA, nrow(newdata))
}
}
if (is.matrix(out))
out <- out[, 1]
out
}
function (modelFit, newdata, submodels = NULL)
{
out <- try(kernlab::predict(modelFit, newdata, type = "probabilities"),
silent = TRUE)
if (class(out)[1] != "try-error") {
if (any(out < 0)) {
out[out < 0] <- 0
out <- t(apply(out, 1, function(x) x/sum(x)))
}
out <- out[, kernlab::lev(modelFit), drop = FALSE]
}
else {
warning("kernlab class probability calculations failed; returning NAs")
out <- matrix(NA, nrow(newdata) * length(kernlab::lev(modelFit)),
ncol = length(kernlab::lev(modelFit)))
colnames(out) <- kernlab::lev(modelFit)
}
out
}
function (x, ...)
{
if (hasTerms(x) & !is.null(x@terms)) {
out <- predictors.terms(x@terms)
}
else {
out <- colnames(attr(x, "xmatrix"))
}
if (is.null(out))
out <- names(attr(x, "scaling")$x.scale$`scaled:center`)
if (is.null(out))
out <- NA
out
}
function (x)
kernlab::lev(x)
function (x)
{
x[order(x$C, -x$sigma), ]
}
parameter | class | label |
---|---|---|
<chr> | <chr> | <chr> |
sigma | numeric | Sigma |
C | numeric | Cost |
Weight | numeric | Weight |
function (x, y, len = NULL, search = "grid")
{
sigmas <- kernlab::sigest(as.matrix(x), na.action = na.omit,
scaled = TRUE)
if (search == "grid") {
out <- expand.grid(sigma = mean(as.vector(sigmas[-2])),
C = 2^((1:len) - 3), Weight = 1:len)
}
else {
rng <- extendrange(log(sigmas), f = 0.75)
out <- data.frame(sigma = exp(runif(len, min = rng[1],
max = rng[2])), C = 2^runif(len, min = -5, max = 10),
Weight = runif(len, min = 1, max = 25))
}
out
}
function (x, y, wts, param, lev, last, classProbs, ...)
{
if (param$Weight != 1) {
wts <- c(param$Weight, 1)
names(wts) <- levels(y)
}
else wts <- NULL
if (any(names(list(...)) == "prob.model") | is.numeric(y)) {
out <- kernlab::ksvm(x = as.matrix(x), y = y, kernel = "rbfdot",
kpar = list(sigma = param$sigma), class.weights = wts,
C = param$C, ...)
}
else {
out <- kernlab::ksvm(x = as.matrix(x), y = y, kernel = "rbfdot",
kpar = list(sigma = param$sigma), class.weights = wts,
C = param$C, prob.model = classProbs, ...)
}
out
}
function (modelFit, newdata, submodels = NULL)
{
out <- kernlab::predict(modelFit, newdata)
if (is.matrix(out))
out <- out[, 1]
out
}
function (modelFit, newdata, submodels = NULL)
{
out <- try(kernlab::predict(modelFit, newdata, type = "probabilities"),
silent = TRUE)
if (class(out)[1] != "try-error") {
if (any(out < 0)) {
out[out < 0] <- 0
out <- t(apply(out, 1, function(x) x/sum(x)))
}
out <- out[, kernlab::lev(modelFit), drop = FALSE]
}
else {
warning("kernlab class probability calculations failed; returning NAs")
out <- matrix(NA, nrow(newdata) * length(kernlab::lev(modelFit)),
ncol = length(kernlab::lev(modelFit)))
colnames(out) <- kernlab::lev(modelFit)
}
out
}
function (x, ...)
{
if (hasTerms(x) & !is.null(x@terms)) {
out <- predictors.terms(x@terms)
}
else {
out <- colnames(attr(x, "xmatrix"))
}
if (is.null(out))
out <- names(attr(x, "scaling")$x.scale$`scaled:center`)
if (is.null(out))
out <- NA
out
}
function (x)
kernlab::lev(x)
function (x)
x[order(x$C, -x$sigma, x$Weight), ]
parameter | class | label |
---|---|---|
<chr> | <chr> | <chr> |
degree | numeric | Polynomial Degree |
scale | numeric | Scale |
tau | numeric | Regularization Parameter |
function (x, y, len = NULL, search = "grid")
{
if (search == "grid") {
out <- expand.grid(degree = seq(1, min(len, 3)), scale = 10^((1:len) -
4), tau = 2^((1:len) - 5))
}
else {
out <- data.frame(degree = sample(1:3, size = len, replace = TRUE),
scale = 10^runif(len, min = -5, log10(2)), tau = 2^runif(len,
min = -5, max = 10))
}
out
}
function (x, y, wts, param, lev, last, classProbs, ...)
{
kernlab::lssvm(x = as.matrix(x), y = y, tau = param$tau,
kernel = kernlab::polydot(degree = param$degree, scale = param$scale,
offset = 1), ...)
}
function (modelFit, newdata, submodels = NULL)
{
out <- kernlab::predict(modelFit, as.matrix(newdata))
if (is.matrix(out))
out <- out[, 1]
out
}
function (x, ...)
{
if (hasTerms(x) & !is.null(x@terms)) {
out <- predictors.terms(x@terms)
}
else {
out <- colnames(attr(x, "xmatrix"))
}
if (is.null(out))
out <- names(attr(x, "scaling")$xscale$`scaled:center`)
if (is.null(out))
out <- NA
out
}
function (x)
lev(x)
function (x)
x
parameter | class | label |
---|---|---|
<chr> | <chr> | <chr> |
degree | numeric | Polynomial Degree |
scale | numeric | Scale |
C | numeric | Cost |
function (x, y, len = NULL, search = "grid")
{
if (search == "grid") {
out <- expand.grid(degree = seq(1, min(len, 3)), scale = 10^((1:len) -
4), C = 2^((1:len) - 3))
}
else {
out <- data.frame(degree = sample(1:3, size = len, replace = TRUE),
scale = 10^runif(len, min = -5, log10(2)), C = 2^runif(len,
min = -5, max = 10))
}
out
}
function (x, y, wts, param, lev, last, classProbs, ...)
{
if (any(names(list(...)) == "prob.model") | is.numeric(y)) {
out <- kernlab::ksvm(x = as.matrix(x), y = y, kernel = kernlab::polydot(degree = param$degree,
scale = param$scale, offset = 1), C = param$C, ...)
}
else {
out <- kernlab::ksvm(x = as.matrix(x), y = y, kernel = kernlab::polydot(degree = param$degree,
scale = param$scale, offset = 1), C = param$C, prob.model = classProbs,
...)
}
out
}
function (modelFit, newdata, submodels = NULL)
{
svmPred <- function(obj, x) {
hasPM <- !is.null(unlist(obj@prob.model))
if (hasPM) {
pred <- kernlab::lev(obj)[apply(kernlab::predict(obj,
x, type = "probabilities"), 1, which.max)]
}
else pred <- kernlab::predict(obj, x)
pred
}
out <- try(svmPred(modelFit, newdata), silent = TRUE)
if (is.character(kernlab::lev(modelFit))) {
if (class(out)[1] == "try-error") {
warning("kernlab class prediction calculations failed; returning NAs")
out <- rep("", nrow(newdata))
out[seq(along = out)] <- NA
}
}
else {
if (class(out)[1] == "try-error") {
warning("kernlab prediction calculations failed; returning NAs")
out <- rep(NA, nrow(newdata))
}
}
if (is.matrix(out))
out <- out[, 1]
out
}
function (modelFit, newdata, submodels = NULL)
{
out <- try(kernlab::predict(modelFit, newdata, type = "probabilities"),
silent = TRUE)
if (class(out)[1] != "try-error") {
if (any(out < 0)) {
out[out < 0] <- 0
out <- t(apply(out, 1, function(x) x/sum(x)))
}
out <- out[, kernlab::lev(modelFit), drop = FALSE]
}
else {
warning("kernlab class probability calculations failed; returning NAs")
out <- matrix(NA, nrow(newdata) * length(kernlab::lev(modelFit)),
ncol = length(kernlab::lev(modelFit)))
colnames(out) <- kernlab::lev(modelFit)
}
out
}
function (x, ...)
{
if (hasTerms(x) & !is.null(x@terms)) {
out <- predictors.terms(x@terms)
}
else {
out <- colnames(attr(x, "xmatrix"))
}
if (is.null(out))
out <- names(attr(x, "scaling")$xscale$`scaled:center`)
if (is.null(out))
out <- NA
out
}
function (x)
kernlab::lev(x)
function (x)
x[order(x$degree, x$C, x$scale), ]
names(getModelInfo("svm"))
# svmLinear
fit.svmLinear <- caret::train(Churn~., data=training_dataset, method="svmLinear",
metric=metric,
trControl=control,
verbose = TRUE
)
print(fit.svmLinear)
Aggregating results Fitting final model on full training set Support Vector Machines with Linear Kernel 2501 samples 19 predictor 2 classes: 'no', 'yes' No pre-processing Resampling: Cross-Validated (2 fold, repeated 2 times) Summary of sample sizes: 1250, 1251, 1251, 1250 Resampling results: Accuracy Kappa 0.8548582 0 Tuning parameter 'C' was held constant at a value of 1
# svmRadial
fit.svmRadial <- caret::train(Churn~., data=training_dataset, method="svmRadial",
metric=metric,
trControl=control,
verbose = TRUE
)
print(fit.svmRadial)
Aggregating results Selecting tuning parameters Fitting sigma = 0.0331, C = 1 on full training set Support Vector Machines with Radial Basis Function Kernel 2501 samples 19 predictor 2 classes: 'no', 'yes' No pre-processing Resampling: Cross-Validated (2 fold, repeated 2 times) Summary of sample sizes: 1251, 1250, 1250, 1251 Resampling results across tuning parameters: C Accuracy Kappa 0.25 0.8558574 0.0116202 0.50 0.8756486 0.2422584 1.00 0.8956422 0.4350147 Tuning parameter 'sigma' was held constant at a value of 0.03310822 Accuracy was used to select the optimal model using the largest value. The final values used for the model were sigma = 0.03310822 and C = 1.
# svmPoly
fit.svmPoly <- caret::train(Churn~., data=training_dataset, method="svmPoly",
metric=metric,
trControl=control,
verbose = TRUE
)
print(fit.svmPoly)
Aggregating results Selecting tuning parameters Fitting degree = 3, scale = 0.1, C = 0.25 on full training set Support Vector Machines with Polynomial Kernel 2501 samples 19 predictor 2 classes: 'no', 'yes' No pre-processing Resampling: Cross-Validated (2 fold, repeated 2 times) Summary of sample sizes: 1251, 1250, 1250, 1251 Resampling results across tuning parameters: degree scale C Accuracy Kappa 1 0.001 0.25 0.8548582 0.000000000 1 0.001 0.50 0.8548582 0.000000000 1 0.001 1.00 0.8548582 0.000000000 1 0.010 0.25 0.8548582 0.000000000 1 0.010 0.50 0.8548582 0.000000000 1 0.010 1.00 0.8548582 0.000000000 1 0.100 0.25 0.8548582 0.000000000 1 0.100 0.50 0.8548582 0.000000000 1 0.100 1.00 0.8548582 0.000000000 2 0.001 0.25 0.8548582 0.000000000 2 0.001 0.50 0.8548582 0.000000000 2 0.001 1.00 0.8548582 0.000000000 2 0.010 0.25 0.8548582 0.000000000 2 0.010 0.50 0.8548582 0.000000000 2 0.010 1.00 0.8574571 0.030040180 2 0.100 0.25 0.9058395 0.548116063 2 0.100 0.50 0.9054406 0.565423514 2 0.100 1.00 0.9024419 0.559400967 3 0.001 0.25 0.8548582 0.000000000 3 0.001 0.50 0.8548582 0.000000000 3 0.001 1.00 0.8548582 0.000000000 3 0.010 0.25 0.8554580 0.007026524 3 0.010 0.50 0.8636552 0.105467971 3 0.010 1.00 0.8794483 0.283427083 3 0.100 0.25 0.9090373 0.598891399 3 0.100 0.50 0.9032388 0.587678103 3 0.100 1.00 0.8926424 0.555832982 Accuracy was used to select the optimal model using the largest value. The final values used for the model were degree = 3, scale = 0.1 and C = 0.25.
# svmLinear
fit.svmLinear_preProc <- caret::train(Churn~., data=training_dataset, method="svmLinear",
metric=metric,
trControl=control,
preProc=c("center", "scale", "pca"),
verbose = TRUE
)
print(fit.svmLinear_preProc)
Aggregating results Fitting final model on full training set Support Vector Machines with Linear Kernel 2501 samples 19 predictor 2 classes: 'no', 'yes' Pre-processing: centered (19), scaled (19), principal component signal extraction (19) Resampling: Cross-Validated (2 fold, repeated 2 times) Summary of sample sizes: 1251, 1250, 1250, 1251 Resampling results: Accuracy Kappa 0.8548582 0 Tuning parameter 'C' was held constant at a value of 1
# svmRadial
fit.svmRadial_preProc <- caret::train(Churn~., data=training_dataset, method="svmRadial",
metric=metric,
trControl=control,
preProc=c("center", "scale", "pca"),
verbose = TRUE
)
print(fit.svmRadial_preProc)
Aggregating results Selecting tuning parameters Fitting sigma = 0.0491, C = 1 on full training set Support Vector Machines with Radial Basis Function Kernel 2501 samples 19 predictor 2 classes: 'no', 'yes' Pre-processing: centered (19), scaled (19), principal component signal extraction (19) Resampling: Cross-Validated (2 fold, repeated 2 times) Summary of sample sizes: 1251, 1250, 1250, 1251 Resampling results across tuning parameters: C Accuracy Kappa 0.25 0.8548582 0.0000000 0.50 0.8642526 0.1058326 1.00 0.8900433 0.3778059 Tuning parameter 'sigma' was held constant at a value of 0.0490987 Accuracy was used to select the optimal model using the largest value. The final values used for the model were sigma = 0.0490987 and C = 1.
# svmPoly
fit.svmPoly_preProc <- caret::train(Churn~., data=training_dataset, method="svmPoly",
metric=metric,
trControl=control,
preProc=c("center", "scale", "pca"),
verbose = TRUE
)
print(fit.svmPoly_preProc)
Aggregating results Selecting tuning parameters Fitting degree = 3, scale = 0.1, C = 0.25 on full training set Support Vector Machines with Polynomial Kernel 2501 samples 19 predictor 2 classes: 'no', 'yes' Pre-processing: centered (19), scaled (19), principal component signal extraction (19) Resampling: Cross-Validated (2 fold, repeated 2 times) Summary of sample sizes: 1251, 1250, 1251, 1250 Resampling results across tuning parameters: degree scale C Accuracy Kappa 1 0.001 0.25 0.8548582 0.00000000 1 0.001 0.50 0.8548582 0.00000000 1 0.001 1.00 0.8548582 0.00000000 1 0.010 0.25 0.8548582 0.00000000 1 0.010 0.50 0.8548582 0.00000000 1 0.010 1.00 0.8548582 0.00000000 1 0.100 0.25 0.8548582 0.00000000 1 0.100 0.50 0.8548582 0.00000000 1 0.100 1.00 0.8548582 0.00000000 2 0.001 0.25 0.8548582 0.00000000 2 0.001 0.50 0.8548582 0.00000000 2 0.001 1.00 0.8548582 0.00000000 2 0.010 0.25 0.8548582 0.00000000 2 0.010 0.50 0.8548582 0.00000000 2 0.010 1.00 0.8548582 0.00000000 2 0.100 0.25 0.8926459 0.42965743 2 0.100 0.50 0.8984433 0.50078337 2 0.100 1.00 0.8992428 0.52311505 3 0.001 0.25 0.8548582 0.00000000 3 0.001 0.50 0.8548582 0.00000000 3 0.001 1.00 0.8548582 0.00000000 3 0.010 0.25 0.8548582 0.00000000 3 0.010 0.50 0.8548582 0.00000000 3 0.010 1.00 0.8574576 0.03020616 3 0.100 0.25 0.9032409 0.53728468 3 0.100 0.50 0.9016409 0.55131624 3 0.100 1.00 0.8960430 0.54436065 Accuracy was used to select the optimal model using the largest value. The final values used for the model were degree = 3, scale = 0.1 and C = 0.25.
# svmLinear
fit.svmLinear_automaticGrid <- caret::train(Churn~., data=training_dataset, method="svmLinear",
metric=metric,
trControl=control,
preProc=c("center", "scale", "pca"),
tuneLength = tuneLength,
verbose = TRUE
)
print(fit.svmLinear_automaticGrid)
Aggregating results Fitting final model on full training set Support Vector Machines with Linear Kernel 2501 samples 19 predictor 2 classes: 'no', 'yes' Pre-processing: centered (19), scaled (19), principal component signal extraction (19) Resampling: Cross-Validated (2 fold, repeated 2 times) Summary of sample sizes: 1250, 1251, 1250, 1251 Resampling results: Accuracy Kappa 0.8548582 0 Tuning parameter 'C' was held constant at a value of 1
# svmRadial
fit.svmRadial_automaticGrid <- caret::train(Churn~., data=training_dataset, method="svmRadial",
metric=metric,
trControl=control,
preProc=c("center", "scale", "pca"),
tuneLength = tuneLength,
verbose = TRUE
)
print(fit.svmRadial_automaticGrid)
Aggregating results Selecting tuning parameters Fitting sigma = 0.0499, C = 0.5 on full training set Support Vector Machines with Radial Basis Function Kernel 2501 samples 19 predictor 2 classes: 'no', 'yes' Pre-processing: centered (19), scaled (19), principal component signal extraction (19) Resampling: Cross-Validated (2 fold, repeated 2 times) Summary of sample sizes: 1251, 1250, 1251, 1250 Resampling results across tuning parameters: C Accuracy Kappa 0.25 0.8548582 0.0000000 0.50 0.8636534 0.1035911 Tuning parameter 'sigma' was held constant at a value of 0.04993854 Accuracy was used to select the optimal model using the largest value. The final values used for the model were sigma = 0.04993854 and C = 0.5.
# svmPoly
fit.svmPoly_automaticGrid <- caret::train(Churn~., data=training_dataset, method="svmPoly",
metric=metric,
trControl=control,
preProc=c("center", "scale", "pca"),
tuneLength = tuneLength,
verbose = TRUE
)
print(fit.svmPoly_automaticGrid)
Aggregating results Selecting tuning parameters Fitting degree = 1, scale = 0.001, C = 0.25 on full training set Support Vector Machines with Polynomial Kernel 2501 samples 19 predictor 2 classes: 'no', 'yes' Pre-processing: centered (19), scaled (19), principal component signal extraction (19) Resampling: Cross-Validated (2 fold, repeated 2 times) Summary of sample sizes: 1250, 1251, 1250, 1251 Resampling results across tuning parameters: degree scale C Accuracy Kappa 1 0.001 0.25 0.8548582 0 1 0.001 0.50 0.8548582 0 1 0.010 0.25 0.8548582 0 1 0.010 0.50 0.8548582 0 2 0.001 0.25 0.8548582 0 2 0.001 0.50 0.8548582 0 2 0.010 0.25 0.8548582 0 2 0.010 0.50 0.8548582 0 Accuracy was used to select the optimal model using the largest value. The final values used for the model were degree = 1, scale = 0.001 and C = 0.25.
Grid needs to parameterise manually for each particular algorithm
# svmLinear
grid <- expand.grid(C=c(seq(from = 1, to = 5, by = 0.5)))
fit.svmLinear_manualGrid <- caret::train(Churn~., data=training_dataset, method="svmLinear",
metric=metric,
trControl=control,
preProc=c("center", "scale", "pca"),
tuneGrid = grid,
verbose = TRUE
)
print(fit.svmLinear_manualGrid)
plot(fit.svmLinear_manualGrid)
Aggregating results Selecting tuning parameters Fitting C = 1 on full training set Support Vector Machines with Linear Kernel 2501 samples 19 predictor 2 classes: 'no', 'yes' Pre-processing: centered (19), scaled (19), principal component signal extraction (19) Resampling: Cross-Validated (2 fold, repeated 2 times) Summary of sample sizes: 1250, 1251, 1250, 1251 Resampling results across tuning parameters: C Accuracy Kappa 1.0 0.8548582 0 1.5 0.8548582 0 2.0 0.8548582 0 2.5 0.8548582 0 3.0 0.8548582 0 3.5 0.8548582 0 4.0 0.8548582 0 4.5 0.8548582 0 5.0 0.8548582 0 Accuracy was used to select the optimal model using the largest value. The final value used for the model was C = 1.
# svmRadial
grid <- expand.grid(C = c(seq(from = 1, to = 5, by = 0.5)),
sigma = c(seq(from = 0.1, to = 1, by = 0.1))
)
fit.svmRadial_manualGrid <- caret::train(Churn~., data=training_dataset, method="svmRadial",
metric=metric,
trControl=control,
preProc=c("center", "scale", "pca"),
tuneGrid = grid,
verbose = TRUE
)
print(fit.svmRadial_manualGrid)
plot(fit.svmRadial_manualGrid)
Aggregating results Selecting tuning parameters Fitting sigma = 0.1, C = 3 on full training set Support Vector Machines with Radial Basis Function Kernel 2501 samples 19 predictor 2 classes: 'no', 'yes' Pre-processing: centered (19), scaled (19), principal component signal extraction (19) Resampling: Cross-Validated (2 fold, repeated 2 times) Summary of sample sizes: 1251, 1250, 1251, 1250 Resampling results across tuning parameters: C sigma Accuracy Kappa 1.0 0.1 0.8910436 0.395101052 1.0 0.2 0.8770491 0.263986504 1.0 0.3 0.8616558 0.082270494 1.0 0.4 0.8560579 0.014053048 1.0 0.5 0.8550582 0.002353195 1.0 0.6 0.8548582 0.000000000 1.0 0.7 0.8548582 0.000000000 1.0 0.8 0.8548582 0.000000000 1.0 0.9 0.8548582 0.000000000 1.0 1.0 0.8548582 0.000000000 1.5 0.1 0.8994398 0.479467316 1.5 0.2 0.8900443 0.398313543 1.5 0.3 0.8710515 0.207242730 1.5 0.4 0.8614560 0.085221306 1.5 0.5 0.8558579 0.017438650 1.5 0.6 0.8550582 0.002353195 1.5 0.7 0.8548582 0.000000000 1.5 0.8 0.8548582 0.000000000 1.5 0.9 0.8548582 0.000000000 1.5 1.0 0.8548582 0.000000000 2.0 0.1 0.9000395 0.504361056 2.0 0.2 0.8922428 0.427059231 2.0 0.3 0.8718508 0.224707540 2.0 0.4 0.8614560 0.095038285 2.0 0.5 0.8556580 0.018936195 2.0 0.6 0.8544584 0.001146119 2.0 0.7 0.8548584 0.001955066 2.0 0.8 0.8548582 0.000000000 2.0 0.9 0.8548582 0.000000000 2.0 1.0 0.8548582 0.000000000 2.5 0.1 0.9020390 0.522895363 2.5 0.2 0.8914432 0.424427823 2.5 0.3 0.8720508 0.228016585 2.5 0.4 0.8614561 0.100069704 2.5 0.5 0.8556580 0.018936195 2.5 0.6 0.8544584 0.001146119 2.5 0.7 0.8548584 0.001955066 2.5 0.8 0.8548582 0.000000000 2.5 0.9 0.8548582 0.000000000 2.5 1.0 0.8548582 0.000000000 3.0 0.1 0.9034380 0.536488374 3.0 0.2 0.8902432 0.420058273 3.0 0.3 0.8716510 0.228112624 3.0 0.4 0.8612563 0.099602705 3.0 0.5 0.8556580 0.018936195 3.0 0.6 0.8544584 0.001146119 3.0 0.7 0.8548584 0.001955066 3.0 0.8 0.8548582 0.000000000 3.0 0.9 0.8548582 0.000000000 3.0 1.0 0.8548582 0.000000000 3.5 0.1 0.9014385 0.532802322 3.5 0.2 0.8898435 0.420461089 3.5 0.3 0.8710513 0.227597045 3.5 0.4 0.8612563 0.099602705 3.5 0.5 0.8556580 0.018936195 3.5 0.6 0.8544584 0.001146119 3.5 0.7 0.8548584 0.001955066 3.5 0.8 0.8548582 0.000000000 3.5 0.9 0.8548582 0.000000000 3.5 1.0 0.8548582 0.000000000 4.0 0.1 0.9002382 0.534163603 4.0 0.2 0.8878440 0.414978448 4.0 0.3 0.8710510 0.230432537 4.0 0.4 0.8610563 0.099096612 4.0 0.5 0.8556580 0.018936195 4.0 0.6 0.8544584 0.001146119 4.0 0.7 0.8548584 0.001955066 4.0 0.8 0.8548582 0.000000000 4.0 0.9 0.8548582 0.000000000 4.0 1.0 0.8548582 0.000000000 4.5 0.1 0.9004384 0.539666102 4.5 0.2 0.8872443 0.413919157 4.5 0.3 0.8712508 0.232350709 4.5 0.4 0.8610563 0.099096612 4.5 0.5 0.8556580 0.018936195 4.5 0.6 0.8544584 0.001146119 4.5 0.7 0.8548584 0.001955066 4.5 0.8 0.8548582 0.000000000 4.5 0.9 0.8548582 0.000000000 4.5 1.0 0.8548582 0.000000000 5.0 0.1 0.9000390 0.539396659 5.0 0.2 0.8878440 0.418393524 5.0 0.3 0.8710508 0.230455586 5.0 0.4 0.8610563 0.099096612 5.0 0.5 0.8556580 0.018936195 5.0 0.6 0.8544584 0.001146119 5.0 0.7 0.8548584 0.001955066 5.0 0.8 0.8548582 0.000000000 5.0 0.9 0.8548582 0.000000000 5.0 1.0 0.8548582 0.000000000 Accuracy was used to select the optimal model using the largest value. The final values used for the model were sigma = 0.1 and C = 3.
# svmPoly
grid <- expand.grid(C = c(seq(from = 1, to = 5, by = 0.25)),
scale = c(seq(from = 0.001, to = 0.010, by = 0.001)),
degree = c(seq(from = 1, to = 10, by = 1))
)
fit.svmPoly_manualGrid <- caret::train(Churn~., data=training_dataset, method="svmPoly",
metric=metric,
trControl=control,
preProc=c("center", "scale", "pca"),
tuneGrid = grid,
verbose = TRUE
)
print(fit.svmPoly_manualGrid)
plot(fit.svmPoly_manualGrid)
Aggregating results Selecting tuning parameters Fitting degree = 10, scale = 0.009, C = 4.25 on full training set Support Vector Machines with Polynomial Kernel 2501 samples 19 predictor 2 classes: 'no', 'yes' Pre-processing: centered (19), scaled (19), principal component signal extraction (19) Resampling: Cross-Validated (2 fold, repeated 2 times) Summary of sample sizes: 1250, 1251, 1250, 1251 Resampling results across tuning parameters: C scale degree Accuracy Kappa 1.00 0.001 1 0.8548582 0.000000000 1.00 0.001 2 0.8548582 0.000000000 1.00 0.001 3 0.8548582 0.000000000 1.00 0.001 4 0.8548582 0.000000000 1.00 0.001 5 0.8548582 0.000000000 1.00 0.001 6 0.8548582 0.000000000 1.00 0.001 7 0.8548582 0.000000000 1.00 0.001 8 0.8548582 0.000000000 1.00 0.001 9 0.8548582 0.000000000 1.00 0.001 10 0.8548582 0.000000000 1.00 0.002 1 0.8548582 0.000000000 1.00 0.002 2 0.8548582 0.000000000 1.00 0.002 3 0.8548582 0.000000000 1.00 0.002 4 0.8548582 0.000000000 1.00 0.002 5 0.8548582 0.000000000 1.00 0.002 6 0.8548582 0.000000000 1.00 0.002 7 0.8548582 0.000000000 1.00 0.002 8 0.8548582 0.000000000 1.00 0.002 9 0.8548582 0.000000000 1.00 0.002 10 0.8548582 0.000000000 1.00 0.003 1 0.8548582 0.000000000 1.00 0.003 2 0.8548582 0.000000000 1.00 0.003 3 0.8548582 0.000000000 1.00 0.003 4 0.8548582 0.000000000 1.00 0.003 5 0.8548582 0.000000000 1.00 0.003 6 0.8548582 0.000000000 1.00 0.003 7 0.8548582 0.000000000 1.00 0.003 8 0.8560579 0.014016854 1.00 0.003 9 0.8570574 0.027457856 1.00 0.003 10 0.8600563 0.063246904 1.00 0.004 1 0.8548582 0.000000000 1.00 0.004 2 0.8548582 0.000000000 1.00 0.004 3 0.8548582 0.000000000 1.00 0.004 4 0.8548582 0.000000000 1.00 0.004 5 0.8548582 0.000000000 1.00 0.004 6 0.8558580 0.011696471 1.00 0.004 7 0.8572574 0.031645341 1.00 0.004 8 0.8620553 0.086992444 1.00 0.004 9 0.8658547 0.146646561 1.00 0.004 10 0.8712523 0.213636693 1.00 0.005 1 0.8548582 0.000000000 1.00 0.005 2 0.8548582 0.000000000 1.00 0.005 3 0.8548582 0.000000000 1.00 0.005 4 0.8548582 0.000000000 1.00 0.005 5 0.8558580 0.011696471 1.00 0.005 6 0.8592566 0.054318694 1.00 0.005 7 0.8636548 0.119865678 1.00 0.005 8 0.8702532 0.203009851 1.00 0.005 9 0.8750502 0.258235867 1.00 0.005 10 0.8816481 0.327107215 1.00 0.006 1 0.8548582 0.000000000 1.00 0.006 2 0.8548582 0.000000000 1.00 0.006 3 0.8548582 0.000000000 1.00 0.006 4 0.8550582 0.002353195 1.00 0.006 5 0.8580571 0.040763415 1.00 0.006 6 0.8638548 0.125111764 1.00 0.006 7 0.8720523 0.222717083 1.00 0.006 8 0.8762497 0.272764890 1.00 0.006 9 0.8838470 0.348722418 1.00 0.006 10 0.8864459 0.377695450 1.00 0.007 1 0.8548582 0.000000000 1.00 0.007 2 0.8548582 0.000000000 1.00 0.007 3 0.8548582 0.000000000 1.00 0.007 4 0.8564576 0.020546433 1.00 0.007 5 0.8624552 0.099687622 1.00 0.007 6 0.8712524 0.213764196 1.00 0.007 7 0.8776491 0.285345622 1.00 0.007 8 0.8842470 0.352115948 1.00 0.007 9 0.8880451 0.393810533 1.00 0.007 10 0.8908438 0.423250231 1.00 0.008 1 0.8548582 0.000000000 1.00 0.008 2 0.8548582 0.000000000 1.00 0.008 3 0.8548582 0.000000000 1.00 0.008 4 0.8592566 0.054318694 1.00 0.008 5 0.8680537 0.174675046 1.00 0.008 6 0.8756500 0.264873106 1.00 0.008 7 0.8842470 0.353084396 1.00 0.008 8 0.8878454 0.394043677 1.00 0.008 9 0.8908440 0.426580259 1.00 0.008 10 0.8922432 0.443012573 1.00 0.009 1 0.8548582 0.000000000 1.00 0.009 2 0.8548582 0.000000000 1.00 0.009 3 0.8558580 0.011696471 1.00 0.009 4 0.8628550 0.102260835 1.00 0.009 5 0.8732515 0.237044193 1.00 0.009 6 0.8832476 0.342652685 1.00 0.009 7 0.8868457 0.382810897 1.00 0.009 8 0.8906440 0.424347031 1.00 0.009 9 0.8922432 0.443012573 1.00 0.009 10 0.8960425 0.471488371 1.00 0.010 1 0.8548582 0.000000000 1.00 0.010 2 0.8548582 0.000000000 1.00 0.010 3 0.8564576 0.020546433 1.00 0.010 4 0.8662545 0.152052900 1.00 0.010 5 0.8756500 0.267345526 1.00 0.010 6 0.8848468 0.359151466 1.00 0.010 7 0.8904441 0.416700875 1.00 0.010 8 0.8910436 0.435063437 1.00 0.010 9 0.8950427 0.465229136 1.00 0.010 10 0.8966417 0.480257219 1.25 0.001 1 0.8548582 0.000000000 1.25 0.001 2 0.8548582 0.000000000 1.25 0.001 3 0.8548582 0.000000000 1.25 0.001 4 0.8548582 0.000000000 1.25 0.001 5 0.8548582 0.000000000 1.25 0.001 6 0.8548582 0.000000000 1.25 0.001 7 0.8548582 0.000000000 1.25 0.001 8 0.8548582 0.000000000 1.25 0.001 9 0.8548582 0.000000000 1.25 0.001 10 0.8548582 0.000000000 1.25 0.002 1 0.8548582 0.000000000 1.25 0.002 2 0.8548582 0.000000000 1.25 0.002 3 0.8548582 0.000000000 1.25 0.002 4 0.8548582 0.000000000 1.25 0.002 5 0.8548582 0.000000000 1.25 0.002 6 0.8548582 0.000000000 1.25 0.002 7 0.8548582 0.000000000 1.25 0.002 8 0.8548582 0.000000000 1.25 0.002 9 0.8548582 0.000000000 1.25 0.002 10 0.8556582 0.009358009 1.25 0.003 1 0.8548582 0.000000000 1.25 0.003 2 0.8548582 0.000000000 1.25 0.003 3 0.8548582 0.000000000 1.25 0.003 4 0.8548582 0.000000000 1.25 0.003 5 0.8548582 0.000000000 1.25 0.003 6 0.8548582 0.000000000 1.25 0.003 7 0.8558580 0.011696471 1.25 0.003 8 0.8568574 0.025122989 1.25 0.003 9 0.8602563 0.065452632 1.25 0.003 10 0.8636548 0.115308991 1.25 0.004 1 0.8548582 0.000000000 1.25 0.004 2 0.8548582 0.000000000 1.25 0.004 3 0.8548582 0.000000000 1.25 0.004 4 0.8548582 0.000000000 1.25 0.004 5 0.8548582 0.000000000 1.25 0.004 6 0.8566574 0.022848945 1.25 0.004 7 0.8610560 0.074263174 1.25 0.004 8 0.8650548 0.138903317 1.25 0.004 9 0.8708526 0.209951901 1.25 0.004 10 0.8750504 0.258010241 1.25 0.005 1 0.8548582 0.000000000 1.25 0.005 2 0.8548582 0.000000000 1.25 0.005 3 0.8548582 0.000000000 1.25 0.005 4 0.8548582 0.000000000 1.25 0.005 5 0.8568574 0.025122989 1.25 0.005 6 0.8624552 0.094713945 1.25 0.005 7 0.8684540 0.181107241 1.25 0.005 8 0.8742508 0.249140389 1.25 0.005 9 0.8814478 0.324242527 1.25 0.005 10 0.8844470 0.354761185 1.25 0.006 1 0.8548582 0.000000000 1.25 0.006 2 0.8548582 0.000000000 1.25 0.006 3 0.8548582 0.000000000 1.25 0.006 4 0.8562579 0.016315702 1.25 0.006 5 0.8620555 0.088674947 1.25 0.006 6 0.8692537 0.188796190 1.25 0.006 7 0.8758499 0.265454935 1.25 0.006 8 0.8830476 0.342056896 1.25 0.006 9 0.8868462 0.377843640 1.25 0.006 10 0.8896446 0.411310325 1.25 0.007 1 0.8548582 0.000000000 1.25 0.007 2 0.8548582 0.000000000 1.25 0.007 3 0.8548582 0.000000000 1.25 0.007 4 0.8590568 0.052051309 1.25 0.007 5 0.8664548 0.156840098 1.25 0.007 6 0.8750505 0.257760807 1.25 0.007 7 0.8832476 0.343624405 1.25 0.007 8 0.8870457 0.383304519 1.25 0.007 9 0.8908438 0.422331175 1.25 0.007 10 0.8906438 0.432956969 1.25 0.008 1 0.8548582 0.000000000 1.25 0.008 2 0.8548582 0.000000000 1.25 0.008 3 0.8558580 0.011696471 1.25 0.008 4 0.8626550 0.100100831 1.25 0.008 5 0.8730518 0.235179989 1.25 0.008 6 0.8822478 0.334420982 1.25 0.008 7 0.8876457 0.385153770 1.25 0.008 8 0.8910438 0.424665473 1.25 0.008 9 0.8908436 0.436873122 1.25 0.008 10 0.8946428 0.465288800 1.25 0.009 1 0.8548582 0.000000000 1.25 0.009 2 0.8548582 0.000000000 1.25 0.009 3 0.8564576 0.020546433 1.25 0.009 4 0.8664548 0.156840098 1.25 0.009 5 0.8760500 0.273538768 1.25 0.009 6 0.8852467 0.363267901 1.25 0.009 7 0.8896443 0.413320114 1.25 0.009 8 0.8904440 0.433969689 1.25 0.009 9 0.8946428 0.465288800 1.25 0.009 10 0.8964424 0.480303444 1.25 0.010 1 0.8548582 0.000000000 1.25 0.010 2 0.8548582 0.000000000 1.25 0.010 3 0.8590568 0.052051309 1.25 0.010 4 0.8708528 0.211477424 1.25 0.010 5 0.8832473 0.343723674 1.25 0.010 6 0.8888454 0.402270412 1.25 0.010 7 0.8902435 0.430762332 1.25 0.010 8 0.8946428 0.465440017 1.25 0.010 9 0.8962424 0.478972727 1.25 0.010 10 0.8984412 0.500521189 1.50 0.001 1 0.8548582 0.000000000 1.50 0.001 2 0.8548582 0.000000000 1.50 0.001 3 0.8548582 0.000000000 1.50 0.001 4 0.8548582 0.000000000 1.50 0.001 5 0.8548582 0.000000000 1.50 0.001 6 0.8548582 0.000000000 1.50 0.001 7 0.8548582 0.000000000 1.50 0.001 8 0.8548582 0.000000000 1.50 0.001 9 0.8548582 0.000000000 1.50 0.001 10 0.8548582 0.000000000 1.50 0.002 1 0.8548582 0.000000000 1.50 0.002 2 0.8548582 0.000000000 1.50 0.002 3 0.8548582 0.000000000 1.50 0.002 4 0.8548582 0.000000000 1.50 0.002 5 0.8548582 0.000000000 1.50 0.002 6 0.8548582 0.000000000 1.50 0.002 7 0.8548582 0.000000000 1.50 0.002 8 0.8548582 0.000000000 1.50 0.002 9 0.8552582 0.004688062 1.50 0.002 10 0.8562579 0.016315702 1.50 0.003 1 0.8548582 0.000000000 1.50 0.003 2 0.8548582 0.000000000 1.50 0.003 3 0.8548582 0.000000000 1.50 0.003 4 0.8548582 0.000000000 1.50 0.003 5 0.8548582 0.000000000 1.50 0.003 6 0.8548582 0.000000000 1.50 0.003 7 0.8562577 0.018226050 1.50 0.003 8 0.8594566 0.056541210 1.50 0.003 9 0.8628552 0.103909074 1.50 0.003 10 0.8666547 0.157294478 1.50 0.004 1 0.8548582 0.000000000 1.50 0.004 2 0.8548582 0.000000000 1.50 0.004 3 0.8548582 0.000000000 1.50 0.004 4 0.8548582 0.000000000 1.50 0.004 5 0.8558580 0.011696471 1.50 0.004 6 0.8584569 0.045287758 1.50 0.004 7 0.8636548 0.115308991 1.50 0.004 8 0.8692537 0.190194742 1.50 0.004 9 0.8738510 0.246641266 1.50 0.004 10 0.8790488 0.301577205 1.50 0.005 1 0.8548582 0.000000000 1.50 0.005 2 0.8548582 0.000000000 1.50 0.005 3 0.8548582 0.000000000 1.50 0.005 4 0.8554582 0.007041257 1.50 0.005 5 0.8592566 0.054318694 1.50 0.005 6 0.8652548 0.140817416 1.50 0.005 7 0.8730518 0.233888504 1.50 0.005 8 0.8784491 0.295101367 1.50 0.005 9 0.8842470 0.353084396 1.50 0.005 10 0.8876457 0.385153770 1.50 0.006 1 0.8548582 0.000000000 1.50 0.006 2 0.8548582 0.000000000 1.50 0.006 3 0.8548582 0.000000000 1.50 0.006 4 0.8568574 0.025122989 1.50 0.006 5 0.8648548 0.134959347 1.50 0.006 6 0.8728516 0.234428340 1.50 0.006 7 0.8812478 0.321304413 1.50 0.006 8 0.8852467 0.363267901 1.50 0.006 9 0.8900451 0.410585307 1.50 0.006 10 0.8906436 0.428694798 1.50 0.007 1 0.8548582 0.000000000 1.50 0.007 2 0.8548582 0.000000000 1.50 0.007 3 0.8552582 0.004688062 1.50 0.007 4 0.8620555 0.086915881 1.50 0.007 5 0.8706528 0.209532587 1.50 0.007 6 0.8794486 0.305961624 1.50 0.007 7 0.8860464 0.370548018 1.50 0.007 8 0.8898448 0.411988181 1.50 0.007 9 0.8900436 0.430143632 1.50 0.007 10 0.8940427 0.460460125 1.50 0.008 1 0.8548582 0.000000000 1.50 0.008 2 0.8548582 0.000000000 1.50 0.008 3 0.8562577 0.018226050 1.50 0.008 4 0.8652548 0.142168648 1.50 0.008 5 0.8760499 0.268482326 1.50 0.008 6 0.8850468 0.360594293 1.50 0.008 7 0.8896449 0.410447321 1.50 0.008 8 0.8904433 0.432265176 1.50 0.008 9 0.8944427 0.464791730 1.50 0.008 10 0.8964422 0.476562554 1.50 0.009 1 0.8548582 0.000000000 1.50 0.009 2 0.8548582 0.000000000 1.50 0.009 3 0.8588568 0.049811814 1.50 0.009 4 0.8702532 0.204451881 1.50 0.009 5 0.8828473 0.340473972 1.50 0.009 6 0.8890456 0.398265268 1.50 0.009 7 0.8902435 0.428349669 1.50 0.009 8 0.8942425 0.463385948 1.50 0.009 9 0.8962422 0.475929281 1.50 0.009 10 0.8980416 0.494936543 1.50 0.010 1 0.8548582 0.000000000 1.50 0.010 2 0.8548582 0.000000000 1.50 0.010 3 0.8622556 0.090706917 1.50 0.010 4 0.8750504 0.258017346 1.50 0.010 5 0.8858460 0.368191766 1.50 0.010 6 0.8894444 0.417073007 1.50 0.010 7 0.8928428 0.452723698 1.50 0.010 8 0.8956420 0.473941654 1.50 0.010 9 0.8974419 0.490907517 1.50 0.010 10 0.8992409 0.510499484 1.75 0.001 1 0.8548582 0.000000000 1.75 0.001 2 0.8548582 0.000000000 1.75 0.001 3 0.8548582 0.000000000 1.75 0.001 4 0.8548582 0.000000000 1.75 0.001 5 0.8548582 0.000000000 1.75 0.001 6 0.8548582 0.000000000 1.75 0.001 7 0.8548582 0.000000000 1.75 0.001 8 0.8548582 0.000000000 1.75 0.001 9 0.8548582 0.000000000 1.75 0.001 10 0.8548582 0.000000000 1.75 0.002 1 0.8548582 0.000000000 1.75 0.002 2 0.8548582 0.000000000 1.75 0.002 3 0.8548582 0.000000000 1.75 0.002 4 0.8548582 0.000000000 1.75 0.002 5 0.8548582 0.000000000 1.75 0.002 6 0.8548582 0.000000000 1.75 0.002 7 0.8548582 0.000000000 1.75 0.002 8 0.8548582 0.000000000 1.75 0.002 9 0.8560580 0.013995319 1.75 0.002 10 0.8568574 0.025122989 1.75 0.003 1 0.8548582 0.000000000 1.75 0.003 2 0.8548582 0.000000000 1.75 0.003 3 0.8548582 0.000000000 1.75 0.003 4 0.8548582 0.000000000 1.75 0.003 5 0.8548582 0.000000000 1.75 0.003 6 0.8558580 0.011696471 1.75 0.003 7 0.8570574 0.027457856 1.75 0.003 8 0.8620555 0.086915881 1.75 0.003 9 0.8654548 0.142718213 1.75 0.003 10 0.8704531 0.204910015 1.75 0.004 1 0.8548582 0.000000000 1.75 0.004 2 0.8548582 0.000000000 1.75 0.004 3 0.8548582 0.000000000 1.75 0.004 4 0.8548582 0.000000000 1.75 0.004 5 0.8562577 0.018226050 1.75 0.004 6 0.8610560 0.074263174 1.75 0.004 7 0.8664545 0.153910147 1.75 0.004 8 0.8730518 0.233888504 1.75 0.004 9 0.8764497 0.276948155 1.75 0.004 10 0.8832476 0.343624405 1.75 0.005 1 0.8548582 0.000000000 1.75 0.005 2 0.8548582 0.000000000 1.75 0.005 3 0.8548582 0.000000000 1.75 0.005 4 0.8562579 0.016315702 1.75 0.005 5 0.8620555 0.086915881 1.75 0.005 6 0.8686539 0.181387810 1.75 0.005 7 0.8752504 0.261055749 1.75 0.005 8 0.8824476 0.336116544 1.75 0.005 9 0.8860464 0.371566492 1.75 0.005 10 0.8900449 0.411523465 1.75 0.006 1 0.8548582 0.000000000 1.75 0.006 2 0.8548582 0.000000000 1.75 0.006 3 0.8548582 0.000000000 1.75 0.006 4 0.8594568 0.056479554 1.75 0.006 5 0.8676539 0.170523847 1.75 0.006 6 0.8758499 0.266577755 1.75 0.006 7 0.8840470 0.352541063 1.75 0.006 8 0.8886456 0.394563349 1.75 0.006 9 0.8898443 0.419171445 1.75 0.006 10 0.8904440 0.435542552 1.75 0.007 1 0.8548582 0.000000000 1.75 0.007 2 0.8548582 0.000000000 1.75 0.007 3 0.8560580 0.013995319 1.75 0.007 4 0.8642547 0.122912153 1.75 0.007 5 0.8740510 0.248559119 1.75 0.007 6 0.8840470 0.350341789 1.75 0.007 7 0.8884456 0.393949621 1.75 0.007 8 0.8898440 0.422695661 1.75 0.007 9 0.8922430 0.448455529 1.75 0.007 10 0.8950428 0.470429687 1.75 0.008 1 0.8548582 0.000000000 1.75 0.008 2 0.8548582 0.000000000 1.75 0.008 3 0.8570574 0.027457856 1.75 0.008 4 0.8692536 0.187322186 1.75 0.008 5 0.8798486 0.309362914 1.75 0.008 6 0.8866460 0.378356644 1.75 0.008 7 0.8892446 0.417310483 1.75 0.008 8 0.8924430 0.450664626 1.75 0.008 9 0.8952427 0.472559203 1.75 0.008 10 0.8960417 0.481290594 1.75 0.009 1 0.8548582 0.000000000 1.75 0.009 2 0.8548582 0.000000000 1.75 0.009 3 0.8614558 0.078618899 1.75 0.009 4 0.8740512 0.247168947 1.75 0.009 5 0.8854465 0.363848247 1.75 0.009 6 0.8900444 0.415391683 1.75 0.009 7 0.8908436 0.440892605 1.75 0.009 8 0.8954425 0.474007870 1.75 0.009 9 0.8960417 0.481290594 1.75 0.009 10 0.8974417 0.501362405 1.75 0.010 1 0.8548582 0.000000000 1.75 0.010 2 0.8548582 0.000000000 1.75 0.010 3 0.8644548 0.127801698 1.75 0.010 4 0.8772499 0.286957623 1.75 0.010 5 0.8882457 0.391677584 1.75 0.010 6 0.8910432 0.435057492 1.75 0.010 7 0.8936430 0.462215697 1.75 0.010 8 0.8958420 0.477684607 1.75 0.010 9 0.8972416 0.498791786 1.75 0.010 10 0.8970412 0.507106090 2.00 0.001 1 0.8548582 0.000000000 2.00 0.001 2 0.8548582 0.000000000 2.00 0.001 3 0.8548582 0.000000000 2.00 0.001 4 0.8548582 0.000000000 2.00 0.001 5 0.8548582 0.000000000 2.00 0.001 6 0.8548582 0.000000000 2.00 0.001 7 0.8548582 0.000000000 2.00 0.001 8 0.8548582 0.000000000 2.00 0.001 9 0.8548582 0.000000000 2.00 0.001 10 0.8548582 0.000000000 2.00 0.002 1 0.8548582 0.000000000 2.00 0.002 2 0.8548582 0.000000000 2.00 0.002 3 0.8548582 0.000000000 2.00 0.002 4 0.8548582 0.000000000 2.00 0.002 5 0.8548582 0.000000000 2.00 0.002 6 0.8548582 0.000000000 2.00 0.002 7 0.8548582 0.000000000 2.00 0.002 8 0.8556582 0.009358009 2.00 0.002 9 0.8564576 0.020546433 2.00 0.002 10 0.8586569 0.047544429 2.00 0.003 1 0.8548582 0.000000000 2.00 0.003 2 0.8548582 0.000000000 2.00 0.003 3 0.8548582 0.000000000 2.00 0.003 4 0.8548582 0.000000000 2.00 0.003 5 0.8548582 0.000000000 2.00 0.003 6 0.8562579 0.016315702 2.00 0.003 7 0.8596566 0.058746939 2.00 0.003 8 0.8640547 0.119386808 2.00 0.003 9 0.8686539 0.181387810 2.00 0.003 10 0.8730515 0.236452537 2.00 0.004 1 0.8548582 0.000000000 2.00 0.004 2 0.8548582 0.000000000 2.00 0.004 3 0.8548582 0.000000000 2.00 0.004 4 0.8548582 0.000000000 2.00 0.004 5 0.8568574 0.025122989 2.00 0.004 6 0.8628550 0.102203763 2.00 0.004 7 0.8690539 0.188268020 2.00 0.004 8 0.8752504 0.259676508 2.00 0.004 9 0.8810478 0.322972472 2.00 0.004 10 0.8850468 0.360532350 2.00 0.005 1 0.8548582 0.000000000 2.00 0.005 2 0.8548582 0.000000000 2.00 0.005 3 0.8548582 0.000000000 2.00 0.005 4 0.8566576 0.022820477 2.00 0.005 5 0.8638547 0.117370164 2.00 0.005 6 0.8716526 0.218547300 2.00 0.005 7 0.8780494 0.292663322 2.00 0.005 8 0.8852467 0.362195584 2.00 0.005 9 0.8884456 0.393949621 2.00 0.005 10 0.8896443 0.418573769 2.00 0.006 1 0.8548582 0.000000000 2.00 0.006 2 0.8548582 0.000000000 2.00 0.006 3 0.8552582 0.004688062 2.00 0.006 4 0.8616556 0.082483114 2.00 0.006 5 0.8700532 0.202493337 2.00 0.006 6 0.8788491 0.300741810 2.00 0.006 7 0.8858462 0.369138903 2.00 0.006 8 0.8896449 0.410439251 2.00 0.006 9 0.8908433 0.433465785 2.00 0.006 10 0.8930428 0.455758110 2.00 0.007 1 0.8548582 0.000000000 2.00 0.007 2 0.8548582 0.000000000 2.00 0.007 3 0.8564576 0.020546433 2.00 0.007 4 0.8656548 0.147652455 2.00 0.007 5 0.8760500 0.269791923 2.00 0.007 6 0.8852465 0.363219835 2.00 0.007 7 0.8898448 0.412903320 2.00 0.007 8 0.8910432 0.437427516 2.00 0.007 9 0.8944427 0.463959953 2.00 0.007 10 0.8954419 0.474273241 2.00 0.008 1 0.8548582 0.000000000 2.00 0.008 2 0.8548582 0.000000000 2.00 0.008 3 0.8596566 0.058729676 2.00 0.008 4 0.8718526 0.221752586 2.00 0.008 5 0.8842468 0.351953162 2.00 0.008 6 0.8888454 0.404129447 2.00 0.008 7 0.8912432 0.436434994 2.00 0.008 8 0.8936430 0.462932966 2.00 0.008 9 0.8952420 0.474364080 2.00 0.008 10 0.8972419 0.492383884 2.00 0.009 1 0.8548582 0.000000000 2.00 0.009 2 0.8548582 0.000000000 2.00 0.009 3 0.8630550 0.104291276 2.00 0.009 4 0.8760499 0.268482326 2.00 0.009 5 0.8860460 0.373725670 2.00 0.009 6 0.8900440 0.424976829 2.00 0.009 7 0.8936427 0.460687998 2.00 0.009 8 0.8956419 0.477074971 2.00 0.009 9 0.8972417 0.493765718 2.00 0.009 10 0.8978412 0.506750801 2.00 0.010 1 0.8548582 0.000000000 2.00 0.010 2 0.8548582 0.000000000 2.00 0.010 3 0.8656550 0.150366044 2.00 0.010 4 0.8822475 0.334246639 2.00 0.010 5 0.8890452 0.406518678 2.00 0.010 6 0.8918435 0.447102834 2.00 0.010 7 0.8956424 0.476135314 2.00 0.010 8 0.8966416 0.488330477 2.00 0.010 9 0.8980416 0.507389838 2.00 0.010 10 0.8978401 0.520073893 2.25 0.001 1 0.8548582 0.000000000 2.25 0.001 2 0.8548582 0.000000000 2.25 0.001 3 0.8548582 0.000000000 2.25 0.001 4 0.8548582 0.000000000 2.25 0.001 5 0.8548582 0.000000000 2.25 0.001 6 0.8548582 0.000000000 2.25 0.001 7 0.8548582 0.000000000 2.25 0.001 8 0.8548582 0.000000000 2.25 0.001 9 0.8548582 0.000000000 2.25 0.001 10 0.8548582 0.000000000 2.25 0.002 1 0.8548582 0.000000000 2.25 0.002 2 0.8548582 0.000000000 2.25 0.002 3 0.8548582 0.000000000 2.25 0.002 4 0.8548582 0.000000000 2.25 0.002 5 0.8548582 0.000000000 2.25 0.002 6 0.8548582 0.000000000 2.25 0.002 7 0.8548582 0.000000000 2.25 0.002 8 0.8560580 0.013995319 2.25 0.002 9 0.8570574 0.027457856 2.25 0.002 10 0.8604563 0.067641763 2.25 0.003 1 0.8548582 0.000000000 2.25 0.003 2 0.8548582 0.000000000 2.25 0.003 3 0.8548582 0.000000000 2.25 0.003 4 0.8548582 0.000000000 2.25 0.003 5 0.8550582 0.002353195 2.25 0.003 6 0.8566576 0.022820477 2.25 0.003 7 0.8616558 0.080775391 2.25 0.003 8 0.8652548 0.142168648 2.25 0.003 9 0.8706528 0.209532587 2.25 0.003 10 0.8752504 0.261055749 2.25 0.004 1 0.8548582 0.000000000 2.25 0.004 2 0.8548582 0.000000000 2.25 0.004 3 0.8548582 0.000000000 2.25 0.004 4 0.8552582 0.004688062 2.25 0.004 5 0.8586568 0.047555143 2.25 0.004 6 0.8650548 0.136860144 2.25 0.004 7 0.8720524 0.222109435 2.25 0.004 8 0.8766497 0.278826572 2.25 0.004 9 0.8844468 0.353696748 2.25 0.004 10 0.8860462 0.373705597 2.25 0.005 1 0.8548582 0.000000000 2.25 0.005 2 0.8548582 0.000000000 2.25 0.005 3 0.8548582 0.000000000 2.25 0.005 4 0.8570574 0.027457856 2.25 0.005 5 0.8654548 0.142718213 2.25 0.005 6 0.8738512 0.246577583 2.25 0.005 7 0.8820475 0.331466873 2.25 0.005 8 0.8858462 0.371083913 2.25 0.005 9 0.8898446 0.411973851 2.25 0.005 10 0.8912432 0.433830236 2.25 0.006 1 0.8548582 0.000000000 2.25 0.006 2 0.8548582 0.000000000 2.25 0.006 3 0.8558580 0.011696471 2.25 0.006 4 0.8630550 0.105859350 2.25 0.006 5 0.8734515 0.238535420 2.25 0.006 6 0.8828470 0.340383545 2.25 0.006 7 0.8870459 0.382464098 2.25 0.006 8 0.8900444 0.419635873 2.25 0.006 9 0.8910436 0.441488920 2.25 0.006 10 0.8936430 0.462932966 2.25 0.007 1 0.8548582 0.000000000 2.25 0.007 2 0.8548582 0.000000000 2.25 0.007 3 0.8568574 0.025122989 2.25 0.007 4 0.8694534 0.187797882 2.25 0.007 5 0.8792489 0.304177985 2.25 0.007 6 0.8862462 0.376223202 2.25 0.007 7 0.8908441 0.426436284 2.25 0.007 8 0.8928427 0.454177468 2.25 0.007 9 0.8952425 0.474115259 2.25 0.007 10 0.8956419 0.477773635 2.25 0.008 1 0.8548582 0.000000000 2.25 0.008 2 0.8548582 0.000000000 2.25 0.008 3 0.8618558 0.084608543 2.25 0.008 4 0.8740510 0.248559119 2.25 0.008 5 0.8854462 0.365897556 2.25 0.008 6 0.8908443 0.421375977 2.25 0.008 7 0.8918433 0.447905567 2.25 0.008 8 0.8952425 0.474024743 2.25 0.008 9 0.8958420 0.479185465 2.25 0.008 10 0.8978409 0.502054565 2.25 0.009 1 0.8548582 0.000000000 2.25 0.009 2 0.8548582 0.000000000 2.25 0.009 3 0.8646550 0.134456950 2.25 0.009 4 0.8786492 0.300125654 2.25 0.009 5 0.8888456 0.397217010 2.25 0.009 6 0.8912435 0.437220549 2.25 0.009 7 0.8952424 0.473862451 2.25 0.009 8 0.8958419 0.479155439 2.25 0.009 9 0.8974411 0.500823782 2.25 0.009 10 0.8968411 0.505626472 2.25 0.010 1 0.8548582 0.000000000 2.25 0.010 2 0.8552582 0.004688062 2.25 0.010 3 0.8690537 0.188290097 2.25 0.010 4 0.8850464 0.362553730 2.25 0.010 5 0.8904443 0.420999894 2.25 0.010 6 0.8932427 0.460131403 2.25 0.010 7 0.8958419 0.479201824 2.25 0.010 8 0.8980414 0.500472794 2.25 0.010 9 0.8968412 0.504976253 2.25 0.010 10 0.8982395 0.526476554 2.50 0.001 1 0.8548582 0.000000000 2.50 0.001 2 0.8548582 0.000000000 2.50 0.001 3 0.8548582 0.000000000 2.50 0.001 4 0.8548582 0.000000000 2.50 0.001 5 0.8548582 0.000000000 2.50 0.001 6 0.8548582 0.000000000 2.50 0.001 7 0.8548582 0.000000000 2.50 0.001 8 0.8548582 0.000000000 2.50 0.001 9 0.8548582 0.000000000 2.50 0.001 10 0.8548582 0.000000000 2.50 0.002 1 0.8548582 0.000000000 2.50 0.002 2 0.8548582 0.000000000 2.50 0.002 3 0.8548582 0.000000000 2.50 0.002 4 0.8548582 0.000000000 2.50 0.002 5 0.8548582 0.000000000 2.50 0.002 6 0.8548582 0.000000000 2.50 0.002 7 0.8550582 0.002353195 2.50 0.002 8 0.8562577 0.018226050 2.50 0.002 9 0.8586568 0.047555143 2.50 0.002 10 0.8624555 0.092939972 2.50 0.003 1 0.8548582 0.000000000 2.50 0.003 2 0.8548582 0.000000000 2.50 0.003 3 0.8548582 0.000000000 2.50 0.003 4 0.8548582 0.000000000 2.50 0.003 5 0.8556582 0.009358009 2.50 0.003 6 0.8570574 0.027457856 2.50 0.003 7 0.8626550 0.100100831 2.50 0.003 8 0.8678539 0.172339189 2.50 0.003 9 0.8730516 0.236359615 2.50 0.003 10 0.8768497 0.280692113 2.50 0.004 1 0.8548582 0.000000000 2.50 0.004 2 0.8548582 0.000000000 2.50 0.004 3 0.8548582 0.000000000 2.50 0.004 4 0.8558580 0.011696471 2.50 0.004 5 0.8604563 0.067641763 2.50 0.004 6 0.8662548 0.154790574 2.50 0.004 7 0.8738512 0.246577583 2.50 0.004 8 0.8804483 0.314567633 2.50 0.004 9 0.8850467 0.361567172 2.50 0.004 10 0.8886456 0.395533795 2.50 0.005 1 0.8548582 0.000000000 2.50 0.005 2 0.8548582 0.000000000 2.50 0.005 3 0.8548582 0.000000000 2.50 0.005 4 0.8594568 0.056479554 2.50 0.005 5 0.8672542 0.166244006 2.50 0.005 6 0.8752504 0.261055749 2.50 0.005 7 0.8846467 0.355308121 2.50 0.005 8 0.8882457 0.391677584 2.50 0.005 9 0.8900444 0.419635873 2.50 0.005 10 0.8908433 0.436806330 2.50 0.006 1 0.8548582 0.000000000 2.50 0.006 2 0.8548582 0.000000000 2.50 0.006 3 0.8562579 0.016315702 2.50 0.006 4 0.8648548 0.134959347 2.50 0.006 5 0.8750504 0.258017346 2.50 0.006 6 0.8850465 0.361701825 2.50 0.006 7 0.8888456 0.402363701 2.50 0.006 8 0.8904438 0.428847051 2.50 0.006 9 0.8930428 0.457966441 2.50 0.006 10 0.8956422 0.476913574 2.50 0.007 1 0.8548582 0.000000000 2.50 0.007 2 0.8548582 0.000000000 2.50 0.007 3 0.8594568 0.056479554 2.50 0.007 4 0.8706531 0.208103557 2.50 0.007 5 0.8828472 0.339294205 2.50 0.007 6 0.8886456 0.396582856 2.50 0.007 7 0.8908435 0.431831146 2.50 0.007 8 0.8936427 0.461406781 2.50 0.007 9 0.8956424 0.476928044 2.50 0.007 10 0.8962419 0.484070121 2.50 0.008 1 0.8548582 0.000000000 2.50 0.008 2 0.8548582 0.000000000 2.50 0.008 3 0.8630550 0.104291276 2.50 0.008 4 0.8756502 0.264704393 2.50 0.008 5 0.8866460 0.378538202 2.50 0.008 6 0.8904441 0.425293415 2.50 0.008 7 0.8932427 0.460131403 2.50 0.008 8 0.8962422 0.480976784 2.50 0.008 9 0.8968416 0.488952925 2.50 0.008 10 0.8976412 0.503419791 2.50 0.009 1 0.8548582 0.000000000 2.50 0.009 2 0.8550582 0.002353195 2.50 0.009 3 0.8660548 0.154436196 2.50 0.009 4 0.8820475 0.333816386 2.50 0.009 5 0.8898448 0.413628173 2.50 0.009 6 0.8918432 0.450324489 2.50 0.009 7 0.8954422 0.476887721 2.50 0.009 8 0.8962420 0.485540017 2.50 0.009 9 0.8976412 0.503419791 2.50 0.009 10 0.8976404 0.516171707 2.50 0.010 1 0.8548582 0.000000000 2.50 0.010 2 0.8560580 0.013995319 2.50 0.010 3 0.8710529 0.211730589 2.50 0.010 4 0.8868460 0.380006734 2.50 0.010 5 0.8908441 0.430076272 2.50 0.010 6 0.8942424 0.469434195 2.50 0.010 7 0.8956420 0.480000451 2.50 0.010 8 0.8980409 0.505290998 2.50 0.010 9 0.8972408 0.512320464 2.50 0.010 10 0.9004388 0.541011083 2.75 0.001 1 0.8548582 0.000000000 2.75 0.001 2 0.8548582 0.000000000 2.75 0.001 3 0.8548582 0.000000000 2.75 0.001 4 0.8548582 0.000000000 2.75 0.001 5 0.8548582 0.000000000 2.75 0.001 6 0.8548582 0.000000000 2.75 0.001 7 0.8548582 0.000000000 2.75 0.001 8 0.8548582 0.000000000 2.75 0.001 9 0.8548582 0.000000000 2.75 0.001 10 0.8548582 0.000000000 2.75 0.002 1 0.8548582 0.000000000 2.75 0.002 2 0.8548582 0.000000000 2.75 0.002 3 0.8548582 0.000000000 2.75 0.002 4 0.8548582 0.000000000 2.75 0.002 5 0.8548582 0.000000000 2.75 0.002 6 0.8548582 0.000000000 2.75 0.002 7 0.8556582 0.009358009 2.75 0.002 8 0.8568574 0.025122989 2.75 0.002 9 0.8604563 0.067641763 2.75 0.002 10 0.8636548 0.115308991 2.75 0.003 1 0.8548582 0.000000000 2.75 0.003 2 0.8548582 0.000000000 2.75 0.003 3 0.8548582 0.000000000 2.75 0.003 4 0.8548582 0.000000000 2.75 0.003 5 0.8558580 0.011696471 2.75 0.003 6 0.8594568 0.056479554 2.75 0.003 7 0.8642548 0.127340281 2.75 0.003 8 0.8696534 0.195799002 2.75 0.003 9 0.8748504 0.256347507 2.75 0.003 10 0.8802483 0.312764524 2.75 0.004 1 0.8548582 0.000000000 2.75 0.004 2 0.8548582 0.000000000 2.75 0.004 3 0.8548582 0.000000000 2.75 0.004 4 0.8562579 0.016315702 2.75 0.004 5 0.8620558 0.088456795 2.75 0.004 6 0.8692536 0.187322186 2.75 0.004 7 0.8752504 0.261055749 2.75 0.004 8 0.8830470 0.342119684 2.75 0.004 9 0.8858460 0.372101173 2.75 0.004 10 0.8892452 0.406240475 2.75 0.005 1 0.8548582 0.000000000 2.75 0.005 2 0.8548582 0.000000000 2.75 0.005 3 0.8550582 0.002353195 2.75 0.005 4 0.8608560 0.072057445 2.75 0.005 5 0.8688539 0.189216416 2.75 0.005 6 0.8768500 0.283298746 2.75 0.005 7 0.8854462 0.365794304 2.75 0.005 8 0.8890454 0.404743568 2.75 0.005 9 0.8904438 0.427991074 2.75 0.005 10 0.8924428 0.452153707 2.75 0.006 1 0.8548582 0.000000000 2.75 0.006 2 0.8548582 0.000000000 2.75 0.006 3 0.8566576 0.022820477 2.75 0.006 4 0.8658548 0.150820424 2.75 0.006 5 0.8762499 0.271520784 2.75 0.006 6 0.8862459 0.372377210 2.75 0.006 7 0.8902448 0.415846218 2.75 0.006 8 0.8908435 0.435990552 2.75 0.006 9 0.8942424 0.466403235 2.75 0.006 10 0.8960424 0.479628261 2.75 0.007 1 0.8548582 0.000000000 2.75 0.007 2 0.8548582 0.000000000 2.75 0.007 3 0.8606560 0.069834930 2.75 0.007 4 0.8732516 0.236764138 2.75 0.007 5 0.8848464 0.360883891 2.75 0.007 6 0.8896449 0.411941731 2.75 0.007 7 0.8906438 0.437793155 2.75 0.007 8 0.8948425 0.472015441 2.75 0.007 9 0.8958419 0.479201824 2.75 0.007 10 0.8970417 0.492438575 2.75 0.008 1 0.8548582 0.000000000 2.75 0.008 2 0.8548582 0.000000000 2.75 0.008 3 0.8650548 0.133781129 2.75 0.008 4 0.8776497 0.290294443 2.75 0.008 5 0.8884459 0.393208272 2.75 0.008 6 0.8910438 0.435738132 2.75 0.008 7 0.8936425 0.465258789 2.75 0.008 8 0.8960417 0.480542848 2.75 0.008 9 0.8976416 0.497775403 2.75 0.008 10 0.8972414 0.505514228 2.75 0.009 1 0.8548582 0.000000000 2.75 0.009 2 0.8552582 0.004688062 2.75 0.009 3 0.8690536 0.186765249 2.75 0.009 4 0.8854464 0.362888087 2.75 0.009 5 0.8902446 0.419455431 2.75 0.009 6 0.8926432 0.457524382 2.75 0.009 7 0.8960420 0.481781824 2.75 0.009 8 0.8970416 0.492414438 2.75 0.009 9 0.8970414 0.504862349 2.75 0.009 10 0.8968403 0.516913773 2.75 0.010 1 0.8548582 0.000000000 2.75 0.010 2 0.8562577 0.018226050 2.75 0.010 3 0.8736515 0.240467123 2.75 0.010 4 0.8878460 0.387817401 2.75 0.010 5 0.8918436 0.443071678 2.75 0.010 6 0.8954425 0.477640648 2.75 0.010 7 0.8964416 0.486146514 2.75 0.010 8 0.8974411 0.503450875 2.75 0.010 9 0.8970404 0.517512376 2.75 0.010 10 0.8986393 0.535990938 3.00 0.001 1 0.8548582 0.000000000 3.00 0.001 2 0.8548582 0.000000000 3.00 0.001 3 0.8548582 0.000000000 3.00 0.001 4 0.8548582 0.000000000 3.00 0.001 5 0.8548582 0.000000000 3.00 0.001 6 0.8548582 0.000000000 3.00 0.001 7 0.8548582 0.000000000 3.00 0.001 8 0.8548582 0.000000000 3.00 0.001 9 0.8548582 0.000000000 3.00 0.001 10 0.8548582 0.000000000 3.00 0.002 1 0.8548582 0.000000000 3.00 0.002 2 0.8548582 0.000000000 3.00 0.002 3 0.8548582 0.000000000 3.00 0.002 4 0.8548582 0.000000000 3.00 0.002 5 0.8548582 0.000000000 3.00 0.002 6 0.8548582 0.000000000 3.00 0.002 7 0.8560580 0.013995319 3.00 0.002 8 0.8570574 0.027457856 3.00 0.002 9 0.8620556 0.086806206 3.00 0.002 10 0.8648550 0.136357747 3.00 0.003 1 0.8548582 0.000000000 3.00 0.003 2 0.8548582 0.000000000 3.00 0.003 3 0.8548582 0.000000000 3.00 0.003 4 0.8548582 0.000000000 3.00 0.003 5 0.8562579 0.016315702 3.00 0.003 6 0.8608560 0.072057445 3.00 0.003 7 0.8652548 0.142168648 3.00 0.003 8 0.8716528 0.219708292 3.00 0.003 9 0.8756502 0.267322474 3.00 0.003 10 0.8828470 0.340383545 3.00 0.004 1 0.8548582 0.000000000 3.00 0.004 2 0.8548582 0.000000000 3.00 0.004 3 0.8548582 0.000000000 3.00 0.004 4 0.8564576 0.020546433 3.00 0.004 5 0.8628550 0.103792814 3.00 0.004 6 0.8700531 0.202630304 3.00 0.004 7 0.8764500 0.276974618 3.00 0.004 8 0.8848465 0.360042663 3.00 0.004 9 0.8884457 0.393251912 3.00 0.004 10 0.8904443 0.419213366 3.00 0.005 1 0.8548582 0.000000000 3.00 0.005 2 0.8548582 0.000000000 3.00 0.005 3 0.8554582 0.007041257 3.00 0.005 4 0.8624553 0.094489453 3.00 0.005 5 0.8718526 0.220260766 3.00 0.005 6 0.8802483 0.312764524 3.00 0.005 7 0.8858462 0.372107333 3.00 0.005 8 0.8904446 0.417368467 3.00 0.005 9 0.8914433 0.437835606 3.00 0.005 10 0.8932427 0.460131403 3.00 0.006 1 0.8548582 0.000000000 3.00 0.006 2 0.8548582 0.000000000 3.00 0.006 3 0.8568574 0.025122989 3.00 0.006 4 0.8686537 0.181368184 3.00 0.006 5 0.8788492 0.301753392 3.00 0.006 6 0.8868460 0.380122376 3.00 0.006 7 0.8908441 0.423909268 3.00 0.006 8 0.8916433 0.449698406 3.00 0.006 9 0.8952424 0.474692383 3.00 0.006 10 0.8958419 0.479201824 3.00 0.007 1 0.8548582 0.000000000 3.00 0.007 2 0.8548582 0.000000000 3.00 0.007 3 0.8618558 0.087966475 3.00 0.007 4 0.8742508 0.250526734 3.00 0.007 5 0.8866459 0.375459711 3.00 0.007 6 0.8904444 0.420068415 3.00 0.007 7 0.8914433 0.449077417 3.00 0.007 8 0.8956422 0.479067616 3.00 0.007 9 0.8954422 0.479382361 3.00 0.007 10 0.8980412 0.502462154 3.00 0.008 1 0.8548582 0.000000000 3.00 0.008 2 0.8548582 0.000000000 3.00 0.008 3 0.8656548 0.147458132 3.00 0.008 4 0.8804483 0.315615750 3.00 0.008 5 0.8882456 0.398889627 3.00 0.008 6 0.8914436 0.444385643 3.00 0.008 7 0.8950427 0.474822561 3.00 0.008 8 0.8954420 0.478659250 3.00 0.008 9 0.8980411 0.503816053 3.00 0.008 10 0.8966417 0.505672820 3.00 0.009 1 0.8548582 0.000000000 3.00 0.009 2 0.8558580 0.011696471 3.00 0.009 3 0.8706531 0.208103557 3.00 0.009 4 0.8866460 0.377430851 3.00 0.009 5 0.8900443 0.424962581 3.00 0.009 6 0.8930432 0.461720525 3.00 0.009 7 0.8966411 0.485305156 3.00 0.009 8 0.8974412 0.501250400 3.00 0.009 9 0.8966417 0.505672820 3.00 0.009 10 0.8980393 0.527078831 3.00 0.010 1 0.8548582 0.000000000 3.00 0.010 2 0.8566576 0.022820477 3.00 0.010 3 0.8746505 0.254420785 3.00 0.010 4 0.8884459 0.394957942 3.00 0.010 5 0.8914435 0.446777980 3.00 0.010 6 0.8954422 0.479160306 3.00 0.010 7 0.8966416 0.491034019 3.00 0.010 8 0.8976411 0.508021458 3.00 0.010 9 0.8976398 0.522442728 3.00 0.010 10 0.8996390 0.543606058 3.25 0.001 1 0.8548582 0.000000000 3.25 0.001 2 0.8548582 0.000000000 3.25 0.001 3 0.8548582 0.000000000 3.25 0.001 4 0.8548582 0.000000000 3.25 0.001 5 0.8548582 0.000000000 3.25 0.001 6 0.8548582 0.000000000 3.25 0.001 7 0.8548582 0.000000000 3.25 0.001 8 0.8548582 0.000000000 3.25 0.001 9 0.8548582 0.000000000 3.25 0.001 10 0.8548582 0.000000000 3.25 0.002 1 0.8548582 0.000000000 3.25 0.002 2 0.8548582 0.000000000 3.25 0.002 3 0.8548582 0.000000000 3.25 0.002 4 0.8548582 0.000000000 3.25 0.002 5 0.8548582 0.000000000 3.25 0.002 6 0.8548582 0.000000000 3.25 0.002 7 0.8560579 0.015887588 3.25 0.002 8 0.8594568 0.056479554 3.25 0.002 9 0.8630548 0.102692479 3.25 0.002 10 0.8658548 0.149522519 3.25 0.003 1 0.8548582 0.000000000 3.25 0.003 2 0.8548582 0.000000000 3.25 0.003 3 0.8548582 0.000000000 3.25 0.003 4 0.8548582 0.000000000 3.25 0.003 5 0.8566576 0.022820477 3.25 0.003 6 0.8620558 0.088456795 3.25 0.003 7 0.8666547 0.160110312 3.25 0.003 8 0.8732516 0.239221468 3.25 0.003 9 0.8778497 0.292063239 3.25 0.003 10 0.8852464 0.361150686 3.25 0.004 1 0.8548582 0.000000000 3.25 0.004 2 0.8548582 0.000000000 3.25 0.004 3 0.8548582 0.000000000 3.25 0.004 4 0.8568574 0.025122989 3.25 0.004 5 0.8648548 0.131643339 3.25 0.004 6 0.8724524 0.227230427 3.25 0.004 7 0.8788491 0.301778317 3.25 0.004 8 0.8858459 0.369111359 3.25 0.004 9 0.8886454 0.399193423 3.25 0.004 10 0.8906440 0.425085120 3.25 0.005 1 0.8548582 0.000000000 3.25 0.005 2 0.8548582 0.000000000 3.25 0.005 3 0.8558580 0.011696471 3.25 0.005 4 0.8638547 0.114115325 3.25 0.005 5 0.8734515 0.238535420 3.25 0.005 6 0.8828472 0.339294205 3.25 0.005 7 0.8878460 0.388586614 3.25 0.005 8 0.8906441 0.423301663 3.25 0.005 9 0.8904440 0.439643832 3.25 0.005 10 0.8940425 0.467921894 3.25 0.006 1 0.8548582 0.000000000 3.25 0.006 2 0.8548582 0.000000000 3.25 0.006 3 0.8580572 0.040666536 3.25 0.006 4 0.8696534 0.198671069 3.25 0.006 5 0.8816476 0.325750768 3.25 0.006 6 0.8886459 0.394763181 3.25 0.006 7 0.8910440 0.431588262 3.25 0.006 8 0.8920432 0.453289463 3.25 0.006 9 0.8956422 0.479067616 3.25 0.006 10 0.8958417 0.479917504 3.25 0.007 1 0.8548582 0.000000000 3.25 0.007 2 0.8548582 0.000000000 3.25 0.007 3 0.8630550 0.104291276 3.25 0.007 4 0.8756502 0.264704393 3.25 0.007 5 0.8872459 0.383278115 3.25 0.007 6 0.8906441 0.425968814 3.25 0.007 7 0.8926432 0.457524382 3.25 0.007 8 0.8954422 0.479160306 3.25 0.007 9 0.8966417 0.487412518 3.25 0.007 10 0.8972412 0.500683557 3.25 0.008 1 0.8548582 0.000000000 3.25 0.008 2 0.8552582 0.004688062 3.25 0.008 3 0.8670545 0.163917641 3.25 0.008 4 0.8828473 0.340559152 3.25 0.008 5 0.8896449 0.413797943 3.25 0.008 6 0.8912433 0.447770229 3.25 0.008 7 0.8952424 0.477819281 3.25 0.008 8 0.8972414 0.491477981 3.25 0.008 9 0.8978408 0.506005015 3.25 0.008 10 0.8962411 0.507926554 3.25 0.009 1 0.8548582 0.000000000 3.25 0.009 2 0.8560580 0.013995319 3.25 0.009 3 0.8722526 0.225386156 3.25 0.009 4 0.8872459 0.384233025 3.25 0.009 5 0.8912438 0.438073915 3.25 0.009 6 0.8942427 0.470750458 3.25 0.009 7 0.8962414 0.483404281 3.25 0.009 8 0.8972409 0.501402602 3.25 0.009 9 0.8960411 0.506622220 3.25 0.009 10 0.8992390 0.536220011 3.25 0.010 1 0.8548582 0.000000000 3.25 0.010 2 0.8568576 0.025077147 3.25 0.010 3 0.8760499 0.269696740 3.25 0.010 4 0.8884456 0.399513758 3.25 0.010 5 0.8926428 0.456772436 3.25 0.010 6 0.8958419 0.483355760 3.25 0.010 7 0.8980411 0.502392711 3.25 0.010 8 0.8958417 0.501201236 3.25 0.010 9 0.8984390 0.532061687 3.25 0.010 10 0.8998387 0.546660589 3.50 0.001 1 0.8548582 0.000000000 3.50 0.001 2 0.8548582 0.000000000 3.50 0.001 3 0.8548582 0.000000000 3.50 0.001 4 0.8548582 0.000000000 3.50 0.001 5 0.8548582 0.000000000 3.50 0.001 6 0.8548582 0.000000000 3.50 0.001 7 0.8548582 0.000000000 3.50 0.001 8 0.8548582 0.000000000 3.50 0.001 9 0.8548582 0.000000000 3.50 0.001 10 0.8548582 0.000000000 3.50 0.002 1 0.8548582 0.000000000 3.50 0.002 2 0.8548582 0.000000000 3.50 0.002 3 0.8548582 0.000000000 3.50 0.002 4 0.8548582 0.000000000 3.50 0.002 5 0.8548582 0.000000000 3.50 0.002 6 0.8552582 0.004688062 3.50 0.002 7 0.8566576 0.022820477 3.50 0.002 8 0.8600564 0.063186192 3.50 0.002 9 0.8646547 0.125350861 3.50 0.002 10 0.8682539 0.175933297 3.50 0.003 1 0.8548582 0.000000000 3.50 0.003 2 0.8548582 0.000000000 3.50 0.003 3 0.8548582 0.000000000 3.50 0.003 4 0.8548582 0.000000000 3.50 0.003 5 0.8568574 0.025122989 3.50 0.003 6 0.8630550 0.104291276 3.50 0.003 7 0.8690537 0.186790305 3.50 0.003 8 0.8744507 0.252480575 3.50 0.003 9 0.8808480 0.317947323 3.50 0.003 10 0.8852464 0.364316145 3.50 0.004 1 0.8548582 0.000000000 3.50 0.004 2 0.8548582 0.000000000 3.50 0.004 3 0.8548582 0.000000000 3.50 0.004 4 0.8572574 0.029714526 3.50 0.004 5 0.8652548 0.142168648 3.50 0.004 6 0.8736512 0.243319819 3.50 0.004 7 0.8816475 0.327993600 3.50 0.004 8 0.8860462 0.373721606 3.50 0.004 9 0.8896451 0.411024014 3.50 0.004 10 0.8908438 0.430976731 3.50 0.005 1 0.8548582 0.000000000 3.50 0.005 2 0.8548582 0.000000000 3.50 0.005 3 0.8560580 0.013995319 3.50 0.005 4 0.8646548 0.131181922 3.50 0.005 5 0.8742508 0.250526734 3.50 0.005 6 0.8852464 0.361090633 3.50 0.005 7 0.8882456 0.394493017 3.50 0.005 8 0.8910440 0.431588262 3.50 0.005 9 0.8918432 0.452679667 3.50 0.005 10 0.8948427 0.473442909 3.50 0.006 1 0.8548582 0.000000000 3.50 0.006 2 0.8548582 0.000000000 3.50 0.006 3 0.8596566 0.058729676 3.50 0.006 4 0.8716528 0.218399421 3.50 0.006 5 0.8836468 0.348216434 3.50 0.006 6 0.8880457 0.396481160 3.50 0.006 7 0.8912438 0.437180214 3.50 0.006 8 0.8926433 0.459827973 3.50 0.006 9 0.8954422 0.479160306 3.50 0.006 10 0.8966417 0.487412518 3.50 0.007 1 0.8548582 0.000000000 3.50 0.007 2 0.8548582 0.000000000 3.50 0.007 3 0.8650547 0.132094621 3.50 0.007 4 0.8770500 0.283640690 3.50 0.007 5 0.8878460 0.389591137 3.50 0.007 6 0.8914438 0.437043123 3.50 0.007 7 0.8924435 0.458355867 3.50 0.007 8 0.8962417 0.483916836 3.50 0.007 9 0.8960417 0.486286777 3.50 0.007 10 0.8972409 0.504078982 3.50 0.008 1 0.8548582 0.000000000 3.50 0.008 2 0.8552582 0.004688062 3.50 0.008 3 0.8690536 0.186765249 3.50 0.008 4 0.8858464 0.366088734 3.50 0.008 5 0.8902446 0.419437120 3.50 0.008 6 0.8926433 0.459070289 3.50 0.008 7 0.8958419 0.481923295 3.50 0.008 8 0.8960419 0.487678528 3.50 0.008 9 0.8974408 0.505387562 3.50 0.008 10 0.8970409 0.514147676 3.50 0.009 1 0.8548582 0.000000000 3.50 0.009 2 0.8562577 0.018226050 3.50 0.009 3 0.8736513 0.242997524 3.50 0.009 4 0.8888456 0.396554669 3.50 0.009 5 0.8912438 0.441958748 3.50 0.009 6 0.8944422 0.475189960 3.50 0.009 7 0.8972412 0.491464767 3.50 0.009 8 0.8972409 0.504078982 3.50 0.009 9 0.8970408 0.514727626 3.50 0.009 10 0.9006382 0.545185503 3.50 0.010 1 0.8548582 0.000000000 3.50 0.010 2 0.8580572 0.038778137 3.50 0.010 3 0.8772499 0.286814679 3.50 0.010 4 0.8884451 0.407737241 3.50 0.010 5 0.8928432 0.459602691 3.50 0.010 6 0.8958412 0.482725729 3.50 0.010 7 0.8980409 0.505797062 3.50 0.010 8 0.8958417 0.505971857 3.50 0.010 9 0.8988390 0.535505341 3.50 0.010 10 0.8992390 0.545895927 3.75 0.001 1 0.8548582 0.000000000 3.75 0.001 2 0.8548582 0.000000000 3.75 0.001 3 0.8548582 0.000000000 3.75 0.001 4 0.8548582 0.000000000 3.75 0.001 5 0.8548582 0.000000000 3.75 0.001 6 0.8548582 0.000000000 3.75 0.001 7 0.8548582 0.000000000 3.75 0.001 8 0.8548582 0.000000000 3.75 0.001 9 0.8548582 0.000000000 3.75 0.001 10 0.8548582 0.000000000 3.75 0.002 1 0.8548582 0.000000000 3.75 0.002 2 0.8548582 0.000000000 3.75 0.002 3 0.8548582 0.000000000 3.75 0.002 4 0.8548582 0.000000000 3.75 0.002 5 0.8548582 0.000000000 3.75 0.002 6 0.8556582 0.009358009 3.75 0.002 7 0.8568574 0.025122989 3.75 0.002 8 0.8618558 0.082915837 3.75 0.002 9 0.8646550 0.134456950 3.75 0.002 10 0.8692537 0.190255558 3.75 0.003 1 0.8548582 0.000000000 3.75 0.003 2 0.8548582 0.000000000 3.75 0.003 3 0.8548582 0.000000000 3.75 0.003 4 0.8550582 0.002353195 3.75 0.003 5 0.8572574 0.029714526 3.75 0.003 6 0.8646547 0.126845907 3.75 0.003 7 0.8698532 0.200730141 3.75 0.003 8 0.8756500 0.264695287 3.75 0.003 9 0.8828472 0.339294205 3.75 0.003 10 0.8862459 0.375260354 3.75 0.004 1 0.8548582 0.000000000 3.75 0.004 2 0.8548582 0.000000000 3.75 0.004 3 0.8548582 0.000000000 3.75 0.004 4 0.8588568 0.049777659 3.75 0.004 5 0.8658548 0.152386672 3.75 0.004 6 0.8748504 0.256347507 3.75 0.004 7 0.8836468 0.348083949 3.75 0.004 8 0.8874462 0.385483707 3.75 0.004 9 0.8902446 0.417487171 3.75 0.004 10 0.8908440 0.436776064 3.75 0.005 1 0.8548582 0.000000000 3.75 0.005 2 0.8548582 0.000000000 3.75 0.005 3 0.8562577 0.018226050 3.75 0.005 4 0.8656547 0.146092379 3.75 0.005 5 0.8752504 0.261055749 3.75 0.005 6 0.8856460 0.367552370 3.75 0.005 7 0.8892452 0.407261537 3.75 0.005 8 0.8914438 0.437799547 3.75 0.005 9 0.8924432 0.457671716 3.75 0.005 10 0.8954424 0.478444626 3.75 0.006 1 0.8548582 0.000000000 3.75 0.006 2 0.8548582 0.000000000 3.75 0.006 3 0.8608560 0.072057445 3.75 0.006 4 0.8732516 0.236764138 3.75 0.006 5 0.8852465 0.362276598 3.75 0.006 6 0.8898448 0.411921178 3.75 0.006 7 0.8916436 0.445753829 3.75 0.006 8 0.8936428 0.466694143 3.75 0.006 9 0.8960417 0.482596326 3.75 0.006 10 0.8966417 0.488180131 3.75 0.007 1 0.8548582 0.000000000 3.75 0.007 2 0.8548582 0.000000000 3.75 0.007 3 0.8650550 0.138630965 3.75 0.007 4 0.8786491 0.301352019 3.75 0.007 5 0.8882456 0.396257699 3.75 0.007 6 0.8914438 0.441834902 3.75 0.007 7 0.8934430 0.465230730 3.75 0.007 8 0.8958414 0.482028647 3.75 0.007 9 0.8974414 0.498382260 3.75 0.007 10 0.8974409 0.506727063 3.75 0.008 1 0.8548582 0.000000000 3.75 0.008 2 0.8556582 0.009358009 3.75 0.008 3 0.8702531 0.204393501 3.75 0.008 4 0.8868459 0.378922000 3.75 0.008 5 0.8906440 0.425933708 3.75 0.008 6 0.8926435 0.458976945 3.75 0.008 7 0.8956417 0.480535529 3.75 0.008 8 0.8976412 0.499668929 3.75 0.008 9 0.8972409 0.506761003 3.75 0.008 10 0.8966408 0.514863323 3.75 0.009 1 0.8548582 0.000000000 3.75 0.009 2 0.8566576 0.022820477 3.75 0.009 3 0.8746505 0.254420785 3.75 0.009 4 0.8880457 0.394696133 3.75 0.009 5 0.8914433 0.446792101 3.75 0.009 6 0.8954419 0.481254526 3.75 0.009 7 0.8970411 0.493002827 3.75 0.009 8 0.8974409 0.507954664 3.75 0.009 9 0.8968404 0.516122914 3.75 0.009 10 0.9010380 0.548104166 3.75 0.010 1 0.8548582 0.000000000 3.75 0.010 2 0.8594568 0.056479554 3.75 0.010 3 0.8790488 0.304905961 3.75 0.010 4 0.8892448 0.414534221 3.75 0.010 5 0.8930433 0.461685887 3.75 0.010 6 0.8962411 0.485397290 3.75 0.010 7 0.8976408 0.505304007 3.75 0.010 8 0.8966409 0.512807832 3.75 0.010 9 0.9006382 0.545827009 3.75 0.010 10 0.8986395 0.545783921 4.00 0.001 1 0.8548582 0.000000000 4.00 0.001 2 0.8548582 0.000000000 4.00 0.001 3 0.8548582 0.000000000 4.00 0.001 4 0.8548582 0.000000000 4.00 0.001 5 0.8548582 0.000000000 4.00 0.001 6 0.8548582 0.000000000 4.00 0.001 7 0.8548582 0.000000000 4.00 0.001 8 0.8548582 0.000000000 4.00 0.001 9 0.8548582 0.000000000 4.00 0.001 10 0.8548582 0.000000000 4.00 0.002 1 0.8548582 0.000000000 4.00 0.002 2 0.8548582 0.000000000 4.00 0.002 3 0.8548582 0.000000000 4.00 0.002 4 0.8548582 0.000000000 4.00 0.002 5 0.8548582 0.000000000 4.00 0.002 6 0.8558580 0.011696471 4.00 0.002 7 0.8572574 0.029714526 4.00 0.002 8 0.8622555 0.092360947 4.00 0.002 9 0.8658547 0.147962443 4.00 0.002 10 0.8704531 0.206145013 4.00 0.003 1 0.8548582 0.000000000 4.00 0.003 2 0.8548582 0.000000000 4.00 0.003 3 0.8548582 0.000000000 4.00 0.003 4 0.8552582 0.004688062 4.00 0.003 5 0.8588568 0.049777659 4.00 0.003 6 0.8646550 0.134456950 4.00 0.003 7 0.8716528 0.219708292 4.00 0.003 8 0.8768497 0.278149010 4.00 0.003 9 0.8852465 0.360114638 4.00 0.003 10 0.8864462 0.377880522 4.00 0.004 1 0.8548582 0.000000000 4.00 0.004 2 0.8548582 0.000000000 4.00 0.004 3 0.8548582 0.000000000 4.00 0.004 4 0.8598566 0.060918807 4.00 0.004 5 0.8686537 0.179828685 4.00 0.004 6 0.8756502 0.266018028 4.00 0.004 7 0.8852464 0.363148564 4.00 0.004 8 0.8886456 0.396582856 4.00 0.004 9 0.8910441 0.426327898 4.00 0.004 10 0.8912438 0.442940611 4.00 0.005 1 0.8548582 0.000000000 4.00 0.005 2 0.8548582 0.000000000 4.00 0.005 3 0.8566576 0.022820477 4.00 0.005 4 0.8656550 0.150366044 4.00 0.005 5 0.8764497 0.273238370 4.00 0.005 6 0.8872457 0.382185334 4.00 0.005 7 0.8896449 0.413797943 4.00 0.005 8 0.8918435 0.445570555 4.00 0.005 9 0.8930432 0.462536915 4.00 0.005 10 0.8954422 0.479160306 4.00 0.006 1 0.8548582 0.000000000 4.00 0.006 2 0.8548582 0.000000000 4.00 0.006 3 0.8620558 0.086733121 4.00 0.006 4 0.8744510 0.250823792 4.00 0.006 5 0.8860460 0.371677304 4.00 0.006 6 0.8896449 0.414849836 4.00 0.006 7 0.8912435 0.446161045 4.00 0.006 8 0.8944425 0.472931364 4.00 0.006 9 0.8958414 0.482028647 4.00 0.006 10 0.8966416 0.492437429 4.00 0.007 1 0.8548582 0.000000000 4.00 0.007 2 0.8550582 0.002353195 4.00 0.007 3 0.8658548 0.150820424 4.00 0.007 4 0.8816480 0.325768076 4.00 0.007 5 0.8882456 0.399978989 4.00 0.007 6 0.8916433 0.447409035 4.00 0.007 7 0.8942424 0.472334325 4.00 0.007 8 0.8960411 0.483430032 4.00 0.007 9 0.8976412 0.502460930 4.00 0.007 10 0.8970412 0.506844566 4.00 0.008 1 0.8548582 0.000000000 4.00 0.008 2 0.8560580 0.013995319 4.00 0.008 3 0.8716528 0.219895714 4.00 0.008 4 0.8872460 0.384138391 4.00 0.008 5 0.8912440 0.437235507 4.00 0.008 6 0.8930435 0.462395618 4.00 0.008 7 0.8958412 0.482725729 4.00 0.008 8 0.8978412 0.502997521 4.00 0.008 9 0.8960422 0.503781415 4.00 0.008 10 0.8974395 0.522493712 4.00 0.009 1 0.8548582 0.000000000 4.00 0.009 2 0.8568576 0.025077147 4.00 0.009 3 0.8760499 0.269696740 4.00 0.009 4 0.8878457 0.397671569 4.00 0.009 5 0.8922430 0.454076012 4.00 0.009 6 0.8952419 0.480745113 4.00 0.009 7 0.8970412 0.495723686 4.00 0.009 8 0.8964414 0.504293284 4.00 0.009 9 0.8974395 0.521932493 4.00 0.009 10 0.8998388 0.544994260 4.00 0.010 1 0.8548582 0.000000000 4.00 0.010 2 0.8604561 0.067567545 4.00 0.010 3 0.8818480 0.328665194 4.00 0.010 4 0.8896444 0.420274926 4.00 0.010 5 0.8936430 0.465751718 4.00 0.010 6 0.8972406 0.491392892 4.00 0.010 7 0.8976406 0.507865037 4.00 0.010 8 0.8970404 0.517339013 4.00 0.010 9 0.9006384 0.548622434 4.00 0.010 10 0.8962403 0.538003294 4.25 0.001 1 0.8548582 0.000000000 4.25 0.001 2 0.8548582 0.000000000 4.25 0.001 3 0.8548582 0.000000000 4.25 0.001 4 0.8548582 0.000000000 4.25 0.001 5 0.8548582 0.000000000 4.25 0.001 6 0.8548582 0.000000000 4.25 0.001 7 0.8548582 0.000000000 4.25 0.001 8 0.8548582 0.000000000 4.25 0.001 9 0.8548582 0.000000000 4.25 0.001 10 0.8548582 0.000000000 4.25 0.002 1 0.8548582 0.000000000 4.25 0.002 2 0.8548582 0.000000000 4.25 0.002 3 0.8548582 0.000000000 4.25 0.002 4 0.8548582 0.000000000 4.25 0.002 5 0.8548582 0.000000000 4.25 0.002 6 0.8560580 0.013995319 4.25 0.002 7 0.8588568 0.049777659 4.25 0.002 8 0.8632550 0.107910900 4.25 0.002 9 0.8668547 0.161925654 4.25 0.002 10 0.8720526 0.223454882 4.25 0.003 1 0.8548582 0.000000000 4.25 0.003 2 0.8548582 0.000000000 4.25 0.003 3 0.8548582 0.000000000 4.25 0.003 4 0.8556582 0.009358009 4.25 0.003 5 0.8598566 0.060918807 4.25 0.003 6 0.8656548 0.147458132 4.25 0.003 7 0.8730518 0.234980995 4.25 0.003 8 0.8786494 0.299946615 4.25 0.003 9 0.8854460 0.365820579 4.25 0.003 10 0.8886460 0.395616297 4.25 0.004 1 0.8548582 0.000000000 4.25 0.004 2 0.8548582 0.000000000 4.25 0.004 3 0.8550582 0.002353195 4.25 0.004 4 0.8610560 0.074230165 4.25 0.004 5 0.8688539 0.187817864 4.25 0.004 6 0.8770499 0.282385798 4.25 0.004 7 0.8860459 0.370669178 4.25 0.004 8 0.8882456 0.398889627 4.25 0.004 9 0.8910438 0.431593520 4.25 0.004 10 0.8912433 0.447034593 4.25 0.005 1 0.8548582 0.000000000 4.25 0.005 2 0.8548582 0.000000000 4.25 0.005 3 0.8568574 0.025122989 4.25 0.005 4 0.8684537 0.178049675 4.25 0.005 5 0.8782496 0.295237693 4.25 0.005 6 0.8872459 0.383278115 4.25 0.005 7 0.8900446 0.418822158 4.25 0.005 8 0.8916433 0.447409035 4.25 0.005 9 0.8938428 0.468012736 4.25 0.005 10 0.8954422 0.479870400 4.25 0.006 1 0.8548582 0.000000000 4.25 0.006 2 0.8548582 0.000000000 4.25 0.006 3 0.8628552 0.098705222 4.25 0.006 4 0.8750504 0.258017346 4.25 0.006 5 0.8872460 0.384138391 4.25 0.006 6 0.8900443 0.420634370 4.25 0.006 7 0.8922430 0.455521622 4.25 0.006 8 0.8946422 0.476559675 4.25 0.006 9 0.8960411 0.483430032 4.25 0.006 10 0.8978412 0.500973263 4.25 0.007 1 0.8548582 0.000000000 4.25 0.007 2 0.8552582 0.004688062 4.25 0.007 3 0.8676544 0.169489642 4.25 0.007 4 0.8826475 0.339974208 4.25 0.007 5 0.8888451 0.409756213 4.25 0.007 6 0.8918430 0.450493869 4.25 0.007 7 0.8950420 0.479193346 4.25 0.007 8 0.8972408 0.490721518 4.25 0.007 9 0.8986406 0.508312552 4.25 0.007 10 0.8958419 0.503278954 4.25 0.008 1 0.8548582 0.000000000 4.25 0.008 2 0.8560579 0.015905667 4.25 0.008 3 0.8726521 0.231378649 4.25 0.008 4 0.8876460 0.387207766 4.25 0.008 5 0.8912438 0.441216579 4.25 0.008 6 0.8942428 0.471360709 4.25 0.008 7 0.8964409 0.486812385 4.25 0.008 8 0.8986406 0.508312552 4.25 0.008 9 0.8956417 0.503294492 4.25 0.008 10 0.8982393 0.528920735 4.25 0.009 1 0.8548582 0.000000000 4.25 0.009 2 0.8576574 0.034176538 4.25 0.009 3 0.8768499 0.281892086 4.25 0.009 4 0.8890449 0.410363653 4.25 0.009 5 0.8930428 0.460318347 4.25 0.009 6 0.8952420 0.480749955 4.25 0.009 7 0.8976411 0.501675594 4.25 0.009 8 0.8964417 0.505707902 4.25 0.009 9 0.8984393 0.529492322 4.25 0.009 10 0.9010382 0.551562222 4.25 0.010 1 0.8548582 0.000000000 4.25 0.010 2 0.8620558 0.086636279 4.25 0.010 3 0.8832472 0.345836859 4.25 0.010 4 0.8918436 0.439109305 4.25 0.010 5 0.8936427 0.468809018 4.25 0.010 6 0.8970409 0.492168688 4.25 0.010 7 0.8976409 0.509938728 4.25 0.010 8 0.8964406 0.516270809 4.25 0.010 9 0.8996388 0.544365021 4.25 0.010 10 0.8954408 0.536769853 4.50 0.001 1 0.8548582 0.000000000 4.50 0.001 2 0.8548582 0.000000000 4.50 0.001 3 0.8548582 0.000000000 4.50 0.001 4 0.8548582 0.000000000 4.50 0.001 5 0.8548582 0.000000000 4.50 0.001 6 0.8548582 0.000000000 4.50 0.001 7 0.8548582 0.000000000 4.50 0.001 8 0.8548582 0.000000000 4.50 0.001 9 0.8548582 0.000000000 4.50 0.001 10 0.8550582 0.002353195 4.50 0.002 1 0.8548582 0.000000000 4.50 0.002 2 0.8548582 0.000000000 4.50 0.002 3 0.8548582 0.000000000 4.50 0.002 4 0.8548582 0.000000000 4.50 0.002 5 0.8548582 0.000000000 4.50 0.002 6 0.8560579 0.015887588 4.50 0.002 7 0.8596566 0.058729676 4.50 0.002 8 0.8646547 0.126845907 4.50 0.002 9 0.8692534 0.185777607 4.50 0.002 10 0.8734515 0.238535420 4.50 0.003 1 0.8548582 0.000000000 4.50 0.003 2 0.8548582 0.000000000 4.50 0.003 3 0.8548582 0.000000000 4.50 0.003 4 0.8558580 0.011696471 4.50 0.003 5 0.8608560 0.072057445 4.50 0.003 6 0.8666547 0.160110312 4.50 0.003 7 0.8736512 0.244659730 4.50 0.003 8 0.8802481 0.313943999 4.50 0.003 9 0.8866457 0.375484443 4.50 0.003 10 0.8882456 0.395325779 4.50 0.004 1 0.8548582 0.000000000 4.50 0.004 2 0.8548582 0.000000000 4.50 0.004 3 0.8552582 0.004688062 4.50 0.004 4 0.8620558 0.086733121 4.50 0.004 5 0.8704531 0.206145013 4.50 0.004 6 0.8788492 0.301753392 4.50 0.004 7 0.8870459 0.381590401 4.50 0.004 8 0.8898448 0.411921178 4.50 0.004 9 0.8912438 0.436422045 4.50 0.004 10 0.8916433 0.452869020 4.50 0.005 1 0.8548582 0.000000000 4.50 0.005 2 0.8548582 0.000000000 4.50 0.005 3 0.8574574 0.031954022 4.50 0.005 4 0.8690537 0.188290097 4.50 0.005 5 0.8796484 0.309828555 4.50 0.005 6 0.8882460 0.390818803 4.50 0.005 7 0.8906440 0.425933708 4.50 0.005 8 0.8914432 0.449260334 4.50 0.005 9 0.8944425 0.473649399 4.50 0.005 10 0.8958416 0.481967262 4.50 0.006 1 0.8548582 0.000000000 4.50 0.006 2 0.8548582 0.000000000 4.50 0.006 3 0.8636547 0.113603679 4.50 0.006 4 0.8762499 0.271355902 4.50 0.006 5 0.8878459 0.387821507 4.50 0.006 6 0.8904440 0.427089339 4.50 0.006 7 0.8924432 0.456922140 4.50 0.006 8 0.8954419 0.481254526 4.50 0.006 9 0.8972408 0.490721518 4.50 0.006 10 0.8980411 0.505028882 4.50 0.007 1 0.8548582 0.000000000 4.50 0.007 2 0.8552582 0.004688062 4.50 0.007 3 0.8690536 0.186765249 4.50 0.007 4 0.8858464 0.367019116 4.50 0.007 5 0.8890449 0.412981943 4.50 0.007 6 0.8926428 0.457550459 4.50 0.007 7 0.8956417 0.482683926 4.50 0.007 8 0.8970412 0.490840306 4.50 0.007 9 0.8974409 0.504639798 4.50 0.007 10 0.8962417 0.505851129 4.50 0.008 1 0.8548582 0.000000000 4.50 0.008 2 0.8566576 0.022820477 4.50 0.008 3 0.8734513 0.241324214 4.50 0.008 4 0.8888456 0.398254890 4.50 0.008 5 0.8912438 0.442753430 4.50 0.008 6 0.8944424 0.475063638 4.50 0.008 7 0.8972408 0.491425372 4.50 0.008 8 0.8976408 0.505900645 4.50 0.008 9 0.8964414 0.509689120 4.50 0.008 10 0.8984393 0.533162673 4.50 0.009 1 0.8548582 0.000000000 4.50 0.009 2 0.8588569 0.049698540 4.50 0.009 3 0.8784491 0.298338140 4.50 0.009 4 0.8886451 0.410853430 4.50 0.009 5 0.8928430 0.461072811 4.50 0.009 6 0.8948417 0.480307516 4.50 0.009 7 0.8976411 0.503659126 4.50 0.009 8 0.8956419 0.505223812 4.50 0.009 9 0.8988390 0.534357042 4.50 0.009 10 0.8996388 0.547265556 4.50 0.010 1 0.8548582 0.000000000 4.50 0.010 2 0.8626553 0.096455100 4.50 0.010 3 0.8854467 0.364776118 4.50 0.010 4 0.8916438 0.442458654 4.50 0.010 5 0.8944424 0.474368731 4.50 0.010 6 0.8966409 0.492356948 4.50 0.010 7 0.8970411 0.508799773 4.50 0.010 8 0.8980393 0.526918484 4.50 0.010 9 0.9006385 0.550778247 4.50 0.010 10 0.8956406 0.539564747 4.75 0.001 1 0.8548582 0.000000000 4.75 0.001 2 0.8548582 0.000000000 4.75 0.001 3 0.8548582 0.000000000 4.75 0.001 4 0.8548582 0.000000000 4.75 0.001 5 0.8548582 0.000000000 4.75 0.001 6 0.8548582 0.000000000 4.75 0.001 7 0.8548582 0.000000000 4.75 0.001 8 0.8548582 0.000000000 4.75 0.001 9 0.8548582 0.000000000 4.75 0.001 10 0.8552582 0.004688062 4.75 0.002 1 0.8548582 0.000000000 4.75 0.002 2 0.8548582 0.000000000 4.75 0.002 3 0.8548582 0.000000000 4.75 0.002 4 0.8548582 0.000000000 4.75 0.002 5 0.8548582 0.000000000 4.75 0.002 6 0.8564576 0.020546433 4.75 0.002 7 0.8606560 0.069834930 4.75 0.002 8 0.8650548 0.135429537 4.75 0.002 9 0.8690537 0.191129789 4.75 0.002 10 0.8740510 0.248559119 4.75 0.003 1 0.8548582 0.000000000 4.75 0.003 2 0.8548582 0.000000000 4.75 0.003 3 0.8548582 0.000000000 4.75 0.003 4 0.8560580 0.013995319 4.75 0.003 5 0.8618558 0.084608543 4.75 0.003 6 0.8690536 0.183827455 4.75 0.003 7 0.8748504 0.256347507 4.75 0.003 8 0.8820476 0.332705391 4.75 0.003 9 0.8870459 0.381657060 4.75 0.003 10 0.8882456 0.398889627 4.75 0.004 1 0.8548582 0.000000000 4.75 0.004 2 0.8548582 0.000000000 4.75 0.004 3 0.8556582 0.009358009 4.75 0.004 4 0.8628552 0.098705222 4.75 0.004 5 0.8714529 0.217851419 4.75 0.004 6 0.8804481 0.315448610 4.75 0.004 7 0.8874457 0.384886328 4.75 0.004 8 0.8896449 0.414849836 4.75 0.004 9 0.8916436 0.442452442 4.75 0.004 10 0.8928432 0.460453781 4.75 0.005 1 0.8548582 0.000000000 4.75 0.005 2 0.8548582 0.000000000 4.75 0.005 3 0.8584569 0.045218768 4.75 0.005 4 0.8702531 0.204393501 4.75 0.005 5 0.8814478 0.326337015 4.75 0.005 6 0.8884457 0.396009930 4.75 0.005 7 0.8906438 0.431202070 4.75 0.005 8 0.8922432 0.455523127 4.75 0.005 9 0.8946422 0.476506637 4.75 0.005 10 0.8958412 0.482725729 4.75 0.006 1 0.8548582 0.000000000 4.75 0.006 2 0.8548582 0.000000000 4.75 0.006 3 0.8650547 0.132094621 4.75 0.006 4 0.8770499 0.283680335 4.75 0.006 5 0.8888456 0.397406289 4.75 0.006 6 0.8914438 0.437848041 4.75 0.006 7 0.8926433 0.459827973 4.75 0.006 8 0.8954419 0.482056457 4.75 0.006 9 0.8968412 0.490204879 4.75 0.006 10 0.8982408 0.506377447 4.75 0.007 1 0.8548582 0.000000000 4.75 0.007 2 0.8556582 0.009358009 4.75 0.007 3 0.8694534 0.195162512 4.75 0.007 4 0.8852464 0.365192444 4.75 0.007 5 0.8894444 0.418773907 4.75 0.007 6 0.8924430 0.458458154 4.75 0.007 7 0.8952419 0.481438208 4.75 0.007 8 0.8966412 0.491739780 4.75 0.007 9 0.8978406 0.508468806 4.75 0.007 10 0.8970411 0.513485499 4.75 0.008 1 0.8548582 0.000000000 4.75 0.008 2 0.8566576 0.022820477 4.75 0.008 3 0.8746505 0.254420785 4.75 0.008 4 0.8884456 0.398655708 4.75 0.008 5 0.8916433 0.448191114 4.75 0.008 6 0.8956417 0.481883821 4.75 0.008 7 0.8968409 0.491547698 4.75 0.008 8 0.8978406 0.508468806 4.75 0.008 9 0.8974406 0.517698999 4.75 0.008 10 0.8990392 0.537322287 4.75 0.009 1 0.8548582 0.000000000 4.75 0.009 2 0.8600564 0.063151862 4.75 0.009 3 0.8804486 0.316771987 4.75 0.009 4 0.8894446 0.417574878 4.75 0.009 5 0.8934428 0.464385915 4.75 0.009 6 0.8962411 0.488225186 4.75 0.009 7 0.8982408 0.507704127 4.75 0.009 8 0.8974411 0.516444736 4.75 0.009 9 0.8988392 0.536136741 4.75 0.009 10 0.8984393 0.542238779 4.75 0.010 1 0.8548582 0.000000000 4.75 0.010 2 0.8628552 0.103643168 4.75 0.010 3 0.8860462 0.372627199 4.75 0.010 4 0.8914438 0.442630490 4.75 0.010 5 0.8950419 0.479314460 4.75 0.010 6 0.8964414 0.493124600 4.75 0.010 7 0.8966414 0.507519443 4.75 0.010 8 0.8988392 0.533130103 4.75 0.010 9 0.8996388 0.547128817 4.75 0.010 10 0.8950409 0.538252839 5.00 0.001 1 0.8548582 0.000000000 5.00 0.001 2 0.8548582 0.000000000 5.00 0.001 3 0.8548582 0.000000000 5.00 0.001 4 0.8548582 0.000000000 5.00 0.001 5 0.8548582 0.000000000 5.00 0.001 6 0.8548582 0.000000000 5.00 0.001 7 0.8548582 0.000000000 5.00 0.001 8 0.8548582 0.000000000 5.00 0.001 9 0.8548582 0.000000000 5.00 0.001 10 0.8556582 0.009358009 5.00 0.002 1 0.8548582 0.000000000 5.00 0.002 2 0.8548582 0.000000000 5.00 0.002 3 0.8548582 0.000000000 5.00 0.002 4 0.8548582 0.000000000 5.00 0.002 5 0.8550582 0.002353195 5.00 0.002 6 0.8566576 0.022820477 5.00 0.002 7 0.8618558 0.082915837 5.00 0.002 8 0.8656547 0.146092379 5.00 0.002 9 0.8704531 0.206145013 5.00 0.002 10 0.8748504 0.256347507 5.00 0.003 1 0.8548582 0.000000000 5.00 0.003 2 0.8548582 0.000000000 5.00 0.003 3 0.8548582 0.000000000 5.00 0.003 4 0.8560579 0.015887588 5.00 0.003 5 0.8626553 0.096592384 5.00 0.003 6 0.8688539 0.187817864 5.00 0.003 7 0.8754502 0.262799823 5.00 0.003 8 0.8840467 0.350523495 5.00 0.003 9 0.8872460 0.383123590 5.00 0.003 10 0.8898448 0.412842945 5.00 0.004 1 0.8548582 0.000000000 5.00 0.004 2 0.8548582 0.000000000 5.00 0.004 3 0.8558580 0.011696471 5.00 0.004 4 0.8634550 0.109947626 5.00 0.004 5 0.8726521 0.231378649 5.00 0.004 6 0.8822475 0.333283308 5.00 0.004 7 0.8880460 0.389263894 5.00 0.004 8 0.8898446 0.418192526 5.00 0.004 9 0.8914435 0.445969991 5.00 0.004 10 0.8926433 0.459731536 5.00 0.005 1 0.8548582 0.000000000 5.00 0.005 2 0.8548582 0.000000000 5.00 0.005 3 0.8598566 0.060918807 5.00 0.005 4 0.8712529 0.213459077 5.00 0.005 5 0.8826475 0.339974208 5.00 0.005 6 0.8882456 0.397081957 5.00 0.005 7 0.8916438 0.440812611 5.00 0.005 8 0.8926432 0.459832791 5.00 0.005 9 0.8952419 0.480631137 5.00 0.005 10 0.8964409 0.486812385 5.00 0.006 1 0.8548582 0.000000000 5.00 0.006 2 0.8548582 0.000000000 5.00 0.006 3 0.8646550 0.134456950 5.00 0.006 4 0.8786491 0.298902319 5.00 0.006 5 0.8882457 0.395314036 5.00 0.006 6 0.8914438 0.441835815 5.00 0.006 7 0.8932433 0.463011790 5.00 0.006 8 0.8952420 0.480695585 5.00 0.006 9 0.8968412 0.492375207 5.00 0.006 10 0.8980406 0.508454825 5.00 0.007 1 0.8548582 0.000000000 5.00 0.007 2 0.8560580 0.013995319 5.00 0.007 3 0.8712529 0.213459077 5.00 0.007 4 0.8874456 0.384686489 5.00 0.007 5 0.8902441 0.427343953 5.00 0.007 6 0.8928432 0.461070095 5.00 0.007 7 0.8950420 0.480066863 5.00 0.007 8 0.8972412 0.496251382 5.00 0.007 9 0.8980404 0.510550661 5.00 0.007 10 0.8972406 0.516493627 5.00 0.008 1 0.8548582 0.000000000 5.00 0.008 2 0.8568576 0.025077147 5.00 0.008 3 0.8758500 0.266422036 5.00 0.008 4 0.8876457 0.396236433 5.00 0.008 5 0.8916432 0.448222796 5.00 0.008 6 0.8952419 0.480631137 5.00 0.008 7 0.8964411 0.491070278 5.00 0.008 8 0.8976408 0.509292898 5.00 0.008 9 0.8974406 0.518434260 5.00 0.008 10 0.8996387 0.542729938 5.00 0.009 1 0.8548582 0.000000000 5.00 0.009 2 0.8608561 0.071896757 5.00 0.009 3 0.8826475 0.337756825 5.00 0.009 4 0.8902446 0.425408992 5.00 0.009 5 0.8936428 0.465746417 5.00 0.009 6 0.8968409 0.492926736 5.00 0.009 7 0.8982406 0.510362305 5.00 0.009 8 0.8974408 0.517720395 5.00 0.009 9 0.8996387 0.542786938 5.00 0.009 10 0.8982388 0.544138687 5.00 0.010 1 0.8548582 0.000000000 5.00 0.010 2 0.8648545 0.127092987 5.00 0.010 3 0.8872459 0.383089427 5.00 0.010 4 0.8916435 0.445571043 5.00 0.010 5 0.8948419 0.477944746 5.00 0.010 6 0.8964412 0.495865625 5.00 0.010 7 0.8974411 0.512544710 5.00 0.010 8 0.8982392 0.533078032 5.00 0.010 9 0.8980393 0.541614769 5.00 0.010 10 0.8950408 0.538912055 Accuracy was used to select the optimal model using the largest value. The final values used for the model were degree = 10, scale = 0.009 and C = 4.25.
results <- resamples(list( trained_Model_1 = fit.svmLinear
, trained_Model_2 = fit.svmRadial
, trained_Model_3 = fit.svmPoly
, trained_Model_4 = fit.svmLinear_preProc
, trained_Model_5 = fit.svmRadial_preProc
, trained_Model_6 = fit.svmRadial_preProc
, trained_Model_7 = fit.svmLinear_automaticGrid
, trained_Model_8 = fit.svmRadial_automaticGrid
, trained_Model_9 = fit.svmPoly_automaticGrid
, trained_Model_10 = fit.svmLinear_manualGrid
, trained_Model_11 = fit.svmRadial_manualGrid
, trained_Model_12 = fit.svmPoly_manualGrid
))
summary(results)
Call: summary.resamples(object = results) Models: trained_Model_1, trained_Model_2, trained_Model_3, trained_Model_4, trained_Model_5, trained_Model_6, trained_Model_7, trained_Model_8, trained_Model_9, trained_Model_10, trained_Model_11, trained_Model_12 Number of resamples: 4 Accuracy Min. 1st Qu. Median Mean 3rd Qu. Max. trained_Model_1 0.8545164 0.8545164 0.8548582 0.8548582 0.8552000 0.8552000 trained_Model_2 0.8904876 0.8928219 0.8960406 0.8956422 0.8988609 0.9000000 trained_Model_3 0.9032774 0.9038193 0.9072358 0.9090373 0.9124537 0.9184000 trained_Model_4 0.8545164 0.8545164 0.8548582 0.8548582 0.8552000 0.8552000 trained_Model_5 0.8816000 0.8882657 0.8916867 0.8900433 0.8934643 0.8952000 trained_Model_6 0.8816000 0.8882657 0.8916867 0.8900433 0.8934643 0.8952000 trained_Model_7 0.8545164 0.8545164 0.8548582 0.8548582 0.8552000 0.8552000 trained_Model_8 0.8592000 0.8616000 0.8632544 0.8636534 0.8653078 0.8689049 trained_Model_9 0.8545164 0.8545164 0.8548582 0.8548582 0.8552000 0.8552000 trained_Model_10 0.8545164 0.8545164 0.8548582 0.8548582 0.8552000 0.8552000 trained_Model_11 0.8912000 0.8990590 0.9048761 0.9034380 0.9092552 0.9128000 trained_Model_12 0.8952000 0.8988000 0.9004396 0.9010382 0.9026779 0.9080735 NA's trained_Model_1 0 trained_Model_2 0 trained_Model_3 0 trained_Model_4 0 trained_Model_5 0 trained_Model_6 0 trained_Model_7 0 trained_Model_8 0 trained_Model_9 0 trained_Model_10 0 trained_Model_11 0 trained_Model_12 0 Kappa Min. 1st Qu. Median Mean 3rd Qu. Max. trained_Model_1 0.00000000 0.00000000 0.0000000 0.0000000 0.0000000 0.0000000 trained_Model_2 0.40525167 0.40907562 0.4326150 0.4350147 0.4585541 0.4695772 trained_Model_3 0.55522418 0.56871090 0.5979642 0.5988914 0.6281447 0.6444130 trained_Model_4 0.00000000 0.00000000 0.0000000 0.0000000 0.0000000 0.0000000 trained_Model_5 0.29086714 0.36655898 0.4012314 0.3778059 0.4124783 0.4178937 trained_Model_6 0.29086714 0.36655898 0.4012314 0.3778059 0.4124783 0.4178937 trained_Model_7 0.00000000 0.00000000 0.0000000 0.0000000 0.0000000 0.0000000 trained_Model_8 0.05362505 0.08019566 0.0983517 0.1035911 0.1217471 0.1640358 trained_Model_9 0.00000000 0.00000000 0.0000000 0.0000000 0.0000000 0.0000000 trained_Model_10 0.00000000 0.00000000 0.0000000 0.0000000 0.0000000 0.0000000 trained_Model_11 0.49325432 0.50752411 0.5343763 0.5364884 0.5633406 0.5839466 trained_Model_12 0.53572647 0.53917530 0.5412529 0.5515622 0.5536398 0.5880166 NA's trained_Model_1 0 trained_Model_2 0 trained_Model_3 0 trained_Model_4 0 trained_Model_5 0 trained_Model_6 0 trained_Model_7 0 trained_Model_8 0 trained_Model_9 0 trained_Model_10 0 trained_Model_11 0 trained_Model_12 0
dotplot(results)
bwplot(results)
best_trained_model <- fit.svmRadial_manualGrid
predictions <- predict(best_trained_model, newdata=testing_dataset)
res_ <- caret::confusionMatrix(table(predictions, testing_dataset$Churn))
print("Results from the BEST trained model ... ...\n");
print(round(res_$overall, digits = 3))
[1] "Results from the BEST trained model ... ...\n" Accuracy Kappa AccuracyLower AccuracyUpper AccuracyNull 0.907 0.559 0.886 0.926 0.856 AccuracyPValue McnemarPValue 0.000 0.000
#getwd()
saveRDS(best_trained_model, "./best_trained_model.rds")
# load the model
#getwd()
saved_model <- readRDS("./best_trained_model.rds")
print(saved_model)
Support Vector Machines with Radial Basis Function Kernel 2501 samples 19 predictor 2 classes: 'no', 'yes' Pre-processing: centered (19), scaled (19), principal component signal extraction (19) Resampling: Cross-Validated (2 fold, repeated 2 times) Summary of sample sizes: 1251, 1250, 1251, 1250 Resampling results across tuning parameters: C sigma Accuracy Kappa 1.0 0.1 0.8910436 0.395101052 1.0 0.2 0.8770491 0.263986504 1.0 0.3 0.8616558 0.082270494 1.0 0.4 0.8560579 0.014053048 1.0 0.5 0.8550582 0.002353195 1.0 0.6 0.8548582 0.000000000 1.0 0.7 0.8548582 0.000000000 1.0 0.8 0.8548582 0.000000000 1.0 0.9 0.8548582 0.000000000 1.0 1.0 0.8548582 0.000000000 1.5 0.1 0.8994398 0.479467316 1.5 0.2 0.8900443 0.398313543 1.5 0.3 0.8710515 0.207242730 1.5 0.4 0.8614560 0.085221306 1.5 0.5 0.8558579 0.017438650 1.5 0.6 0.8550582 0.002353195 1.5 0.7 0.8548582 0.000000000 1.5 0.8 0.8548582 0.000000000 1.5 0.9 0.8548582 0.000000000 1.5 1.0 0.8548582 0.000000000 2.0 0.1 0.9000395 0.504361056 2.0 0.2 0.8922428 0.427059231 2.0 0.3 0.8718508 0.224707540 2.0 0.4 0.8614560 0.095038285 2.0 0.5 0.8556580 0.018936195 2.0 0.6 0.8544584 0.001146119 2.0 0.7 0.8548584 0.001955066 2.0 0.8 0.8548582 0.000000000 2.0 0.9 0.8548582 0.000000000 2.0 1.0 0.8548582 0.000000000 2.5 0.1 0.9020390 0.522895363 2.5 0.2 0.8914432 0.424427823 2.5 0.3 0.8720508 0.228016585 2.5 0.4 0.8614561 0.100069704 2.5 0.5 0.8556580 0.018936195 2.5 0.6 0.8544584 0.001146119 2.5 0.7 0.8548584 0.001955066 2.5 0.8 0.8548582 0.000000000 2.5 0.9 0.8548582 0.000000000 2.5 1.0 0.8548582 0.000000000 3.0 0.1 0.9034380 0.536488374 3.0 0.2 0.8902432 0.420058273 3.0 0.3 0.8716510 0.228112624 3.0 0.4 0.8612563 0.099602705 3.0 0.5 0.8556580 0.018936195 3.0 0.6 0.8544584 0.001146119 3.0 0.7 0.8548584 0.001955066 3.0 0.8 0.8548582 0.000000000 3.0 0.9 0.8548582 0.000000000 3.0 1.0 0.8548582 0.000000000 3.5 0.1 0.9014385 0.532802322 3.5 0.2 0.8898435 0.420461089 3.5 0.3 0.8710513 0.227597045 3.5 0.4 0.8612563 0.099602705 3.5 0.5 0.8556580 0.018936195 3.5 0.6 0.8544584 0.001146119 3.5 0.7 0.8548584 0.001955066 3.5 0.8 0.8548582 0.000000000 3.5 0.9 0.8548582 0.000000000 3.5 1.0 0.8548582 0.000000000 4.0 0.1 0.9002382 0.534163603 4.0 0.2 0.8878440 0.414978448 4.0 0.3 0.8710510 0.230432537 4.0 0.4 0.8610563 0.099096612 4.0 0.5 0.8556580 0.018936195 4.0 0.6 0.8544584 0.001146119 4.0 0.7 0.8548584 0.001955066 4.0 0.8 0.8548582 0.000000000 4.0 0.9 0.8548582 0.000000000 4.0 1.0 0.8548582 0.000000000 4.5 0.1 0.9004384 0.539666102 4.5 0.2 0.8872443 0.413919157 4.5 0.3 0.8712508 0.232350709 4.5 0.4 0.8610563 0.099096612 4.5 0.5 0.8556580 0.018936195 4.5 0.6 0.8544584 0.001146119 4.5 0.7 0.8548584 0.001955066 4.5 0.8 0.8548582 0.000000000 4.5 0.9 0.8548582 0.000000000 4.5 1.0 0.8548582 0.000000000 5.0 0.1 0.9000390 0.539396659 5.0 0.2 0.8878440 0.418393524 5.0 0.3 0.8710508 0.230455586 5.0 0.4 0.8610563 0.099096612 5.0 0.5 0.8556580 0.018936195 5.0 0.6 0.8544584 0.001146119 5.0 0.7 0.8548584 0.001955066 5.0 0.8 0.8548582 0.000000000 5.0 0.9 0.8548582 0.000000000 5.0 1.0 0.8548582 0.000000000 Accuracy was used to select the optimal model using the largest value. The final values used for the model were sigma = 0.1 and C = 3.
# make a predictions on "new data" using the final model
final_predictions <- predict(saved_model, dataSet[1:20])
confusionMatrix(table(final_predictions, dataSet$Churn))
res_ <- confusionMatrix(table(final_predictions, dataSet$Churn))
print("Results from the BEST trained model ... ...\n");
print(round(res_$overall, digits = 3))
Confusion Matrix and Statistics final_predictions no yes no 2831 110 yes 19 373 Accuracy : 0.9613 95% CI : (0.9542, 0.9676) No Information Rate : 0.8551 P-Value [Acc > NIR] : < 2.2e-16 Kappa : 0.8306 Mcnemar's Test P-Value : 2.299e-15 Sensitivity : 0.9933 Specificity : 0.7723 Pos Pred Value : 0.9626 Neg Pred Value : 0.9515 Prevalence : 0.8551 Detection Rate : 0.8494 Detection Prevalence : 0.8824 Balanced Accuracy : 0.8828 'Positive' Class : no
[1] "Results from the BEST trained model ... ...\n" Accuracy Kappa AccuracyLower AccuracyUpper AccuracyNull 0.961 0.831 0.954 0.968 0.855 AccuracyPValue McnemarPValue 0.000 0.000
print(res_$table)
fourfoldplot(res_$table, color = c("#CC6666", "#99CC99"),
conf.level = 0, margin = 1, main = "Confusion Matrix")
final_predictions no yes no 2831 110 yes 19 373