Support Vector Machine (SVM algorithm)

In machine learning, support-vector machines (SVMs, also support-vector networks) are supervised learning models with associated learning algorithms that analyze data used for classification and regression analysis. Developed at AT&T Bell Laboratories by Vapnik with colleagues (Boser et al., 1992, Guyon et al., 1993, Vapnik et al., 1997), it presents one of the most robust prediction methods, based on the statistical learning framework or VC theory proposed by Vapnik and Chervonekis (1974) and Vapnik (1982, 1995). Given a set of training examples, each marked as belonging to one or the other of two categories, an SVM training algorithm builds a model that assigns new examples to one category or the other, making it a non-probabilistic binary linear classifier (although methods such as Platt scaling exist to use SVM in a probabilistic classification setting). An SVM model is a representation of the examples as points in space, mapped so that the examples of the separate categories are divided by a clear gap that is as wide as possible. New examples are then mapped into that same space and predicted to belong to a category based on the side of the gap on which they fall.
In addition to performing linear classification, SVMs can efficiently perform a non-linear classification using what is called the kernel trick, implicitly mapping their inputs into high-dimensional feature spaces.
When data are unlabelled, supervised learning is not possible, and an unsupervised learning approach is required, which attempts to find natural clustering of the data to groups, and then map new data to these formed groups. The support-vector clustering[2] algorithm, created by Hava Siegelmann and Vladimir Vapnik, applies the statistics of support vectors, developed in the support vector machines algorithm, to categorize unlabeled data, and is one of the most widely used clustering algorithms in industrial applications.

Here we are going to implement SVM using Telecom Churn Dataset.

0. Loading required libraries

In [3]:
library(DBI)
library(corrgram)
library(caret) # contains SVM function 
library(gridExtra)
library(ggpubr)

1. Setting up the code parallelizing

Today is a good practice to start parallelizing your code. The common motivation behind parallel computing is that something is taking too long time. For somebody that means any computation that takes more than 3 minutes – this because parallelization is incredibly simple and most tasks that take time are embarrassingly parallel. Here are a few common tasks that fit the description:

  • Bootstrapping
  • Cross-validation
  • Multivariate Imputation by Chained Equations (MICE)
  • Fitting multiple regression models
You can find out more about parallelizing your computations in R - here.

For Windows users

In [ ]:
# process in parallel on Windows
library(doParallel) 
cl <- makeCluster(detectCores(), type='PSOCK')
registerDoParallel(cl)

For Mac OSX and Unix like systems users

In [6]:
# process in parallel on Mac OSX and UNIX like systems
library(doMC)
registerDoMC(cores = 4)

2. Importing Data

In [8]:
#Set working directory where CSV is located

#getwd()
#setwd("...YOUR WORKING DIRECTORY WITH A DATASET...")
#getwd()
In [7]:
# Load the DataSets: 
dataSet <- read.csv("TelcoCustomerChurnDataset.csv", header = TRUE, sep = ',')
colnames(dataSet) #Check the dataframe column names
  1. 'Account_Length'
  2. 'Vmail_Message'
  3. 'Day_Mins'
  4. 'Eve_Mins'
  5. 'Night_Mins'
  6. 'Intl_Mins'
  7. 'CustServ_Calls'
  8. 'Churn'
  9. 'Intl_Plan'
  10. 'Vmail_Plan'
  11. 'Day_Calls'
  12. 'Day_Charge'
  13. 'Eve_Calls'
  14. 'Eve_Charge'
  15. 'Night_Calls'
  16. 'Night_Charge'
  17. 'Intl_Calls'
  18. 'Intl_Charge'
  19. 'State'
  20. 'Area_Code'
  21. 'Phone'

3. Exploring the dataset

In [8]:
# Print top 10 rows in the dataSet
head(dataSet, 10)
A data.frame: 10 × 21
Account_LengthVmail_MessageDay_MinsEve_MinsNight_MinsIntl_MinsCustServ_CallsChurnIntl_PlanVmail_Plan⋯Day_ChargeEve_CallsEve_ChargeNight_CallsNight_ChargeIntl_CallsIntl_ChargeStateArea_CodePhone
<int><int><dbl><dbl><dbl><dbl><int><fct><fct><fct>⋯<dbl><int><dbl><int><dbl><int><dbl><fct><int><fct>
112825265.1197.4244.710.01nono yes⋯45.07 9916.78 9111.0132.70KS415382-4657
210726161.6195.5254.413.71nono yes⋯27.4710316.6210311.4533.70OH415371-7191
3137 0243.4121.2162.612.20nono no ⋯41.3811010.30104 7.3253.29NJ415358-1921
4 84 0299.4 61.9196.9 6.62noyesno ⋯50.90 88 5.26 89 8.8671.78OH408375-9999
5 75 0166.7148.3186.910.13noyesno ⋯28.3412212.61121 8.4132.73OK415330-6626
6118 0223.4220.6203.9 6.30noyesno ⋯37.9810118.75118 9.1861.70AL510391-8027
712124218.2348.5212.6 7.53nono yes⋯37.0910829.62118 9.5772.03MA510355-9993
8147 0157.0103.1211.8 7.10noyesno ⋯26.69 94 8.76 96 9.5361.92MO415329-9001
9117 0184.5351.6215.8 8.71nono no ⋯31.37 8029.89 90 9.7142.35LA408335-4719
1014137258.6222.0326.411.20noyesyes⋯43.9611118.87 9714.6953.02WV415330-8173
In [9]:
# Print last 10 rows in the dataSet
tail(dataSet, 10)
A data.frame: 10 × 21
Account_LengthVmail_MessageDay_MinsEve_MinsNight_MinsIntl_MinsCustServ_CallsChurnIntl_PlanVmail_Plan⋯Day_ChargeEve_CallsEve_ChargeNight_CallsNight_ChargeIntl_CallsIntl_ChargeStateArea_CodePhone
<int><int><dbl><dbl><dbl><dbl><int><fct><fct><fct>⋯<dbl><int><dbl><int><dbl><int><dbl><fct><int><fct>
3324117 0118.4249.3227.013.65yesno no ⋯20.13 9721.19 5610.22 33.67IN415362-5899
3325159 0169.8197.7193.711.61no no no ⋯28.8710516.80 82 8.72 43.13WV415377-1164
3326 78 0193.4116.9243.3 9.32no no no ⋯32.88 88 9.9410910.95 42.51OH408368-8555
3327 96 0106.6284.8178.914.91no no no ⋯18.12 8724.21 92 8.05 74.02OH415347-6812
3328 79 0134.7189.7221.411.82no no no ⋯22.90 6816.12128 9.96 53.19SC415348-3830
332919236156.2215.5279.1 9.92no no yes⋯26.5512618.32 8312.56 62.67AZ415414-4276
3330 68 0231.1153.4191.3 9.63no no no ⋯39.29 5513.04123 8.61 42.59WV415370-3271
3331 28 0180.8288.8191.914.12no no no ⋯30.74 5824.55 91 8.64 63.81RI510328-8230
3332184 0213.8159.6139.2 5.02no yesno ⋯36.35 8413.57137 6.26101.35CT510364-6381
3333 7425234.4265.9241.413.70no no yes⋯39.85 8222.60 7710.86 43.70TN415400-4344
In [10]:
# Dimention of Dataset
dim(dataSet)
  1. 3333
  2. 21
In [11]:
# Check Data types of each column
table(unlist(lapply(dataSet, class)))
 factor integer numeric 
      5       8       8 
In [12]:
# Check Data types of individual column
data.class(dataSet$Account_Length) 
data.class(dataSet$Vmail_Message) 
data.class(dataSet$Day_Mins)
data.class(dataSet$Eve_Mins)
data.class(dataSet$Night_Mins) 
data.class(dataSet$Intl_Mins)
data.class(dataSet$CustServ_Calls)
data.class(dataSet$Intl_Plan) 
data.class(dataSet$Vmail_Plan)
data.class(dataSet$Day_Calls)
data.class(dataSet$Day_Charge) 
data.class(dataSet$Eve_Calls)
data.class(dataSet$Eve_Charge) 
data.class(dataSet$Night_Calls)
data.class(dataSet$Night_Charge)
data.class(dataSet$Intl_Calls) 
data.class(dataSet$Intl_Charge)
data.class(dataSet$State) 
data.class(dataSet$Phone)
data.class(dataSet$Churn)
'numeric'
'numeric'
'numeric'
'numeric'
'numeric'
'numeric'
'numeric'
'factor'
'factor'
'numeric'
'numeric'
'numeric'
'numeric'
'numeric'
'numeric'
'numeric'
'numeric'
'factor'
'factor'
'factor'

Converting variables Intl_Plan, Vmail_Plan, State to numeric data type.

In [13]:
dataSet$Intl_Plan <- as.numeric(dataSet$Intl_Plan)
dataSet$Vmail_Plan <- as.numeric(dataSet$Vmail_Plan)
dataSet$State <- as.numeric(dataSet$State)
In [14]:
# Check Data types of each column
table(unlist(lapply(dataSet, class)))
 factor integer numeric 
      2       8      11 

4. Exploring or Summarising dataset with descriptive statistics

In [15]:
# Find out if there is missing value in rows
rowSums(is.na(dataSet))
  1. 0
  2. 0
  3. 0
  4. 0
  5. 0
  6. 0
  7. 0
  8. 0
  9. 0
  10. 0
  11. 0
  12. 0
  13. 0
  14. 0
  15. 0
  16. 0
  17. 0
  18. 0
  19. 0
  20. 0
  21. 0
  22. 0
  23. 0
  24. 0
  25. 0
  26. 0
  27. 0
  28. 0
  29. 0
  30. 0
  31. 0
  32. 0
  33. 0
  34. 0
  35. 0
  36. 0
  37. 0
  38. 0
  39. 0
  40. 0
  41. 0
  42. 0
  43. 0
  44. 0
  45. 0
  46. 0
  47. 0
  48. 0
  49. 0
  50. 0
  51. 0
  52. 0
  53. 0
  54. 0
  55. 0
  56. 0
  57. 0
  58. 0
  59. 0
  60. 0
  61. 0
  62. 0
  63. 0
  64. 0
  65. 0
  66. 0
  67. 0
  68. 0
  69. 0
  70. 0
  71. 0
  72. 0
  73. 0
  74. 0
  75. 0
  76. 0
  77. 0
  78. 0
  79. 0
  80. 0
  81. 0
  82. 0
  83. 0
  84. 0
  85. 0
  86. 0
  87. 0
  88. 0
  89. 0
  90. 0
  91. 0
  92. 0
  93. 0
  94. 0
  95. 0
  96. 0
  97. 0
  98. 0
  99. 0
  100. 0
  101. 0
  102. 0
  103. 0
  104. 0
  105. 0
  106. 0
  107. 0
  108. 0
  109. 0
  110. 0
  111. 0
  112. 0
  113. 0
  114. 0
  115. 0
  116. 0
  117. 0
  118. 0
  119. 0
  120. 0
  121. 0
  122. 0
  123. 0
  124. 0
  125. 0
  126. 0
  127. 0
  128. 0
  129. 0
  130. 0
  131. 0
  132. 0
  133. 0
  134. 0
  135. 0
  136. 0
  137. 0
  138. 0
  139. 0
  140. 0
  141. 0
  142. 0
  143. 0
  144. 0
  145. 0
  146. 0
  147. 0
  148. 0
  149. 0
  150. 0
  151. 0
  152. 0
  153. 0
  154. 0
  155. 0
  156. 0
  157. 0
  158. 0
  159. 0
  160. 0
  161. 0
  162. 0
  163. 0
  164. 0
  165. 0
  166. 0
  167. 0
  168. 0
  169. 0
  170. 0
  171. 0
  172. 0
  173. 0
  174. 0
  175. 0
  176. 0
  177. 0
  178. 0
  179. 0
  180. 0
  181. 0
  182. 0
  183. 0
  184. 0
  185. 0
  186. 0
  187. 0
  188. 0
  189. 0
  190. 0
  191. 0
  192. 0
  193. 0
  194. 0
  195. 0
  196. 0
  197. 0
  198. 0
  199. 0
  200. 0
  201. ⋯
  202. 0
  203. 0
  204. 0
  205. 0
  206. 0
  207. 0
  208. 0
  209. 0
  210. 0
  211. 0
  212. 0
  213. 0
  214. 0
  215. 0
  216. 0
  217. 0
  218. 0
  219. 0
  220. 0
  221. 0
  222. 0
  223. 0
  224. 0
  225. 0
  226. 0
  227. 0
  228. 0
  229. 0
  230. 0
  231. 0
  232. 0
  233. 0
  234. 0
  235. 0
  236. 0
  237. 0
  238. 0
  239. 0
  240. 0
  241. 0
  242. 0
  243. 0
  244. 0
  245. 0
  246. 0
  247. 0
  248. 0
  249. 0
  250. 0
  251. 0
  252. 0
  253. 0
  254. 0
  255. 0
  256. 0
  257. 0
  258. 0
  259. 0
  260. 0
  261. 0
  262. 0
  263. 0
  264. 0
  265. 0
  266. 0
  267. 0
  268. 0
  269. 0
  270. 0
  271. 0
  272. 0
  273. 0
  274. 0
  275. 0
  276. 0
  277. 0
  278. 0
  279. 0
  280. 0
  281. 0
  282. 0
  283. 0
  284. 0
  285. 0
  286. 0
  287. 0
  288. 0
  289. 0
  290. 0
  291. 0
  292. 0
  293. 0
  294. 0
  295. 0
  296. 0
  297. 0
  298. 0
  299. 0
  300. 0
  301. 0
  302. 0
  303. 0
  304. 0
  305. 0
  306. 0
  307. 0
  308. 0
  309. 0
  310. 0
  311. 0
  312. 0
  313. 0
  314. 0
  315. 0
  316. 0
  317. 0
  318. 0
  319. 0
  320. 0
  321. 0
  322. 0
  323. 0
  324. 0
  325. 0
  326. 0
  327. 0
  328. 0
  329. 0
  330. 0
  331. 0
  332. 0
  333. 0
  334. 0
  335. 0
  336. 0
  337. 0
  338. 0
  339. 0
  340. 0
  341. 0
  342. 0
  343. 0
  344. 0
  345. 0
  346. 0
  347. 0
  348. 0
  349. 0
  350. 0
  351. 0
  352. 0
  353. 0
  354. 0
  355. 0
  356. 0
  357. 0
  358. 0
  359. 0
  360. 0
  361. 0
  362. 0
  363. 0
  364. 0
  365. 0
  366. 0
  367. 0
  368. 0
  369. 0
  370. 0
  371. 0
  372. 0
  373. 0
  374. 0
  375. 0
  376. 0
  377. 0
  378. 0
  379. 0
  380. 0
  381. 0
  382. 0
  383. 0
  384. 0
  385. 0
  386. 0
  387. 0
  388. 0
  389. 0
  390. 0
  391. 0
  392. 0
  393. 0
  394. 0
  395. 0
  396. 0
  397. 0
  398. 0
  399. 0
  400. 0
  401. 0
In [16]:
# Find out if there is missing value in columns
colSums(is.na(dataSet))
Account_Length
0
Vmail_Message
0
Day_Mins
0
Eve_Mins
0
Night_Mins
0
Intl_Mins
0
CustServ_Calls
0
Churn
0
Intl_Plan
0
Vmail_Plan
0
Day_Calls
0
Day_Charge
0
Eve_Calls
0
Eve_Charge
0
Night_Calls
0
Night_Charge
0
Intl_Calls
0
Intl_Charge
0
State
0
Area_Code
0
Phone
0

Missing value checking using different packages (mice and VIM)

In [17]:
#Checking missing value with the mice package
library(mice)
md.pattern(dataSet)
Attaching package: ‘mice’


The following objects are masked from ‘package:base’:

    cbind, rbind


 /\     /\
{  `---'  }
{  O   O  }
==>  V <==  No need for mice. This data set is completely observed.
 \  \|/  /
  `-----'

A matrix: 2 × 22 of type dbl
Account_LengthVmail_MessageDay_MinsEve_MinsNight_MinsIntl_MinsCustServ_CallsChurnIntl_PlanVmail_Plan⋯Eve_CallsEve_ChargeNight_CallsNight_ChargeIntl_CallsIntl_ChargeStateArea_CodePhone
33331111111111⋯1111111110
0000000000⋯0000000000
In [18]:
#Checking missing value with the VIM package
library(VIM)
mice_plot <- aggr(dataSet, col=c('navyblue','yellow'),
                  numbers=TRUE, sortVars=TRUE,
                  labels=names(dataSet[1:21]), cex.axis=.9,
                  gap=3, ylab=c("Missing data","Pattern"))
Loading required package: colorspace

Loading required package: grid

VIM is ready to use.


Suggestions and bug-reports can be submitted at: https://github.com/statistikat/VIM/issues


Attaching package: ‘VIM’


The following object is masked from ‘package:datasets’:

    sleep


 Variables sorted by number of missings: 
       Variable Count
 Account_Length     0
  Vmail_Message     0
       Day_Mins     0
       Eve_Mins     0
     Night_Mins     0
      Intl_Mins     0
 CustServ_Calls     0
          Churn     0
      Intl_Plan     0
     Vmail_Plan     0
      Day_Calls     0
     Day_Charge     0
      Eve_Calls     0
     Eve_Charge     0
    Night_Calls     0
   Night_Charge     0
     Intl_Calls     0
    Intl_Charge     0
          State     0
      Area_Code     0
          Phone     0

After the observation, we can claim that dataset contains no missing values.

Summary of dataset

In [19]:
# Selecting just columns with numeric data type
numericalCols <- colnames(dataSet[c(1:7,9:20)])

Difference between the lapply and sapply functions (we will use them in the next 2 cells):
We use lapply - when we want to apply a function to each element of a list in turn and get a list back.
We use sapply - when we want to apply a function to each element of a list in turn, but we want a vector back, rather than a list.

Finding statistics metrics with lapply function

In [20]:
#Sum
lapply(dataSet[numericalCols], FUN = sum)
$Account_Length
336849
$Vmail_Message
26994
$Day_Mins
599190.4
$Eve_Mins
669867.5
$Night_Mins
669506.5
$Intl_Mins
34120.9
$CustServ_Calls
5209
$Intl_Plan
3656
$Vmail_Plan
4255
$Day_Calls
334752
$Day_Charge
101864.17
$Eve_Calls
333681
$Eve_Charge
56939.44
$Night_Calls
333659
$Night_Charge
30128.07
$Intl_Calls
14930
$Intl_Charge
9214.35
$State
90189
$Area_Code
1457129
In [21]:
#Mean
lapply(dataSet[numericalCols], FUN = mean)
$Account_Length
101.064806480648
$Vmail_Message
8.0990099009901
$Day_Mins
179.775097509751
$Eve_Mins
200.980348034803
$Night_Mins
200.87203720372
$Intl_Mins
10.2372937293729
$CustServ_Calls
1.56285628562856
$Intl_Plan
1.0969096909691
$Vmail_Plan
1.27662766276628
$Day_Calls
100.435643564356
$Day_Charge
30.5623072307231
$Eve_Calls
100.114311431143
$Eve_Charge
17.0835403540354
$Night_Calls
100.107710771077
$Night_Charge
9.03932493249325
$Intl_Calls
4.47944794479448
$Intl_Charge
2.76458145814581
$State
27.0594059405941
$Area_Code
437.182418241824
In [22]:
#median
lapply(dataSet[numericalCols], FUN = median)
$Account_Length
101
$Vmail_Message
0
$Day_Mins
179.4
$Eve_Mins
201.4
$Night_Mins
201.2
$Intl_Mins
10.3
$CustServ_Calls
1
$Intl_Plan
1
$Vmail_Plan
1
$Day_Calls
101
$Day_Charge
30.5
$Eve_Calls
100
$Eve_Charge
17.12
$Night_Calls
100
$Night_Charge
9.05
$Intl_Calls
4
$Intl_Charge
2.78
$State
27
$Area_Code
415
In [23]:
#Min
lapply(dataSet[numericalCols], FUN = min)
$Account_Length
1
$Vmail_Message
0
$Day_Mins
0
$Eve_Mins
0
$Night_Mins
23.2
$Intl_Mins
0
$CustServ_Calls
0
$Intl_Plan
1
$Vmail_Plan
1
$Day_Calls
0
$Day_Charge
0
$Eve_Calls
0
$Eve_Charge
0
$Night_Calls
33
$Night_Charge
1.04
$Intl_Calls
0
$Intl_Charge
0
$State
1
$Area_Code
408
In [24]:
#Max
lapply(dataSet[numericalCols], FUN = max)
$Account_Length
243
$Vmail_Message
51
$Day_Mins
350.8
$Eve_Mins
363.7
$Night_Mins
395
$Intl_Mins
20
$CustServ_Calls
9
$Intl_Plan
2
$Vmail_Plan
2
$Day_Calls
165
$Day_Charge
59.64
$Eve_Calls
170
$Eve_Charge
30.91
$Night_Calls
175
$Night_Charge
17.77
$Intl_Calls
20
$Intl_Charge
5.4
$State
51
$Area_Code
510
In [25]:
#Length
lapply(dataSet[numericalCols], FUN = length)
$Account_Length
3333
$Vmail_Message
3333
$Day_Mins
3333
$Eve_Mins
3333
$Night_Mins
3333
$Intl_Mins
3333
$CustServ_Calls
3333
$Intl_Plan
3333
$Vmail_Plan
3333
$Day_Calls
3333
$Day_Charge
3333
$Eve_Calls
3333
$Eve_Charge
3333
$Night_Calls
3333
$Night_Charge
3333
$Intl_Calls
3333
$Intl_Charge
3333
$State
3333
$Area_Code
3333

Finding statistics metrics with sapply function

In [26]:
# Sum
sapply(dataSet[numericalCols], FUN = sum)
Account_Length
336849
Vmail_Message
26994
Day_Mins
599190.4
Eve_Mins
669867.5
Night_Mins
669506.5
Intl_Mins
34120.9
CustServ_Calls
5209
Intl_Plan
3656
Vmail_Plan
4255
Day_Calls
334752
Day_Charge
101864.17
Eve_Calls
333681
Eve_Charge
56939.44
Night_Calls
333659
Night_Charge
30128.07
Intl_Calls
14930
Intl_Charge
9214.35
State
90189
Area_Code
1457129
In [27]:
# Mean
sapply(dataSet[numericalCols], FUN = mean)
Account_Length
101.064806480648
Vmail_Message
8.0990099009901
Day_Mins
179.775097509751
Eve_Mins
200.980348034803
Night_Mins
200.87203720372
Intl_Mins
10.2372937293729
CustServ_Calls
1.56285628562856
Intl_Plan
1.0969096909691
Vmail_Plan
1.27662766276628
Day_Calls
100.435643564356
Day_Charge
30.5623072307231
Eve_Calls
100.114311431143
Eve_Charge
17.0835403540354
Night_Calls
100.107710771077
Night_Charge
9.03932493249325
Intl_Calls
4.47944794479448
Intl_Charge
2.76458145814581
State
27.0594059405941
Area_Code
437.182418241824
In [28]:
# Median
sapply(dataSet[numericalCols], FUN = median)
Account_Length
101
Vmail_Message
0
Day_Mins
179.4
Eve_Mins
201.4
Night_Mins
201.2
Intl_Mins
10.3
CustServ_Calls
1
Intl_Plan
1
Vmail_Plan
1
Day_Calls
101
Day_Charge
30.5
Eve_Calls
100
Eve_Charge
17.12
Night_Calls
100
Night_Charge
9.05
Intl_Calls
4
Intl_Charge
2.78
State
27
Area_Code
415
In [29]:
# Min
sapply(dataSet[numericalCols], FUN = min)
Account_Length
1
Vmail_Message
0
Day_Mins
0
Eve_Mins
0
Night_Mins
23.2
Intl_Mins
0
CustServ_Calls
0
Intl_Plan
1
Vmail_Plan
1
Day_Calls
0
Day_Charge
0
Eve_Calls
0
Eve_Charge
0
Night_Calls
33
Night_Charge
1.04
Intl_Calls
0
Intl_Charge
0
State
1
Area_Code
408
In [30]:
# Max
sapply(dataSet[numericalCols], FUN = max)
Account_Length
243
Vmail_Message
51
Day_Mins
350.8
Eve_Mins
363.7
Night_Mins
395
Intl_Mins
20
CustServ_Calls
9
Intl_Plan
2
Vmail_Plan
2
Day_Calls
165
Day_Charge
59.64
Eve_Calls
170
Eve_Charge
30.91
Night_Calls
175
Night_Charge
17.77
Intl_Calls
20
Intl_Charge
5.4
State
51
Area_Code
510
In [31]:
# Length
sapply(dataSet[numericalCols], FUN = length)
Account_Length
3333
Vmail_Message
3333
Day_Mins
3333
Eve_Mins
3333
Night_Mins
3333
Intl_Mins
3333
CustServ_Calls
3333
Intl_Plan
3333
Vmail_Plan
3333
Day_Calls
3333
Day_Charge
3333
Eve_Calls
3333
Eve_Charge
3333
Night_Calls
3333
Night_Charge
3333
Intl_Calls
3333
Intl_Charge
3333
State
3333
Area_Code
3333

In the next few cells, you will find three different options on how to aggregate data.

In [32]:
# OPTION 1: (Using Aggregate FUNCTION - all variables together)
aggregate(dataSet[numericalCols], list(dataSet$Churn), summary)
A data.frame: 2 × 20
Group.1Account_LengthVmail_MessageDay_MinsEve_MinsNight_MinsIntl_MinsCustServ_CallsIntl_PlanVmail_PlanDay_CallsDay_ChargeEve_CallsEve_ChargeNight_CallsNight_ChargeIntl_CallsIntl_ChargeStateArea_Code
<fct><dbl[,6]><dbl[,6]><dbl[,6]><dbl[,6]><dbl[,6]><dbl[,6]><dbl[,6]><dbl[,6]><dbl[,6]><dbl[,6]><dbl[,6]><dbl[,6]><dbl[,6]><dbl[,6]><dbl[,6]><dbl[,6]><dbl[,6]><dbl[,6]><dbl[,6]>
no 1, 73, 100, 100.7937, 127, 2430, 0, 0, 8.604561, 22, 510, 142.825, 177.2, 175.1758, 210.30, 315.6 0.0, 164.5, 199.6, 199.0433, 233.20, 361.823.2, 165.90, 200.25, 200.1332, 234.90, 395.00, 8.4, 10.2, 10.15888, 12.0, 18.90, 1, 1, 1.449825, 2, 81, 1, 1, 1.065263, 1, 21, 1, 1, 1.295439, 2, 20, 87.0, 100, 100.2832, 114.0, 1630, 24.2825, 30.12, 29.78042, 35.75, 53.65 0, 87, 100, 100.0386, 114, 1700.00, 13.980, 16.97, 16.91891, 19.820, 30.7533, 87, 100, 100.0582, 113, 1751.04, 7.470, 9.01, 9.006074, 10.570, 17.770, 3, 4, 4.532982, 6, 190.00, 2.27, 2.75, 2.743404, 3.24, 5.11, 14, 27, 27.01193, 40, 51408, 408, 415, 437.0747, 510, 510
yes1, 76, 103, 102.6646, 127, 2250, 0, 0, 5.115942, 0, 480, 153.250, 217.6, 206.9141, 265.95, 350.870.9, 177.1, 211.3, 212.4101, 249.45, 363.747.4, 171.25, 204.80, 205.2317, 239.85, 354.92, 8.8, 10.6, 10.70000, 12.8, 20.00, 1, 2, 2.229814, 4, 91, 1, 1, 1.283644, 2, 21, 1, 1, 1.165631, 1, 20, 87.5, 103, 101.3354, 116.5, 1650, 26.0550, 36.99, 35.17592, 45.21, 59.6448, 87, 101, 100.5611, 114, 1686.03, 15.055, 17.96, 18.05497, 21.205, 30.9149, 85, 100, 100.3996, 115, 1582.13, 7.705, 9.22, 9.235528, 10.795, 15.971, 2, 4, 4.163561, 5, 200.54, 2.38, 2.86, 2.889545, 3.46, 5.41, 17, 27, 27.33954, 39, 51408, 408, 415, 437.8178, 510, 510
In [33]:
# OPTION 2: (Using Aggregate FUNCTION - variables separately)
aggregate(dataSet$Intl_Mins, list(dataSet$Churn), summary)
aggregate(dataSet$Day_Mins, list(dataSet$Churn), summary)
aggregate(dataSet$Night_Mins, list(dataSet$Churn), summary)
A data.frame: 2 × 2
Group.1x
<fct><dbl[,6]>
no 0, 8.4, 10.2, 10.15888, 12.0, 18.9
yes2, 8.8, 10.6, 10.70000, 12.8, 20.0
A data.frame: 2 × 2
Group.1x
<fct><dbl[,6]>
no 0, 142.825, 177.2, 175.1758, 210.30, 315.6
yes0, 153.250, 217.6, 206.9141, 265.95, 350.8
A data.frame: 2 × 2
Group.1x
<fct><dbl[,6]>
no 23.2, 165.90, 200.25, 200.1332, 234.90, 395.0
yes47.4, 171.25, 204.80, 205.2317, 239.85, 354.9
In [34]:
# OPTION 3: (Using "by" FUNCTION instead of "Aggregate" FUNCTION)
by(dataSet$Intl_Mins, dataSet[8], FUN = summary)
by(dataSet$Day_Mins, dataSet[8], FUN = summary)
by(dataSet$Night_Mins, dataSet[8], FUN = summary)
Churn: no
   Min. 1st Qu.  Median    Mean 3rd Qu.    Max. 
   0.00    8.40   10.20   10.16   12.00   18.90 
------------------------------------------------------------ 
Churn: yes
   Min. 1st Qu.  Median    Mean 3rd Qu.    Max. 
    2.0     8.8    10.6    10.7    12.8    20.0 
Churn: no
   Min. 1st Qu.  Median    Mean 3rd Qu.    Max. 
    0.0   142.8   177.2   175.2   210.3   315.6 
------------------------------------------------------------ 
Churn: yes
   Min. 1st Qu.  Median    Mean 3rd Qu.    Max. 
    0.0   153.2   217.6   206.9   265.9   350.8 
Churn: no
   Min. 1st Qu.  Median    Mean 3rd Qu.    Max. 
   23.2   165.9   200.2   200.1   234.9   395.0 
------------------------------------------------------------ 
Churn: yes
   Min. 1st Qu.  Median    Mean 3rd Qu.    Max. 
   47.4   171.2   204.8   205.2   239.8   354.9 

Find out correlation

In [35]:
# Correlations/covariances among numeric variables 
library(Hmisc)
cor(dataSet[c(2,5,11,13,16,18)], use="complete.obs", method="kendall") 
cov(dataSet[c(2,5,11,13,16,18)], use="complete.obs")
Loading required package: survival


Attaching package: ‘survival’


The following object is masked from ‘package:caret’:

    cluster


Loading required package: Formula


Attaching package: ‘Hmisc’


The following objects are masked from ‘package:base’:

    format.pval, units


A matrix: 6 × 6 of type dbl
Vmail_MessageNight_MinsDay_CallsEve_CallsNight_ChargeIntl_Charge
Vmail_Message 1.000000000 0.003718463-0.009573189-5.382921e-03 0.003710434-1.263503e-03
Night_Mins 0.003718463 1.000000000 0.012550159 3.291091e-03 0.999625309-7.103399e-03
Day_Calls-0.009573189 0.012550159 1.000000000 9.253492e-03 0.012531632 1.038631e-02
Eve_Calls-0.005382921 0.003291091 0.009253492 1.000000e+00 0.003310838-9.536135e-05
Night_Charge 0.003710434 0.999625309 0.012531632 3.310838e-03 1.000000000-7.097366e-03
Intl_Charge-0.001263503-0.007103399 0.010386309-9.536135e-05-0.007097366 1.000000e+00
A matrix: 6 × 6 of type dbl
Vmail_MessageNight_MinsDay_CallsEve_CallsNight_ChargeIntl_Charge
Vmail_Message187.37134656 5.3174453 -2.6229779 -1.59925653 0.23873433 0.02975334
Night_Mins 5.317445292557.7140018 23.2812431 -2.10859729115.09955435-0.57867377
Day_Calls -2.62297790 23.2812431402.7681409 2.58373944 1.04716693 0.32775442
Eve_Calls -1.59925653 -2.1085973 2.5837394396.91099860 -0.09322113 0.13025644
Night_Charge 0.23873433 115.0995543 1.0471669 -0.09322113 5.17959717-0.02605168
Intl_Charge 0.02975334 -0.5786738 0.3277544 0.13025644 -0.02605168 0.56817315
In [36]:
# Correlations with significance levels
rcorr(as.matrix(dataSet[c(2,5,11,13,16,18)]), type="pearson")
              Vmail_Message Night_Mins Day_Calls Eve_Calls Night_Charge
Vmail_Message          1.00       0.01     -0.01     -0.01         0.01
Night_Mins             0.01       1.00      0.02      0.00         1.00
Day_Calls             -0.01       0.02      1.00      0.01         0.02
Eve_Calls             -0.01       0.00      0.01      1.00         0.00
Night_Charge           0.01       1.00      0.02      0.00         1.00
Intl_Charge            0.00      -0.02      0.02      0.01        -0.02
              Intl_Charge
Vmail_Message        0.00
Night_Mins          -0.02
Day_Calls            0.02
Eve_Calls            0.01
Night_Charge        -0.02
Intl_Charge          1.00

n= 3333 


P
              Vmail_Message Night_Mins Day_Calls Eve_Calls Night_Charge
Vmail_Message               0.6576     0.5816    0.7350    0.6583      
Night_Mins    0.6576                   0.1855    0.9039    0.0000      
Day_Calls     0.5816        0.1855               0.7092    0.1857      
Eve_Calls     0.7350        0.9039     0.7092              0.9056      
Night_Charge  0.6583        0.0000     0.1857    0.9056                
Intl_Charge   0.8678        0.3810     0.2111    0.6167    0.3808      
              Intl_Charge
Vmail_Message 0.8678     
Night_Mins    0.3810     
Day_Calls     0.2111     
Eve_Calls     0.6167     
Night_Charge  0.3808     
Intl_Charge              

5. Visualising DataSet

In [37]:
# Pie Chart from data 
mytable <- table(dataSet$Churn)
lbls <- paste(names(mytable), "\n", mytable, sep="")
pie(mytable, labels = lbls, col=rainbow(length(lbls)), 
    main="Pie Chart of Classes\n (with sample sizes)")
In [38]:
# Barplot of categorical data
par(mfrow=c(1,1))
barplot(table(dataSet$Churn), ylab = "Count", 
        col=c("darkblue","red"))
barplot(prop.table(table(dataSet$Churn)), ylab = "Proportion", 
        col=c("darkblue","red"))
barplot(table(dataSet$Churn), xlab = "Count", horiz = TRUE, 
        col=c("darkblue","red"))
barplot(prop.table(table(dataSet$Churn)), xlab = "Proportion", horiz = TRUE, 
        col=c("darkblue","red"))
In [39]:
# Scatterplot Matrices from the glus Package 
library(gclus)
dta <- dataSet[c(2,5,11,13,16,18)] # get data 
dta.r <- abs(cor(dta)) # get correlations
dta.col <- dmat.color(dta.r) # get colors
# reorder variables so those with highest correlation are closest to the diagonal
dta.o <- order.single(dta.r) 
cpairs(dta, dta.o, panel.colors=dta.col, gap=.5, 
       main="Variables Ordered and Colored by Correlation" )
Loading required package: cluster

Visualise correlations

In [40]:
corrgram(dataSet[c(2,5,11,13,16,18)], order=TRUE, lower.panel=panel.shade,
         upper.panel=panel.pie, text.panel=panel.txt, main=" ")
In [41]:
# More graphs on correlatios amaong data
# Using "Hmisc"
res2 <- rcorr(as.matrix(dataSet[,c(2,5,11,13,16,18)]))
# Extract the correlation coefficients
res2$r
# Extract p-values
res2$P
A matrix: 6 × 6 of type dbl
Vmail_MessageNight_MinsDay_CallsEve_CallsNight_ChargeIntl_Charge
Vmail_Message 1.000000000 0.007681136-0.009548068-0.005864351 0.007663290 0.002883658
Night_Mins 0.007681136 1.000000000 0.022937845-0.002092768 0.999999215-0.015179849
Day_Calls-0.009548068 0.022937845 1.000000000 0.006462114 0.022926638 0.021666095
Eve_Calls-0.005864351-0.002092768 0.006462114 1.000000000-0.002055984 0.008673858
Night_Charge 0.007663290 0.999999215 0.022926638-0.002055984 1.000000000-0.015186139
Intl_Charge 0.002883658-0.015179849 0.021666095 0.008673858-0.015186139 1.000000000
A matrix: 6 × 6 of type dbl
Vmail_MessageNight_MinsDay_CallsEve_CallsNight_ChargeIntl_Charge
Vmail_Message NA0.65755700.58160890.73503350.65830200.8678283
Night_Mins0.6575570 NA0.18552680.90386940.00000000.3809828
Day_Calls0.58160890.1855268 NA0.70919640.18574180.2111142
Eve_Calls0.73503350.90386940.7091964 NA0.90555110.6166654
Night_Charge0.65830200.00000000.18574180.9055511 NA0.3807855
Intl_Charge0.86782830.38098280.21111420.61666540.3807855 NA
In [42]:
# Using "corrplot"
library(corrplot)
library(RColorBrewer)
corrplot(res2$r, type = "upper", order = "hclust", col=brewer.pal(n=8, name="RdYlBu"),
         tl.col = "black", tl.srt = 45)
corrplot(res2$r, type = "lower", order = "hclust", col=brewer.pal(n=8, name="RdYlBu"),
         tl.col = "black", tl.srt = 45)
corrplot 0.84 loaded

In [43]:
# Using PerformanceAnalytics
library(PerformanceAnalytics)
data <- dataSet[, c(2,5,11,13,16,18)]
chart.Correlation(data, histogram=TRUE, pch=19)
Loading required package: xts

Loading required package: zoo


Attaching package: ‘zoo’


The following objects are masked from ‘package:base’:

    as.Date, as.Date.numeric



Attaching package: ‘PerformanceAnalytics’


The following object is masked from ‘package:graphics’:

    legend


In [44]:
# Using Colored Headmap 
col <- colorRampPalette(c("blue", "white", "red"))(20)
heatmap(x = res2$r, col = col, symm = TRUE)

We should notice that Night_Mins and Night_Charge have a strong, linear, positive relationship.

6. Pre-Processing of DataSet i.e. train (75%) : test (25%) split

In [45]:
train_test_index <- createDataPartition(dataSet$Churn, p=0.75, list=FALSE)
training_dataset <- dataSet[, c(1:20)][train_test_index,]
testing_dataset  <- dataSet[, c(1:20)][-train_test_index,]
In [46]:
dim(training_dataset)
dim(testing_dataset)
  1. 2501
  2. 20
  1. 832
  2. 20

7. Cross Validation and control parameter setup

In [47]:
control <- trainControl(method="repeatedcv", # repeatedcv / adaptive_cv
                        number=2, repeats = 2, 
                        verbose = TRUE, search = "grid",
                        allowParallel = TRUE)
metric <- "Accuracy"
tuneLength = 2

8. Algorithm : SVM

In [49]:
getModelInfo("svmLinear"); getModelInfo("svmRadial"); getModelInfo("svmPoly");
$lssvmLinear
$label
'Least Squares Support Vector Machine'
$library
'kernlab'
$type
'Classification'
$parameters
A data.frame: 1 × 3
parameterclasslabel
<chr><chr><chr>
taunumericRegularization Parameter
$grid
function (x, y, len = NULL, search = "grid") 
{
    if (search == "grid") {
        out <- expand.grid(tau = 2^((1:len) - 5))
    }
    else {
        out <- data.frame(tau = 2^runif(len, min = -5, max = 10))
    }
    out
}
$loop
NULL
$fit
function (x, y, wts, param, lev, last, classProbs, ...) 
{
    kernlab::lssvm(x = as.matrix(x), y = y, tau = param$tau, 
        kernel = kernlab::polydot(degree = 1, scale = 1, offset = 1), 
        ...)
}
$predict
function (modelFit, newdata, submodels = NULL) 
{
    out <- kernlab::predict(modelFit, as.matrix(newdata))
    if (is.matrix(out)) 
        out <- out[, 1]
    out
}
$prob
NULL
$predictors
function (x, ...) 
{
    if (hasTerms(x) & !is.null(x@terms)) {
        out <- predictors.terms(x@terms)
    }
    else {
        out <- colnames(attr(x, "xmatrix"))
    }
    if (is.null(out)) 
        out <- names(attr(x, "scaling")$x.scale$`scaled:center`)
    if (is.null(out)) 
        out <- NA
    out
}
$tags
  1. 'Kernel Method'
  2. 'Support Vector Machines'
  3. 'Linear Classifier'
$levels
function (x) 
lev(x)
$sort
function (x) 
x
$svmLinear
$label
'Support Vector Machines with Linear Kernel'
$library
'kernlab'
$type
  1. 'Regression'
  2. 'Classification'
$parameters
A data.frame: 1 × 3
parameterclasslabel
<chr><chr><chr>
CnumericCost
$grid
function (x, y, len = NULL, search = "grid") 
{
    if (search == "grid") {
        out <- data.frame(C = 1)
    }
    else {
        out <- data.frame(C = 2^runif(len, min = -5, max = 10))
    }
    out
}
$loop
NULL
$fit
function (x, y, wts, param, lev, last, classProbs, ...) 
{
    if (any(names(list(...)) == "prob.model") | is.numeric(y)) {
        out <- kernlab::ksvm(x = as.matrix(x), y = y, kernel = kernlab::vanilladot(), 
            C = param$C, ...)
    }
    else {
        out <- kernlab::ksvm(x = as.matrix(x), y = y, kernel = kernlab::vanilladot(), 
            C = param$C, prob.model = classProbs, ...)
    }
    out
}
$predict
function (modelFit, newdata, submodels = NULL) 
{
    svmPred <- function(obj, x) {
        hasPM <- !is.null(unlist(obj@prob.model))
        if (hasPM) {
            pred <- kernlab::lev(obj)[apply(kernlab::predict(obj, 
                x, type = "probabilities"), 1, which.max)]
        }
        else pred <- kernlab::predict(obj, x)
        pred
    }
    out <- try(svmPred(modelFit, newdata), silent = TRUE)
    if (is.character(kernlab::lev(modelFit))) {
        if (class(out)[1] == "try-error") {
            warning("kernlab class prediction calculations failed; returning NAs")
            out <- rep("", nrow(newdata))
            out[seq(along = out)] <- NA
        }
    }
    else {
        if (class(out)[1] == "try-error") {
            warning("kernlab prediction calculations failed; returning NAs")
            out <- rep(NA, nrow(newdata))
        }
    }
    if (is.matrix(out)) 
        out <- out[, 1]
    out
}
$prob
function (modelFit, newdata, submodels = NULL) 
{
    out <- try(kernlab::predict(modelFit, newdata, type = "probabilities"), 
        silent = TRUE)
    if (class(out)[1] != "try-error") {
        if (any(out < 0)) {
            out[out < 0] <- 0
            out <- t(apply(out, 1, function(x) x/sum(x)))
        }
        out <- out[, kernlab::lev(modelFit), drop = FALSE]
    }
    else {
        warning("kernlab class probability calculations failed; returning NAs")
        out <- matrix(NA, nrow(newdata) * length(kernlab::lev(modelFit)), 
            ncol = length(kernlab::lev(modelFit)))
        colnames(out) <- kernlab::lev(modelFit)
    }
    out
}
$predictors
function (x, ...) 
{
    if (hasTerms(x) & !is.null(x@terms)) {
        out <- predictors.terms(x@terms)
    }
    else {
        out <- colnames(attr(x, "xmatrix"))
    }
    if (is.null(out)) 
        out <- names(attr(x, "scaling")$x.scale$`scaled:center`)
    if (is.null(out)) 
        out <- NA
    out
}
$tags
  1. 'Kernel Method'
  2. 'Support Vector Machines'
  3. 'Linear Regression'
  4. 'Linear Classifier'
  5. 'Robust Methods'
$levels
function (x) 
kernlab::lev(x)
$sort
function (x) 
{
    x[order(x$C), ]
}
$svmLinear2
$label
'Support Vector Machines with Linear Kernel'
$library
'e1071'
$type
  1. 'Regression'
  2. 'Classification'
$parameters
A data.frame: 1 × 3
parameterclasslabel
<chr><chr><chr>
costnumericCost
$grid
function (x, y, len = NULL, search = "grid") 
{
    if (search == "grid") {
        out <- expand.grid(cost = 2^((1:len) - 3))
    }
    else {
        out <- data.frame(cost = 2^runif(len, min = -5, max = 10))
    }
    out
}
$loop
NULL
$fit
function (x, y, wts, param, lev, last, classProbs, ...) 
{
    if (any(names(list(...)) == "probability") | is.numeric(y)) {
        out <- e1071::svm(x = as.matrix(x), y = y, kernel = "linear", 
            cost = param$cost, ...)
    }
    else {
        out <- e1071::svm(x = as.matrix(x), y = y, kernel = "linear", 
            cost = param$cost, probability = classProbs, ...)
    }
    out
}
$predict
function (modelFit, newdata, submodels = NULL) 
{
    predict(modelFit, newdata)
}
$prob
function (modelFit, newdata, submodels = NULL) 
{
    out <- predict(modelFit, newdata, probability = TRUE)
    attr(out, "probabilities")
}
$predictors
function (x, ...) 
{
    out <- if (!is.null(x$terms)) 
        predictors.terms(x$terms)
    else x$xNames
    if (is.null(out)) 
        out <- names(attr(x, "scaling")$x.scale$`scaled:center`)
    if (is.null(out)) 
        out <- NA
    out
}
$tags
  1. 'Kernel Method'
  2. 'Support Vector Machines'
  3. 'Linear Regression'
  4. 'Linear Classifier'
  5. 'Robust Methods'
$levels
function (x) 
x$levels
$sort
function (x) 
{
    x[order(x$cost), ]
}
$svmLinear3
$label
'L2 Regularized Support Vector Machine (dual) with Linear Kernel'
$library
'LiblineaR'
$type
  1. 'Regression'
  2. 'Classification'
$parameters
A data.frame: 2 × 3
parameterclasslabel
<chr><chr><chr>
costnumeric Cost
LosscharacterLoss Function
$grid
function (x, y, len = NULL, search = "grid") 
{
    if (search == "grid") {
        out <- expand.grid(cost = 2^((1:len) - 3), Loss = c("L1", 
            "L2"))
    }
    else {
        out <- data.frame(cost = 2^runif(len, min = -10, max = 10), 
            Loss = sample(c("L1", "L2"), size = len, replace = TRUE))
    }
    out
}
$loop
NULL
$fit
function (x, y, wts, param, lev, last, classProbs, ...) 
{
    if (param$Loss == "L2") {
        model_type <- if (is.factor(y)) 
            2
        else 12
    }
    else model_type <- if (is.factor(y)) 
        3
    else 13
    out <- LiblineaR::LiblineaR(data = as.matrix(x), target = y, 
        cost = param$cost, type = model_type, ...)
    out
}
$predict
function (modelFit, newdata, submodels = NULL) 
{
    predict(modelFit, newdata)$predictions
}
$prob
NULL
$predictors
function (x, ...) 
{
    out <- colnames(x$W)
    out[out != "Bias"]
}
$tags
  1. 'Kernel Method'
  2. 'Support Vector Machines'
  3. 'Linear Regression'
  4. 'Linear Classifier'
  5. 'Robust Methods'
$levels
function (x) 
x$levels
$sort
function (x) 
{
    x[order(x$cost), ]
}
$svmLinearWeights
$label
'Linear Support Vector Machines with Class Weights'
$library
'e1071'
$type
'Classification'
$parameters
A data.frame: 2 × 3
parameterclasslabel
<chr><chr><chr>
cost numericCost
weightnumericClass Weight
$grid
function (x, y, len = NULL, search = "grid") 
{
    if (search == "grid") {
        out <- expand.grid(cost = 2^((1:len) - 3), weight = 1:len)
    }
    else {
        out <- data.frame(cost = 2^runif(len, min = -5, max = 10), 
            weight = runif(len, min = 1, max = 25))
    }
    out
}
$loop
NULL
$fit
function (x, y, wts, param, lev, last, classProbs, ...) 
{
    if (length(levels(y)) != 2) 
        stop("Currently implemented for 2-class problems")
    cwts <- c(1, param$weight)
    names(cwts) <- levels(y)
    out <- e1071::svm(x = as.matrix(x), y = y, kernel = "linear", 
        cost = param$cost, probability = classProbs, class.weights = cwts, 
        ...)
    out
}
$predict
function (modelFit, newdata, submodels = NULL) 
{
    predict(modelFit, newdata)
}
$prob
function (modelFit, newdata, submodels = NULL) 
{
    out <- predict(modelFit, newdata, probability = TRUE)
    attr(out, "probabilities")
}
$predictors
function (x, ...) 
{
    out <- if (!is.null(x$terms)) 
        predictors.terms(x$terms)
    else x$xNames
    if (is.null(out)) 
        out <- names(attr(x, "scaling")$x.scale$`scaled:center`)
    if (is.null(out)) 
        out <- NA
    out
}
$tags
  1. 'Kernel Method'
  2. 'Support Vector Machines'
  3. 'Linear Classifier'
  4. 'Robust Methods'
  5. 'Cost Sensitive Learning'
  6. 'Two Class Only'
$levels
function (x) 
x$levels
$sort
function (x) 
{
    x[order(x$cost, x$weight), ]
}
$svmLinearWeights2
$label
'L2 Regularized Linear Support Vector Machines with Class Weights'
$library
'LiblineaR'
$type
'Classification'
$parameters
A data.frame: 3 × 3
parameterclasslabel
<chr><chr><chr>
cost numeric Cost
Loss characterLoss Function
weightnumeric Class Weight
$grid
function (x, y, len = NULL, search = "grid") 
{
    if (search == "grid") {
        out <- expand.grid(cost = 2^((1:len) - 3), Loss = c("L1", 
            "L2"), weight = 1:len)
    }
    else {
        out <- data.frame(cost = 2^runif(len, min = -10, max = 10), 
            Loss = sample(c("L1", "L2"), size = len, replace = TRUE), 
            weight = runif(len, min = 1, max = 25))
    }
    out
}
$loop
NULL
$fit
function (x, y, wts, param, lev, last, classProbs, ...) 
{
    model_type <- if (param$Loss == "L2") 
        2
    else 3
    if (length(levels(y)) != 2) 
        stop("Currently implemented for 2-class problems")
    cwts <- c(1, param$weight)
    names(cwts) <- levels(y)
    out <- LiblineaR::LiblineaR(data = as.matrix(x), target = y, 
        cost = param$cost, type = model_type, wi = cwts, ...)
    out
}
$predict
function (modelFit, newdata, submodels = NULL) 
{
    predict(modelFit, newdata)$predictions
}
$prob
NULL
$predictors
function (x, ...) 
{
    out <- colnames(x$W)
    out[out != "Bias"]
}
$tags
  1. 'Kernel Method'
  2. 'Support Vector Machines'
  3. 'Linear Classifier'
  4. 'Robust Methods'
  5. 'Cost Sensitive Learning'
  6. 'Two Class Only'
$levels
function (x) 
x$levels
$sort
function (x) 
{
    x[order(x$cost), ]
}
$lssvmRadial
$label
'Least Squares Support Vector Machine with Radial Basis Function Kernel'
$library
'kernlab'
$type
'Classification'
$parameters
A data.frame: 2 × 3
parameterclasslabel
<chr><chr><chr>
sigmanumericSigma
tau numericRegularization Parameter
$grid
function (x, y, len = NULL, search = "grid") 
{
    sigmas <- kernlab::sigest(as.matrix(x), na.action = na.omit, 
        scaled = TRUE)
    if (search == "grid") {
        out <- expand.grid(sigma = seq(min(sigmas), max(sigmas), 
            length = min(6, len)), tau = 2^((1:len) - 5))
    }
    else {
        rng <- extendrange(log(sigmas), f = 0.75)
        out <- data.frame(sigma = exp(runif(len, min = rng[1], 
            max = rng[2])), tau = 2^runif(len, min = -5, max = 10))
    }
    out
}
$loop
NULL
$fit
function (x, y, wts, param, lev, last, classProbs, ...) 
{
    kernlab::lssvm(x = as.matrix(x), y = y, tau = param$tau, 
        kernel = "rbfdot", kpar = list(sigma = param$sigma), 
        ...)
}
$predict
function (modelFit, newdata, submodels = NULL) 
{
    out <- kernlab::predict(modelFit, as.matrix(newdata))
    if (is.matrix(out)) 
        out <- out[, 1]
    out
}
$prob
NULL
$predictors
function (x, ...) 
{
    if (hasTerms(x) & !is.null(x@terms)) {
        out <- predictors.terms(x@terms)
    }
    else {
        out <- colnames(attr(x, "xmatrix"))
    }
    if (is.null(out)) 
        out <- names(attr(x, "scaling")$x.scale$`scaled:center`)
    if (is.null(out)) 
        out <- NA
    out
}
$tags
  1. 'Kernel Method'
  2. 'Support Vector Machines'
  3. 'Radial Basis Function'
$levels
function (x) 
lev(x)
$sort
function (x) 
x
$svmRadial
$label
'Support Vector Machines with Radial Basis Function Kernel'
$library
'kernlab'
$type
  1. 'Regression'
  2. 'Classification'
$parameters
A data.frame: 2 × 3
parameterclasslabel
<chr><chr><chr>
sigmanumericSigma
C numericCost
$grid
function (x, y, len = NULL, search = "grid") 
{
    sigmas <- kernlab::sigest(as.matrix(x), na.action = na.omit, 
        scaled = TRUE)
    if (search == "grid") {
        out <- expand.grid(sigma = mean(as.vector(sigmas[-2])), 
            C = 2^((1:len) - 3))
    }
    else {
        rng <- extendrange(log(sigmas), f = 0.75)
        out <- data.frame(sigma = exp(runif(len, min = rng[1], 
            max = rng[2])), C = 2^runif(len, min = -5, max = 10))
    }
    out
}
$loop
NULL
$fit
function (x, y, wts, param, lev, last, classProbs, ...) 
{
    if (any(names(list(...)) == "prob.model") | is.numeric(y)) {
        out <- kernlab::ksvm(x = as.matrix(x), y = y, kernel = "rbfdot", 
            kpar = list(sigma = param$sigma), C = param$C, ...)
    }
    else {
        out <- kernlab::ksvm(x = as.matrix(x), y = y, kernel = "rbfdot", 
            kpar = list(sigma = param$sigma), C = param$C, prob.model = classProbs, 
            ...)
    }
    out
}
$predict
function (modelFit, newdata, submodels = NULL) 
{
    svmPred <- function(obj, x) {
        hasPM <- !is.null(unlist(obj@prob.model))
        if (hasPM) {
            pred <- kernlab::lev(obj)[apply(kernlab::predict(obj, 
                x, type = "probabilities"), 1, which.max)]
        }
        else pred <- kernlab::predict(obj, x)
        pred
    }
    out <- try(svmPred(modelFit, newdata), silent = TRUE)
    if (is.character(kernlab::lev(modelFit))) {
        if (class(out)[1] == "try-error") {
            warning("kernlab class prediction calculations failed; returning NAs")
            out <- rep("", nrow(newdata))
            out[seq(along = out)] <- NA
        }
    }
    else {
        if (class(out)[1] == "try-error") {
            warning("kernlab prediction calculations failed; returning NAs")
            out <- rep(NA, nrow(newdata))
        }
    }
    if (is.matrix(out)) 
        out <- out[, 1]
    out
}
$prob
function (modelFit, newdata, submodels = NULL) 
{
    out <- try(kernlab::predict(modelFit, newdata, type = "probabilities"), 
        silent = TRUE)
    if (class(out)[1] != "try-error") {
        if (any(out < 0)) {
            out[out < 0] <- 0
            out <- t(apply(out, 1, function(x) x/sum(x)))
        }
        out <- out[, kernlab::lev(modelFit), drop = FALSE]
    }
    else {
        warning("kernlab class probability calculations failed; returning NAs")
        out <- matrix(NA, nrow(newdata) * length(kernlab::lev(modelFit)), 
            ncol = length(kernlab::lev(modelFit)))
        colnames(out) <- kernlab::lev(modelFit)
    }
    out
}
$predictors
function (x, ...) 
{
    if (hasTerms(x) & !is.null(x@terms)) {
        out <- predictors.terms(x@terms)
    }
    else {
        out <- colnames(attr(x, "xmatrix"))
    }
    if (is.null(out)) 
        out <- names(attr(x, "scaling")$x.scale$`scaled:center`)
    if (is.null(out)) 
        out <- NA
    out
}
$tags
  1. 'Kernel Method'
  2. 'Support Vector Machines'
  3. 'Radial Basis Function'
  4. 'Robust Methods'
$levels
function (x) 
kernlab::lev(x)
$sort
function (x) 
{
    x[order(x$C, -x$sigma), ]
}
$svmRadialCost
$label
'Support Vector Machines with Radial Basis Function Kernel'
$library
'kernlab'
$type
  1. 'Regression'
  2. 'Classification'
$parameters
A data.frame: 1 × 3
parameterclasslabel
<chr><chr><chr>
CnumericCost
$grid
function (x, y, len = NULL, search = "grid") 
{
    if (search == "grid") {
        out <- data.frame(C = 2^((1:len) - 3))
    }
    else {
        out <- data.frame(C = 2^runif(len, min = -5, max = 10))
    }
    out
}
$loop
NULL
$fit
function (x, y, wts, param, lev, last, classProbs, ...) 
{
    if (any(names(list(...)) == "prob.model") | is.numeric(y)) {
        out <- kernlab::ksvm(x = as.matrix(x), y = y, kernel = "rbfdot", 
            C = param$C, ...)
    }
    else {
        out <- kernlab::ksvm(x = as.matrix(x), y = y, kernel = "rbfdot", 
            C = param$C, prob.model = classProbs, ...)
    }
    out
}
$predict
function (modelFit, newdata, submodels = NULL) 
{
    svmPred <- function(obj, x) {
        hasPM <- !is.null(unlist(obj@prob.model))
        if (hasPM) {
            pred <- kernlab::lev(obj)[apply(kernlab::predict(obj, 
                x, type = "probabilities"), 1, which.max)]
        }
        else pred <- kernlab::predict(obj, x)
        pred
    }
    out <- try(svmPred(modelFit, newdata), silent = TRUE)
    if (is.character(kernlab::lev(modelFit))) {
        if (class(out)[1] == "try-error") {
            warning("kernlab class prediction calculations failed; returning NAs")
            out <- rep("", nrow(newdata))
            out[seq(along = out)] <- NA
        }
    }
    else {
        if (class(out)[1] == "try-error") {
            warning("kernlab prediction calculations failed; returning NAs")
            out <- rep(NA, nrow(newdata))
        }
    }
    if (is.matrix(out)) 
        out <- out[, 1]
    out
}
$prob
function (modelFit, newdata, submodels = NULL) 
{
    out <- try(kernlab::predict(modelFit, newdata, type = "probabilities"), 
        silent = TRUE)
    if (class(out)[1] != "try-error") {
        if (any(out < 0)) {
            out[out < 0] <- 0
            out <- t(apply(out, 1, function(x) x/sum(x)))
        }
        out <- out[, kernlab::lev(modelFit), drop = FALSE]
    }
    else {
        warning("kernlab class probability calculations failed; returning NAs")
        out <- matrix(NA, nrow(newdata) * length(kernlab::lev(modelFit)), 
            ncol = length(kernlab::lev(modelFit)))
        colnames(out) <- kernlab::lev(modelFit)
    }
    out
}
$predictors
function (x, ...) 
{
    if (hasTerms(x) & !is.null(x@terms)) {
        out <- predictors.terms(x@terms)
    }
    else {
        out <- colnames(attr(x, "xmatrix"))
    }
    if (is.null(out)) 
        out <- names(attr(x, "scaling")$x.scale$`scaled:center`)
    if (is.null(out)) 
        out <- NA
    out
}
$tags
  1. 'Kernel Method'
  2. 'Support Vector Machines'
  3. 'Radial Basis Function'
$levels
function (x) 
kernlab::lev(x)
$sort
function (x) 
{
    x[order(x$C), ]
}
$svmRadialSigma
$label
'Support Vector Machines with Radial Basis Function Kernel'
$library
'kernlab'
$type
  1. 'Regression'
  2. 'Classification'
$parameters
A data.frame: 2 × 3
parameterclasslabel
<chr><chr><chr>
sigmanumericSigma
C numericCost
$grid
function (x, y, len = NULL, search = "grid") 
{
    sigmas <- kernlab::sigest(as.matrix(x), na.action = na.omit, 
        scaled = TRUE)
    if (search == "grid") {
        out <- expand.grid(sigma = seq(min(sigmas), max(sigmas), 
            length = min(6, len)), C = 2^((1:len) - 3))
    }
    else {
        rng <- extendrange(log(sigmas), f = 0.75)
        out <- data.frame(sigma = exp(runif(len, min = rng[1], 
            max = rng[2])), C = 2^runif(len, min = -5, max = 10))
    }
    out
}
$loop
NULL
$fit
function (x, y, wts, param, lev, last, classProbs, ...) 
{
    if (any(names(list(...)) == "prob.model") | is.numeric(y)) {
        out <- kernlab::ksvm(x = as.matrix(x), y = y, kernel = "rbfdot", 
            kpar = list(sigma = param$sigma), C = param$C, ...)
    }
    else {
        out <- kernlab::ksvm(x = as.matrix(x), y = y, kernel = "rbfdot", 
            kpar = list(sigma = param$sigma), C = param$C, prob.model = classProbs, 
            ...)
    }
    out
}
$predict
function (modelFit, newdata, submodels = NULL) 
{
    svmPred <- function(obj, x) {
        hasPM <- !is.null(unlist(obj@prob.model))
        if (hasPM) {
            pred <- kernlab::lev(obj)[apply(kernlab::predict(obj, 
                x, type = "probabilities"), 1, which.max)]
        }
        else pred <- kernlab::predict(obj, x)
        pred
    }
    out <- try(svmPred(modelFit, newdata), silent = TRUE)
    if (is.character(kernlab::lev(modelFit))) {
        if (class(out)[1] == "try-error") {
            warning("kernlab class prediction calculations failed; returning NAs")
            out <- rep("", nrow(newdata))
            out[seq(along = out)] <- NA
        }
    }
    else {
        if (class(out)[1] == "try-error") {
            warning("kernlab prediction calculations failed; returning NAs")
            out <- rep(NA, nrow(newdata))
        }
    }
    if (is.matrix(out)) 
        out <- out[, 1]
    out
}
$prob
function (modelFit, newdata, submodels = NULL) 
{
    out <- try(kernlab::predict(modelFit, newdata, type = "probabilities"), 
        silent = TRUE)
    if (class(out)[1] != "try-error") {
        if (any(out < 0)) {
            out[out < 0] <- 0
            out <- t(apply(out, 1, function(x) x/sum(x)))
        }
        out <- out[, kernlab::lev(modelFit), drop = FALSE]
    }
    else {
        warning("kernlab class probability calculations failed; returning NAs")
        out <- matrix(NA, nrow(newdata) * length(kernlab::lev(modelFit)), 
            ncol = length(kernlab::lev(modelFit)))
        colnames(out) <- kernlab::lev(modelFit)
    }
    out
}
$predictors
function (x, ...) 
{
    if (hasTerms(x) & !is.null(x@terms)) {
        out <- predictors.terms(x@terms)
    }
    else {
        out <- colnames(attr(x, "xmatrix"))
    }
    if (is.null(out)) 
        out <- names(attr(x, "scaling")$x.scale$`scaled:center`)
    if (is.null(out)) 
        out <- NA
    out
}
$notes
'This SVM model tunes over the cost parameter and the RBF kernel parameter sigma. In the latter case, using `tuneLength` will, at most, evaluate six values of the kernel parameter. This enables a broad search over the cost parameter and a relatively narrow search over `sigma`'
$tags
  1. 'Kernel Method'
  2. 'Support Vector Machines'
  3. 'Radial Basis Function'
  4. 'Robust Methods'
$levels
function (x) 
kernlab::lev(x)
$sort
function (x) 
{
    x[order(x$C, -x$sigma), ]
}
$svmRadialWeights
$label
'Support Vector Machines with Class Weights'
$library
'kernlab'
$type
'Classification'
$parameters
A data.frame: 3 × 3
parameterclasslabel
<chr><chr><chr>
sigma numericSigma
C numericCost
WeightnumericWeight
$grid
function (x, y, len = NULL, search = "grid") 
{
    sigmas <- kernlab::sigest(as.matrix(x), na.action = na.omit, 
        scaled = TRUE)
    if (search == "grid") {
        out <- expand.grid(sigma = mean(as.vector(sigmas[-2])), 
            C = 2^((1:len) - 3), Weight = 1:len)
    }
    else {
        rng <- extendrange(log(sigmas), f = 0.75)
        out <- data.frame(sigma = exp(runif(len, min = rng[1], 
            max = rng[2])), C = 2^runif(len, min = -5, max = 10), 
            Weight = runif(len, min = 1, max = 25))
    }
    out
}
$loop
NULL
$fit
function (x, y, wts, param, lev, last, classProbs, ...) 
{
    if (param$Weight != 1) {
        wts <- c(param$Weight, 1)
        names(wts) <- levels(y)
    }
    else wts <- NULL
    if (any(names(list(...)) == "prob.model") | is.numeric(y)) {
        out <- kernlab::ksvm(x = as.matrix(x), y = y, kernel = "rbfdot", 
            kpar = list(sigma = param$sigma), class.weights = wts, 
            C = param$C, ...)
    }
    else {
        out <- kernlab::ksvm(x = as.matrix(x), y = y, kernel = "rbfdot", 
            kpar = list(sigma = param$sigma), class.weights = wts, 
            C = param$C, prob.model = classProbs, ...)
    }
    out
}
$predict
function (modelFit, newdata, submodels = NULL) 
{
    out <- kernlab::predict(modelFit, newdata)
    if (is.matrix(out)) 
        out <- out[, 1]
    out
}
$prob
function (modelFit, newdata, submodels = NULL) 
{
    out <- try(kernlab::predict(modelFit, newdata, type = "probabilities"), 
        silent = TRUE)
    if (class(out)[1] != "try-error") {
        if (any(out < 0)) {
            out[out < 0] <- 0
            out <- t(apply(out, 1, function(x) x/sum(x)))
        }
        out <- out[, kernlab::lev(modelFit), drop = FALSE]
    }
    else {
        warning("kernlab class probability calculations failed; returning NAs")
        out <- matrix(NA, nrow(newdata) * length(kernlab::lev(modelFit)), 
            ncol = length(kernlab::lev(modelFit)))
        colnames(out) <- kernlab::lev(modelFit)
    }
    out
}
$predictors
function (x, ...) 
{
    if (hasTerms(x) & !is.null(x@terms)) {
        out <- predictors.terms(x@terms)
    }
    else {
        out <- colnames(attr(x, "xmatrix"))
    }
    if (is.null(out)) 
        out <- names(attr(x, "scaling")$x.scale$`scaled:center`)
    if (is.null(out)) 
        out <- NA
    out
}
$tags
  1. 'Kernel Method'
  2. 'Support Vector Machines'
  3. 'Radial Basis Function'
  4. 'Cost Sensitive Learning'
  5. 'Two Class Only'
$levels
function (x) 
kernlab::lev(x)
$sort
function (x) 
x[order(x$C, -x$sigma, x$Weight), ]
$lssvmPoly
$label
'Least Squares Support Vector Machine with Polynomial Kernel'
$library
'kernlab'
$type
'Classification'
$parameters
A data.frame: 3 × 3
parameterclasslabel
<chr><chr><chr>
degreenumericPolynomial Degree
scale numericScale
tau numericRegularization Parameter
$grid
function (x, y, len = NULL, search = "grid") 
{
    if (search == "grid") {
        out <- expand.grid(degree = seq(1, min(len, 3)), scale = 10^((1:len) - 
            4), tau = 2^((1:len) - 5))
    }
    else {
        out <- data.frame(degree = sample(1:3, size = len, replace = TRUE), 
            scale = 10^runif(len, min = -5, log10(2)), tau = 2^runif(len, 
                min = -5, max = 10))
    }
    out
}
$loop
NULL
$fit
function (x, y, wts, param, lev, last, classProbs, ...) 
{
    kernlab::lssvm(x = as.matrix(x), y = y, tau = param$tau, 
        kernel = kernlab::polydot(degree = param$degree, scale = param$scale, 
            offset = 1), ...)
}
$predict
function (modelFit, newdata, submodels = NULL) 
{
    out <- kernlab::predict(modelFit, as.matrix(newdata))
    if (is.matrix(out)) 
        out <- out[, 1]
    out
}
$prob
NULL
$predictors
function (x, ...) 
{
    if (hasTerms(x) & !is.null(x@terms)) {
        out <- predictors.terms(x@terms)
    }
    else {
        out <- colnames(attr(x, "xmatrix"))
    }
    if (is.null(out)) 
        out <- names(attr(x, "scaling")$xscale$`scaled:center`)
    if (is.null(out)) 
        out <- NA
    out
}
$tags
  1. 'Kernel Method'
  2. 'Support Vector Machines'
  3. 'Polynomial Model'
$levels
function (x) 
lev(x)
$sort
function (x) 
x
$svmPoly
$label
'Support Vector Machines with Polynomial Kernel'
$library
'kernlab'
$type
  1. 'Regression'
  2. 'Classification'
$parameters
A data.frame: 3 × 3
parameterclasslabel
<chr><chr><chr>
degreenumericPolynomial Degree
scale numericScale
C numericCost
$grid
function (x, y, len = NULL, search = "grid") 
{
    if (search == "grid") {
        out <- expand.grid(degree = seq(1, min(len, 3)), scale = 10^((1:len) - 
            4), C = 2^((1:len) - 3))
    }
    else {
        out <- data.frame(degree = sample(1:3, size = len, replace = TRUE), 
            scale = 10^runif(len, min = -5, log10(2)), C = 2^runif(len, 
                min = -5, max = 10))
    }
    out
}
$loop
NULL
$fit
function (x, y, wts, param, lev, last, classProbs, ...) 
{
    if (any(names(list(...)) == "prob.model") | is.numeric(y)) {
        out <- kernlab::ksvm(x = as.matrix(x), y = y, kernel = kernlab::polydot(degree = param$degree, 
            scale = param$scale, offset = 1), C = param$C, ...)
    }
    else {
        out <- kernlab::ksvm(x = as.matrix(x), y = y, kernel = kernlab::polydot(degree = param$degree, 
            scale = param$scale, offset = 1), C = param$C, prob.model = classProbs, 
            ...)
    }
    out
}
$predict
function (modelFit, newdata, submodels = NULL) 
{
    svmPred <- function(obj, x) {
        hasPM <- !is.null(unlist(obj@prob.model))
        if (hasPM) {
            pred <- kernlab::lev(obj)[apply(kernlab::predict(obj, 
                x, type = "probabilities"), 1, which.max)]
        }
        else pred <- kernlab::predict(obj, x)
        pred
    }
    out <- try(svmPred(modelFit, newdata), silent = TRUE)
    if (is.character(kernlab::lev(modelFit))) {
        if (class(out)[1] == "try-error") {
            warning("kernlab class prediction calculations failed; returning NAs")
            out <- rep("", nrow(newdata))
            out[seq(along = out)] <- NA
        }
    }
    else {
        if (class(out)[1] == "try-error") {
            warning("kernlab prediction calculations failed; returning NAs")
            out <- rep(NA, nrow(newdata))
        }
    }
    if (is.matrix(out)) 
        out <- out[, 1]
    out
}
$prob
function (modelFit, newdata, submodels = NULL) 
{
    out <- try(kernlab::predict(modelFit, newdata, type = "probabilities"), 
        silent = TRUE)
    if (class(out)[1] != "try-error") {
        if (any(out < 0)) {
            out[out < 0] <- 0
            out <- t(apply(out, 1, function(x) x/sum(x)))
        }
        out <- out[, kernlab::lev(modelFit), drop = FALSE]
    }
    else {
        warning("kernlab class probability calculations failed; returning NAs")
        out <- matrix(NA, nrow(newdata) * length(kernlab::lev(modelFit)), 
            ncol = length(kernlab::lev(modelFit)))
        colnames(out) <- kernlab::lev(modelFit)
    }
    out
}
$predictors
function (x, ...) 
{
    if (hasTerms(x) & !is.null(x@terms)) {
        out <- predictors.terms(x@terms)
    }
    else {
        out <- colnames(attr(x, "xmatrix"))
    }
    if (is.null(out)) 
        out <- names(attr(x, "scaling")$xscale$`scaled:center`)
    if (is.null(out)) 
        out <- NA
    out
}
$tags
  1. 'Kernel Method'
  2. 'Support Vector Machines'
  3. 'Polynomial Model'
  4. 'Robust Methods'
$levels
function (x) 
kernlab::lev(x)
$sort
function (x) 
x[order(x$degree, x$C, x$scale), ]
In [50]:
names(getModelInfo("svm"))
  1. 'lssvmLinear'
  2. 'lssvmPoly'
  3. 'lssvmRadial'
  4. 'ORFsvm'
  5. 'svmBoundrangeString'
  6. 'svmExpoString'
  7. 'svmLinear'
  8. 'svmLinear2'
  9. 'svmLinear3'
  10. 'svmLinearWeights'
  11. 'svmLinearWeights2'
  12. 'svmPoly'
  13. 'svmRadial'
  14. 'svmRadialCost'
  15. 'svmRadialSigma'
  16. 'svmRadialWeights'
  17. 'svmSpectrumString'

1) Training - without explicit parameter tuning / using default

In [73]:
# svmLinear
fit.svmLinear <- caret::train(Churn~., data=training_dataset, method="svmLinear", 
                              metric=metric, 
                              trControl=control,
                              verbose = TRUE
)
print(fit.svmLinear)
Aggregating results
Fitting final model on full training set
Support Vector Machines with Linear Kernel 

2501 samples
  19 predictor
   2 classes: 'no', 'yes' 

No pre-processing
Resampling: Cross-Validated (2 fold, repeated 2 times) 
Summary of sample sizes: 1250, 1251, 1251, 1250 
Resampling results:

  Accuracy   Kappa
  0.8548582  0    

Tuning parameter 'C' was held constant at a value of 1
In [74]:
# svmRadial
fit.svmRadial <- caret::train(Churn~., data=training_dataset, method="svmRadial", 
                              metric=metric, 
                              trControl=control,
                              verbose = TRUE
)
print(fit.svmRadial)
Aggregating results
Selecting tuning parameters
Fitting sigma = 0.0331, C = 1 on full training set
Support Vector Machines with Radial Basis Function Kernel 

2501 samples
  19 predictor
   2 classes: 'no', 'yes' 

No pre-processing
Resampling: Cross-Validated (2 fold, repeated 2 times) 
Summary of sample sizes: 1251, 1250, 1250, 1251 
Resampling results across tuning parameters:

  C     Accuracy   Kappa    
  0.25  0.8558574  0.0116202
  0.50  0.8756486  0.2422584
  1.00  0.8956422  0.4350147

Tuning parameter 'sigma' was held constant at a value of 0.03310822
Accuracy was used to select the optimal model using the largest value.
The final values used for the model were sigma = 0.03310822 and C = 1.
In [75]:
# svmPoly
fit.svmPoly <- caret::train(Churn~., data=training_dataset, method="svmPoly", 
                            metric=metric, 
                            trControl=control,
                            verbose = TRUE
)
print(fit.svmPoly)
Aggregating results
Selecting tuning parameters
Fitting degree = 3, scale = 0.1, C = 0.25 on full training set
Support Vector Machines with Polynomial Kernel 

2501 samples
  19 predictor
   2 classes: 'no', 'yes' 

No pre-processing
Resampling: Cross-Validated (2 fold, repeated 2 times) 
Summary of sample sizes: 1251, 1250, 1250, 1251 
Resampling results across tuning parameters:

  degree  scale  C     Accuracy   Kappa      
  1       0.001  0.25  0.8548582  0.000000000
  1       0.001  0.50  0.8548582  0.000000000
  1       0.001  1.00  0.8548582  0.000000000
  1       0.010  0.25  0.8548582  0.000000000
  1       0.010  0.50  0.8548582  0.000000000
  1       0.010  1.00  0.8548582  0.000000000
  1       0.100  0.25  0.8548582  0.000000000
  1       0.100  0.50  0.8548582  0.000000000
  1       0.100  1.00  0.8548582  0.000000000
  2       0.001  0.25  0.8548582  0.000000000
  2       0.001  0.50  0.8548582  0.000000000
  2       0.001  1.00  0.8548582  0.000000000
  2       0.010  0.25  0.8548582  0.000000000
  2       0.010  0.50  0.8548582  0.000000000
  2       0.010  1.00  0.8574571  0.030040180
  2       0.100  0.25  0.9058395  0.548116063
  2       0.100  0.50  0.9054406  0.565423514
  2       0.100  1.00  0.9024419  0.559400967
  3       0.001  0.25  0.8548582  0.000000000
  3       0.001  0.50  0.8548582  0.000000000
  3       0.001  1.00  0.8548582  0.000000000
  3       0.010  0.25  0.8554580  0.007026524
  3       0.010  0.50  0.8636552  0.105467971
  3       0.010  1.00  0.8794483  0.283427083
  3       0.100  0.25  0.9090373  0.598891399
  3       0.100  0.50  0.9032388  0.587678103
  3       0.100  1.00  0.8926424  0.555832982

Accuracy was used to select the optimal model using the largest value.
The final values used for the model were degree = 3, scale = 0.1 and C = 0.25.

2) Training - with explicit parameter tuning using preProcess method

In [76]:
# svmLinear
fit.svmLinear_preProc <- caret::train(Churn~., data=training_dataset, method="svmLinear", 
                                      metric=metric, 
                                      trControl=control,
                                      preProc=c("center", "scale", "pca"), 
                                      verbose = TRUE
)
print(fit.svmLinear_preProc)
Aggregating results
Fitting final model on full training set
Support Vector Machines with Linear Kernel 

2501 samples
  19 predictor
   2 classes: 'no', 'yes' 

Pre-processing: centered (19), scaled (19), principal component
 signal extraction (19) 
Resampling: Cross-Validated (2 fold, repeated 2 times) 
Summary of sample sizes: 1251, 1250, 1250, 1251 
Resampling results:

  Accuracy   Kappa
  0.8548582  0    

Tuning parameter 'C' was held constant at a value of 1
In [77]:
# svmRadial
fit.svmRadial_preProc <- caret::train(Churn~., data=training_dataset, method="svmRadial", 
                                      metric=metric, 
                                      trControl=control,
                                      preProc=c("center", "scale", "pca"), 
                                      verbose = TRUE
)
print(fit.svmRadial_preProc)
Aggregating results
Selecting tuning parameters
Fitting sigma = 0.0491, C = 1 on full training set
Support Vector Machines with Radial Basis Function Kernel 

2501 samples
  19 predictor
   2 classes: 'no', 'yes' 

Pre-processing: centered (19), scaled (19), principal component
 signal extraction (19) 
Resampling: Cross-Validated (2 fold, repeated 2 times) 
Summary of sample sizes: 1251, 1250, 1250, 1251 
Resampling results across tuning parameters:

  C     Accuracy   Kappa    
  0.25  0.8548582  0.0000000
  0.50  0.8642526  0.1058326
  1.00  0.8900433  0.3778059

Tuning parameter 'sigma' was held constant at a value of 0.0490987
Accuracy was used to select the optimal model using the largest value.
The final values used for the model were sigma = 0.0490987 and C = 1.
In [78]:
# svmPoly
fit.svmPoly_preProc <- caret::train(Churn~., data=training_dataset, method="svmPoly", 
                                    metric=metric, 
                                    trControl=control,
                                    preProc=c("center", "scale", "pca"), 
                                    verbose = TRUE
)
print(fit.svmPoly_preProc)
Aggregating results
Selecting tuning parameters
Fitting degree = 3, scale = 0.1, C = 0.25 on full training set
Support Vector Machines with Polynomial Kernel 

2501 samples
  19 predictor
   2 classes: 'no', 'yes' 

Pre-processing: centered (19), scaled (19), principal component
 signal extraction (19) 
Resampling: Cross-Validated (2 fold, repeated 2 times) 
Summary of sample sizes: 1251, 1250, 1251, 1250 
Resampling results across tuning parameters:

  degree  scale  C     Accuracy   Kappa     
  1       0.001  0.25  0.8548582  0.00000000
  1       0.001  0.50  0.8548582  0.00000000
  1       0.001  1.00  0.8548582  0.00000000
  1       0.010  0.25  0.8548582  0.00000000
  1       0.010  0.50  0.8548582  0.00000000
  1       0.010  1.00  0.8548582  0.00000000
  1       0.100  0.25  0.8548582  0.00000000
  1       0.100  0.50  0.8548582  0.00000000
  1       0.100  1.00  0.8548582  0.00000000
  2       0.001  0.25  0.8548582  0.00000000
  2       0.001  0.50  0.8548582  0.00000000
  2       0.001  1.00  0.8548582  0.00000000
  2       0.010  0.25  0.8548582  0.00000000
  2       0.010  0.50  0.8548582  0.00000000
  2       0.010  1.00  0.8548582  0.00000000
  2       0.100  0.25  0.8926459  0.42965743
  2       0.100  0.50  0.8984433  0.50078337
  2       0.100  1.00  0.8992428  0.52311505
  3       0.001  0.25  0.8548582  0.00000000
  3       0.001  0.50  0.8548582  0.00000000
  3       0.001  1.00  0.8548582  0.00000000
  3       0.010  0.25  0.8548582  0.00000000
  3       0.010  0.50  0.8548582  0.00000000
  3       0.010  1.00  0.8574576  0.03020616
  3       0.100  0.25  0.9032409  0.53728468
  3       0.100  0.50  0.9016409  0.55131624
  3       0.100  1.00  0.8960430  0.54436065

Accuracy was used to select the optimal model using the largest value.
The final values used for the model were degree = 3, scale = 0.1 and C = 0.25.

3) Training - with explicit parameter tuning using preProcess method & Automatic Grid i.e. tuneLength

In [79]:
# svmLinear
fit.svmLinear_automaticGrid <- caret::train(Churn~., data=training_dataset, method="svmLinear", 
                                            metric=metric, 
                                            trControl=control,
                                            preProc=c("center", "scale", "pca"), 
                                            tuneLength = tuneLength,
                                            verbose = TRUE
)
print(fit.svmLinear_automaticGrid)
Aggregating results
Fitting final model on full training set
Support Vector Machines with Linear Kernel 

2501 samples
  19 predictor
   2 classes: 'no', 'yes' 

Pre-processing: centered (19), scaled (19), principal component
 signal extraction (19) 
Resampling: Cross-Validated (2 fold, repeated 2 times) 
Summary of sample sizes: 1250, 1251, 1250, 1251 
Resampling results:

  Accuracy   Kappa
  0.8548582  0    

Tuning parameter 'C' was held constant at a value of 1
In [80]:
# svmRadial
fit.svmRadial_automaticGrid <- caret::train(Churn~., data=training_dataset, method="svmRadial", 
                                            metric=metric, 
                                            trControl=control,
                                            preProc=c("center", "scale", "pca"), 
                                            tuneLength = tuneLength,
                                            verbose = TRUE
)
print(fit.svmRadial_automaticGrid)
Aggregating results
Selecting tuning parameters
Fitting sigma = 0.0499, C = 0.5 on full training set
Support Vector Machines with Radial Basis Function Kernel 

2501 samples
  19 predictor
   2 classes: 'no', 'yes' 

Pre-processing: centered (19), scaled (19), principal component
 signal extraction (19) 
Resampling: Cross-Validated (2 fold, repeated 2 times) 
Summary of sample sizes: 1251, 1250, 1251, 1250 
Resampling results across tuning parameters:

  C     Accuracy   Kappa    
  0.25  0.8548582  0.0000000
  0.50  0.8636534  0.1035911

Tuning parameter 'sigma' was held constant at a value of 0.04993854
Accuracy was used to select the optimal model using the largest value.
The final values used for the model were sigma = 0.04993854 and C = 0.5.
In [81]:
# svmPoly
fit.svmPoly_automaticGrid <- caret::train(Churn~., data=training_dataset, method="svmPoly", 
                                          metric=metric, 
                                          trControl=control,
                                          preProc=c("center", "scale", "pca"), 
                                          tuneLength = tuneLength,
                                          verbose = TRUE
)
print(fit.svmPoly_automaticGrid)
Aggregating results
Selecting tuning parameters
Fitting degree = 1, scale = 0.001, C = 0.25 on full training set
Support Vector Machines with Polynomial Kernel 

2501 samples
  19 predictor
   2 classes: 'no', 'yes' 

Pre-processing: centered (19), scaled (19), principal component
 signal extraction (19) 
Resampling: Cross-Validated (2 fold, repeated 2 times) 
Summary of sample sizes: 1250, 1251, 1250, 1251 
Resampling results across tuning parameters:

  degree  scale  C     Accuracy   Kappa
  1       0.001  0.25  0.8548582  0    
  1       0.001  0.50  0.8548582  0    
  1       0.010  0.25  0.8548582  0    
  1       0.010  0.50  0.8548582  0    
  2       0.001  0.25  0.8548582  0    
  2       0.001  0.50  0.8548582  0    
  2       0.010  0.25  0.8548582  0    
  2       0.010  0.50  0.8548582  0    

Accuracy was used to select the optimal model using the largest value.
The final values used for the model were degree = 1, scale = 0.001 and C = 0.25.

4) Training - with explicit parameter tuning using preProcess method & Manual Grid i.e. tuneGrid

Grid needs to parameterise manually for each particular algorithm

In [82]:
# svmLinear
grid <- expand.grid(C=c(seq(from = 1, to = 5, by = 0.5)))
fit.svmLinear_manualGrid <- caret::train(Churn~., data=training_dataset, method="svmLinear", 
                                         metric=metric, 
                                         trControl=control,
                                         preProc=c("center", "scale", "pca"), 
                                         tuneGrid = grid,
                                         verbose = TRUE
)
print(fit.svmLinear_manualGrid)
plot(fit.svmLinear_manualGrid)
Aggregating results
Selecting tuning parameters
Fitting C = 1 on full training set
Support Vector Machines with Linear Kernel 

2501 samples
  19 predictor
   2 classes: 'no', 'yes' 

Pre-processing: centered (19), scaled (19), principal component
 signal extraction (19) 
Resampling: Cross-Validated (2 fold, repeated 2 times) 
Summary of sample sizes: 1250, 1251, 1250, 1251 
Resampling results across tuning parameters:

  C    Accuracy   Kappa
  1.0  0.8548582  0    
  1.5  0.8548582  0    
  2.0  0.8548582  0    
  2.5  0.8548582  0    
  3.0  0.8548582  0    
  3.5  0.8548582  0    
  4.0  0.8548582  0    
  4.5  0.8548582  0    
  5.0  0.8548582  0    

Accuracy was used to select the optimal model using the largest value.
The final value used for the model was C = 1.
In [83]:
# svmRadial
grid <- expand.grid(C     = c(seq(from = 1, to = 5, by = 0.5)),
                    sigma = c(seq(from = 0.1, to = 1, by = 0.1))
)
fit.svmRadial_manualGrid <- caret::train(Churn~., data=training_dataset, method="svmRadial", 
                                         metric=metric, 
                                         trControl=control,
                                         preProc=c("center", "scale", "pca"), 
                                         tuneGrid = grid,
                                         verbose = TRUE
)
print(fit.svmRadial_manualGrid)
plot(fit.svmRadial_manualGrid)
Aggregating results
Selecting tuning parameters
Fitting sigma = 0.1, C = 3 on full training set
Support Vector Machines with Radial Basis Function Kernel 

2501 samples
  19 predictor
   2 classes: 'no', 'yes' 

Pre-processing: centered (19), scaled (19), principal component
 signal extraction (19) 
Resampling: Cross-Validated (2 fold, repeated 2 times) 
Summary of sample sizes: 1251, 1250, 1251, 1250 
Resampling results across tuning parameters:

  C    sigma  Accuracy   Kappa      
  1.0  0.1    0.8910436  0.395101052
  1.0  0.2    0.8770491  0.263986504
  1.0  0.3    0.8616558  0.082270494
  1.0  0.4    0.8560579  0.014053048
  1.0  0.5    0.8550582  0.002353195
  1.0  0.6    0.8548582  0.000000000
  1.0  0.7    0.8548582  0.000000000
  1.0  0.8    0.8548582  0.000000000
  1.0  0.9    0.8548582  0.000000000
  1.0  1.0    0.8548582  0.000000000
  1.5  0.1    0.8994398  0.479467316
  1.5  0.2    0.8900443  0.398313543
  1.5  0.3    0.8710515  0.207242730
  1.5  0.4    0.8614560  0.085221306
  1.5  0.5    0.8558579  0.017438650
  1.5  0.6    0.8550582  0.002353195
  1.5  0.7    0.8548582  0.000000000
  1.5  0.8    0.8548582  0.000000000
  1.5  0.9    0.8548582  0.000000000
  1.5  1.0    0.8548582  0.000000000
  2.0  0.1    0.9000395  0.504361056
  2.0  0.2    0.8922428  0.427059231
  2.0  0.3    0.8718508  0.224707540
  2.0  0.4    0.8614560  0.095038285
  2.0  0.5    0.8556580  0.018936195
  2.0  0.6    0.8544584  0.001146119
  2.0  0.7    0.8548584  0.001955066
  2.0  0.8    0.8548582  0.000000000
  2.0  0.9    0.8548582  0.000000000
  2.0  1.0    0.8548582  0.000000000
  2.5  0.1    0.9020390  0.522895363
  2.5  0.2    0.8914432  0.424427823
  2.5  0.3    0.8720508  0.228016585
  2.5  0.4    0.8614561  0.100069704
  2.5  0.5    0.8556580  0.018936195
  2.5  0.6    0.8544584  0.001146119
  2.5  0.7    0.8548584  0.001955066
  2.5  0.8    0.8548582  0.000000000
  2.5  0.9    0.8548582  0.000000000
  2.5  1.0    0.8548582  0.000000000
  3.0  0.1    0.9034380  0.536488374
  3.0  0.2    0.8902432  0.420058273
  3.0  0.3    0.8716510  0.228112624
  3.0  0.4    0.8612563  0.099602705
  3.0  0.5    0.8556580  0.018936195
  3.0  0.6    0.8544584  0.001146119
  3.0  0.7    0.8548584  0.001955066
  3.0  0.8    0.8548582  0.000000000
  3.0  0.9    0.8548582  0.000000000
  3.0  1.0    0.8548582  0.000000000
  3.5  0.1    0.9014385  0.532802322
  3.5  0.2    0.8898435  0.420461089
  3.5  0.3    0.8710513  0.227597045
  3.5  0.4    0.8612563  0.099602705
  3.5  0.5    0.8556580  0.018936195
  3.5  0.6    0.8544584  0.001146119
  3.5  0.7    0.8548584  0.001955066
  3.5  0.8    0.8548582  0.000000000
  3.5  0.9    0.8548582  0.000000000
  3.5  1.0    0.8548582  0.000000000
  4.0  0.1    0.9002382  0.534163603
  4.0  0.2    0.8878440  0.414978448
  4.0  0.3    0.8710510  0.230432537
  4.0  0.4    0.8610563  0.099096612
  4.0  0.5    0.8556580  0.018936195
  4.0  0.6    0.8544584  0.001146119
  4.0  0.7    0.8548584  0.001955066
  4.0  0.8    0.8548582  0.000000000
  4.0  0.9    0.8548582  0.000000000
  4.0  1.0    0.8548582  0.000000000
  4.5  0.1    0.9004384  0.539666102
  4.5  0.2    0.8872443  0.413919157
  4.5  0.3    0.8712508  0.232350709
  4.5  0.4    0.8610563  0.099096612
  4.5  0.5    0.8556580  0.018936195
  4.5  0.6    0.8544584  0.001146119
  4.5  0.7    0.8548584  0.001955066
  4.5  0.8    0.8548582  0.000000000
  4.5  0.9    0.8548582  0.000000000
  4.5  1.0    0.8548582  0.000000000
  5.0  0.1    0.9000390  0.539396659
  5.0  0.2    0.8878440  0.418393524
  5.0  0.3    0.8710508  0.230455586
  5.0  0.4    0.8610563  0.099096612
  5.0  0.5    0.8556580  0.018936195
  5.0  0.6    0.8544584  0.001146119
  5.0  0.7    0.8548584  0.001955066
  5.0  0.8    0.8548582  0.000000000
  5.0  0.9    0.8548582  0.000000000
  5.0  1.0    0.8548582  0.000000000

Accuracy was used to select the optimal model using the largest value.
The final values used for the model were sigma = 0.1 and C = 3.
In [84]:
# svmPoly
grid <- expand.grid(C     = c(seq(from = 1, to = 5, by = 0.25)),
                    scale = c(seq(from = 0.001, to = 0.010, by = 0.001)),
                    degree = c(seq(from = 1, to = 10, by = 1))
)
fit.svmPoly_manualGrid <- caret::train(Churn~., data=training_dataset, method="svmPoly", 
                                       metric=metric, 
                                       trControl=control,
                                       preProc=c("center", "scale", "pca"), 
                                       tuneGrid = grid,
                                       verbose = TRUE
)
print(fit.svmPoly_manualGrid)
plot(fit.svmPoly_manualGrid)
Aggregating results
Selecting tuning parameters
Fitting degree = 10, scale = 0.009, C = 4.25 on full training set
Support Vector Machines with Polynomial Kernel 

2501 samples
  19 predictor
   2 classes: 'no', 'yes' 

Pre-processing: centered (19), scaled (19), principal component
 signal extraction (19) 
Resampling: Cross-Validated (2 fold, repeated 2 times) 
Summary of sample sizes: 1250, 1251, 1250, 1251 
Resampling results across tuning parameters:

  C     scale  degree  Accuracy   Kappa      
  1.00  0.001   1      0.8548582  0.000000000
  1.00  0.001   2      0.8548582  0.000000000
  1.00  0.001   3      0.8548582  0.000000000
  1.00  0.001   4      0.8548582  0.000000000
  1.00  0.001   5      0.8548582  0.000000000
  1.00  0.001   6      0.8548582  0.000000000
  1.00  0.001   7      0.8548582  0.000000000
  1.00  0.001   8      0.8548582  0.000000000
  1.00  0.001   9      0.8548582  0.000000000
  1.00  0.001  10      0.8548582  0.000000000
  1.00  0.002   1      0.8548582  0.000000000
  1.00  0.002   2      0.8548582  0.000000000
  1.00  0.002   3      0.8548582  0.000000000
  1.00  0.002   4      0.8548582  0.000000000
  1.00  0.002   5      0.8548582  0.000000000
  1.00  0.002   6      0.8548582  0.000000000
  1.00  0.002   7      0.8548582  0.000000000
  1.00  0.002   8      0.8548582  0.000000000
  1.00  0.002   9      0.8548582  0.000000000
  1.00  0.002  10      0.8548582  0.000000000
  1.00  0.003   1      0.8548582  0.000000000
  1.00  0.003   2      0.8548582  0.000000000
  1.00  0.003   3      0.8548582  0.000000000
  1.00  0.003   4      0.8548582  0.000000000
  1.00  0.003   5      0.8548582  0.000000000
  1.00  0.003   6      0.8548582  0.000000000
  1.00  0.003   7      0.8548582  0.000000000
  1.00  0.003   8      0.8560579  0.014016854
  1.00  0.003   9      0.8570574  0.027457856
  1.00  0.003  10      0.8600563  0.063246904
  1.00  0.004   1      0.8548582  0.000000000
  1.00  0.004   2      0.8548582  0.000000000
  1.00  0.004   3      0.8548582  0.000000000
  1.00  0.004   4      0.8548582  0.000000000
  1.00  0.004   5      0.8548582  0.000000000
  1.00  0.004   6      0.8558580  0.011696471
  1.00  0.004   7      0.8572574  0.031645341
  1.00  0.004   8      0.8620553  0.086992444
  1.00  0.004   9      0.8658547  0.146646561
  1.00  0.004  10      0.8712523  0.213636693
  1.00  0.005   1      0.8548582  0.000000000
  1.00  0.005   2      0.8548582  0.000000000
  1.00  0.005   3      0.8548582  0.000000000
  1.00  0.005   4      0.8548582  0.000000000
  1.00  0.005   5      0.8558580  0.011696471
  1.00  0.005   6      0.8592566  0.054318694
  1.00  0.005   7      0.8636548  0.119865678
  1.00  0.005   8      0.8702532  0.203009851
  1.00  0.005   9      0.8750502  0.258235867
  1.00  0.005  10      0.8816481  0.327107215
  1.00  0.006   1      0.8548582  0.000000000
  1.00  0.006   2      0.8548582  0.000000000
  1.00  0.006   3      0.8548582  0.000000000
  1.00  0.006   4      0.8550582  0.002353195
  1.00  0.006   5      0.8580571  0.040763415
  1.00  0.006   6      0.8638548  0.125111764
  1.00  0.006   7      0.8720523  0.222717083
  1.00  0.006   8      0.8762497  0.272764890
  1.00  0.006   9      0.8838470  0.348722418
  1.00  0.006  10      0.8864459  0.377695450
  1.00  0.007   1      0.8548582  0.000000000
  1.00  0.007   2      0.8548582  0.000000000
  1.00  0.007   3      0.8548582  0.000000000
  1.00  0.007   4      0.8564576  0.020546433
  1.00  0.007   5      0.8624552  0.099687622
  1.00  0.007   6      0.8712524  0.213764196
  1.00  0.007   7      0.8776491  0.285345622
  1.00  0.007   8      0.8842470  0.352115948
  1.00  0.007   9      0.8880451  0.393810533
  1.00  0.007  10      0.8908438  0.423250231
  1.00  0.008   1      0.8548582  0.000000000
  1.00  0.008   2      0.8548582  0.000000000
  1.00  0.008   3      0.8548582  0.000000000
  1.00  0.008   4      0.8592566  0.054318694
  1.00  0.008   5      0.8680537  0.174675046
  1.00  0.008   6      0.8756500  0.264873106
  1.00  0.008   7      0.8842470  0.353084396
  1.00  0.008   8      0.8878454  0.394043677
  1.00  0.008   9      0.8908440  0.426580259
  1.00  0.008  10      0.8922432  0.443012573
  1.00  0.009   1      0.8548582  0.000000000
  1.00  0.009   2      0.8548582  0.000000000
  1.00  0.009   3      0.8558580  0.011696471
  1.00  0.009   4      0.8628550  0.102260835
  1.00  0.009   5      0.8732515  0.237044193
  1.00  0.009   6      0.8832476  0.342652685
  1.00  0.009   7      0.8868457  0.382810897
  1.00  0.009   8      0.8906440  0.424347031
  1.00  0.009   9      0.8922432  0.443012573
  1.00  0.009  10      0.8960425  0.471488371
  1.00  0.010   1      0.8548582  0.000000000
  1.00  0.010   2      0.8548582  0.000000000
  1.00  0.010   3      0.8564576  0.020546433
  1.00  0.010   4      0.8662545  0.152052900
  1.00  0.010   5      0.8756500  0.267345526
  1.00  0.010   6      0.8848468  0.359151466
  1.00  0.010   7      0.8904441  0.416700875
  1.00  0.010   8      0.8910436  0.435063437
  1.00  0.010   9      0.8950427  0.465229136
  1.00  0.010  10      0.8966417  0.480257219
  1.25  0.001   1      0.8548582  0.000000000
  1.25  0.001   2      0.8548582  0.000000000
  1.25  0.001   3      0.8548582  0.000000000
  1.25  0.001   4      0.8548582  0.000000000
  1.25  0.001   5      0.8548582  0.000000000
  1.25  0.001   6      0.8548582  0.000000000
  1.25  0.001   7      0.8548582  0.000000000
  1.25  0.001   8      0.8548582  0.000000000
  1.25  0.001   9      0.8548582  0.000000000
  1.25  0.001  10      0.8548582  0.000000000
  1.25  0.002   1      0.8548582  0.000000000
  1.25  0.002   2      0.8548582  0.000000000
  1.25  0.002   3      0.8548582  0.000000000
  1.25  0.002   4      0.8548582  0.000000000
  1.25  0.002   5      0.8548582  0.000000000
  1.25  0.002   6      0.8548582  0.000000000
  1.25  0.002   7      0.8548582  0.000000000
  1.25  0.002   8      0.8548582  0.000000000
  1.25  0.002   9      0.8548582  0.000000000
  1.25  0.002  10      0.8556582  0.009358009
  1.25  0.003   1      0.8548582  0.000000000
  1.25  0.003   2      0.8548582  0.000000000
  1.25  0.003   3      0.8548582  0.000000000
  1.25  0.003   4      0.8548582  0.000000000
  1.25  0.003   5      0.8548582  0.000000000
  1.25  0.003   6      0.8548582  0.000000000
  1.25  0.003   7      0.8558580  0.011696471
  1.25  0.003   8      0.8568574  0.025122989
  1.25  0.003   9      0.8602563  0.065452632
  1.25  0.003  10      0.8636548  0.115308991
  1.25  0.004   1      0.8548582  0.000000000
  1.25  0.004   2      0.8548582  0.000000000
  1.25  0.004   3      0.8548582  0.000000000
  1.25  0.004   4      0.8548582  0.000000000
  1.25  0.004   5      0.8548582  0.000000000
  1.25  0.004   6      0.8566574  0.022848945
  1.25  0.004   7      0.8610560  0.074263174
  1.25  0.004   8      0.8650548  0.138903317
  1.25  0.004   9      0.8708526  0.209951901
  1.25  0.004  10      0.8750504  0.258010241
  1.25  0.005   1      0.8548582  0.000000000
  1.25  0.005   2      0.8548582  0.000000000
  1.25  0.005   3      0.8548582  0.000000000
  1.25  0.005   4      0.8548582  0.000000000
  1.25  0.005   5      0.8568574  0.025122989
  1.25  0.005   6      0.8624552  0.094713945
  1.25  0.005   7      0.8684540  0.181107241
  1.25  0.005   8      0.8742508  0.249140389
  1.25  0.005   9      0.8814478  0.324242527
  1.25  0.005  10      0.8844470  0.354761185
  1.25  0.006   1      0.8548582  0.000000000
  1.25  0.006   2      0.8548582  0.000000000
  1.25  0.006   3      0.8548582  0.000000000
  1.25  0.006   4      0.8562579  0.016315702
  1.25  0.006   5      0.8620555  0.088674947
  1.25  0.006   6      0.8692537  0.188796190
  1.25  0.006   7      0.8758499  0.265454935
  1.25  0.006   8      0.8830476  0.342056896
  1.25  0.006   9      0.8868462  0.377843640
  1.25  0.006  10      0.8896446  0.411310325
  1.25  0.007   1      0.8548582  0.000000000
  1.25  0.007   2      0.8548582  0.000000000
  1.25  0.007   3      0.8548582  0.000000000
  1.25  0.007   4      0.8590568  0.052051309
  1.25  0.007   5      0.8664548  0.156840098
  1.25  0.007   6      0.8750505  0.257760807
  1.25  0.007   7      0.8832476  0.343624405
  1.25  0.007   8      0.8870457  0.383304519
  1.25  0.007   9      0.8908438  0.422331175
  1.25  0.007  10      0.8906438  0.432956969
  1.25  0.008   1      0.8548582  0.000000000
  1.25  0.008   2      0.8548582  0.000000000
  1.25  0.008   3      0.8558580  0.011696471
  1.25  0.008   4      0.8626550  0.100100831
  1.25  0.008   5      0.8730518  0.235179989
  1.25  0.008   6      0.8822478  0.334420982
  1.25  0.008   7      0.8876457  0.385153770
  1.25  0.008   8      0.8910438  0.424665473
  1.25  0.008   9      0.8908436  0.436873122
  1.25  0.008  10      0.8946428  0.465288800
  1.25  0.009   1      0.8548582  0.000000000
  1.25  0.009   2      0.8548582  0.000000000
  1.25  0.009   3      0.8564576  0.020546433
  1.25  0.009   4      0.8664548  0.156840098
  1.25  0.009   5      0.8760500  0.273538768
  1.25  0.009   6      0.8852467  0.363267901
  1.25  0.009   7      0.8896443  0.413320114
  1.25  0.009   8      0.8904440  0.433969689
  1.25  0.009   9      0.8946428  0.465288800
  1.25  0.009  10      0.8964424  0.480303444
  1.25  0.010   1      0.8548582  0.000000000
  1.25  0.010   2      0.8548582  0.000000000
  1.25  0.010   3      0.8590568  0.052051309
  1.25  0.010   4      0.8708528  0.211477424
  1.25  0.010   5      0.8832473  0.343723674
  1.25  0.010   6      0.8888454  0.402270412
  1.25  0.010   7      0.8902435  0.430762332
  1.25  0.010   8      0.8946428  0.465440017
  1.25  0.010   9      0.8962424  0.478972727
  1.25  0.010  10      0.8984412  0.500521189
  1.50  0.001   1      0.8548582  0.000000000
  1.50  0.001   2      0.8548582  0.000000000
  1.50  0.001   3      0.8548582  0.000000000
  1.50  0.001   4      0.8548582  0.000000000
  1.50  0.001   5      0.8548582  0.000000000
  1.50  0.001   6      0.8548582  0.000000000
  1.50  0.001   7      0.8548582  0.000000000
  1.50  0.001   8      0.8548582  0.000000000
  1.50  0.001   9      0.8548582  0.000000000
  1.50  0.001  10      0.8548582  0.000000000
  1.50  0.002   1      0.8548582  0.000000000
  1.50  0.002   2      0.8548582  0.000000000
  1.50  0.002   3      0.8548582  0.000000000
  1.50  0.002   4      0.8548582  0.000000000
  1.50  0.002   5      0.8548582  0.000000000
  1.50  0.002   6      0.8548582  0.000000000
  1.50  0.002   7      0.8548582  0.000000000
  1.50  0.002   8      0.8548582  0.000000000
  1.50  0.002   9      0.8552582  0.004688062
  1.50  0.002  10      0.8562579  0.016315702
  1.50  0.003   1      0.8548582  0.000000000
  1.50  0.003   2      0.8548582  0.000000000
  1.50  0.003   3      0.8548582  0.000000000
  1.50  0.003   4      0.8548582  0.000000000
  1.50  0.003   5      0.8548582  0.000000000
  1.50  0.003   6      0.8548582  0.000000000
  1.50  0.003   7      0.8562577  0.018226050
  1.50  0.003   8      0.8594566  0.056541210
  1.50  0.003   9      0.8628552  0.103909074
  1.50  0.003  10      0.8666547  0.157294478
  1.50  0.004   1      0.8548582  0.000000000
  1.50  0.004   2      0.8548582  0.000000000
  1.50  0.004   3      0.8548582  0.000000000
  1.50  0.004   4      0.8548582  0.000000000
  1.50  0.004   5      0.8558580  0.011696471
  1.50  0.004   6      0.8584569  0.045287758
  1.50  0.004   7      0.8636548  0.115308991
  1.50  0.004   8      0.8692537  0.190194742
  1.50  0.004   9      0.8738510  0.246641266
  1.50  0.004  10      0.8790488  0.301577205
  1.50  0.005   1      0.8548582  0.000000000
  1.50  0.005   2      0.8548582  0.000000000
  1.50  0.005   3      0.8548582  0.000000000
  1.50  0.005   4      0.8554582  0.007041257
  1.50  0.005   5      0.8592566  0.054318694
  1.50  0.005   6      0.8652548  0.140817416
  1.50  0.005   7      0.8730518  0.233888504
  1.50  0.005   8      0.8784491  0.295101367
  1.50  0.005   9      0.8842470  0.353084396
  1.50  0.005  10      0.8876457  0.385153770
  1.50  0.006   1      0.8548582  0.000000000
  1.50  0.006   2      0.8548582  0.000000000
  1.50  0.006   3      0.8548582  0.000000000
  1.50  0.006   4      0.8568574  0.025122989
  1.50  0.006   5      0.8648548  0.134959347
  1.50  0.006   6      0.8728516  0.234428340
  1.50  0.006   7      0.8812478  0.321304413
  1.50  0.006   8      0.8852467  0.363267901
  1.50  0.006   9      0.8900451  0.410585307
  1.50  0.006  10      0.8906436  0.428694798
  1.50  0.007   1      0.8548582  0.000000000
  1.50  0.007   2      0.8548582  0.000000000
  1.50  0.007   3      0.8552582  0.004688062
  1.50  0.007   4      0.8620555  0.086915881
  1.50  0.007   5      0.8706528  0.209532587
  1.50  0.007   6      0.8794486  0.305961624
  1.50  0.007   7      0.8860464  0.370548018
  1.50  0.007   8      0.8898448  0.411988181
  1.50  0.007   9      0.8900436  0.430143632
  1.50  0.007  10      0.8940427  0.460460125
  1.50  0.008   1      0.8548582  0.000000000
  1.50  0.008   2      0.8548582  0.000000000
  1.50  0.008   3      0.8562577  0.018226050
  1.50  0.008   4      0.8652548  0.142168648
  1.50  0.008   5      0.8760499  0.268482326
  1.50  0.008   6      0.8850468  0.360594293
  1.50  0.008   7      0.8896449  0.410447321
  1.50  0.008   8      0.8904433  0.432265176
  1.50  0.008   9      0.8944427  0.464791730
  1.50  0.008  10      0.8964422  0.476562554
  1.50  0.009   1      0.8548582  0.000000000
  1.50  0.009   2      0.8548582  0.000000000
  1.50  0.009   3      0.8588568  0.049811814
  1.50  0.009   4      0.8702532  0.204451881
  1.50  0.009   5      0.8828473  0.340473972
  1.50  0.009   6      0.8890456  0.398265268
  1.50  0.009   7      0.8902435  0.428349669
  1.50  0.009   8      0.8942425  0.463385948
  1.50  0.009   9      0.8962422  0.475929281
  1.50  0.009  10      0.8980416  0.494936543
  1.50  0.010   1      0.8548582  0.000000000
  1.50  0.010   2      0.8548582  0.000000000
  1.50  0.010   3      0.8622556  0.090706917
  1.50  0.010   4      0.8750504  0.258017346
  1.50  0.010   5      0.8858460  0.368191766
  1.50  0.010   6      0.8894444  0.417073007
  1.50  0.010   7      0.8928428  0.452723698
  1.50  0.010   8      0.8956420  0.473941654
  1.50  0.010   9      0.8974419  0.490907517
  1.50  0.010  10      0.8992409  0.510499484
  1.75  0.001   1      0.8548582  0.000000000
  1.75  0.001   2      0.8548582  0.000000000
  1.75  0.001   3      0.8548582  0.000000000
  1.75  0.001   4      0.8548582  0.000000000
  1.75  0.001   5      0.8548582  0.000000000
  1.75  0.001   6      0.8548582  0.000000000
  1.75  0.001   7      0.8548582  0.000000000
  1.75  0.001   8      0.8548582  0.000000000
  1.75  0.001   9      0.8548582  0.000000000
  1.75  0.001  10      0.8548582  0.000000000
  1.75  0.002   1      0.8548582  0.000000000
  1.75  0.002   2      0.8548582  0.000000000
  1.75  0.002   3      0.8548582  0.000000000
  1.75  0.002   4      0.8548582  0.000000000
  1.75  0.002   5      0.8548582  0.000000000
  1.75  0.002   6      0.8548582  0.000000000
  1.75  0.002   7      0.8548582  0.000000000
  1.75  0.002   8      0.8548582  0.000000000
  1.75  0.002   9      0.8560580  0.013995319
  1.75  0.002  10      0.8568574  0.025122989
  1.75  0.003   1      0.8548582  0.000000000
  1.75  0.003   2      0.8548582  0.000000000
  1.75  0.003   3      0.8548582  0.000000000
  1.75  0.003   4      0.8548582  0.000000000
  1.75  0.003   5      0.8548582  0.000000000
  1.75  0.003   6      0.8558580  0.011696471
  1.75  0.003   7      0.8570574  0.027457856
  1.75  0.003   8      0.8620555  0.086915881
  1.75  0.003   9      0.8654548  0.142718213
  1.75  0.003  10      0.8704531  0.204910015
  1.75  0.004   1      0.8548582  0.000000000
  1.75  0.004   2      0.8548582  0.000000000
  1.75  0.004   3      0.8548582  0.000000000
  1.75  0.004   4      0.8548582  0.000000000
  1.75  0.004   5      0.8562577  0.018226050
  1.75  0.004   6      0.8610560  0.074263174
  1.75  0.004   7      0.8664545  0.153910147
  1.75  0.004   8      0.8730518  0.233888504
  1.75  0.004   9      0.8764497  0.276948155
  1.75  0.004  10      0.8832476  0.343624405
  1.75  0.005   1      0.8548582  0.000000000
  1.75  0.005   2      0.8548582  0.000000000
  1.75  0.005   3      0.8548582  0.000000000
  1.75  0.005   4      0.8562579  0.016315702
  1.75  0.005   5      0.8620555  0.086915881
  1.75  0.005   6      0.8686539  0.181387810
  1.75  0.005   7      0.8752504  0.261055749
  1.75  0.005   8      0.8824476  0.336116544
  1.75  0.005   9      0.8860464  0.371566492
  1.75  0.005  10      0.8900449  0.411523465
  1.75  0.006   1      0.8548582  0.000000000
  1.75  0.006   2      0.8548582  0.000000000
  1.75  0.006   3      0.8548582  0.000000000
  1.75  0.006   4      0.8594568  0.056479554
  1.75  0.006   5      0.8676539  0.170523847
  1.75  0.006   6      0.8758499  0.266577755
  1.75  0.006   7      0.8840470  0.352541063
  1.75  0.006   8      0.8886456  0.394563349
  1.75  0.006   9      0.8898443  0.419171445
  1.75  0.006  10      0.8904440  0.435542552
  1.75  0.007   1      0.8548582  0.000000000
  1.75  0.007   2      0.8548582  0.000000000
  1.75  0.007   3      0.8560580  0.013995319
  1.75  0.007   4      0.8642547  0.122912153
  1.75  0.007   5      0.8740510  0.248559119
  1.75  0.007   6      0.8840470  0.350341789
  1.75  0.007   7      0.8884456  0.393949621
  1.75  0.007   8      0.8898440  0.422695661
  1.75  0.007   9      0.8922430  0.448455529
  1.75  0.007  10      0.8950428  0.470429687
  1.75  0.008   1      0.8548582  0.000000000
  1.75  0.008   2      0.8548582  0.000000000
  1.75  0.008   3      0.8570574  0.027457856
  1.75  0.008   4      0.8692536  0.187322186
  1.75  0.008   5      0.8798486  0.309362914
  1.75  0.008   6      0.8866460  0.378356644
  1.75  0.008   7      0.8892446  0.417310483
  1.75  0.008   8      0.8924430  0.450664626
  1.75  0.008   9      0.8952427  0.472559203
  1.75  0.008  10      0.8960417  0.481290594
  1.75  0.009   1      0.8548582  0.000000000
  1.75  0.009   2      0.8548582  0.000000000
  1.75  0.009   3      0.8614558  0.078618899
  1.75  0.009   4      0.8740512  0.247168947
  1.75  0.009   5      0.8854465  0.363848247
  1.75  0.009   6      0.8900444  0.415391683
  1.75  0.009   7      0.8908436  0.440892605
  1.75  0.009   8      0.8954425  0.474007870
  1.75  0.009   9      0.8960417  0.481290594
  1.75  0.009  10      0.8974417  0.501362405
  1.75  0.010   1      0.8548582  0.000000000
  1.75  0.010   2      0.8548582  0.000000000
  1.75  0.010   3      0.8644548  0.127801698
  1.75  0.010   4      0.8772499  0.286957623
  1.75  0.010   5      0.8882457  0.391677584
  1.75  0.010   6      0.8910432  0.435057492
  1.75  0.010   7      0.8936430  0.462215697
  1.75  0.010   8      0.8958420  0.477684607
  1.75  0.010   9      0.8972416  0.498791786
  1.75  0.010  10      0.8970412  0.507106090
  2.00  0.001   1      0.8548582  0.000000000
  2.00  0.001   2      0.8548582  0.000000000
  2.00  0.001   3      0.8548582  0.000000000
  2.00  0.001   4      0.8548582  0.000000000
  2.00  0.001   5      0.8548582  0.000000000
  2.00  0.001   6      0.8548582  0.000000000
  2.00  0.001   7      0.8548582  0.000000000
  2.00  0.001   8      0.8548582  0.000000000
  2.00  0.001   9      0.8548582  0.000000000
  2.00  0.001  10      0.8548582  0.000000000
  2.00  0.002   1      0.8548582  0.000000000
  2.00  0.002   2      0.8548582  0.000000000
  2.00  0.002   3      0.8548582  0.000000000
  2.00  0.002   4      0.8548582  0.000000000
  2.00  0.002   5      0.8548582  0.000000000
  2.00  0.002   6      0.8548582  0.000000000
  2.00  0.002   7      0.8548582  0.000000000
  2.00  0.002   8      0.8556582  0.009358009
  2.00  0.002   9      0.8564576  0.020546433
  2.00  0.002  10      0.8586569  0.047544429
  2.00  0.003   1      0.8548582  0.000000000
  2.00  0.003   2      0.8548582  0.000000000
  2.00  0.003   3      0.8548582  0.000000000
  2.00  0.003   4      0.8548582  0.000000000
  2.00  0.003   5      0.8548582  0.000000000
  2.00  0.003   6      0.8562579  0.016315702
  2.00  0.003   7      0.8596566  0.058746939
  2.00  0.003   8      0.8640547  0.119386808
  2.00  0.003   9      0.8686539  0.181387810
  2.00  0.003  10      0.8730515  0.236452537
  2.00  0.004   1      0.8548582  0.000000000
  2.00  0.004   2      0.8548582  0.000000000
  2.00  0.004   3      0.8548582  0.000000000
  2.00  0.004   4      0.8548582  0.000000000
  2.00  0.004   5      0.8568574  0.025122989
  2.00  0.004   6      0.8628550  0.102203763
  2.00  0.004   7      0.8690539  0.188268020
  2.00  0.004   8      0.8752504  0.259676508
  2.00  0.004   9      0.8810478  0.322972472
  2.00  0.004  10      0.8850468  0.360532350
  2.00  0.005   1      0.8548582  0.000000000
  2.00  0.005   2      0.8548582  0.000000000
  2.00  0.005   3      0.8548582  0.000000000
  2.00  0.005   4      0.8566576  0.022820477
  2.00  0.005   5      0.8638547  0.117370164
  2.00  0.005   6      0.8716526  0.218547300
  2.00  0.005   7      0.8780494  0.292663322
  2.00  0.005   8      0.8852467  0.362195584
  2.00  0.005   9      0.8884456  0.393949621
  2.00  0.005  10      0.8896443  0.418573769
  2.00  0.006   1      0.8548582  0.000000000
  2.00  0.006   2      0.8548582  0.000000000
  2.00  0.006   3      0.8552582  0.004688062
  2.00  0.006   4      0.8616556  0.082483114
  2.00  0.006   5      0.8700532  0.202493337
  2.00  0.006   6      0.8788491  0.300741810
  2.00  0.006   7      0.8858462  0.369138903
  2.00  0.006   8      0.8896449  0.410439251
  2.00  0.006   9      0.8908433  0.433465785
  2.00  0.006  10      0.8930428  0.455758110
  2.00  0.007   1      0.8548582  0.000000000
  2.00  0.007   2      0.8548582  0.000000000
  2.00  0.007   3      0.8564576  0.020546433
  2.00  0.007   4      0.8656548  0.147652455
  2.00  0.007   5      0.8760500  0.269791923
  2.00  0.007   6      0.8852465  0.363219835
  2.00  0.007   7      0.8898448  0.412903320
  2.00  0.007   8      0.8910432  0.437427516
  2.00  0.007   9      0.8944427  0.463959953
  2.00  0.007  10      0.8954419  0.474273241
  2.00  0.008   1      0.8548582  0.000000000
  2.00  0.008   2      0.8548582  0.000000000
  2.00  0.008   3      0.8596566  0.058729676
  2.00  0.008   4      0.8718526  0.221752586
  2.00  0.008   5      0.8842468  0.351953162
  2.00  0.008   6      0.8888454  0.404129447
  2.00  0.008   7      0.8912432  0.436434994
  2.00  0.008   8      0.8936430  0.462932966
  2.00  0.008   9      0.8952420  0.474364080
  2.00  0.008  10      0.8972419  0.492383884
  2.00  0.009   1      0.8548582  0.000000000
  2.00  0.009   2      0.8548582  0.000000000
  2.00  0.009   3      0.8630550  0.104291276
  2.00  0.009   4      0.8760499  0.268482326
  2.00  0.009   5      0.8860460  0.373725670
  2.00  0.009   6      0.8900440  0.424976829
  2.00  0.009   7      0.8936427  0.460687998
  2.00  0.009   8      0.8956419  0.477074971
  2.00  0.009   9      0.8972417  0.493765718
  2.00  0.009  10      0.8978412  0.506750801
  2.00  0.010   1      0.8548582  0.000000000
  2.00  0.010   2      0.8548582  0.000000000
  2.00  0.010   3      0.8656550  0.150366044
  2.00  0.010   4      0.8822475  0.334246639
  2.00  0.010   5      0.8890452  0.406518678
  2.00  0.010   6      0.8918435  0.447102834
  2.00  0.010   7      0.8956424  0.476135314
  2.00  0.010   8      0.8966416  0.488330477
  2.00  0.010   9      0.8980416  0.507389838
  2.00  0.010  10      0.8978401  0.520073893
  2.25  0.001   1      0.8548582  0.000000000
  2.25  0.001   2      0.8548582  0.000000000
  2.25  0.001   3      0.8548582  0.000000000
  2.25  0.001   4      0.8548582  0.000000000
  2.25  0.001   5      0.8548582  0.000000000
  2.25  0.001   6      0.8548582  0.000000000
  2.25  0.001   7      0.8548582  0.000000000
  2.25  0.001   8      0.8548582  0.000000000
  2.25  0.001   9      0.8548582  0.000000000
  2.25  0.001  10      0.8548582  0.000000000
  2.25  0.002   1      0.8548582  0.000000000
  2.25  0.002   2      0.8548582  0.000000000
  2.25  0.002   3      0.8548582  0.000000000
  2.25  0.002   4      0.8548582  0.000000000
  2.25  0.002   5      0.8548582  0.000000000
  2.25  0.002   6      0.8548582  0.000000000
  2.25  0.002   7      0.8548582  0.000000000
  2.25  0.002   8      0.8560580  0.013995319
  2.25  0.002   9      0.8570574  0.027457856
  2.25  0.002  10      0.8604563  0.067641763
  2.25  0.003   1      0.8548582  0.000000000
  2.25  0.003   2      0.8548582  0.000000000
  2.25  0.003   3      0.8548582  0.000000000
  2.25  0.003   4      0.8548582  0.000000000
  2.25  0.003   5      0.8550582  0.002353195
  2.25  0.003   6      0.8566576  0.022820477
  2.25  0.003   7      0.8616558  0.080775391
  2.25  0.003   8      0.8652548  0.142168648
  2.25  0.003   9      0.8706528  0.209532587
  2.25  0.003  10      0.8752504  0.261055749
  2.25  0.004   1      0.8548582  0.000000000
  2.25  0.004   2      0.8548582  0.000000000
  2.25  0.004   3      0.8548582  0.000000000
  2.25  0.004   4      0.8552582  0.004688062
  2.25  0.004   5      0.8586568  0.047555143
  2.25  0.004   6      0.8650548  0.136860144
  2.25  0.004   7      0.8720524  0.222109435
  2.25  0.004   8      0.8766497  0.278826572
  2.25  0.004   9      0.8844468  0.353696748
  2.25  0.004  10      0.8860462  0.373705597
  2.25  0.005   1      0.8548582  0.000000000
  2.25  0.005   2      0.8548582  0.000000000
  2.25  0.005   3      0.8548582  0.000000000
  2.25  0.005   4      0.8570574  0.027457856
  2.25  0.005   5      0.8654548  0.142718213
  2.25  0.005   6      0.8738512  0.246577583
  2.25  0.005   7      0.8820475  0.331466873
  2.25  0.005   8      0.8858462  0.371083913
  2.25  0.005   9      0.8898446  0.411973851
  2.25  0.005  10      0.8912432  0.433830236
  2.25  0.006   1      0.8548582  0.000000000
  2.25  0.006   2      0.8548582  0.000000000
  2.25  0.006   3      0.8558580  0.011696471
  2.25  0.006   4      0.8630550  0.105859350
  2.25  0.006   5      0.8734515  0.238535420
  2.25  0.006   6      0.8828470  0.340383545
  2.25  0.006   7      0.8870459  0.382464098
  2.25  0.006   8      0.8900444  0.419635873
  2.25  0.006   9      0.8910436  0.441488920
  2.25  0.006  10      0.8936430  0.462932966
  2.25  0.007   1      0.8548582  0.000000000
  2.25  0.007   2      0.8548582  0.000000000
  2.25  0.007   3      0.8568574  0.025122989
  2.25  0.007   4      0.8694534  0.187797882
  2.25  0.007   5      0.8792489  0.304177985
  2.25  0.007   6      0.8862462  0.376223202
  2.25  0.007   7      0.8908441  0.426436284
  2.25  0.007   8      0.8928427  0.454177468
  2.25  0.007   9      0.8952425  0.474115259
  2.25  0.007  10      0.8956419  0.477773635
  2.25  0.008   1      0.8548582  0.000000000
  2.25  0.008   2      0.8548582  0.000000000
  2.25  0.008   3      0.8618558  0.084608543
  2.25  0.008   4      0.8740510  0.248559119
  2.25  0.008   5      0.8854462  0.365897556
  2.25  0.008   6      0.8908443  0.421375977
  2.25  0.008   7      0.8918433  0.447905567
  2.25  0.008   8      0.8952425  0.474024743
  2.25  0.008   9      0.8958420  0.479185465
  2.25  0.008  10      0.8978409  0.502054565
  2.25  0.009   1      0.8548582  0.000000000
  2.25  0.009   2      0.8548582  0.000000000
  2.25  0.009   3      0.8646550  0.134456950
  2.25  0.009   4      0.8786492  0.300125654
  2.25  0.009   5      0.8888456  0.397217010
  2.25  0.009   6      0.8912435  0.437220549
  2.25  0.009   7      0.8952424  0.473862451
  2.25  0.009   8      0.8958419  0.479155439
  2.25  0.009   9      0.8974411  0.500823782
  2.25  0.009  10      0.8968411  0.505626472
  2.25  0.010   1      0.8548582  0.000000000
  2.25  0.010   2      0.8552582  0.004688062
  2.25  0.010   3      0.8690537  0.188290097
  2.25  0.010   4      0.8850464  0.362553730
  2.25  0.010   5      0.8904443  0.420999894
  2.25  0.010   6      0.8932427  0.460131403
  2.25  0.010   7      0.8958419  0.479201824
  2.25  0.010   8      0.8980414  0.500472794
  2.25  0.010   9      0.8968412  0.504976253
  2.25  0.010  10      0.8982395  0.526476554
  2.50  0.001   1      0.8548582  0.000000000
  2.50  0.001   2      0.8548582  0.000000000
  2.50  0.001   3      0.8548582  0.000000000
  2.50  0.001   4      0.8548582  0.000000000
  2.50  0.001   5      0.8548582  0.000000000
  2.50  0.001   6      0.8548582  0.000000000
  2.50  0.001   7      0.8548582  0.000000000
  2.50  0.001   8      0.8548582  0.000000000
  2.50  0.001   9      0.8548582  0.000000000
  2.50  0.001  10      0.8548582  0.000000000
  2.50  0.002   1      0.8548582  0.000000000
  2.50  0.002   2      0.8548582  0.000000000
  2.50  0.002   3      0.8548582  0.000000000
  2.50  0.002   4      0.8548582  0.000000000
  2.50  0.002   5      0.8548582  0.000000000
  2.50  0.002   6      0.8548582  0.000000000
  2.50  0.002   7      0.8550582  0.002353195
  2.50  0.002   8      0.8562577  0.018226050
  2.50  0.002   9      0.8586568  0.047555143
  2.50  0.002  10      0.8624555  0.092939972
  2.50  0.003   1      0.8548582  0.000000000
  2.50  0.003   2      0.8548582  0.000000000
  2.50  0.003   3      0.8548582  0.000000000
  2.50  0.003   4      0.8548582  0.000000000
  2.50  0.003   5      0.8556582  0.009358009
  2.50  0.003   6      0.8570574  0.027457856
  2.50  0.003   7      0.8626550  0.100100831
  2.50  0.003   8      0.8678539  0.172339189
  2.50  0.003   9      0.8730516  0.236359615
  2.50  0.003  10      0.8768497  0.280692113
  2.50  0.004   1      0.8548582  0.000000000
  2.50  0.004   2      0.8548582  0.000000000
  2.50  0.004   3      0.8548582  0.000000000
  2.50  0.004   4      0.8558580  0.011696471
  2.50  0.004   5      0.8604563  0.067641763
  2.50  0.004   6      0.8662548  0.154790574
  2.50  0.004   7      0.8738512  0.246577583
  2.50  0.004   8      0.8804483  0.314567633
  2.50  0.004   9      0.8850467  0.361567172
  2.50  0.004  10      0.8886456  0.395533795
  2.50  0.005   1      0.8548582  0.000000000
  2.50  0.005   2      0.8548582  0.000000000
  2.50  0.005   3      0.8548582  0.000000000
  2.50  0.005   4      0.8594568  0.056479554
  2.50  0.005   5      0.8672542  0.166244006
  2.50  0.005   6      0.8752504  0.261055749
  2.50  0.005   7      0.8846467  0.355308121
  2.50  0.005   8      0.8882457  0.391677584
  2.50  0.005   9      0.8900444  0.419635873
  2.50  0.005  10      0.8908433  0.436806330
  2.50  0.006   1      0.8548582  0.000000000
  2.50  0.006   2      0.8548582  0.000000000
  2.50  0.006   3      0.8562579  0.016315702
  2.50  0.006   4      0.8648548  0.134959347
  2.50  0.006   5      0.8750504  0.258017346
  2.50  0.006   6      0.8850465  0.361701825
  2.50  0.006   7      0.8888456  0.402363701
  2.50  0.006   8      0.8904438  0.428847051
  2.50  0.006   9      0.8930428  0.457966441
  2.50  0.006  10      0.8956422  0.476913574
  2.50  0.007   1      0.8548582  0.000000000
  2.50  0.007   2      0.8548582  0.000000000
  2.50  0.007   3      0.8594568  0.056479554
  2.50  0.007   4      0.8706531  0.208103557
  2.50  0.007   5      0.8828472  0.339294205
  2.50  0.007   6      0.8886456  0.396582856
  2.50  0.007   7      0.8908435  0.431831146
  2.50  0.007   8      0.8936427  0.461406781
  2.50  0.007   9      0.8956424  0.476928044
  2.50  0.007  10      0.8962419  0.484070121
  2.50  0.008   1      0.8548582  0.000000000
  2.50  0.008   2      0.8548582  0.000000000
  2.50  0.008   3      0.8630550  0.104291276
  2.50  0.008   4      0.8756502  0.264704393
  2.50  0.008   5      0.8866460  0.378538202
  2.50  0.008   6      0.8904441  0.425293415
  2.50  0.008   7      0.8932427  0.460131403
  2.50  0.008   8      0.8962422  0.480976784
  2.50  0.008   9      0.8968416  0.488952925
  2.50  0.008  10      0.8976412  0.503419791
  2.50  0.009   1      0.8548582  0.000000000
  2.50  0.009   2      0.8550582  0.002353195
  2.50  0.009   3      0.8660548  0.154436196
  2.50  0.009   4      0.8820475  0.333816386
  2.50  0.009   5      0.8898448  0.413628173
  2.50  0.009   6      0.8918432  0.450324489
  2.50  0.009   7      0.8954422  0.476887721
  2.50  0.009   8      0.8962420  0.485540017
  2.50  0.009   9      0.8976412  0.503419791
  2.50  0.009  10      0.8976404  0.516171707
  2.50  0.010   1      0.8548582  0.000000000
  2.50  0.010   2      0.8560580  0.013995319
  2.50  0.010   3      0.8710529  0.211730589
  2.50  0.010   4      0.8868460  0.380006734
  2.50  0.010   5      0.8908441  0.430076272
  2.50  0.010   6      0.8942424  0.469434195
  2.50  0.010   7      0.8956420  0.480000451
  2.50  0.010   8      0.8980409  0.505290998
  2.50  0.010   9      0.8972408  0.512320464
  2.50  0.010  10      0.9004388  0.541011083
  2.75  0.001   1      0.8548582  0.000000000
  2.75  0.001   2      0.8548582  0.000000000
  2.75  0.001   3      0.8548582  0.000000000
  2.75  0.001   4      0.8548582  0.000000000
  2.75  0.001   5      0.8548582  0.000000000
  2.75  0.001   6      0.8548582  0.000000000
  2.75  0.001   7      0.8548582  0.000000000
  2.75  0.001   8      0.8548582  0.000000000
  2.75  0.001   9      0.8548582  0.000000000
  2.75  0.001  10      0.8548582  0.000000000
  2.75  0.002   1      0.8548582  0.000000000
  2.75  0.002   2      0.8548582  0.000000000
  2.75  0.002   3      0.8548582  0.000000000
  2.75  0.002   4      0.8548582  0.000000000
  2.75  0.002   5      0.8548582  0.000000000
  2.75  0.002   6      0.8548582  0.000000000
  2.75  0.002   7      0.8556582  0.009358009
  2.75  0.002   8      0.8568574  0.025122989
  2.75  0.002   9      0.8604563  0.067641763
  2.75  0.002  10      0.8636548  0.115308991
  2.75  0.003   1      0.8548582  0.000000000
  2.75  0.003   2      0.8548582  0.000000000
  2.75  0.003   3      0.8548582  0.000000000
  2.75  0.003   4      0.8548582  0.000000000
  2.75  0.003   5      0.8558580  0.011696471
  2.75  0.003   6      0.8594568  0.056479554
  2.75  0.003   7      0.8642548  0.127340281
  2.75  0.003   8      0.8696534  0.195799002
  2.75  0.003   9      0.8748504  0.256347507
  2.75  0.003  10      0.8802483  0.312764524
  2.75  0.004   1      0.8548582  0.000000000
  2.75  0.004   2      0.8548582  0.000000000
  2.75  0.004   3      0.8548582  0.000000000
  2.75  0.004   4      0.8562579  0.016315702
  2.75  0.004   5      0.8620558  0.088456795
  2.75  0.004   6      0.8692536  0.187322186
  2.75  0.004   7      0.8752504  0.261055749
  2.75  0.004   8      0.8830470  0.342119684
  2.75  0.004   9      0.8858460  0.372101173
  2.75  0.004  10      0.8892452  0.406240475
  2.75  0.005   1      0.8548582  0.000000000
  2.75  0.005   2      0.8548582  0.000000000
  2.75  0.005   3      0.8550582  0.002353195
  2.75  0.005   4      0.8608560  0.072057445
  2.75  0.005   5      0.8688539  0.189216416
  2.75  0.005   6      0.8768500  0.283298746
  2.75  0.005   7      0.8854462  0.365794304
  2.75  0.005   8      0.8890454  0.404743568
  2.75  0.005   9      0.8904438  0.427991074
  2.75  0.005  10      0.8924428  0.452153707
  2.75  0.006   1      0.8548582  0.000000000
  2.75  0.006   2      0.8548582  0.000000000
  2.75  0.006   3      0.8566576  0.022820477
  2.75  0.006   4      0.8658548  0.150820424
  2.75  0.006   5      0.8762499  0.271520784
  2.75  0.006   6      0.8862459  0.372377210
  2.75  0.006   7      0.8902448  0.415846218
  2.75  0.006   8      0.8908435  0.435990552
  2.75  0.006   9      0.8942424  0.466403235
  2.75  0.006  10      0.8960424  0.479628261
  2.75  0.007   1      0.8548582  0.000000000
  2.75  0.007   2      0.8548582  0.000000000
  2.75  0.007   3      0.8606560  0.069834930
  2.75  0.007   4      0.8732516  0.236764138
  2.75  0.007   5      0.8848464  0.360883891
  2.75  0.007   6      0.8896449  0.411941731
  2.75  0.007   7      0.8906438  0.437793155
  2.75  0.007   8      0.8948425  0.472015441
  2.75  0.007   9      0.8958419  0.479201824
  2.75  0.007  10      0.8970417  0.492438575
  2.75  0.008   1      0.8548582  0.000000000
  2.75  0.008   2      0.8548582  0.000000000
  2.75  0.008   3      0.8650548  0.133781129
  2.75  0.008   4      0.8776497  0.290294443
  2.75  0.008   5      0.8884459  0.393208272
  2.75  0.008   6      0.8910438  0.435738132
  2.75  0.008   7      0.8936425  0.465258789
  2.75  0.008   8      0.8960417  0.480542848
  2.75  0.008   9      0.8976416  0.497775403
  2.75  0.008  10      0.8972414  0.505514228
  2.75  0.009   1      0.8548582  0.000000000
  2.75  0.009   2      0.8552582  0.004688062
  2.75  0.009   3      0.8690536  0.186765249
  2.75  0.009   4      0.8854464  0.362888087
  2.75  0.009   5      0.8902446  0.419455431
  2.75  0.009   6      0.8926432  0.457524382
  2.75  0.009   7      0.8960420  0.481781824
  2.75  0.009   8      0.8970416  0.492414438
  2.75  0.009   9      0.8970414  0.504862349
  2.75  0.009  10      0.8968403  0.516913773
  2.75  0.010   1      0.8548582  0.000000000
  2.75  0.010   2      0.8562577  0.018226050
  2.75  0.010   3      0.8736515  0.240467123
  2.75  0.010   4      0.8878460  0.387817401
  2.75  0.010   5      0.8918436  0.443071678
  2.75  0.010   6      0.8954425  0.477640648
  2.75  0.010   7      0.8964416  0.486146514
  2.75  0.010   8      0.8974411  0.503450875
  2.75  0.010   9      0.8970404  0.517512376
  2.75  0.010  10      0.8986393  0.535990938
  3.00  0.001   1      0.8548582  0.000000000
  3.00  0.001   2      0.8548582  0.000000000
  3.00  0.001   3      0.8548582  0.000000000
  3.00  0.001   4      0.8548582  0.000000000
  3.00  0.001   5      0.8548582  0.000000000
  3.00  0.001   6      0.8548582  0.000000000
  3.00  0.001   7      0.8548582  0.000000000
  3.00  0.001   8      0.8548582  0.000000000
  3.00  0.001   9      0.8548582  0.000000000
  3.00  0.001  10      0.8548582  0.000000000
  3.00  0.002   1      0.8548582  0.000000000
  3.00  0.002   2      0.8548582  0.000000000
  3.00  0.002   3      0.8548582  0.000000000
  3.00  0.002   4      0.8548582  0.000000000
  3.00  0.002   5      0.8548582  0.000000000
  3.00  0.002   6      0.8548582  0.000000000
  3.00  0.002   7      0.8560580  0.013995319
  3.00  0.002   8      0.8570574  0.027457856
  3.00  0.002   9      0.8620556  0.086806206
  3.00  0.002  10      0.8648550  0.136357747
  3.00  0.003   1      0.8548582  0.000000000
  3.00  0.003   2      0.8548582  0.000000000
  3.00  0.003   3      0.8548582  0.000000000
  3.00  0.003   4      0.8548582  0.000000000
  3.00  0.003   5      0.8562579  0.016315702
  3.00  0.003   6      0.8608560  0.072057445
  3.00  0.003   7      0.8652548  0.142168648
  3.00  0.003   8      0.8716528  0.219708292
  3.00  0.003   9      0.8756502  0.267322474
  3.00  0.003  10      0.8828470  0.340383545
  3.00  0.004   1      0.8548582  0.000000000
  3.00  0.004   2      0.8548582  0.000000000
  3.00  0.004   3      0.8548582  0.000000000
  3.00  0.004   4      0.8564576  0.020546433
  3.00  0.004   5      0.8628550  0.103792814
  3.00  0.004   6      0.8700531  0.202630304
  3.00  0.004   7      0.8764500  0.276974618
  3.00  0.004   8      0.8848465  0.360042663
  3.00  0.004   9      0.8884457  0.393251912
  3.00  0.004  10      0.8904443  0.419213366
  3.00  0.005   1      0.8548582  0.000000000
  3.00  0.005   2      0.8548582  0.000000000
  3.00  0.005   3      0.8554582  0.007041257
  3.00  0.005   4      0.8624553  0.094489453
  3.00  0.005   5      0.8718526  0.220260766
  3.00  0.005   6      0.8802483  0.312764524
  3.00  0.005   7      0.8858462  0.372107333
  3.00  0.005   8      0.8904446  0.417368467
  3.00  0.005   9      0.8914433  0.437835606
  3.00  0.005  10      0.8932427  0.460131403
  3.00  0.006   1      0.8548582  0.000000000
  3.00  0.006   2      0.8548582  0.000000000
  3.00  0.006   3      0.8568574  0.025122989
  3.00  0.006   4      0.8686537  0.181368184
  3.00  0.006   5      0.8788492  0.301753392
  3.00  0.006   6      0.8868460  0.380122376
  3.00  0.006   7      0.8908441  0.423909268
  3.00  0.006   8      0.8916433  0.449698406
  3.00  0.006   9      0.8952424  0.474692383
  3.00  0.006  10      0.8958419  0.479201824
  3.00  0.007   1      0.8548582  0.000000000
  3.00  0.007   2      0.8548582  0.000000000
  3.00  0.007   3      0.8618558  0.087966475
  3.00  0.007   4      0.8742508  0.250526734
  3.00  0.007   5      0.8866459  0.375459711
  3.00  0.007   6      0.8904444  0.420068415
  3.00  0.007   7      0.8914433  0.449077417
  3.00  0.007   8      0.8956422  0.479067616
  3.00  0.007   9      0.8954422  0.479382361
  3.00  0.007  10      0.8980412  0.502462154
  3.00  0.008   1      0.8548582  0.000000000
  3.00  0.008   2      0.8548582  0.000000000
  3.00  0.008   3      0.8656548  0.147458132
  3.00  0.008   4      0.8804483  0.315615750
  3.00  0.008   5      0.8882456  0.398889627
  3.00  0.008   6      0.8914436  0.444385643
  3.00  0.008   7      0.8950427  0.474822561
  3.00  0.008   8      0.8954420  0.478659250
  3.00  0.008   9      0.8980411  0.503816053
  3.00  0.008  10      0.8966417  0.505672820
  3.00  0.009   1      0.8548582  0.000000000
  3.00  0.009   2      0.8558580  0.011696471
  3.00  0.009   3      0.8706531  0.208103557
  3.00  0.009   4      0.8866460  0.377430851
  3.00  0.009   5      0.8900443  0.424962581
  3.00  0.009   6      0.8930432  0.461720525
  3.00  0.009   7      0.8966411  0.485305156
  3.00  0.009   8      0.8974412  0.501250400
  3.00  0.009   9      0.8966417  0.505672820
  3.00  0.009  10      0.8980393  0.527078831
  3.00  0.010   1      0.8548582  0.000000000
  3.00  0.010   2      0.8566576  0.022820477
  3.00  0.010   3      0.8746505  0.254420785
  3.00  0.010   4      0.8884459  0.394957942
  3.00  0.010   5      0.8914435  0.446777980
  3.00  0.010   6      0.8954422  0.479160306
  3.00  0.010   7      0.8966416  0.491034019
  3.00  0.010   8      0.8976411  0.508021458
  3.00  0.010   9      0.8976398  0.522442728
  3.00  0.010  10      0.8996390  0.543606058
  3.25  0.001   1      0.8548582  0.000000000
  3.25  0.001   2      0.8548582  0.000000000
  3.25  0.001   3      0.8548582  0.000000000
  3.25  0.001   4      0.8548582  0.000000000
  3.25  0.001   5      0.8548582  0.000000000
  3.25  0.001   6      0.8548582  0.000000000
  3.25  0.001   7      0.8548582  0.000000000
  3.25  0.001   8      0.8548582  0.000000000
  3.25  0.001   9      0.8548582  0.000000000
  3.25  0.001  10      0.8548582  0.000000000
  3.25  0.002   1      0.8548582  0.000000000
  3.25  0.002   2      0.8548582  0.000000000
  3.25  0.002   3      0.8548582  0.000000000
  3.25  0.002   4      0.8548582  0.000000000
  3.25  0.002   5      0.8548582  0.000000000
  3.25  0.002   6      0.8548582  0.000000000
  3.25  0.002   7      0.8560579  0.015887588
  3.25  0.002   8      0.8594568  0.056479554
  3.25  0.002   9      0.8630548  0.102692479
  3.25  0.002  10      0.8658548  0.149522519
  3.25  0.003   1      0.8548582  0.000000000
  3.25  0.003   2      0.8548582  0.000000000
  3.25  0.003   3      0.8548582  0.000000000
  3.25  0.003   4      0.8548582  0.000000000
  3.25  0.003   5      0.8566576  0.022820477
  3.25  0.003   6      0.8620558  0.088456795
  3.25  0.003   7      0.8666547  0.160110312
  3.25  0.003   8      0.8732516  0.239221468
  3.25  0.003   9      0.8778497  0.292063239
  3.25  0.003  10      0.8852464  0.361150686
  3.25  0.004   1      0.8548582  0.000000000
  3.25  0.004   2      0.8548582  0.000000000
  3.25  0.004   3      0.8548582  0.000000000
  3.25  0.004   4      0.8568574  0.025122989
  3.25  0.004   5      0.8648548  0.131643339
  3.25  0.004   6      0.8724524  0.227230427
  3.25  0.004   7      0.8788491  0.301778317
  3.25  0.004   8      0.8858459  0.369111359
  3.25  0.004   9      0.8886454  0.399193423
  3.25  0.004  10      0.8906440  0.425085120
  3.25  0.005   1      0.8548582  0.000000000
  3.25  0.005   2      0.8548582  0.000000000
  3.25  0.005   3      0.8558580  0.011696471
  3.25  0.005   4      0.8638547  0.114115325
  3.25  0.005   5      0.8734515  0.238535420
  3.25  0.005   6      0.8828472  0.339294205
  3.25  0.005   7      0.8878460  0.388586614
  3.25  0.005   8      0.8906441  0.423301663
  3.25  0.005   9      0.8904440  0.439643832
  3.25  0.005  10      0.8940425  0.467921894
  3.25  0.006   1      0.8548582  0.000000000
  3.25  0.006   2      0.8548582  0.000000000
  3.25  0.006   3      0.8580572  0.040666536
  3.25  0.006   4      0.8696534  0.198671069
  3.25  0.006   5      0.8816476  0.325750768
  3.25  0.006   6      0.8886459  0.394763181
  3.25  0.006   7      0.8910440  0.431588262
  3.25  0.006   8      0.8920432  0.453289463
  3.25  0.006   9      0.8956422  0.479067616
  3.25  0.006  10      0.8958417  0.479917504
  3.25  0.007   1      0.8548582  0.000000000
  3.25  0.007   2      0.8548582  0.000000000
  3.25  0.007   3      0.8630550  0.104291276
  3.25  0.007   4      0.8756502  0.264704393
  3.25  0.007   5      0.8872459  0.383278115
  3.25  0.007   6      0.8906441  0.425968814
  3.25  0.007   7      0.8926432  0.457524382
  3.25  0.007   8      0.8954422  0.479160306
  3.25  0.007   9      0.8966417  0.487412518
  3.25  0.007  10      0.8972412  0.500683557
  3.25  0.008   1      0.8548582  0.000000000
  3.25  0.008   2      0.8552582  0.004688062
  3.25  0.008   3      0.8670545  0.163917641
  3.25  0.008   4      0.8828473  0.340559152
  3.25  0.008   5      0.8896449  0.413797943
  3.25  0.008   6      0.8912433  0.447770229
  3.25  0.008   7      0.8952424  0.477819281
  3.25  0.008   8      0.8972414  0.491477981
  3.25  0.008   9      0.8978408  0.506005015
  3.25  0.008  10      0.8962411  0.507926554
  3.25  0.009   1      0.8548582  0.000000000
  3.25  0.009   2      0.8560580  0.013995319
  3.25  0.009   3      0.8722526  0.225386156
  3.25  0.009   4      0.8872459  0.384233025
  3.25  0.009   5      0.8912438  0.438073915
  3.25  0.009   6      0.8942427  0.470750458
  3.25  0.009   7      0.8962414  0.483404281
  3.25  0.009   8      0.8972409  0.501402602
  3.25  0.009   9      0.8960411  0.506622220
  3.25  0.009  10      0.8992390  0.536220011
  3.25  0.010   1      0.8548582  0.000000000
  3.25  0.010   2      0.8568576  0.025077147
  3.25  0.010   3      0.8760499  0.269696740
  3.25  0.010   4      0.8884456  0.399513758
  3.25  0.010   5      0.8926428  0.456772436
  3.25  0.010   6      0.8958419  0.483355760
  3.25  0.010   7      0.8980411  0.502392711
  3.25  0.010   8      0.8958417  0.501201236
  3.25  0.010   9      0.8984390  0.532061687
  3.25  0.010  10      0.8998387  0.546660589
  3.50  0.001   1      0.8548582  0.000000000
  3.50  0.001   2      0.8548582  0.000000000
  3.50  0.001   3      0.8548582  0.000000000
  3.50  0.001   4      0.8548582  0.000000000
  3.50  0.001   5      0.8548582  0.000000000
  3.50  0.001   6      0.8548582  0.000000000
  3.50  0.001   7      0.8548582  0.000000000
  3.50  0.001   8      0.8548582  0.000000000
  3.50  0.001   9      0.8548582  0.000000000
  3.50  0.001  10      0.8548582  0.000000000
  3.50  0.002   1      0.8548582  0.000000000
  3.50  0.002   2      0.8548582  0.000000000
  3.50  0.002   3      0.8548582  0.000000000
  3.50  0.002   4      0.8548582  0.000000000
  3.50  0.002   5      0.8548582  0.000000000
  3.50  0.002   6      0.8552582  0.004688062
  3.50  0.002   7      0.8566576  0.022820477
  3.50  0.002   8      0.8600564  0.063186192
  3.50  0.002   9      0.8646547  0.125350861
  3.50  0.002  10      0.8682539  0.175933297
  3.50  0.003   1      0.8548582  0.000000000
  3.50  0.003   2      0.8548582  0.000000000
  3.50  0.003   3      0.8548582  0.000000000
  3.50  0.003   4      0.8548582  0.000000000
  3.50  0.003   5      0.8568574  0.025122989
  3.50  0.003   6      0.8630550  0.104291276
  3.50  0.003   7      0.8690537  0.186790305
  3.50  0.003   8      0.8744507  0.252480575
  3.50  0.003   9      0.8808480  0.317947323
  3.50  0.003  10      0.8852464  0.364316145
  3.50  0.004   1      0.8548582  0.000000000
  3.50  0.004   2      0.8548582  0.000000000
  3.50  0.004   3      0.8548582  0.000000000
  3.50  0.004   4      0.8572574  0.029714526
  3.50  0.004   5      0.8652548  0.142168648
  3.50  0.004   6      0.8736512  0.243319819
  3.50  0.004   7      0.8816475  0.327993600
  3.50  0.004   8      0.8860462  0.373721606
  3.50  0.004   9      0.8896451  0.411024014
  3.50  0.004  10      0.8908438  0.430976731
  3.50  0.005   1      0.8548582  0.000000000
  3.50  0.005   2      0.8548582  0.000000000
  3.50  0.005   3      0.8560580  0.013995319
  3.50  0.005   4      0.8646548  0.131181922
  3.50  0.005   5      0.8742508  0.250526734
  3.50  0.005   6      0.8852464  0.361090633
  3.50  0.005   7      0.8882456  0.394493017
  3.50  0.005   8      0.8910440  0.431588262
  3.50  0.005   9      0.8918432  0.452679667
  3.50  0.005  10      0.8948427  0.473442909
  3.50  0.006   1      0.8548582  0.000000000
  3.50  0.006   2      0.8548582  0.000000000
  3.50  0.006   3      0.8596566  0.058729676
  3.50  0.006   4      0.8716528  0.218399421
  3.50  0.006   5      0.8836468  0.348216434
  3.50  0.006   6      0.8880457  0.396481160
  3.50  0.006   7      0.8912438  0.437180214
  3.50  0.006   8      0.8926433  0.459827973
  3.50  0.006   9      0.8954422  0.479160306
  3.50  0.006  10      0.8966417  0.487412518
  3.50  0.007   1      0.8548582  0.000000000
  3.50  0.007   2      0.8548582  0.000000000
  3.50  0.007   3      0.8650547  0.132094621
  3.50  0.007   4      0.8770500  0.283640690
  3.50  0.007   5      0.8878460  0.389591137
  3.50  0.007   6      0.8914438  0.437043123
  3.50  0.007   7      0.8924435  0.458355867
  3.50  0.007   8      0.8962417  0.483916836
  3.50  0.007   9      0.8960417  0.486286777
  3.50  0.007  10      0.8972409  0.504078982
  3.50  0.008   1      0.8548582  0.000000000
  3.50  0.008   2      0.8552582  0.004688062
  3.50  0.008   3      0.8690536  0.186765249
  3.50  0.008   4      0.8858464  0.366088734
  3.50  0.008   5      0.8902446  0.419437120
  3.50  0.008   6      0.8926433  0.459070289
  3.50  0.008   7      0.8958419  0.481923295
  3.50  0.008   8      0.8960419  0.487678528
  3.50  0.008   9      0.8974408  0.505387562
  3.50  0.008  10      0.8970409  0.514147676
  3.50  0.009   1      0.8548582  0.000000000
  3.50  0.009   2      0.8562577  0.018226050
  3.50  0.009   3      0.8736513  0.242997524
  3.50  0.009   4      0.8888456  0.396554669
  3.50  0.009   5      0.8912438  0.441958748
  3.50  0.009   6      0.8944422  0.475189960
  3.50  0.009   7      0.8972412  0.491464767
  3.50  0.009   8      0.8972409  0.504078982
  3.50  0.009   9      0.8970408  0.514727626
  3.50  0.009  10      0.9006382  0.545185503
  3.50  0.010   1      0.8548582  0.000000000
  3.50  0.010   2      0.8580572  0.038778137
  3.50  0.010   3      0.8772499  0.286814679
  3.50  0.010   4      0.8884451  0.407737241
  3.50  0.010   5      0.8928432  0.459602691
  3.50  0.010   6      0.8958412  0.482725729
  3.50  0.010   7      0.8980409  0.505797062
  3.50  0.010   8      0.8958417  0.505971857
  3.50  0.010   9      0.8988390  0.535505341
  3.50  0.010  10      0.8992390  0.545895927
  3.75  0.001   1      0.8548582  0.000000000
  3.75  0.001   2      0.8548582  0.000000000
  3.75  0.001   3      0.8548582  0.000000000
  3.75  0.001   4      0.8548582  0.000000000
  3.75  0.001   5      0.8548582  0.000000000
  3.75  0.001   6      0.8548582  0.000000000
  3.75  0.001   7      0.8548582  0.000000000
  3.75  0.001   8      0.8548582  0.000000000
  3.75  0.001   9      0.8548582  0.000000000
  3.75  0.001  10      0.8548582  0.000000000
  3.75  0.002   1      0.8548582  0.000000000
  3.75  0.002   2      0.8548582  0.000000000
  3.75  0.002   3      0.8548582  0.000000000
  3.75  0.002   4      0.8548582  0.000000000
  3.75  0.002   5      0.8548582  0.000000000
  3.75  0.002   6      0.8556582  0.009358009
  3.75  0.002   7      0.8568574  0.025122989
  3.75  0.002   8      0.8618558  0.082915837
  3.75  0.002   9      0.8646550  0.134456950
  3.75  0.002  10      0.8692537  0.190255558
  3.75  0.003   1      0.8548582  0.000000000
  3.75  0.003   2      0.8548582  0.000000000
  3.75  0.003   3      0.8548582  0.000000000
  3.75  0.003   4      0.8550582  0.002353195
  3.75  0.003   5      0.8572574  0.029714526
  3.75  0.003   6      0.8646547  0.126845907
  3.75  0.003   7      0.8698532  0.200730141
  3.75  0.003   8      0.8756500  0.264695287
  3.75  0.003   9      0.8828472  0.339294205
  3.75  0.003  10      0.8862459  0.375260354
  3.75  0.004   1      0.8548582  0.000000000
  3.75  0.004   2      0.8548582  0.000000000
  3.75  0.004   3      0.8548582  0.000000000
  3.75  0.004   4      0.8588568  0.049777659
  3.75  0.004   5      0.8658548  0.152386672
  3.75  0.004   6      0.8748504  0.256347507
  3.75  0.004   7      0.8836468  0.348083949
  3.75  0.004   8      0.8874462  0.385483707
  3.75  0.004   9      0.8902446  0.417487171
  3.75  0.004  10      0.8908440  0.436776064
  3.75  0.005   1      0.8548582  0.000000000
  3.75  0.005   2      0.8548582  0.000000000
  3.75  0.005   3      0.8562577  0.018226050
  3.75  0.005   4      0.8656547  0.146092379
  3.75  0.005   5      0.8752504  0.261055749
  3.75  0.005   6      0.8856460  0.367552370
  3.75  0.005   7      0.8892452  0.407261537
  3.75  0.005   8      0.8914438  0.437799547
  3.75  0.005   9      0.8924432  0.457671716
  3.75  0.005  10      0.8954424  0.478444626
  3.75  0.006   1      0.8548582  0.000000000
  3.75  0.006   2      0.8548582  0.000000000
  3.75  0.006   3      0.8608560  0.072057445
  3.75  0.006   4      0.8732516  0.236764138
  3.75  0.006   5      0.8852465  0.362276598
  3.75  0.006   6      0.8898448  0.411921178
  3.75  0.006   7      0.8916436  0.445753829
  3.75  0.006   8      0.8936428  0.466694143
  3.75  0.006   9      0.8960417  0.482596326
  3.75  0.006  10      0.8966417  0.488180131
  3.75  0.007   1      0.8548582  0.000000000
  3.75  0.007   2      0.8548582  0.000000000
  3.75  0.007   3      0.8650550  0.138630965
  3.75  0.007   4      0.8786491  0.301352019
  3.75  0.007   5      0.8882456  0.396257699
  3.75  0.007   6      0.8914438  0.441834902
  3.75  0.007   7      0.8934430  0.465230730
  3.75  0.007   8      0.8958414  0.482028647
  3.75  0.007   9      0.8974414  0.498382260
  3.75  0.007  10      0.8974409  0.506727063
  3.75  0.008   1      0.8548582  0.000000000
  3.75  0.008   2      0.8556582  0.009358009
  3.75  0.008   3      0.8702531  0.204393501
  3.75  0.008   4      0.8868459  0.378922000
  3.75  0.008   5      0.8906440  0.425933708
  3.75  0.008   6      0.8926435  0.458976945
  3.75  0.008   7      0.8956417  0.480535529
  3.75  0.008   8      0.8976412  0.499668929
  3.75  0.008   9      0.8972409  0.506761003
  3.75  0.008  10      0.8966408  0.514863323
  3.75  0.009   1      0.8548582  0.000000000
  3.75  0.009   2      0.8566576  0.022820477
  3.75  0.009   3      0.8746505  0.254420785
  3.75  0.009   4      0.8880457  0.394696133
  3.75  0.009   5      0.8914433  0.446792101
  3.75  0.009   6      0.8954419  0.481254526
  3.75  0.009   7      0.8970411  0.493002827
  3.75  0.009   8      0.8974409  0.507954664
  3.75  0.009   9      0.8968404  0.516122914
  3.75  0.009  10      0.9010380  0.548104166
  3.75  0.010   1      0.8548582  0.000000000
  3.75  0.010   2      0.8594568  0.056479554
  3.75  0.010   3      0.8790488  0.304905961
  3.75  0.010   4      0.8892448  0.414534221
  3.75  0.010   5      0.8930433  0.461685887
  3.75  0.010   6      0.8962411  0.485397290
  3.75  0.010   7      0.8976408  0.505304007
  3.75  0.010   8      0.8966409  0.512807832
  3.75  0.010   9      0.9006382  0.545827009
  3.75  0.010  10      0.8986395  0.545783921
  4.00  0.001   1      0.8548582  0.000000000
  4.00  0.001   2      0.8548582  0.000000000
  4.00  0.001   3      0.8548582  0.000000000
  4.00  0.001   4      0.8548582  0.000000000
  4.00  0.001   5      0.8548582  0.000000000
  4.00  0.001   6      0.8548582  0.000000000
  4.00  0.001   7      0.8548582  0.000000000
  4.00  0.001   8      0.8548582  0.000000000
  4.00  0.001   9      0.8548582  0.000000000
  4.00  0.001  10      0.8548582  0.000000000
  4.00  0.002   1      0.8548582  0.000000000
  4.00  0.002   2      0.8548582  0.000000000
  4.00  0.002   3      0.8548582  0.000000000
  4.00  0.002   4      0.8548582  0.000000000
  4.00  0.002   5      0.8548582  0.000000000
  4.00  0.002   6      0.8558580  0.011696471
  4.00  0.002   7      0.8572574  0.029714526
  4.00  0.002   8      0.8622555  0.092360947
  4.00  0.002   9      0.8658547  0.147962443
  4.00  0.002  10      0.8704531  0.206145013
  4.00  0.003   1      0.8548582  0.000000000
  4.00  0.003   2      0.8548582  0.000000000
  4.00  0.003   3      0.8548582  0.000000000
  4.00  0.003   4      0.8552582  0.004688062
  4.00  0.003   5      0.8588568  0.049777659
  4.00  0.003   6      0.8646550  0.134456950
  4.00  0.003   7      0.8716528  0.219708292
  4.00  0.003   8      0.8768497  0.278149010
  4.00  0.003   9      0.8852465  0.360114638
  4.00  0.003  10      0.8864462  0.377880522
  4.00  0.004   1      0.8548582  0.000000000
  4.00  0.004   2      0.8548582  0.000000000
  4.00  0.004   3      0.8548582  0.000000000
  4.00  0.004   4      0.8598566  0.060918807
  4.00  0.004   5      0.8686537  0.179828685
  4.00  0.004   6      0.8756502  0.266018028
  4.00  0.004   7      0.8852464  0.363148564
  4.00  0.004   8      0.8886456  0.396582856
  4.00  0.004   9      0.8910441  0.426327898
  4.00  0.004  10      0.8912438  0.442940611
  4.00  0.005   1      0.8548582  0.000000000
  4.00  0.005   2      0.8548582  0.000000000
  4.00  0.005   3      0.8566576  0.022820477
  4.00  0.005   4      0.8656550  0.150366044
  4.00  0.005   5      0.8764497  0.273238370
  4.00  0.005   6      0.8872457  0.382185334
  4.00  0.005   7      0.8896449  0.413797943
  4.00  0.005   8      0.8918435  0.445570555
  4.00  0.005   9      0.8930432  0.462536915
  4.00  0.005  10      0.8954422  0.479160306
  4.00  0.006   1      0.8548582  0.000000000
  4.00  0.006   2      0.8548582  0.000000000
  4.00  0.006   3      0.8620558  0.086733121
  4.00  0.006   4      0.8744510  0.250823792
  4.00  0.006   5      0.8860460  0.371677304
  4.00  0.006   6      0.8896449  0.414849836
  4.00  0.006   7      0.8912435  0.446161045
  4.00  0.006   8      0.8944425  0.472931364
  4.00  0.006   9      0.8958414  0.482028647
  4.00  0.006  10      0.8966416  0.492437429
  4.00  0.007   1      0.8548582  0.000000000
  4.00  0.007   2      0.8550582  0.002353195
  4.00  0.007   3      0.8658548  0.150820424
  4.00  0.007   4      0.8816480  0.325768076
  4.00  0.007   5      0.8882456  0.399978989
  4.00  0.007   6      0.8916433  0.447409035
  4.00  0.007   7      0.8942424  0.472334325
  4.00  0.007   8      0.8960411  0.483430032
  4.00  0.007   9      0.8976412  0.502460930
  4.00  0.007  10      0.8970412  0.506844566
  4.00  0.008   1      0.8548582  0.000000000
  4.00  0.008   2      0.8560580  0.013995319
  4.00  0.008   3      0.8716528  0.219895714
  4.00  0.008   4      0.8872460  0.384138391
  4.00  0.008   5      0.8912440  0.437235507
  4.00  0.008   6      0.8930435  0.462395618
  4.00  0.008   7      0.8958412  0.482725729
  4.00  0.008   8      0.8978412  0.502997521
  4.00  0.008   9      0.8960422  0.503781415
  4.00  0.008  10      0.8974395  0.522493712
  4.00  0.009   1      0.8548582  0.000000000
  4.00  0.009   2      0.8568576  0.025077147
  4.00  0.009   3      0.8760499  0.269696740
  4.00  0.009   4      0.8878457  0.397671569
  4.00  0.009   5      0.8922430  0.454076012
  4.00  0.009   6      0.8952419  0.480745113
  4.00  0.009   7      0.8970412  0.495723686
  4.00  0.009   8      0.8964414  0.504293284
  4.00  0.009   9      0.8974395  0.521932493
  4.00  0.009  10      0.8998388  0.544994260
  4.00  0.010   1      0.8548582  0.000000000
  4.00  0.010   2      0.8604561  0.067567545
  4.00  0.010   3      0.8818480  0.328665194
  4.00  0.010   4      0.8896444  0.420274926
  4.00  0.010   5      0.8936430  0.465751718
  4.00  0.010   6      0.8972406  0.491392892
  4.00  0.010   7      0.8976406  0.507865037
  4.00  0.010   8      0.8970404  0.517339013
  4.00  0.010   9      0.9006384  0.548622434
  4.00  0.010  10      0.8962403  0.538003294
  4.25  0.001   1      0.8548582  0.000000000
  4.25  0.001   2      0.8548582  0.000000000
  4.25  0.001   3      0.8548582  0.000000000
  4.25  0.001   4      0.8548582  0.000000000
  4.25  0.001   5      0.8548582  0.000000000
  4.25  0.001   6      0.8548582  0.000000000
  4.25  0.001   7      0.8548582  0.000000000
  4.25  0.001   8      0.8548582  0.000000000
  4.25  0.001   9      0.8548582  0.000000000
  4.25  0.001  10      0.8548582  0.000000000
  4.25  0.002   1      0.8548582  0.000000000
  4.25  0.002   2      0.8548582  0.000000000
  4.25  0.002   3      0.8548582  0.000000000
  4.25  0.002   4      0.8548582  0.000000000
  4.25  0.002   5      0.8548582  0.000000000
  4.25  0.002   6      0.8560580  0.013995319
  4.25  0.002   7      0.8588568  0.049777659
  4.25  0.002   8      0.8632550  0.107910900
  4.25  0.002   9      0.8668547  0.161925654
  4.25  0.002  10      0.8720526  0.223454882
  4.25  0.003   1      0.8548582  0.000000000
  4.25  0.003   2      0.8548582  0.000000000
  4.25  0.003   3      0.8548582  0.000000000
  4.25  0.003   4      0.8556582  0.009358009
  4.25  0.003   5      0.8598566  0.060918807
  4.25  0.003   6      0.8656548  0.147458132
  4.25  0.003   7      0.8730518  0.234980995
  4.25  0.003   8      0.8786494  0.299946615
  4.25  0.003   9      0.8854460  0.365820579
  4.25  0.003  10      0.8886460  0.395616297
  4.25  0.004   1      0.8548582  0.000000000
  4.25  0.004   2      0.8548582  0.000000000
  4.25  0.004   3      0.8550582  0.002353195
  4.25  0.004   4      0.8610560  0.074230165
  4.25  0.004   5      0.8688539  0.187817864
  4.25  0.004   6      0.8770499  0.282385798
  4.25  0.004   7      0.8860459  0.370669178
  4.25  0.004   8      0.8882456  0.398889627
  4.25  0.004   9      0.8910438  0.431593520
  4.25  0.004  10      0.8912433  0.447034593
  4.25  0.005   1      0.8548582  0.000000000
  4.25  0.005   2      0.8548582  0.000000000
  4.25  0.005   3      0.8568574  0.025122989
  4.25  0.005   4      0.8684537  0.178049675
  4.25  0.005   5      0.8782496  0.295237693
  4.25  0.005   6      0.8872459  0.383278115
  4.25  0.005   7      0.8900446  0.418822158
  4.25  0.005   8      0.8916433  0.447409035
  4.25  0.005   9      0.8938428  0.468012736
  4.25  0.005  10      0.8954422  0.479870400
  4.25  0.006   1      0.8548582  0.000000000
  4.25  0.006   2      0.8548582  0.000000000
  4.25  0.006   3      0.8628552  0.098705222
  4.25  0.006   4      0.8750504  0.258017346
  4.25  0.006   5      0.8872460  0.384138391
  4.25  0.006   6      0.8900443  0.420634370
  4.25  0.006   7      0.8922430  0.455521622
  4.25  0.006   8      0.8946422  0.476559675
  4.25  0.006   9      0.8960411  0.483430032
  4.25  0.006  10      0.8978412  0.500973263
  4.25  0.007   1      0.8548582  0.000000000
  4.25  0.007   2      0.8552582  0.004688062
  4.25  0.007   3      0.8676544  0.169489642
  4.25  0.007   4      0.8826475  0.339974208
  4.25  0.007   5      0.8888451  0.409756213
  4.25  0.007   6      0.8918430  0.450493869
  4.25  0.007   7      0.8950420  0.479193346
  4.25  0.007   8      0.8972408  0.490721518
  4.25  0.007   9      0.8986406  0.508312552
  4.25  0.007  10      0.8958419  0.503278954
  4.25  0.008   1      0.8548582  0.000000000
  4.25  0.008   2      0.8560579  0.015905667
  4.25  0.008   3      0.8726521  0.231378649
  4.25  0.008   4      0.8876460  0.387207766
  4.25  0.008   5      0.8912438  0.441216579
  4.25  0.008   6      0.8942428  0.471360709
  4.25  0.008   7      0.8964409  0.486812385
  4.25  0.008   8      0.8986406  0.508312552
  4.25  0.008   9      0.8956417  0.503294492
  4.25  0.008  10      0.8982393  0.528920735
  4.25  0.009   1      0.8548582  0.000000000
  4.25  0.009   2      0.8576574  0.034176538
  4.25  0.009   3      0.8768499  0.281892086
  4.25  0.009   4      0.8890449  0.410363653
  4.25  0.009   5      0.8930428  0.460318347
  4.25  0.009   6      0.8952420  0.480749955
  4.25  0.009   7      0.8976411  0.501675594
  4.25  0.009   8      0.8964417  0.505707902
  4.25  0.009   9      0.8984393  0.529492322
  4.25  0.009  10      0.9010382  0.551562222
  4.25  0.010   1      0.8548582  0.000000000
  4.25  0.010   2      0.8620558  0.086636279
  4.25  0.010   3      0.8832472  0.345836859
  4.25  0.010   4      0.8918436  0.439109305
  4.25  0.010   5      0.8936427  0.468809018
  4.25  0.010   6      0.8970409  0.492168688
  4.25  0.010   7      0.8976409  0.509938728
  4.25  0.010   8      0.8964406  0.516270809
  4.25  0.010   9      0.8996388  0.544365021
  4.25  0.010  10      0.8954408  0.536769853
  4.50  0.001   1      0.8548582  0.000000000
  4.50  0.001   2      0.8548582  0.000000000
  4.50  0.001   3      0.8548582  0.000000000
  4.50  0.001   4      0.8548582  0.000000000
  4.50  0.001   5      0.8548582  0.000000000
  4.50  0.001   6      0.8548582  0.000000000
  4.50  0.001   7      0.8548582  0.000000000
  4.50  0.001   8      0.8548582  0.000000000
  4.50  0.001   9      0.8548582  0.000000000
  4.50  0.001  10      0.8550582  0.002353195
  4.50  0.002   1      0.8548582  0.000000000
  4.50  0.002   2      0.8548582  0.000000000
  4.50  0.002   3      0.8548582  0.000000000
  4.50  0.002   4      0.8548582  0.000000000
  4.50  0.002   5      0.8548582  0.000000000
  4.50  0.002   6      0.8560579  0.015887588
  4.50  0.002   7      0.8596566  0.058729676
  4.50  0.002   8      0.8646547  0.126845907
  4.50  0.002   9      0.8692534  0.185777607
  4.50  0.002  10      0.8734515  0.238535420
  4.50  0.003   1      0.8548582  0.000000000
  4.50  0.003   2      0.8548582  0.000000000
  4.50  0.003   3      0.8548582  0.000000000
  4.50  0.003   4      0.8558580  0.011696471
  4.50  0.003   5      0.8608560  0.072057445
  4.50  0.003   6      0.8666547  0.160110312
  4.50  0.003   7      0.8736512  0.244659730
  4.50  0.003   8      0.8802481  0.313943999
  4.50  0.003   9      0.8866457  0.375484443
  4.50  0.003  10      0.8882456  0.395325779
  4.50  0.004   1      0.8548582  0.000000000
  4.50  0.004   2      0.8548582  0.000000000
  4.50  0.004   3      0.8552582  0.004688062
  4.50  0.004   4      0.8620558  0.086733121
  4.50  0.004   5      0.8704531  0.206145013
  4.50  0.004   6      0.8788492  0.301753392
  4.50  0.004   7      0.8870459  0.381590401
  4.50  0.004   8      0.8898448  0.411921178
  4.50  0.004   9      0.8912438  0.436422045
  4.50  0.004  10      0.8916433  0.452869020
  4.50  0.005   1      0.8548582  0.000000000
  4.50  0.005   2      0.8548582  0.000000000
  4.50  0.005   3      0.8574574  0.031954022
  4.50  0.005   4      0.8690537  0.188290097
  4.50  0.005   5      0.8796484  0.309828555
  4.50  0.005   6      0.8882460  0.390818803
  4.50  0.005   7      0.8906440  0.425933708
  4.50  0.005   8      0.8914432  0.449260334
  4.50  0.005   9      0.8944425  0.473649399
  4.50  0.005  10      0.8958416  0.481967262
  4.50  0.006   1      0.8548582  0.000000000
  4.50  0.006   2      0.8548582  0.000000000
  4.50  0.006   3      0.8636547  0.113603679
  4.50  0.006   4      0.8762499  0.271355902
  4.50  0.006   5      0.8878459  0.387821507
  4.50  0.006   6      0.8904440  0.427089339
  4.50  0.006   7      0.8924432  0.456922140
  4.50  0.006   8      0.8954419  0.481254526
  4.50  0.006   9      0.8972408  0.490721518
  4.50  0.006  10      0.8980411  0.505028882
  4.50  0.007   1      0.8548582  0.000000000
  4.50  0.007   2      0.8552582  0.004688062
  4.50  0.007   3      0.8690536  0.186765249
  4.50  0.007   4      0.8858464  0.367019116
  4.50  0.007   5      0.8890449  0.412981943
  4.50  0.007   6      0.8926428  0.457550459
  4.50  0.007   7      0.8956417  0.482683926
  4.50  0.007   8      0.8970412  0.490840306
  4.50  0.007   9      0.8974409  0.504639798
  4.50  0.007  10      0.8962417  0.505851129
  4.50  0.008   1      0.8548582  0.000000000
  4.50  0.008   2      0.8566576  0.022820477
  4.50  0.008   3      0.8734513  0.241324214
  4.50  0.008   4      0.8888456  0.398254890
  4.50  0.008   5      0.8912438  0.442753430
  4.50  0.008   6      0.8944424  0.475063638
  4.50  0.008   7      0.8972408  0.491425372
  4.50  0.008   8      0.8976408  0.505900645
  4.50  0.008   9      0.8964414  0.509689120
  4.50  0.008  10      0.8984393  0.533162673
  4.50  0.009   1      0.8548582  0.000000000
  4.50  0.009   2      0.8588569  0.049698540
  4.50  0.009   3      0.8784491  0.298338140
  4.50  0.009   4      0.8886451  0.410853430
  4.50  0.009   5      0.8928430  0.461072811
  4.50  0.009   6      0.8948417  0.480307516
  4.50  0.009   7      0.8976411  0.503659126
  4.50  0.009   8      0.8956419  0.505223812
  4.50  0.009   9      0.8988390  0.534357042
  4.50  0.009  10      0.8996388  0.547265556
  4.50  0.010   1      0.8548582  0.000000000
  4.50  0.010   2      0.8626553  0.096455100
  4.50  0.010   3      0.8854467  0.364776118
  4.50  0.010   4      0.8916438  0.442458654
  4.50  0.010   5      0.8944424  0.474368731
  4.50  0.010   6      0.8966409  0.492356948
  4.50  0.010   7      0.8970411  0.508799773
  4.50  0.010   8      0.8980393  0.526918484
  4.50  0.010   9      0.9006385  0.550778247
  4.50  0.010  10      0.8956406  0.539564747
  4.75  0.001   1      0.8548582  0.000000000
  4.75  0.001   2      0.8548582  0.000000000
  4.75  0.001   3      0.8548582  0.000000000
  4.75  0.001   4      0.8548582  0.000000000
  4.75  0.001   5      0.8548582  0.000000000
  4.75  0.001   6      0.8548582  0.000000000
  4.75  0.001   7      0.8548582  0.000000000
  4.75  0.001   8      0.8548582  0.000000000
  4.75  0.001   9      0.8548582  0.000000000
  4.75  0.001  10      0.8552582  0.004688062
  4.75  0.002   1      0.8548582  0.000000000
  4.75  0.002   2      0.8548582  0.000000000
  4.75  0.002   3      0.8548582  0.000000000
  4.75  0.002   4      0.8548582  0.000000000
  4.75  0.002   5      0.8548582  0.000000000
  4.75  0.002   6      0.8564576  0.020546433
  4.75  0.002   7      0.8606560  0.069834930
  4.75  0.002   8      0.8650548  0.135429537
  4.75  0.002   9      0.8690537  0.191129789
  4.75  0.002  10      0.8740510  0.248559119
  4.75  0.003   1      0.8548582  0.000000000
  4.75  0.003   2      0.8548582  0.000000000
  4.75  0.003   3      0.8548582  0.000000000
  4.75  0.003   4      0.8560580  0.013995319
  4.75  0.003   5      0.8618558  0.084608543
  4.75  0.003   6      0.8690536  0.183827455
  4.75  0.003   7      0.8748504  0.256347507
  4.75  0.003   8      0.8820476  0.332705391
  4.75  0.003   9      0.8870459  0.381657060
  4.75  0.003  10      0.8882456  0.398889627
  4.75  0.004   1      0.8548582  0.000000000
  4.75  0.004   2      0.8548582  0.000000000
  4.75  0.004   3      0.8556582  0.009358009
  4.75  0.004   4      0.8628552  0.098705222
  4.75  0.004   5      0.8714529  0.217851419
  4.75  0.004   6      0.8804481  0.315448610
  4.75  0.004   7      0.8874457  0.384886328
  4.75  0.004   8      0.8896449  0.414849836
  4.75  0.004   9      0.8916436  0.442452442
  4.75  0.004  10      0.8928432  0.460453781
  4.75  0.005   1      0.8548582  0.000000000
  4.75  0.005   2      0.8548582  0.000000000
  4.75  0.005   3      0.8584569  0.045218768
  4.75  0.005   4      0.8702531  0.204393501
  4.75  0.005   5      0.8814478  0.326337015
  4.75  0.005   6      0.8884457  0.396009930
  4.75  0.005   7      0.8906438  0.431202070
  4.75  0.005   8      0.8922432  0.455523127
  4.75  0.005   9      0.8946422  0.476506637
  4.75  0.005  10      0.8958412  0.482725729
  4.75  0.006   1      0.8548582  0.000000000
  4.75  0.006   2      0.8548582  0.000000000
  4.75  0.006   3      0.8650547  0.132094621
  4.75  0.006   4      0.8770499  0.283680335
  4.75  0.006   5      0.8888456  0.397406289
  4.75  0.006   6      0.8914438  0.437848041
  4.75  0.006   7      0.8926433  0.459827973
  4.75  0.006   8      0.8954419  0.482056457
  4.75  0.006   9      0.8968412  0.490204879
  4.75  0.006  10      0.8982408  0.506377447
  4.75  0.007   1      0.8548582  0.000000000
  4.75  0.007   2      0.8556582  0.009358009
  4.75  0.007   3      0.8694534  0.195162512
  4.75  0.007   4      0.8852464  0.365192444
  4.75  0.007   5      0.8894444  0.418773907
  4.75  0.007   6      0.8924430  0.458458154
  4.75  0.007   7      0.8952419  0.481438208
  4.75  0.007   8      0.8966412  0.491739780
  4.75  0.007   9      0.8978406  0.508468806
  4.75  0.007  10      0.8970411  0.513485499
  4.75  0.008   1      0.8548582  0.000000000
  4.75  0.008   2      0.8566576  0.022820477
  4.75  0.008   3      0.8746505  0.254420785
  4.75  0.008   4      0.8884456  0.398655708
  4.75  0.008   5      0.8916433  0.448191114
  4.75  0.008   6      0.8956417  0.481883821
  4.75  0.008   7      0.8968409  0.491547698
  4.75  0.008   8      0.8978406  0.508468806
  4.75  0.008   9      0.8974406  0.517698999
  4.75  0.008  10      0.8990392  0.537322287
  4.75  0.009   1      0.8548582  0.000000000
  4.75  0.009   2      0.8600564  0.063151862
  4.75  0.009   3      0.8804486  0.316771987
  4.75  0.009   4      0.8894446  0.417574878
  4.75  0.009   5      0.8934428  0.464385915
  4.75  0.009   6      0.8962411  0.488225186
  4.75  0.009   7      0.8982408  0.507704127
  4.75  0.009   8      0.8974411  0.516444736
  4.75  0.009   9      0.8988392  0.536136741
  4.75  0.009  10      0.8984393  0.542238779
  4.75  0.010   1      0.8548582  0.000000000
  4.75  0.010   2      0.8628552  0.103643168
  4.75  0.010   3      0.8860462  0.372627199
  4.75  0.010   4      0.8914438  0.442630490
  4.75  0.010   5      0.8950419  0.479314460
  4.75  0.010   6      0.8964414  0.493124600
  4.75  0.010   7      0.8966414  0.507519443
  4.75  0.010   8      0.8988392  0.533130103
  4.75  0.010   9      0.8996388  0.547128817
  4.75  0.010  10      0.8950409  0.538252839
  5.00  0.001   1      0.8548582  0.000000000
  5.00  0.001   2      0.8548582  0.000000000
  5.00  0.001   3      0.8548582  0.000000000
  5.00  0.001   4      0.8548582  0.000000000
  5.00  0.001   5      0.8548582  0.000000000
  5.00  0.001   6      0.8548582  0.000000000
  5.00  0.001   7      0.8548582  0.000000000
  5.00  0.001   8      0.8548582  0.000000000
  5.00  0.001   9      0.8548582  0.000000000
  5.00  0.001  10      0.8556582  0.009358009
  5.00  0.002   1      0.8548582  0.000000000
  5.00  0.002   2      0.8548582  0.000000000
  5.00  0.002   3      0.8548582  0.000000000
  5.00  0.002   4      0.8548582  0.000000000
  5.00  0.002   5      0.8550582  0.002353195
  5.00  0.002   6      0.8566576  0.022820477
  5.00  0.002   7      0.8618558  0.082915837
  5.00  0.002   8      0.8656547  0.146092379
  5.00  0.002   9      0.8704531  0.206145013
  5.00  0.002  10      0.8748504  0.256347507
  5.00  0.003   1      0.8548582  0.000000000
  5.00  0.003   2      0.8548582  0.000000000
  5.00  0.003   3      0.8548582  0.000000000
  5.00  0.003   4      0.8560579  0.015887588
  5.00  0.003   5      0.8626553  0.096592384
  5.00  0.003   6      0.8688539  0.187817864
  5.00  0.003   7      0.8754502  0.262799823
  5.00  0.003   8      0.8840467  0.350523495
  5.00  0.003   9      0.8872460  0.383123590
  5.00  0.003  10      0.8898448  0.412842945
  5.00  0.004   1      0.8548582  0.000000000
  5.00  0.004   2      0.8548582  0.000000000
  5.00  0.004   3      0.8558580  0.011696471
  5.00  0.004   4      0.8634550  0.109947626
  5.00  0.004   5      0.8726521  0.231378649
  5.00  0.004   6      0.8822475  0.333283308
  5.00  0.004   7      0.8880460  0.389263894
  5.00  0.004   8      0.8898446  0.418192526
  5.00  0.004   9      0.8914435  0.445969991
  5.00  0.004  10      0.8926433  0.459731536
  5.00  0.005   1      0.8548582  0.000000000
  5.00  0.005   2      0.8548582  0.000000000
  5.00  0.005   3      0.8598566  0.060918807
  5.00  0.005   4      0.8712529  0.213459077
  5.00  0.005   5      0.8826475  0.339974208
  5.00  0.005   6      0.8882456  0.397081957
  5.00  0.005   7      0.8916438  0.440812611
  5.00  0.005   8      0.8926432  0.459832791
  5.00  0.005   9      0.8952419  0.480631137
  5.00  0.005  10      0.8964409  0.486812385
  5.00  0.006   1      0.8548582  0.000000000
  5.00  0.006   2      0.8548582  0.000000000
  5.00  0.006   3      0.8646550  0.134456950
  5.00  0.006   4      0.8786491  0.298902319
  5.00  0.006   5      0.8882457  0.395314036
  5.00  0.006   6      0.8914438  0.441835815
  5.00  0.006   7      0.8932433  0.463011790
  5.00  0.006   8      0.8952420  0.480695585
  5.00  0.006   9      0.8968412  0.492375207
  5.00  0.006  10      0.8980406  0.508454825
  5.00  0.007   1      0.8548582  0.000000000
  5.00  0.007   2      0.8560580  0.013995319
  5.00  0.007   3      0.8712529  0.213459077
  5.00  0.007   4      0.8874456  0.384686489
  5.00  0.007   5      0.8902441  0.427343953
  5.00  0.007   6      0.8928432  0.461070095
  5.00  0.007   7      0.8950420  0.480066863
  5.00  0.007   8      0.8972412  0.496251382
  5.00  0.007   9      0.8980404  0.510550661
  5.00  0.007  10      0.8972406  0.516493627
  5.00  0.008   1      0.8548582  0.000000000
  5.00  0.008   2      0.8568576  0.025077147
  5.00  0.008   3      0.8758500  0.266422036
  5.00  0.008   4      0.8876457  0.396236433
  5.00  0.008   5      0.8916432  0.448222796
  5.00  0.008   6      0.8952419  0.480631137
  5.00  0.008   7      0.8964411  0.491070278
  5.00  0.008   8      0.8976408  0.509292898
  5.00  0.008   9      0.8974406  0.518434260
  5.00  0.008  10      0.8996387  0.542729938
  5.00  0.009   1      0.8548582  0.000000000
  5.00  0.009   2      0.8608561  0.071896757
  5.00  0.009   3      0.8826475  0.337756825
  5.00  0.009   4      0.8902446  0.425408992
  5.00  0.009   5      0.8936428  0.465746417
  5.00  0.009   6      0.8968409  0.492926736
  5.00  0.009   7      0.8982406  0.510362305
  5.00  0.009   8      0.8974408  0.517720395
  5.00  0.009   9      0.8996387  0.542786938
  5.00  0.009  10      0.8982388  0.544138687
  5.00  0.010   1      0.8548582  0.000000000
  5.00  0.010   2      0.8648545  0.127092987
  5.00  0.010   3      0.8872459  0.383089427
  5.00  0.010   4      0.8916435  0.445571043
  5.00  0.010   5      0.8948419  0.477944746
  5.00  0.010   6      0.8964412  0.495865625
  5.00  0.010   7      0.8974411  0.512544710
  5.00  0.010   8      0.8982392  0.533078032
  5.00  0.010   9      0.8980393  0.541614769
  5.00  0.010  10      0.8950408  0.538912055

Accuracy was used to select the optimal model using the largest value.
The final values used for the model were degree = 10, scale = 0.009 and C
 = 4.25.

Collect the results of trained models

In [85]:
results <- resamples(list(  trained_Model_1  = fit.svmLinear
                            , trained_Model_2  = fit.svmRadial
                            , trained_Model_3  = fit.svmPoly
                            
                            , trained_Model_4  = fit.svmLinear_preProc
                            , trained_Model_5  = fit.svmRadial_preProc
                            , trained_Model_6  = fit.svmRadial_preProc
                            
                            , trained_Model_7  = fit.svmLinear_automaticGrid
                            , trained_Model_8  = fit.svmRadial_automaticGrid
                            , trained_Model_9  = fit.svmPoly_automaticGrid
                            
                            , trained_Model_10 = fit.svmLinear_manualGrid
                            , trained_Model_11 = fit.svmRadial_manualGrid
                            , trained_Model_12 = fit.svmPoly_manualGrid
))

Summarize the fitted models

In [86]:
summary(results)
Call:
summary.resamples(object = results)

Models: trained_Model_1, trained_Model_2, trained_Model_3, trained_Model_4, trained_Model_5, trained_Model_6, trained_Model_7, trained_Model_8, trained_Model_9, trained_Model_10, trained_Model_11, trained_Model_12 
Number of resamples: 4 

Accuracy 
                      Min.   1st Qu.    Median      Mean   3rd Qu.      Max.
trained_Model_1  0.8545164 0.8545164 0.8548582 0.8548582 0.8552000 0.8552000
trained_Model_2  0.8904876 0.8928219 0.8960406 0.8956422 0.8988609 0.9000000
trained_Model_3  0.9032774 0.9038193 0.9072358 0.9090373 0.9124537 0.9184000
trained_Model_4  0.8545164 0.8545164 0.8548582 0.8548582 0.8552000 0.8552000
trained_Model_5  0.8816000 0.8882657 0.8916867 0.8900433 0.8934643 0.8952000
trained_Model_6  0.8816000 0.8882657 0.8916867 0.8900433 0.8934643 0.8952000
trained_Model_7  0.8545164 0.8545164 0.8548582 0.8548582 0.8552000 0.8552000
trained_Model_8  0.8592000 0.8616000 0.8632544 0.8636534 0.8653078 0.8689049
trained_Model_9  0.8545164 0.8545164 0.8548582 0.8548582 0.8552000 0.8552000
trained_Model_10 0.8545164 0.8545164 0.8548582 0.8548582 0.8552000 0.8552000
trained_Model_11 0.8912000 0.8990590 0.9048761 0.9034380 0.9092552 0.9128000
trained_Model_12 0.8952000 0.8988000 0.9004396 0.9010382 0.9026779 0.9080735
                 NA's
trained_Model_1     0
trained_Model_2     0
trained_Model_3     0
trained_Model_4     0
trained_Model_5     0
trained_Model_6     0
trained_Model_7     0
trained_Model_8     0
trained_Model_9     0
trained_Model_10    0
trained_Model_11    0
trained_Model_12    0

Kappa 
                       Min.    1st Qu.    Median      Mean   3rd Qu.      Max.
trained_Model_1  0.00000000 0.00000000 0.0000000 0.0000000 0.0000000 0.0000000
trained_Model_2  0.40525167 0.40907562 0.4326150 0.4350147 0.4585541 0.4695772
trained_Model_3  0.55522418 0.56871090 0.5979642 0.5988914 0.6281447 0.6444130
trained_Model_4  0.00000000 0.00000000 0.0000000 0.0000000 0.0000000 0.0000000
trained_Model_5  0.29086714 0.36655898 0.4012314 0.3778059 0.4124783 0.4178937
trained_Model_6  0.29086714 0.36655898 0.4012314 0.3778059 0.4124783 0.4178937
trained_Model_7  0.00000000 0.00000000 0.0000000 0.0000000 0.0000000 0.0000000
trained_Model_8  0.05362505 0.08019566 0.0983517 0.1035911 0.1217471 0.1640358
trained_Model_9  0.00000000 0.00000000 0.0000000 0.0000000 0.0000000 0.0000000
trained_Model_10 0.00000000 0.00000000 0.0000000 0.0000000 0.0000000 0.0000000
trained_Model_11 0.49325432 0.50752411 0.5343763 0.5364884 0.5633406 0.5839466
trained_Model_12 0.53572647 0.53917530 0.5412529 0.5515622 0.5536398 0.5880166
                 NA's
trained_Model_1     0
trained_Model_2     0
trained_Model_3     0
trained_Model_4     0
trained_Model_5     0
trained_Model_6     0
trained_Model_7     0
trained_Model_8     0
trained_Model_9     0
trained_Model_10    0
trained_Model_11    0
trained_Model_12    0

Plot and rank the fitted models

In [87]:
dotplot(results)
In [88]:
bwplot(results)

Assign the best trained model based on Accuracy

In [89]:
best_trained_model <- fit.svmRadial_manualGrid

9. Test skill of the BEST trained model on validation/testing dataset

In [90]:
predictions <- predict(best_trained_model, newdata=testing_dataset)

Evaluate the BEST trained model and print results

In [91]:
res_  <- caret::confusionMatrix(table(predictions, testing_dataset$Churn))
print("Results from the BEST trained model ... ...\n"); 
print(round(res_$overall, digits = 3))
[1] "Results from the BEST trained model ... ...\n"
      Accuracy          Kappa  AccuracyLower  AccuracyUpper   AccuracyNull 
         0.907          0.559          0.886          0.926          0.856 
AccuracyPValue  McnemarPValue 
         0.000          0.000 

10. Save the model to disk

In [92]:
#getwd()
saveRDS(best_trained_model, "./best_trained_model.rds")
In [93]:
# load the model
#getwd()
saved_model <- readRDS("./best_trained_model.rds")
print(saved_model)
Support Vector Machines with Radial Basis Function Kernel 

2501 samples
  19 predictor
   2 classes: 'no', 'yes' 

Pre-processing: centered (19), scaled (19), principal component
 signal extraction (19) 
Resampling: Cross-Validated (2 fold, repeated 2 times) 
Summary of sample sizes: 1251, 1250, 1251, 1250 
Resampling results across tuning parameters:

  C    sigma  Accuracy   Kappa      
  1.0  0.1    0.8910436  0.395101052
  1.0  0.2    0.8770491  0.263986504
  1.0  0.3    0.8616558  0.082270494
  1.0  0.4    0.8560579  0.014053048
  1.0  0.5    0.8550582  0.002353195
  1.0  0.6    0.8548582  0.000000000
  1.0  0.7    0.8548582  0.000000000
  1.0  0.8    0.8548582  0.000000000
  1.0  0.9    0.8548582  0.000000000
  1.0  1.0    0.8548582  0.000000000
  1.5  0.1    0.8994398  0.479467316
  1.5  0.2    0.8900443  0.398313543
  1.5  0.3    0.8710515  0.207242730
  1.5  0.4    0.8614560  0.085221306
  1.5  0.5    0.8558579  0.017438650
  1.5  0.6    0.8550582  0.002353195
  1.5  0.7    0.8548582  0.000000000
  1.5  0.8    0.8548582  0.000000000
  1.5  0.9    0.8548582  0.000000000
  1.5  1.0    0.8548582  0.000000000
  2.0  0.1    0.9000395  0.504361056
  2.0  0.2    0.8922428  0.427059231
  2.0  0.3    0.8718508  0.224707540
  2.0  0.4    0.8614560  0.095038285
  2.0  0.5    0.8556580  0.018936195
  2.0  0.6    0.8544584  0.001146119
  2.0  0.7    0.8548584  0.001955066
  2.0  0.8    0.8548582  0.000000000
  2.0  0.9    0.8548582  0.000000000
  2.0  1.0    0.8548582  0.000000000
  2.5  0.1    0.9020390  0.522895363
  2.5  0.2    0.8914432  0.424427823
  2.5  0.3    0.8720508  0.228016585
  2.5  0.4    0.8614561  0.100069704
  2.5  0.5    0.8556580  0.018936195
  2.5  0.6    0.8544584  0.001146119
  2.5  0.7    0.8548584  0.001955066
  2.5  0.8    0.8548582  0.000000000
  2.5  0.9    0.8548582  0.000000000
  2.5  1.0    0.8548582  0.000000000
  3.0  0.1    0.9034380  0.536488374
  3.0  0.2    0.8902432  0.420058273
  3.0  0.3    0.8716510  0.228112624
  3.0  0.4    0.8612563  0.099602705
  3.0  0.5    0.8556580  0.018936195
  3.0  0.6    0.8544584  0.001146119
  3.0  0.7    0.8548584  0.001955066
  3.0  0.8    0.8548582  0.000000000
  3.0  0.9    0.8548582  0.000000000
  3.0  1.0    0.8548582  0.000000000
  3.5  0.1    0.9014385  0.532802322
  3.5  0.2    0.8898435  0.420461089
  3.5  0.3    0.8710513  0.227597045
  3.5  0.4    0.8612563  0.099602705
  3.5  0.5    0.8556580  0.018936195
  3.5  0.6    0.8544584  0.001146119
  3.5  0.7    0.8548584  0.001955066
  3.5  0.8    0.8548582  0.000000000
  3.5  0.9    0.8548582  0.000000000
  3.5  1.0    0.8548582  0.000000000
  4.0  0.1    0.9002382  0.534163603
  4.0  0.2    0.8878440  0.414978448
  4.0  0.3    0.8710510  0.230432537
  4.0  0.4    0.8610563  0.099096612
  4.0  0.5    0.8556580  0.018936195
  4.0  0.6    0.8544584  0.001146119
  4.0  0.7    0.8548584  0.001955066
  4.0  0.8    0.8548582  0.000000000
  4.0  0.9    0.8548582  0.000000000
  4.0  1.0    0.8548582  0.000000000
  4.5  0.1    0.9004384  0.539666102
  4.5  0.2    0.8872443  0.413919157
  4.5  0.3    0.8712508  0.232350709
  4.5  0.4    0.8610563  0.099096612
  4.5  0.5    0.8556580  0.018936195
  4.5  0.6    0.8544584  0.001146119
  4.5  0.7    0.8548584  0.001955066
  4.5  0.8    0.8548582  0.000000000
  4.5  0.9    0.8548582  0.000000000
  4.5  1.0    0.8548582  0.000000000
  5.0  0.1    0.9000390  0.539396659
  5.0  0.2    0.8878440  0.418393524
  5.0  0.3    0.8710508  0.230455586
  5.0  0.4    0.8610563  0.099096612
  5.0  0.5    0.8556580  0.018936195
  5.0  0.6    0.8544584  0.001146119
  5.0  0.7    0.8548584  0.001955066
  5.0  0.8    0.8548582  0.000000000
  5.0  0.9    0.8548582  0.000000000
  5.0  1.0    0.8548582  0.000000000

Accuracy was used to select the optimal model using the largest value.
The final values used for the model were sigma = 0.1 and C = 3.
In [94]:
# make a predictions on "new data" using the final model
final_predictions <- predict(saved_model, dataSet[1:20])
confusionMatrix(table(final_predictions, dataSet$Churn))
res_ <- confusionMatrix(table(final_predictions, dataSet$Churn))
print("Results from the BEST trained model ... ...\n"); 
print(round(res_$overall, digits = 3))
Confusion Matrix and Statistics

                 
final_predictions   no  yes
              no  2831  110
              yes   19  373
                                          
               Accuracy : 0.9613          
                 95% CI : (0.9542, 0.9676)
    No Information Rate : 0.8551          
    P-Value [Acc > NIR] : < 2.2e-16       
                                          
                  Kappa : 0.8306          
                                          
 Mcnemar's Test P-Value : 2.299e-15       
                                          
            Sensitivity : 0.9933          
            Specificity : 0.7723          
         Pos Pred Value : 0.9626          
         Neg Pred Value : 0.9515          
             Prevalence : 0.8551          
         Detection Rate : 0.8494          
   Detection Prevalence : 0.8824          
      Balanced Accuracy : 0.8828          
                                          
       'Positive' Class : no              
                                          
[1] "Results from the BEST trained model ... ...\n"
      Accuracy          Kappa  AccuracyLower  AccuracyUpper   AccuracyNull 
         0.961          0.831          0.954          0.968          0.855 
AccuracyPValue  McnemarPValue 
         0.000          0.000 
In [95]:
print(res_$table)
fourfoldplot(res_$table, color = c("#CC6666", "#99CC99"),
             conf.level = 0, margin = 1, main = "Confusion Matrix")
                 
final_predictions   no  yes
              no  2831  110
              yes   19  373

REFERENCES