The hardware and bandwidth for this mirror is donated by METANET, the Webhosting and Full Service-Cloud Provider.
If you wish to report a bug, or if you are interested in having us mirror your free-software or open-source project, please feel free to contact us at mirror[@]metanet.ch.
This vignette will cover capped mean functions, exposure and increased limit factor curves of the NetSimR package. It will overview the theory behind these functions, propose posible uses and illustrate examples. The distributions covered for claim severities are LogNormal, Gamma, Pareto, Sliced LogNormal-Pareto, Sliced Gamma-Pareto. An application of the capped mean is suggested in the article “Taken to Excess” accessible here.
Let claim severity \(x\) originate from a probability density function \(f(x)\) and a cumulative density function \(F(x)\). Let claims be capped at amount \(c\). The capped claim severity is defined as \(y=Min(x,c)\).
The expected mean can be estimated by: \[E(x)=\int_0^{\infty}x*f(x)~dx\] The expected capped claim can be obtained by: \[E(y)=\int_0^{\infty}y*f(x)~dx\] The above simplifies to: \[E(y)=c-\int_0^{c}F(x)~dx\] This equation will need to be solved for every distribution.
This function provides the percentage of total claims cost that lies below a certain claims level. Exposure curve at a claims cap is calculated by: \[EC(c)=\frac{E(y)}{E(x)}\]
This function provides the ratio of expected capped claims to a higher cap \(c_{h}\) to the expected claims of a lower cap \(c_{l}\). Let the claim capped at \(c_{h}\) be \(y_{h}\) and the claim capped at \(c_{h}\) be \(y_{h}\). Increased limit factor curve at a claims caps \(c_{h}\) and \(c_{l}\) is calculated by: \[ILF(c_{h},c_{l})=\frac{E(y_{h})}{E(y_{l})}\]
In this section we will suggest possible uses of the specified functions.
For property lines/sections:
Note: for property lines, instead of severity, we may opt to model severity as a percentage of sum insured. In GLMs this is equivalent to having the sum insured as the exposure for the severity distribution.
For casualty lines/sections:
The below code ilustrates an alternative and more accurate approach to applying capping in regressions, for modelling attritional claims part. The code steps are:
The example suggests that the proposed method outperforms the market approach on LogNormal distribution. The one-way benefit will not be significant on a Gamma distribution, but when we move to a GLM and the model relativities are larger, the capped mean benefit may also be significant.
## Warning: package 'crch' was built under R version 4.2.3
#Set parameters
n<-10000
mu<-6
sigma<-1.7
Cap <- 10000
#Set seed to keep simulations constant
set.seed(10)
#Simulate data
x<-round(rlnorm(n,mu,sigma),0)
head(x)
## [1] 416 295 39 146 666 783
## Min. 1st Qu. Median Mean 3rd Qu. Max.
## 1 128 400 1769 1279 263389
## [1] 416 295 39 146 666 783
## Min. 1st Qu. Median Mean 3rd Qu. Max.
## 1 128 400 1282 1279 10000
##
## Call:
## lm(formula = log(z) ~ 1)
##
## Residuals:
## Min 1Q Median 3Q Max
## -5.9815 -1.1295 0.0099 1.1723 3.2288
##
## Coefficients:
## Estimate Std. Error t value Pr(>|t|)
## (Intercept) 5.98153 0.01662 359.8 <2e-16 ***
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Residual standard error: 1.662 on 9999 degrees of freedom
#fit right censored regression
lmCensored<-crch(log(z)~1, right = log(Cap), link.scale="identity", dist = "gaussian")
summary(lmCensored)
##
## Call:
## crch(formula = log(z) ~ 1, link.scale = "identity", dist = "gaussian",
## right = log(Cap))
##
## Standardized residuals:
## Min 1Q Median 3Q Max
## -3.5188 -0.6735 -0.0053 0.6763 1.8823
##
## Coefficients (location model):
## Estimate Std. Error z value Pr(>|z|)
## (Intercept) 6.00054 0.01708 351.2 <2e-16 ***
##
## Coefficients (scale model with identity link):
## Estimate Std. Error z value Pr(>|z|)
## (Intercept) 1.70526 0.01234 138.2 <2e-16 ***
## ---
## Signif. codes: 0 '***' 0.001 '**' 0.01 '*' 0.05 '.' 0.1 ' ' 1
##
## Distribution: gaussian
## Log-likelihood: -1.936e+04 on 2 Df
## Number of iterations in BFGS optimization: 6
## [,1] [,2]
## (Intercept) 5.981527 1.662243
## (Intercept) (scale)_(Intercept)
## 6.000537 1.705257
#Compare regressions' attritional capped cost to empirical
round(cbind(
Empirical=mean(z)
,LinearModel=exp(coefficients(lmLinear)+0.5*summary(lmLinear)$sigma*summary(lmLinear)$sigma)
,CensoredModel=LNormCappedMean(Cap,coefficients(lmCensored)[1],coefficients(lmCensored)[2])
),0)
## Empirical LinearModel CensoredModel
## (Intercept) 1282 1577 1284
Modelling excess levels would be similar to the above approach. In that case the expected mean above the excess levels would be the mean less the expected mean below the excess level. This is elaborated in the article cited in the introduction section.
For a LogNormal severity, generating the exposure curve value at a particular point is obtained by running the below code:
## [1] 0.5601327
This would be interpreted as: 56% of the expected cost is up to 1,000 currency units. Other distributions follow the same logic. The challenge for this exercise would be fitting the correct severity distribution, rather than calculating the Exposure curve’s value.
For a LogNormal severity, generating the increased limit factor value between two particular points is obtained by running the below code:
## [1] 1.155086
This would be interpreted as: the expected cost of the claims up to 1,500 currency units is 1.155 times the expected cost is up to 1,000 currency units. Other distributions follow the same logic. The challenge for this exercise would be fitting the severity distribution, rather than calculating the Increased Limit Factor curve’s value.
These binaries (installable software) and packages are in development.
They may not be fully stable and should be used with caution. We make no claims about them.