Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 2 additions & 0 deletions .github/pull_request_template.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,3 +6,5 @@

## Additional Context for Reviewers


- [ ] I passed tests locally for both code (`pytest`) and documentation changes (`uv run jb build docs --builder=custom --custom-builder=doctest`)
121 changes: 85 additions & 36 deletions chainladder/methods/benktander.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,8 @@ class Benktander(MethodBase):
then use 1.0
n_iters: int, optional (default=1)
Number of iterations to use in the Benktander model.
When n_iters=1, the result is equivalent to the BornhuetterFerguson method.
When n_iters>>1, the result converges to the traditional Chainladder model.
apriori_sigma: float, optional (default=0.0)
Standard deviation of the apriori. When used in conjunction with the
bootstrap model, the model samples aprioris from a lognormal distribution
Expand Down Expand Up @@ -49,53 +51,83 @@ class Benktander(MethodBase):

.. testcode::

tr = cl.load_sample('ukmotor')
apriori = cl.Chainladder().fit(tr).ultimate_ * 0 + 14000
xyz = cl.load_sample("xyz")

With ``n_iters=1`` Benktander reproduces Bornhuetter-Ferguson exactly.
ibnr = cl.Benktander().fit(X=xyz["Paid"], sample_weight=xyz["Premium"].latest_diagonal).ibnr_
print(ibnr)

.. testoutput::

2261
1998 NaN
1999 115.472127
2000 914.033812
2001 2432.394513
2002 6037.026677
2003 13928.934651
2004 33925.451475
2005 69724.761575
2006 73410.593920
2007 52977.560411
2008 45873.769490

When `n_iters=1`, the model is exactly the same as the BornhuetterFerguson model.

.. testcode::

print(
cl.Benktander(apriori=1.0, n_iters=1).fit(
tr, sample_weight=apriori
).ultimate_
xyz = cl.load_sample("xyz")

bk_ibnr = (
cl.Benktander(n_iters=1)
.fit(X=xyz["Paid"], sample_weight=xyz["Premium"].latest_diagonal)
.ibnr_
)
bf_ibnr = (
cl.BornhuetterFerguson()
.fit(X=xyz["Paid"], sample_weight=xyz["Premium"].latest_diagonal)
.ibnr_
)
print(bk_ibnr - bf_ibnr)

.. testoutput::

2261
2007 12690.000000
2008 13121.098503
2009 14028.278620
2010 13272.048822
2011 13911.968891
2012 15614.145287
2013 16029.501746

Increasing ``n_iters`` pulls the immature origins toward the chainladder
estimate. The 2013 origin shows this most: ``16029`` at ``n_iters=1``,
rising to ``19110`` at ``n_iters=4`` and approaching the chainladder
ultimate of ``20680``.
2261
1998 NaN
1999 NaN
2000 NaN
2001 NaN
2002 NaN
2003 NaN
2004 NaN
2005 NaN
2006 NaN
2007 NaN
2008 NaN

When `n_iters>>1`, the model converges to the traditional Chainladder model.

.. testcode::

print(
cl.Benktander(apriori=1.0, n_iters=4).fit(
tr, sample_weight=apriori
).ultimate_
)
xyz = cl.load_sample("xyz")

bk_ibnr = cl.Benktander(n_iters=1000).fit(X=xyz["Paid"], sample_weight=xyz["Premium"].latest_diagonal).ibnr_
cl_ibnr = cl.Chainladder().fit(xyz["Paid"]).ibnr_
print(bk_ibnr - cl_ibnr)

.. testoutput::

2261
2007 12690.000000
2008 13096.902490
2009 14030.535854
2010 13138.365841
2011 13880.984774
2012 16719.527550
2013 19110.806503
1998 NaN
1999 NaN
2000 NaN
2001 1.455192e-11
2002 -7.275958e-12
2003 7.275958e-12
2004 1.455192e-11
2005 -1.455192e-11
2006 2.910383e-11
2007 -5.820766e-11
2008 -7.275958e-11
"""

def __init__(self, apriori=1.0, n_iters=1, apriori_sigma=0, random_state=None):
Expand Down Expand Up @@ -132,13 +164,30 @@ def fit(self, X, y=None, sample_weight=None):

.. testcode::

tr = cl.load_sample('ukmotor')
apriori = cl.Chainladder().fit(tr).ultimate_ * 0 + 14000
print(cl.Benktander(apriori=1.0, n_iters=2).fit(tr, sample_weight=apriori))
xyz = cl.load_sample("xyz")

ultimate = (
cl.Benktander(apriori=1, n_iters=2)
.fit(X=xyz["Paid"], sample_weight=xyz["Premium"].latest_diagonal)
.ultimate_
)
print(ultimate)

.. testoutput::

Benktander(n_iters=2)
2261
1998 15822.000000
1999 24908.397003
2000 37547.676656
2001 40511.198946
2002 49417.354765
2003 50042.095135
2004 82437.601111
2005 95417.171135
2006 88485.508416
2007 66882.788227
2008 50708.755370

"""
if sample_weight is None:
raise ValueError("sample_weight is required.")
Expand Down
74 changes: 42 additions & 32 deletions chainladder/methods/bornferg.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,9 +10,14 @@ class BornhuetterFerguson(Benktander):
Parameters
----------
apriori: float, optional (default=1.0)
Multiplier for the sample_weight used in the Bornhuetter Ferguson
method. If sample_weight is already an apriori measure of ultimate,
then use 1.0
Multiplier for the `sample_weight` used in the Bornhuetter Ferguson
method. If `sample_weight` is already an apriori measure of ultimate,
then use 1.0.
The recommended pratice is to seperate the model parameter assumption
and data apart.
For example, if the apriori s 80% of premium, it is recommended to set
the aprior as 0.8 and leave the premium data in `sample_weight` argument
unmodified.
apriori_sigma: float, optional (default=0.0)
Standard deviation of the apriori. When used in conjunction with the
bootstrap model, the model samples aprioris from a lognormal distribution
Expand All @@ -35,55 +40,60 @@ class BornhuetterFerguson(Benktander):
Examples
--------
Bornhuetter-Ferguson requires an apriori expected ultimate per origin,
supplied through ``sample_weight``. ``sample_weight`` must be a
chainladder Triangle aligned with ``X``, not a scalar; passing
``sample_weight=14000`` would raise ``AttributeError`` because the model
accesses ``.shape``.
supplied through ``sample_weight``.

A common idiom for building a flat per-origin apriori is to take any
same-shape Triangle, zero it out, and add the desired value. Below uses
the chainladder ultimate as the shape donor.
same-shape Triangle, zero it out, and add the desired value. Here is an example.

.. testsetup::

import chainladder as cl

.. testcode::

tr = cl.load_sample('ukmotor')
cl_ult = cl.Chainladder().fit(tr).ultimate_
apriori = cl_ult * 0 + float(cl_ult.sum()) / 7
print(apriori)
raa = cl.load_sample("raa")
premium = raa.latest_diagonal * 0 + 40_000 # zero out and add 40,000 to each origin

ibnr = cl.BornhuetterFerguson(apriori=0.7).fit(X=raa, sample_weight=premium).ibnr_
print(ibnr)

.. testoutput::

2261
2007 14903.967562
2008 14903.967562
2009 14903.967562
2010 14903.967562
2011 14903.967562
2012 14903.967562
2013 14903.967562

Fit with that apriori. The BF ultimates pull the immature origins toward
the apriori while leaving mature origins close to chainladder.
1981 NaN
1982 255.707763
1983 717.772687
1984 1596.061515
1985 2658.738155
1986 5239.441491
1987 8574.335344
1988 12714.889984
1989 18585.219714
1990 24861.068855

One might be tempted to set never set the aprior and modify the sample_weight directly, and they will result in the same answer, but this is not the recommended practice. It not only add confusion, but it alos mixes the model parameter assumption and data together.

.. testcode::

model = cl.BornhuetterFerguson(apriori=1.0).fit(tr, sample_weight=apriori)
print(model.ultimate_)
raa = cl.load_sample("raa")
premium = raa.latest_diagonal * 0 + 40_000 * 0.7 # premium is modified by 70%

ibnr = cl.BornhuetterFerguson().fit(X=raa, sample_weight=premium).ibnr_
print(ibnr)

.. testoutput::

2261
2007 12690.000000
2008 13145.318280
2009 14095.125641
2010 13412.748068
2011 14150.549749
2012 15999.244850
2013 16658.824705
1981 NaN
1982 255.707763
1983 717.772687
1984 1596.061515
1985 2658.738155
1986 5239.441491
1987 8574.335344
1988 12714.889984
1989 18585.219714
1990 24861.068855
"""

def __init__(self, apriori=1.0, apriori_sigma=0.0, random_state=None):
Expand Down
Loading
Loading