Weekly excess death has become a popular yardstick for assessing the impact of COVID, government policies and COVID therapies but this method is brimming with issues.
"By way of example, if there were 10% less 70 – 79 year-olds back in 2015 compared to 2019 then the multiplication factor applied to 70 – 79 year-old deaths in 2015 would be x0.9. "
Isn't this backwards? Shouldn't the multiplier be (1/0.9)=1.11?
Example: In 2015 there were 10 deaths in a population size of 10. Everyone died. In 2019 there were 10 deaths in a population size of 1 million. 2015 was a very deadly year, but would have a multiple of basically zero. That can't be right. 2015 should be weighted more, not less.
Would the following be an equivalent way of phrasing the population size standardization?:
1) For each year in the baseline, calculate the event rate (occurrences per person).
2) A = the average of the event rates of the baseline years
3) B= the event rate in the year in question.
4) C = (B-A) = "excess event rate" in the year in question
5) D = the population size for the year in question
6) E = (C X D) = standardized excess count of events in the year in question.
"If the sub-population has been dwindling in size over time then the curve for standardised excess will be tucked beneath the unadjusted series."
Similarly, two references to populations being in "decline" don't seem right to me. Adjusted curve going below unadjusted should indicate increasing population size I think.
"There’s a difference between the two series indicating this sub-population has been in decline since 2015. "
"The patterns are near-identical and this arises from the near-linear decline (though we must note this is a rather large assumption). Herewith that summary table"
I've figured why I tripped - whilst typing I glanced at my calcs for the under 1yr subpop not realising this series has been in decline, so the logic got inverted. Bummer!
When are you going to look at the method used by Our World in Data to calculate excess deaths? It would be helpful, as my government, NZ, is using their method to pat themselves on the back, since at the close of 2022, they still have negative excess deaths. Admittedly, their December 25 weekly figure is total crap, as it was based on an initial estimate of 602 deaths, but that has grown to 674 deaths as more death registrations have been processed. My colleague in Nz, who has developed his own regression analysis, says the OWID method lacks transparency.
Hi John. The biggest problem I see with the Projected Baseline method used by OWID is that I believe their regression only evaluates 5 years of data, 2015-2019. In the case of NZ, this creates a steeper line than if you analyse 2011 to 2019 data, because we had bad seasonal deaths in the winters of 2017 and 2019, which biases the 5-year trend. This is neutralized in the longer range data.
If you look at the data for all countries, the 5-year regression line produces the impression that there has not been as many excess deaths. I was hoping you might find reasons that also contribute to this effect.
Thank you. It will help in my legal case against the NZ MOH. I hope to have my evidence collated in a week's time. There is also the issue of 52 and 53 week years, but I don't believe it makes a lot of difference when using the longer term data, that is 9-10 years vis a vis 5 years.
Do you believe it's better to use death rates, deaths per million, rather than absolute deaths? I suppose it depends on the accuracy of population data, the Demographics etc.
Mortality is preferred but getting decent estimate of banded age sub-populations is nigh on impossible! There are other ways to model excess death using sophisticated time series techniques and I shall be taking folk through these at some point.
In print very few people are capable of commenting honestly and openly right now. Behind closed doors it's a different matter - they know exactly what is going on.
Towards the end you mention “ The CHEC death spike of spring 2022”. Do you mean 2020?
I do indeed - I'll get that fixed!
I'm confused:
"By way of example, if there were 10% less 70 – 79 year-olds back in 2015 compared to 2019 then the multiplication factor applied to 70 – 79 year-old deaths in 2015 would be x0.9. "
Isn't this backwards? Shouldn't the multiplier be (1/0.9)=1.11?
Example: In 2015 there were 10 deaths in a population size of 10. Everyone died. In 2019 there were 10 deaths in a population size of 1 million. 2015 was a very deadly year, but would have a multiple of basically zero. That can't be right. 2015 should be weighted more, not less.
Would the following be an equivalent way of phrasing the population size standardization?:
1) For each year in the baseline, calculate the event rate (occurrences per person).
2) A = the average of the event rates of the baseline years
3) B= the event rate in the year in question.
4) C = (B-A) = "excess event rate" in the year in question
5) D = the population size for the year in question
6) E = (C X D) = standardized excess count of events in the year in question.
I'm not surprised... the factor should be 1.11 as it is in the spreadsheet! Good job one of us is awake.
Is this not also backwards?:
"If the sub-population has been dwindling in size over time then the curve for standardised excess will be tucked beneath the unadjusted series."
Similarly, two references to populations being in "decline" don't seem right to me. Adjusted curve going below unadjusted should indicate increasing population size I think.
LOL - I spotted and fixed that blooper a moment ago!
These two seem suspect:
"There’s a difference between the two series indicating this sub-population has been in decline since 2015. "
"The patterns are near-identical and this arises from the near-linear decline (though we must note this is a rather large assumption). Herewith that summary table"
The same virus has struck. About to correct.
I've figured why I tripped - whilst typing I glanced at my calcs for the under 1yr subpop not realising this series has been in decline, so the logic got inverted. Bummer!
One more possible bugger:
"There’s quite a difference here and that is because this age group has been declining the most since 2015. "
I failed a chemistry test due to early AM dyslexia once. Can relate.
The good news is that spreadsheet calcs are correct. I'll put this down to lack of a decent breakfast. Meanwhile I'll tinker with different methods.
Typo: "The CHEC death spike of spring 2022" (in Coffee & Cogitation) should be 2020.
It should indeed - now fixed.
Hi John,
When are you going to look at the method used by Our World in Data to calculate excess deaths? It would be helpful, as my government, NZ, is using their method to pat themselves on the back, since at the close of 2022, they still have negative excess deaths. Admittedly, their December 25 weekly figure is total crap, as it was based on an initial estimate of 602 deaths, but that has grown to 674 deaths as more death registrations have been processed. My colleague in Nz, who has developed his own regression analysis, says the OWID method lacks transparency.
Cheers
Terry
Morning! Right now seems a good time to do this while I'm mulling over methods. I'll go take a look see how they are calculating this...
It's nearly midnight here, so I look forward to your updates. I have lots of NZ data, if you need it.
Regards
Hi John. The biggest problem I see with the Projected Baseline method used by OWID is that I believe their regression only evaluates 5 years of data, 2015-2019. In the case of NZ, this creates a steeper line than if you analyse 2011 to 2019 data, because we had bad seasonal deaths in the winters of 2017 and 2019, which biases the 5-year trend. This is neutralized in the longer range data.
If you look at the data for all countries, the 5-year regression line produces the impression that there has not been as many excess deaths. I was hoping you might find reasons that also contribute to this effect.
Cheers
Terry
Spot on. In the following article I compare 5-year and 10-year baselines for the UK ONS data...
https://jdee.substack.com/p/vaccines-and-death-part-5
In this one I reveal sleight of hand by the ONS...
https://jdee.substack.com/p/ons-baseline-derivation-for-excess
There's more I can do on this, the problem is time (lack thereof) but I hope to trundle something out in a few days.
Thank you. It will help in my legal case against the NZ MOH. I hope to have my evidence collated in a week's time. There is also the issue of 52 and 53 week years, but I don't believe it makes a lot of difference when using the longer term data, that is 9-10 years vis a vis 5 years.
Do you believe it's better to use death rates, deaths per million, rather than absolute deaths? I suppose it depends on the accuracy of population data, the Demographics etc.
Cheers
Mortality is preferred but getting decent estimate of banded age sub-populations is nigh on impossible! There are other ways to model excess death using sophisticated time series techniques and I shall be taking folk through these at some point.
Excellent. Ptof John Gibson at Waikato University in New Zealand made a fist of that in his 2022 paper looking at the impact of booster doses.
I see the CMI is one-quarter expecting this higher mortality to be the new normal. They don't seem very curious do they?
https://www.theactuary.com/2023/02/07/falling-life-expectancy-new-mortality-model?utm_term=&utm_medium=email&utm_source=Adestra
In print very few people are capable of commenting honestly and openly right now. Behind closed doors it's a different matter - they know exactly what is going on.
Did you read the response to the petition about excess deaths? More heart attacks, less cancer.
OHID and ONS use different methodologies. Just to compound the confusion I guess.
Incredible that they base their comments on one week in October. Especially given what we know about coroners delays.
https://petition.parliament.uk/petitions/628188
Quite. Obfuscation is set to maximum!