Almost all of the severance programs that I have looked at work on the basis of a displaced worker receiving a certain number of days or weeks of severance for each year that they have worked for the company. For instance, if a company gives one week of severance for each year of tenure, the arithmetic is pretty easy...6 years of tenure gets 6 weeks of severance, where each week of severance equates to one week of pre-displacement salary.
The question is, are there any potential savings here? I went to the Bureau of Labor Statistics website and researched what the average duration of unemployment was. Interestingly, they not only keep average data monthly (and have since 1948) but they also keep median data monthly.
So, my conclusion is pretty clear here; if the severance duration awarded an individual displaced worker exceeds the national unemployment duration average or median, there just might be suuficiently significant savings on which an insurance company might build a risk absorbtion model. But that's too simple, because I'll bet when you drill down in the data, you'll find significant variables like age, sex, ethnicity, academic achievement, job description, job tenure, salary range, etc.
Does severance insurance really take into account all of this and still allow an insurance company to quantify the duration risk, such that they can make an informed decision as to whether or not they should offer coverage to a company? There's another issue. What's a company but a group of workers (I've refrred to them quite often as currency). Could an underwriter really accurately underwrite a company on an individual by individual basis and come up with a premium that accurately reflected the sum of the individual risks?
This whole discussion brings up another hurdle. In order to capture potential duration savings, the severance benefit would have to be paid on a pay period basis and stop (like disability) when the diplaced worker went back to work. How would that change cut it with the human resources crowd?