Danish researchers develop an algorithm to predict life… and death

Danish researchers develop an algorithm to predict life... and death

An algorithm to predict the stages of life until its end: researchers at a Danish university have developed a model nicknamed “death calculator”, which helps raise awareness of the risks of commercial misuse of data.

It is a very general framework for making predictions about human life. It can predict anything provided it has training data“, explains to AFP Sune Lehmann, professor at Danamerk Technical University (DTU), and one of the authors of the study published in the journal Nature Computational Science.

According to him, the possibilities are endless.

It could predict health outcomes. So it could predict fertility or obesity, or maybe who’s going to get cancer or not. But it could also predict whether you’ll make a lot of money“, he adds.

Concretely, life2vec uses an operating model similar to that of ChatGPT but instead of processing textual data, it analyzes life stages such as birth, studies, social benefits or even working hours.

From a certain point of view, life is just a series of events: people are born, go to the pediatrician, go to school, move, get married, etc.“, according to the study.

Here we exploit this similarity to adapt innovations in natural language processing to examine the evolution and predictability of human lives based on detailed event sequences.“, she explains.

Six million pieces of data

It is based on anonymized data from around six million Danes collected by the National Statistics Institute.

Sequence analysis makes it possible to predict the rest until the end. On death, the algorithm is right in 78% of cases, on migrations, in 73%.

With a very young cohort of people aged between 35 and 65, we try to predict, based on an eight-year period – 2008 to 2016, whether the person will die in the next four years, until 2020. The model does this very well, better than any other algorithm“, explains Mr. Lehmann, who is careful not to use his formula on personal cases.

This age group, where deaths are usually few, allows, according to the researchers, to verify the reliability of the program.

But the tool is not ready to be used by the general public because it still has biases. “For the moment, it is a research project which explores the field of possibilities (…), we do not know if it treats everyone equally“.

It also remains to discover the role of long time, social connections and their impact on the predictability of lives.


For the academic, the project presents a scientific counterweight to the algorithms developed by Gafam (Google, Apple, Facebook, Amazon, Microsoft).

They can also build models like this, but they don’t make them public, don’t talk about them“, he said. “We can hope that they develop them only to make us buy more products“, adds the researcher.

For him, it’s “important to have a public and open counterweight to begin to understand what we can do with data like this“.

Especially since algorithms of this type are certainly already used in the insurance field, says data ethics expert Pernille Tranberg.

We are certainly put in groups (…) and this can be used against us to the extent that it can force us to pay a higher insurance policy, deny us a bank loan or access to healthcare. public because we are going to die anyway“, she lists.

Bias absent from the research project, which is not intended for individual use, thanks to the anonymization of its sources.

There are no examples of data leaks“personal with the national statistics institute, and”the data is not individualized“, she emphasizes. However, with the development of artificial intelligence, “everything speeds up“.

The project “just shows that we have a lot of data in Denmark and it can be used because we humans are all going in the same direction“, adds Ms. Tranberg.

And some developers have decided to exploit the idea for commercial purposes.

On the web, we already see prediction clocks, which show the age we will reach and some are not at all reliable“, she warns.