But, what does it mean to have a probability of a 'one-off' event?
Some quotes from the very much pro-evolutionist book A Beautiful Math by Tom Siegfried (pp. 206-7). Emphasis mine.
So here's a clue about what to do when you know nothing about the probabilities in the system you want to study. Choose a probability distribution that maximizes the entropy! Maximum entropy means maximum ignroance, and if you know nothing, ignorance is by definition at a maximum. Assuming maximum entropy/ignorance then is not just an assumption; it's a factual statement about your situation.And, there's the rub. When we talk about the 'probability' of life evolving by chance, it must be a type of 'blunt instrument' probability because it remains to be established that there is anything possible in the story of life evolving by known processes. Life is a very low entropy phenomenon and needs a mechanism to overcome the comparitively high entropy equilibrium of the environment and then to keep it there as it 'evolves' the components, inter-dependent sub-systems and dependent machinery to do the job.
...
But what, exactly, does it mean to 'maximize the entropy'? It simply means choosing the probability distribution that would result from adding up all the possibilities permited by the laws of nature (since you know nothing you cannot leave out anything that's possible).