This got me thinking, just how good is Cassandra Brown?
What a great chance to use some real data in a toy example.
Now: $$P(H|H, F)= P(H|F)=0.5$$ This is obvious: conditionally on the coin being fair (or biased), we are modelling the coin tosses as i.i.d..
I outlined the basic idea behind likelihoods and likelihood ratios.
Likelihoods are relatively straightforward to understand because they are based on tangible data.
Collect your data, and then the likelihood curve shows the relative support that your data lend to various simple hypotheses.
Conjugate priors are not required for doing bayesian updating, but they make the calculations a lot easier so they are nice to use if you can.