Skip to content
UoL CS Notes

Home

This website houses notes from my studies at the University of Liverpool. If you see any errors or issues, please do open an issue on this site's GitHub.

Bayesian Networks - 2

COMP111 Lectures

Joint Probability Distribution We want to know the joint probability distribution from: \[\mathbf{P}(F\vert \text{parents}(F))\] Given a Belief Network, we can always assume an ordering: \[F_1,\ldots,F_n\] of its random variables such that for all $i,j$: \[F_i\rightarrow F_j \text{ implies } i<j\] Using the examples we can order the random variables as...

Read More

Bayesian Networks - 1

COMP111 Lectures

Conditional Independence Random variable $G,F$ are conditionally independent given $H_1,\ldots,H_n$ if: \[\begin{aligned} \mathbf{P} (G,F\vert H_1,\ldots,H_n)=&\mathbf{P}(G\vert H_1,\ldots,H_n)\\ \times& \mathbf{P}(F\vert H_1,\ldots,H_n) \end{aligned}\] or, equivalently: \[\mathbf{P} (G\vert F, H_1,\ldots,H_n)=\mathbf{P}(G\vert H_1,\ldots,H_n)\] This is using the multiplication rule. Example - Dentistry In the dentist domain it seems reasonable to assert conditional independence of the variables...

Read More

Logic - 3

COMP109 Lectures

Semantic Consequence Suppose $\Gamma$ is a finite set of formulas and $P$ is a formula. Then $P$ follows from $\Gamma$ (is a semantic consequence of $\Gamma$) if the following implication holds for every interpretation $I$: \[\text{If } I(Q)=1\text{ for all } Q\in \Gamma,\text{then } I(P)=1\] This is denoted by: \[\Gamma...

Read More

Assessment 2 - Meeting 1

COMP107 Meetings

Role Call Liam, Ravi and Ben Weston were present. Identifying Submission Requirements We had a look at the marking criteria to identify the requirements for the report. Identify at least three users or view which provide three different perspectives on the model? For the user perspectives create 4 - 5...

Read More

Independent Random Variables

COMP111 Lectures

Random variables $F$ and $G$ are independent if: \[\mathbf{P}(F,G)=\mathbf{P}(F)\times\mathbf{P}(G)\] That is, for all values $r$ and $s$: \[P(F=r,G=s)=P(F=r)\times P(G=s)\] As one’s dental problems do not influence the weather, the pairs of random variables are each independent: $\text{Toothache},\text{Weather}$ $\text{Catch},\text{Weather}$ $\text{Cavity},\text{Weather}$ Example - Weather and Dental Problems The full joint probability distribution:...

Read More

Lecture 23-2

COMP105 Lectures

Writing IO Code We can write our own IO actions: print_two :: String -> String -> IO () Print_two s1 s2 = putStrLn (s1 ++ s2) > print_two "abc" "def" abcdef Note that the return type is IO (). Combining Multiple IO Calls The do syntax allows us to combine...

Read More

Lecture 23-1

COMP105 Lectures

So far, we have studied pure functional programming. Pure functions: Have no side effects. Always return a value. Are deterministic. All computation can be done in pure functional programming. IO Sometimes programs need to do non-pure things: Print something to the screen. Read or write a file. … IO v.s....

Read More

Logic - 2

COMP109 Lectures

This lecture is very similar to COMP111’s truth values lecture. View that lecture for all truth tables. Truth Values Interpretations are a way of assigning values to propositions which may vary depending on the situation or person who answers them. An interpretation $I$ is a function which assigns to any...

Read More

Logic - 1

COMP109 Lectures

This topic is very similar to the subjects covered in COMP111’s propositional logic. As result I will only be noting down significant differences. Logic is concerned with the truth a falsity of statements. The question is when does a statement follow from a set of statements. Propositional Logic Propositional logic...

Read More

Lecture 22-2

COMP105 Lectures

Trees graph TD a[ ] --> b[ ] a --> c[ ] b --> d[ ] b --> e[ ] d --> f[ ] d --> g[ ] c --> h[ ] c --> i[ ] A tree is composed of: Leaf nodes. Leaves have no children. Branch nodes. Has...

Read More