Skip to content
UoL CS Notes

Home

This website houses notes from my studies at the University of Liverpool. If you see any errors or issues, please do open an issue on this site's GitHub.

Models of Complex Networks

COMP324 Lectures

Binomial Model A random binomial model on $n$ nodes, denoted by the expression $\mathcal G_{n,\frac12}$, is a network on $n$ nodes in which each line exists, independently of all other lines with probability of $\frac12$. To generate one of these networks, we randomly fill an adjacency matrix with 1’s and...

Read More

Cryptography (Not Encryption)

COMP315 Lectures

Key Exchange We can use public key encryption to share a secret, such as a symmetric key. We can also use a pre-agreed key if we have a previous secure connection. Diffie-Hellman (DH) Key Exchange This is based on the assumption that logarithm modulo $n$ is not feasible to compute...

Read More

Symmetric Key Encryption & Quantum Computers

COMP315 Lectures

AES AES has the following pros: Very fast to compute. Key size is very small for equivalent security (265 bits vs 4096 for RSA). Due to it’s speed, AES is used under TLS for HTTPS. Public key encryption is often used for key sharing in symmetric encryption systems (DH for...

Read More

RSA Example

COMP315 Lectures

Computing Keys Take $p=7, q=11$. Therefore $n=pq=77$ and $m=(p-1)(q-1)=60$. Choose $e$ where $\text{gcd}(e, 60)=1$. $e=7$ works in this case. To compute $k=e^{-1}\mod 60$: $9\times7=63=3\mod60$. \[\begin{aligned} 1&=7-2\times3\\ &=7-2\times(9\times7)\\ &=1-18)\times7\\ &=-17\times7\mod60 \end{aligned}\] Therefore $7^{-1}=-17=43\mod60$. Encryption Given that the public keys are $n=77,e=7$ and the private key is $k=43$, we wish to encrypt...

Read More

Social Network Case Studies

COMP324 Lectures

Acquaintances It is possible to derive acquaintances by analysis of the relation between people and events. If two people attend the same event it is likely that they know each-other. We can use this information to create a new graph where individuals have connections when they attend a shared event....

Read More

Multi-Layer Perceptron Training

ELEC320 Lectures

Backpropagation Activation Functions In multi-layer perceptrons we require activation functions with continuous gradients so that their derivatives are significant. Using a threshold would not be acceptable as it isn’t continuous. Both of the following functions create small derivatives when used in large networks. Leaky ReLU may be a better option...

Read More

Multi-Layer Perceptrons

ELEC320 Lectures

Backpropagation To update the weights for a multi-layer perceptron we need to complete the following steps: Feed Forward Backwards Pass The general formula to update the weights on a multi-layer perceptron is: \[\mathbf w_{(n+1)}=\mathbf w_n-\eta\nabla_\mathbf wE\] where: $\nabla_\mathbf wE$ is the gradient with respect to $E$. $\eta$ is the learning...

Read More

Message Passing Interface (MPI) Introduction

COMP328 Lectures

Types of Locks Deadlock Occurs when two or more task wait for each other an each will not resume until some action is taken Livelock Occurs when the tasks involved in a deadlock take action to resolve the original deadlock but in such a way that there is no progress....

Read More

Public Key Encryption (One Time Pad & RSA)

COMP315 Lectures

One-time Pads This method of encryption provides perfect secrecy provided that the following are met: The key should never be reused. The key needs to be at least as long as the message: If a longer key is used then the message should be padded. You have a perfect source...

Read More