Prove that if $a ~|~ c$ and $b ~|~ c$, with $gcd(a, b) = 1$, then $a \cdot b ~|~ c$.
Corollary: If $a ~|~ c$ and $b ~|~ c$, with $gcd(a, b) = 1$, then $a \cdot b ~|~ c$.
Proof: Suppose that $a ~|~ c$ and $b ~|~ c$. Therefore there exist integers $r$ and $s$ such that $c = a \cdot r = b \cdot s$. Since $gcd(a, b) = 1$ By Theorem, there exist integers $x$ and $y$ such that $a \cdot x + b \cdot y = 1$. Multiplying this equation by $c$, we get $c = c \cdot 1 = c~(a \cdot x + b \cdot y) = a \cdot c \cdot x + b \cdot c \cdot y$. By putting appropriate value of $c$ on the right-hand side, we get $c = a~(b \cdot s)~x + b~(a \cdot r)~y = a \cdot b~(s \cdot x + r \cdot y)$. Since $s \cdot x+r \cdot y$ is an integer, we get $a \cdot b ~|~ c$.