Define Greatest Common Divisor and Prove that if $a$ and $b$ are any integers, not both of them are zero. Then there exist integers $x$ and $y$ such that $gcd(a, b) =a \cdot x + b \cdot y$.
Definition: Let $a$ and $b$ be any integers, with at least one of them is not zero. The greatest common divisor of $a$ and $b$, denoted by $gcd(a, b)$, is the positive integer $d$ satisfying the following:
$1.$ $d ~|~ a$ and $d ~|~ b$.
$2.$ If $c ~|~ a$ and $c ~|~ b$, then $c \leq d$.
Theorem: If $a$ and $b$ are any integers, not both of them are zero. Then there exist integers $x$ and $y$ such that $gcd(a, b) =a \cdot x + b \cdot y$.
Proof: Let $a$ and $b$ be any integers, not both of them are zero. Consider the set $S = \{a \cdot u + b \cdot v \;|\; a \cdot u + b \cdot v > 0$ and $ u, v \in \mathbb{Z}\}$, that is $S$ is the set of all positive linear combinations of $a$ and $b$. Firstly, we will show that $S$ is nonempty set. We have taken integers $a$ and $b$ such that not both of them are zero. Without loss of generality, suppose that $a \not = 0$. Then the integer $|a| = a \cdot u + b \cdot 0 \in S$ for the value of $u = 1$, when $a$ is positive or $u = -1$ when $a$ is negative. Since $S$ is nonempty subset of nonnegative integers, By Well-Ordering Principle, $S$ must contain a least element $d$. As $a \in S$, there exist integers $x$ and $y$ such that $d = a \cdot x + b \cdot y$. We claim that $d = gcd(a, b)$. We use Division Algorithm for integers $a$ and $d$, so there exist integers $q$ and $r$ such that $a = d \cdot q + r$, where $0 \leq r <d$. We can write $ r$ in the form $ r = a - q \cdot d = a-q~(a \cdot x + b \cdot y) = a ~(1- q \cdot x) + b~(-(q \cdot y))$ .
If $r > 0$, then this representation imply that $r \in S$, a contradiction to the fact that $d$ is the least integer in $S$ (recall that $r < d$). Therefore, $r = 0$, and so $a = q \cdot d$, that is, $d ~|~ a$. By similar way, we can show that $d ~|~ b$. Hence $d$ is a common divisor of $a$ and $b$. Let $c$ be any arbitrary positive common divisor of the integers $a$ and $b$. By part $7.$ of Theorem , $c ~|~ (a \cdot x + b \cdot y)$; that is, $c ~|~ d$. By part $6.$ of Theorem, we have $c = |c| \leq |d| = d$. Therefore $d$ is greater than every positive common divisor of $a$ and $b$. By the definition of greatest common divisor, we see that $d = gcd(a, b)$.