The Division Algorithm / Prove that given integers $a$ and $b$, with $b > ~ 0$, there exist unique integers $q$ and $r$ such that $a = b \cdot q +r$, where $0 \leq r < b$. The integer $q$ is called the $\textbf{quotient}$ and the integer $r$ is called $\textbf{remainder}$.

State and prove Division Algorithm theorem. 

Theorem:  Given integers $a$ and $b$, with $b > ~ 0$, there exist unique integers $q$ and $r$ such that $a = b \cdot q +r$, where $0 \leq r < b$. The integer $q$ is called the $\textbf{quotient}$ and the integer $r$ is called $\textbf{remainder}$.

Proof: Firstly, we will show that the set $S = \{a - x \cdot b \;|\; x \in \mathbb{Z}$ and $  a -x \cdot b \geq 0\}$ is nonempty.  For this purpose, we have to find value of $x$ such that $a -x \cdot b \geq 0$. Given that integer $b > 0$, i.e. $b \geq 1$. Multiply $b \geq 1$ by $|a|$, we get $|a|\cdot b \geq |a|$ and so $a - ( - |a|) \cdot b = a + |a| \cdot b \geq a + |a| \geq 0$.

If we take $x = |a|$, then, $a -x \cdot b \in S$. This show that $S$ is nonempty subset of nonnegative integers. By the Well-Ordering Principle (for more details see Well-Ordering Principle ) , set $S$ contains a least integer; say it $r$. Since $r \in S$, there exists an integer $q$ such that 

$r = a -q \cdot b $ with $0 \leq r$. We claim that 

$r < b$. Suppose on the contrary that $r \not < b$ i.e.$ r \geq b$. Then

$a - (q + 1) \cdot  b = (a - q \cdot b) - b = r - b \geq 0$.

This implies that the integer $a - (q + 1) \cdot b \in S$. But $a - (q + 1) \cdot b = r - b < r$,  a contradiction to $r$ as

the least integer of S. Hence, $r < b$.

Now, we show that  uniqueness of $q$ and $r$. Suppose that $a$ has

two representations of the desired form, say, $a = b \cdot q + r $ where $0 \leq r < b$ as a first representation and $a= b \cdot q' + r'$, where $0 \leq r' < b$ as a second representation.  If we subtract first representation from second representation, then we get $b \cdot q' + r' - b \cdot q  - r = 0$, that is $b \cdot ( q'  - q) + (r'  - r) = 0$. This gives $r' - r = b \cdot ( q  - q')$. Taking absolute value on both side, we get $|r' - r |= b \cdot |( q  - q')|$ .

Multiply inequality $0 \leq r < b$ by $(-1)$. So, we get the inequality $ - b < - r \leq 0$. We will add inequality $ - b < - r \leq 0$ and inequality $0 \leq r' < b$. This addition gives $-b < r' - r < b$, that is  $ |r' - r | < b$. Thus, $b \cdot | q -q' | < b$, which gives $0 \leq | q -q' | < 1$.

Since $ q -q' $ is a nonnegative integer, the only possibility is that $q -q'  = 0$,

Hence $q = q'$ and this gives $r =r'$. 





Popular posts from this blog

Definition of divisibility and Theorem on divisibility

Define Greatest Common Divisor and Prove that if $a$ and $b$ are any integers, not both of them are zero. Then there exist integers $x$ and $y$ such that $gcd(a, b) =a \cdot x + b \cdot y$.