-
Notifications
You must be signed in to change notification settings - Fork 1
/
chapter00-basics.tex
642 lines (544 loc) · 33.2 KB
/
chapter00-basics.tex
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
444
445
446
447
448
449
450
451
452
453
454
455
456
457
458
459
460
461
462
463
464
465
466
467
468
469
470
471
472
473
474
475
476
477
478
479
480
481
482
483
484
485
486
487
488
489
490
491
492
493
494
495
496
497
498
499
500
501
502
503
504
505
506
507
508
509
510
511
512
513
514
515
516
517
518
519
520
521
522
523
524
525
526
527
528
529
530
531
532
533
534
535
536
537
538
539
540
541
542
543
544
545
546
547
548
549
550
551
552
553
554
555
556
557
558
559
560
561
562
563
564
565
566
567
568
569
570
571
572
573
574
575
576
577
578
579
580
581
582
583
584
585
586
587
588
589
590
591
592
593
594
595
596
597
598
599
600
601
602
603
604
605
606
607
608
609
610
611
612
613
614
615
616
617
618
619
620
621
622
623
624
625
626
627
628
629
630
631
632
633
634
635
636
637
638
639
640
641
642
\chapter{Preliminaries}
\label{cha:preliminaries}
We assume the reader to be familiar with the content of the course \emph{MATH-215, Rings and Fields}. In the following, we briefly review some of the most important notions from this course.
Throughout this course, we assume that a ring $(R,+,⋅)$ to be commutative with \emph{unity} $1_R \neq 0_R$. A \emph{zero divisor} is an element $a≠0$ such that there exists an element $b≠0$ with $a⋅b = 0$. A ring without zero divisors is an \emph{integral domain}. The \emph{characteristic} of $R$ is the order of $1$ in the additive subgroup $(R,+)$, if this order is finite. If this order is infinite, then the characteristic of is $R$ zero. The characteristic of an integral domain is either zero or a prime number.
\begin{quote}
\small Indeed, if the order of $1$ is $m ≥1$ and if $m = p⋅q$ with natural numbers $p,q >1$, then $ p ⋅1≠ 0$ and $q ⋅1≠0$ but $ (p ⋅1 ) (q ⋅1) = (p⋅q) 1 = 0$ contradicting the assumption that $R$ does not contain zero divisors.
\end{quote}
An element $u ∈ R$ is a \emph{unit}, if there exists an element $u^{-1} ∈R$ with $u ⋅ u^{-1} = 1$.
A \emph{field} is a ring $R$ in which $(R \setminus \{0\}, ⋅)$ is a group, or in other words, if each element of $R$ apart from $0$ is a unit. The \emph{field of quotients} of an integral domain is the set of equivalence classes of $X = \{ (r,u) : r,u ∈R, u ≠0\}$ of the equivalence relation
$ (r,u) ≡ (r',u') \quad \text{ if and only if } r ⋅ u' = r' ⋅ u$, together with the obvious well defined operations $+, ⋅$.
\section{Homomorphisms and ideals}
\label{sec:homomorphisms-ideals}
If $R_1$ and $R_2$ are rings, a map $θ: R_1 → R_2$ is a \emph{ring homomorphism} if for all $r,s ∈R_1$:
\begin{enumerate}
\item $θ(r+s) = θ(r) + θ(s)$
\item $θ(rs) = θ(r) θ(s)$
\item $θ(1_{R_1}) = 1_{R_2} $
\end{enumerate}
If $θ$ is a bijection, then it is a \emph{(ring) isomorphism} and we write $R_1 ≅ R_2$. If in addition $R_1 = R_2$, then $θ$ is called an \emph{automorphism}.
If $A$ is an additive subgroup of $R$, then $R$ is partitioned into classes
\begin{displaymath}
r + A = \{ r+a : a ∈A\}, \quad \text{ where } r ∈R.
\end{displaymath}
Since $(R,+)$ is abelian, $A$ is a normal subgroup which makes the addition operation on classes
\begin{equation}\label{eq:1}
(r + A ) + (s + A) = (r+s) + A
\end{equation}
well defined. We would like to have a well defined multiplication as well.
An additive subgroup of $A$ of $R$ is an \emph{ideal} if $ra ∈ A$ for each $r∈R$ and $a ∈A$.
\begin{quote}
\small We repeat the argument that shows why the multiplication~\eqref{eq:1} is well defined if and only if the additive subgroup $A$ is an ideal of $R$. If $A$ is an ideal, $r+A = r'+A$ and $s +A = s'+A$, then we need to show that $rs - r's' ∈ A$. But
\begin{eqnarray*}
rs - r's' & = & rs - rs' + rs' - r's' \\
& = & r(s-s') + (r - r') s'
\end{eqnarray*}
and this is an element of $A$ since both $s-s' ∈A$ and $r-r' ∈A$, as the corresponding cosets are equal.
Suppose next that the multiplication is well defined and let $r ∈R$ and $a ∈A$. Then $a+A = 0+A$ and thus the coset $ra +A$ is the coset $r⋅0 +A$ which is equivalent to $ra ∈A$. \qed
\end{quote}
We have the following theorem.
\begin{theorem}
\label{thr:1}
Let $A$ be an ideal of the ring $R$. The additive factor group $R / A$ is, together with the multiplication $(r+A ) (s+A) = rs+A$ a ring with unity $1+A$. This ring $R/A$ is called the \emph{factor ring} of $R$ by $A$.
\end{theorem}
\begin{proof}
Since $(R,+)$ is abelian, $R/A$ is an abelian group. The multiplication is well defined and it is easy to see that $1 +A$ is the unity. Verification that the multiplication is associative and that the distributive laws hold are an exercise.
\end{proof}
The kernel of a homomorphism $θ: R_1 → R_2$, $\ker(θ) = \{ r ∈ R_1 :θ(r) = 0\}$ is an ideal of $R_1$. If we denote the image of $θ$ by $θ(R_1) = \{ θ(r) :r ∈R_1\} ⊆ R_2$, then we can note the isomorphism theorem.
\begin{theorem}[Isomorphism theorem]
\label{thr:2}
Let $θ:R_1 →R_2$ be a ring-homomorphism and $A = \ker(θ)$. Then $θ$ induces the ring isomorphism $φ:R_1 / A → θ(R_1)$ with $φ(r +A) = θ(r)$.
\end{theorem}
An ideal $P$ of a ring $R$ is called a \emph{prime ideal} if, $P ≠R$ and $P$ has the following property: If $rs ∈P$, the $r ∈P$ or $s ∈P$. One immediately has the following theorem.
\begin{theorem}
\label{thr:3}
An ideal $P ≠R$ of a ring $R$ is a prime ideal if and only if $R/P$ is an integral domain.
\end{theorem}
An ideal $M$ of $R$ is called \emph{maximal ideal}, if $M ≠R$ and the following property holds: An ideal $A$ of $R$ with $M ⊆ A ⊆R$ is either equal to $M$ or equal to $R$. One immediately has the following theorem.
\begin{theorem}
\label{thr:4}
Let $M$ be an ideal of $R$. Then $M$ is maximal if and only if $R/M$ is a field.
\end{theorem}
\begin{proof}
\small Indeed, if $M$ is maximal and $r ∉ M$, then $Rr + M$ is also an ideal that contains $M$ but is not equal to $M$ and thus is equal to $R$ and thus contains $1$. This means that there exists $x ∈R$ and $m ∈M$ with $xr + m =1$ which shows that $x+M$ is the multiplicative inverse of $r+M$. If on the other hand $R/M$ is not a field, then there exists a non-invertible element $r + M ≠ 0+M$ of $R/M$. The ideal $Rr + M$ contains $M$ strictly, but is not equal to $R$, which contradicts the fact that $M$ is maximal.
\end{proof}
\begin{theorem}[Chinese Remainder Theorem]
\label{thr:8}
Let $A$ and $B$ be ideals of a ring $R$ such that $A+B = R$ holds. Then
\begin{displaymath}
R/(A ∩B) ≅ R/A × R/B.
\end{displaymath}
\end{theorem}
{ \small \noindent Clearly, $A∩B$ is an ideal of $R$.
The proof is by the isomorphism theorem~\ref{thr:2} since the kernel of $φ: R → R/A × R/B$, with $φ(r) = (r+A, r+B)$ is $A ∩B$ and since $φ$ is onto. In fact, let $1 = a+b$ with $a ∈A$ and $b∈B$, then $(x+A,y+B)$ is the image of $x⋅b + y⋅a$.
}
\subsection*{Exercises}
\begin{enumerate}
\item Let $θ: R_1 → R_2$ be a ring homomorphism. Show that $\ker(θ) = \{ r ∈ R_1 :θ(r) = 0 \}$ is an ideal of $R_1$.
\item Let $R$ be an integral domain and consider the construction of the field of quotients as described above. We denote the equivalence class of $(r,u)$ with $u ≠0$ by $\frac{r}{u}$. Show that addition and multiplication
\begin{displaymath}
\frac{r}{u} + \frac{x}{y} = \frac{ry + ux}{uy} \quad \text{ and } \quad \frac{r}{u} ⋅ \frac{x}{y} = \frac{rx}{uy}
\end{displaymath}
where $r,u,x,y ∈R$ and $uy ≠0$ are well defined.
\item Complete the proof of Theorem~\ref{thr:1} by showing that the multiplication defined on $R/A$ is associative and that the distributive laws hold.
\end{enumerate}
\section{Polynomials}
\label{sec:polynomials}
The ring of polynomials with coefficients in $R$ is denoted by $R[x]$. If $f(x) ∈ R[x]$ is nonzero, then the highest exponent of $x$ that has nonzero coefficient is the \emph{degree} of $f(x)$. This coefficient is the \emph{leading coefficient} of $f(x)$. The polynomial $f(x)$ is called \emph{monic} if this leading coefficient is $1$. The degree of $0$ is not defined. We recall division with remainder.
\begin{theorem}
\label{thr:5}
Let $R$ be a ring and $f(x),g(x)∈R[x]$, $g(x) ≠0$ be polynomials such that the leading coefficient of $g$ is a unit in $R$. Then there exist uniquely determined polynomials $q(x)$ and $r(x)$ such that
\begin{enumerate}[i)]
\item $f(x) = q(x) g(x) + r(x)$, and
\item either $r(x) = 0$ or $\deg r(x) < \deg g(x)$.
\end{enumerate}
\end{theorem}
%
In particular, division with remainder implies the so-called \emph{factor theorem}. for $f(x) ∈ R[x]$ and $a ∈R$, if $f(a) = 0$, then there exists $q(x) ∈ R[x]$ with
\begin{displaymath}
f(x) = (x - a) q(x).
\end{displaymath}
In this case, $a$ is a \emph{root} of $f(x)$. The \emph{multiplicity} of the root $a$ is the largest $m≥1$ such that $f(x) = (x-a)^m \, g(x)$.
In the following, let $F$ denote a field. A nonzero polynomial $p(x) ∈ F[x]$ is called an \emph{irreducible polynomial}, if
\begin{enumerate}[(1)]
\item $\deg p(x) ≥1$, and
\item if $p(x) = f(x) g(x)$ with $f(x),g(x) ∈ F[x]$, then either $f(x) ∈ F$ or $g(x) ∈F$.
\end{enumerate}
In particular, irreducible polynomials in $F$ have no root in $F$.
The \emph{content} of a polynomial $f(x) = a_0 + a_1 x + \cdots + a_n x^n ∈ Z[x]$ with integer coefficients is defined as $\cont(f) = \gcd(a_0,\dots,a_n)$.
\begin{theorem}[Gauss's Lemma]
Let $f(x),g(x), h(x) ∈ℤ[x]$ be nonzero polynomials with $f(x) = g(x) h(x)$, then
\begin{displaymath}
\cont (f) = \cont (g) \cont (h).
\end{displaymath}
\end{theorem}
\begin{proof}[Sketch of proof]
\small It suffices to show that in the case $\cont(g) = \cont(h) =1$. Let $p$ be a prime dividing each coefficient of $f$. Write $g(x) = a_0 + a_1 x + \dots$ and $h(x) = b_0 + b_1 x + \dots$ and let $i,j$ be the smallest indices such that $p$ does not divide $a_i$ and $p$ does not divide $b_j$ respectively. The coefficient of $x^{i+j}$ in $f(x)$ is $a_ib_j + a_{i-1}b_{j+1} + a_{i-2}b_{j+2}+ \dots + a_{i+1}b_{j-1} + a_{i+2} b_{j-2}+ \dots$. Since $p$ divides this coefficient, it follows that $p$ divides $a_i$ or $p$ divides $b_j$, a contradiction.
\end{proof}
The consequence is that a nonzero polynomial $f(x) ∈ ℤ[X]$ is irreducible in $ℚ[x]$ if and only if it cannot be written as $f(x) = g(x) h(x)$ where $g(x),h(x) ∈ ℤ[x]$ are non-constant polynomials.
\begin{theorem}[Eisenstein criterion]
Let $f(x) = a_0 + a_1 x + \dots + a_n x^n ∈ ℤ[x]$ be a polynomial with integral coefficients, where $n≥1$. If there exists a prime number $p$ such that
\begin{enumerate}[(1)]
\item $p$ divides $a_0,\dots,a_{n-1}$,
\item $p$ does not divide $a_n$, and
\item $p^2$ does not divide $a_0$,
\end{enumerate}
then $f(x)$ is irreducible in $ℚ[x]$.
\end{theorem}
\begin{proof}[Sketch of proof]
Suppose $f(x)$ is not irreducible. Then there exist non constant polynomials $g(x) = b_0+ b_1x + \dots + b_rx^r ∈ℤ[x]$ and $h(x) = c_0+ c_1x+ \dots + c_sx^s ∈ℤ[x]$ such that $f(x) = g(x) h(x)$. The prime $p$ divides either $b_0$ or $c_0$. Assume it divides $c_0$. Let $0<i<s$ be the first index such that $p$ does not divide $c_i$. The coefficient of $x^i$ in $f(x)$ is $b_0c_i + b_1c_{i-1}+ \dots$. This implies $p$ divides $c_i$.
\end{proof}
The Eisenstein criterion is sufficient but not necessary for irreducibility. How can we rigorously test whether $f(x) = a_0 + a_1 x + \dots a_n x^n ∈ ℤ[x]$ is irreducible? One way to do so is to derive bounds on the coefficients of a possible factor, see~\cite{mignotte1974inequality}.
We define the \emph{norm} of a complex polynomial $f(x) = a_0+a_1x+ \cdots+a_nx^n ∈ ℂ[x]$ by $\|f\|_2 = \sqrt{∑_{i=0}^n |a_i|^2}$ where $|⋅|$ denotes the usual norm in $ℂ$. By the fundamental theorem of algebra, we can write
\begin{displaymath}
f(x) = a_n ∏_{i=1}^n (x-z_i),
\end{displaymath}
where the $z_i$ are the roots of $f(x)$. We define the \emph{measure} of $f(x)≠0$ as
\begin{displaymath}
M(f) = |a_n| ∏_{i=1}^n \max\{1,|z_i|\},
\end{displaymath}
and $M(0) = 0$.
Clearly, the measure is multiplicative, i.e., for $f(x),g(x) ∈ℂ[x]$ are nonzero, then
\begin{displaymath}
M(f(x)g(x)) = M(f(x)) \, M (g(x)).
\end{displaymath}
The next assertion bounds the norm of $f(x)$ by the measure $M(f)$ from above. To this end, we also introduce the $1$-norm of a polynomial $f(x) = a_0+ \cdots+a_nx^n∈ ℂ[x]$ as
\begin{displaymath}
\|f\|_1 = ∑_{i=0}^n |a_i|
\end{displaymath}
and recall $\|f(x)\|_2 ≤ \|f(x)\|_1$.
\begin{theorem}
\label{thr:6}
Let $f(x) =a_0 + \cdots + a_n x^n∈ ℂ[x]$, then $\|f(x)\|_1 ≤ 2^n M(f(x))$.
\end{theorem}
\begin{proof}
Let us denote the roots of $f(x)$ by $z_1,\dots, z_n$. Then
\begin{displaymath}
f(x) = a_n ∏_{i=1}^n (x-z_i).
\end{displaymath}
The $i$-th coefficient of $f(x)$ is
\begin{displaymath}
a_i = (-1)^{n-i} a_n ∑_{ S ∈ \binom{n}{n-i}} ∏_{j ∈S} z_j.
\end{displaymath}
This implies
\begin{displaymath}
|a_i| ≤ \binom{n}{n-i} M(f(x))
\end{displaymath}
and thus
\begin{displaymath}
∑_{i=0}^n |a_i| ≤ 2^n M(f(x))
\end{displaymath}
follows.
\end{proof}
%
\begin{lemma}
\label{lem:1}
For $f(x) ∈ℂ[x]$ and $z ∈ℂ$, one has
\begin{displaymath}
\|(x-z) f(x)\|_2 = \|(\overline{z}x-1) f(x)\|_2.
\end{displaymath}
\end{lemma}
\begin{proof}
If we define $a_{-1}=0$ and $a_{n+1} = 0$, then
\begin{eqnarray*}
\|(x-z) f(x)\|_2^2 & = & ∑_{i=0}^{n+1} | a_{i-1} - z a_i|^2 \\
& = & ∑_{i=0}^{n+1} (a_{i-1} - z a_i) (\con{a_{i-1}} - \con{z} \con{ a_i}) \\
& = & (1+ |z|^2 ) \|f(x)\|_2^2 - ∑_{i=0}^{n+1} \con{z} a_{i-1} \con{a_{i}} + z \con{a_{i-1}} a_i .
\end{eqnarray*}
%
Similarly, we develop
\begin{eqnarray*}
\|(\con{z}x-1) f(x)\|_2^2 & = & ∑_{i=0}^{n+1} | \con{z} a_{i-1} - a_i|^2 \\
& = & ∑_{i=0}^{n+1} (\con{z} a_{i-1} - a_i) (z \con{a_{i-1}} - \con{ a_i}) \\
& = & (1+ |z|^2 ) \|f(x)\|_2^2 - ∑_{i=0}^{n+1} \con{z} a_{i-1}\con{a_{i}} + {z} \con{a_{i-1}} a_i.
\end{eqnarray*}
\end{proof}
%
We next want to bound the norm by the measure from below.
%
\begin{theorem}[Landau's inequality]
Let $f(x) ∈ℂ[x]$, then $M(f) ≤ \|f(x)\|_2$.
\end{theorem}
\begin{proof}
This clearly holds for the zero polynomial. Thus, let $f(x)≠0$ and suppose the roots $z_i$ are ordered such that $|z_1|,\dots,|z_k| >1$ and $|z_{k+1}|,\dots,|z_n| ≤1$ such that $M(f) = |a_n ⋅ z_1 \cdots z_k|$. We define the polynomial
\begin{displaymath}
g(x) = a_n ∏_{j=1}^k (\con{z_j}x-1) ∏_{i=k+1}^n(x-z_i) = g_n x^n + \cdots + g_0 ∈ ℂ[x].
\end{displaymath}
One has
\begin{eqnarray*}
M(f)^2 &=& |g_n|^2 \\
& ≤& \|g(x)\|_2^2 \\
& = & \| \frac{g(x)}{\con{z_1}x-1} (x-z_1) \|_2^2\\
& = & \| \frac{g(x)}{(\con{z_1}x-1) \cdots (\con{z_k}x-1)} (x-z_1) \cdots (x-z_k) \|_2^2 \\
& = & \|f(x)\|_2^2,
\end{eqnarray*}
where we applied Lemma~\ref{lem:1} repeatedly.
\end{proof}
\begin{theorem}
\label{thr:7}
Let $f(x) ∈ ℤ[x]$ be a nonzero polynomial of degree $n$ and $g(x),h(x) ∈ℤ[x]$ with
$f(x) = g(x) h(x)$, then
\begin{displaymath}
\|g\|_1 ≤ 2^n \|f\|_2.
\end{displaymath}
\end{theorem}
\begin{proof}
Theorem~\ref{thr:6} implies $\|g\|_1 ≤ 2^n M(g)$. Since the measure is multiplicative and the coefficients are all integral, the absolute value of the leading coefficient of $f$ is at least the one of $g$ and therefore $M(g) ≤ M(f)$. Finally Landau's inequality yields $M(f)≤ \|f\|_2$. This yields
\begin{displaymath}
\|g\|_1 ≤ 2^n \|f\|_2.
\end{displaymath}
\end{proof}
Theorem~\ref{thr:7} enables us to solve the problem whether a given polynomial $f(x) ∈ℤ[x]$ with content $1$ is irreducible or not with a finite algorithm. The assertion states that a factor $g(x) ∈ℤ[x]$ satisfies $\|g\|_1 ≤ 2^n \|f\|_2$, where $n$ is the degree of $f$. We can simply list all polynomials of degree at most $n$ that satisfy this bound. This is a finite but not efficient algorithm. It is exponential in the degree and in the binary encoding length of the largest absolute vallue of a coefficient of $f$. The problem has an efficient algorithmic solution. This is a celebrated result by Lenstra, Lenstra and Lovàsz~\cite{lenstra1982factoring}.
\subsection*{Exercises}
\begin{enumerate}
\item Let $f(x) ∈ R[x]\setminus\{0\}$ and $a ∈R$ be a root of $f(x)$ with multiplicity $m$. Let $g(x) ∈ R[x] $ with $f(x) = (x-a)^m g(x)$. Show that $a$ is not a root of $g(x)$.
\item Let $R$ be an integral domain and $f(x) ∈ R[x]$ be a nonzero polynomial. Show that $f(x)$ has at most $\deg f(x)$ many different roots.
\item Describe an irreducible (in $ℚ[x]$) polynomial $p(x) ∈ℤ[X]$ with degree at least two which does not satisfy the Eisenstein criterion.
\item Let $f(x),g(x) ∈ℂ[x]$ be nonzero. Show that $M(f(x)g(x)) = M(f(x)) \, M (g(x))$.
\item Let $f(x),g(x) ∈ℂ[x]$ be nonzero polynomials with leading coefficients $a,b ≠0$ where $\deg (f) = n$ and where $g$ divides $f$. Show the following generalization of Theorem~\ref{thr:7}: $\|g\|_1 ≤ 2^n |b| / |a| \|f\|_2$.
\end{enumerate}
% \subsection{Polynomials over a field}
%\label{sec:polyn-with-coeff}
In the following, let $K$ be field. The polynomial ring $K[x]$ is a \emph{euclidean domain} and therefore a \emph{principal ideal domain} and consequently a \emph{unique factorization domain}, this means that a factorization of $f(x)$ into \emph{monic} (leading coefficient equal to one) irreducible polynomials $p_i(x)$ and $α ∈K$
\begin{equation}
\label{eq:3}
f(x) = α p_1(x) \cdots p_k(x),
\end{equation}
is unique up to the ordering of the $p_i(x)$. Recall that an ideal $A$ of a ring $R$ is called \emph{principal} if it is of the form $A = \{ r a : r ∈R\}$ for some $a ∈A$. For $a_1,\dots,a_k ∈R$, the set $\{ r_1 a_1+ \cdots + r_k a_k : r_i ∈R, \, i=1,\dots,k\}$ is called the \emph{ideal generated by} $a_1,\dots,a_k$ and it is denoted by $〈a_1,\dots,a_k〉$.
It is easy to see that a nonzero ideal $A$ of $K[x]$ is principal. In fact, one chooses simply an element $g(x) ∈A \setminus \{0\}$ of minimal degree. If $f(x) ∈A$, then we perform division with remainder
\begin{displaymath}
f(x) = q(x) g(x) + r(x)
\end{displaymath}
from which $r(x) ∈A$ follows and since $g(x)$ is the non-zero element of smallest degree of $A$, one has $r(x) = 0$, meaning that $g(x)$ divides $f(x)$ and thus $A = 〈g(x)〉$.
From this one can conclude the following important fact from which the uniqueness of the factorization~\eqref{eq:3} follows. (Existence is in fact almost trivial.)
\begin{theorem}
\label{thr:9}
Let $p(x) ∈K[x]$ be irreducible and suppose that $p(x)$ divides $f(x) ⋅ g(x) ∈ K[x]$. Then $p(x)$ divides $f(x)$ or $p(x)$ divides $g(x)$.
\end{theorem}
{\small \noindent
In fact, if $p(x)$ does not divide $f(x)$, then $〈p(x),f(x)〉 = 〈1〉$ which means that there are $h_1(x), h_2(x) ∈ K[x]$ with $1 = h_1(x) p(x) + h_2(x) f(x)$. But then
\begin{equation}
\label{eq:2}
g(x) = h_1(x) p(x) g(x)+ h_2(x) f(x) g(x),
\end{equation}
and since $p(x)$ divides $f(x) ⋅g(x)$ the term on the right of~\eqref{eq:2} is divisible by $p(x)$ and thus $h(x)$ is divisible by $p(x)$. \qed
}
Given $f(x),g(x) ∈K[x]$, not both zero, we now recall how to compute $h(x) ∈ K[x]$ with
\begin{displaymath}
〈h(x) 〉 = 〈 f(x),g(x) 〉.
\end{displaymath}
A polynomial $d(x) ∈K[x]$ that divides both $f(x)$ and $g(x)$ is a \emph{common divisor} of $f(x)$ and $g(x)$. A \emph{greatest common divisor} of $f(x)$ and $g(x)$ is a common divisor of $f(x)$ and $g(x)$ that is divided by any common divisor of $f(x)$ and $g(x)$. If one requires the greatest common divisor to be monic, then it is also unique. It is the product of the common irreducible factors of $f(x)$ and $g(x)$ in~\eqref{eq:3}.
We can efficiently compute the greatest common divisor of two polynomials $f_0(x), f_1(x) ∈ K[X]$ where $f_0(x) ≠0$ with the Euclidean algorithm, that computes the reminder sequence
\begin{displaymath}
f_0(x),f_1(x),\dots,f_k(x),
\end{displaymath}
where
\begin{equation}
\label{eq:4}
f_{i-1}(x) = q_i(x) f_i(x) + f_{i+1}(x)
\end{equation}
is the result of dividing $f_{i-1}(x)$ by $f_i(x)$ with remainder $f_{i+1}(x)$ for $i=1,\dots,k-1$. The polynomial $f_k(x) = 0$ is the first remainder that is zero, i.e., $f_i(x) ≠0$ for $i=0,\dots,k-1$. Clearly, the set of common divisors of $f_{i-1}(x)$ and $f_i(x)$ is the set of common divisors of $f_{i}(x)$ and $f_{i+1}(x)$ in~\eqref{eq:4}. This means that the greatest common divisor of $f_0(x)$ and $f_1(x)$ is $f_{k-1}(x)$. inspecting the remainder sequence, one observes further that
\begin{equation}
\label{eq:5}
\begin{pmatrix}
f_{i} \\
f_{i+1}
\end{pmatrix}
=
\begin{pmatrix}
0 & 1 \\
1 & -q_i
\end{pmatrix}
\begin{pmatrix}
f_{i-1} \\
f_{i}
\end{pmatrix}
\end{equation}
and thus
\begin{equation}
\label{eq:6}
\begin{pmatrix}
f_{k-1} \\
f_{k}
\end{pmatrix}
=
U
\begin{pmatrix}
f_0 \\
f_1
\end{pmatrix}
\end{equation}
where
$
U
$
is the matrix
\begin{equation}
\label{eq:7} U =
\begin{pmatrix}
0 & 1 \\
1 & -q_{k-1}
\end{pmatrix} \cdots
\begin{pmatrix}
0 & 1 \\
1 & -q_{1}
\end{pmatrix}.
\end{equation}
\begin{example}
\label{exe:1}
We compute the greatest common divisor of
\begin{displaymath}
f_0(x)= x^5 - 2x^4 + 3x^3 - 3x^2 + 1
\end{displaymath}
and
\begin{displaymath}
f_1(x) = x^4 - x^3 - x + 1.
\end{displaymath}
Successive division with remainder yields the sequence
\begin{eqnarray*}
f_0(x) &= & x^5 - 2x^4 + 3x^3 - 3x^2 + 1 \\
f_1(x) &=& x^4 - x^3 - x + 1 \\
f_2(x) & = & 2x^3 - 2x^2 - 2x + 2 \\
f_3(x) & = & x^2 - 2x + 1 \\
f_4(x) & = & 0
\end{eqnarray*}
with the corresponding quotients
\begin{eqnarray*}
\label{eq:9}
q_1(x) & = & x - 1 \\
q_2(x) & = & \frac{1}{2}x \\
q_3(x) & = & 2x + 2.
\end{eqnarray*}
The greatest common divisor is $x^2 - 2x + 1$. The matrix $U$ is
\begin{eqnarray*}
U & = & \left(\begin{array}{rr}
0 & 1 \\
1 & -2 x - 2
\end{array}\right) ⋅
\left(\begin{array}{rr}
0 & 1 \\
1 & -\frac{1}{2} x
\end{array}\right) ⋅
\left(\begin{array}{rr}
0 & 1 \\
1 & -x + 1
\end{array}\right) \\
& = & \left(\begin{array}{rr}
-\frac{1}{2} x & \frac{1}{2} x^{2} - \frac{1}{2} x + 1 \\
x^{2} + x + 1 & -x^{3} - 2 x - 1
\end{array}\right).
\end{eqnarray*}
And indeed
\begin{displaymath}
-\frac{1}{2} x ⋅ f_0(x) + \left(\frac{1}{2} x^{2} - \frac{1}{2} x + 1\right) ⋅ f_1(x) = x^2 - 2x + 1.
\end{displaymath}
\end{example}
Let $f(x) = a_0 + a_1x + \cdots + a_n x^n ∈ F[x]$ be a polynomial. The \emph{derivative} of $f$ is the polynomial $f'(x) = a_1 + 2 a_2x + \cdots + n a_n x^{n-1}$. The following is easily verified by comparing the coefficients of the resulting polynomials.
\begin{theorem}
\label{thr:10}
Let $f$ and $g$ be polynomials in $F[x]$, where $F$ is a field, then
\begin{enumerate}[i)]
\item $(f(x) + g(x))' = f'(x) + g'(x)$,
\item $(f(x) g(x))' = f'(x)g(x) + f(x) g'(x)$, and
\item $(f(g(x)))' = f'(g(x))g'(x)$.
\end{enumerate}
\end{theorem}
\section{Field extensions}
\label{sec:field-extensions}
Let $E$ and $F$ be fields. If $E ⊇ F$ holds, then $E$ is a \emph{field extension} of $F$. It is an important observation that $E$ is an $F$-vector space and thus the concepts of linear algebra can be used. The dimension of $E$ as an $F$ vector space is denoted by $[E:F]$. It is also called the \emph{degree of the extension}. A field extension of finite degree is also called a \emph{finite field extension}.
\begin{theorem}[Tower Law]
\label{thr:10}
Let $E ⊇F$ be a finite field extension and $K$ an intermediate field, i.e., $E ⊇K ⊇F$, then
\begin{displaymath}
[E:F] = [E:K] ⋅ [K:F]
\end{displaymath}
\end{theorem}
The proof of the tower law is a simple exercise in linear algebra.
An element $α∈ E$ is \emph{algebraic} over $F$, if there exists a nonzero polynomial $f(x) ∈ F[x]$ such that $f(α)= 0 $ holds. The \emph{minimal polynomial} of $α$ is the unique monic and non-zero polynomial $p(x) ∈ F[x] $ of minimal degree with $p(α)=0$. For fields $E ⊇ F$ and $u_1,\dots,u_k ∈E$, $F(u_1,\dots,u_k)$ denotes the inclusion-wise minimal intermediate field of $E⊇F$ that contains $u_1,\dots,u_k$. We have the following important fact. If $u ∈E$ is algebraic over $F$ with minimal polynomial $p(x) = a_0+ \cdots + a_n x^n ∈F[x]$, then
\begin{equation}
\label{eq:8}
F(u) ≅ F[x] / 〈p(x)〉.
\end{equation}
This follows from the isomorphism theorem~\ref{thr:2} by inspecting ring homomorphism
\begin{equation}
\label{eq:10}
\begin{array}{rccc}
φ: & F[x] &→& E \\
& f(x) & ↦ & f(u).
\end{array}
\end{equation}
The image $φ(F[x])$ of $φ$ is isomorphic to $ F[x] / 〈p(x)〉$. Since $p(x)$ is irreducible, $〈p(x)〉$ is maximal and thus $ F[x] / 〈p(x)〉$ is a field. The image $φ(F[x])$ is clearly contained in $F(u)$ and since the image is a field, it is equal to the smallest extension field of $F$ containing $u$, which is $F(u)$. A further consequence of this fact is Kronecker's Theorem.
\begin{theorem}[Kronecker's Theorem]
\label{thr:11}
If $F$ is any field and if $f(x) ∈ F[x]$ is a non-constant polynomial, then there exists a field extension of $F$ in which $f(x)$ has a root.
\end{theorem}
The proof is by inspecting an irreducible factor $p(x)$ of $f(x)$ and by embedding $F$ into $F[x] / 〈p(x)〉$ via the injective homomorphism $π: F →F[x] / 〈p(x)〉$ defined by $π(r) = r + 〈p(x)〉$.
\begin{definition}
\label{def:1}
Let $f(x)$ be a polynomial in $F[x]$ with $\deg(f) ≥1$, where $F$ is a field. An extension $E ⊇F$ is called a \emph{splitting field} of $f$ over $F$ if
\begin{enumerate}[i)]
\item $f(x) = a (x-u_1) \cdots (x - u_n)$, where $a ∈F$ and $u_i ∈E$ for each $i$ and
\item $E = F(u_1,\dots,u_n)$.
\end{enumerate}
\end{definition}
\begin{theorem}
\label{thr:12}
Let $f$ be a polynomial of degree $n≥1$ over a field $F$. Then a splitting field $E ⊇F$ of $f$ over $F$ exists and $[E:F] ≤ n!$.
\end{theorem}
{\small \noindent
The proof is by induction on $n$. If $n=1$, then $E = F$ is a splitting field. This is also the case if $f(x)$ splits into linear factors over $F$ already. Otherwise, let $p(x) ∈ F[x]$ be a monic irreducible factor of $f(x)$ with $\deg(p) ≥2$. Let $K ⊇ F$ be an extension that contains a root $u_1$ of $p$ and let $E_1 = F(u_1)$. Clearly $[E_1:F] ≤ n$ and $f(x) = (x - u_1) g(x)$ with $g(x) ∈ E_1[x]$ has degree $n-1$. By induction, there exists a splitting field $E_2 ⊇ E_1$ of $g$ and this is of the form $E_2 = E_1(u_2,\dots,u_n)$ and one has $[E_2:E_1] ≤ (n-1)!$. Clearly $F(u_1,\dots,u_n)$ is a splitting field of $f$ over $F$ of degree at most $n ⋅(n-1)! = n!$. \qed
}
The next theorem shows that splitting fields are unique up to isomorphism.
\begin{theorem}
\label{thr:13}
Let $π: F → \overline{F}$ be an isomorphism of fields and let $f(x) = a_0+ \dots+a_nx^n ∈ F[x]$ be a non-constant polynomial and let $f^π(x) = π(a_0)+\dots+π(a_n) x^n ∈ \overline{F}[x]$. If $E⊇F$ is a splitting field of $f(x)$ over $F$ and $\overline{E}⊇\overline{F}$ is a splitting field of $f^π(x)$ over $\overline{F}$, then there exists an isomorphism $φ: E → \overline{E}$ that extends $π$, i.e. for which $π(r) = φ(r)$ for each $ r ∈ F$ holds.
\end{theorem}
The proof is by induction, see diagram below.
\begin{displaymath}
\begin{CD}
E @>φ>> \overline{E} \\
@AAA @AAA \\
F(u) @>τ >>\overline{F}(v) \\
@AAA @AAA \\
F @>π >>\overline{F}
\end{CD}
\end{displaymath}
\begin{proof}
First we observe that the mapping $ F[x] → \overline{F}[x]$ with $f ↦ f^π$ is a ring-isomorphism that extends $π$ (Exercise).
Let $f(x) = p_1(x) \cdots p_k(x) ⋅ (x-a_1) \cdots (x-a_l)$ where the $p_i∈ F[x]$ are irreducibles of degree at least two and $a_i ∈F$.
The proof is then by induction on $\deg p_1 + \cdots + \deg p_k$. If this sum is equal to zero, then $E = F$ and $\overline{E} = \overline{F}$ and the assertion obviously holds. Let $p(x)∈ F[x]$ be an irreducible factor of $f(x)$ of degree at least two and let $u ∈ E$ be a root of $p(x)$. Similarly, let $v ∈ \overline{E}$ be a root of the irreducible factor $p^π(x)$ of $f^π(x)$.
We have argued, see~\eqref{eq:8} that $F(u) ≅ F[x] / 〈 p(x) 〉$ and $\overline{F}[x] / 〈p^π(x)〉 ≅ \overline{F}(v)$. Let $Ψ_1: F(u) → F[x] / 〈 p(x) 〉$ and $Ψ_2: \overline{F}[x] / 〈p^π(x)〉 → \overline{F}(v)$ be the corresponding isomorphisms. The mapping $Ψ_3: F[x] / 〈p(x)〉 → \overline{F}[x] / 〈p^π(x) 〉$ with
\begin{equation}
\label{eq:11}
Ψ_3 (h(x) + 〈p(x) 〉 ) = h^π(x) + 〈p^π(x)〉
\end{equation}
is well defined (Exercise)
and an isomorphism that extends $π$. Clearly $ τ = Ψ_2 \circ Ψ_3 \circ Ψ_1: F(u) → \overline{F}(v)$ is an isomorphism extending $π$. Since $f$ has at least one linear factor more in $F(u)$ than in $F$, we can apply induction now. There exists an isomorphism $φ: E → \overline{E}$ that extends $τ$ and thus also extends $π$.
\end{proof}
\begin{corollary}
\label{co:1}
Let $f(x) ∈ F[x]$ be a non-constant polynomial and $E ⊇F$ as well as $\overline{E} ⊇F$ be splitting fields of $f$ over $F$, then $E ≅ \overline{E}$.
\end{corollary}
A polynomial $f(x) ∈ F[x] \setminus\{0\}$ is called \emph{separable}, if it does not have any multiple roots in any field extension $E ⊇F$. We can check easily, whether a polynomial $f(x) ∈F[x]$ is separable.
\begin{theorem}
\label{thr:14}
A non-zero polynomial $f(x) ∈ F[x]$ is separable, if and only if $\gcd(f,f') = 1$.
\end{theorem}
\begin{proof}
Suppose that $f$ is separable. We show $\gcd(f,f') = 1$. This is the case if $\deg(f)≤1$ because then, either $f$ is a constant or $f'$ is a nonzero constant. Thus suppose that $\deg(f)>1$ and that $x-u$ divides both $f$ and $f'$, where $u$ is an element of a splitting field $E ⊇ F$. Then $f = (x-u) g$ and thus $f'= g + (x-u) g'$. This means that $(x-u)$ divides $g$ and thus that $u$ is a double root of $f$. This is a contradiction.
If $\gcd(f,f') = 1$, then there exist $g,h∈ F[x]$ such that $g ⋅ f + h ⋅f'=1$. This means that no polynomial $x-u$ with $u ∈E$ can divide both $f$ and $f'$ and thus $(x-u)^2$ does not divide $f$. This means that $f$ is separable.
\end{proof}
\begin{theorem}
\label{thr:15}
Let $p(x) ∈ F[x]$ be irreducible. If $\car(F) = 0$, then $p(x)$ is separable. If $\car(F) = q$, then $p$ is not separable if and only if $p(x) = h(x^p)$ for some $h(x) ∈ F[x]$.
\end{theorem}
\begin{proof}
If $p' ≠0$ then $\gcd(p,p')=1$, since $p$ is irreducible and with
Theorem~\ref{thr:14} it follows that
$p$ is separable if and only if $p'≠0$.
This is
the case if $\car(F) = 0$.
Let $p(x) = a_0 + a_1x + \cdots + a_n x^n$ with $n≥1$ and $a_n ≠0$,
then $p'(x) = a_1 + 2a_2x + \cdots + n ⋅ a_n x^{n-1}$. This shows that, if $\car(F) =q$, then $p'=0$ if and only if $a_i = 0$ whenever $i ≠ 0 \pmod{q}$. This means that $f(x) = g(x^q)$ with $g(x) = a_0 + a_qx + a_{2q}x^{2} + \cdots$.
\end{proof}
\begin{definition}
Let $E ⊇F$ be a finite field extension. An element $u ∈E$ is called \emph{separable} if its minimal polynomial $p(x) ∈F[x]$ is separable. If each element of $E$ is separable, then the extension $E⊇F$ is a \emph{separable extension}.
\end{definition}
\begin{theorem}[Primitive element theorem]
\label{thr:16}
Let $E ⊇ F$ be a finite field extension, $u_1,\dots,u_k ∈E$ and suppose that $u_2,\dots,u_k$ are separable. Then there exists an element $u ∈ F(u_1,\dots,u_k)$ with
\begin{displaymath}
F(u) = F(u_1,\dots,u_k).
\end{displaymath}
\end{theorem}
\begin{proof}
If $F$ is a finite field, then so is $F(u_1,\dots,u_k)$. The group of units in a finite field is cyclic. Clearly, $F(u_1,\dots,u_k) = f(α)$, where $α$ is a generator of $(F(u_1,\dots,u_k) \setminus \{0\}, ⋅)$. To this end, let us assume that $F$ is infinite. We show the theorem for $k=2$, i.e., we show that there exists an element $u ∈ F(u_1,u_2)$ with $F(u) = F(u_1,u_2)$. The assertion then follows by induction on $k$.
To simplify notation, we denote $α = u_1$ and $β = u_2$. We claim that there exists an element $y ∈F \setminus \{0\}$ such that $α + yβ$ is a generator of the field extension. Let $p(x)$ and $q(x)$ be the minimal polynomials of $α$ and $β$ respectively. We can factor
\begin{eqnarray*}
p(x) & = & (x- α_1) \cdots (x-α_n) \\
q(x) & = & (x- β_1) \cdots (x-β_m)
\end{eqnarray*}
in a splitting field of $E ⊇ F$ of $p(x)⋅q(x)$, where $α=α_1$ and $β = β_1$. The $β_i$ are pairwise different.
For $y ∈ F$ we define the polynomial
\begin{equation}
\label{eq:12}
h_y(x) = p( (α + y β) - y ⋅ x) ∈ F(α + y β)[x].
\end{equation}
The idea is to choose $y ∈F \setminus \{0\}$ in such a way that
\begin{equation}
\label{eq:13}
(x-β) = \gcd(q,h)
\end{equation}
holds. Since $\gcd(q,h) ∈ F(α + y β)[x]$, this shows that
$β ∈ F(α + y β)$ and therefore also $α ∈ F(α + y β)$. Consequently
$F(α,β) = F(α + y β)$.
We have~\eqref{eq:13} if and only if none of the elements $β_2,\dots,β_m$ are roots of $h(x)$ and this is if and only if $y ∈ F$ was chosen such that
\begin{equation}
\label{eq:14}
α + y β - y β_i ≠ α_j, \, \text{ holds for } 2≤ i ≤ m, \, 1 ≤ j ≤ n.
\end{equation}
For $i,j$ in this range fixed, one has
\begin{displaymath}
α + y β - y β_i = α_j
\end{displaymath}
if and only if
\begin{displaymath}
y (β - β_i) = α_j - α
\end{displaymath}
and since the $β≠ β_i$ this holds if and only if
\begin{displaymath}
y = (α_j - α) / (β - β_i).
\end{displaymath}
This shows that any $y$ from the set
\begin{displaymath}
F \setminus \{ (α_j - α) / (β - β_i) : 2≤ i ≤ m, \, 1 ≤ j ≤ n\}
\end{displaymath}
satisfies
\begin{displaymath}
F(α,β) = F(α + y β).
\end{displaymath}
\end{proof}
%%% Local Variables:
%%% mode: latex
%%% TeX-master: "notes"
%%% End: