Suppose that $f : [a, b] \to \R$ is integrable. Prove that $t \mapsto \max(f(t), 0)$ is integrable. Use this to show that the absolute value of an integrable function is integrable.
(This was the one step we were missing in our proof that \[\abs{\int_a^b f(t)\,dt} \leq \int_a^b \abs{f(t)}\,dt.\] Note as well that since $\max(f(t), g(t)) = \max(f(t)-g(t), 0) + g(t)$ this shows that the maximum of two integrable functions is integrable.)
In this problem, we will make the assumption that sequences are indexed beginning at $0$ rather than at $1$.
Suppose $(h_n)_n$ is a sequence of functions from a set $S$ to a metric space $X$. The sequence is said to be uniformly Cauchy if for every $\epsilon \gt 0$ there is some $N$ so that for any $n, m \gt N$ and any $s \in S$, $\abs{f_n(s)-f_m(s)} \lt \epsilon$. Next week we will see that uniformly convergent sequences of functions are uniformly Cauchy, and that if $X$ is complete, all uniformly Cauchy sequences of functions from $S$ to $X$ converge uniformly.
A function of the form above is called a power series. Since a power series is the uniform limit of polynomials—which are continuous—it is continuous.
Let $C : \R \to [-1, 1]$ be a continuous function with the properties that $C(2k) = 1$ and $C(2k+1) = -1$ for every $k \in \Z$, and that $\abs{C(x)-C(y)} \leq 4\abs{x-y}$ for all $x, y \in \R$.
Here are some further problems to think about out of interest. You do not need to attempt them, nor should you submit them with the assignment. They are coloured by approximate difficulty: easy/medium/hard.
Suppose that $X$ is a set and $(f_n)_n$ is a sequence of functions $X \to \R$, so that each $f_n$ is bounded. Suppose further that $(f_n)_n$ converges uniformly to some $f : X \to \R$. Show that $(f_n)_n$ is uniformly bounded: that is, there is some $T \in \R$ so that for all $x \in X$ and all $n \in \N$, \[\abs{f_n(x)} \leq T.\]
For each $n \in \N$, let \begin{align*} f_n : \R &\longrightarrow \R \\ x &\longmapsto \frac{x}{1+nx^2}. \end{align*} Show that the sequence $(f_n)_n$ converges uniformly (on $\R$) to some function $f$, and that $(f_n')_n$ converges to $f'$ pointwise on $\R\setminus\set0$, but not at $0$.
Suppose $S$ is a set, and consider the set of functions from $S$ to $\R$, $\R^S = \set{f : S \to \R}$. Let us say that a metric $d$ on $\R^S$ gives rise to the topology of pointwise convergence if whenever $(f_n)_n$ is a sequence of functions in $\R^S$ we have that $(f_n)_n$ converges to a function $f$ with respect to $d$ if and only if $(f_n)_n$ converges pointwise to $f$.
Show that there is a metric on $\R^S$ giving rise to the topology of pointwise convergence if and only if there is a one-to-one function $\varphi : S \to \N$.
Let $R$ be the set of integrable functions on the interval $[a, b]$. Given $f \in R$, define \[\norm{f}_2 = \paren{\int_a^b\abs{f(t)}^2\,dt}^{\frac12}.\]
The function $d_2(f, g) = \norm{f-g}_2$ on $R\times R$ is a pseudo-metric on $R$: it satisfies the triangle inequality, $d_2(f,f) = 0$, and $d_2(f, g) = d_2(g, f)$, but $d_2(f, g)=0$ does not imply $f = g$. If we define $\sim$ by $f\sim g$ whenever $d_2(f, g) = 0$, then $R/\sim$ becomes a metric space; what we have done is show that (equivalence classes of) polynomials are dense in $R/\sim$.
State precisely what is meant by:
Provide examples to show that these three concepts are different.
Recall that for $x \in \R$, $\floor{x}$ (the floor of $x$) is the greatest integer $k \in \Z$ with $k \leq x$. For $n \in \N$, define \begin{align*} \varphi_n: \R &\longmapsto \R\\ t & \longmapsto 2^{-n}\floor{2^nt}. \end{align*} Finally, let $\mathscr{A}$ be the algebra of functions generated by $(\varphi_n)_n$. That is, $\mathscr{A}$ consists of all functions which are finite sums of scalar multiples of finite products of $\varphi_n$'s.
Let $\overline{\mathscr{A}}$ be the uniform closure $\mathscr{A}$. Show that $\overline{\mathscr{A}}$ is not an algebra. (Hint: $x\mapsto x$ is in $\overline{\mathscr{A}}$, but $x \mapsto x^2$ is not.)
Reflect on what this example says about trying to plot quadratic or linear functions on a pixelated display.
Prove that a power series is differntiable, and its derivative is given by differentiating the sum term-by-term. (This is not immediate and does take work, as in general, uniform limits of differentiable functions need not be differentiable, or may have derivatives unrelated to the derivatives of their approximants.) Conclude that a power series is infinitely differentiable.
Let $f : [0, 1] \to \R$ be continuous.
Show by example that there exists a function $f : \R \to \R$ such that $f$ takes on every real value on every non-empty open set. That is, for any $U \subseteq \R$ open and any $y \in \R$, there is some $x \in U$ with $f(x) = y$. (This isn't really related to this week's material in particular, it's just an interesting problem.)
Suppose you're interested in generating integers uniformly at random from $0$ to $N-1$ by flipping coins. If $N = 2$ and you have a fair coin, this is easy: flip it once, and answer $0$ if it's tails and $1$ if it's heads. If $N = 2^k$ you can similarly do this using $k$ flips of a fair coin, writing down a $k$-bit binary number. If you allow yourself to take an arbitrarily large number of flips, you can make this work for any $N$: for example, to generate something from $0$ to $9$, uniformly generate numbers with $N=16$ using four flips each, and take the first that is less than $10$. This can take a rather long time.
What happens if you are making flips with unfair coins? If you have a coin which lands on heads with probability $1/3$ and a fair coin, you can get a uniform random number from $\set{0,1,2}$ as follows: flip the biased coin, and if you see heads, answer $2$; otherwise, flip the fair coin and return $0$ or $1$ based on its result. If you have $N-1$ biased coins with probabilities $\frac1{N}, \frac1{N-1}, \ldots, \frac12$ of landing heads, you can do a similar trick to get a fair number from $0$ to $N-1$: flip the coins starting from the least fair, and answer the first time you see heads.
Unfortunately, biased coins are somewhat difficult and expensive to manufacture, so we would like to use as few as possible. This leads to the following question: given a natural number $N$, what is the minimum number of biased coins required to generate a number in $\set{0,1,\ldots,N-1}$ uniformly at random, using a number of flips you must fix ahead of time? Does the answer change if you are only allowed to use coins with rational bias?