The Dominance Assumption in Lebesgue’s Dominated Convergence Theorem

in

This theorem is like the cool kid at the math party who everyone wants to hang out with because they have all the answers. But before we dive into this bad boy, let’s first talk about what it means for a function to be dominated.

A function f(x) is said to be dominated by another function g(x), if there exists some constant C such that |f(x)| |g(x)| for all x in the domain of interest. This basically means that f(x) can’t get too crazy and out-of-control, because it has a leash (the dominating function g(x)) to keep it from running wild.

Now DCT itself. The theorem states that if we have a sequence of functions {f_n} which converges pointwise to f on some interval [a,b], and there exists a dominated function g such that |f_n(x)| |g(x)| for all x in [a,b] and n = 1,2,3… then the integral of f is equal to the limit of the integrals of the sequence {f_n}.

In other words, if we have a bunch of functions that are getting closer and closer to some final function (the limit), as long as they’re all leashed by a dominating function g(x) which doesn’t get too crazy itself, then we can take the integral of the final function f.

This theorem is super useful in calculus because it allows us to swap limits and integrals without having to worry about ***** convergence issues. It also has applications in probability theory and statistics, where it helps us calculate expected values for random variables that converge almost surely (meaning they converge with probability 1).

Remember to always keep your functions on a leash!

SICORPS