Distributions are a relatively new idea in mathematics (they were only invented in the 1940's and 1950's). However, they have become extremely useful in analysis, especially for understanding partial differential equations.
The basic idea behind distributions is a little like the trick for getting the weak form a differential equation: multiply something nasty by something nice and integrate. Suppose f(x) is an unpleasant function; then often we can pick a nice function g(x) so that the integral
For distributions we want f(x) to be something that is not even really a function, but we will start there.
The function g(x) is called a test function, which acts much like our v(x) in the weak form of a differential equation. The function g(x) is not a particular function, but rather one of many functions that we use to ``test'' the behavior of f(x). To do this we make g(x) as nice as possible while still being able to probe f(x) thoroughly.
The way to do this is to allow test functions to be any function
g(x) that is infinitely smooth -- meaning that all derivatives
exist and are continuous -- and that g(x) is eventually zero
(either taking x to or
).
(I don't mean that the limit is zero, rather g(x)=0 for x ``large
enough''.)
If f(x) is an ordinary integrable function, we can define the functional on test functions:
Actually, the -function isn't a function at all; no function
has the property that
for all smooth
functions.
Rather,
is a limit of functions like
Normally, we would not consider f(x)=1/x to be an integrable
function, so ordinarily would not make sense.
But, we can consider it as a distributional derivative of
, we can find a suitable distribution for ``1/x'':
The only problem is for x around zero.
So we can approximate the integral F*(g') with an interval
removed: