Download document 8919696

Document related concepts
no text concepts found
Transcript
Gauge Institute Journal,
H. Vic Dannon
Infinitesimal Calculus of
Random Walk and Poisson
Processes
H. Vic Dannon
vic0@comcast.net
March, 2013
Abstract We set up the Infinitesimal Calculus of Random
Processes X (ζ, t ) , and apply it to the Random Walk B(ζ, t ) ,
and to the Poisson Process P(ζ, t ) .
Both Processes are Continuous, and have Derivative
Processes with Delta Function Variance.
t =b
The integral
∫
f (t )dB(ζ, t ) , of integrable f (t ) , with respect to
t =a
t =b
the Random Walk B(ζ, t ) ,
and the integral ∫ f (t )dP(ζ, t ) of
t =a
integrable f (t ) , with respect to the Poisson Process P(ζ, t )
are well-defined Random Variables.
Keywords: Infinitesimal, Infinite-Hyper-real, Hyper-real,
1
Gauge Institute Journal,
H. Vic Dannon
Calculus, Limit, Continuity, Derivative, Integral, Delta
Function, Random Variable, Random Process, Random
Signal, Stochastic Process, Stochastic Calculus, Probability
Distribution,
Bernoulli
Random
Variables,
Binomial
Distribution, Gaussian, Normal, Expectation, Variance,
Random Walk, Poisson Process
2000 Mathematics Subject Classification 26E35; 26E30;
26E15; 26E20; 26A06; 26A12; 03E10; 03E55; 03E17; 03H15;
46S20; 97I40; 97I30.
2
Gauge Institute Journal,
H. Vic Dannon
Contents
Introduction
1. Hyper-real Line
2. Hyper-real Function
3. Integral of a Hyper-real Function
4. Delta Function
5. Hyper-real Random Variable
6. Normal Distribution, and Delta Function
7. Hyper-real Random Signal X (ζ , t )
8. Continuity of X (ζ , t )
9. Derivative of X (ζ , t )
10. Random Walk B(ζ, t )
11. Random Walk is Continuous, has a Derivative with Delta
Function Variance, and E [B(ζ, t )] has unbounded
Variation.
t =b
12.
∫
f (t )dB(ζ, t )
t =a
13. Poisson Process P(ζ, t )
14. Poisson Process is Continuous and has a Derivative with
Delta Function Variance
3
Gauge Institute Journal,
H. Vic Dannon
t =b
15.
∫
f (t )dP(ζ , t )
t =a
References
4
Gauge Institute Journal,
H. Vic Dannon
Introduction
0.1 Infinitesimal Calculus
Recently we have shown that when the Real Line is
represented as the infinite dimensional space of all the
Cauchy sequences of rational numbers, the hyper-reals are
spanned
by
the
constant
hyper-reals,
a
family
of
infinitesimal hyper-reals, and the associated family of
infinite hyper-reals.
The
infinitesimal hyper-reals are smaller than any real
number, yet bigger than zero.
The reciprocals of the infinitesimal hyper-reals are the
infinite hyper-reals. They are greater than any real number,
yet strictly smaller than infinity.
A neighborhood of infinitesimals separates the zero hyperreal from the reals, and each real number is the center of an
interval of hyper-reals, that includes no other real number.
The Hyper-reals are totally ordered, and are lined up on a
line, the hyper-real line.
A hyper-real function is a mapping from the hyper-real line
into the hyper-real line.
5
Gauge Institute Journal,
Infinitesimal
Calculus
H. Vic Dannon
is
the
Calculus
of
hyper-real
functions.
Infinitesimal Calculus is far more effective than the ε, δ
Calculus, because being based on almost zero numbers, it
allows us to deal with their reciprocals, the almost infinite
numbers.
We have no use for infinity by itself,
but to
comprehend the effects of singularities, we have use for the
almost infinite.
Infinitesimals are a precise tool compared to the vague limit
concept, and the awkward ε, δ statements.
Random walks are made clearer with infinitesimals.
Poisson Process can be derived only in Infinitesimal
Calculus.
0.2 Random Processes
Probability Distributions are defined on Random Variables.
Random Variables assign numerical vales to outcomes.
Thus, maps outcomes into the real line.
Random Variables that evolve in time are called Random
Processes, in Mechanics, or Random Signals, in Electricity.
6
Gauge Institute Journal,
H. Vic Dannon
Random Walk is the Random drift of a particle in fluid due
to collisions with fluid molecules.
Poisson Process models the Random arrival of radioactive
particles at a counter.
7
Gauge Institute Journal,
H. Vic Dannon
1.
Hyper-real Line
The minimal domain and range, needed for the definition
and analysis of a hyper-real function, is the hyper-real line.
Each real number α can be represented by a Cauchy
sequence of rational numbers, (r1, r2 , r3 ,...) so that rn → α .
The constant sequence (α, α, α,...) is a constant hyper-real.
In [Dan2] we established that,
1. Any
totally ordered set of positive, monotonically
decreasing to zero sequences (ι1, ι2 , ι3 ,...) constitutes a
family of infinitesimal hyper-reals.
2. The infinitesimals are smaller than any real number,
yet strictly greater than zero.
3. Their reciprocals
(
1 1 1
, ,
ι1 ι2 ι3
,...
)
are the infinite hyper-
reals.
4. The infinite hyper-reals are greater than any real
number, yet strictly smaller than infinity.
8
Gauge Institute Journal,
H. Vic Dannon
5. The infinite hyper-reals with negative signs are
smaller than any real number, yet strictly greater than
−∞ .
6. The sum of a real number with an infinitesimal is a
non-constant hyper-real.
7. The Hyper-reals are the totality of constant hyperreals,
a
family
of
infinitesimals,
a
family
of
infinitesimals with negative sign, a family of infinite
hyper-reals, a family of infinite hyper-reals with
negative sign, and non-constant hyper-reals.
8. The hyper-reals are totally ordered, and aligned along
a line: the Hyper-real Line.
9. That line includes the real numbers separated by the
non-constant hyper-reals. Each real number is the
center of an interval of hyper-reals, that includes no
other real number.
10.
In particular, zero is separated from any positive
real by the infinitesimals, and from any negative real
by the infinitesimals with negative signs, −dx .
11.
Zero is not an infinitesimal, because zero is not
strictly greater than zero.
9
Gauge Institute Journal,
H. Vic Dannon
12.
We do not add infinity to the hyper-real line.
13.
The
infinitesimals,
the
infinitesimals
with
negative signs, the infinite hyper-reals, and the infinite
hyper-reals with negative signs are semi-groups with
respect to addition. Neither set includes zero.
14.
The hyper-real line is embedded in \∞ , and is
not homeomorphic to the real line. There is no bicontinuous one-one mapping from the hyper-real onto
the real line.
15.
In particular, there are no points on the real line
that can be assigned uniquely to the infinitesimal
hyper-reals, or to the infinite hyper-reals, or to the nonconstant hyper-reals.
16.
No
neighbourhood
of
homeomorphic to an \n ball.
a
hyper-real
is
Therefore, the hyper-
real line is not a manifold.
17.
The hyper-real line is totally ordered like a line,
but it is not spanned by one element, and it is not onedimensional.
10
Gauge Institute Journal,
H. Vic Dannon
2.
Hyper-real Function
2.1 Definition of a hyper-real function
f (x ) is a hyper-real function, iff it is from the hyper-reals
into the hyper-reals.
This means that any number in the domain, or in the range
of a hyper-real f (x ) is either one of the following
real
real + infinitesimal
real – infinitesimal
infinitesimal
infinitesimal with negative sign
infinite hyper-real
infinite hyper-real with negative sign
Clearly,
2.2
Every function from the reals into the reals is a hyper-
real function.
11
Gauge Institute Journal,
H. Vic Dannon
3.
Integral of Hyper-real Function
In [Dan3], we defined the integral of a Hyper-real Function.
Let f (x ) be a hyper-real function on the interval [a, b ] .
The interval may not be bounded.
f (x ) may take infinite hyper-real values, and need not be
bounded.
At each
a ≤ x ≤b,
there is a rectangle with base [x − dx2 , x + dx2 ] , height f (x ) ,
and area
f (x )dx .
We form the Integration Sum of all the areas for the x ’s
that start at x = a , and end at x = b ,
∑
f (x )dx .
x ∈[a ,b ]
If for any infinitesimal dx , the Integration Sum has the
same hyper-real value, then f (x ) is integrable over the
interval [a, b ] .
12
Gauge Institute Journal,
H. Vic Dannon
Then, we call the Integration Sum the integral of f (x ) from
x = a , to x = b , and denote it by
x =b
∫
f (x )dx .
x =a
If the hyper-real is infinite, then it is the integral over [a, b ] ,
If the hyper-real is finite,
x =b
∫
f (x )dx = real part of the hyper-real . ,
x =a
3.1 The countability of the Integration Sum
In [Dan1], we established the equality of all positive
infinities:
We proved that the number of the Natural Numbers,
Card ` ,
equals
the
number
of
Real
Numbers,
Card \ = 2Card ` , and we have
Card `
Card ` = (Card `)2 = .... = 2Card ` = 22
= ... ≡ ∞ .
In particular, we demonstrated that the real numbers may
be well-ordered.
13
Gauge Institute Journal,
H. Vic Dannon
Consequently, there are countably many real numbers in the
interval [a, b ] , and the Integration Sum has countably many
terms.
While we do not sequence the real numbers in the interval,
the summation takes place over countably many f (x )dx .
The Lower Integral is the Integration Sum where f (x ) is
replaced
by its lowest value on each interval [x − dx2 , x + dx2 ]
3.2
∑
x ∈[a ,b ]
⎛
⎞
⎜⎜
inf
f (t ) ⎟⎟⎟dx
⎜⎝ x −dx ≤t ≤x + dx
⎠⎟
2
2
The Upper Integral is the Integration Sum where f (x ) is
replaced by its largest value on each interval [x − dx2 , x + dx2 ]
3.3
⎛
⎞⎟
⎜⎜
f (t ) ⎟⎟dx
∑ ⎜⎜ x −dxsup
⎟
dx
≤t ≤x +
⎠⎟
x ∈[a ,b ] ⎝
2
2
If the integral is a finite hyper-real, we have
14
Gauge Institute Journal,
H. Vic Dannon
3.4 A hyper-real function has a finite integral if and only if
its upper integral and its lower integral are finite, and differ
by an infinitesimal.
15
Gauge Institute Journal,
H. Vic Dannon
4.
Delta Function
In [Dan4], we defined the Delta Function, and established its
properties
1. The Delta Function is a hyper-real function defined
from the hyper-real line into the set of two hyper-reals
⎧
1 ⎫⎪
⎪
⎪
⎪
0,
⎬ . The hyper-real 0 is the sequence
⎨
⎪
dx
⎪
⎪⎪
⎩
⎭
The infinite hyper-real
0, 0, 0,... .
1
depends on our choice of
dx
dx .
2. We will usually choose the family of infinitesimals that
is spanned by the sequences
1
1
1
,
,
,… It is a
n
n2
n3
semigroup with respect to vector addition, and includes
all the scalar multiples of the generating sequences
that are non-zero. That is, the family includes
infinitesimals with negative sign. Therefore,
1
will
dx
mean the sequence n . Alternatively, we may choose
16
Gauge Institute Journal,
the
1
2n
H. Vic Dannon
family
,
2n .
1
3n
,
spanned
1
4n
,… Then,
by
the
sequences
1
will mean the sequence
dx
Once we determined the basic infinitesimal dx ,
we will use it in the Infinite Riemann Sum that defines
an Integral in Infinitesimal Calculus.
3. The Delta Function is strictly smaller than ∞
δ(x ) ≡
4. We define,
where
1
dx
χ
⎡ −dx , dx ⎤ (x ) ,
⎢⎣ 2 2 ⎥⎦
χ
⎧1, x ∈ ⎡ − dx , dx ⎤
⎪
⎢⎣ 2 2 ⎥⎦ .
⎪
⎡ −dx , dx ⎤ (x ) = ⎨
⎪
⎣⎢ 2 2 ⎦⎥
0, otherwise
⎪
⎩
5. Hence,
™ for x < 0 , δ(x ) = 0
™ at x = −
™ for
dx
1
, δ(x ) jumps from 0 to
,
2
dx
1
.
x ∈ ⎡⎢⎣ − dx2 , dx2 ⎤⎦⎥ , δ(x ) =
dx
™ at x = 0 ,
™ at x =
δ(0) =
1
dx
1
dx
, δ(x ) drops from
to 0 .
dx
2
™ for x > 0 , δ(x ) = 0 .
17
Gauge Institute Journal,
H. Vic Dannon
™ x δ(x ) = 0
6. If dx =
7. If dx =
8. If dx =
1
n
2
n
1
n
χ
, δ(x ) =
χ
(x ), 2
[− 1 , 1 ]
2 2
1
, δ(x ) =
,
∫
2
,
(x )...
[− 1 , 1 ]
6 6
3
,...
, δ(x ) = e−x χ[0,∞), 2e−2x χ[0,∞), 3e−3x χ[0,∞),...
δ(x )dx = 1 .
x =−∞
10.
4 4
2 cosh2 x 2 cosh2 2x 2 cosh2 3x
x =∞
9.
χ
(x ), 3
[− 1 , 1 ]
1
δ(ξ − x ) =
2π
k =∞
∫
e −ik (ξ −x )dk
k =−∞
18
Gauge Institute Journal,
H. Vic Dannon
5.
Hyper-real Random Variable
A Random Variable
X (ζ )
is a real-valued function that maps any event (=outcome) ζ ,
in the Sample space S , into a real number x , in \ .
S includes the non-event φ , and X (φ) = 0 .
Example
A ball is drawn from a container that has 5 Red balls, and 4
Black balls.
The 2 possible outcomes,
ζ1 = B , ζ2 = R ,
constitute the sample space,
S = {ζ1, ζ2 } .
The number of Red balls is a Random Variable, X (ζ ) with
the values
X (ζ1 ) = X (B ) = 0 ,
X (ζ2 ) = X (R) = 1 . ,
19
Gauge Institute Journal,
5.1
H. Vic Dannon
Hyper-real X (ζ )
X (ζ ) is Hyper-real Random Variable iff
its values may
include infinitesimals, and infinite hyper-reals.
5.2 Hyper-real Probability Distribution of X (ζ )
Let X (ζ ) be Hyper-real, and define,
dF (x ) = Pr(x − 12 dx ≤ X (ζ ) ≤ x + 12 dx ) .
Then,
F (x ) =
∑
dF (x ) .
x =X (ζ ), ζ ∈S
is a Hyper-real Probability Distribution of X (ζ )
Example
If a ball is drawn from a container that has 5 Red balls, and
4 Black balls, and X (ζ ) is the number of Red balls,
dF (0) = Pr(X (ζ ) = 0) =
4
9
dF (1) = Pr(X (ζ ) = 1) = 59 . ,
20
Gauge Institute Journal,
H. Vic Dannon
5.3 Hyper-real Probability Density of X (ζ )
Let X (ζ ) be Hyper-real. If there is Hyper-real f (x ) so that
dF (x ) = f (x )dx ,
Then
f (x ) =
dF (x )
dx
is the Hyper-real Probability Density of X (ζ ) .
5.4 Expectation of Hyper-real X (ζ )
E [X (ζ )] ≡
∑
xdF (x ) ,
∑
xf (x )dx .
x =X (ζ ), ζ ∈S
is a Hyper-real number.
If dF (x ) = f (x )dx ,
E [X (ζ )] =
x =X (ζ ), ζ ∈S
Example
If a ball is drawn from a container that has 5 Red balls, and
4 Black balls, and X (ζ ) is the number of Red balls,
E [X (ζ )] =
∑
xdF (x )
x =X (ζ ), ζ ∈S
= 0 ⋅ dF (0) + 1 ⋅ dF (1) = 59 . ,
N
N
4/9
5/9
21
Gauge Institute Journal,
5.5
H. Vic Dannon
2nd Moment of Hyper-real X (ζ )
E [X 2 (ζ )] ≡
∑
x 2dF (x )
x =X (ζ ), ζ ∈S
is a Hyper-real number.
Example
If a ball is drawn from a container that has 5 Red balls, and
4 Black balls, and X (ζ ) is the number of Red balls,
E [X 2 (ζ )] =
∑
x 2dF (x )
x =X (ζ ), ζ ∈S
= 02 ⋅ dF (0) + 12 ⋅ dF (1) = 59 . ,
N
N
4/9
5.6
5/9
Variance of Hyper-real Random Variable X (ζ )
Var[X (ζ )] ≡ E [X 2 (ζ )] − (E [X (ζ )])2
is a Hyper-real number.
Example
If a ball is drawn from a container that has 5 Red balls, and
4 Black balls, and X (ζ ) is the number of Red balls,
Var[X (ζ )] = E [X 2 (ζ )] − (E [X (ζ )])2 =
22
5
9
− ( 59 )2 =
20
.,
81
Gauge Institute Journal,
H. Vic Dannon
6.
Normal Distribution and Delta
Function
A Normal Random Variable N (ζ ) , with
E [N (ζ )] = μ , and
Var[N (ζ )] = σ 2 , has a probability density function
f (x ) =
1
2πσ
e
−
( x −μ )2
2 σ2
.
The Variance of a Hyper-real N (ζ ) may be an infinitesimal,
or an infinite hyper real.
6.1 Infinite Hyper-real Variance
σ=
1
⇒ f (x ) = infinitesimal
dx
Proof:
f (x ) =
1
2π
(dx )e
− 1 (x −μ )2 (dx )2
2
x = finite hyper-real Then,
(x − μ) is finite hyper-real,
23
Gauge Institute Journal,
H. Vic Dannon
(x − μ)dx is at most infinitesimal (it vanishes at x = μ ),
− μ)2 (dx )2 is at most infinitesimal,
1 (x
2
e
− 1 (x −μ )2 (dx )2
2
≈ 1 − 12 (x − μ)2(dx )2 ≈ 1 ,
infinitesimal
1 (dx )
2π
f (x ) ≈
= infinitesimal .
x = infinite hyper-real Then,
x =α
1 (x
2
1
, where α is finite hyper-real,
dx
− μ)2 (dx )2 =
1
2
2
( α dx1 − μ )
(dx )2 ,
= 12 α2 − αμ(dx ) + 12 μ 2 (dx )2 ,
≈ 12 α2 ,
e
f (x ) ≈
− 1 (x −μ )2 (dx )2
2
1
1 (dx )e − 2 α
2π
2
≈e
− 1 α2
2
,
= infinitesimal . ,
6.2 Infinitesimal Variance
σ = dx ⇒ f (x ) = Delta Function
Proof:
We’ll show that
f (x ) =
1
2πdx
24
e
− 1 ( x −μ )2
2
dx
Gauge Institute Journal,
H. Vic Dannon
is a Delta Function.
x =μ
Then,
e
− 1 ( x −μ )2
2
= e0 = 1,
dx
1
f (μ) =
2πdx
.
That is, at x = μ , the density function peaks to
x ≠μ
Substituting e
f (x ) =
−1(
2
x −μ 2
)
dx
−μ 2
= 1 + 12 (xdx
) +
1
2πdx
.
1 1 x −μ 4
( )
2! 22 dx
⎧
⎫
⎪
⎪
⎪
1 ⎪
1
⎪
⎪
⎨
⎬
x
−
μ
x
−
μ
2
4
1 1 (
⎪
2π dx ⎪
+
1 + 12 ( dx ) + 2!
)
...
⎪
⎪
⎪
⎪
⎩
⎭
22 dx
1
=
⎧⎪
⎫
⎪
⎪
1 ⎪⎪
1
⎪
⎨
⎬
2
4
1
1
1
1
1
+ ... ⎪
2π ⎪⎪ dx + 2 (x − μ) dx + 2! 2 (x − μ)
⎪
3
2
(dx )
⎪
⎩⎪
⎭
≈
⎧⎪
1 ⎪⎪
⎨
2π ⎪⎪ 12 (x − μ)2 dx1 +
⎩⎪
=
1
1 1
2! 22
(x − μ)4
1
(dx )3
⎫
⎪
⎪
⎪
⎬
+ ... ⎪
⎪
⎪
⎭
1
1
= infinitesimal
2π infinite hyper-real
Finally, for a normal density function,
x =∞
∫
x =−∞
x =∞
f (x )dx =
∫
x =−∞
1
2πσ
25
e
− 1 ( x −μ )2
2
σ
dx = 1 . ,
+ ... ,
Gauge Institute Journal,
H. Vic Dannon
7.
Hyper-real Random Signal
A Random Signal (=Random Process) is a Random
Variable that depends also on the time t :
X (ζ, t ) .
Then, the outcome of a Black ball,
ζ =B
is identified with the outcome of drawing one Black ball, and
one Red ball successively,
BR , and RB ,
and with the drawing of one Black ball, and two Red balls
successively,
BRR , RBR , RRB ,
etc.
For a given outcome ζ0 ,
X (ζ0 , t ) = x ζ (t ) ,
0
is a function of t , a Sample Function, or Process Realization.
Example
At time t = 1 , a ball is drawn from a container that has 5
Red balls, and 4 Black balls, and X (ζ,1) is the number of
26
Gauge Institute Journal,
H. Vic Dannon
Red balls at t = 1 .
At time t = 2 , another ball is drawn from the container that
now has 8 Red, and Black balls, and X (ζ, 2) is the number of
Red balls at t = 2 .
At time t = 3 , another ball is drawn from the container that
now has 7 Red, and Black balls, and X (ζ(3)) is the number of
Red balls at t = 3 .
The outcome of no Red balls, appears once at t = 1 , once at
t = 2 , and once at t = 3 :
X (noR,1) = X (noR, 2) = X (noR, 3) = 1 .
The outcome of one Red ball, appears once at t = 1 , twice at
t = 2 , and 3 times at t = 3 ,
X (1R,1) = 1 ,
27
Gauge Institute Journal,
H. Vic Dannon
X (1R, 2) = 2 ,
X (1R, 3) = 3 .
The outcome of two Red balls, appears once at t = 2 , and
four times at t = 3 ,
X (2R,1) = 0 ,
X (2R, 2) = 1 ,
X (2R, 3) = 4 .
The outcome of three Red balls, appears once at t = 3 ,
X (3R,1) = 0 ,
X (3R, 2) = 0 ,
X (3R, 3) = 1 .
The sample space of the process is
{0R,1R, 2R, 3R} . ,
7.1 Hyper-real X (ζ , t )
A Random Signal is Hyper-real iff the time variable t , and
the values of X (ζ, t ) may include infinitesimals, and infinite
hyper-reals.
7.2 Hyper-real Probability Distribution of X (ζ , t )
28
Gauge Institute Journal,
H. Vic Dannon
Let X (ζ , t ) be Hyper-real, fix t = t0 , and define,
dF (x , t0 ) = Pr(x − 12 dx ≤ X (ζ , t0 ) < x + 12 dx ) .
Then,
F (x , t0 ) =
∑
x =X (ζ ,t0 ), ζ ∈S
dF (x , t0 ) .
is a Hyper-real Probability Distribution of X (ζ, t0 ) .
Example
At time t = 1 , a ball is drawn from a container that has 5
Red balls, and 4 Black balls, and X (ζ,1) is the number of Red
balls at t = 1 .
At time t = 2 , another ball is drawn from the container that
now has 8 Red, and Black balls, and X (ζ, 2) is the number of
Red balls at t = 2 .
dF (0, 2) = Pr(X (ζ , 2) = 0) =
dF (1, 2) = Pr(X (ζ , 2) = 1) =
29
4
9
4
9
⋅ 83 =
1
6
⋅ 85 + 59 ⋅ 84 =
5
9
Gauge Institute Journal,
H. Vic Dannon
dF (2, 2) = Pr(X (ζ , 2) = 2) =
5
9
⋅ 84 =
5
18
.,
7.3 Hyper-real Probability Density of X (ζ, t )
Let X (ζ, t ) be Hyper-real, and fix t = t0 . If there is Hyperreal f (x , t0 ) so that
dF (x , t0 ) = f (x , t0 )dx ,
Then
f (x , t0 ) =
dF (x , t0 )
dx
is the Hyper-real Probability Density of X (ζ, t0 ) .
7.4 Expectation of Hyper-real X (ζ , t )
Let X (ζ , t ) be Hyper-real, fix t = t0 , and define
E [X (ζ, t0 )] ≡
∑
xdF (x , t0 ) ,
∑
xf (x , t0 )dx .
x =X (ζ ,t0 ), ζ ∈S
If dF (x , t0 ) = f (x , t0 )dx ,
E [X (ζ , t0 )] =
x =X (ζ ,t0 ), ζ ∈S
Example
30
Gauge Institute Journal,
E [X (ζ, 2)] =
H. Vic Dannon
∑
xdF (x , 2)
x =X (ζ ,2), ζ ∈S
= 0 ⋅ dF (0, 2) + 1 ⋅ dF (1, 2) + 2 ⋅ dF (2, 2) =
1/6
7.5
5/9
10
9
.,
5/18
2nd Moment of Hyper-real X (ζ , t )
E [X 2 (ζ, t )] ≡
∑
x 2dF (x , t ) .
x =X (ζ ,t ), ζ ∈S
Example
E [X 2 (ζ, t )] =
∑
x 2dF (x , t )
x =X (ζ ,t ), ζ ∈S
= 02 ⋅ dF (0) + 12 ⋅ dF (1) + 22 ⋅ dF (2) = 53 . ,
N
N
N
1/6
7.6
5/9
5/18
Variance of Hyper-real Random Variable
Var[X (ζ, t )] ≡ E [X 2 (ζ, t )] − (E [X (ζ, t )])2 .
31
Gauge Institute Journal,
H. Vic Dannon
Example
Var[X (ζ, t )] = E[X 2 (ζ, t )] − (E[X (ζ, t )])2
=
9
5
− ( 56 )2 =
32
9
25
.,
Gauge Institute Journal,
H. Vic Dannon
8.
Continuity of X (ζ, t )
8.1 Hyper-real X (ζ , t ) is continuous at t = t0 iff for any dt ,
E {[X (ζ, t0 + dt ) − X (ζ , t0 )]2 } = infinitesimal ,
∑
⇔
X (ζ ,t0 ), ζ ∈S
[X (ζ , t0 + dt ) − X (ζ , t0 )]2dF (x , t 0 ) = infinitesimal
If dF (x , t0 ) = f (x , t0 )dx ,
∑
⇔
X (ζ ,t0 ), ζ ∈S
[X (ζ , t0 + dt ) − X (ζ , t0 )]2 f (x , t0 )dx = infinitesimal
8.2 X (ζ, t ) is continuous at t = t0 ⇒ E [X (ζ , t0 )] is continuous
Proof:
0 ≤ E [{[X (ζ, t0 + dt ) − X (ζ, t0 )] − E [X (ζ , t0 + dt ) − X (ζ, t0 )]}2 ]
= E {[X (ζ, t0 + dt ) − X (ζ, t0 )]2 }
−2E {[X (ζ, t0 + dt ) − X (ζ , t0 )]E [X (ζ, t0 + dt ) − X (ζ , t0 )]}
+{E [X (ζ , t0 + dt ) − X (ζ , t0 )]}2
= E {[X (ζ , t0 + dt ) − X (ζ, t0 )]2 } − {E [X (ζ , t0 + dt ) − X (ζ, t0 )]}2
Therefore,
33
Gauge Institute Journal,
H. Vic Dannon
{E [X (ζ, t0 + dt ) − X (ζ, t0 )]}2 ≤ E {[X (ζ , t0 + dt ) − X (ζ , t0 )]2 }
≥0
ifinitesimal
Hence,
{E [X (ζ , t0 + dt ) − X (ζ , t0 )]}2 = infinitesimal ,
E [X (ζ , t0 + dt ) − X (ζ , t0 )] = infinitesimal .
34
Gauge Institute Journal,
H. Vic Dannon
9.
Derivative of X (ζ, t )
9.1 Hyper-real X (ζ , t ) has derivative with respect to t at
t = t0 iff there is a Random Signal X '(ζ, t ) = ∂t X (ζ, t ) , so
that for any dt ,
⎡⎡
⎤ 2 ⎤⎥
⎢ ⎢ X (ζ, t0 + dt ) − X (ζ, t0 )
E⎢
− X '(ζ, t0 ) ⎥ ⎥ = infinitesimal ,
⎢
⎥ ⎥
dt
⎢⎣
⎦ ⎦
⎣
⎡ x (ζ, t0 + dt ) − x (ζ, t0 )
⎤2
⇔
− x '(ζ , t0 ) ⎥ dF (x , t0 ) = infinitesimal
∑ ⎢⎢
⎥
dt
x = X (ζ ,t0 ), ζ ∈S ⎣
⎦
If dF (x , t0 ) = f (x , t0 )dx ,
⎡ x (ζ, t0 + dt ) − x (ζ, t0 )
⎤2
⇔
− x '(ζ , t0 ) ⎥ f (x , t0 )dx = infinitesimal
∑ ⎢⎢
⎥
dt
x = X (ζ ,t0 ), ζ ∈S ⎣
⎦
35
Gauge Institute Journal,
H. Vic Dannon
10.
Random Walk
The Random Walk of small particles in fluid is named after
Brown, who first observed it, Brownian Motion. It models
other processes, such as the fluctuations of a stock price.
In a volume of fluid, the path of a particle is in any direction
in the volume, and of variable size
10.1 The Bernoulli Random Variables of the Walk
We restrict the Walk here to the line, in uniform
infinitesimal size steps dx :
To the left, with probability
p = 12 ,
36
Gauge Institute Journal,
H. Vic Dannon
or to the right, with probability
q = 12 .
At fixed time t , after
N infinitesimal time intervals dt ,
N =
t
dt
, is a fixed infinite hyper-real,
the particle have made
K infinitesimal steps of size dx to the right,
and
L infinitesimal steps of size dx to the left,
and is at the point
x = (K
− L
)dx = Mdx .
M
K , L, M , are infinite hyper-reals.
At the i th step we define the Bernoulli Random Variable,
Bi (right step) = dx ,
ζ1 = right step .
Bi (left step) = −dx ,
ζ2 = left step .
where i = 1, 2,..., N .
Pr(Bi = dx ) = p = 12 ,
Pr(Bi = −dx ) = q = 12 ,
E [Bi ] = dx ⋅ 12 + (−dx ) ⋅ 12 = 0 ,
E [Bi 2 ] = (dx )2 ⋅ 12 + (−dx )2 ⋅ 12 = (dx )2
37
Gauge Institute Journal,
H. Vic Dannon
Var[Bi ] = E [Bi 2 ] − (E [Bi ])2 = (dx )2 .
N
0
(dx )2
10.2 The Binomial Distribution of the Walk
B(ζ, t ) = B1 + B2 + ... + BN
is a Random Process with
E [B(ζ, t )] = 0 ,
Var[B(ζ, t )] = N (dx )2 ,
distributed Binomially
Pr ( x −
1 dx
2
≤ B(ζ, t ) ≤ x +
1 dx
2
⎛ N ⎞⎟
) = ⎜⎜⎜⎜ M +N ⎟⎟⎟⎟ 21N
⎝ 2 ⎠
Proof: Since the Bi are independent,
E [B(ζ, t )] = E [B1 ] + ... + E [BN ] = 0
N
0
0
Var[B(ζ, t )] = Var[B1 ] + ... + Var[BN ] = N (dx )2
(dx )2
(dx )2
B(ζ, t ) has a Binomial distribution,
⎛N ⎞
Pr ( x − 12 dx ≤ X (ζ, t ) ≤ x + 12 dx ) = ⎜⎜⎜ ⎟⎟⎟ p K q N −K ,
⎜⎝ K ⎠⎟
38
Gauge Institute Journal,
H. Vic Dannon
⎛N ⎞ K
N −K
= ⎜⎜⎜ ⎟⎟⎟( 12 ) ( 12 )
,
⎜⎝ K ⎠⎟
⎛N ⎞
= ⎜⎜⎜ ⎟⎟⎟ 1N .
⎜⎝ K ⎠⎟ 2
From
N = K +L,
M = K −L,
we have
K =
N +M
2
,
L =
N −M
2
.
Thus,
Pr ( x −
1
dx
2
≤ B(ζ, t ) ≤ x +
1
dx
2
⎛ N ⎞⎟
) = ⎜⎜⎜⎜ M +N ⎟⎟⎟⎟ 21N . ,
⎝ 2 ⎠
10.3 The Gaussian Distribution of the Walk
If (dx )2 = 2D(dt ) , where the Drift Coefficient D is a constant
Then, the Binomial distribution of B(ζ, t ) is infinitesimally
close to a Gaussian distribution of a Random Signal with
μ = 0,
σ =
f (x , t ) ≈
t 2D =
Ndx .
1
2π t 2D
39
e
−
1 x2
2 t 2D
Gauge Institute Journal,
H. Vic Dannon
Proof:
Pr ( x − 12 dx ≤ X (ζ , t ) ≤ x + 12 dx ) =
(
dF (x ,t )
N!
N +M
2
)!(
N −M
2
)
1
N
2
!
.
2πNN N e−N from Sterling’s Formula for
Substituting N ! ≈
infinite hyper-real N ,
2πNN N e−N
≈
2π
=
=
N +M
2
N +M N + M
( 2 )
2
2
π N N +M2 +1 (1 +
2
πN (1 +
e
− N +M
2π
2
N
N −M
2
N −M ( N −M )
2
2
N
N −M +1
2
1
(1 −
2
2
N +M +1
2
N +M +1
2
,
N +1
M)
N
M)
N
e
1
− N −M 2N
N −M +1
2
(1 −
N −M +1
2
,
M)
N
.
M)
N
Then, up to an infinitesimal,
log ⎡⎣ dF (x , t ) ⎤⎦ ≈ log
Since 0 <
M
N
2
πN
+1
log(1 +
− N +M
2
+1
M
) − N −M
log(1 − M
)
2
N
N
< 1,
log(1 +
M)
N
≈
M
N
2
− 12 M 2 ,
N
2
log(1 − M
) ≈ −M
− 12 M 2 ,
N
N
N
log ⎡⎣ dF (x , t ) ⎤⎦ ≈ log
2
πN
2
+1 M
+1 M
− N +M
+ N +M
2
2
N
4
N
40
Gauge Institute Journal,
H. Vic Dannon
+1 M
+ N −M
+
N
2
= log
N −M +1 M 2
4
N2
2
πN
2
− M2 − M
− 2MN +
2N
2
+ M2 − M
+
2N
M
2N
M2
4N
+
M3
4N 2
+
M2
4N 2
M2
4N
−
M3
4N 2
+
M2
4N 2
+
2
M2
2N 2
= log
2
πN
−M
+
2N
= log
2
πN
−M
(1 − N1 )
2N N
2
≈1
= log
This would give
1
N
2
1 M2
N
1 e−2
2π
.
1
dF (x , t ) ≈ 2
2π N
e
−
1M2
2 N
, but accounting
for negative M , and x , we have
dF (x , t ) ≈
1
2π N
e
−
1M2
2 N
x2
=
=
1
2π
t
dt
dt
2π t
e
1 (dx )2
−
2 t
−
e
dt
1 dt x 2
2 (dx )2 t
Thus, we need to assume that (dx )2 , and dt are proportional,
41
Gauge Institute Journal,
H. Vic Dannon
(dx )2 = 2D(dt ) ,
where the Drift Coefficient D is a constant of the Walk.
Then,
dF (x , t ) ≈
1
2π t 2D
e
−
1 x2
2 t 2D dx
.
Hence, the probability density of the Walk is
dF (x , t )
≈
f (x , t ) =
dx
1
2π t 2D
e
−
1 x2
2 t 2D
,
with
μ = 0,
σ=
2tD =
Ndx . ,
10.4 f (x , t ) solves the parabolic wave equation ∂t f = D∂x2 f .
Proof: By substitution. ,
10.5 Increments of Random Walk
If (dx )2 = 2D(dt ) ,
Then
1) For any τ > 0 , the distribution of B(ζ, t + τ ) − B(ζ, t ) is
infinitesimally close to a Gaussian distribution that has
μ = 0,
42
Gauge Institute Journal,
H. Vic Dannon
σ 2 = τ 2D ,
and depends only on τ (Stationary Process).
2) For fixed t , and any dt , the Random Variables
B(ζ, t ) − B(ζ, t − dt ) ,
B(ζ, t − dt ) − B(ζ, t − 2dt ) ,
…………………………….,
B(ζ, dt ) − B(ζ, 0) ,
are independent, random variables.
Proof:
1) Let T =
τ
dt
. Then, as in 10.2, the Binomial distribution of
B(ζ, t + τ ) − B(ζ, t ) = BN +1 + BN +2 + ... + BN +T ,
is infinitesimally close to a Gaussian distribution with
μ = 0 , and σ 2 = τ 2D , that depends only on τ .
2)
B(ζ, t ) − B(ζ, t − dt )
is precisely one Bernoulli Random Variable that is
statistically independent of the precisely one Bernoulli
Random Variable that equals
B(ζ, t − dt ) − B(ζ, t − 2dt )
43
Gauge Institute Journal,
H. Vic Dannon
11.
Random Walk is Continuous,
has a Derivative with Delta
Function Variance, and E [B(ζ, t )]
has unbounded Variation
(dx )2 = (2D )dt ⇒ Random Walk is Continuous
11.1
Proof:
E [{B(ζ, t + dt ) − B(ζ, t )}2 ] =
= Var[B(ζ, t + dt ) − B(ζ, t )] + (E [B(ζ, t + dt ) − B(ζ, t )])2 ,
Bi
Bi
where Bi is a Bernoulli Random Variable,
= Var[Bi ] + (E [Bi ])2 = (2D )dt . ,
N
(dx )2 =(2D )dt
11.2 If
0
(dx )2 = (2D )dt
44
Gauge Institute Journal,
H. Vic Dannon
Then The Derivative of Random Walk is
1
B = Bi ,
dt
where (1) Bi = B(ζ, t0 + dt ) − B(ζ, t0 ) , is a Bernoulli
Random Variable.
(2) E [B ] = 0 ,
(3) Var[B ] = 2Dδ(t0 ) ,
Proof:
(1)
For each t = t0 , we need to find a Random Signal
B (ζ , t0 ) , so that for any dt ,
⎡
⎤ 2 ⎥⎤
⎢ ⎡⎢ B(ζ, t0 + dt ) − B(ζ, t0 )
E⎢
− B (ζ, t0 ) ⎥ ⎥ = infinitesimal ,
⎢
⎥ ⎥
dt
⎢⎣
⎦ ⎦
⎣
Since B(ζ, t0 + dt ) − B(ζ, t0 ) , is a Bernoulli Random Variable
Bi ,
2⎤
⎡⎧
⎡⎧
⎫
⎫⎪2 ⎤⎥
B
⎪
⎪
⎪⎪ X (B, t + dt ) − B(ζ, t )
⎪
⎢
⎥
⎢
E ⎢⎨
− B (ζ, t ) ⎬ ⎥ = E ⎢ ⎪
⎨ i − B ⎪⎬ ⎥
⎪⎭⎪ ⎥
⎪
⎪⎪⎭ ⎥
dt
⎢ ⎪⎩⎪ dt
⎩
⎣⎢ ⎪
⎦
⎣
⎦
Therefore, at time t = t0 , the Random Variable
1
B ,
dt i
is the derivative of the Random Walk B(ζ, t0 ) . ,
45
Gauge Institute Journal,
H. Vic Dannon
E [B ] =
(2)
1
dt
E [Bi ] = 0 . ,
N
0
])2
Var[B ] = E [B 2 ] − (E
[
B
N
(3)
0
=
1
E [Bi 2 ] ,
(dt ) 2
2
(dx )
=
(dx )2 1
dt dt
N
2D
= (2D )δ(t0 ) . ,
E [B(ζ, t )] has unbounded Variation in [a, b ]
11.3
Proof:
2D(b − a ) = (2D )dt + (2D )dt + ... + (2D )dt
(dx )2
(dx )2
(dx )2
2⎤
2⎤
⎡
⎡
= E ⎢ { B(ζ, b ) − B(ζ, b − dt )} ⎥ + .. + E ⎢ { B(ζ, a + dt ) − B(ζ , a )} ⎥
⎣
⎦
⎣
⎦
≤ max B(ζ, t + dt ) − B(ζ, t ) E ⎡⎣ B(ζ , b ) − B(ζ , b − dt ) ⎤⎦ + ..
a
≤t ≤b
infinitesimal
... + max B(ζ, t + dt ) − B(ζ, t ) E ⎡⎣ B(ζ, a + dt ) − B(ζ, a ) ⎤⎦ =
a ≤t ≤b
= infinitesimal{E B(ζ, b) − B(ζ, b − dt ) + ... + E B(ζ , a + dt ) − B(ζ , a ) } ,
46
Gauge Institute Journal,
H. Vic Dannon
=infinitesimal { E ⎡⎣ B(ζ, b) − B(ζ, b − dt ) + ... + B(ζ , a + dt ) − B(ζ , a ) ⎤⎦ } ,
since the Bernoulli Random Variables are independent.
Therefore,
(2D )(b − a )
E ⎡⎣ B(ζ, b) − B(ζ, b − dt ) + ... + B(ζ, a + dt ) − B(ζ , a ) ⎤⎦ =
,
infinitesimal
is infinite hyper-real, and E [B(ζ, t )] has unbounded variation
in [a, b ] . ,
47
Gauge Institute Journal,
H. Vic Dannon
12.
t =b
∫
f (t )dB(ζ , t )
t =a
While E [B(ζ, t )] has unbounded Variation in [a, b ] , integration
with respect to B(ζ, t ) is possible.
Let f (t ) be a hyper-real function on the bounded time
interval [a, b ] . f (t ) need not be bounded.
At each a ≤ t ≤ b , there is a Bernoulli Random Variable
dB(ζ, t ) = B(ζ , t + dt ) − B(ζ, t ) = Bi (ζ, t ) = B (ζ , t )dt .
We form the Integration Sum
t =b
t =b
t =b
t =a
t =a
t =a
∑ f (t )dB(ζ, t ) = ∑ f (t )Bi (ζ, t ) = ∑ f (t )B (ζ, t )dt
For any dt ,
(1) the First Moment of the Integration Sum is
⎡ t =b
⎤
⎢
E ⎢ ∑ f (t )B(ζ , t )dt ⎥⎥ =
⎣⎢ t =a
⎦⎥
t =b
[B (ζ , t )]dt = 0 .
∑ f (t ) E
t =a
0
(2) the Second Moment of the Integration sum is
48
Gauge Institute Journal,
H. Vic Dannon
⎡ ⎛ t =b
τ =b
⎡⎛ t =b
⎞⎛
⎞⎤
⎞⎟2 ⎤⎥
⎢⎜
⎟⎜
⎟
⎜
⎢
⎟
⎟
E ⎢ ⎜⎜ ∑ f (t )Bi (ζ, t ) ⎟ ⎥ = E ⎢ ⎜⎜ ∑ f (t )Bi (ζ , t ) ⎟⎜⎜ ∑ f (τ )B j (ζ , τ ) ⎟⎟ ⎥⎥
⎟
⎟⎟⎜
⎟
⎢ ⎝⎜ t =a
⎠⎟ ⎥⎥
⎠⎝
⎠⎟ ⎦⎥
τ =a
⎢ ⎝⎜ t =a
⎣
⎢⎣
⎦
t =b τ =b
=
∑ ∑ f (t )f (τ )E[B j (ζ, τ )Bi (ζ, t )]
t =a τ =a
Since the Bernoulli Random Variables are independent,
E [B j (ζ , τ )Bi (ζ , t )] = E [Bi 2 (ζ , t )] = (dx )2
only for t = τ . Then,
⎡
⎞⎟2 ⎥⎤
⎢ ⎜⎛ t =b
E ⎢ ⎜⎜ ∑ f (t )Bi (ζ, t ) ⎟⎟ ⎥ =
⎟
⎢ ⎜⎝ t =a
⎠⎟ ⎥⎥
⎢⎣
⎦
t =b
dx )2 ,
∑ f 2(t )(N
t =a
(2D )dt
t =b
= 2D ∑ f 2 (t )dt ,
t =a
t =b
= 2D ∫ f 2 (t )dt .
t =a
assuming (dx )2 = (2D )dt , and f (t ) integrable
Thus, for any dt , the Integration Sum is a unique welldefined hyper-real Random Variable I (ζ ) .
We call I (ζ ) the integral of f (t ) , with respect to B(ζ, t ) from
t =b
x = a , to x = b , and denote it by
∫
t =a
49
f (t )dB(ζ, t ) .
Gauge Institute Journal,
H. Vic Dannon
13.
Poisson Process
The arrival at rate λ , of radioactive particles at a counter is
modeled by the Poisson Process. It models other processes,
such as the arrival of phone calls at rate λ , to an operator.
13.1 The Bernoulli Random Variables of the Process
We assume that
an arrival probability in time dt is
p = λdt ,
and no arrival probability in time dt is
q = 1 − λdt .
At fixed time t , after
N infinitesimal time intervals dt ,
N =
t
dt
, is an infinite hyper-real,
there are
k arrivals,
k is a finite hyper-real
and
N − k no arrivals,
50
Gauge Institute Journal,
H. Vic Dannon
N − k is an infinite Hyper-real
At the i th step we define the Bernoulli Random Variable,
Pi (arrival) = 1 ,
ζ1 = arrival
Pi (no-arrival) = 0 ,
ζ2 = no-arrival
where i = 1, 2,..., N .
Pr(Pi = 1) = p = λdt ,
Pr(Pi = 0) = q = 1 − λdt ,
E [Pi ] = 1 ⋅ λdt + 0 ⋅ (1 − λdt ) = λdt ,
E [Pi 2 ] = 12 ⋅ λdt + 02 ⋅ (1 − λdt ) = λdt ,
Var[Pi ] = E [Pi 2 ] − (E [Pi ])2 ,
N
N
λdt
λdt
= λdt (1 − λdt ) ≈ λdt .
≈1
13.2 The Binomial Distribution of the Process
P(ζ, t ) = P1 + P2 + ... + PN
is a Random Process with
E [P(ζ , t )] = λt ,
Var[P(ζ, t )] = λt ,
distributed Binomially
51
Gauge Institute Journal,
H. Vic Dannon
⎛N ⎞
k
N −k
Pr ( P(ζ, t ) = k ) = ⎜⎜⎜ ⎟⎟⎟( λdt ) ( 1 − λdt )
⎜⎝ k ⎠⎟
Proof: Since the Pi are independent,
E [P(ζ, t )] = E [P1 ] + ... + E [PN ] = λ Ndt
N
N
N
λdt
λdt
t
Var[P(ζ, t )] = Var[P1 ] + ... + Var[PN ] ≈ λ Ndt
N
≈λdt
≈λdt
t
P(ζ, t ) has a Binomial distribution,
⎛ N ⎞⎟
Pr ( P(ζ, t ) = k ) = ⎜⎜⎜ ⎟⎟ pkq N −k ,
⎜⎝ k ⎠⎟
⎛ N ⎞⎟
k
N −k
= ⎜⎜⎜ ⎟⎟( λdt ) ( 1 − λdt )
.
⎜⎝ k ⎠⎟
13.3 The Poisson Distribution of the Process
The Binomial distribution of P(ζ, t ) is infinitesimally
close to a Poisson distribution of a Random Signal with
μ = λt ,
σ 2 = λt .
Pr[P(ζ , t ) = k ] ≈
Proof:
52
1
(λt )k e −λt
k!
Gauge Institute Journal,
H. Vic Dannon
Pr ( P(ζ, t ) = k ) =
Substituting N ! ≈
2πN
k
N −k
N!
( λdt ) ( 1 − λdt ) .
k !(N − k )!
N + 1 −N
2
e
from Sterling’s Formula for
infinite hyper-real N ,
≈
2πN
N + 1 −N
2
e
N −k + 1
k ! 2π(N − k )
2
k
e
N −k
( λdt ) ( 1 − λdt )
−N +k
,
N +1
k
N −k
1
N 2
t
t
1
=
−
,
λ
λ
(
)
(
)
N
N
k ! N N + 12 −k (1 − k )N −k + 12 e k
N
=
N
k
1
1
1
k −1
e −k (1 − Nk ) 2 ( 1 − λNt )
,
( λt )
N
1 − λt k
k!
(1 − Nk )N
( N)
≈
1
≈e−λt −k
≈e
≈1
≈1
≈1
since N is an infinite Hyper-real. ,
13.4
Increment of Poisson Process
1) For any τ > 0 , the distribution of P(ζ, t + τ ) − P(ζ , t ) is
infinitesimally close to a Poisson distribution of a Random
Signal with
μ = λτ ,
σ 2 = λτ .
Pr[P(ζ, t + τ ) − P(ζ , t ) = k ] ≈
53
1
(λτ )k e −λτ
k!
Gauge Institute Journal,
H. Vic Dannon
that depends only on τ (Stationary Process).
2) For fixed t , and any dt , the Random Variables
P(ζ, t ) − P(ζ, t − dt ) ,
P(ζ, t − dt ) − P(ζ, t − 2dt ) ,
…………………………….,
P(ζ, dt ) − P(ζ, 0) ,
are independent, random variables.
Proof:
1) Let T =
τ
dt
. Then, as in 12.2, the Binomial distribution of
P(ζ, t + τ ) − P(ζ, t ) = PN +1 + PN +2 + ... + PN +T ,
is infinitesimally close to a Poisson distribution with μ = λτ ,
and σ 2 = λτ , that depends only on τ . ,
2)
P(ζ, t ) − P(ζ, t − dt )
is precisely one Bernoulli Random Variable that is
statistically independent of the precisely one Bernoulli
Random Variable that equals P(ζ, t − dt ) − P(ζ, t − 2dt ) . ,
54
Gauge Institute Journal,
H. Vic Dannon
14.
Poisson Process is Continuous
and has a Derivative with Delta
Function Variance
14.1
Poisson Process is Continuous
Proof:
E[{P(ζ, t + dt ) − P(ζ, t )}2 ] =
= Var[P(ζ, t + dt ) − P(ζ, t )] + (E [P(ζ , t + dt ) − P(ζ , t )])2 ,
Pi
Pi
where Xi is a Bernoulli Random Variable,
= Var[Pi ] + (E[Pi ])2 = infinitesimal . ,
N
≈λdt
λdt
14.2 The Derivative of the Poisson process is
1
P = Pi ,
dt
where (1) Pi = P(ζ, t0 + dt ) − P(ζ, t0 ) , is a Bernoulli
55
Gauge Institute Journal,
H. Vic Dannon
Random Variable.
(2) E [P ] = λ ,
(3) Var[P ] = λδ(t0 )
Proof:
(1)
For each t = t0 , we need to find a Random Signal
P (ζ , t0 ) , so that for any dt ,
⎡⎡
⎤ 2 ⎤⎥
⎢ ⎢ P(ζ, t0 + dt ) − P(ζ, t0 )
E⎢
− P (ζ, t0 ) ⎥ ⎥ = infinitesimal ,
⎢
⎥ ⎥
dt
⎢⎣
⎦ ⎦
⎣
Since P(ζ, t0 + dt ) − P(ζ, t0 ) , is a Bernoulli Random Variable
Pi ,
⎡⎧
⎡⎧
⎫⎪2 ⎤⎥
⎫2 ⎤⎥
P
⎪
⎪
⎪⎪ P(ζ, t + dt ) − P(ζ, t )
⎪
⎢
⎢
⎪
⎪
i
E ⎢⎨
− P (ζ, t )⎬ ⎥ = E ⎢ ⎨ − P ⎬ ⎥
⎪
⎪⎭⎪ ⎥
dt
⎢ ⎩⎪⎪ dt
⎭⎪⎪ ⎥⎦
⎣⎢ ⎩⎪
⎦
⎣
Therefore, at time t = t0 , the Random Variable
1
P,
dt i
is the derivative of the Random Walk P(ζ, t0 ) . ,
(2)
E [P ] =
1
dt
E [Pi ] = λ . ,
N
λdt
(3)
Var[P ] = E [P 2 ] − (E
[P ])2
N
λ
56
Gauge Institute Journal,
H. Vic Dannon
=
1
2
(dt )
=λ
E [Pi 2 ] − λ 2
N
λdt +λ2 (dt )2
1
dt
= λδ(t0 ) ,
By [Dan4]. ,
57
Gauge Institute Journal,
H. Vic Dannon
15.
t =b
∫
f (t )dP(ζ , t )
t =a
Let f (t ) be a hyper-real function on the bounded time
interval [a, b ] . f (t ) need not be bounded.
At each a ≤ t ≤ b , there is a Bernoulli Random Variable
dP(ζ, t ) = P(ζ, t + dt ) − P(ζ , t ) = Pi (ζ , t ) = P (ζ , t )dt .
We form the Integration Sum
t =b
t =b
t =b
t =a
t =a
t =a
∑ f (t )dP(ζ, t ) = ∑ f (t )Pi (ζ, t ) = ∑ f (t )P (ζ, t )dt
For any dt ,
(1) the First Moment of the Integration Sum is
⎡ t =b
⎤
E ⎢⎢ ∑ f (t )P (ζ, t )dt ⎥⎥ =
⎢⎣ t =a
⎥⎦
t =b
t =b
[P (ζ, t )]dt = λ ∫
∑ f (t ) E
t =a
λ
f (t )dt ,
t =a
assuming f (t ) integrable.
(2) the Second Moment of the Integration sum is
58
Gauge Institute Journal,
H. Vic Dannon
⎡ ⎛ t =b
τ =b
⎞⎟2 ⎥⎤
⎡⎛ t =b
⎞⎛
⎞⎤
⎢⎜
⎟⎟⎜
⎟
⎜
⎢
⎟
E ⎢ ⎜⎜ ∑ f (t )Pi (ζ, t ) ⎟ ⎥ = E ⎢ ⎜⎜ ∑ f (t )Pi (ζ, t ) ⎟⎜⎜ ∑ f (τ )Pj (ζ , τ ) ⎟⎟ ⎥⎥
⎟
⎟
⎢ ⎝⎜ t =a
⎟⎟⎜ τ =a
⎠⎟ ⎥⎥
⎠⎝
⎠⎟ ⎦⎥
⎢ ⎝⎜ t =a
⎣
⎢⎣
⎦
t =b τ =b
=
∑ ∑ f (t )f (τ )E[Pj (ζ, τ )Pi (ζ, t )]
t =a τ =a
Since the Bernoulli Random Variables are independent,
E [Pj (ζ, τ )Pi (ζ , t )] = E [Pi 2 (ζ , t )] = λdt(1
+ λdt )
≈1
only for t = τ . Then,
⎡
t =b
⎞⎟2 ⎥⎤
⎢ ⎛⎜ t =b
E ⎢ ⎜⎜ ∑ f (t )Pi (ζ, t ) ⎟⎟ ⎥ = λ ∑ f 2 (t )dt ,
⎟
⎢ ⎜⎝ t =a
⎠⎟ ⎥⎥
t =a
⎢⎣
⎦
t =b
= λ ∫ f 2 (t )dt ,
t =a
assuming f (t ) integrable.
Thus, assuming f (t ) integrable, for any dt , the Integration
Sum is a unique well-defined hyper-real Random Variable
I (ζ ) . We call I (ζ ) the integral of f (t ) , with respect to P(ζ, t )
from x = a , to x = b , and denote it by
t =b
∫
f (t )dP(ζ, t ) .
t =a
59
Gauge Institute Journal,
H. Vic Dannon
References
[Benoit] Eric Benoit “Random Walks and Stochastic Differential
Equations” in “Nonstandard Analysis in Practice” edited by Francine
Diener, and Marc Diener, Springer, 1995.
[Chandrasekhar] S. Chandrasekhar, “Stochastic Problems in Physics
and Astronomy” Reviews of Modern Physics, Volume 15, Number1,
January 1943.
Reprinted in “Selected Papers on Noise and Stochastic Processes” edited
by Nelson Wax, Dover, 1954
[Dan1] Dannon, H. Vic, “Well-Ordering of the Reals, Equality of all
Infinities, and the Continuum Hypothesis” in Gauge Institute Journal
Vol.6 No 2, May 2010;
[Dan2] Dannon, H. Vic, “Infinitesimals” in Gauge Institute Journal
Vol.6 No 4, November 2010;
[Dan3] Dannon, H. Vic, “Infinitesimal Calculus” in Gauge Institute
Journal Vol.7 No 4, November 2011;
[Dan4]
Dannon, H. Vic,
“The Delta Function” in Gauge Institute
Journal Vol.8 No 1, February 2012;
[Hoel/Port/Stone] Paul Hoel, Sidney Port, Charles Stone, “Introduction
to Stochastic Processes” Houghton Mifflin, 1972.
[Hsu]
Hwei Hsu, “Probability, Random Variables, & Random
Processes”, Schaum’s Outlines, McGraw-Hill, 1997.
[Karlin/Taylor] Howard Taylor, Samuel Karlin, “An Introduction to
Stochastic Modeling”, Academic Press, 1984.
60
Gauge Institute Journal,
H. Vic Dannon
[Larson/Shubert] Harold Larson, Bruno Shubert, “Probabilistic Models
in Engineering Sciences, Volume II, Random Noise, Signals, and
Dynamic Systems”, Wiley, 1979.
61
Related documents