
Markov random fields (MRFs) are well studied during the past 50 years. Their success are mainly due to their flexibility and to the fact that they gives raise to stochastic image models. In this work, we will consider a stochastic differential equation (SDE) driven by Lévy noise. We will show that the solution Xv of the SDE is a MRF satisfying the Markov property. We will prove that the Gibbs distribution of the process Xv can be represented graphically through Feynman graphs, which are defined as a set of cliques, then we will provide applications of MRFs in image processing where the image intensity at a particular location depends only on a neighborhood of pixels.
Citation: Boubaker Smii. Markov random fields model and applications to image processing[J]. AIMS Mathematics, 2022, 7(3): 4459-4471. doi: 10.3934/math.2022248
[1] | Boubaker Smii . Representation of the solution of a nonlinear molecular beam epitaxy equation. AIMS Mathematics, 2024, 9(12): 36012-36030. doi: 10.3934/math.20241708 |
[2] | Andrey Borisov . Filtering of hidden Markov renewal processes by continuous and counting observations. AIMS Mathematics, 2024, 9(11): 30073-30099. doi: 10.3934/math.20241453 |
[3] | Min Han, Bin Pei . An averaging principle for stochastic evolution equations with jumps and random time delays. AIMS Mathematics, 2021, 6(1): 39-51. doi: 10.3934/math.2021003 |
[4] | Lin Xu, Linlin Wang, Hao Wang, Liming Zhang . Optimal investment game for two regulated players with regime switching. AIMS Mathematics, 2024, 9(12): 34674-34704. doi: 10.3934/math.20241651 |
[5] | Luigi Accardi, Amenallah Andolsi, Farrukh Mukhamedov, Mohamed Rhaima, Abdessatar Souissi . Clustering quantum Markov chains on trees associated with open quantum random walks. AIMS Mathematics, 2023, 8(10): 23003-23015. doi: 10.3934/math.20231170 |
[6] | Snezhana Hristova, Kremena Stefanova . p-moment exponential stability of second order differential equations with exponentially distributed moments of impulses. AIMS Mathematics, 2021, 6(3): 2886-2899. doi: 10.3934/math.2021174 |
[7] | Meijiao Wang, Qiuhong Shi, Maoning Tang, Qingxin Meng . Stochastic differential equations in infinite dimensional Hilbert space and its optimal control problem with Lévy processes. AIMS Mathematics, 2022, 7(2): 2427-2455. doi: 10.3934/math.2022137 |
[8] | Jing Li, Linlin Dai, Kamran, Waqas Nazeer . Numerical solution of multi-term time fractional wave diffusion equation using transform based local meshless method and quadrature. AIMS Mathematics, 2020, 5(6): 5813-5838. doi: 10.3934/math.2020373 |
[9] | Minyu Wu, Xizhong Yang, Feiran Yuan, Xuyi Qiu . Averaging principle for two-time-scale stochastic functional differential equations with past-dependent switching. AIMS Mathematics, 2025, 10(1): 353-387. doi: 10.3934/math.2025017 |
[10] | Xiangqi Zheng . On the extinction of continuous-state branching processes in random environments. AIMS Mathematics, 2021, 6(1): 156-167. doi: 10.3934/math.2021011 |
Markov random fields (MRFs) are well studied during the past 50 years. Their success are mainly due to their flexibility and to the fact that they gives raise to stochastic image models. In this work, we will consider a stochastic differential equation (SDE) driven by Lévy noise. We will show that the solution Xv of the SDE is a MRF satisfying the Markov property. We will prove that the Gibbs distribution of the process Xv can be represented graphically through Feynman graphs, which are defined as a set of cliques, then we will provide applications of MRFs in image processing where the image intensity at a particular location depends only on a neighborhood of pixels.
Let G=(V,E) be a Feynman graph, where V and E are the sets of vertices and edges respectively. The vertices are represented by points and classified into two category full and empty. The edges join different pairs of vertices will be denoted by e=(v,w) and are represented by lines joining the vertices. For the graph G, we associate a network evolving according to the stochastic differential equation:
{dXv(t)=∑w∈WαvwXw(t)dt+dLv(t),Xv(0)=x0∈Rd, | (1.1) |
where W={w∈V:(w,v)=e∈E} is the set of all vertices connected to v, called parents of v, {Xv(t),t≥0} is a stochastic process taking values in Rd and α=(αvw)v,w∈V∈Rd×d a matrix whose sparsity pattern is given by the graph G.
The process {Lv(t),t>0} is a Lévy process taking values in Rd,Xv(0)=x0∈Rd is a random variable independent of Lv(t), distributed according to the invariant measure, see [1,2,3,4,5,21].
We are mainly interested in characterizing the scaling of the learning time for large networks. The Gaussian case, i.e, when the driving noise in Eq (1.1) is a Brownian motion {Bt,t>0} has been discussed by several authors see, e.g. [10] and [23]. In this work we will develop the general case, i.e when the noise in Eq (1.1) is of Lévy type, we will prove that the scaling limit of a Poisson noise gives a Gaussian noise. This result is of great interest, for instance it can be used to get the scaling of the learning time for large networks. Moreover, the Itˆo formula applied to Eq (1.1) to find the solution Xv, can be also related by a scaling limit of the classical Itˆo formula for Gaussian noise.
In the paper [21], the author proved an asymptotic expansion of the transition probability density pt,t>0 of the semigroup associated to a SDE driven by Lévy noise, and he outlined some remarks on the applications of the neural networks. In this work we will focus on more applications using Markov Random fields particularly on images and cliques. This notion is captured by means of a local, conditional probability distribution and Gibbs distribution.
Let us also mention that getting such applications in a Complex biological networks, where extracting stochastic models from data is a challenging task, see, e.g., [18,19,22]. The model given by Eq (1.1) can be used to trace fluctuations of the process with respect to equilibrium values, for instance we will show that the stochastic process {Xv(t),t≥0} given by Eq (1.1) with fluctuating force {Lv(t),t>0} is a MRF satisfying a Markov property. The probabilistic approaches behind image analysis is due to the fact that a probability distribution (Gibbs distribution) can interact to the different sets of image attributes. The MRF image models will be defined in this work using Feynman graphs, where the solution Xv of the SDE (1) will depend only on the neighboring variables of the vertex v. The structure of the Feynman graphs defined in this work is in a such a way all variables are mutually independent, i.e., each two vertices in the graph can be joined through a path. This graphical representation is a powerful tool to study the structural properties of the model. Let us mention that, to the best of our knowledge, getting Feynman graphs representations of the distribution function P(Xv=x) of the solution Xv,v∈V of the SDE (1.1), is not studied before, it is the aim of this paper to give such a recipe by calculating first the Gibbs distribution of the solution Xv, then Hammensley-Clifford theorem ensure that the stochastic process {Xv(t),t≥0} is a MRF. Hence, by using Feynman graphs and rules approach we will prove that the distribution function of the solution Xv,v∈V is given by a sum over all such graphs. The graphical representation of the MRF are defined in a way that it model and analyze images, see, e.g., [7]. For computations and numerical analysis we may restrict only to neighboring variables due to the Markovian properties conditions.
The structure of this paper is as follows: Section 2 is devoted to some definitions and assumptions that are useful for the current work. In Section 3 we will show how we can scale a Poisson process to obtain a Gaussian one, thus one can extract previous results where a Gaussian noise is considered from the current general work. Section 4 is devoted to the main results on MRFs and their graphical representations. In Section 5 some remarks and applications are given. Section 6 is reserved to the conclusion part of this work.
In this section we will introduce some useful notations and basic results from stochastic process, Lévy noise and probability theory.
Denote by ˜M a compensated jump measure, then ˜M will be defined as:
˜M(dt,dz):=M(dt,dz)−π(dz)dt, | (2.1) |
here π is the Lévy measure satisfying ∫Rd∖{0}(∣x∣2∧1)π(dx)<∞. More detail about Lévy measure can be found in [9,20].
The Lévy process Lt=L(t),t≥0, is represented, see [20], by
L(t)=mt+kBt+∫t0∫∣z∣<1z˜M(ds,dz)+∫t0∫∣z∣≥1zM(ds,dz), | (2.2) |
where m and k are constants and Bt=B(t),t≥0 is a standard Brownian motion on Rd.
For the particular case m=k=0, the Lévy process L(t),t≥0 is called a pure jump Lévy process denoted by ˜L,t≥0 and defined on Rd. ˜L is characterized by its characteristic function:
E(ei⟨x,˜L(t)⟩)=et∫Rd∖{0}(ei⟨x,y⟩−1)π(dy),x∈Rd. | (2.3) |
Moreover, the process ˜Lt can be represented as follows:
˜Lt=∫By˜M(t,dy)+∫Rd∖ByM(t,dy), t≥0, | (2.4) |
where B is a Borel set, i.e., B∈B(Rd).
Definition 2.1. A transition kernel on (Rd,B(Rd)), is a family of mappings ps,t(x,B),s,t≥0,x∈Rd and B∈B(Rd), with values in [0,1] and satisfying:
(1) ps,t(x,B) is a probability measure as a function of B for any fixed x;
(2) ps,t(x,B) is measurable in x for any fixed B;
(3) p0,0(x,B)=δx(B);
(4)
∫Rdps,t(x,y)pt,r(y,B)=ps,r(x,B),for 0≤s≤t≤r. | (2.5) |
Called semi-group or Chapman-Kolmogorov property.
Definition 2.2. Let (Ω,F,P) be a probability space with a filtration (Ft,t≥0). A stochastic process (X=Xt,t≥0) adapted to Ft is said to posses the Markov property if for each s,t≥0,s<t,
P(Xt∈B∣Fs)=P(Xt∈B∣Xs). | (2.6) |
A Markov process is a stochastic process which satisfies the Markov property with respect to its filtration.
Remark 2.3. (1) Markov processes are characterized by transition probability functions, moreover if u(t),t≥0 is a Markov process then for 0<s<t,
ps,t(u0,x)=P(u(t)≤x∣u(s)=u0),u0,x∈Rd. | (2.7) |
(2) If Pt(u0,.),t>0,u0∈Rd is the transition semigroup of a homogeneous Markov process ut,t≥0, then by a result in [20] we have:
∫Rdei⟨x,y⟩Pt(u0,dy)=exp[ie−λt⟨x,u0⟩+∫t0Ψ(e−λsx)ds],u0,x∈Rd,λ>0, | (2.8) |
where Ψ is the Lévy characteristic given by:
Ψ(ξ)=i⟨l,ξ⟩−⟨ξ,Dξ⟩+∫Rd∖{0}(ei⟨x,ξ⟩−1)ν(dx),ξ∈Rd. | (2.9) |
Here l,D∈Rd and ν is a positive Lévy measure satisfying:
∫|x|≤1|x|ν(dx)<∞,∫|x|≥1|x|2ν(dx)<∞. | (2.10) |
(3) The transition probability density qt,t>0 of the solution ut,t≥0 can be characterized by the Fourier transform:
qt(x)=F(e−tΨ(ξ))(x)=(2π)−d∫Rde−ix⋅ξe−tΨ(ξ)dξ. | (2.11) |
Following Hartman and Wintner [14], existence of the transition probability density qt,t>0 associated to ut,t≥0 is guaranteed in terms of the characteristic exponent of Ψ. To guarantee the smoothness of qt,t>0 we assume for a∈(0,2):
ReΨ(ξ)≥c(∣ξ∣a∧∣ξ∣b),b∈[a,2],c>0,ξ∈Rd. | (2.12) |
Definition 2.4. Let X and Y be two random variables. X and Y are said to be conditionally independent given the random variable Z, if and only if:
P(X,Y∣Z)=P(X∣Z)P(Y∣Z). | (2.13) |
Let us remind that a random field is essentially a stochastic process defined on set of spatial nodes (called also sites). For instance, if S={1,...,n} is a finite set and {X(s),s∈S} is a collection of random variables on the sample space Ω and X(s1)=x1,...,X(sm)=xm, where xi∈Ω. Then the joint event x=x1,...,xm is called a configuration of X(s), which corresponds to a realization of the random field (RF), more details can be found in [8,12].
In a given random fields, the sites in S are related to one another through a neighborhood system denoted by N={N(i),i∈S}, where N(i) is the set of sites that are neighbors of i,i∉N(i). The neighborhood relation is symmetrical, which means that i∈N(j)↔j∈N(i). Thus, for a finite set of sites S={1,...,N}, a Markov random field is a family of random variables Xi,i∈S such that P(Xi=xi)>0 and with probability functions that satisfy the Markov properties:
P(Xi=xi|Xj=xj,j≠i)=P(Xi=xi|Xj=xj,j∈N(i)). | (2.14) |
Unfortunately, using the Markov property (2.14) one can not deduce the joint probability distribution P(X1,...,XN) from the conditional probability P[Xi|Xj,j∈N(i)]. Thanks to Hammersley-Clifford theorem which solves this problem:
Theorem 2.5. The random field X is a Markov random field (MRF) if and only if X is a Gibbs random field.
Proof. For the proof we refer to [6].
Definition 2.6. For a graph G=(V,E), we define a Green function ˜G on ]0,∞[, as follows:
{d˜G(t)dt=αvw˜G(t)+δ(t),t∈]0,∞[˜G(t)=0,t<0, | (2.15) |
where δ(t) is the Dirac distribution on R+ and αvw are the matrix coefficients given in Eq (1.1).
Following [20] and From Eq (2.9), one can see that the Lévy process Lv(t),t>0 can be decomposed into three independent components, i.e., Lv=Ldv+Lgv+Lpv, where Ldv,Lgv and Lpv are respectively, the deterministic, Gaussian and Poisson noise given by their characteristic functions.
In the following and without loss of generality, Ldv,Lgv and Lpv will be denoted by Ld,Lg and Lp respectively.
The deterministic component Ldv do not impose any difficulty in computing the moments of the noise, and we refer to several works from literature. In this work we will stress our selfes to the more general case, it is our aim then to show how can we recover the Gaussian noise Lg starting from a Poisson noise Lp, then we will model a Markov chain by a neural networks. For this aim we consider a representation of the Poisson noise in terms of Poisson distribution: Let Λn⊂⊂R+ be a monotone sequence of compact sets, i.e, intervals, s.t, Λn↑R+ as n⟶∞ and Λ0=∅. For n∈N, let Ln=Λn∖Λn−1 and we denote the Lebesgue volume of Ln by ∣Ln∣. Let
Lpn=NzT∑j=1SnjδTnj,Tnj∼dxT∣[0,T],T>0, | (3.1) |
where δx is the Dirac measure of mass one in x, {Snj}j∈N is a family of real valued and independent random variables with law given by r and NzT is a Poisson random variable with intensity z∣Ln∣, i.e,
P(NzT=k)=e−z∣Ln∣(z∣Ln∣)kk!;k∈N0. | (3.2) |
Let now S(R) be the Schwartz space of all rapidly decreasing functions on R endowed with the Schwartz topology, its topological dual is the space of tempered distribution noted by S′(R). We denote by ⟨.,.⟩ the dual pairing between S(R) and S′(R).
The characteristic functional of the noise Lpn for any function h∈S(R) such that supph⊆Ln is given by:
CLp(h):=⟨eiLpn(h)⟩=⟨ei∑NzTj=1Snjh(Xnj)⟩=e−z∣Ln∣∞∑l=0(z∣Ln∣)ll!(∫Ln∫R∖{0}eish(x)dr(s)dx∣Ln∣)l=exp{z∫Ln∫R∖{0}(eish(x)−1)dr(s)dx},∀h∈S(R). | (3.3) |
We will denote now CLp the generating functional of the function f for a Poisson noise Lp on R by:
CLp(f)=exp(z∫T0∫R∖{0}(eisf(t)−1)dr(s)dt),z>0. | (3.4) |
Definition 3.1. Let x1,...,xk∈Rd, I a partition of the set {1,...,m}, I∈P(m),I={I1,...,Ik} the truncated moments functions ⟨L(x1)⋅⋅⋅L(xm)⟩T are recursively defined by:
⟨m∏i=1L(xi)⟩=∑I∈P(m)I={I1,...,Ik}k∏l=1⟨Il⟩T, | (3.5) |
where ⟨Il⟩T=⟨∏j∈IlL(xj)⟩T. $
Theorem 3.2. The truncated moment functions of the noise L are given by the following formula
⟨L(t1)⋅⋅⋅L(tn)⟩T=cn∫Rnδ(t−t1)⋅⋅⋅δ(t−tn)dt, | (3.6) |
where
cn=(−i)ndnψ(t)dtn∣t=0=δn,1a+δn,2σ2+z∫R∖{0}sndr(s), | (3.7) |
δn,n′ being the Kronecker symbol and ψ the characteristic function given by Eq (2.9).
Proof. For the proof we refer to [13].
The following first result, shows that one can scale the Poisson noise to obtain a Gaussian noise.
Theorem 3.3. Let CLp be the generating function of a function f given by Eq (3.4) and Lg the Gaussian noise. Assume that ∫R∖{0}sdr(s)=0, then:
limz⟶∞CLp(f)=CLg(f)={exp(−c22∫T0f2(t)dt)ifn=2,0otherweise. | (3.8) |
Here CLg(f) is the generating function of the function f for a Gaussian noise Lg on R and c2 are the moment of order 2 of Lg given by Eq (3.7).
Proof. Consider the transformation s⟶s√z, then:
limz⟶∞CLp(f)=limz⟶∞exp(z∫T0∫R∖{0}(eis√zf(t)−1)dr(s√z)dt)=limz⟶∞exp(√z∫T0∫R∖{0}(∞∑n=0inn!snzn2fn(t)−1)dr(s)dt)=limz⟶∞exp(√z∫T0∫R∖{0}∞∑n=1inn!snzn2fn(t)dr(s)dt)=exp(∫T0∫R∖{0}−12s2f2(t)dr(s)dt)=exp(−c22∫T0f2(t)dt)=CLg. | (3.9) |
Remark 3.4. In the next section we will prove that the MRF can be represented through Feynman graphs according to some fixed rules, since the representation consider the Lévy process Lv(t),t≥0, then by using Theorem 3.3 one can scale the Poisson noise to get a Gaussian noise, this particular case consider the Gauss-Markov random fields, see, e.g., [6].
Markov random fields (MRF) is used in different areas, it was originally used in statistical mechanics to model system of particles interacting in two or three dimensional lattice. MRF have been recently widely used in statistics and image analysis, where pixels and voxels represents the images, see, e.g., [11].
Assuming now that a process is at particular location, it will be then influenced by the events happened in a neighborhood location. The relation is in one-to-one correspondence through a neighborhood system denoted by N={N(v),v∈V}, where N(v) is the set of sites that are neighbors of V, graphically this set is a subset from the set V of all vertices of a graph G.
Notice that one can apply to symmetry in N(V), i.e;
v∈N(w)⇔w∈N(v). | (4.1) |
Consequently a MRF represents in this work a family of stochastic processes {Xv,v∈N(v)} satisfying the Markov property:
P(Xv=xv|Xw=xw,v≠w)=P(Xv=xv|Xw=xw,w∈N(v),P(Xv=x)>0,∀v∈N(v). | (4.2) |
Equation (4.2) establishes the local characteristics of the process {Xv,v∈N(v)}, this means that only neighboring sites have direct interactions on each other. For this reason it is important to discuss carefully the neighborhood structure in MRF, since it gives idea about the characteristic of the image data.
The neighborhood structure can be modeled by the graph G=(V,E), where the sites represents the vertices and any two sites are connected by an edge e∈E., see, Figure 1.
Definition 4.1. We define a clique of the graph G=(v,E) as a subgraph of G in which every site is a neighbor of all other sites. Example of three vertices clique is given in Figure 2.
One can understand that a MRF consists of a set of cliques since any vertex in a clique is a neighbor to all other vertices in the same clique, therefore the conditional probability P(Xv|Xw,w∈N(v)) will be represented by cliques.
The process Xv is defined to have the Gibbs distribution if its distribution function is given by:
P(Xv=x)=1Zexp(−βU(x)), | (4.3) |
where U(x) is the energy function, β>0 characterizes the label scale variability in an image. Here Z is the partition function given by
Z=∑xexp(−βU(x)), | (4.4) |
which is nothing else than a normalizing constant and involves a summation over all possible configurations of Xv.
Remark 4.2. The Gibbs distribution given by Eq (4.3) shows that high energies corresponds to low probabilities, whereas lower energies are more likely.
One can express U(x) in terms of cliques as
U(x)=∑c∈CVc(x)=∑v∈N1VN1(xv)+∑(v,w)∈N2VN2(v,w)+..., | (4.5) |
Vc(x) is the potential function that is associated with clique c∈C, which is known in image analysis, as a function of the values of the pixels in a clique c∈C, and VNi,i=1,2,.... are potentials defined on the cliques of the neighborhood system Ni. Here v and w are neighbors if xv and xw appears simultaneously within a same factor VNi,i=1,2,....
Following Hammensley-Clifford theorem the process {Xv,v∈N(v)} is a MRF if and only if it has a Gibbs distribution with potentials defined on the cliques of the neighborhood system N(v). The conditional probability P(Xv|Xw,w∈N(v)) is then given by:
P(Xv|Xw,w∈N(v))=1Zvexp(−β∑v∈CVc(x)), | (4.6) |
where
Zv=∑x,y∈Rdexp(−β∑v∈CVc(x|Xv=y)), | (4.7) |
here {x|Xv=y}={x1,…,xk,y,…,xd},k=1,...,d.
In image processing one can understand Eq (4.6) as follows: Associate an image with the stochastic process Xv0, where v0 here refer to a site in the image. Then the conditional probability given by Eq (4.6) can be written in this case as:
P(Xv0=xv0|Xv=xv,v0≠v). | (4.8) |
Moreover, if v0 is the site of (v,w), then the neighbors of v0 are (v,w1),(v,w−1),(v1,w) and (v−1,w), notice that each two sites are connected through a path, which will be expressed in terms of vertices and edges, consequently the conditional probability given by Eq (4.8) is nothing else then the first order Gauss-Markov model, see, e.g., [6]:
P(Xv0=xv0|Xv)=1√2πexp[−12(xv,w−14(xv,w1+xv,w−1+xv1,w+xv−1,w)2)]. | (4.9) |
Remark 4.3. From the above example and the expression of the Gibbs distribution, one can obtain the density function by assembling the different clique energies from the conditional probability and then compute the energy function by adding up the clique energies, see, e.g., [6,11,16].
We would like now to establishes a graphical representation for the solution {Xv,v∈V} of the SDE equation (1.1). Since the solution is characterized by the Gibbs distribution given by Eq (4.3), it suffices then to provide a graphical representation to P(Xv=x), this leads to express graphically the sites in image as well as the cliques.
Proposition 4.4. In the sense of formal power series the distribution function of the solution of the SDE (1) is given by:
P(Xv=x)=1Zv∞∑n=0(−β)nn!∑n0,n1,...≥0n0+n1+...=n0×n0+1×n1+...=i−1n!n0!n1!...∏i≥0VniNi(x(v)), | (4.10) |
where x(ai) are the neighbors of ai∈Ni,ai=v,(u,v),...
Proof.
P(Xv=x)=1Zvexp(−βU(x))=1Zv∞∑n=0(−β)nn!Un(x)=1Zv∞∑n=0(−β)nn!(∑V∈N1VN1(xv)+∑(v,w)∈N2VN2(v,w)+....)n=1Zv∞∑n=0(−β)nn!∑n0,n1,...≥0n0+n1+...=n0×n0+1×n1+...=i−1n!n0!n1!...∏i≥0VniNi(x(v)), | (4.11) |
where x(ai) are the neighbors of ai∈Ni,ai=v,(u,v),...
Definition 4.5. Let G=(E,V) be a Feynman graph, the random variable L(G,x), is defined as follows:
1)-Assign x∈Rd to the root (first inner vertex) of the graph G.
- Assign values x1,...,xd∈R to the other inner vertices, where x(v)=(x1,...,xd)∈Rd.
2)- For every edge with two end points, e={v,w}, assign a value ˜G(e)=˜G(v−w),(v≤w) to this edge. ˜G is the Green function defined by Eq (2.15).
3)- For the i−th inner vertex multiply with VNi.
4)- For the i−th inner vertex multiply with the coefficient n!n0!n1!...ni!....
5)- Integrate with respect to the Lebesgue measure dx1⋅⋅⋅dxd.
Theorem 4.6. The distribution function of the solution of the stochastic differential equation (1) is given by a sum over all Feynman graph G that are evaluated according to the ruled fixed in Definition 4.5, i.e.
P(Xv=x)=1Zv∞∑n=0(−β)nn!∑G∈F(n)L(G,x). | (4.12) |
Proof. From Proposition (4.4) and the Definition (4.5) we have
P(Xv=x)=1Zv∞∑n=0(−β)nn!∑n0,n1,...≥0n0+n1+...=n0×n0+1×n1+...=i−1n!n0!n1!...∏i≥0VniNi(x(v))=1Zv∞∑n=0(−β)nn!∑G∈F(n)L(G,x). | (4.13) |
Markov random fields (MRF) enjoyed much success on the structural data particularly on image and cliques where pixels are well arranged, see, e.g., [17].
Traditional (MRF) has one graph in its model, later and in their work [15], the authors uses three graphs to characterize hidden communities in a given network. In our current work the (MRF) has d∈N graphs in its models. Moreover, if different models are provided, they will be used in competition to analyze and identify the content of a given image. A particular case arise when the SDE (1) is driven by a Gaussian noise. In this case the Gauss-Markov random field will be considered and identified by the distribution given in Eq (4.9). Moreover, the variable in the Feynman graph G interacts with each others through the quadratic energy, since the matrix is sparse. Therefore the sites v and w are neighbors in G if the corresponding entry αvw=αwv is nonzero.
In this work the following results are achieved: We proved that the solution of the Lévy type SDE is a MRF satisfying the Markov property, we proved also that by scaling the Poisson process one get the Gaussian process. we showed that the Gibbs distribution of the solution process is represented graphically through Feynman graphs according to a fixed rules. At the end, we outlined some applications in image processing mainly where the image intensity depends only on neighborhood of pixels.
This work is supported by King Fahd University of Petroleum and Minerals. The author gratefully acknowledges this support.
The author declares that there is no conflict of interest regarding the publication of this paper.
[1] | S. Albeverio, L. Dipersio, E. Mastrogiacomo, B. Smii, Invariant measures for SDEs driven by Lévy noise: A case study for dissipative nonlinear drift in infinite dimension, Commun. Math. Sci., 15 (2017), 957–983. https://doi.org/10.4310/CMS.2017.v15.n4.a3 |
[2] |
S. Albeverio, L. Dipersio, E. Mastrogiacomo, B. Smii, A class of Lévy driven SDEs and their explicit invariant measures, Potential Anal., 45 (2016), 229–259. https://doi.org/10.1007/s11118-016-9544-3 doi: 10.1007/s11118-016-9544-3
![]() |
[3] |
S. Albeverio, E. Mastrogiacomo, B. Smii, Small noise asymptotic expansions for stochastic PDE's driven by dissipative nonlinearity and Lévy noise, Stoch. Proc. Appl., 123 (2013), 2084–2109. https://doi.org/10.1016/j.spa.2013.01.013 doi: 10.1016/j.spa.2013.01.013
![]() |
[4] |
S. Albeverio, B. Smii, Borel summation of the small time expansion of some SDE's driven by Gaussian white noise, Asymptotic Anal., 114 (2019), 211–223. https://doi.org/10.3233/ASY-191525 doi: 10.3233/ASY-191525
![]() |
[5] |
S. Albeverio, B. Smii, Asymptotic expansions for SDE's with small multiplicative noise, Stoch. Proc. Appl., 125 (2015), 1009–1031. https://doi.org/10.1016/j.spa.2014.09.009 doi: 10.1016/j.spa.2014.09.009
![]() |
[6] | J. Besag, Spatial interaction and the statistical analysis of lattice systems, J. Royal Stat. Soc. Ser. B, 36 (1974), 192–236. |
[7] | A. Blake, P. Kohli, C. Rother, Markov random fields for vision and image processing, The MIT Press, 2011. |
[8] | P. Brémaud, Markov chains, Gibbs fields, Monte Carlo simulation and queues, Springer-Verlag, 1999. |
[9] | Z. Brzeˊzniak, E. Hausenblas, Uniqueness in law of the Itˆo integral with respect to Lévy noise, In: R. Dalang, M. Dozzi, F. Russo, Seminar on stochastic analysis, random fields and applications VI, Vol. 63, Basel: Springer, 2011. https://doi.org/10.1007/978-3-0348-0021-1_3 |
[10] |
Y. Fu, Y. Kang, G. Chen, Stochastic resonance based visual perception using spiking neural networks, Front. Comput. Neurosci., 14 (2020), 24. https://doi.org/10.3389/fncom.2020.00024 doi: 10.3389/fncom.2020.00024
![]() |
[11] |
S. Geman, D. Geman, Stochastic relaxation, Gibbs distributions and the Bayesian restoration of images, IEEE T. Pattern Anal., PAMI-6 (1984), 721–741. https://doi.org/10.1109/TPAMI.1984.4767596 doi: 10.1109/TPAMI.1984.4767596
![]() |
[12] | D. Griffeath, Introduction to random fields, In: Denumerable Markov chains, Graduate Texts in Mathematics, New York: Springer, 1976. https://doi.org/10.1007/978-1-4684-9455-6_12 |
[13] | H. Gottschalk, B. Smii, How to determine the law of the solution to a SPDE driven by a Lévy space-time noise, J. Math. Phys., 43 (2007), 1–22. |
[14] |
P. Hartman, A. Wintner, On the infinitisimal generators of integral convolutions, Am. J. Math., 64 (1942), 273–298. https://doi.org/10.2307/2371683 doi: 10.2307/2371683
![]() |
[15] |
K. He, Y. Li, S. Soundarajanc, J. E. Hopcroft, Hidden community detection in social networks, Inform. Sciences, 425 (2018), 92–106. https://doi.org/10.1016/j.ins.2017.10.019 doi: 10.1016/j.ins.2017.10.019
![]() |
[16] | R. Kinderman, J. L. Snell, Markov random fields and their applications, Contemporary Mathematics, 1980. http://dx.doi.org/10.1090/conm/001 |
[17] | P. Kr¨ahenb¨ahl, V. Koltun, Efficient inference in fully connected CRFs with gaussian edge potentials, Adv. Neural Inf. Process. Syst., 24 (2011), 109–117. |
[18] |
D. Mugnolo, Gaussian estimates for a heat equation on a network, Netw. Heterog. Media, 2 (2007), 55–79. https://doi.org/10.3934/nhm.2007.2.55 doi: 10.3934/nhm.2007.2.55
![]() |
[19] |
D. Mugnolo, S. Romanelli, Dynamic and generalized Wentzell node conditions for network equations, Math. Methods Appl. Sci., 30 (2007), 681–706. https://doi.org/10.1002/mma.805 doi: 10.1002/mma.805
![]() |
[20] | K. I. Sato, Lévy processes and infinitely divisible distributions, Cambridge University Press, 1999. |
[21] |
B. Smii, Asymptotic expansion of the transition density of the semigroup associated to a SDE driven by Lévy noise, Asymptotic Anal., 124 (2021), 51–68. https://doi.org/10.3233/ASY-201640 doi: 10.3233/ASY-201640
![]() |
[22] | C. Turchetti, Stochastic models of neural networks, IOS Press, 2004. |
[23] | V. K. Pandey, H. Agarwal, A. K. Aggarwal Image solution of stochastic differential equation of diffusion type driven by Brownian motion, Singapore: Springer, 2021,542–553. https://doi.org/10.1007/978-981-16-1092-9_46 |
1. | H.I. Alrebdi, Boubaker Smii, Euaggelos E. Zotos, Revealing the dynamics of equilibrium points in a binary system with two radiating bodies, 2022, 70, 02731177, 2021, 10.1016/j.asr.2022.06.052 | |
2. | Yavuz Kahraman, Alptekin Durmuşoğlu, Classification of Defective Fabrics Using Capsule Networks, 2022, 12, 2076-3417, 5285, 10.3390/app12105285 | |
3. | Boubaker Smii, Representation of the solution of a nonlinear molecular beam epitaxy equation, 2024, 9, 2473-6988, 36012, 10.3934/math.20241708 |