Loading [MathJax]/jax/output/SVG/jax.js
Research article Special Issues

Wave interference network with a wave function for traffic sign recognition


  • Received: 04 August 2023 Revised: 19 September 2023 Accepted: 26 September 2023 Published: 16 October 2023
  • In this paper, we successfully combine convolution with a wave function to build an effective and efficient classifier for traffic signs, named the wave interference network (WiNet). In the WiNet, the feature map extracted by the convolutional filters is refined into many entities from an input image. Each entity is represented as a wave. We utilize Euler's formula to unfold the wave function. Based on the wave-like information representation, the model modulates the relationship between the entities and the fixed weights of convolution adaptively. Experiment results on the Chinese Traffic Sign Recognition Database (CTSRD) and the German Traffic Sign Recognition Benchmark (GTSRB) demonstrate that the performance of the presented model is better than some other models, such as ResMLP, ResNet50, PVT and ViT in the following aspects: 1) WiNet obtains the best accuracy rate with 99.80% on the CTSRD and recognizes all images exactly on the GTSRB; 2) WiNet gains better robustness on the dataset with different noises compared with other models; 3) WiNet has a good generalization on different datasets.

    Citation: Qiang Weng, Dewang Chen, Yuandong Chen, Wendi Zhao, Lin Jiao. Wave interference network with a wave function for traffic sign recognition[J]. Mathematical Biosciences and Engineering, 2023, 20(11): 19254-19269. doi: 10.3934/mbe.2023851

    Related Papers:

    [1] Giuseppe Maria Coclite, Lorenzo di Ruvo . A singular limit problem for conservation laws related to the Kawahara-Korteweg-de Vries equation. Networks and Heterogeneous Media, 2016, 11(2): 281-300. doi: 10.3934/nhm.2016.11.281
    [2] Hyeontae Jo, Hwijae Son, Hyung Ju Hwang, Eun Heui Kim . Deep neural network approach to forward-inverse problems. Networks and Heterogeneous Media, 2020, 15(2): 247-259. doi: 10.3934/nhm.2020011
    [3] Anya Désilles, Hélène Frankowska . Explicit construction of solutions to the Burgers equation with discontinuous initial-boundary conditions. Networks and Heterogeneous Media, 2013, 8(3): 727-744. doi: 10.3934/nhm.2013.8.727
    [4] Tong Yan . The numerical solutions for the nonhomogeneous Burgers' equation with the generalized Hopf-Cole transformation. Networks and Heterogeneous Media, 2023, 18(1): 359-379. doi: 10.3934/nhm.2023014
    [5] Jinyi Sun, Weining Wang, Dandan Zhao . Global existence of 3D rotating magnetohydrodynamic equations arising from Earth's fluid core. Networks and Heterogeneous Media, 2025, 20(1): 35-51. doi: 10.3934/nhm.2025003
    [6] Guillermo Reyes, Juan-Luis Vázquez . The Cauchy problem for the inhomogeneous porous medium equation. Networks and Heterogeneous Media, 2006, 1(2): 337-351. doi: 10.3934/nhm.2006.1.337
    [7] Caihong Gu, Yanbin Tang . Global solution to the Cauchy problem of fractional drift diffusion system with power-law nonlinearity. Networks and Heterogeneous Media, 2023, 18(1): 109-139. doi: 10.3934/nhm.2023005
    [8] Bendong Lou . Self-similar solutions in a sector for a quasilinear parabolic equation. Networks and Heterogeneous Media, 2012, 7(4): 857-879. doi: 10.3934/nhm.2012.7.857
    [9] Yannick Holle, Michael Herty, Michael Westdickenberg . New coupling conditions for isentropic flow on networks. Networks and Heterogeneous Media, 2020, 15(4): 605-631. doi: 10.3934/nhm.2020016
    [10] Flavia Smarrazzo, Alberto Tesei . Entropy solutions of forward-backward parabolic equations with Devonshire free energy. Networks and Heterogeneous Media, 2012, 7(4): 941-966. doi: 10.3934/nhm.2012.7.941
  • In this paper, we successfully combine convolution with a wave function to build an effective and efficient classifier for traffic signs, named the wave interference network (WiNet). In the WiNet, the feature map extracted by the convolutional filters is refined into many entities from an input image. Each entity is represented as a wave. We utilize Euler's formula to unfold the wave function. Based on the wave-like information representation, the model modulates the relationship between the entities and the fixed weights of convolution adaptively. Experiment results on the Chinese Traffic Sign Recognition Database (CTSRD) and the German Traffic Sign Recognition Benchmark (GTSRB) demonstrate that the performance of the presented model is better than some other models, such as ResMLP, ResNet50, PVT and ViT in the following aspects: 1) WiNet obtains the best accuracy rate with 99.80% on the CTSRD and recognizes all images exactly on the GTSRB; 2) WiNet gains better robustness on the dataset with different noises compared with other models; 3) WiNet has a good generalization on different datasets.



    The equation:

    {tu+xf(u)β22xu+δ3xu+κu+γ2|u|u=0,0<t<T,xR,u(0,x)=u0(x),xR, (1.1)

    was originally derived in [14,17] with f(u)=au2 focusing on microbubbles coated by viscoelastic shells. These structures are crucial in ultrasound diagnosis using contrast agents, and the dynamics of individual coated bubbles are explored, taking into account nonlinear competition and dissipation factors such as dispersion, thermal effects, and drag force.

    The coefficients β2, δ, κ, and γ2 are related to the dissipation, the dispersion, the thermal conduction dissipation, and to the drag force, repsctively.

    If κ=γ=0, we obtain the Kudryashov-Sinelshchikov [18] Korteweg-de Vries-Burgers [3,20] equation

    tu+axu2β22xu+δ3xu=0, (1.2)

    that models pressure waves in liquids with gas bubbles, taking into account heat transfer and viscosity. The mathematical results on Eq (1.2) are the following:

    ● analysis of exact solutions in [13],

    ● existence of the traveling waves in [2],

    ● well-posedness and asymptotic behavior in [7,11].

    If β=0, we derive the Korteweg-de Vries equation:

    tu+axu2+δ3xu=0, (1.3)

    which describes surface waves of small amplitude and long wavelength in shallow water. Here, u(t,x) represents the wave height above a flat bottom, x corresponds to the distance in the propagation direction, and t denotes the elapsed time. In [4,6,10,12,15,16], the completele integrability of Eq (1.3) and the existence of solitary wave solutions are proved.

    Through the manuscript, we will assume

    ● on the coefficients

    β,δ,κ,γR,β,δ,γ0; (1.4)

    ● on the flux f, one of the following conditions:

    f(u)=au2+bu3, (1.5)
    fC1(R),|f(u)|C0(1+|u|),uR, (1.6)

    for some positive constant C0;

    ● on the initial value

    u0H1(R). (1.7)

    The main result of this paper is the following theorem.

    Theorem 1.1. Assume Eqs (1.5)–(1.7). For fixed T>0, there exists a unique distributional solution u of Eq (1.1), such that

    uL(0,T;H1(R))L4(0,T;W1,4(R))L6(0,T;W1,6(R))2xuL2((0,T)×R). (1.8)

    Moreover, if u1 and u2 are solutions to Eq (1.1) corresponding to the initial conditions u1,0 and u2,0, respectively, it holds that:

    u1(t,)u2(t,)L2(R)eC(T)tu1,0u2,0L2(R), (1.9)

    for some suitable C(T)>0, and every, 0tT.

    Observe that Theorem 1.1 gives the well-posedness of (1.1), without conditions on the constants. Moreover, the proof of Theorem 1.1 is based on the Aubin-Lions Lemma [5,21]. The analysis of Eq (1.1) is more delicate than the one of Eq (1.2) due to the presence of the nonlinear sources and the very general assumptions on the coefficients.

    The structure of the paper is outlined as follows. Section 2 is dedicated to establishing several a priori estimates for a vanishing viscosity approximation of Eq (1.1). These estimates are crucial for proving our main result, which is presented in Section 3.

    To establish existence, we utilize a vanishing viscosity approximation of equation (1.1), as discussed in [19]. Let 0<ε<1 be a small parameter, and denote by uεC([0,T)×R) the unique classical solution to the following problem [1,9]:

    {tuε+xf(uε)β22xuε+δ3xuε+κu+γ2|u|u=ε4xuε,0<t<T,xR,uε(0,x)=uε,0(x),xR, (2.1)

    where uε,0 is a C approximation of u0, such that

    uε,0H1(R)u0H1(R). (2.2)

    Let us prove some a priori estimates on uε, denoting with C0 constants which depend only on the initial data, and with C(T) the constants which depend also on T.

    We begin by proving the following lemma:

    Lemma 2.1. Let T>0 be fixed. There exists a constant C(T)>0, which does not depend on ε, such that

    uε(t,)2L2(R)+2γ2e|κ|tt0Re|κ|su2ε|uε|dsdx+2β2e|κ|tt0e|κ|sxuε(s,)2L2(R)ds+2εe|κ|tt0e|κ|s2xuε(s,)2L2(R)C(T), (2.3)

    for every 0tT.

    Proof. For 0tT. Multiplying equations (2.1) by 2uε, and integrating over R yields

    ddtuε(t,)2L2(R)=2Ruεtuεdx=2Ruεf(uε)xuεdx=0+2β2Ruε2xuεdx2δRuε3xuεdxκuε(t,)2L2(R)2γ2R|uε|u2εdx2εRuε4xuεdx=2β2xuε(t,)2L2(R)+2δRxuε2xuεdxκuε(t,)2L2(R)2γ2R|uε|u2εdx+2εRxuε3xuεdx=2β2xuε(t,)2L2(R)κuε(t,)2L2(R)2γ2R|uε|u2εdx2ε2xuε(t,)2L2(R).

    Thus, it follows that

    ddtuε(t,)2L2(R)+2β2xuε(t,)2L2(R)+2γ2R|uε|u2εdx+2ε2xuε(t,)2L2(R)=κuε(t,)2L2(R)|κ|uε(t,)2L2(R).

    Therefore, applying the Gronwall's lemma and using Eq (2.2), we obtain

    uε(t,)2L2(R)+2β2e|κ|tt0e|κ|sxuε(s,)2L2(R)ds+2γ2e|κ|tt0Re|κ|t|uε|u2εdsdx+2ε2xuε(t,)2L2(R)+2εe|κ|tt0e|κ|s2xuε(s,)2L2(R)dsC0e|κ|tC(T),

    which gives Eq (2.3).

    Lemma 2.2. Fix T>0 and assume (1.5). There exists a constant C(T)>0, independent of ε, such that

    uεL((0,T)×R)C(T), (2.4)
    xuε(t,)2L2(R)+β2t02xuε(s,)2L2(R)ds (2.5)
    +2εt03xuε(s,)2L2(R)dsC(T),t0xuε(s,)4L4(R)dsC(T), (2.6)

    holds for every 0tT.

    Proof. Let 0tT. Consider A,B as two real constants, which will be specified later. Thanks to Eq (1.5), multiplying Eq (2.1) by

    22xuε+Au2ε+Bu3ε,

    we have that

    (22xuε+Au2ε+Bu3ε)tuε+2a(22xuε+Au2ε+Bu3ε)uεxuε+3b(22xuε+Au2ε+Bu3ε)u2εxuεβ2(22xuε+Au2ε+Bu3ε)2xuε+δ(22xuε+Au2ε+Bu3ε)3xuε+κ(22xuε+Au2ε+Bu3ε)uε+γ2(22xuε+Au2ε+Bu3ε)|uε|uε=ε(22xuε+Au2ε+Bu3ε)4xuε. (2.7)

    Observe that

    R(22xuε+Au2ε+Bu3ε)tuεdx=ddt(xuε(t,)2L2(R)+A3Ru3εdx+B4Ru4εdx),2aR(22xuε+Au2ε+Bu3ε)uεxuεdx=4aRuεxuε2xuεdx,3bR(22xuε+Au2ε+Bu3ε)u2εxuεdx=6bRu2εxuε2xuεdx,β2R(22xuε+Au2ε+Bu3ε)2xuεdx=2β22xuε(t,)2L2(R)+2Aβ2Ruε(xuε)2dx+3Bβ2Ru2ε(xuε)2dx,δR(22xuε+Au2ε+Bu3ε)3xuεdx=2AδRuεxuε2xuεdx3BδRu2εxuε2xuεdx,κR(22xuε+Au2ε+Bu3ε)uεdx=2κxuε(t,)2L2(R)+AκRu3εdx+BκRu4εdx,γ2R(22xuε+Au2ε+Bu3ε)|uε|uεdx=2γ2R|uε|uε2xuεdx+Aγ2R|u|u3εdx+Bγ2R|uε|u4dx,εR(22xuε+Au2ε+Bu3ε)4xuεdx=2ε3xuε(t,)2L2(R)+2AεRuεxuε3xuεdx+3BεRu2εxuε3xuεdx=2ε3xuε(t,)2L2(R)AεR(xuε)3dx6BεRuε(xuε)22xuεdx3Bεuε(t,)2xuε(t,)2L2(R)=2ε3xuε(t,)2L2(R)AεR(xuε)3dx+2BεR(xuε)4dx3Bεuε(t,)2xuε(t,)2L2(R).

    Therefore, an integration on R gives

    ddt(xuε(t,)2L2(R)+A3Ru3εdx+B4Ru4εdx)+β22xuε(t,)2L2(R)+2ε3xuε(t,)2L2(R)=(4a+Aδ)Ruεxuε2xuεdx3(2b+Bδ)Ru2εxuε2xuεdx2Aβ2Ruε(xuε)2dx3Bβ2Ru2ε(xuε)2dxκxuε(t,)2L2(R)Aκ3Ru3εdxBκ4Ru4εdx+2γ2R|uε|uε2xuεdxAγ2R|uε|u3εdxBγ2R|uε|u4εdxAεR(xuε)3dx+2BεR(xuε)4dx3Bεuε(t,)2xuε(t,)2L2(R).

    Taking

    (A,B)=(4aδ,2bδ),

    we get

    ddt(xuε(t,)2L2(R)4a3δRu3εdxbδRu4εdx)+2β22xuε(t,)2L2(R)+2ε3xuε(t,)2L2(R)=8aβ2δRuε(xuε)2dx+6bβ2δRu2ε(xuε)2dxκxuε(t,)2L2(R)+4aκ3δRu3εdx+bκ2Ru4εdx+2γ2R|uε|uε2xuεdx+4aγ2δR|uε|u3εdx+2bγ2δR|uε|u4εdx+4aεδR(xuε)3dx4bεδR(xuε)4dx+6bεδuε(t,)2xuε(t,)2L2(R). (2.8)

    Since 0<ε<1, due to the Young inequality and (2.3),

    8aβ2δR|uε|(xuε)2dx4Ru2ε(xuε)2dx+4a2β4δ2xuε(t,)2L2(R)4uε2L((0,T)×R)xuε(t,)2L2(R)+4a2β4δ2xuε(t,)2L2(R)C0(1+uε2L((0,T)×R))xuε(t,)2L2(R),|6bβ2δ|Ru2ε(xuε)2dx|6bβ2δ|uε2L((0,T)×R)xuε(t,)2L2(R),|4aκ3δ|R|uε|3dx|4aκ3δ|uεL((0,T)×R)uε(t,)2L2(R)C(T)uεL((0,T)×R),|bκ2|Ru4εdx|bκ2|uε2L((0,T)×R)uε(t,)2L2(R)C(T)uε2L((0,T)×R),2γ2R|uε|uε2xuεdx2R|γ2|uε|uεβ||β2xuε|dxγ4β2Ruε4dx+β22xuε(t,)2L2(R)γ4β2uε2L((0.T)×R)uε(t,)2L2(R)+β22xuε(t,)2L2(R)C(T)uε2L((0,T)×R)+β22xuε(t,)2L2(R),|4aγ2δ|R|uε||uε|3dx=|4aγ2δ|Ru4εdx|4aγ2δ|uε2L((0,T)×R)uε(t,)2L2(R)C(T)uε2L((0,T)×R),|2bγ2δ|R|uε|uε4dx|2bγ2δ|uε3L((0,T)×R)uε(t,)2L2(R)C(T)uε3L((0,T)×R),|4aεδ|R|xuε|3dx|4aεδ|xuε(t,)2L2(R)+|4aεδ|R(xuε)4dx|4aδ|xuε(t,)2L2(R)+|4aεδ|R(xuε)4dx.

    It follows from Eq (2.8) that

    ddt(xuε(t,)2L2(R)4a3δRu3εdxbδRu4εdx)+β22xuε(t,)2L2(R)+2ε3xuε(t,)2L2(R)C0(1+uε2L((0,T)×R))xuε(t,)2L2(R)+C(T)uεL((0,T)×R)+C(T)uε2L((0,T)×R)+C(T)uε3L((0,T)×R)+C0εR(xuε)4dx+C0εuε(t,)2xuε(t,)2L2(R)+C0xuε(t,)2L2(R). (2.9)

    [8, Lemma 2.3] says that

    R(xuε)4dx9Ru2ε(2xuε)2dx9uε2L((0,T)×R)2xuε(t,)2L2(R). (2.10)

    Moreover, we have that

    uε(t,)2xuε(t,)2L2(R)=Ru2ε(2xuε)2dxuε2L((0,T)×R)2xuε(t,)2L2(R). (2.11)

    Consequentially, by Eqs (2.9)–(2.11), we have that

    ddt(xuε(t,)2L2(R)4a3δRu3εdxbδRu4εdx)+β22xuε(t,)2L2(R)+2ε3xuε(t,)2L2(R)C0(1+uε2L((0,T)×R))xuε(t,)2L2(R)+C(T)uεL((0,T)×R)+C(T)uε2L((0,T)×R)+C(T)uε3L((0,T)×R)+C0εuε2L((0,T)×R)2xuε(t,)2L2(R)+C0xuε(t,)2L2(R).

    An integration on (0,t) and Eqs (2.2) and (2.3) give

    xuε(t,)2L2(R)4a3δRu3εdxbδRu4εdx+β2t02xuε(s,)2L2(R)ds+2εt03xuε(s,)2L2(R)dsC0(1+uε2L((0,T)×R))t0xuε(s,)2L2(R)ds+C(T)uεL((0,T)×R)t+C(T)uε2L((0,T)×R)t+C(T)uε3L((0,T)×R)t+C0εuε2L((0,T)×R)t02xuε(s,)2L2(R)ds+C0t0xuε(s,)2L2(R)dsC(T)(1+uεL((0,T)×R)+uε2L((0,T)×R)+uε3L((0,T)×R)).

    Therefore, by Eq (2.3),

    xuε(t,)2L2(R)+β2t02xuε(s,)2L2(R)ds+2εt03xuε(s,)2L2(R)dsC(T)(1+uεL((0,T)×R)+uε2L((0,T)×R)+uε3L((0,T)×R))+4a3δRu3εdx+bδRu4εdxC(T)(1+uεL((0,T)×R)+uε2L((0,T)×R)+uε3L((0,T)×R))+|4a3δ|R|uε|3dx+|bδ|Ru4εdxC(T)(1+uεL((0,T)×R)+uε2L((0,T)×R)+uε3L((0,T)×R))+|4a3δ|uεL((0,T)×R)uε(t,)2L2(R)+|bδ|uε2L((0,T)×R)uε(t,)2L2(R)C(T)(1+uεL((0,T)×R)+uε2L((0,T)×R)+uε3L((0,T)×R)). (2.12)

    We prove Eq (2.4). Thanks to the Hölder inequality,

    u2ε(t,x)=2xuεxuεdx2R|uε||xuε|dx2uε(t,)L2(R)xuε(t,)L2(R).

    Hence, we have that

    uε(t,)4L(R)4uε(t,)2L2(R)xuε(t,)2L2(R). (2.13)

    Thanks to Eqs (2.3) and (2.12), we have that

    uε4L((0,T)×R)C(T)(1+uεL((0,T)×R)+uε2L((0,T)×R)+uε3L((0,T)×R)). (2.14)

    Due to the Young inequality,

    C(T)uε3L((0,T)×R)12uε4L((0,T)×R)+C(T)uε2L((0,T)×R),C(T)uεL((0,T)×R)C(T)uε2L((0,T)×R)+C(T).

    By Eq (2.14), we have that

    12uε4L((0,T)×R)C(T)uε2L((0,T)×R)C(T)0,

    which gives Eq (2.4).

    Equation (2.5) follows from Eqs (2.4) and (2.12).

    Finally, we prove Eq (2.6). We begin by observing that, from Eqs (2.4) and (2.10), we have

    xuε(t,)4L4(R)C(T)2xuε(t,)2L2(R).

    An integration on (0,t) and Eqs (2.5) give Eq (2.6).

    Lemma 2.3. Fix T>0 and assume (1.6). There exists a constant C(T)>0, independent of ε, such that Eq (2.4) holds. Moreover, we have Eqs (2.5) and (2.6).

    Proof. Let 0tT. Multiplying Eq (2.1) by 22xuε, an integration on R gives

    ddtxuε(t,)2L2(R)=2R2xuεtuεdx=2Rf(uε)xuε2xuεdx2β22xuε(t,)2L2(R)2δR2xuε3xuεdx2κRuε2xuεdx2γ2R|uε|uε2xuεdx+2εR2xuε4xuεdx=2Rf(uε)xuε2xuεdx2β22xuε(t,)2L2(R)+2κxuε(t,)2L2(R)+2γ2R|uε|uε2xuεdx2ε3xuε(t,)2L2(R).

    Therefore, we have that

    ddtxuε(t,)2L2(R)+2β22xuε(t,)2L2(R)+2ε3xuε(t,)2L2(R)=2Rf(uε)xuε2xuεdx+2κxuε(t,)2L2(R)+2γ2R|uε|uε2xuεdx. (2.15)

    Due Eqs (1.6) and (2.3) and the Young inequality,

    2R|f(uε)||xuε||2xuε|dxC0R|xuε2xuε|dx+C0R|uεxuε||2xuε|dx=2R|C03xuε2β||β2xuε3|dx+2R|C03uεxuε2β||3β2xuε|dxC0xuε(t,)2L2(R)+C0Ru2ε(xuε)2dx+2β232xuε(t,)2L2(R)C0xuε(t,)2L2(R)+C0uε2L((0,T)×R)xuε(t,)2L2(R)+2β232xuε(t,)2L2(R)C0(1+uε2L((0,T)×R))xuε(t,)2L2(R)+2β232xuε(t,)2L2(R),2γ2R|uε|uε2xuεdx2γ2Ru2ε|2xuε|dx=2R|3γ2u2εβ||β2xuε3|dx3γ4β2Ru4εdx+β232xuε(t,)2L2(R)3γ4β2uε2L((0,T)×R)uε(t,)2L2(R)+β232xuε(t,)2L2(R)C(T)uε2L((0,T)×R)+β232xuε(t,)2L2(R).

    It follows from Eq (2.15) that

    ddtxuε(t,)2L2(R)+β22xuε(t,)2L2(R)+2ε3xuε(t,)2L2(R)C0(1+uε2L((0,T)×R))xuε(t,)2L2(R)+C(T)uε2L((0,T)×R).

    Integrating on (0,t), by Eq (2.3), we have that

    xuε(t,)2L2(R)+β2t02xuε(s,)2L2(R)ds+2εt03xuε(s,)2L2(R)C0+C0(1+uε2L((0,T)×R))t0xuε(s,)2L2(R)ds+C(T)uε2L((0,T)×R)tC(T)(1+uε2L((0,T)×R)). (2.16)

    Thanks to Eqs (2.3), (2.13), and (2.16), we have that

    uε4L((0,T)×R)C(T)(1+uε2L((0,T)×R)).

    Therefore,

    uε4L((0,T)×R)C(T)uε2L((0,T)×R)C(T)0,

    which gives (2.4).

    Equation (2.5) follows from (2.4) and (2.16), while, arguing as in Lemma 2.2, we have Eq (2.6).

    Lemma 2.4. Fix T>0. There exists a constant C(T)>0, independent of ε, such that

    t0xuε(s,)6L6(R)dsC(T), (2.17)

    for every 0tT.

    Proof. Let 0tT. We begin by observing that,

    R(xuε)6dxxuε(t,)4L(R)xuε(t,)2L2(R). (2.18)

    Thanks to the Hölder inequality,

    (xuε(t,x))2=2xxuε2xuεdy2R|xuε||2xuε|dx2xuε(t,)L2(R)2xuε(t,)2L2(R).

    Hence,

    u(t,)4L(R)4xuε(t,)2L2(R)2xuε(t,)2L2(R).

    It follows from Eq (2.18) that

    R(xuε)6dx4xuε(t,)4L2(R)2xuε(t,)2L2(R).

    Therefore, by Eq (2.5),

    R(xuε)6dxC(T)2xuε(t,)2L2(R).

    An integration on (0,t) and Eq (2.5) gives (2.17).

    This section is devoted to the proof of Theorem 1.1.

    We begin by proving the following result.

    Lemma 3.1. Fix T>0. Then,

    the family {uε}ε>0 is compact in L2loc((0,T)×R). (3.1)

    Consequently, there exist a subsequence {uεk}kN and uL2loc((0,T)×R) such that

    uεku in L2loc((0,T)×R) and a.e. in (0,T)×R. (3.2)

    Moreover, u is a solution of Eq (1.1), satisfying Eq (1.8).

    Proof. We begin by proving Eq (3.1). To prove Eq (3.1), we rely on the Aubin-Lions Lemma (see [5,21]). We recall that

    H1loc(R)↪↪L2loc(R)H1loc(R),

    where the first inclusion is compact and the second one is continuous. Owing to the Aubin-Lions Lemma [21], to prove Eq (3.1), it suffices to show that

    {uε}ε>0 is uniformly bounded in L2(0,T;H1loc(R)), (3.3)
    {tuε}ε>0 is uniformly bounded in L2(0,T;H1loc(R)). (3.4)

    We prove Eq (3.3). Thanks to Lemmas 2.1–2.3,

    uε(t,)2H1(R)=uε(t,)2L2(R)+xuε(t,)2L2(R)C(T).

    Therefore,

    {uε}ε>0 is uniformly bounded in L(0,T;H1(R)),

    which gives Eq (3.3).

    We prove Eq (3.4). Observe that, by Eq (2.1),

    tuε=x(G(uε))f(uε)xuεκuεγ2|uε|uε,

    where

    G(uε)=β2xuεδ2xuεε3xuε. (3.5)

    Since 0<ε<1, thanks to Eq (2.5), we have that

    β2xuε2L2((0,T)×R),δ22xuε2L2((0,T)×R)C(T),ε23xuε2L2((0,T)×R)C(T). (3.6)

    Therefore, by Eqs (3.5) and (3.6), we have that

    {x(G(uε))}ε>0 is bounded in L2(0,T;H1(R)). (3.7)

    We claim that

    T0R(f(uε))2(xuε)2dtdxC(T). (3.8)

    Thanks to Eqs (2.4) and (2.5),

    T0R(f(uε))2(xuε)2dtdxf2L(C(T),C(T))T0xuε(t,)2L2(R)dtC(T).

    Moreover, thanks to Eq (2.3),

    |κ|T0R(uε)2dxC(T). (3.9)

    We have that

    γ2T0R(|uε|uε)2dsdxC(T). (3.10)

    In fact, thanks to Eqs (2.3) and (2.4),

    γ2T0R(|uε|uε)2dsdxγ2uε2L((0,T)×R)T0R(uε)2dsdxC(T)T0R(uε)2dsdxC(T).

    Therefore, Eq (3.4) follows from Eqs (3.7)–(3.10).

    Thanks to the Aubin-Lions Lemma, Eqs (3.1) and (3.2) hold.

    Consequently, arguing as in [5, Theorem 1.1], u is solution of Eq (1.1) and, thanks to Lemmas 2.1–2.3 and Eqs (2.4), (1.8) holds.

    Proof of Theorem 1.1. Lemma 3.1 gives the existence of a solution of Eq (1.1).

    We prove Eq (1.9). Let u1 and u2 be two solutions of Eq (1.1), which verify Eq (1.8), that is,

    {tui+xf(ui)β22xui+δ3xui+κui+γ2|ui|ui=0,0<t<T,xR,ui(0,x)=ui,0(x),xR,i=1,2.

    Then, the function

    ω(t,x)=u1(t,x)u2(t,x), (3.11)

    is the solution of the following Cauchy problem:

    {tω+x(f(u1)f(u2))β22xω+δ2xω+κω+γ2(|u1|u1|u2|u2)=0,0<t<T,xR,ω(0,x)=u1,0(x)u2,0(x),xR. (3.12)

    Fixed T>0, since u1,u2H1(R), for every 0tT, we have that

    u1L((0,T)×R),u2L((0,T)×R)C(T). (3.13)

    We define

    g=f(u1)f(u2)ω (3.14)

    and observe that, by Eq (3.13), we have that

    |g|fL(C(T),C(T))C(T). (3.15)

    Moreover, by Eq (3.11) we have that

    ||u1||u2|||u1u2|=|ω|. (3.16)

    Observe that thanks to Eq (3.11),

    |u1|u1|u2|u2=|u1|u1|u1|u2+|u1|u2|u2|u2=|u1|ω+u2(|u1||u2|). (3.17)

    Thanks to Eqs (3.14) and (3.17), Equation (3.12) is equivalent to the following one:

    tω+x(gω)β22xω+δ3xω+κω+γ2|u1|ω+γ2u2(|u1||u2|)=0. (3.18)

    Multiplying Eq (3.18) by 2ω, an integration on R gives

    dtdtω(t,)2L2(R)=2Rωtω=2Rωx(gω)dx+2β2Rω2xωdx2δRω3xωdx2κω(t,)2L2(R)2γ2R|u1|ω2dx2γ2Ru2(|u1||u2|)ωdx=2Rgωxωdx2β2xω(t,)2L2(R)+2δRxω2xωdx2κω(t,)2L2(R)2γ2R|u1|ω2dx2γ2Ru2(|u1||u2|)ωdx=2Rgωxωdx2β2xω(t,)2L2(R)2κω(t,)2L2(R)2γ2R|u1|ω2dx2γ2Ru2(|u1||u2|)ωdx.

    Therefore, we have that

    ω(t,)2L2(R)+2β2xω(t,)2L2(R)+2γ2R|u1|ω2dx=2Rgωxωdxκω(t,)2L2(R)2γ2Ru2(|u1||u2|)ωdx. (3.19)

    Due to Eqs (3.13), (3.15) and (3.16) and the Young inequality,

    2R|g||ω||xω|dx2C(T)R|ω||xω|dx=2R|C(T)ωβ||βxω|dxC(T)ω(t,)2L2(R)+β2xω(t,)2L2(R),2γ2R|u2||(|u1||u2|)||ω|dx2γ2u2L((0,T)×R)R|(|u1||u2|)||ω|dxC(T)ω(t,)2L2(R).

    It follows from Eq (3.19) that

    ω(t,)2L2(R)+β2xω(t,)2L2(R)+2γ2R|u1|ω2dxC(T)ω(t,)2L2(R).

    The Gronwall Lemma and Eq (3.12) give

    ω(t,)2L2(R)+β2eC(T)tt0eC(T)sxω(s,)2L2(R)ds+2γ2eC(T)tt0ReC(T)s|u1|ω2dsdxeC(T)tω02L2(R). (3.20)

    Equation (1.9) follows from Eqs (3.11) and (3.20).

    Giuseppe Maria Coclite and Lorenzo Di Ruvo equally contributed to the methodologies, typesetting, and the development of the paper.

    The authors declare they have not used Artificial Intelligence (AI) tools in the creation of this article.

    Giuseppe Maria Coclite is an editorial boardmember for [Networks and Heterogeneous Media] and was not involved inthe editorial review or the decision to publish this article.

    GMC is member of the Gruppo Nazionale per l'Analisi Matematica, la Probabilità e le loro Applicazioni (GNAMPA) of the Istituto Nazionale di Alta Matematica (INdAM). GMC has been partially supported by the Project funded under the National Recovery and Resilience Plan (NRRP), Mission 4 Component 2 Investment 1.4 -Call for tender No. 3138 of 16/12/2021 of Italian Ministry of University and Research funded by the European Union -NextGenerationEUoAward Number: CN000023, Concession Decree No. 1033 of 17/06/2022 adopted by the Italian Ministry of University and Research, CUP: D93C22000410001, Centro Nazionale per la Mobilità Sostenibile, the Italian Ministry of Education, University and Research under the Programme Department of Excellence Legge 232/2016 (Grant No. CUP - D93C23000100001), and the Research Project of National Relevance "Evolution problems involving interacting scales" granted by the Italian Ministry of Education, University and Research (MIUR Prin 2022, project code 2022M9BKBC, Grant No. CUP D53D23005880006). GMC expresses its gratitude to the HIAS - Hamburg Institute for Advanced Study for their warm hospitality.

    The authors declare there is no conflict of interest.



    [1] A. Gudigar, S. Chokkadi, U. Raghavendra, U. Rajendra Acharya, An efficient traffic sign recognition based on graph embedding features, Neural Comput. Appl., 31 (2019), 395–407. https://doi.org/10.1007/s00521-017-3063-z doi: 10.1007/s00521-017-3063-z
    [2] Z. Liang, J. Shao, D. Zhang, L. Gao, Traffic sign detection and recognition based on pyramidal convolutional networks, Neural Comput. Appl., 32 (2020), 6533–6543. https://doi.org/10.1007/s00521-019-04086-z doi: 10.1007/s00521-019-04086-z
    [3] R. Abdel-Salam, R. Mostafa, A. H. Abdel-Gawad, RIECNN: real-time image enhanced CNN for traffic sign recognition, Neural Comput. Appl., 34 (2022), 6085–6096. https://doi.org/10.1007/s00521-021-06762-5 doi: 10.1007/s00521-021-06762-5
    [4] M. Lu, K. Wevers, R. V. D. Heijden, Technical feasibility of advanced driver assistance systems (ADAS) for road traffic safety, Transp. Plann. Technol., 28 (2005), 167–187. https://doi.org/10.1080/03081060500120282 doi: 10.1080/03081060500120282
    [5] J. C. McCall, M. M. Trivedi, Video-based lane estimation and tracking for driver assistance: survey, system, and evaluation, IEEE Trans. Intell. Transp. Syst., 7 (2006), 20–37. https://doi.org/10.1109/TITS.2006.869595
    [6] M. Haloi, D. B. Jayagopi, A robust lane detection and departure warning system, in 2015 IEEE Intelligent Vehicles Symposium (IV), (2015), 126–131. https://doi.org/10.1109/IVS.2015.7225674
    [7] R. Ayachi, Y. Said, A. B. Abdelaali, Pedestrian detection based on light-weighted separable convolution for advanced driver assistance systems, Neural Process. Lett., 52 (2020), 2655–2668. https://doi.org/10.1007/s11063-020-10367-9 doi: 10.1007/s11063-020-10367-9
    [8] Y. Gu, B. Si, A novel lightweight real-time traffic sign detection integration framework based on YOLOv4, Entropy, 24 (2022), 487. https://doi.org/10.3390/e24040487 doi: 10.3390/e24040487
    [9] T. Liang, H. Bao, W. Pan, F. Pan, Traffic sign detection via improved sparse R-CNN for autonomous vehicles, J. Adv. Transp., (2022), 1–16. https://doi.org/10.1155/2022/3825532
    [10] J. Wang, Y. Chen, Z. Dong, M. Gao, Improved YOLOv5 network for real-time multi-scale traffic sign detection, Neural Comput. Appl., 35 (2023), 7853–7865. https://doi.org/10.1007/s00521-022-08077-5 doi: 10.1007/s00521-022-08077-5
    [11] M Swathi, K. V. Suresh, Automatic traffic sign detection and recognition: A review, in 2017 International Conference on Algorithms, Methodology, Models and Applications in Emerging Technologies (ICAMMAET), (2017), 1–6. https://doi.org/10.1109/ICAMMAET.2017.8186650
    [12] C. Liu, S. Li, F. Chang, Y. Wang, Machine vision based traffic sign detection methods: Review, analyses and perspectives, IEEE Access, 7 (2019), 86578–86596. https://doi.org/10.1109/ACCESS.2019.2924947
    [13] S. Maldonado-Bascon, S. Lafuente-Arroyo, P. Gil-Jimenez, H. Gomez-Moreno, F. Lopez-Ferreras, Road-sign detection and recognition based on support vector machines, IEEE Trans. Intell. Transp. Syst., 8 (2007), 264–278. https://doi.org/10.1109/TITS.2007.895311 doi: 10.1109/TITS.2007.895311
    [14] V. Cherkassky, Y. Ma, Practical selection of SVM parameters and noise estimation for SVM regression, Neural Networks, 17 (2004), 113–126. https://doi.org/10.1016/S0893-6080(03)00169-2 doi: 10.1016/S0893-6080(03)00169-2
    [15] A. Ellahyani, M. E. Ansari, I. E. Jaafari, S. Charfi, Traffic sign detection and recognition using features combination and random forests, Int. J. Adv. Comput. Sci. Appl., 7 (2016), 686–693. https://doi.org/10.14569/IJACSA.2016.070193 doi: 10.14569/IJACSA.2016.070193
    [16] K. Lu, Z. Ding, S. Ge, Sparse-representation-based graph embedding for traffic sign recognition, IEEE Trans. Intell. Transp. Syst., 13 (2021), 1515–1524. https://doi.org/10.1109/TITS.2012.2220965 doi: 10.1109/TITS.2012.2220965
    [17] Y. Tang, K. Han, J. Guo, C. Xu, Y. Li, C. Xu, et al., An image patch is a wave: Phase-aware vision MLP, in Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), (2022), 10935–10944. https://doi.org/10.48550/arXiv.2111.12294
    [18] E. J. Heller, M. F. Crommie, C. P. Lutz, D. M. Eigler, Scattering and absorption of surface electron waves in quantum corrals, Nature, 369 (1994), 464–466. https://doi.org/10.1038/369464a0 doi: 10.1038/369464a0
    [19] H. Touvron, P. Bojanowski, M. Caron, M. Cord, A. El-Nouby, E. Grave, et al., ResMLP: Feedforward networks for image classification with data-efficient training, IEEE Trans. Pattern Anal. Mach. Intell., 45 (2023), 5314–5321. https://doi.org/10.1109/TPAMI.2022.3206148 doi: 10.1109/TPAMI.2022.3206148
    [20] W. Wang, E. Xie, X. Li, D. Fan, K. Song, D. Liang, et al., Pyramid vision transformer: A versatile backbone for dense prediction without convolutions, in Proceedings of the IEEE/CVF International Conference on Computer Vision (ICCV), (2021), 568–578. https://doi.org/10.48550/arXiv.2102.12122
    [21] A. Dosovitskiy, L. Beyer, A. Kolesnikov, D. Weissenborn, X. Zhai, T. Unterthiner, An image is worth 16x16 Words: Transformers for image recognition at scale, preprint, arXiv: 2010.11929.
    [22] O. N. Manzari, A. Boudesh, S. B. Shokouhi, Pyramid transformer for traffic sign detection, in 2022 12th International Conference on Computer and Knowledge Engineering (ICCKE), (2022), 112–116. https://doi.org/10.1109/ICCKE57176.2022.9960090
    [23] Y. Zheng, W. Jiang, Evaluation of vision transformers for traffic sign classification, Wireless Commun. Mobile Comput., 2022 (2022), 14. https://doi.org/10.1155/2022/3041117 doi: 10.1155/2022/3041117
    [24] D. Pei, F. Sun, H. Liu, Supervised low-rank matrix recovery for traffic sign recognition in image sequences, IEEE Signal Process. Lett., 20 (2013), 241–244. https://doi.org/10.1109/LSP.2013.2241760 doi: 10.1109/LSP.2013.2241760
    [25] S. Ardianto, C. Chen, H. Hang, Real-time traffic sign recognition using color segmentation and SVM, in 2017 International Conference on Systems, Signals and Image Processing (IWSSIP), (2017), 1–5. https://doi.org/10.1109/IWSSIP.2017.7965570
    [26] D. Cireşan, U. Meier, J. Masci, J. Schmidhuber, A committee of neural networks for traffic sign classification, in the 2011 International Joint Conference on Neural Networks, (2011), 1918–1921. https://doi.org/10.1109/IJCNN.2011.6033458.
    [27] P. Sermanet, Y. LeCun, Traffic sign recognition with multi-scale convolutional networks, in the 2011 International Joint Conference on Neural Networks, (2011), 2809–2813. https://doi.org/10.1109/IJCNN.2011.6033589
    [28] M. Haloi, Traffic sign classification using deep inception based convolutional networks, preprint, arXiv: 1511.02992.
    [29] M. Jaderberg, K. Simonyan, A. Zisserman, K. kavukcuoglu, Spatial transformer networks, in Advances in Neural Information Processing Systems 28 (NIPS 2015), 28 (2015).
    [30] C. Szegedy, W. Liu, Y. Jia, P. Sermanet, S. Reed, D. Anguelov, et al., Going deeper with convolutions, in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), (2015), 1–9. https://doi.org/10.1109/CVPR.2015.7298594
    [31] J. Hu, L. Shen, G. Sun, Squeeze-and-excitation networks, in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), (2018), 7132–7141. https://doi.org/10.48550/arXiv.1709.01507
    [32] Y. Yuan, Z. Xiong, Q. Wang, VSSA-NET: Vertical spatial sequence attention network for traffic sign detection, IEEE Trans. Image Process., 28 (2019), 3423–3434. https://doi.org/10.1109/TIP.2019.2896952 doi: 10.1109/TIP.2019.2896952
    [33] Y. Liu, Z. Shao, N. Hoffmann, Global attention mechanism: Retain information to enhance channel-spatial interactions, preprint, arXiv: 2112.05561.
    [34] S. Liu, J. Li, C. Hu, W. Wang, Traffic sign recognition based on convolutional neural network and ensemble learning, Comput. Mod., 12 (2019), 67. https://doi.org/10.3969/j.issn.1006-2475.2019.12.013 doi: 10.3969/j.issn.1006-2475.2019.12.013
    [35] K. Zhou, Y. Zhan, D. Fu, Learning region-based attention network for traffic sign recognition, Sensors, 21 (2021), 686. https://doi.org/10.3390/s21030686 doi: 10.3390/s21030686
    [36] M. Guo, C. Lu, Z. Liu, M. Cheng, S. Hu, Visual attention network, Visual Media, 9 (2023), 733–752. https://doi.org/10.1007/s41095-023-0364-2 doi: 10.1007/s41095-023-0364-2
    [37] S. Gao, M. Cheng, K. Zhao, X. Zhang, M. Yang, P. Torr, Res2Net: A new multi-scale backbone architecture, IEEE Trans. Pattern Anal. Mach. Intell., 43 (2019), 652–662. https://doi.org/10.1109/TPAMI.2019.2938758 doi: 10.1109/TPAMI.2019.2938758
    [38] K. He, X. Zhang, S. Ren, J. Sun, Deep Residual learning for image recognition, in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), (2016), 770–778. https://doi.org/10.1109/CVPR.2016.90
    [39] L. Chen, H. Zhang, J. Xiao, L. Nie, J. Shao, W. Liu, SCA-CNN: Spatial and channel-wise attention in convolutional networks for image captioning, in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (CVPR), (2017), 5659–5667.
    [40] Z. Zhu, J. Lu, R. R. Martin, S. Hu, An optimization approach for localization refinement of candidate traffic signs, IEEE Trans. Intell. Transp. Syst., 18 (2017), 3006–016. https://doi.org/10.1109/TITS.2017.2665647 doi: 10.1109/TITS.2017.2665647
    [41] S. Mehta, C. Paunwala, B. Vaidya, CNN based traffic sign classification using Adam optimizer, in 2019 International Conference on Intelligent Computing and Control Systems (ICCS), (2019), 1293–1298. https://doi.org/10.1109/ICCS45141.2019.9065537
    [42] A. Staravoitau, Traffic sign classification with a convolutional network, Pattern Recognit. Image Anal., 28 (2018), 155–162. https://doi.org/10.1134/S1054661818010182 doi: 10.1134/S1054661818010182
    [43] J. Greenhalgh, M. Mirmehdi, Real-time detection and recognition of road traffic signs, IEEE Trans. Intell. Transp. Syst., 13 (2012), 1498–1506. https://doi.org/10.1109/TITS.2012.2208909 doi: 10.1109/TITS.2012.2208909
    [44] G. Overett, L. Petersson, Large scale sign detection using HOG feature variants, in 2011 IEEE Intelligent Vehicles Symposium (IV), (2011), 326–331. https://doi.org/10.1109/IVS.2011.5940549
    [45] A. Mogelmose, M. M. Trivedi, T. B. Moeslund, Vision-based traffic sign detection and analysis for intelligent driver assistance systems: Perspectives and survey, IEEE Trans. Intell. Transp. Syst., 13 (2012), 1484–1497. https://doi.org/10.1109/TITS.2012.2209421 doi: 10.1109/TITS.2012.2209421
    [46] F. Larsson, M. Felsberg, Using fourier descriptors and spatial models for traffic sign recognition, in SCIA 2011: Image Analysis, 6688 (2011), 238–249. https://doi.org/10.1007/978-3-642-21227-7_23
    [47] J. Stallkamp, M. Schlipsing, J. Salmen, C. Igel, Man vs. computer: Benchmarking machine learning algorithms for traffic sign recognition, Neural Networks, 32 (2012), 323–332. https://doi.org/10.1016/j.neunet.2012.02.016 doi: 10.1016/j.neunet.2012.02.016
    [48] X. Mao, S. Hijazi, R. Casas, P. Kaul, R. Kumar, C. Rowen, Hierarchical CNN for traffic sign recognition, in 2016 IEEE Intelligent Vehicles Symposium (IV), (2016), 130–135. https://doi.org/10.1109/IVS.2016.7535376
    [49] X. Peng, Y. Li, X. Wei, J. Luo, Y. Murphey, Traffic sign recognition with transfer learning, in 2017 IEEE Symposium Series on Computational Intelligence (SSCI), (2017), 1–7. https://doi.org/10.1109/SSCI.2017.8285332
    [50] Y. Jin, Y. Fu, W. Wang, J. Guo. C. Ren, X. Xiang, Multi-feature fusion and enhancement single shot detector for traffic sign recognition, IEEE Access, 8 (2020), 38931–38940. https://doi.org/10.1109/ACCESS.2020.2975828 doi: 10.1109/ACCESS.2020.2975828
    [51] A. Bouti, M. A. Mahraz, J. Riffi, H. Tairi, A robust system for road sign detection and classification using LeNet architecture based on convolutional neural network, Soft Comput., 24 (2020), 6721–6733. https://doi.org/10.1007/s00500-019-04307-6 doi: 10.1007/s00500-019-04307-6
    [52] F. J. Moreno-Barea, F. Strazzera, J. M. Jerez, D. Urda, L. Franco, Forward noise adjustment scheme for data augmentation, in 2018 IEEE Symposium Series on Computational Intelligence (SSCI), (2018), 728–734. https://doi.org/10.1109/SSCI.2018.8628917
    [53] A. Asuncion, D. Newman, UCI Machine Learning Repository, 2007. Available from: http://archive.ics.uci.edu/ml.
  • Reader Comments
  • © 2023 the Author(s), licensee AIMS Press. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0)
通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Metrics

Article views(12280) PDF downloads(55) Cited by(0)

/

DownLoad:  Full-Size Img  PowerPoint
Return
Return

Catalog