Semiring | + | ⋅ | Semiring | + | ⋅ |
L2 | 0 1 1 1 |
0 0 1 1 |
R2 | 0 1 1 1 |
0 1 0 1 |
M2 | 0 1 1 1 |
0 1 1 1 |
D2 | 0 1 1 1 |
0 0 0 1 |
N2 | 0 1 1 1 |
0 0 0 0 |
T2 | 0 1 1 1 |
1 1 1 1 |
Z2 | 0 0 0 0 |
0 0 0 0 |
W2 | 0 0 0 0 |
0 0 0 1 |
As an agricultural innovation, low-temperature plasma technology is an environmentally friendly green technology that increases crop quality and productivity. However, there is a lack of research on the identification of plasma-treated rice growth. Although traditional convolutional neural networks (CNN) can automatically share convolution kernels and extract features, the outputs are only suitable for entry-level categorization. Indeed, shortcuts from the bottom layers to fully connected layers can be established feasibly in order to utilize spatial and local information from the bottom layers, which contain small distinctions necessary for fine-grain identification. In this work, 5000 original images which contain the basic growth information of rice (including plasma treated rice and the control rice) at the tillering stage were collected. An efficient multiscale shortcut CNN (MSCNN) model utilizing key information and cross-layer features was proposed. The results show that MSCNN outperforms the mainstream models in terms of accuracy, recall, precision and F1 score with 92.64%, 90.87%, 92.88% and 92.69%, respectively. Finally, the ablation experiment, comparing the average precision of MSCNN with and without shortcuts, revealed that the MSCNN with three shortcuts achieved the best performance with the highest precision.
Citation: Wenzhuo Chen, Yuan Wang, Xiaojiang Tang, Pengfei Yan, Xin Liu, Lianfeng Lin, Guannan Shi, Eric Robert, Feng Huang. A specific fine-grained identification model for plasma-treated rice growth using multiscale shortcut convolutional neural network[J]. Mathematical Biosciences and Engineering, 2023, 20(6): 10223-10243. doi: 10.3934/mbe.2023448
[1] | Sami Ul Haq, Saeed Ullah Jan, Syed Inayat Ali Shah, Ilyas Khan, Jagdev Singh . Heat and mass transfer of fractional second grade fluid with slippage and ramped wall temperature using Caputo-Fabrizio fractional derivative approach. AIMS Mathematics, 2020, 5(4): 3056-3088. doi: 10.3934/math.2020198 |
[2] | Kehong Zheng, Fuzhang Wang, Muhammad Kamran, Rewayat Khan, Ali Sikandar Khan, Sadique Rehman, Aamir Farooq . On rate type fluid flow induced by rectified sine pulses. AIMS Mathematics, 2022, 7(2): 1615-1627. doi: 10.3934/math.2022094 |
[3] | J. Kayalvizhi, A. G. Vijaya Kumar, Ndolane Sene, Ali Akgül, Mustafa Inc, Hanaa Abu-Zinadah, S. Abdel-Khalek . An exact solution of heat and mass transfer analysis on hydrodynamic magneto nanofluid over an infinite inclined plate using Caputo fractional derivative model. AIMS Mathematics, 2023, 8(2): 3542-3560. doi: 10.3934/math.2023180 |
[4] | Muhammad Imran Asjad, Muhammad Haris Butt, Muhammad Armaghan Sadiq, Muhammad Danish Ikram, Fahd Jarad . Unsteady Casson fluid flow over a vertical surface with fractional bioconvection. AIMS Mathematics, 2022, 7(5): 8112-8126. doi: 10.3934/math.2022451 |
[5] | Asifa, Poom Kumam, Talha Anwar, Zahir Shah, Wiboonsak Watthayu . Analysis and modeling of fractional electro-osmotic ramped flow of chemically reactive and heat absorptive/generative Walters'B fluid with ramped heat and mass transfer rates. AIMS Mathematics, 2021, 6(6): 5942-5976. doi: 10.3934/math.2021352 |
[6] | Ritu Agarwal, Mahaveer Prasad Yadav, Dumitru Baleanu, S. D. Purohit . Existence and uniqueness of miscible flow equation through porous media with a non singular fractional derivative. AIMS Mathematics, 2020, 5(2): 1062-1073. doi: 10.3934/math.2020074 |
[7] | Álvaro Abucide, Koldo Portal, Unai Fernandez-Gamiz, Ekaitz Zulueta, Iker Azurmendi . Unsteady-state turbulent flow field predictions with a convolutional autoencoder architecture. AIMS Mathematics, 2023, 8(12): 29734-29758. doi: 10.3934/math.20231522 |
[8] | Geetika Saini, B. N. Hanumagowda, S. V. K. Varma, Jasgurpreet Singh Chohan, Nehad Ali Shah, Yongseok Jeon . Impact of couple stress and variable viscosity on heat transfer and flow between two parallel plates in conducting field. AIMS Mathematics, 2023, 8(7): 16773-16789. doi: 10.3934/math.2023858 |
[9] | M. Hamid, T. Zubair, M. Usman, R. U. Haq . Numerical investigation of fractional-order unsteady natural convective radiating flow of nanofluid in a vertical channel. AIMS Mathematics, 2019, 4(5): 1416-1429. doi: 10.3934/math.2019.5.1416 |
[10] | Shabiha Naz, Tamizharasi Renganathan . An exact asymptotic solution for a non-Newtonian fluid in a generalized Couette flow subject to an inclined magnetic field and a first-order chemical reaction. AIMS Mathematics, 2024, 9(8): 20245-20270. doi: 10.3934/math.2024986 |
As an agricultural innovation, low-temperature plasma technology is an environmentally friendly green technology that increases crop quality and productivity. However, there is a lack of research on the identification of plasma-treated rice growth. Although traditional convolutional neural networks (CNN) can automatically share convolution kernels and extract features, the outputs are only suitable for entry-level categorization. Indeed, shortcuts from the bottom layers to fully connected layers can be established feasibly in order to utilize spatial and local information from the bottom layers, which contain small distinctions necessary for fine-grain identification. In this work, 5000 original images which contain the basic growth information of rice (including plasma treated rice and the control rice) at the tillering stage were collected. An efficient multiscale shortcut CNN (MSCNN) model utilizing key information and cross-layer features was proposed. The results show that MSCNN outperforms the mainstream models in terms of accuracy, recall, precision and F1 score with 92.64%, 90.87%, 92.88% and 92.69%, respectively. Finally, the ablation experiment, comparing the average precision of MSCNN with and without shortcuts, revealed that the MSCNN with three shortcuts achieved the best performance with the highest precision.
A semiring is an algebra with two associative binary operations +,⋅, in which + is commutative and ⋅ distributive over + from the left and right. Such an algebra is a common generalization of both rings and distributive lattices. It has broad applications in information science and theoretical computer science (see [5,6]). In this paper, we shall investigate some small-order semirings which will play a crucial role in subsequent follows.
The semiring A with addition and multiplication table (see [12])
+0a10000a0a01001⋅0a10000a01a10a1 |
The semiring B with addition and multiplication table (see [4])
+abcaabcbbbbccbc⋅abcaaaabbbbcabc |
Eight 2-element semirings with addition and multiplication table (see [2])
Semiring | + | ⋅ | Semiring | + | ⋅ |
L2 | 0 1 1 1 |
0 0 1 1 |
R2 | 0 1 1 1 |
0 1 0 1 |
M2 | 0 1 1 1 |
0 1 1 1 |
D2 | 0 1 1 1 |
0 0 0 1 |
N2 | 0 1 1 1 |
0 0 0 0 |
T2 | 0 1 1 1 |
1 1 1 1 |
Z2 | 0 0 0 0 |
0 0 0 0 |
W2 | 0 0 0 0 |
0 0 0 1 |
For any semiring S, we denote by S0 the semiring obtained from S by adding an extra element 0 and where a=0+a=a+0,0=0a=a0 for every a∈S. For any semiring S, S∗ will denote the (multiplicative) left-right dual of S. In 2005, Pastijn et al. [4,9,10] studied the semiring variety generated by B0 and (B0)∗ (Denoted by Sr(2,1)). They showed that the lattice of subvarieties of this variety is distributive and contains 78 varieties precisely. Moreover, each of these is finitely based. In 2016, Ren et al. [12,13] studied the variety generated by B0,(B0)∗ and A0 (Denoted by Sr(3,1)). They showed that the lattice of subvarieties of this variety is distributive and contains 179 varieties precisely. Moreover, each of these is finitely based. From [4,10], we have HSP(L2,R2,M2,D2)⫋HSP(B0,(B0)∗). So
HSP(L2,R2,M2,D2)⫋HSP(L2,R2,M2,D2,Z2,W2)⫋HSP(B0,(B0)∗,Z2,W2). |
In 2016, Shao and Ren [15] studied the variety HSP(L2,R2,M2,D2,Z2,W2) (Denoted by S6). They showed that the lattice of subvarieties of this variety is distributive and contains 64 varieties precisely. Moreover, each of these is finitely based. Recently, Ren and Zeng [14] studied the variety generated by B0,(B0)∗,N2,T2. They proved that the lattice of subvarieties of this variety is a distributive lattice of order 312 and that each of its subvarieties is finitely based. In [16], Wang, Wang and Li studied the variety generated by B0,(B0)∗,A0,N2,T2. They proved that the lattice of subvarieties of this variety is a distributive lattice of order 716 and that each of its subvarieties is finitely based. It is easy to check
HSP(B0,(B0)∗,A0,N2,T2)⫋HSP(B0,(B0)∗,A0,N2,T2,Z2,W2). |
So semiring variety HSP(B0,(B0)∗,A0,N2,T2) is a proper subvariety of the semiring variety HSP(B0,(B0)∗,A0,N2,T2,Z2,W2). The main purpose of this paper is to study the variety HSP(B0,(B0)∗, A0,N2,T2,Z2,W2). We show that the lattice of subvarieties of this variety is a distributive lattice of order 2327. Moreover, we show this variety is hereditarily finitely based.
By a variety we mean a class of algebras of the same type that is closed under subalgebras, homomorphic images and direct products (see [11]). Let W be a variety, let L(W) denote the lattice of subvarieties of W and let IdW(X) denote the set of all identities defining W. If W can be defined by finitely many identities, then we say that W is finitely based (see [14]). In other words, W is said to be finitely based if there exists a finite subset Σ of IdW(X) such that for any p≈q∈IdW(X), p≈q can be derived from Σ, i.e., Σ⊢p≈q. Otherwise, we say that W is nonfinitely based. Recall that W is said to be hereditarily finitely based if all members of L(W) are finitely based. If a variety W is finitely based and L(W) is a finite lattice, then W is hereditarily finitely based (see [14]).
A semiring is called an additively idempotent semiring (ai-semiring for short) if its additive reduct is a semilattice, i.e., a commutative idempotent semigroup. It is also called a semilattice-ordered semigroup (see [3,8,12]). The variety of all semirings (resp. all ai-semirings) is denoted by SR (resp. AI). Let X denote a fixed countably infinite set of variables and X+ the free semigroup on X (see [8]). A semiring identity (SR-identity for short) is an expression of the form u≈v, where u and v are terms with u=u1+⋯+uk, v=v1+⋯+vℓ, where ui,vj∈X+. Let k_ denote the set {1,2,…,k} for a positive integer k, Σ be a set of identities which include the identities determining AI (Each identity in Σ is called an AI-identity) and u≈v be an AI-identity. It is easy to check that the ai-semiring variety defined by u≈v coincides with the ai-semiring variety defined by the identities u≈u+vj,v≈v+ui,i∈k_,j∈ℓ_. Thus, in order to show that u≈v is derivable from Σ, we only need to show that u≈u+vj,v≈v+ui,i∈k_,j∈ℓ_ can be derived from Σ (see [9]).
To solve the word problem for the variety HSP(B0,(B0)∗,A0,N2,T2,Z2,W2), the following notions and notations are needed. Let q be an element of X+. Then
● the head of q, denoted by h(q), is the first variable occurring in q;
● the tail of q, denoted by t(q), is the last variable occurring in q;
● the content of q, denoted by c(q), is the set of variables occurring in q;
● the length of q, denoted by |q|, is the number of variables occurring in q counting multiplicities;
● the initial part of q, denoted by i(q), is the word obtained from q by retaining only the first occurrence of each variable;
● the final part of q, denoted by f(q), is the word obtained from q by retaining only the last occurrence of each variable;
● r(q) denotes set {x∈X|the number of occurrences of x in q is odd}.
By [13,Lemma 1.2], Sr(3,1) satisfies the identity p≈q if and only if (i(p),f(p),r(p))=(i(q),f(q), r(q)). This result will be used later without any further notice. The basis for each one of N2,T2,Z2,W2 can be found from [2] (See Table 1).
Semiring | Equational basis | Semiring | Equational basis |
N2 | xy≈zt,x+x2≈x | T2 | xy≈zt,x+x2≈x2 |
Z2 | x+y≈z+u,xy≈x+y | W2 | x+y≈z+u,x2≈x,xy≈yx |
By [15,Lemma 1.1] and the Table 1, we have
Lemma 2.1. Let u≈v be a nontrivial SR-identity, where u=u1+u2+⋯+um, v=v1+v2+⋯+vn, ui,vj∈X+, i∈m_,j∈n_. Then
(i) N2⊨u≈vifandonlyif{ui∈u||ui|=1}={vi∈v||vi|=1};
(ii) T2⊨u≈vifandonlyif{ui∈u||ui|≥2}≠ϕ,{vi∈v||vi|≥2}≠ϕ;
(iii) Z2⊨u≈vifandonlyif(∀x∈X)u≠x,v≠x;
(iv) W2⊨u≈vifandonlyifm=n=1,c(u1)=c(v1)orm,n≥2.
Suppose that u=u1+⋯+um,ui∈X+,i∈m_. Let 1 be a symbol which is not in X and Y an arbitrary subset of ⋃i=mi=1c(u1). For any ui in u, if c(ui)⊆Y, put hY(ui)=1. Otherwise, we shall denote by hY(ui) the first variable occurring in the word obtained from ui by deleting all variables in Y. The set {hY(ui)|ui∈u} is written HY(u). Dually, we have the notations tY(ui) and TY(ui). In particular, if Y=∅, then hY(ui)=h(ui) and tY(ui)=t(ui). Moreover, if c(ui)∩Y≠∅ for every ui in u, then we write DY(u)=∅. Otherwise, DY(u) is the sum of all terms ui in u such that c(ui)∩Y=∅. By [13,Lemma 2.3 and 2.11] and [4,Lemma 2.4 and its dual,Lemma 2.5 and 2.6], we have
Lemma 2.2. Let u≈u+q be an AI-identity, where u=u1+⋯+um,ui,q∈X+,i∈m_. If u≈u+q holds in Sr(3,1), then
(i) for every Z⊆⋃i=mi=1c(ui)∖c(q), there exists p1 in X+ with r(p1)=r(q) and c(q)⊆c(p1)⊆⋃i=ki=1c(ui) such that DZ(u)≈DZ(u)+p1 holds in Sr(3,1), where DZ(u)=u1+⋯+uk.
(ii) for every Y⊆Z=⋃i=mi=1c(ui)∖c(q), HY(DZ(u))=HY(DZ(u)+p1) and TY(DZ(u))=TY(DZ(u)+p1).
Throughout this paper, u(3.1),(3.2),⋯≈v denotes the identity u≈v can be derived from the identities (3.1),(3.2),⋯ and the identities determining SR. For other notations and terminology used in this paper, the reader is referred to [1,4,7,13,15].
In this section, we shall show that the variety HSP(B0,(B0)∗,A0,N2,T2,Z2,W2) is finitely based. Indeed, we have
Theorem 3.1. The semiring variety HSP(B0,(B0)∗,A0,N2,T2,Z2,W2) is determined by (3.1)–(3.12),
x3y≈xy; | (3.1) |
xy3≈xy; | (3.2) |
(xy)2≈x2y2; | (3.3) |
(xy)3≈xy; | (3.4) |
x2yx≈xyx2; | (3.5) |
xyzx≈xyx2zx; | (3.6) |
xy+z≈xy+z+xyz2; | (3.7) |
xy+z≈xy+z+z2xy; | (3.8) |
xy+z≈xy+z+xz2y; | (3.9) |
xy+z≈xy+z+z3; | (3.10) |
x+y+zt≈x+y+zt+xzty; | (3.11) |
x+y≈x+y+y. | (3.12) |
Proof. From [13] and Lemma 2.1, we know that both Sr(3,1) and HSP(N2,T2,Z2,W2) satisfy identities (3.1)–(3.12) and so does HSP(B0,(B0)∗,A0,N2,T2,Z2,W2).
Next, we shall show that every identity that holds in HSP(B0,(B0)∗,A0,N2,T2,Z2,W2) can be derived from (3.1)–(3.12) and the identities determining SR. Let u≈v be such an identity, where u=u1+u2+⋯+um, v=v1+v2+⋯+vn, ui,vj∈X+, 1≤i≤m,1≤j≤n. By Lemma 2.1 (ⅳ), we only need to consider the following two cases:
Case 1. m=n=1 and c(u1)=c(v1). From Sr(3,1),T2,Z2⊨u1≈v1, it follows that (i(u1),f(u1),r(u1))=(i(v1),f(v1),r(v1)), |u1|≥2 and |v1|≥2. Hence u1(3.1)∼(3.6)≈v1.
Case 2. m,n≥2. It is easy to verify that u≈v and the identity (3.12) imply the identities u≈u+vj, v≈v+ui for all i,j such that 1≤i≤m,1≤j≤n. Conversely, the latter m+n identities imply u≈u+v≈v. Thus, to show that u≈v is derivable from (3.1)–(3.12) and the identities determining SR, we need only show that the simpler identities u≈u+vj, v≈v+ui for all i,j such that 1≤i≤m,1≤j≤n. Hence we need to consider the following two cases:
Case 2.1. u≈u+q, where |q|=1. Since N2⊨u≈u+q, there exists us=q. Thus u+q≈u′+us+q≈u′+us+us(3.12)≈u′+us≈u.
Case 2.2. u≈u+q, where |q|≥2. Since u≈u+q holds in T2, it follows from Lemma 2.1 (ⅱ) that there exists ui in u such that ui>1. Put Z=(⋃i=mi=1c(ui))∖c(q). Assume that DZ(u)=u1+⋯+uk. Then ⋃i=ki=1c(ui)=c(q). By Lemma 2.2 (ⅰ), there exists p1∈X+ such that r(p1)=r(q) and c(q)⊆c(p1)⊆⋃i=ki=1c(ui). Moreover,
u≈u+ui+DZ(u)≈u+ui+p1+DZ(u)≈u+ui+p1+DZ(u)+p31(by (3.10))≈u+ui+p1+DZ(u)+p31+p31u21u22⋯u2k.(by (3.7)) |
Write p=p31u21u22⋯u2k. Thus c(p)=c(q), r(p)=r(q) and we have derived the identity
u≈u+p. | (3.13) |
Due to |p|>1, it follows that (3.4) implies the identity
p3≈p. | (3.14) |
Suppose that i(q)=x1x2⋯xℓ. We shall show by induction on j that for every 1≤j≤ℓ, u≈u+x21x22⋯x2ℓp is derivable from (3.1)–(3.11) and the identities defining SR.
From Lemma 2.1 (ⅱ), there exists ui1 in DZ(u) with c(ui1)⊆c(q) such that h(ui1)=h(q)=x1. Furthermore,
u≈u+ui1+p(by (3.13))≈u+ui1+p+u2i1p(by (3.8))≈u+ui1+p+x21u2i1p(by (3.1))≈u+ui1+p+x21u2i1p+x21p2u2iip(by (3.9))≈u+ui1+p+x21u2i1p+x21p.(by (3.6), (3.14)) |
Therefore
u≈u+x21p. | (3.15) |
Assume that for some 1<j≤ℓ,
u≈u+x21x22⋯x2j−1p | (3.16) |
is derivable from (3.1–3.12) and the identities defining SR. By Lemma 2.1 (ⅱ), there exists ui in DZ(u) with c(ui)⊆c(q) such that ui=ui1xjui2 and c(ui1)⊆{x1,x2,…,xj−1}. It follows that
u≈u+ui+p≈u+ui+p+u2ip(by (3.8))≈u+ui+p+u2i1x2ju2i2p(by (3.3))≈u+ui+p+u2i1x2ju2i2p+u2i1x2jp2u2i2p(by (3.9))≈u+ui+p+u2i1x2ju2i2p+u2i1x2jp.(by (3.6), (3.14)) |
Consequently
u≈u+u2i1x2jp. | (3.17) |
Moreover, we have
u≈u+x21x22⋯x2j−1p+u2i1x2jp(by (3.16), (3.17))≈u+x21x22⋯x2j−1p+u2i1x2jp+x21x22⋯x2j−1(u2i1x2jp)2p(by (3.9))≈u+x21x22⋯x2j−1p+u2i1x2jp+x21x22⋯x2j−1x2jp.(by (3.3), (3.6), (3.14))) |
Hence u≈u+x21x22⋯x2j−1x2jp. Using induction we have
u≈u+i2(q)p. | (3.18) |
Dually,
u≈u+pf2(q). | (3.19) |
Thus
u≈u+p+i2(q)p+pf2(q)(by (3.13), (3.18), (3.19))≈u+p+i2(q)p+pf2(q)+i2(q)pppf2(q)(by (3.11))≈u+p+i2(q)p+pf2(q)+i2(q)pf2(q)(by (3.14))≈u+p+i2(q)p+pf2(q)+q.(by (3.1)–(3.6)) |
It follows that u≈u+q.
In this section we characterize the lattice L(HSP(B0,(B0)∗,A0,N2,T2,Z2,W2)). Throughout this section, t(x1,…,xn) denotes the term t which contains no other variables than x1,…,xn (but not necessarily all of them). Let S∈HSP(B0,(B0)∗,A0,N2,T2,Z2,W2) and let E+(S) denote the set {a∈S|a+a=a}, where any element of E+(S) is said to be an additive idempotent of (S,+). Notice that HSP(B0,(B0)∗,A0,N2,T2,Z2,W2) satisfies the identities
(x+y)+(x+y)≈(x+x)+(y+y), | (4.1) |
xy+xy≈(x+x)(y+y). | (4.2) |
By (4.1) and (4.2), it is easy to verify that E+(S)={a+a|a∈S} forms a subsemiring of S. To characterize the lattice L(HSP(B0,(B0)∗,A0,N2,T2,Z2,W2)), we need to consider the following mapping
φ:L(HSP(B0,(B0)∗,A0,N2,T2,Z2,W2))→L(HSP(B0,(B0)∗,A0,N2,T2)),W↦W∩HSP(B0,(B0)∗,A0,N2,T2). | (4.3) |
It is easy to prove that φ(W)={E+(S)|S∈W} for each member W of L(HSP(B0,(B0)∗,A0,N2,T2, Z2,W2)). If W is the subvariety of HSP(B0,(B0)∗,A0,N2,T2) determined by the identities
ui(xi1,…,xin)≈vi(xi1,…,xin),i∈k_, |
then ˆW denotes the subvariety of HSP(B0,(B0)∗,A0,N2,T2,Z2,W2) determined by the identities
ui(xi1+xi1,…,xin+xin)≈vi(xi1+xi1,…,xin+xin),i∈k_. | (4.4) |
Lemma 4.1. [16] The ai-semiring variety HSP(B0,(B0)∗,A0,N2,T2) is determined by the identities (3.1)–(3.11) and L(HSP(B0,(B0)∗,A0,N2,T2)) is a distributive lattice of order 716.
Lemma 4.2. Let W be a member of L(HSP(B0,(B0)∗,A0,N2,T2)). Then, ˆW=W∨HSP(Z2,W2).
Proof. Since W satisfies the identities (4.4), it follows that W is a subvariety of ˆW. Both Z2 and W2 are members of ˆW and so W∨HSP(Z2,W2)⊆ˆW. To show the converse inclusion, it suffices to show that every identity that is satisfied by W∨HSP(Z2,W2) can be derived by the identities holding in HSP(B0,(B0)∗,A0,N2,T2,Z2,W2) and
ui(xi1+xi1,…,xin+xin)≈vi(xi1+xi1,…,xin+xin),i∈k_, |
if W is the subvariety of L(HSP(B0,(B0)∗,A0,N2,T2)) determined by ui(xi1,…,xin)≈vi(xi1,…,xin), i∈k_. Let u≈v be such an identity, where u=u1+u2+⋯+um,v=v1+v2+⋯+vn,ui,vj∈X+,1≤i≤m,1≤j≤n. By Lemma 2.1 (8), we only need to consider the following two cases.
Case 1. m,n≥2. By identity (3.12), HSP(B0,(B0)∗,A0,N2,T2,Z2,W2) satisfies the identities
u+u≈u, | (4.5) |
v+v≈v. | (4.6) |
Since u≈v holds in HSP(B0,(B0)∗,A0,N2,T2), we have that it is derivable from the collection Σ of ui≈vi,i∈k_ and the identities determining HSP(B0,(B0)∗,A0,N2,T2). From [1,Exercise Ⅱ.14.11], it follows that there exist t1,t2,…,tℓ∈Pf(X+) such that
● t1=u,tℓ=v;
● For any i=1,2,…,ℓ−1, there exist pi,qi,ri∈Pf(X+) (where pi, qi and ri may be empty words), a semiring substitution φi and an identity u′i≈v′i∈Σ such that
ti=piφi(wi)qi+ri,ti+1=piφi(si)qi+ri,where eitherwi=u′i,si=v′iorwi=v′i,si=u′i. |
Let Σ′ denote the set {u+u≈v+v|u≈v∈Σ}. For any i=1,2,…,ℓ−1, we shall show that ti+ti≈ti+1+ti+1 is derivable from Σ′ and the identities holding in HSP(B0,(B0)∗,A0,N2,T2,Z2,W2). Indeed, we have
ti+ti=piφi(wi)qi+ri+piφi(wi)qi+ri≈piφi(wi)qi+piφi(wi)qi+ri+ri≈pi(φi(wi+wi))qi+ri+ri≈pi(φi(si+si))qi+ri+ri(sincewi+wi≈si+si∈Σ′orsi+si≈wi+wi∈Σ′)≈piφi(si)qi+piφi(si)qi+ri+ri≈piφi(si)qi+ri+piφi(si)qi+ri=ti+1+ti+1. |
Further,
u+u=t1+t1≈t2+t2≈⋯≈tℓ+tℓ=v+v. |
This implies the identity
u+u≈v+v. | (4.7) |
We now have
u(4.6)≈u+u(4.7)≈v+v(4.6)≈v. | (4.8) |
Case 2. m=n=1 and c(u)=c(v). Since Z2⊨u1≈v1, u1≠x,v1≠x, for every x∈X. Since u1≈v1 holds in HSP(B0,(B0)∗,A0,N2,T2), we have that it is derivable from the collection Σ of ui≈vi,i∈k_ and the identities defining HSP(B0,(B0)∗,A0,N2,T2). From [1,Exercise Ⅱ.14.11], it follows that there exist t1,t2,…,tℓ∈Pf(X+) such that
● t1=u1,tℓ=v1;
● For any i=1,2,…,ℓ−1, there exist pi,qi∈Pf(X+) (where pi and qi may be empty words), a semiring substitution φi and an identity u′i≈v′i∈Σ (where u′i and v′i are words) such that
ti=piφi(wi)qi,ti+1=piφi(si)qi,where eitherwi=u′i,si=v′iorwi=v′i,si=u′i. |
By Lemma 4.1, we have that u1≈v1 can be derived from (3.1)–(3.6), so, by Theorem 3.1, it can be derived from monomial identities holding in HSP(B0,(B0)∗,A0,N2,T2,Z2,W2). This completes the proof.
Lemma 4.3. The following equality holds
L(HSP(B0,(B0)∗,A0,N2,T2,Z2,W2))=⋃W∈L(HSP(B0,(B0)∗,A0,N2,T2))[W,ˆW]. | (4.9) |
There are 716 intervals in L(HSP(B0,(B0)∗,A0,N2,T2,Z2,W2)), and each interval is a congruence class of the kernel of the complete epimorphism φ in (4.3).
Proof. Firstly, we shall show that equality (4.9) holds. It is easy to see that
L(HSP(B0,(B0)∗,A0,N2,T2,Z2,W2))=⋃W∈L(HSP(B0,(B0)∗,A0,N2,T2))φ−1(W). |
So it suffices to show that
φ−1(W)=[W,ˆW], | (4.10) |
for each member W of L(HSP(B0,(B0)∗,A0,N2,T2)). If W1 is a member of [W,ˆW], then it is routine to verify that W⊆{E+(S)|S∈W1}⊆W. This implies that {E+(S)|S∈W1}=W and so φ(W1)=W. Hence, W1 is a member of φ−1(W) and so [W,ˆW]⊆φ−1(W). Conversely, if W1 is a member of φ−1(W), then W=φ(W1)={E+(S)|S∈W1} and so φ−1(W)⊆[W,ˆW]. This shows that (4.9) holds.
From Lemma 4.1, we know that L(HSP(B0,(B0)∗,A0,N2,T2)) is a lattice of order 716. So there are 716 intervals in L(HSP(B0,(B0)∗,A0,N2,T2,Z2,W2)). Next, we show that φ a complete epimorphism. On one hand, it is easy to see that φ is a complete ∧-epimorphism. On the other hand, let (Wi)i∈I be a family of members of L(HSP(B0,(B0)∗,A0,N2,T2,Z2, W2)). Then, by (4.3), we have that φ(Wi)⊆Wi⊆^φ(Wi) for each i∈I. Further,
⋁i∈Iφ(Wi)⊆⋁i∈IWi⊆⋁i∈I^φ(Wi)⊆^⋁i∈Iφ(Wi). |
This implies that φ(⋁i∈IWi)=⋁i∈Iφ(Wi). Thus, φ is a complete ∨-homomorphism and so φ is a complete ∨-epimorphism. By (4.10), we deduce that each interval in (4.3) is a congruence class of the kernel of the complete epimorphism φ.
In order to characterize the lattice L(HSP(B0,(B0)∗,A0,N2,T2,Z2,W2)), by Lemma 4.3, we only need to describe the interval [W,ˆW] for each member W of L(HSP(B0, (B0)∗,A0,N2,T2)). Next, we have
Lemma 4.4. Let W be a member of L(HSP(B0,(B0)∗,A0,N2,T2)). Then, W∨HSP(Z2) is the subvariety of ˆW determined by the identity
x3≈x3+x3. | (4.11) |
Proof. It is easy to see that both, W and HSP(Z2) satisfy the identity (4.11) and so does W∨HSP(Z2). In the following we prove that every identity that is satisfied by W∨HSP(Z2) is derivable from (4.11) and the identities holding in ˆW. Let u≈v be such an identity, where u=u1+u2+⋯+um,v=v1+v2+⋯+vn,ui,vj∈X+,1≤i≤m,1≤j≤n. We only need to consider the following cases.
Case 1. m=n=1. Since Z2 satisfies u1≈v1, it follows that |u1|≠1 and |v1|≠1. By Lemma 4.2, ˆW satisfies the identity u31+u31≈v31+v31. Hence u1(3.4)≈u31(4.11)≈u31+u31≈v31+v31(4.11)≈v31(3.4)≈v1.
Case 2. m=1, n≥2. Since Z2 satisfies u1≈v, it follows that |u1|≠1. By Lemma 4.2, ˆW satisfies the identity u31+u31≈v+v. Hence u1(3.4)≈u31(4.11)≈u31+u31≈v+v(3.11)≈v.
Case 3. m≥2, n=1. Similar to Case 2.
Case 4. m,n≥2. By Lemma 4.2, ˆW satisfies the identity u+u≈v+v. Hence u(3.11)≈u+u≈v+v(3.11)≈v.
Lemma 4.5. Let W be a member of L(Sr(3,1)). Then W∨HSP(W2) is the subvariety of ˆW determined by the identities
x3≈x. | (4.12) |
Proof. It is easy to see that both, W and HSP(W2) satisfy the identity (4.12) and so does W∨HSP(W2). So it suffices to show that every identity that is satisfied by W∨HSP(W2) is derivable from (4.12) and the identities holding in ˆW. Let u≈v be such an identity, where u=u1+u2+⋯+um,v=v1+v2+⋯+vn,ui,vj∈X+,1≤i≤m,1≤j≤n. By Lemma 4.2, ˆW satisfies the identity u3≈v3. Hence, u(4.12)≈u3≈v3(4.12)≈v.
Lemma 4.6. Let W be a member of L(HSP(B0,(B0)∗,A0,N2,T2)). Then the interval [W,ˆW] of L(HSP(B0,(B0)∗,A0,N2,T2,Z2,W2)) is given in Figure 1.
Proof. Suppose that W1 is a member of [W,ˆW] such that W1≠ˆW and W1≠W. Then, there exists a nontrivial identity u≈v holding in W1 such that it is not satisfied by ˆW. Also, we have that W1 does not satisfy the identity x+x≈x. By Lemma 4.2, we only need to consider the following two cases.
Case 1. HSP(Z2)⊨u≈v,HSP(W2)⊭u≈v. Then, u≈v satisfies one of the following three cases:
● m=n=1, c(u1)≠c(v1), |u1|≠1 and |v1|≠1;
● m=1,n>1 and |u1|≠1;
● m>1,n=1 and |v1|≠1.
It is easy to see that, in each of the above cases, u≈v can imply the identity x3≈x3+x3. By Lemma 4.4, we have that W1 is a subvariety of W∨HSP(Z2). On the other hand, since W1⊨x3≈x3+x3 and W1⊭x+x≈x, it follows that Z2 is a member of W1 and so W∨HSP(Z2) is a subvariety of W1. Thus, W1=W∨HSP(Z2).
Case 2. HSP(Z2)⊭u≈v,HSP(W2)⊨u≈v. Then, u≈v satisfies one of the following two cases:
● m=n=1, c(u1)=c(v1) and |u1|=1;
● m=n=1, c(u1)=c(v1) and |v1|=1.
If N2,T2∉W, then, in each of the above cases, u≈v can imply the identity x≈x3. By Lemma 4.5, W1 is a subvariety of W∨HSP(W2). On the other hand, since W1⊨x≈x3 and W1⊭x≈x+x, it follows that W2 is a member of W1 and so W∨HSP(W2) is a subvariety of W1. Thus, W1=W∨HSP(W2).
If N2∈W, then, by Lemma 2.1 (ⅰ), |u1|=|v1|=1, a contradiction. Thus, V1=ˆV.
If T2∈W, then, by Lemma 2.1 (ⅱ), |u1|≥2,|v1|≥2, a contradiction. Thus, V1=ˆV.
By Lemma 4.3 and 4.6, we can show that the lattice L(HSP(B0,(B0)∗,A0,N2,T2,Z2, W2)) of subvarieties of the variety HSP(B0,(B0)∗,A0,N2,T2,Z2,W2) contains 2327 elements. In fact, we have
Theorem 4.7. L(HSP(B0,(B0)∗,A0,N2,T2,Z2,W2)) is a distributive lattice of order 2327.
Proof. We recall from [16] that Sr(3,1)∨T2 [Sr(3,1)∨N2] contains 358 subvarieties since Sr(3,1) contains 179 subvarieties. By Lemma 4.3 and 4.6, we can show that L(HSP(B0,(B0)∗,A0,N2,T2,Z2, W2)) has exactly 2327 (where 2327=179×4+358×3×2−179×3) elements. Suppose that W1,W2 and W3 are members of L(HSP(B0,(B0)∗,A0, N2,T2,Z2,W2)) such that W1∨W2=W1∨W3 and W1∧W2=W1∧W3. Then, by Lemma 4.3
φ(W1)∨φ(W2)=φ(W1)∨φ(W3) |
and
φ(W1)∧φ(W2)=φ(W1)∧φ(W3). |
Since L(HSP(B0,(B0)∗,A0,N2,T2) is distributive, it follows that φ(W2)=φ(W3). Write W for φ(W2). Then both W2,W3 are members of [W,ˆW]. Suppose that W2≠W3. Then, by Lemma 4.6, W1∨W2=W1∨W3 and W1∧W2=W1∧W3 can not hold at the same time. This implies that W2=W3.
By Theorem 4.1, 4.7 and [14,Corollary 1.2], we now immediately deduce
Corollary 4.8. HSP(B0,(B0)∗,A0,N2,T2,Z2,W2) is hereditarily finitely based.
This article considers a semiring variety generated by B0,(B0)∗,A0,N2,T2,Z2,W2. The finite basis problem for semirings is an interesting developing topic, with plenty of evidence of a high level of complexity along the lines of the more well-developed area of semigroup varieties. This article is primarily a contribution toward the property of being hereditarily finite based, meaning that all subvarieties are finitely based. This property is of course useful because it guarantees the finite basis property of a large number of examples.
This work was supported by the Natural Science Foundation of Chongqing (cstc2019jcyj-msxmX0156, cstc2020jcyj-msxmX0272, cstc2021jcyj-msxmX0436), the Scientific and Technological Research Program of Chongqing Municipal Education Commission (KJQN202001107, KJQN202101130) and the Scientific Research Starting Foundation of Chongqing University of Technology (2019ZD68).
The authors declare that they do not have any conflict of interests regarding this paper.
[1] |
A. M. Khaneghah, L. M. Martins, A. M. Von Hertwig, R. Bertoldo, A. S. Sant'Ana, Deoxynivalenol and its masked forms: Characteristics, incidence, control and fate during wheat and wheat-based products processing - A review, Trends Food Sci. Technol., 71 (2018), 13−24. https://doi.org/10.1016/j.tifs.2017.10.012 doi: 10.1016/j.tifs.2017.10.012
![]() |
[2] |
N.S. Poluxeni, M. Sotirios, K. Chrysanthi, S. Panagiotis, H. Luc, Chemical pesticides and human health: the urgent need for a new concept in agriculture, Front. Public Health, 4 (2016) 148−148. https://doi.org/10.3389/fpubh.2016.00148 doi: 10.3389/fpubh.2016.00148
![]() |
[3] |
X. Lei, R. Qiu, Evaluation of food security in China based entropy TOPSIS model and the diagnosis of its obstacle factors, J. China Agric. Univ., 27 (2022), 1−14. https://doi.org/10.11841/j.issn.1007-4333.2022.12.01 doi: 10.11841/j.issn.1007-4333.2022.12.01
![]() |
[4] |
Y. T. Hui, D. C. Wang, Y. You, C. Y. Shao, C. S. Zhong, H. D. Wang, Effect of low temperature plasma treatment on biological characteristics and yield components of wheat seeds (Triticum aestivum L.), Plasma Chem. Plasma Process., 40 (2020), 1555−1570. https://doi.org/10.1007/s11090-020-10104-z doi: 10.1007/s11090-020-10104-z
![]() |
[5] |
H. Liu, Y. H. Zhang, H. Yin, W. X. Wang, X. M. Zhao, Y. G. Du, Alginate oligosaccharides enhanced triticum aestivum L. tolerance to drought stress, Plant Physiol. Biochem., 62 (2013), 33−40. https://doi.org/10.1016/j.plaphy.2012.10.012 doi: 10.1016/j.plaphy.2012.10.012
![]() |
[6] |
B. Šerá, P. r Špatenka, M. l Šerý, N. Vrchotová, I. a Hrušková, Influence of plasma treatment on wheat and oat germination and early growth, IEEE Trans. Plasma Sci., 38 (2010), 2963−2968. https://doi.org/10.1109/TPS.2010.2060728 doi: 10.1109/TPS.2010.2060728
![]() |
[7] |
R. Thirumdas, A. Kothakota, U. Annapure, K. Siliveru, R. Blundell, R. Gatt, et al., Plasma activated water (PAW) Chemistry, physico-chemical properties, applications in food and agriculture, Trends Food Sci. Technol., 77 (2018), 21−31. https://doi.org/10.1016/j.tifs.2018.05.007 doi: 10.1016/j.tifs.2018.05.007
![]() |
[8] | L. Tonks, Oscillations in ionized gases, Plasma and Oscillations, Elsevier, 1961,122−139. https://doi.org/10.1016/B978-1-4831-9913-9.50014-5 |
[9] |
B. Zhao, J. S. Feng, X. Wu, S. C. Yan, A survey on deep learning-based fine-grained object classification and semantic segmentation, Int. J. Autom. Comput., 14 (2017), 119−135. https://doi.org/10.1007/s11633-017-1053-3 doi: 10.1007/s11633-017-1053-3
![]() |
[10] | A. Srivastava, E. Han, V. Kumar, V. Singh, Parallel formulations of decision-tree classification algorithms, High Performance Data Mining, Springer, Boston, 1999,237−261. https://doi.org/10.1007/0-306-47011-X_2 |
[11] | G. D. Guo, H. Wang, D. Bell, Y. X. Bi, KNN model-based approach in classification, in OTM Confederated International Conferences CoopIS, DOA, and ODBASE, (2003), 986−996. https://doi.org/10.1007/978-3-540-39964-3_62 |
[12] |
A. Tharwat, A. E. Hassanien, B. E. Elnaghi, A BA-based algorithm for parameter optimization of Support Vector Machine, Pattern Recognit. Lett., 93 (2017), 13−22. https://doi.org/10.1016/j.patrec.2016.10.007 doi: 10.1016/j.patrec.2016.10.007
![]() |
[13] | N. Coskun, T. Yildirim, The effects of training algorithms in MLP network on image classification, in Proceedings of the International Joint Conference on Neural Networks, (2003), 1223−1226. |
[14] | J. Deng, J. Krause, F. F. Li, Fine-grained crowdsourcing for fine-grained recognition, in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, (2013), 580−587. https://doi.org/10.1109/CVPR.2013.81 |
[15] | E. Gavves, B. Fernando, C. G. Snoek, A. W. Smeulders, T. Tuytelaars, Fine-grained categorization by alignments, in Proceedings of the IEEE International Conference on Computer Vision, (2013), 1713−1720. https://doi.org/10.1109/ICCV.2013.215 |
[16] | K. M. He, X. Y. Zhang, S. Q. Ren, J. Sun, Deep residual learning for image recognition, in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, (2016), 770−778. https://doi.org/10.1109/CVPR.2016.90 |
[17] | G. Huang, Z. Liu, L. Van Der Maaten, K. Q. Weinberger, Densely connected convolutional networks, in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, (2017), 4700−4708. https://doi.org/10.1109/cvpr.2017.243 |
[18] | S. Ioffe, C. Szegedy, Batch normalization: Accelerating deep network training by reducing internal covariate shift, in Proceedings of the 32nd International Conference on Machine Learning, (2015), 448−456. |
[19] |
A. Krizhevsky, I. Sutskever, G. E. Hinton, ImageNet classification with deep convolutional neural networks, Commun. ACM, 60 (2017), 84-90. https://doi.org/10.1145/3065386 doi: 10.1145/3065386
![]() |
[20] |
S. Jin, H. X. Yao, X. S. Sun, S. C. Zhou, L. Zhang, X. S. Hua, Deep saliency hashing for fine-grained retrieval, IEEE Trans. Image Process., 29 (2020), 5336−5351. https://doi.org/10.1109/TIP.2020.2971105 doi: 10.1109/TIP.2020.2971105
![]() |
[21] |
Y. Jing, W. Wang, L. Wang, T. N. Tan, Learning aligned image-text representations using graph attentive relational net-work, IEEE Trans. Image Process., 30 (2021), 1840−1852. https://doi.org/10.1109/TIP.2020.3048627 doi: 10.1109/TIP.2020.3048627
![]() |
[22] |
L. L. Zhang, J. Liu, M. N. Luo, X. J. Chang, Q. H. Zheng, Deep semisupervised zero-shot learning with maximum mean discrepancy, Neural Comput., 30 (2018), 1426−1447. https://doi.org/10.1162/neco_a_01071 doi: 10.1162/neco_a_01071
![]() |
[23] |
K. Liu, D. Liu, L. Li, N. Yan, H. Q. Li, Semantics-to-signal scalable image compression with learned revertible representations, Int. J. Comput. Vis., 129 (2021), 2605−2621. https://doi.org/10.1007/s11263-021-01491-7 doi: 10.1007/s11263-021-01491-7
![]() |
[24] |
L. Qi, X. Q. Lu, X. L. Li, Exploiting spatial relation for fine-grained image classification, Pattern Recognit., 91 (2019), 47−55. https://doi.org/10.1016/j.patcog.2019.02.007 doi: 10.1016/j.patcog.2019.02.007
![]() |
[25] |
L. Wang, K. He, X. Feng, X. T. Ma, Multilayer feature fusion with parallel convolutional block for fine-grained image classification, Appl. Intell., 52 (2022), 2872−2883. https://doi.org/10.1007/s10489-021-02573-2 doi: 10.1007/s10489-021-02573-2
![]() |
[26] | M. Srinivas, Y. Y. Lin, H. Y. M. Liao, Deep dictionary learning for fine-grained image classification, in 2017 IEEE International Conference on Image Processing, (2017), 835−839. https://doi.org/10.1109/ICIP.2017.8296398 |
[27] | L. Liao, R. M. Hu, J. Xiao, Q. Wang, J. Xiao, J. Chen, Exploiting effects of parts in fine-grained categorization of vehicles, in 2015 IEEE international conference on image processing, (2015), 745−749. https://doi.org/10.1109/ICIP.2015.7350898 |
[28] |
K. Wang, M. Z. Liu, YOLOv3-MTis A YOLOv3 using multi-target tracking for vehicle visual detection, Appl. Intell., 52 (2022), 2070−2091. https://doi.org/10.1007/s10489-021-02491-3 doi: 10.1007/s10489-021-02491-3
![]() |
[29] |
S. M. Pan, W. Q. Feng, Y. W. Chong, Attribute-guided global and part-level identity network for person re-identification, Int. J. Pattern Recognit. Artif. Intell., 36 (2022), 2250011. https://doi.org/S0218001422500112 doi: 10.1142/S0218001422500112
![]() |
[30] |
C. Wang, J. Y. Sun, S. W. Ma, Y. Q. Lu, W. Liu, Multi-stream network for human-object interaction detection, Int. J. Pattern Recognit. Artif. Intell., 35 (2021), 2150025. https://doi.org/10.1142/S0218001421500257 doi: 10.1142/S0218001421500257
![]() |
[31] |
Z. Q. Lin, S. M. Mu, F. Huang, K. A. Mateen, M. J. Wang, W. L. Gao, et al., A unified matrix-based convolutional neural network for fine-grained image classification of wheat leaf diseases, IEEE Access, 7 (2019), 11570−11590. https://doi.org/10.1109/ACCESS.2019.2891739 doi: 10.1109/ACCESS.2019.2891739
![]() |
[32] |
Z. Q. Lin, S. M. Mu, A. J. Shi, C. Pang, X. X. Sun, A novel method of maize leaf disease image identification based on a multichannel convolutional neural network, Trans. ASABE, 61 (2018), 1461−1474. https://doi.org/10.13031/trans.12440 doi: 10.13031/trans.12440
![]() |
[33] | H. Lu, Z. G. Cao, Y. Xiao, Z. W. Fang, Y. J. Zhu, Fine-grained maize cultivar identification using filter-specific convolutional activations, in 2016 IEEE International Conference on Image Processing, (2016), 3718−3722. https://doi.org/10.1109/ICIP.2016.7533054 |
[34] |
X. P. Zhang, H. K. Xiong, W. G. Zhou, Q. Tian, Fused one-vs-all features with semantic alignments for fine-grained visual categorization, IEEE Trans. Image Process., 25 (2015), 878−892. https://doi.org/10.1109/TIP.2015.2509425 doi: 10.1109/TIP.2015.2509425
![]() |
[35] |
X. S. Wei, C. W. Xie, J. X. Wu, C. H. Shen, Mask-CNN is Localizing parts and selecting descriptors for fine-grained bird species categorization, Pattern Recognit., 76 (2018), 704−714. https://doi.org/10.1016/j.patcog.2017.10.002 doi: 10.1016/j.patcog.2017.10.002
![]() |
[36] |
L. Qi, X. Q. Lu, X. L. Li, Exploiting spatial relation for fine-grained image classification, Pattern Recognit., 91 (2019), 47−55. https://doi.org/10.1016/j.patcog.2019.02.007 doi: 10.1016/j.patcog.2019.02.007
![]() |
[37] |
Y. Zhang, X. S. Wei, J. X. Wu, J. F. Cai, J. B. Lu, V. A. Nguyen, et al., Weakly supervised fine-grained categorization with part-based image representation, IEEE Trans. Image Process., 25 (2016), 1713−1725. https://doi.org/10.1109/TIP.2016.2531289 doi: 10.1109/TIP.2016.2531289
![]() |
[38] | S. L. Huang, Z. Xu, D. C. Tao, Y. Zhang, Part-Stacked CNN for fine-grained visual categorization, in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, (2016), 1173−1182. https://doi.org/10.1109/CVPR.2016.132 |
[39] |
S. H. Lee, C. S. Chan, S. J. Mayo, P. Remagnino, How deep learning extracts and learns leaf features for plant classification, Pattern Recognit., 71 (2017), 1−13. https://doi.org/10.1016/j.patcog.2017.05.015 doi: 10.1016/j.patcog.2017.05.015
![]() |
[40] |
M. Rohrbach, A. Rohrbach, M. Regneri, S. Amin, M. Andriluka, M. Pinkal, et al., Recognizing fine-grained and composite activities using hand-centric features and script data, Int. J. Comput. Vision., 119 (2016), 346−373. https://doi.org/10.1007/s11263-015-0851-8 doi: 10.1007/s11263-015-0851-8
![]() |
[41] | S. Cai, W. Zuo, Z. Lei, Higher-order integration of hierarchical convolutional activations for fine-grained visual categorization, in Proceedings of the IEEE International Conference on Computer Vision, (2017), 511−520. |
[42] |
Q. Hu, H. Wang, T. Li, C. Shen, Deep CNNs with spatially weighted pooling for fine-grained car recognition, IEEE. Intell Transp., 91 (2019), 47−55. https://doi.org/10.1016/j.patcog.2019.02.007 doi: 10.1016/j.patcog.2019.02.007
![]() |
[43] | P. J. Burt, E. H. Adelson, Readings in computer vision, Elsevier, Piscataway, 1987,671−679. https://doi.org/10.1016/B978-0-08-051581-6.50065-9 |
[44] |
C. Farabet, C. Couprie, L. Najman, Y. LeCun, Learning hierarchical features for scene labeling, IEEE Trans. Pattern Anal. Mach. Intell., 35 (2012), 1915−1929. https://doi.org/10.1109/TPAMI.2012.231 doi: 10.1109/TPAMI.2012.231
![]() |
[45] | B. Hariharan, P. Arbeláez, R. Girshick, J. Malik, Hyper columns for object segmentation and fine-grained localization, in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, (2015), 447−456. https://doi.org/10.1109/CVPR.2015.7298642 |
[46] |
J. Weber, J. Malik, Robust computation of optical flow in a multi-scale differential framework, Int. J. Comput. Vis., 14 (1995), 67−81. https://doi.org/10.1007/BF01421489 doi: 10.1007/BF01421489
![]() |
[47] | H. L. Zheng, J. L. Fu, T. Mei, J.B. Luo, Learning multi-attention convolutional neural network for fine-grained image recognition, in Proceedings of the IEEE international conference on computer vision, (2017), 5209−5217. https://doi.org/10.1109/ICCV.2017.557 |
[48] | T. Y. Lin, A. RoyChowdhury, S. Maji, Bilinear CNN models for fine-grained visual recognition, in Proceedings of the IEEE International Conference on Computer Vision, (2015), 1449−1457. https://doi.org/10.1109/ICCV.2015.170 |
[49] | A. Fawzi, H. Samulowitz, D. Turaga, P. Frossard, Adaptive data augmentation for image classification, in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, (2013), 580−587. https://doi.org/10.1109/ICIP.2016.7533048 |
[50] | R. Dellana, K. Roy, Data augmentation in CNN-based periocular authentication, in 2016 6th International Conference on Information Communication and Management, (2016), 141−145. https://doi.org/10.1109/INFOCOMAN.2016.7784231 |
[51] | J. Johnson, A. Karpathy, F. F. Li, Densecap is Fully convolutional localization networks for dense captioning, in Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, (2016), 4565−4574. https://doi.org/10.1109/CVPR.2016.494 |
[52] | N. S. Keskar, D. Mudigere, J. Nocedal, M. Smelyanskiy, P. T. P. Tang, On large-batch training for deep learning is generalization gap and sharp minima, (2016). https://doi.org/10.48550/arXiv.1609.04836 |
[53] | H. Li, Z. Xu, G. Taylor, T. Goldstein, Visualizing the loss landscape of neural nets, in 32nd Conference on Neural Information Processing Systems, 31 (2018). |
[54] | P. Goyal, P. Dollár, R. Girshick, P. Noordhuis, L. Wesolowski, A. Kyrola, et al., Accurate, large minibatch SGD is Training ImageNet in 1 hour, (2017). https://doi.org/10.48550/arXiv.1706.02677 |
[55] | K. Simonyan, A. Zisserman, Very deep convolutional networks for large-scale image recognition, in Proceedings of International Conference on Learning Representations, (2015). https://doi.org/10.48550/arXiv.1409.1556 |
Semiring | + | ⋅ | Semiring | + | ⋅ |
L2 | 0 1 1 1 |
0 0 1 1 |
R2 | 0 1 1 1 |
0 1 0 1 |
M2 | 0 1 1 1 |
0 1 1 1 |
D2 | 0 1 1 1 |
0 0 0 1 |
N2 | 0 1 1 1 |
0 0 0 0 |
T2 | 0 1 1 1 |
1 1 1 1 |
Z2 | 0 0 0 0 |
0 0 0 0 |
W2 | 0 0 0 0 |
0 0 0 1 |
Semiring | Equational basis | Semiring | Equational basis |
N2 | xy≈zt,x+x2≈x | T2 | xy≈zt,x+x2≈x2 |
Z2 | x+y≈z+u,xy≈x+y | W2 | x+y≈z+u,x2≈x,xy≈yx |