Loading [MathJax]/jax/output/SVG/jax.js
Research article

Robust strong duality for nonconvex optimization problem under data uncertainty in constraint

  • Received: 08 February 2021 Accepted: 09 August 2021 Published: 26 August 2021
  • MSC : 90C46, 90C48

  • This paper deals with the robust strong duality for nonconvex optimization problem with the data uncertainty in constraint. A new weak conjugate function which is abstract convex, is introduced and three kinds of robust dual problems are constructed to the primal optimization problem by employing this weak conjugate function: the robust augmented Lagrange dual, the robust weak Fenchel dual and the robust weak Fenchel-Lagrange dual problem. Characterizations of inequality (1.1) according to robust abstract perturbation weak conjugate duality are established by using the abstract convexity. The results are used to obtain robust strong duality between noncovex uncertain optimization problem and its robust dual problems mentioned above, the optimality conditions for this noncovex uncertain optimization problem are also investigated.

    Citation: Yanfei Chai. Robust strong duality for nonconvex optimization problem under data uncertainty in constraint[J]. AIMS Mathematics, 2021, 6(11): 12321-12338. doi: 10.3934/math.2021713

    Related Papers:

    [1] Koushik Das, Savin Treanţă, Thongchai Botmart . Set-valued minimax programming problems under $ \sigma $-arcwisely connectivity. AIMS Mathematics, 2023, 8(5): 11238-11258. doi: 10.3934/math.2023569
    [2] Bao Ma, Yanrong Ma, Jun Ma . Adaptive robust AdaBoost-based kernel-free quadratic surface support vector machine with Universum data. AIMS Mathematics, 2025, 10(4): 8036-8065. doi: 10.3934/math.2025369
    [3] Ebenezer Fiifi Emire Atta Mills . The worst-case scenario: robust portfolio optimization with discrete distributions and transaction costs. AIMS Mathematics, 2024, 9(8): 20919-20938. doi: 10.3934/math.20241018
    [4] Ziran Yin, Chongyang Liu, Xiaoyu Chen, Jihong Zhang, Jinlong Yuan . A comprehensive characterization of the robust isolated calmness of Ky Fan $ k $-norm regularized convex matrix optimization problems. AIMS Mathematics, 2025, 10(3): 4955-4969. doi: 10.3934/math.2025227
    [5] Wei Zhang, Pengcheng Li, Donghe Pei . Circular evolutes and involutes of spacelike framed curves and their duality relations in Minkowski 3-space. AIMS Mathematics, 2024, 9(3): 5688-5707. doi: 10.3934/math.2024276
    [6] Gerardo Sánchez Licea . Strong and weak measurable optimal controls. AIMS Mathematics, 2021, 6(5): 4958-4978. doi: 10.3934/math.2021291
    [7] Yixin Li, Chunguang Li, Wei Yang, Wensheng Zhang . A new conjugate gradient method with a restart direction and its application in image restoration. AIMS Mathematics, 2023, 8(12): 28791-28807. doi: 10.3934/math.20231475
    [8] Mingze Zhao, Huilan Li . A pair of dual Hopf algebras on permutations. AIMS Mathematics, 2021, 6(5): 5106-5123. doi: 10.3934/math.2021302
    [9] Jun Moon . The Pontryagin type maximum principle for Caputo fractional optimal control problems with terminal and running state constraints. AIMS Mathematics, 2025, 10(1): 884-920. doi: 10.3934/math.2025042
    [10] Yuna Oh, Jun Moon . The infinite-dimensional Pontryagin maximum principle for optimal control problems of fractional evolution equations with endpoint state constraints. AIMS Mathematics, 2024, 9(3): 6109-6144. doi: 10.3934/math.2024299
  • This paper deals with the robust strong duality for nonconvex optimization problem with the data uncertainty in constraint. A new weak conjugate function which is abstract convex, is introduced and three kinds of robust dual problems are constructed to the primal optimization problem by employing this weak conjugate function: the robust augmented Lagrange dual, the robust weak Fenchel dual and the robust weak Fenchel-Lagrange dual problem. Characterizations of inequality (1.1) according to robust abstract perturbation weak conjugate duality are established by using the abstract convexity. The results are used to obtain robust strong duality between noncovex uncertain optimization problem and its robust dual problems mentioned above, the optimality conditions for this noncovex uncertain optimization problem are also investigated.



    Robust optimization problems [4,5,7,8,22,23,29,30,31] and robust dual theory [3,6,10,11,12,13,15,16,18,19,28] have attracted much attention of mathematical researchers. Many of the works in this area were considered convex robust optimization problems, in [6,15] robust Lagrangian strong duality was established in convex optimization and in [16] robust Lagrangian strong duality theorem was given whenever the Lagrangian function is convex. Moreover, duality theory which is based on conjugate function plays an important role in optimization. In convex analysis, dual problem is constructed in terms of conjugate functions by using the well-known Legendre-Fenchel transform. Robust classical conjugate duality was presented for convex optimization problem in [18]. Furthermore, in [13], characterizations of inequality below

    p(x)=supvVFv(x,0Y)l(x) (1.1)

    in terms of robust abstract perturbational duality were established, where X, Y are locally convex Hausdorff topological vector spaces, V is an uncertainty set, FV:X×YˉR=R{±} for each vV, l:XˉR is a lower semicontinuous proper convex function. The results were then applied to robust DC and robust convex optimization problems, and strong Fenchel duality and strong Lagrangian duality for these classes of robust problems were also obtained.

    It is well known that dual problems constructed by using general augmented Lagrangian functions or weak conjugate functions, and strong duality conditions for noncovex optimization problems were comprehensively studied by researchers[1,2,13,14,17,20,21,25,27,33]. In particular, the conjugate function theory, developed by Azimov and Gasimov in [1], used superlinear functions of the form x,xcx instead of linear functions x,x used in convex analysis. They extended the usual definition of the subdifferential, using this class of functions, and established duality relations in terms of so-called weak subdifferentiability of the perturbation function associated with the problem under consideration. By using weak conjugate function and weak subdifferential given in [1], Küçük ect. in [17] constructed weak Fenchel conjugate dual problem and weak Fenchel-Lagrange conjugate dual problem, presented necessary and sufficient conditions for the strong duality of the dual problems and nonconvex scalar optimization problem; In [33], the duality scheme and strong duality theorems for nonconvex optimization problem were presented, which are based on the weak conjugate function and the weak subdifferential concept given in [1].

    Nevertheless, there are few duality results on nonconvex robust optimization problem in the literature, since it is not only very hard to verify the zero duality gap conditions formulated in terms of perturbation and/or dualizing parameterization functions, but also to derive the conditions formulated in terms of objective and constraint functions. Motivated by [13,17,33], the aim of this paper is to formulate robust dual problems by using the weak conjugate function we introduced (see Definition 2.1) and establish robust strong duality results for nonconvex uncertain optimization problem. Characterization of general inequality (1.1) above with uncertainty is established according to robust perturbational weak conjugate duality, where we only assume the right hand function l in (1.1) is abstract convex [9,24], which covers very broad classes on nonconvex functions. Then the results are used as key tools to obtain the strong duality for the robust augmented Lagrange dual (RDL), robust weak Fenchel dual (RDwF) and the robust weak Fenchel-Lagrange dual problems (RDwLF) which are all defined by using the weak conjugate function, and are also applied to investigate the optimality conditions for nonconvex robust optimization problem.

    The paper is organized as follows. In section 2, we recall some notations and introduce some preliminary results which will be used in the rest of paper. In section 3, we construct three types of robust dual problems for the primal optimization problem by using the weak conjugate function and obtain the strong duality respectively by establishing the inequality (1.1) via robust perturbation weak conjugate duality. In section 4, we investigate the relations among the optimal objective values of (RDL), (RDwF), (RDwLF) and the robust optimization (RP) of (UP). Finally, section 5, we present necessary and sufficient optimality conditions for (RDL), (RDwF), R(DwLF) and (RP).

    In this section, we introduce the definitions of weak conjugate, weak biconjugate function, weak subdifferentials and some basic theorems and lemmas about these notions.

    Throughout this paper, let X, Y be two locally convex vector spaces with their topological dual spaces X and Y, endowed with the weak topologies W(X,X) and W(Y,Y), respectively. Let DY be a nonempty closed convex cone, the dual cone of D is defined by

    D={yY:y,y0,yD},

    where we use the notation , for the value of the continuous linear function yY and yY. We use the notation R+={xxR,x0}. We also recall the corresponding concepts and results on (extended) real-valued functions. Let f:GˉR, g:GˉR be functions defined on a set GX, then the inequality fg means that f(x)g(x) for all xG. The domain and the epigraph of f are

    domf={xX:f(x)<+},

    and

    epif={(x,r)X×R:xdomf,f(x)r},

    respectively. The strict epigraph of f:XˉR is the set

    epi _{s} f={(x,r)X×R:xdomf,f(x)<r}.

    The function f is said to be proper if domf and f(x). Let H be a set of functions h:GˉR. The set supp(f,H)={hHhf} is called the support set of f with respect to H. The function co H f:GˉR defined by co H f(x)=sup{h(x)hsupp(f,H)} is called the Hconvex hull of f. A function f:GˉR is called abstract convex with respect to H (or Hconvex) at a point xG if there exist a set Usupp(f,H) such that f(x)=sup{h(x)hH}. It is clear that f is Hconvex at x if and only if f(x)=co H f(x). If f is Hconvex at each point xG, then f is called Hconvex on G.

    Let L be a set of functions defined on a set G. Functions hl,r of the form hl,r=l(x)r, xG, with lL and rR are called Laffine. Denoted by HL the set of all Laffine functions. Denoted by ΓX the union of the set of all functions f:GR{+} and the function , where (x)= for all xG.

    We now introduce the definitions of new weak conjugate and weak biconjugate functions. First we need to have a function σ for the above definitions. It is assumed that σ:YR+ is continuous function with the following properties:

    σ(0)=0,σ(y)0ify0. (2.1)

    Definition 2.1. (a) A function fw:X×R+ˉR defined by

    fw(x,c)=supxX{x,xcσ(x)f(x)}

    is called the weak conjugate function of f. This function is HXconvex;

    (b) The function fww:XˉR defined by

    fww(x)=sup(x,c)X×R+{x,xcσ(x)fw(x,c)}

    is called the weak biconjugate function of f. This function is HX×R+convex. The classical result of abstract convex analysis states that fHX is abstract convex with respect to HX×R+ at a point x if and only if f(x)=fww(x).

    Remark 2.1. In the definition of weak biconjugate function if r=0, then fw(x,0)=f(x) where f(x) is the classical conjugate function; if σ(x)=x, then fw(x,c) reduces to the fw(0,x,c) in [1].

    Definition 2.2. Let X be a locally convex vector space. Let f:XR be a single valued function and x0X be a point with f(x0) is finite. A pair (x,c)X×R+ is called weak subgradient of f at x0 if

    f(x)f(x0)x,xx0cσ(xx0)forallxX.

    The set wf(x0)={(x,c)X×R+f(x)f(x0)x,xx0cσ(xx0),xX} of all weak subgradients of f at x0 is called the weak subdifferential of f at x0. If wf(x0), then f is called weakly subdifferentiable at x0.

    Remark 2.2. If σ(x)=x, then the definition of 2.2 reduces to the corresponding definition in [1].

    Consider the following optimization problem with uncertain parameter in the constraint:

    (UP)infxQ{f(x)g(x,v)D},

    where f:XˉR and g:X×ZY are given functions, Z is another locally convex vector space, QX is a nonempty closed set, v is uncertain parameter and belongs to VZ.

    For each vV, we denote

    Sv={xQg(x,v)D}.

    In this paper, robust optimization approach is applied to (UP). Now, we associate with (UP) its robust counterpart

    (RP)infxQ{f(x)g(x,v)D,vV}.

    We denote the feasible set of (RP) by

    S={xQg(x,v)D,vV}=vVSv.

    The problem (RP) is called the robust primal problem of (UP). The infimum for problem (RP) is denoted by inf(RP) and every element xS such that f(x)=inf(RP) is called a robust solution of (UP) (or a solution of (RP)).

    The Lagrange perturbation function of (UP) is F:V×X×YˉR define as follows:

    Fv(x,y)={f(x),g(x,v)+yD,xQ+,otherwise. (2.2)

    The weak conjugate function of Fv is Fwv:X×R+×Y×R+ˉR given by

    Fwv(x,c,y,d)=sup(x,y)X×Y{x,xcσ(x)+y,ydσ(y)Fv(x,y)}=supxQsupyDg(x,v){x,xcσ(x)+y,ydσ(y)f(x)} (2.3)

    for all (x,c,y,d)X×R+×Y×R+.

    Remark 2.3. It follows immediately from the definition of weak biconjugate function, we have

    Fwwv(x,0)=sup(x,c,y,d){x,xcσ(x)+y,0dσ(0)Fwv(x,c,y,d)}=sup(x,c,y,d){x,xcσ(x)sup(x,y){x,xcσ(x)+y,ydσ(y)Fv(x,y)}}sup(x,c,y,d){x,xcσ(x)x,x+cσ(x)+Fv(x,0)}=Fv(x,0).

    Remark 2.4. Considering (1.1) and the definition of Fv(x,y), we can conclude that

    p(x)={f(x),xS,+,otherwise.

    Let q:X×R+ˉR be the function defined by

    q(x,c)=infvVinf(y,d)Y×R+Fwv(x,c,y,d),(x,c)X×R+.

    Let the projection Π:(x,c,y,d,r)X×R+×Y×R+×R(x,c,r)X×R+×R, and let

    Λ=vVΠ(epiFwv)

    Lemma 2.1. Let pw:X×R+ˉR be a weak conjugate function, then pw is lower semicontinuous and convex on X×R+.

    Proof. By the definition of weak conjugate function, we have

    pw(x,c)=supxX{x,xcσ(x)p(x)}=supxX{x,x(p+cσ)(x)}=(p+cσ)(x),

    where (ρ+cσ)(x)=ρ(x)+cσ(x), so pw is lower semicontinuous on X×R+. Since x is linear function and c is a constant, so convexity is easy to obtain. The proof is complete.

    The following lemmas generalize [[13], Lemmas 2.1 and 2.2].

    Lemma 2.2. One has

    (i) qw=supvVFwwv(x,0Y)p;

    (ii) pwqwwq;

    (iii)episp=vVΠ(episFwv);

    (iv)Λepiqepipw and ¯coΛepipw.

    Proof. For any xX, from the definition of qw one has

    qw(x)=sup(x,c)X×R+{x,xcσ(x)q(x,c)}=sup(x,c)X×R+{x,xcσ(x)infvVinf(y,d)Y×R+Fwv(x,c,y,d)}=supvVsup(x,c)X×R+(y,d)Y×R+{x,xcσ(x)+y,0dσ(0)Fwv(x,c,y,d)}=supvVFwwv(x,0Y). (2.4)

    Since Fwwv(x,0Y)Fv(x,0Y)p(x) for all vV and xX, (2.4) yields qw(x)p(x) and (i) holds, while (ii) follows from (i).

    Proof of (iii). Take (x,c,r)episq. Then

    infvV(y,d)Y×R+Fwv(x,c,y,d)<r,

    which implies there exist ˉvV and (ˉy,ˉd)Y×R+ such that

    Fwˉv(x,c,ˉy,ˉd)<r,

    so (x,c,ˉy,ˉd,r)episFwˉv and (x,c,r)=Π(episFwˉv)vVΠ(episFwˉv), which mean episqvVΠ(episFwv).

    On the other hand, take (x,c,r)vVΠ(episFwv), then there exist ˉvV such that (x,c,r)Π(episFwˉv). Since Π is surjective, there is (ˉy,ˉd)Y×R+ such that (x,c,ˉy,ˉd,r)episFwˉv, and so (x,c,r)episq as

    q(x,c)=infvVinf(y,d)Y×R+Fwv(x,c,y,d)Fwˉv(x,c,ˉy,ˉd)<r,

    for all (x,c)X×R+. Thus, episqvVΠ(episFwv) which, together with the inclusion above, proves that (iii) holds.

    Proof of (iv). Since Π is surjective and

    q(x,c)=infvVinf(y,d)Y×R+Fwv(x,c,y,d)Fwv(x,c,ˉy,ˉd)

    for all vV, (ˉy,ˉd)Y×R+ and (x,c)X×R+, it follows that Λepiq. By (ii), epiqepipw, and so ¯coΛepipw.

    Lemma 2.3. Assume that there exists ˉxX such that supvVFwwv(x,0Y)<+. Then one has epiqww=¯coΛ. Moreover, the following statements are equivalent:

    (i) pww=supvVFwwv(,0Y);

    (ii) epipw=¯coΛ.

    Proof. Observe that qw=supvVFwwv(,0Y), and so by assumption, one obtains domqw. According to [34], epiqww=¯co(epiq) which, together with Lemma 2.1 (iii), implies

    ¯co(epiq)=¯co(episq)=¯co(vVΠepisFwv)=¯co(vVΠepiFwv)=¯coΛ.

    For the equivalence of (i) and (ii), note that in light of Lemma 2.1, (i) is equivalent to pww=qw, which means also that qww=pw. The last equality and epiqww=¯coΛ show epipw=epiqww=¯coΛ, which is (ii). The proof is complete.

    The aim of this section is to construct three types of robust dual problems for (UP) by using weak conjugate function: the robust augmented Lagrange dual, the robust weak Fenchel dual and the robust weak Fenchel-Lagrange dual problem, to establish characterization of inequality (1.1) according to robust abstract perturbational weak conjugate duality, and finally, by employing these results to obtain robust strong duality results for (UP).

    To define an augmented Lagrange function for (UP), we need augmented function σ to be a continuous function with the properties (2.1). For each fixed vV, the uncertain augmented Lagrange function associated with (UP) is given by

    Lv(x,y,d)=infyY{Fv(x,y)y,y+dσ(y)}=infyY{f(x)y,y+dσ(y),g(x,v)+yD,xQ+,otherwise,

    for xX, yY, yY and dR+, where function Fv(x,y) is defined in (2.2). By using the definition of Fv(x,y), we can concretize the augmented Lagrange associated with (UP)

    Lv(x,y,d)=infyDg(x,v){f(x)y,y+dσ(y)},

    for xQ, yY and dR+.

    The uncertain dual function of (UP) is

    ϕv(y,d)=infxQLv(x,y,d),for(Y,d)Y×R+.

    Then uncertain augmented Lagrange dual problem of (UP) is defined as

    (UDL)sup(y,d)Y×R+ϕv(λ,r).

    The optimistic counterpart of the uncertain augmented Lagrange dual (UDL) is a deterministic maximization problem given by

    (RDL)sup(v,y,d)V×Y×R+ϕv(y,d)=sup(v,y,d)V×Y×R+infxQLv(x,y,d)=sup(v,y,d)V×Y×R+infxQinfyDg(x,v){f(x)y,y+dσ(y)}.

    Now, when x=0X and c=0 in (2.3), the value of the function Fwv(0,0,y,d) simply denoted by

    Fwv(0,0,y,d)=supxQsupyDg(x,v){y,ydσ(y)f(x)}.

    Hence,

    Fwv(0,0,y,d)=infxQinfyDg(x,v){f(x)y,y+dσ(y)}.

    As a result, robust augmented Lagrange dual problem for (UP) with respect to Fv can be given by

    (RDL)sup(y,d)Y×R+supvV{Fwv(0,0,y,d)}.

    The supremum for problem (RDL) is denoted by sup(RDL) and any element (v,y,d)V×Y×R+ such that Fwv(0,0,y,d)=sup(RDL) is termed as a solution of (RDL).

    Theorem 3.1. (Weak duality) sup(RDL)inf(RP).

    Proof. For arbitrary (v,y,d)V×Y×R+,

    Fwv(0,0,y,d)=supxQsupyDg(x,v){y,ydσ(y)f(x)}=infxQinfyDg(x,v){f(x)y,y+dσ(y)}infxSVf(x)infxSf(x)=inf(RP),

    so we conclude that sup(RDL)inf(RP)

    In the following sections we always assume Γ is a set of functions defined on X, H is a set of functions and H, define ΓH(X)={lΓ(x)l(x)is H-convex on X}.

    Theorem 3.2. (Robust abstract perturbational weak conjugate duality) Consider the following statements:

    (a1) epipw=Λ;

    (b1) For any lΓH(X), the following assertions are equivalent:

    (b11) p(x)l(x);

    (b12) for all (x,c)X×R+,

    there exist (v,y,d)V×Y×R+ such that Fwv(x,c,y,d)lw(x,c).

    One has (a1)(b1).

    Proof. (a1)(b1) Assume that (a1) holds. Let lΓH(x). If (b1) holds, i.e., pl, then pwlw, and hence for all (x,c)domlw, (x,c,lw(x,c))epipw. Since (a1) holds, there exist vV such that (x,c,lw(x,c))Π(epiFwv), and so there exist (y,d)Y×R+ such that Fwv(x,c,y,d)lw(x,c), which implies that (b12) holds.

    Conversely, if (b12) holds, then for any (x,c)domlw, there exist (v,y,d)V×Y×R+ such that Fwv(x,c,y,d)lw(x,c). Taking Lemma 2.1 (ii) into account, we get pw(x,c)q(x,c)Fwv(x,c,y,d)lw(x,c), and hence l=lwwpwwp, the "=" above is from l being H-convex.

    (b1)(a1). Assume that (b1) holds, we will show that (a1) holds. To this aim, considering Lemma 2.1 (iv) into account, it is sufficient to prove that

    epipwΛ.

    Take every (x,c,r)epipw. Then x,xcσ(x)p(x)pw(x,c)r for all xX. Now set l(x)=x,xcσ(x)r, then lΓH(X) and lp, so by (b1), there exist (v,y,d)V×Y×R+ such that Fwv(x,c,y,d)lw(x,c)r. This shows that (x,c,r)Π(epiFwv) and hence (x,c,r)Λ.

    Remark 3.1. Theorem 3.2 generalizes [[13], Theorem 3.1]. In [13] the authors used the classical conjugate function and assumed the right-side function l(x) of inequality (1.1) is convex lower semicontinuous, whereas Theorem 3.2 in this paper, we employ the weak conjugate function and only assume l(x) is abstract convex, which covers very broad classes on nonconvex function.

    Theorem 3.3. (Strong duality) Let epipw=Λ, then there exist (v0,y0,d0)V×Y×R+ such that (v0,y0,d0) is a solution of (RDL) and sup(RDL)=inf(RP).

    Proof. Let l(x)=infxXp(x), then l(x)=inf(RP) for all xS. From assumption epipw=Λ and considering x=0X, c=0 in Theorem 3.2, then there exist (v0,y0,d0)V×Y×R+ such that

    Fwv0(0,0,y0,d0)lw(0,0)=inf(RP),

    which is equivalent to

    Fwv0(0,0,y0,d0)inf(RP),

    considering weak duality of Theorem 3.1, we have

    inf(RP)sup(RDL)Fwv0(0,0,y0,d0)inf(RP),

    this yields (v0,y0,d0) is a solution of (RDL) and sup(RDL)=inf(RP).

    Theorem 3.4. Assume that domp and pww=supvVFwwv(,0Y). Then the following statements are equivalent:

    (f1) Λ=¯coΛ;

    (b1) For any ΓH(X), the following assertions are equivalent:

    (b11) p(x)l(x);

    (b12) for all (x,c)X×R+, there exist (v,y,d)V×Y×R+ such that Fwv(x,c,y,d)lw(x,c).

    Proof. Since pww=supvVFwwv(,0Y), Lemma 2.3 and f1 give epipw=¯coΛ=Λ. The conclusion follows from Theorem 3.2.

    Remark 3.2. Theorem 3.4 gives an equivalent condition that the set Λ is closed convex set.

    Next we give a sufficient condition for pww=supvVFwwv(,0).

    Proposition 3.1. Assume that for any (x,c)X×R+, there exists ˉvV such that yFˉv(,0Y) on Sˉv and supxS{x,xcσ(x)f(x)}=supxSˉv{x,xcσ(x)f(x)}, then pww=supvVFwwv(,0).

    Proof. Since for any (x,c)X×R+, there exists ˉvV such that yFˉv(,0Y), so there exist (y,d)Y×R+ such that (y,d)yFˉv(,0Y), which implies

    Fˉv(x,0)Fˉv(x,y)+y,ydσ(y),xSˉv,yY,

    that is

    Fˉv(x,0)y,ydσ(y)Fˉv(x,y),xSˉv,yY. (3.1)

    Following from (3.1), we obtain

    Fwˉv(x,c,y,d)=sup(x,y)X×Y{x,xcσ(x)+y,ydσ(y)Fˉv(x,y)}supxX{x,xcσ(x)Fˉv(x,0)}=supxSˉv{x,xcσ(x)f(x)} (3.2)

    Since supxS{x,xcσ(x)f(x)}=supxSˉv{x,xcσ(x)f(x)}, together with (3.2), we get

    Fwˉv(x,c,y,d)supxS{x,xcσ(x)f(x)}=supxX{x,xcσ(x)P(x)}=pw(x,c). (3.3)

    The first equality above follows from Remark 2.1. Taking account into (3.3) and the definition of pww(x), one has

    pww(x)=sup(x,c)X×R+{x,xcσ(x)pw(x,c)}sup(x,c)X×R+{x,xcσ(x)Fwˉv(x,c,y,d)}supvVsup(x,c,y,d)X×R+×Y×R+{x,xcσ(x)Fwv(x,c,y,d)}=supvVFwwv(x,0)pww(x).

    Which implies pww=supvVFwwv(,0). The proof is complete.

    Remark 3.3. We note that yFˉv(,0Y) on Sˉv is a general assumption. In fact, we just need to assume augmented function σ(y) satisfy σ(y)y for all yY and take (y,d)Y×R+ such that yd, then (3.1) holds from the definition of Fˉv(x,0) and Fˉv(x,y) (see (2.2)), so we can conclude that (y,d)yFˉv(,0Y), which implies yFˉv(,0Y).

    Remark 3.4. In the proof of the strong duality theorem 3.3, our sufficient condition epipw=Λ is different from the existed conditions. Duality Theorem 11.59 in [26], the condition was supposed dualizing parameterization ϕ(x,y) is level-bounded in x locally uniformly in y, duality theorems in [1,32], conditions were assumed perturbation function h=infxϕ(x,y) is proper and weakly subdifferential at the origin 0Y. All these conditions were formulated in terms of dualization parameterization or perturbation functions associated with the given problem.

    We recall the assumption epipw=Λ in theorem 3.3, which employ the epigraph of function Fwv defined by (2.3), it is also related to dualization parameterization Fv(x,y), and we know that epipw=¯coΛ is easy to satisfy from Lemma 2.3, Proposition 3.1 and Remark 3.1. Moreover, Theorem 3.4 gives an equivalent condition that the set Λ is closed convex set, so it is worth further exploration to find the sufficient conditions that can ensure the set Λ is not only a closed convex set but also only related to the objective function and the constraint function.

    The Fenchel perturbation function of (UP) is F:V×X×XˉR defined as

    Fv(x,u)={f(x+u),g(x,v)D,xQ,+,otherwise.

    The weak conjugate function of Fv is Fwv:X×R+×X×RR+ defined as

    Fwv(x,c,u,d)=sup(x,u)X×X{x,xcσ(x)+u,udσ(u)Fv(x,u)}=supxSvsupuX{x,xcσ(x)+u,udσ(u)f(x+u)}=supxSvsupγX{x,xcσ(x)+u,γxdσ(γx)f(γ)},

    for x,uX and c,dR+, where γ=x+u. By choosing x=0X, c=0, we have

    Fwv(0,0,u,d)=supxSvsupγX{u,γxdσ(γx)f(γ)}.

    Hence, the robust Fenchel dual problem (RDwF) with respect to Fv is defined as

    (RDwF)sup(u,d)X×R+supvV{Fwv(0,0,u,d)}=sup(u,d)X×R+supvVinfxSvinfγX{f(γ)u,γx+dσ(γx)}

    The supremum for problem (RDF) is denoted by sup(RDF) and any element (v,u,d)V×X×R+ such that Fwv(0,0,u,d)=sup(RDF) is termed as a solution of (RDF).

    Remark 3.5. sup(RDwF)inf(RP) follows immediately from the definition of Fwv(0,0,u,d).

    Let the projection Π1:(x,c,u,d,r)X×R+×X×R+×R(x,c,r)X×R+×R and let

    Λ1=vVΠ1(epiFwv)

    Theorem 3.5. (Strong duality) Let epipw=Λ1, then there exists (v0,u0,d0)V×X×R+ such that (v0,u0,d0) is a solution of (RDF) and sup(RDF)=inf(RP)

    Proof. The proof is similar to that of Theorem 3.3.

    The Fenchel-Lagrange perturbation function of (UP) is F:V×X×X×YˉR defined as

    Fv(x,u,y)={f(x+u),g(x,v)+yD,xQ,+,otherwise.

    The weak conjugate function of Fv(x,u,y) is defined as Fwv:X×R+×X×R+×Y×R+R+:

    Fwv(x,c,u,d,y,e)=sup(x,u,y)X×X×Y{x,xcσ(x)+u,udσ(u)+y,yeσ(y)Fv(x,u,y)}=supxQuXsupyDg(x,v){x,xcσ(x)+u,udσ(u)+y,yeσ(y)f(x+u)}=supxQγXsupyDg(x,v){x,xcσ(x)+u,γxdσ(γx)+y,yeσ(y)f(γ)},

    where γ=x+u. By choosing x=0X, c=0 and d=e, we have

    Fwv(0,0,u,d,y,d)=supxQγXsupyDg(x,v){u,γxdσ(γx)+y,ydσ(y)f(γ)}.

    Hence, the robust Fenchel-Lagrange dual problem (RDwFL) with respect to Fv is defined as

    (RDwFL)sup(u,d)X×R+(y,d)Y×R+supvVinfxQγXinfyDg(x,v){Fwv(0,0,u,d,y,d)}=sup(u,d)X×R+(y,d)Y×R+supvVinfxQγXinfyDg(x,v){f(γ)u,γxy,y+dσ(γx)+dσ(y)}

    The supremum for problem (RDwFL) is denoted by sup(RDwFL) and any element (v,u,d,y,d)V×X×R+×X×R+ such that Fwv(0,0,u,d)=sup(RDwFL) is termed as a solution of (RDwFL).

    Theorem 3.6. (Weak duality) sup(RDwFL)inf(RP).

    Proof. For any (v,u,d,y,d)V×X×R+×Y×R+,

    Fwv(0,0,(v,u,d,y,d))=infxQγXinfyDg(x,v){f(γ)u,γxy,y+dσ(γx)+dσ(y)}infxQinfyDg(x,v){f(x)y,y+dσ(y)}infxSVf(x)infxSf(x)=inf(RP),

    so we conclude that sup(RDL)inf(RP)

    Let the projection Π2:(x,c,u,d,y,e,r)X×R+×X×R+×Y×R+×R(x,c,r)X×R+×R and let

    Λ2=vVΠ2(epiFwv).

    Theorem 3.7. (Strong duality) Let epipw=Λ2, then there exists (v,u,d,y,d)V×X×R+×Y×R+ such that (v,u,d,y,d) is a solution of (RDwFL) and inf(RP)=sup(RDwFL).

    Proof. The proof is similar to that of Theorem 3.3.

    In this section, we examine the relations among the objective values of dual problems (RDL), (RDwF) and (RDwFL).

    Proposition 4.1. The inequality sup(DwFL)sup(RDL) holds.

    Proof. Let (v,u,y,d) be an arbitrary element of V×X×Y×R+. It is known that

    infxQinfγXinfyDg(x,v){f(γ)u,γxy,y+dσ(γx)+dσ(y)}infxQinfyDg(x,v){f(x)y,y+dσ(y)}sup(v,y,d)V×Y×R+infxQinfyDg(x,v){f(x)y,y+dσ(y)}=sup(RDL).

    As (v,u,y,d)V×X×Y×R+ is arbitrary element, we get

    sup(DwFL)=sup(v,u,y,d)infxQinfγXinfyDg(x,v){f(γ)u,γxy,y+dσ(γx)+dσ(y)}sup(RDL).

    Proposition 4.2. The inequality sup(RDwFL)sup(RDwF) holds.

    Proof. Let (v,u,y,d) be an arbitrary element of V×X×Y×R+. It is known that 0Dg(x,v) for all xSv, so we have

    infxQγXinfyDg(x,v){f(γ)u,γxy,y+dσ(γx)+dσ(y)}infxSvγXinfyDg(x,v){f(γ)u,γxy,y+dσ(γx)+dσ(y)}infxSvγX{f(γ)u,γx+dσ(γx)}sup(v,u,d)V×X×R+infxSvγX{f(γ)u,γx+dσ(γx)}=sup(RDwF).

    Hence, taking the supremum in both sides over (v,u,y,d)V×X×Y×R+, we get

    sup(RDwFL)sup(RDwF).

    This completes the proof.

    Proposition 4.3. Let Λ2({0X}×{0}×R+)=epipw({0X}×{0}×R+), then sup(RDwFL)=sup(RDwF)=sup(RDL)=inf(RP).

    Proof. Under these assumptions and considering Theorem 3.7, it is known that sup(RDwFL)=inf(RP). By Propositions 4.1 and 4.2, we obtain

    sup(RDwFL)=sup(RDwF)=sup(RDL)=inf(RP).

    Now, we present an example of robust optimization problem which prove the relationships between the optimal values of the three proposed dual problems.

    Example 1. Consider the following one-dimensional optimization with data uncertainty in constraint:

    (UP)infxQ{f(x)g(x,v)=vx0},

    where f:RR is defined as

    f(x)={|x|,x11,x>1.

    for all xR and Q=R, the data v[1,1] is uncertain.

    In this example, we always assume function defined in (2.1) is σ(x)=|x|. Let us calculate the weak conjugate function of Lagrange perturbation Fv(x,u) with x=c=0. If 0<v1, then Fwv(0,0,y,d)=+, if 1v0,

    Fwv(0,0,y,d)={+,zd>0orz+d<0,orz+d>0and1+(z+d)v>0uc+1,uc0.

    then we have sup(RDL)=1.

    Let us calculate the weak conjugate function of Fenchel perturbation Fv(x,u) with x=c=0. If 0<v1, then Fwv(0,0,u,d)=+, otherwise if 1v0,

    Fwv(0,0,u,d)={+,u+c+1<0oru>cuc+1,uc0.

    Then we obtain sup(RDwF)=1.

    We also get sup(RDwFL)=. So we obtain =sup(RDwFL)<sup(RDwF)=sup(RDL)=1.

    Remark 4.1. Consider Example 1, let v0, from [[17], Example1], we know the classical conjugate function of Fenchel perturbation Fv(x,y) with x=0 is Fv(0,y)=+, so we can conclude that the optimal value of robust Fenchel dual problem in classical sense (denoted by sup(RDF)) is

    sup(RDF)=supv[1,0]supyYFv(0,y)=<sup(RDwF)=1,

    which shows that weak conjugate function is more likely to guarantee zero dual gaps than classical conjugate function.

    In this section, we give optimality conditions for (RDwF), (DwFL) and (RP).

    Theorem 5.1. Let Λ1({0X}×{0}×R+)=epipw({0X}×{0}×R+).

    (a) If ˉx is a solution of (RP), then there exists a solution (v0,u0,d0)V×X×R+ such that

    (i) f(ˉx)=infxSv0infγX{f(γ)u0,γx+d0σ(γx)},

    (ii) (u0,d0)wf(ˉx).

    (b) Conversely, if ˉxS and (v0,u0,d0)V×X×R+ satisfies conditions (i) and (ii), then ˉx is a solution of (RP) and (v0,u0,d0) is a solution of (RDwF).

    Proof. (a) Let Λ1({0X}×{0}×R+)=epipw({0X}×{0}×R+). Theorem 3.5 ensures the existence of an optimal solution (v0,u0,d0)V×X×R+ of (RDwF) and sup(DwF)=inf(RP). Since ˉx is a solution of (RP), so we get

    f(ˉx)=infxSv0infγX{f(γ)u0,γx+d0σ(γx)},

    which means condition (i) is satisfied.

    From (i) we have

    f(ˉx)=infxSv0infγX{f(γ)u0,γx+d0σ(γx)}f(γ)u0,γˉx+d0σ(γˉx),γX

    which implies (u0,d0)wf(ˉx). Therefore, (ii) is satisfied.

    (b) Let (i) be satisfied. Then

    inf(RP)f(ˉx)=infxSv0infγX{f(γ)u0,γx+d0σ(γx)}sup(v,u,d)V×X×R+infxSv0infγX{f(γ)u0,γx+d0σ(γx)}=sup(RDwF).

    Considering the above formula and weak duality, it is easy to get the following,

    inf(RP)=sup(DwF)=f(ˉx)=infxSv0infγX{f(r)u0,γx+d0σ(γx)},

    that is, ˉx and (v0,u0,d0) are solutions of (RP) and (RDwF) respectively.

    Theorem 5.2. Let Λ2({0X}×{0}×R+)=epipw({0X}×{0}×R+).

    (a) If ˉx is a solution of (RP), then there exist a solution (v0,u0,d0,y0,d0)V×X×R+×Y×R+ such that

    (i) f(ˉx)=infxSv0γXinfyDg(x,v0){f(γ)u0,γxy0,y+d0σ(γx)+d0σ(y)};

    (ii) (u0,d0)wf(ˉx);

    (iii) infyDg(ˉx,v0){d0σ(y)y0,y}=0.

    (b) Conversely, if ˉxS and (v0,u0,d0,y0,d0)V×X×R+×Y×R+ satisfy conditions (i)-(iii), then ˉx is solution of of (RP) and (v0,u0,d0,y0,d0) is solution of (DwFL).

    Proof. (a) Let Λ2({0X}{0}×R+)=epipw({0X}×{0}×R+). Theorem 3.7 guarantees the existence of an optimal solution (v0,u0,d0,y0,d0)V×X×R+×Y×R+ of (RDwFL) and sup(DwFL)=inf(RP). Since ˉx is a solution of (RP), then

    f(ˉx)=infxSv0γXinfyDg(x,v0){f(γ)u0,γxy0,y+d0σ(γx)+d0σ(y)}.

    Therefore, (i) is satisfied.

    As xSv0 implies 0YDg(x,v0), so infyDxSv0g(x,v0){d0σ(y)y0,y}0, which means

    f(ˉx)=infxSv0γXinfyDg(x,v0){f(γ)u0,γxy0,y+d0σ(γx)+dσ(y)}f(γ)u0,γˉx+d0σ(γˉx),γX,

    this implies (u0,d0)wf(ˉx), that is (ii) holds.

    Considering (i) we also have

    f(ˉx)=infxSv0γXinfyDg(x,v0){f(γ)u0,γxy0,y+d0σ(γx)+d0σ(y)}f(ˉx)+infyDg(ˉx,v0){d0σ(y)y0,y},

    which implies infyDg(ˉx,v0){d0σ(y)y0,y}0. It is obvious that infyDg(ˉx,v0){d0σ(y)y0,y}0 for ˉxSv0, so we get infyDg(ˉx,v0){d0σ(y)y0,y}=0, the relation (iii) is proved.

    The proof of (b) is similar to that of Theorem 5.1 (b).

    This paper deals with the robust strong duality for nonconvex optimization problem with the data uncertainty in constraint. We introduce a weak conjugate function and construct three kinds of robust dual problems for primal problem by employing this weak conjugate function, then we establish the robust strong duality between noncovex uncertain optimization problem and its robust dual problems. In particular, we note that the optimal value of robust conjugate dual problem in classical sense is less than that in weak conjugate sense (see Remark 4.1), which shows that weak conjugate function is more likely to guarantee zero dual gaps than classical conjugate function.

    The research was supported by the Education department of Shaanxi province (17JK0330). The author is deeply grateful to the referees for their valuable comments and helpful suggestions.

    The author declares no conflict of interest.



    [1] A. Y. Azimov, R. N. Gasimov, On weak conjugacy, weak subdifferentials and duality with zero gap in nonconvex optimization, Int. J. Appl. Math., 1 (1999), 171-192.
    [2] E. J. Balder, An extension of duality-stability relations to nonconvex optimization problems, SIAM J. Control Optim., 15 (2006), 329-343.
    [3] A. Beck, A. Ben-Tal, Duality in robust optimization: Primal worst equals dual best, Oper. Res. Lett., 37 (2009), 1-6.
    [4] A. Ben-Tal, S. Boyd, A. Nemirovski, Extending scope of robust optimization: Comprehensive robust counterparts of uncertain problems, Math. Program., 107 (2006), 63-89.
    [5] D. Bertsimas, D. Brown, Constructing uncertainty sets for robust linear optimization, Oper. Res., 57 (2009), 1483-1495.
    [6] R. I. Boţ, V. Jeyakumar, G. Y. Li, Robust duality in parametric convex optimization, Set-Valued Var. Anal., 21 (2013), 177-189.
    [7] A. Ben-Tal, A. Nemirovski, Selected topics in robust convex optimization, Math Program. Ser. B, 112 (2008), 125-158.
    [8] D. Bertsimas, D. Pachamanova, M. Sim, Robust linear optimization under general norms, Oper. Res. Lett., 32 (2004), 510-516.
    [9] R. S. Burachik, A. Rubinov, Abstract convexity and augmented Lagrangians, SIAM J. Optim., 18 (2007), 413-436.
    [10] T. D. Chuong, Optimality and duality for robust multiobjectiv optimization problems, Nonlnear Anal., 134 (2016), 127-143.
    [11] N. Dinh, M. A. Goberna, D. H. Long, New Farkas-Type results for vector-valued functions: A non-abstract approach, J. Optim. Theory Appl., 182 (2018), 4-29.
    [12] N. Dinh, D. H. Long, Complete characterizations of robust strong duality for robust vecter optimization problems, Vietnam J. Math., 46 (2018), 293-328. doi: 10.1007/s10013-018-0283-1
    [13] N. Dinh, T. H. Mo, G. Vallet, M. Volle, A unified approach to robust Farkas-Type results with applications to robust optimization problems, SIAM J. Optim., 27 (2017), 1075-1101. doi: 10.1137/16M1067925
    [14] C. J. Goh, X. Q. Yang, Nonlinear Lagrangian theory for nonconvex optimization, J. Optim. Theory Appl., 109 (2001), 99-121. doi: 10.1023/A:1017513905271
    [15] V. Jeyakumar, G. Y. Li, Strong duality in robust convex programming: Complete characterizations, SIAM J. Optim., 20 (2010), 3384-3407. doi: 10.1137/100791841
    [16] V. Jeyakumar, G. Y. Li, G. M. Lee, Robust duality for generalized convex programming problems under data uncertainty, Nonlinear Anal. Theory Methods Appl., 75 (2012), 1362-1373. doi: 10.1016/j.na.2011.04.006
    [17] Y. Küçük, İ. Atasever, M. Küçük, Weak Fenchel and weak Fenchel-Lagrange conjugate duality for nonconvex scalar optimization problems, J. Glob. Optim., 54 (2012), 813-830. doi: 10.1007/s10898-011-9794-y
    [18] G. Y. Li, V. Jeyakumar, G. M. Lee, Robust conjugate duality for convex optimization under uncertainty with application to data classication, Nonlnear Anal., 74 (2011), 2327-2341. doi: 10.1016/j.na.2010.11.036
    [19] J. H. Lee, G. M. Lee, On optimality conditions and duality theorems for robust semi-infinite multiobjective optimization problems, Ann. Oper. Res., 269 (2018), 419-438. doi: 10.1007/s10479-016-2363-5
    [20] S. K. Mishra, M. Jaiswal, L. T. H. An, Duality for nonsmooth semi-infinite programming problems, Optim. Lett., 6 (2012), 261-271. doi: 10.1007/s11590-010-0240-8
    [21] Y. Pandey, S. K. Mishra, Duality for nonsmooth optimization problems with equilibrium constraints, using convexificators, J. Optim. Theory Appl., 171 (2016), 694-707. doi: 10.1007/s10957-016-0885-2
    [22] S. Qu, H. Cai, D. Xu, N. Mohamed, Correction to: Uncertainty in the prediction and management of CO2 emissions: A robust minimum entropy approach, Nat. Hazards, 2021.
    [23] S. J. Qu, Y. M. Li, Y. Ji, The mixed integer robust maximum expert consensus models for large-scale GDM under uncertainty circumstances, Appl. Soft. Comput., 2021.
    [24] A. M. Rubinov, Abstract convexity and global optimization, Dordrecht: Kluwer Academic, 2000.
    [25] A. M. Rubinov, R. N. Gasimov, The nonlinear and augmented Lagrangians for nonconvex optimization problems with a single constraint, Appl. Comput. Math., 1 (2002), 142-157.
    [26] R. T. Rockafellar, R. J. B. Wets, Variational analysis, Berlin: Springer, 1998.
    [27] M. F. Sahin, A. Eftekhari, A. Alacaoglu, F. Latorre, V. Cevher, An inexact augmented Lagrangian framework for nonconvex optimization with nonlinear constraints, 2019. Available from: https://arXiv.org/abs/1906.11357.
    [28] X. K. Sun, Z. Y. Peng, X. L. Guo, Some characterizations of robust optimal solutions for uncertain convex optimization problems, Optim. Lett., 10 (2016), 1463-1478. doi: 10.1007/s11590-015-0946-8
    [29] X. K. Sun, K. L. Teo, X. J. Long, Characterizations of robust ε-quasi optimal solutions for nonsmooth optimization problems with uncertain data, Optimization, 70 (2021), 847-870. doi: 10.1080/02331934.2021.1871730
    [30] X. K. Sun, K. L. Teo, L. P. Tang, Dual approaches to characterize Robust optimal solution sets for a class of uncertain optimization problems, J. Optim. Theory Appl., 182 (2019), 984-1000. doi: 10.1007/s10957-019-01496-w
    [31] X. K. Sun, K. L. Teo, J. Zeng, L. Y. Liu, Robust approximate optimal solutions for nonlinear semi-infinite programming with uncertainty, Optimization, 69 (2020), 2109-2129. doi: 10.1080/02331934.2020.1763990
    [32] X. Q. Yang, X. X Huang, A nonlinear Lagrangian approach to constrained optimization problems, SIAM J. Optim., 11 (2001), 1119-1144. doi: 10.1137/S1052623400371806
    [33] G. D. Yalcin, R. Kasimbeyli, On weak conjugacy, augmented Lagrangians and duality in nonconvex optimization, Math. Methods Oper. Res., 92 (2020), 199-228. doi: 10.1007/s00186-020-00708-8
    [34] C. Zǎlinescu, Convex analysis in general vector spaces, World Scientific, 2002.
  • Reader Comments
  • © 2021 the Author(s), licensee AIMS Press. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0)
通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Metrics

Article views(2646) PDF downloads(124) Cited by(0)

Other Articles By Authors

/

DownLoad:  Full-Size Img  PowerPoint
Return
Return

Catalog