Review Special Issues

Involvement of CXCL12 Pathway in HPV-related Diseases

  • Received: 17 October 2016 Accepted: 22 December 2016 Published: 23 December 2016
  • Human Papillomavirus (HPV) is a necessary cause of cervical cancer in women worldwide. However, the HPV infection is not sufficient to cause neoplasia, and immune mediators, such as chemokines, are important in this context, since they are involved in the regulation of leukocyte trafficking in many essential biological processes, including inflammation. Prolonged inflammation is thought to facilitate carcinogenesis by providing a microenvironment that is ideal for tumor cell development and growth. Chemokines also contribute to tumor development by promoting angiogenesis and metastasis. Among these molecules we highlight the chemokine CXCL12, also called stromal-derived factor 1 alpha (SDF1-α), a pleiotropic chemokine capable of eliciting multiple signal transduction cascades and functions, via interaction with either CXCR4 or CXCR7, which have been implicated in malignant cell survival, proliferation and migration. This review will focus on our current knowledge in the pathogenesis of HPV infection, the main aspects of CXCL12 signaling, its participation in tumor development and immunodeficiencies that may enable the HPV infection. We also discuss how CXCL12 gene expression and polymorphisms may influence tumor development, especially cervical cancer. Finally, we highlight how the inhibition of CXCL12 pathway may be an attractive alternative for cancer therapeutics.

    Citation: Nádia C. M. Okuyama, Fernando Cezar dos Santos, Kleber Paiva Trugilo, Karen Brajão de Oliveira. Involvement of CXCL12 Pathway in HPV-related Diseases[J]. AIMS Medical Science, 2016, 3(4): 417-440. doi: 10.3934/medsci.2016.4.417

    Related Papers:

    [1] Nickson Golooba, Woldegebriel Assefa Woldegerima, Huaiping Zhu . Deep neural networks with application in predicting the spread of avian influenza through disease-informed neural networks. Big Data and Information Analytics, 2025, 9(0): 1-28. doi: 10.3934/bdia.2025001
    [2] Xiaoxiang Guo, Zuolin Shi, Bin Li . Multivariate polynomial regression by an explainable sigma-pi neural network. Big Data and Information Analytics, 2024, 8(0): 65-79. doi: 10.3934/bdia.2024004
    [3] Yaru Cheng, Yuanjie Zheng . Frequency filtering prompt tuning for medical image semantic segmentation with missing modalities. Big Data and Information Analytics, 2024, 8(0): 109-128. doi: 10.3934/bdia.2024006
    [4] Jason Adams, Yumou Qiu, Luis Posadas, Kent Eskridge, George Graef . Phenotypic trait extraction of soybean plants using deep convolutional neural networks with transfer learning. Big Data and Information Analytics, 2021, 6(0): 26-40. doi: 10.3934/bdia.2021003
    [5] Bill Huajian Yang . Modeling path-dependent state transitions by a recurrent neural network. Big Data and Information Analytics, 2022, 7(0): 1-12. doi: 10.3934/bdia.2022001
    [6] Sayed Mohsin Reza, Md Al Masum Bhuiyan, Nishat Tasnim . A convolution neural network with encoder-decoder applied to the study of Bengali letters classification. Big Data and Information Analytics, 2021, 6(0): 41-55. doi: 10.3934/bdia.2021004
    [7] Zhouchen Lin . A Review on Low-Rank Models in Data Analysis. Big Data and Information Analytics, 2016, 1(2): 139-161. doi: 10.3934/bdia.2016001
    [8] S. Chen, Z. Wang, M. Kelly . Aggregate loss model with Poisson-Tweedie frequency. Big Data and Information Analytics, 2021, 6(0): 56-73. doi: 10.3934/bdia.2021005
    [9] Ricky Fok, Agnieszka Lasek, Jiye Li, Aijun An . Modeling daily guest count prediction. Big Data and Information Analytics, 2016, 1(4): 299-308. doi: 10.3934/bdia.2016012
    [10] Marco Tosato, Jianhong Wu . An application of PART to the Football Manager data for players clusters analyses to inform club team formation. Big Data and Information Analytics, 2018, 3(1): 43-54. doi: 10.3934/bdia.2018002
  • Human Papillomavirus (HPV) is a necessary cause of cervical cancer in women worldwide. However, the HPV infection is not sufficient to cause neoplasia, and immune mediators, such as chemokines, are important in this context, since they are involved in the regulation of leukocyte trafficking in many essential biological processes, including inflammation. Prolonged inflammation is thought to facilitate carcinogenesis by providing a microenvironment that is ideal for tumor cell development and growth. Chemokines also contribute to tumor development by promoting angiogenesis and metastasis. Among these molecules we highlight the chemokine CXCL12, also called stromal-derived factor 1 alpha (SDF1-α), a pleiotropic chemokine capable of eliciting multiple signal transduction cascades and functions, via interaction with either CXCR4 or CXCR7, which have been implicated in malignant cell survival, proliferation and migration. This review will focus on our current knowledge in the pathogenesis of HPV infection, the main aspects of CXCL12 signaling, its participation in tumor development and immunodeficiencies that may enable the HPV infection. We also discuss how CXCL12 gene expression and polymorphisms may influence tumor development, especially cervical cancer. Finally, we highlight how the inhibition of CXCL12 pathway may be an attractive alternative for cancer therapeutics.


    This manuscript has been authored by UT-Battelle, LLC under Contract No. DE-AC05-00OR22725 with the U.S. Department of Energy. The United States Government retains and the publisher, by accepting the article for publication, acknowledges that the United States Government retains a non-exclusive, paid-up, irrevocable, world-wide license to publish or reproduce the published form of this manuscript, or allow others to do so, for United States Government purposes. The Department of Energy will provide public access to these results of federally sponsored research in accordance with the DOE Public Access Plan (http://energy.gov/downloads/doe-public-access-plan).


    1. Setup

    Assume we have a set of labelled data points, or training data pairs D={x(i),y(i)}Ni=1, where x(i)RK and y(i)R+.

    We postulate amongst a family of ansatz f(;θ), parameterized by θRP, that if we can find a θ such that for all i=1,,N,

    y(i)f(x(i);θ),

    then for all x within the convex hull of the set {x(i)}Ni=1, and ideally in a high-probability region over training data distribution, we will have

    yf(x;θ),

    where y is the true output corresponding to input x. In particular, the hope is that if we derive f based on physical theory, then in fact it will hold that even for x different from the set {x(i)}Ni=1.

    Here we consider parametric approximations, given by generalized linear models (GLM), in particular polynomial models, as well as nonlinear models (NM) given by neural networks (NN).

    We will consider the problem of finding the θ which minimizes data misfit

    Φ(θ)=Ni=1(y(i),f(x(i);θ)),

    for some distance function . Here we use the standard choice ():=||2:=22. It makes sense to interpret this as the log-likelihood of the data given the parameter, as a function of the parameter.


    2. Physical setup and theory-based model

    In recent work [9], a theory-based scaling of thermal energy confinement time has been derived based on a comprehensive turbulent transport model trapped gyrokinetic Landau fluid (TGLF) [13] in core coupled to the edge pedestal (EPED) [12] model, especially in burning plasma conditions with dominant fusion alpha particle heating for future reactor design. The simulation dataset consists of a massive number of predictive Integrated Plasma Simulator (IPS) FASTRAN [8] simulations, selfconsistent with core transport, edge pedestal, fusion alpha particle heating, and MHD equilibrium, built upon a modern integrated modeling framework, IPS.

    The K=9 input features are x1=R= major radius, x2=a= minor radius, x3=κ=V/2π2a2R, where V= plasma volume, x4=δ= triangularity, x5=Ip= plasma current, x6=b0= toroidal magnetic field, x7=ˉne= line average density, x8=Zeff=, x9=Ploss= loss power. The output y=τ is the thermal energy confinement time.

    For input features x(i)RK, i=1,,N, the theoretical relationship with the output thermal energy confinement time y(i)R+, is given by the following ansatz

    fth(x;θ)=θ0Kk=1xθkk (2.1)

    or on a logarithmic scale (redefining logθ0θ0)

    logfth(x;θ)=θ0+Kk=1θklog(xk) (2.2)

    3. Parametric regression

    Parametric models afford the luxury of discarding the data set after training, and reduce subsequent evaluations of the model to a cost relating to the number of parameters only [1,7,10]. However, these models are much more rigid than the non-parametric ones, by virtue of restricting to a particular parametric family.


    3.1. General linear models

    The simplest type of parametrization is in terms of coefficients of expansion in some basis {ψi}P1i=0. Here we choose appropriately ordered monomials with total degree p, then the number of parameters is P=(K+p)!/K!p!. The polynomials {ψi} are products of monomials in the individual variables, such as ψi(x)=Kk=1ψαk(i)(xk), where ψ0(x)=1, each index iZ+ is associated to a unique multi-index α(i)=(α1(i),,αK(i))ZK+, and the single variable monomials are defined as ψαk(i)(z)=zαk(i) for zR. In particular, for i=1,,K, α(i)i=i and α(i)j=0 for ji, i.e. ψi(x)=xi. The polynomials are constructed with sglib (https://github.com/ezander/sglib) and the computations are done with Matlab.

    Assume that f(;θ)=P1i=0θiψi(). Under the assumption that the data is corrupted by additive Gaussian white noise, the negative log-likelihood (i.e. data misfit) takes the form

    Φ(θ)=Ni=1|f(x(i);θ)y(i)|2 (3.1)

    Let X=[x(1),,x(N)]T and Y=[y(1),,y(N)]T, and let Ψ(X)ij=ψj(x(i)). Then

    Φ(θ)=|Ψ(X)θY|2

    and the maximum likelihood solution θRP is given by

    θ=(Ψ(X)TΨ(X))1Ψ(X)TY.

    Predictions of outputs Y for some X are later given by

    Y=Ψ(X)θ.

    We will also consider log-polynomial regression, where the monomials are composed with log, i.e. the basis functions become ψilog, where log acts component-wise. We then define logf(;θ)=P1i=0θiψi(log()) and the misfit is

    Φ(θ)=Ni=1|logf(x(i);θ)logy(i)|2. (3.2)

    This is the negative log-likelihood under a multiplicative lognormal noise assumption, as opposed to the additive Gaussian noise assumption of (3.1). To understand the motivation for this, observe that (2.2) is a log-linear model corresponding to the logarithm of (2.1). Therefore, this setup includes the theoretical model prediction, when p=1, and provides a way to systematically improve upon the theoretical model, by increasing p.

    As a final note, as p increases, there is a chance of including more irrelevant parameters, in addition to ones which may be relevant. Therefore, we must regularize somehow. For the GLM, it is natural to adopt a standard L2 Tikhonov regularization term λ|θ|2, resulting in the following modified objective function, for λ>0,

    ˉΦ(θ)=Φ(θ)+λ|θ|2 (3.3)

    which again can be exactly solved

    θ=(Ψ(X)TΨ(X)+λI)1Ψ(X)TY

    where I is the P×P identity matrix. One can readily observe that this provides an upper bound to the pseudo inverse operator mapping the observation set to the optimal parameter (in other words Ψ(X)TΨ(X)+λI has a spectrum lower bounded by λ).


    3.2. Nonlinear models

    In some very particular cases, it may make sense to invoke nonlinear models, at the significant additional expense of solving a nonlinear least squares problem (in the easiest instance) as opposed to the linear least squares solution presented above. The most notable example is deep neural networks. It has been shown that these models perform very well for tasks such as handwritten digit classification, voice and object recognition [3], and there has recently even been mathematical justification that substantiates this [5]. However, [6] show that one can sometimes find very small perturbations on input values which lead to misclassification. The methods can also be used for regression problems, and will be considered below.


    3.2.1. Deep neural networks

    Let θiRPi (P=Li=1Pi, where L is the number of layers) be parameters and gi be some possibly nonlinear function, corresponding to layer i, and let

    f(;θ)=gL(g2(g1(;θ1);θ2);;θL).

    As an example, θi=(Ai,bi) is typically split into parameters AiRni×ni1,biRni, defining an affine transformation Aix+bi, and gi(x;θi)=σi(Aix+bi), where σi is a simple nonlinear "activation function" which is applied element-wise. In this case ni are the number of auxiliary variables, or "neurons", on level i, n0=K is the input dimension, nL=1 is the output dimension, and there are L levels. For regression one typically takes σL=Id the identity map, as will be done here.

    For the neural networks used here, σi=σ:RR+, for i<L, is the the rectified linear unit (ReLU) defined by

    σ(zi)=max{0,zi} (3.4)

    This activation function contains no trainable parameters, and we iterate that it is applied element-wise to yield a function RniRni+.

    The misfit again takes the form

    Φ(θ)=Ni=1|f(x(i);θ)y(i)|2 (3.5)

    except now we have a nonlinear and typically non-convex optimization problem.

    There are many methods that can be employed to try to find the minimum of this objective function, and they will generally be successful in finding some minimum (though perhaps not global). Due to the mathematically simple functions that make up the neural network, derivatives with respect to the unknown parameters can be determined analytically. A popular method to solve non-convex optimization problems is stochastic gradient descent, or Robbins-Monro stochastic approximation algorithm [4,11], which is defined as follows. Let ^Φ be an unbiased estimate of Φ, i.e. a random variable such that E^Φ(θ)=Φ(θ). Then let

    θi+1=θiγi^Φ(θi) (3.6)

    If {γi} are chosen such that iγi= and iγ2i<, then the algorithm is guaranteed to converge to a minimizer [4,11], which is global if the function is convex [2]. Noting that we can compute

    Φ(θ)=Ni=1θ(f(x(i);θ),y(i))

    easily in this case, one could use for example ^Φ(θ)=Φ(θ)+ξ, where ξRP is some random variable, for example a Gaussian, with Eξ=0. Something more clever and natural can be done in the particular case of (3.5). Recall that N can be very large. Observe that (3.5) is, up to a multiple of N, an equally weighted average of the summands (f(x(i);θ),y(i)). Let {ni}Nsubi=1 be a random (or deterministic) subset of Nsub<N indices, where Nsub could be 1. Then the following provides an unbiased estimator which is also much cheaper than the original gradient (since it uses a subset of the data)

    ^Φ(θ):=NNsubNsubi=1θ(f(x(ni);θ),y(ni)).

    Naturally the variance of this estimator will affect the convergence time of the algorithm and we refer the interested reader to the monograph [4] for discussion of the finer points of this family of algorithms. Neural Network models were implemented and trained using the Mathematica Neural Network functions built on top of the MXNet framework.

    Increasing the number of free parameters, by increasing the depth or width of the network, allows more flexibility and can provide a better approximation to complex input output relationships. However, as with GLM, increasing the number of degrees of freedom in the optimization problem also increases the susceptibility of the algorithm to instability from fitting noise in the data too accurately. A Tikhonov regularization could be employed again, but here we appeal to another strategy which is applicable only to DNN. In particular, the training data is randomly divided into N training samples and Nval validation samples. The algorithm 3.6 is run with the training data, however the loss of the validation data is computed along the way

    L(θ)=N+Nvali=N+1|f(x(i);θ)y(i)|2,

    and the network with the smallest loss on the validation data is retained at the end. This serves as a regularization, i.e. avoids fitting noise, since the training and validation sets are corrupted by different realizations of noise.


    4. Numerical results

    We consider a data set of 2822 data points with K=9 input features and a single output. A subset of N=2100 data points will be used for training, with the rest of the Nout data points aside for testing. General linear and log linear models with order up to 6 are considered, as well as deep neural networks.


    4.1. GLM

    An empirically chosen regularization of λ=1 for p=2, λ=50 for p=3, and λ=100 for p>3 is used in (3.3). Figure 1 presents the results for the theoretical log linear model, as well as the best fit based on a expansion in log polynomials of total order 6, for the out-of-sample set of testing data. It is clear already that the enriched model set brings negligible improvement in the reconstruction fidelity. The left subpanels of Figure 4 show some slices over the various covariates, where the other covariates are fixed at a value from the center of the data set. This gives some idea of how the reconstruction behaves, although one should bear in mind that the data set forms a manifold and not a hypercube, in R9, and so these slices are bound to include unobserved subsets of the parameter space. Furthermore, there may be sparse coverage even where there is data. This can be observed also in Figure 4 which illustrates the single component marginal histograms over the data.

    Figure 1. Comparison of true values with GLM prediction, for log linear (left), and log sixth order (right). The right subplots show the error distribution histogram (difference between values of the truth and the prediction). Out-of-sample, test data.
    Figure 2. Relative L2 error and coefficient of determination R2 for out-of-sample test data as a function of model complexity, for log polynomial models (left) and direct polynomial models (right).
    Figure 3. Histograms of the 3 DNN instances and the loglinear model. The right subplots show the error distribution histogram (difference between values of the truth and the prediction).
    Figure 4. Scaling and input data distribution log linear, sixth order, and NN models. The left subplots show a slice through the parameter space of the various models, giving an indication of their behaviour relative to one another. The right subplots show the marginal histograms for the given parameter. One can observe a large deviation between the curves where the data is quite sparse, particularly for the higher order polynomial models.

    Figure 2 show the relative L2 fidelity and the coefficient of determination R2, defined below, for log polynomial and polynomial models of increasing order.

    The relative error L2rel(f) of the surrogate f is given by

    L2rel(f):=N+Nouti=N+1|f(x(i);θ)y(i)|2N+Nouti=N+1|y(i)|2.

    The coefficient of determination R2(f) is given by

    R2(f):=1N+Nouti=N+1|f(x(i);θ)y(i)|2N+Nouti=N+1|y(i)¯y|2,

    where

    y=1NoutN+Nouti=N+1y(i).

    Both models begin to saturate at the same level of fidelity, indicating that this is a fundamental limitation of the information in the data. Furthermore, one observes from Figure 2 that the saturation level of the fidelity occurs with only a few percentage points of improvement over the theoretically derived log-linear model.


    4.2. Neural networks

    The neural network models were setup and trained using the built in neural network functions of Mathematica, which is built around the MXNet machine learning framework. The Nval=Nout samples are used in this case for validation, as described in the end of subsection 3.2.1.

    By connecting various combinations of layers, the neural network model gains the flexibility to model the behavior of the training data. As the size of the network increases so do the number of unknown parameters in the model. If there is too much flexibility then the network model begins to conform to the noise in the data. In addition, due to the nature of gradient decent methods, the minima reached is a local minima dependent on the starting position. As the number of unknown parameters increases, in particular if it exceeds the amount of training data, one expects the inversion to become unstable, with an increased risk of converging to an undesirable local minimum.

    To reduce the degeneracy of the objective function, the total number tunable weights and biases in the network will be kept below N, although this can be relaxed if a Tikhonov or other type of regularization is employed. Table 1, shows the make up of the neural network model used here. Initial weights and biases and chosen at random. The networks will be trained multiple times with random initial parameters to explore various local modes. Three examples are shown in Figures 3, 4. Figure 3 shows four pairs of images, with the left being the output in comparison to the true value, and the right being a histogram of the difference between the 2. The left two pairs and the top right are all corresponding to different instances of the same network optimization, but with different random selections of training data and different initial guesses. The bottom right is the best fit log linear model 2.2.

    Table 1. Description of the 3 layer DNN.
    Layer Type Inputs Width Weights Biases Total Parameters
    Linear 9 20 180 20 200
    ReLU 20 20 0 0 0
    Linear 20 20 400 20 420
    ReLU 20 20 0 0 0
    Linear 20 1 20 1 21
    Total 641
     | Show Table
    DownLoad: CSV

    The L2rel and R2 fidelity measures are shown in Table 2. What one observes here is that one finds almost no improvement at all over the theoretical log linear model with this P=641 parameter neural network, while in fact we do manage to improve the accuracy by a few percent with higher order GLM using P=5005 parameters (see Figure 2). Figure 4 shows 9 pairs of panels, one for each parameter. This indicates that the models perform better where the data is more dense. Also, the higher order polynomial models can behave wildly outside the support of the data. See for example the left bottom 3 panels.

    Table 2. Total loss of all available data.
    Network Validation loss 1L2rel R2
    3 Layer 1 0.0148 0.915 0.963
    3 Layer 2 0.0146 0.917 0.964
    3 Layer 3 0.0140 0.918 0.965
    Linlog 0.0161 0.914 0.960
     | Show Table
    DownLoad: CSV

    Finally, Figure 5 shows the pairwise marginal histograms, which gives a slightly better indication of the shape of the data distribution.

    Figure 5. Pairwise marginal histograms of the input data X. This gives some idea of the pairwise correlations between the different features over the input data.

    5. Summary

    In this work we considered fitting a theory-based log-linear ansatz, as well as higher order approximations for the thermal energy confinement of a Tokamak. It is a supervised machine learning regression problem to identify the map RKR+ with K=9 inputs. We parametrized the map using general linear models based on total order polynomials up to degree 6 (P=5005 parameters), as well as deep neural networks with up to P=641 parameters. The results indicate that the theory-based model fits the data almost as well as the more sophisticated machines, within the support of the data set. The conclusion we arrive at is that only negligible improvements can be made to the theoretical model, for input data of this type. Further work includes (ⅰ) Bayesian formulation of the inversion for uncertainty quantification (UQ), or perhaps derivative-based UQ (sensitivity), (ⅱ) inclusion of further hyper-parameters in the GLM for improved regularization within an empirical Bayesian framework, (ⅲ) using other types of regularization for the deep neural network, in order to allow for more parameters, and (ⅳ) exploring alternative machine learning methods, such as Gaussian processes, which naturally include UQ.


    Acknowledgments

    This work has been supported by the U.S. Department of Energy, Office of Science, Office of Fusion Energy Sciences, using the DIII-D National Fusion Facility, a DOE Office of Science user facility under awards, DE-FG02-04ER54761 and DE-FC02-04ER54698.


    Conflict of interest

    All authors declare no additional conflicts of interest in this paper.


    [1] BRASIL. Estimativa 2016: Incidência de Câncer no Brasil. INCA 2015.
    [2] Torre LA, Bray F, Siegel RL, et al. (2015) Global Cancer Statistics, 2012. CA a cancer J Clin 65: 87-108. doi: 10.3322/caac.21262
    [3] Walboomers JMM, Jacobs MV, Manos MM, et al. (1999) Human Papillomavirus Is a Necessary Cause Of Invasive Cervical Cancer Worldwide. J Patholgy 189: 12-19.
    [4] Woodman CBJ, Collins SI, Young LS (2007) The natural history of cervical HPV infection: unresolved issues. Nat Rev Cancer 7: 11-22. doi: 10.1038/nrc2050
    [5] Smith JS, Lindsay L, Hoots B, et al. (2007) Human papillomavirus type distribution in invasive cervical cancer and high-grade cervical lesions: A meta-analysis update. Int J Cancer 121: 621-632. doi: 10.1002/ijc.22527
    [6] Lee LJ, Howitt B, Catalano P, et al. (2016) Gynecologic Oncology Prognostic importance of human papillomavirus (HPV) and p16 positivity in squamous cell carcinoma of the vulva treated with radiotherapy. Gynecol Oncol 142: 293-298. doi: 10.1016/j.ygyno.2016.05.019
    [7] Levovitz C, Chen D, Ivansson E, et al. (2014) TGFβ Receptor 1: An immune susceptibility gene in HPV-associated cancer. Cancer Res 74: 6833-6844. doi: 10.1158/0008-5472.CAN-14-0602-T
    [8] Mai S, Welzel G, Ottstadt M, et al. (2015) Prognostic relevance of HPV infection and p16 overexpression in squamous cell anal cancer. Radiat Oncol Biol 93: 819-827. doi: 10.1016/j.ijrobp.2015.08.004
    [9] Egawa N, Egawa K, Griffin H, et al. (2015) Human Papillomaviruses; epithelial tropisms, and the development of neoplasia. Viruses 7: 3863-3890. doi: 10.3390/v7072802
    [10] Arbyn M, Sanjosé S de, Saraiya M, et al. (2011) EUROGIN 2011 roadmap on prevention and treatment of HPV- related disease. Int J Cancer 131: 1969-1982.
    [11] Doorbar J (2006) Molecular biology of human papillomavirus infection and cervical cancer. Clin Sci 110: 525-541. doi: 10.1042/CS20050369
    [12] Moody CA, Laimins LA (2010) Human papillomavirus oncoproteins : pathways to transformation. Nat Rev Cancer 10: 550-560. doi: 10.1038/nrc2886
    [13] Doorbar J (2013) The E4 protein —structure , function and patterns of expression. Virology 445: 80-98. doi: 10.1016/j.virol.2013.07.008
    [14] DiMaio D, Petti LM (2013) The E5 proteins. Virology 445: 99-114. doi: 10.1016/j.virol.2013.05.006
    [15] Beaudenon S, Huibregtse JM (2008) HPV E6 , E6AP and cervical cancer. BMC Biochem 7: 1-7.
    [16] Zhou W, Slingerland JM (2014) Links between oestrogen receptor activation and proteolysis : relevance to hormone-regulated cancer therapy. Nat Rev Cancer 14: 26-38.
    [17] zur Hausen H (2009) Papillomaviruses in the causation of human cancers—a brief historical account. Virology 384: 260-265. doi: 10.1016/j.virol.2008.11.046
    [18] Schiffman M, Castle PE, Jeronimo J, et al. (2007) Human papillomavirus and cervical cancer. Lancet 370: 890-907. doi: 10.1016/S0140-6736(07)61416-0
    [19] Schiffman M, Wentzensen N (2013) Human Papillomavirus Infection and the Multistage Carcinogenesis of Cervical Cancer. Cancer Epidemiol Biomarkers Prev 22: 553-560. doi: 10.1158/1055-9965.EPI-12-1406
    [20] Castellsagué X, Muñoz N (2003) Chapter 3: Cofactors in human papillomavirus carcinogenesis—role of parity, oral contraceptives, and tobacco smoking. J Nation Cancer Institute Monographs 31: 20-28.
    [21] Patel S, Chiplunkar S (2009) Host immune responses to cervical cancer. Curr Opin Obstet Gynecol 21: 54-59.
    [22] Mbeunkui F, Johann DJJr (2010) Cancer and the tumor microenvironment: a review of an essential relationship. Cancer Chemoter Pharmacol 63: 571-582.
    [23] Vandercappellen J, Van Damme J, Struyf S (2008) The role of CXC chemokines and their receptors in cancer. Cancer Lett 267: 226-244.
    [24] Baggiolini M (2001) Chemokines in pathology and medicine. J Intern Med 250: 91-104.
    [25] Griffith JW, Sokol CL, Luster AD (2014) Chemokines and chemokine receptors: positioning cells for host defense and immunity. Annu Rev Immunol 32: 659-702. doi: 10.1146/annurev-immunol-032713-120145
    [26] Pozzobon T, Goldoni G, Viola A, et al. (2016) CXCR4 signaling in health and disease. Immunol Lett 177: 6-15. doi: 10.1016/j.imlet.2016.06.006
    [27] Hanahan D, Weinberg RA (2011) Hallmarks of cancer: the next generation. Cell 144: 646-674.
    [28] Wani N, Nasser MW, Ahirwar DK, et al. (2014) C-X-C motif chemokine 12/C-X-C chemokine receptor type 7 signaling regulates breast cancer growth and metastasis by modulating the tumor microenvironment. Breast Cancer Res16: R54.
    [29] Bergers G, Benjamin LE (2003) Tumorigenesis and the angiogenic switch. Nat Rev Cancer 3: 401-410. doi: 10.1038/nrc1093
    [30] Tachibana K, Hirota S, Iizasa H, et al. (1998). The chemokine receptor CXCR4 is essential for vascularization of the gastrointestinal tract. Nature 393: 591-594. doi: 10.1038/31261
    [31] Funakoshi A, Jimi A, Yasunami Y, et al. (1989) CXCL8/IL8 stimulates vascular endothelial growth factor (VEGF) expression and the autocrine activation of VEGFR2 in endothelial cells by activating NFkappaB through the CBM (Carma3/Bcl10/Malt1) complex. Biochem Biophys Res Commun 159: 913-918. doi: 10.1016/0006-291X(89)92195-5
    [32] Martin D, Galisteo R, Gutkind JS (2009) CXCL8/IL8 stimulates vascular endothelial growth factor (VEGF) expression and the autocrine activation of VEGFR2 in endothelial cells by activating NF-κB through the CBM (Carma3/Bcl10/Malt1) complex. J Biol Chem 284: 6038-6042. doi: 10.1074/jbc.C800207200
    [33] Nguyen DX, Bos PD, Massagué J (2009) Metastasis: from dissemination to organ-specific colonization. Nat Rev Cancer 9: 274-284. doi: 10.1038/nrc2622
    [34] Zlotnik A, Burkhardt AM, Homey B (2011) Homeostatic chemokine receptors and organ-specific metastasis. Nat Rev Immunol 11: 597-606.
    [35] Muller A, Homey B, Soto H, et al. (2001) Involvement of chemokine receptors in breast cancer metastasis. Nature 410: 50-56.
    [36] Terasaki M, Sugita Y, Arakawa F, et al. (2011) CXCL12/CXCR4 signaling in malignant brain tumors: a potential pharmacological therapeutic target. Brain Tumor Pathol 28: 89-97.
    [37] Balkwill F (2004) Cancer and the chemokine network. Nat Rev Cancer 4: 540-550. doi: 10.1038/nrc1388
    [38] Zou W, Machelon V, Coulomb-L'Hermin A, et al. (2001) Stromal-derived factor-1 in human tumors recruits and alters the function of plasmacytoid precursor dendritic cells. Nat Med 7: 1339-1346. doi: 10.1038/nm1201-1339
    [39] Maksym RB, Tarnowski M, Grymula K, et al. (2009) The role of stromal derived factor-1-CXCR7 axis in development and cancer. Eur J Pharmacol 625: 31-40. doi: 10.1016/j.ejphar.2009.04.071
    [40] Wang J, Knaut H (2014) Chemokine signaling in development and disease. Development 12: 4199-4205.
    [41] Wang Z, Ma Q, Liu Q, et al. (2008) Blockade of SDF-1/CXCR4 signalling inhibits pancreatic cancer progression in vitro via inactivation of canonical Wnt pathway. Br J Cancer 99: 1695-1703. doi: 10.1038/sj.bjc.6604745
    [42] Ganju RK, Brubaker SA, Meyer J, et al. (1998) The a-Chemokine, stromal cell-derived factor-1a, binds to the transmembrane G-protein-coupled CXCR-4 receptor and activates multiple signal transduction pathways. J Biol Chem 273: 23169-23175. doi: 10.1074/jbc.273.36.23169
    [43] Bhandari D, Robia SL, Marchese A (2009) The E3 ubiquitin ligase atrophin interacting protein 4 binds directly to the chemokine receptor CXCR4 via a novel WW domain-mediated interaction. Mol Biol Cell 20: 1324-1339. doi: 10.1091/mbc.E08-03-0308
    [44] Mines MA, Goodwin JS, Limbird LE, et al. (2009) Deubiquitination of CXCR4 by USP14 is critical for both CXCL12-induced CXCR4 degradation and chemotaxis but not ERK activation. J Biol Chem 284: 5742-5752. doi: 10.1074/jbc.M808507200
    [45] Busillo JM, Benovic JL (2007) Regulation of CXCR4 signaling. Biochim Biophys Acta 1768: 952-963. doi: 10.1016/j.bbamem.2006.11.002
    [46] Cojoc M, Peitzsch C, Trautmann F, et al. (2013) Emerging targets in cancer management: role of the CXCL12/CXCR4 axis. Onco Targets Ther 6: 1347-1361.
    [47] Mellado M, Rodríguez-Frade JM, Mañes S, et al. (2001) Chemokine signaling and functional responses: the role of receptor dimerization and TK pathway activation. Annu Rev Immunol 19: 397-421. doi: 10.1146/annurev.immunol.19.1.397
    [48] Teicher BA, Fricker SP (2010) CXCL12 (SDF-1)/CXCR4 Pathway in Cancer. Clin Cancer Res 16: 2927-2931.
    [49] Graham GJ, Locati M, Mantovani A, et al. (2012) The biochemistry and biology of the atypical chemokine receptors. Immunol Lett 145: 30-38. doi: 10.1016/j.imlet.2012.04.004
    [50] Freitas C, Desnoyer A, Meuris F, et al. (2014) The relevance of the chemokine receptor ACKR3/CXCR7 on CXCL12-mediated effects in cancers with a focus on virus-related cancers. Cytokine Growth Factor Rev 25: 307-316. doi: 10.1016/j.cytogfr.2014.04.006
    [51] Veldkamp CT, Peterson FC, Pelzek AJ, et al. (2005) The monomer—dimer equilibrium of stromal cell-derived factor-1 ( CXCL 12 ) is altered by pH , phosphate , sulfate , and heparin. Protein Sci 1: 1071-1081.
    [52] Ray P, Lewin SA, Mihalko LA, et al. (2012) Secreted CXCL12 (SDF-1) forms dimers under physiological conditions. Biochem J 442: 433-442.
    [53] Wu B, Chien EY, Mol CD, et al. (2010) Structures of the CXCR4 Chemokine. Science 330: 1066-1071.
    [54] Kramp BK, Sarabi A, Koenen RR, et al. (2011) Heterophilic chemokine receptor interactions in chemokine signaling and biology. Exp Cell Res 317: 655-663.
    [55] Thelen M, Thelen S (2008) CXCR7, CXCR4 and CXCL12: An eccentric trio? J Neuroimmunol 198: 9-13. doi: 10.1016/j.jneuroim.2008.04.020
    [56] Laguri C, Arenzana-Seisdedos F, Lortat-Jacob H (2008) Relationships between glycosaminoglycan and receptor binding sites in chemokines-the CXCL12 example. Carbohydr Res 343: 2018-2023. doi: 10.1016/j.carres.2008.01.047
    [57] Rueda P, Balabanian K, Lagane B, et al. (2008) The CXCL12γ Chemokine Displays Unprecedented Structural and Functional Properties that Make It a Paradigm of Chemoattractant Proteins. Gold JA, editor. PLoS One 3: e2543. doi: 10.1371/journal.pone.0002543
    [58] Sadir R, Imberty A, Baleux F, et al. (2004) Heparan Sulfate/Heparin oligosaccharides protect stromal cell-derived factor-1 (SDF-1)/CXCL12 against proteolysis induced by CD26/dipeptidyl peptidase IV. J Biol Chem 279: 43854-43860.
    [59] Brule S, Friand V, Sutton A, et al. (2009) Glycosaminoglycans and syndecan-4 are involved in SDF-1/CXCL12-mediated invasion of human epitheloid carcinoma HeLa cells. Biochim Biophys Acta-Gen Subj 1790: 1643-1650.
    [60] Shirozu M, Nakano T, Inazawa J, et al. (1995) Struture and chromosomal localization of the human stromal cell-derived factor 1 (SDF1) gene. Genomics 28: 495-500.
    [61] Winkler C, Modi W, Smith MW, et al. (1998) Genetic restriction of AIDS pathogenesis by an SDF-1 chemokine gene variant. ALIVE Study, Hemophilia Growth and Development Study (HGDS), Multicenter AIDS Cohort Study (MACS), Multicenter Hemophilia Cohort Study (MHCS), San Francisco City Cohort (SFCC). Science 279: 389-393.
    [62] Feng L, Nian S, Hao Y, et al. (2014) A Single Nucleotide Polymorphism in the Stromal Cell-Derived Factor 1 Gene Is Associated with Coronary Heart Disease in Chinese Patients. Int J Mol Sci 15: 11054-11063. doi: 10.3390/ijms150611054
    [63] Reiche EMV, Ehara Watanabe MA, et al. (2006) The effect of stromal cell-derived factor 1 (SDF1/CXCL12) genetic polymorphism on HIV-1 disease progression. Int J Mol Med 18 785-793.
    [64] de Oliveira KB, Oda JMM, Voltarelli JC, et al. (2009) CXCL12 rs1801157 polymorphism in patients with breast cancer, hodgkin’s lymphoma, and non-hodgkin’s lymphoma. J Clin Lab Anal 23: 387-393. doi: 10.1002/jcla.20346
    [65] Cai C, Wang L-H, Dong Q, et al. (2013) Association of CXCL12 and CXCR4 gene polymorphisms with the susceptibility and prognosis of renal cell carcinoma. Tissue Antigens 82: 165-170. doi: 10.1111/tan.12170
    [66] Schimanski C, Jordan M, Schlaegel F (2011) SNP rs1801157 significantly correlates with distant metastasis in CXCL12 expressing esophagogastri c cancer. Int J Oncol 39: 515-920.
    [67] Razmkhah M, Doroudchi M, Ghayumi SMA, et al. ( 2005) Stromal cell-derived factor-1 (SDF-1) gene and susceptibility of Iranian patients with lung cancer. Lung Cancer 49: 311-315.
    [68] Hassan S, Baccarelli A, Salvucci O, et al. (2008) Plasma stromal cell-derived factor-1: host derived marker predictive of distant metastasis in breast cancer. Clin Cancer Res 14: 446-454.
    [69] Phillips RJ, Burdick MD, Lutz M, et al. ( 2003) The stromal derived factor–1/CXCL12–CXC chemokine receptor 4 biological axis in non–small cell lung cancer metastases. Am J Respir Crit Care Med 167: 1676-1686.
    [70] Dehghani M, Kianpour S, Zangeneh A, et al. (2014) CXCL12 modulates prostate cancer cell adhesion by altering the levels or activities of  β1-containing integrins. Int J Cell Biol 2014: 981750.
    [71] Beider K, Bitner H, Leiba M, et al. (2014) Multiple myeloma cells recruit tumor-supportive macrophages through the CXCR4/CXCL12 axis and promote their polarization toward the M2 phenotype. Oncotarget 5: 11283-11296. doi: 10.18632/oncotarget.2207
    [72] Sei S, O’Neill DP, Stewart SK, et al. (2001) Increased level of stromal cell-derived factor-1 mRNA in peripheral blood mononuclear cells from children with AIDS-related lymphoma. Cancer Res 61: 5028-5037.
    [73] de Oliveira KB, Guembarovski RL, Oda JMM, et al. (2011) CXCL12 rs1801157 polymorphism and expression in peripheral blood from breast cancer patients. Cytokine 55: 260-265. doi: 10.1016/j.cyto.2011.04.017
    [74] Leiding JW, Holland SM (2012) Warts and all: human papillomavirus in primary immunodeficiencies. J Allergy Clin Immunol 130: 1030-1048. doi: 10.1016/j.jaci.2012.07.049
    [75] Chow KYC, Brotin É, Ben Khalifa Y, et al. (2010) A Pivotal Role for CXCL12 Signaling in HPV-Mediated Transformation of Keratinocytes: Clues to Understanding HPV-Pathogenesis in WHIM Syndrome. Cell Host Microbe 8: 523-533.
    [76] Bachelerie F (2010) CXCL12/CXCR4-axis dysfunctions: Markers of the rare immunodeficiency disorder WHIM syndrome. Dis Markers 29: 189-198. doi: 10.1155/2010/475104
    [77] Maciejewski-Duval A, Meuris F, Bignon A, et al. (2016) Altered chemotactic response to CXCL12 in patients carrying GATA2 mutations. J Leukoc Biol 99: 1065-1076.
    [78] Bignon A, Biajoux V, Bouchet-Delbos L, et al. (2011) CXCR4, a therapeutic target in rare immunodeficiencies? Med Sci (Paris) 27: 391-397. doi: 10.1051/medsci/2011274015
    [79] Maley SN, Schwartz SM, Johnson LG, et al. (2009) Genetic variation in CXCL12 and risk of cervical carcinoma: a population-based case-control study. Int J Immunogenet 36: 367-375. doi: 10.1111/j.1744-313X.2009.00877.x
    [80] Tee Y, Yang S-F, Wang P, et al. (2012) G801A Polymorphism of Human Stromal Cell–Derived Factor 1 Gene Raises No Susceptibility to Neoplastic Lesions of Uterine Cervix. Int J Gynecol Cancer 22: 1297-1302. doi: 10.1097/IGC.0b013e318265d334
    [81] Roszak A, Misztal M, Sowińska A, et al. (2015) Stromal cell-derived factor-1 G801A polymorphism and the risk factors for cervical cancer. Mol Med Rep 11: 4633-4638.
    [82] Jaafar F, Righi E, Lindstrom V, et al. (2009) Correlation of CXCL12 expression and FoxP3+ cell infiltration with human papillomavirus infection and clinicopathological progression of cervical cancer. Am J Pathol 175: 1525-1535. doi: 10.2353/ajpath.2009.090295
    [83] Huang Y, Zhang J, Cui Z-M, et al. (2013) Expression of the CXCL12/CXCR4 and CXCL16/CXCR6 axes in cervical intraepithelial neoplasia and cervical cancer. Chin J Cancer 32: 289-296.
    [84] Kurban S, Tursun M, Kurban G, et al. (2014) Role of CXCR7 and effects on CXCL12 in SiHa cells and upregulation in cervical squamous cell carcinomas in Uighur women. Asian Pac J Cancer Prev 15: 9211-9216 doi: 10.7314/APJCP.2014.15.21.9211
    [85] Zanotta N, Tornesello ML, Annunziata C, et al. (2016) Candidate soluble immune mediators in young women with high-risk human papillomavirus infection: high expression of chemokines promoting angiogenesis and cell proliferation. PLoS One 11: e0151851.
    [86] Baker R, Dauner JG, Rodriguez AC, et al. (2011) Increased plasma levels of adipokines and inflammatory markers in older women with persistent HPV infection. Cytokine 53: 282-285. doi: 10.1016/j.cyto.2010.11.014
    [87] Shen X, Wang S-H, Liang M, et al. (2008) The role and mechanism of CXCR4 and its ligand SDF-1 in the development of cervical cancer metastasis. Ai Zheng 27: 1044-1049.
    [88] Yadav SS, Prasad SB, Das M, et al. (2014) Epigenetic silencing of CXCR4 promotes loss of cell adhesion in cervical cancer. Biomed Res Int 2014: 581403.
    [89] Donzella GA, Schols D, Lin SW, et al. (1998) AMD3100, a small molecule inhibitor of HIV-1 entry via the CXCR4 co-receptor. Nat Med 4: 72-77. doi: 10.1038/nm0198-072
    [90] Burger JA, Stewart DJ (2009) CXCR4 chemokine receptor antagonists: perspectives in SCLC. Expert Opin Investig Drugs 18: 481-490.
    [91] Sayyed SG, Hägele H, Kulkarni OP, et al. (2009) Podocytes produce homeostatic chemokine stromal cell-derived factor-1/CXCL12, which contributes to glomerulosclerosis, podocyte loss and albuminuria in a mouse model of type 2 diabetes. Diabetologia 52: 2445-2454.
    [92] Miao Z, Luker KE, Summers BC, et al. (2007) CXCR7 (RDC1) promotes breast and lung tumor growth in vivo and is expressed on tumor-associated vasculature. Proc Natl Acad Sci USA 104: 15735-15740. doi: 10.1073/pnas.0610444104
    [93] Wang J, Wang J, Sun Y, et al. (2005) Diverse signaling pathways through the SDF-1/CXCR4 chemokine axis in prostate cancer cell lines leads to altered patterns of cytokine secretion and angiogenesis. Cell Signal 17: 1578-1592. doi: 10.1016/j.cellsig.2005.03.022
    [94] Sowińska A, Jagodzinski PP (2007) RNA interference-mediated knockdown of DNMT1 and DNMT3B induces CXCL12 expression in MCF-7 breast cancer and AsPC1 pancreatic carcinoma cell lines. Cancer Lett 255: 153-159. doi: 10.1016/j.canlet.2007.04.004
    [95] Uchida D, Onoue T, Kuribayashi N, et al. (2011) Blockade of CXCR4 in oral squamous cell carcinoma inhibits lymph node metastases. Eur J Cancer 47: 452-459. doi: 10.1016/j.ejca.2010.09.028
    [96] Gupta S, Gupta S (2015) Role of human papillomavirus in oral squamous cell carcinoma and oral potentially malignant disorders: A review of the literature. Indian J Dent 6: 91-98. doi: 10.4103/0975-962X.155877
    [97] Meuris F, Gaudin F, Aknin M-L, et al. (2016) Symptomatic Improvement in Human Papillomavirus-Induced Epithelial Neoplasia by Specific Targeting of the CXCR4 Chemokine Receptor. J Invest Dermatol 136: 473-480.
    [98] von Knebel Doeberitz M (2001) New molecular tools for efficient screening of cervical cancer. Dis Markers 17: 123-128.
    [99] Sun Y, Schneider A, Jung Y, et al. (2005) Skeletal localization and neutralization of the SDF-1(CXCL12)/CXCR4 axis blocks prostate cancer metastasis and growth in osseous sites in vivo. J Bone Miner Res 20: 318-329.
    [100] Huang EH, Singh B, Cristofanilli M, et al. (2009) A CXCR4 Antagonist CTCE-9908 Inhibits Primary Tumor Growth and Metastasis of Breast Cancer. J Surg Res 155: 231-236.
    [101] Matsusue R, Kubo H, Hisamori S, et al. (2009) Hepatic stellate Cells cromote liver metastasis of colon cancer cells by the action of SDF-1/CXCR4 axis. Ann Surg Oncol 16: 2645-2653. doi: 10.1245/s10434-009-0599-x
    [102] Rubin JB, Kung AL, Klein RS, et al. (2003) A small-molecule antagonist of CXCR4 inhibits intracranial growth of primary brain tumors. Proc Natl Acad Sci USA 100: 13513-13518.
    [103] Porvasnik S, Sakamoto N, Kusmartsev S, et al. (2009) Effects of CXCR4 antagonist CTCE-9908 on prostate tumor growth. Prostate 69: 1460-1469. doi: 10.1002/pros.21008
    [104] Epstein RJ (2004) The CXCL12-CXCR4 chemotactic pathway as a target of adjuvant breast cancer therapies. Nat Rev Cancer 4: 901-909. doi: 10.1038/nrc1473
    [105] Batchelor TT, Sorensen AG, di Tomaso E, et al. (2007) AZD2171, a pan-VEGF receptor tyrosine kinase inhibitor, normalizes tumor vasculature and alleviates edema in glioblastoma patients. Cancer Cell 11: 83-95. doi: 10.1016/j.ccr.2006.11.021
    [106] Kamoun WS, Ley CD, Farrar CT, et al. (2009) Edema control by cediranib, a vascular endothelial growth factor receptor-targeted kinase inhibitor, prolongs survival despite persistent brain tumor growth in mice. J Clin Oncol 27: 2542-2552. doi: 10.1200/JCO.2008.19.9356
    [107] Shaked Y, Henke E, Roodhart JML, et al. (2008) Rapid chemotherapy-induced acute endothelial progenitor cell mobilization: implications for antiangiogenic drugs as chemosensitizing agents. Cancer Cell 14: 263-273.
    [108] Shaked Y, Tang T, Woloszynek J, et al. (2009) Contribution of granulocyte colony-stimulating factor to the acute mobilization of endothelial precursor cells by vascular disrupting agents. Cancer Res 69: 7524-7528. doi: 10.1158/0008-5472.CAN-09-0381
    [109] Kozin S V., Kamoun WS, Huang Y, et al. (2010) Recruitment of myeloid but not endothelial precursor cells facilitates tumor regrowth after local irradiation. Cancer Res 70: 5679-5685. doi: 10.1158/0008-5472.CAN-09-4446
    [110] Kioi M, Vogel H, Schultz G, et al. (2010) Inhibition of vasculogenesis, but not angiogenesis, prevents the recurrence of glioblastoma after irradiation in mice. J Clin Invest 120: 694-705. doi: 10.1172/JCI40283
    [111] Hattermann K, Held-Feindt J, Lucius R, et al. (2010) The Chemokine Receptor CXCR7 Is Highly Expressed in Human Glioma Cells and Mediates Antiapoptotic Effects. Cancer Res 70: 3299-3308.
    [112] Zagzag D, Esencay M, Mendez O, et al. (2008) Hypoxia and vascular endothelial growth factor-induced stromal cell-derived factor-1α/CXCR4 expression in glioblastomas. Am J Pathol 173: 545-560. doi: 10.2353/ajpath.2008.071197
    [113] Batchelor TT, Duda DG, di Tomaso E, et al. (2010) Phase II study of cediranib, an oral pan-vascular endothelial growth factor receptor tyrosine kinase inhibitor, in patients with recurrent glioblastoma. J Clin Oncol 28: 2817-2823. doi: 10.1200/JCO.2009.26.3988
  • This article has been cited by:

    1. Ehab Hassan, C. E. Kessel, J. M. Park, W. R. Elwasif, R. E. Whitfield, K. Kim, P. B. Snyder, D. B. Batchelor, D. E. Bernholdt, M. R. Cianciosa, D. L. Green, K. J. H. Law, Core-Pedestal Plasma Configurations in Advanced Tokamaks, 2023, 1536-1055, 1, 10.1080/15361055.2022.2145826
    2. Yang Li, Imran Shafique Ansari, The Influence of Big Data and Information Fusion Innovative Technology in College Students’ Ideological and Political Education, 2022, 2022, 1875-905X, 1, 10.1155/2022/9265037
  • Reader Comments
  • © 2016 the Author(s), licensee AIMS Press. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0)
通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Metrics

Article views(7452) PDF downloads(1308) Cited by(2)

Article outline

Figures and Tables

Figures(3)

/

DownLoad:  Full-Size Img  PowerPoint
Return
Return

Catalog