Sleep is a ubiquitous component of animal life including birds and mammals. The exact function of sleep has been one of the mysteries of biology. A considerable number of theories have been put forward to explain the reason(s) for the necessity of sleep. To date, while a great deal is known about what happens when animals sleep, there is no definitive comprehensive explanation as to the reason that sleep is an inevitable part of animal functioning. It is well known that sleep is a homeostatically regulated body process, and that prolonged sleep deprivation is fatal in animals. In this paper, we present some of the theories as to the functions of sleep and provide a review of some hypotheses as to the overall physiologic function of sleep. To better understand the purpose for sleeping, we review the effects of sleep deprivation on physical, neurocognitive and psychic function. A better understanding of the purpose for sleeping will be a great advance in our understanding of the nature of the animal kingdom, including our own.
1.
Introduction
For a simple, undirected, connected graph $ H = (V(H) $, $ E(H)) $, let $ \mathit{\boldsymbol{A}} $ represent the adjacency matrix of $ H $, and let $ d_u $ denote the degree of vertex $ u $ in $ H $. Assuming that the eigenvalues of $ \mathit{\boldsymbol{A}} $ are $ \mu_1 > \mu_2 \geq \cdots \geq \mu_n $.
The centrality of the node study is a significant research direction in network science and graph theory. The centrality of a node describes how important a node is in a network [1,2]. The concept of node centrality comes from Leavitt's analysis of the influence between the behavior of small groups and their mode of communication in 1951[3]. Soon after, many scholars defined and studied the centrality index of nodes from different perspectives, such as degree centrality[4], closeness centrality[5], energy centrality[6], and so on[7]. The research on node centrality has excellent significance for the solution of practical problems such as the robustness of the actual network[8], the propagation efficiency in the control network[9], and the understanding of the structural characteristics of the network[10].
The matrix functions have become an essential mathematical tool for studying network science[11]. The node centrality index is based on the matrix function with an adjacency matrix, including subgraph centrality[12,13], total communicability centrality[14], Katz centrality[15], and so on[16]. The above definition of centrality gives weight to the adjacency matrix's extreme eigenvalues (the largest and the smallest). Such a weighting scheme is characterized by hiding the primary structural information in other eigenvalues [17]. For example, consider a connected graph $ H $. If the spectral gap $ \mu_1 - \mu_2 $ is significantly large, then the matrix exponential $ \exp (\mathit{\boldsymbol{A}}) $ is primarily influenced by $ \mu_1 $ and its corresponding eigenvector. Consequently, the information the remaining part of the spectrum provides is largely overlooked [18]. An exception to this is the Gaussian function $ \exp (- \mathit{\boldsymbol{A}} ^2) $, which places greater emphasis on the zero eigenvalues (if they exist) and those eigenvalues that are close to it [19]. This matrix function explores the central part of the spectrum, revealing crucial structural information about the graphs and networks studied[17]. The zero eigenvalue and eigenvalues near zero of $ \mathit{\boldsymbol{A}} $ play a critical role in determining molecules' magnetic and stability properties when $ \mathit{\boldsymbol{A}} $ represents the tight-binding Hamiltonian in HMO (Hückel molecular orbital) theory[20,21]. Many chemical reactivities are closely linked to the HOMO-LUMO gap, which corresponds to the smallest positive and the largest negative eigenvalues of $ \mathit{\boldsymbol{A}} $ in frontier molecular orbital theory [22,23,24,25]. For example, the transfer of electrons from the HOMO of one molecule to the LUMO of another molecule is crucial in various organic chemical reactions [26]. Estrada et al. defined Gaussian subgraph centrality by exploring the influence of near-zero eigenvalues (i.e., "middle eigenvalues") on graph structure by the Gaussian function[17,18].
For a vertex $ u \in V(G) $, the generalized Gaussian subgraph centrality of $ u $ [18] is defined as
where $ \beta > 0 $. Furthermore, the generalized Gaussian Estrada index of a graph[18] is defined as
where $ \beta > 0 $. Notice that the $ GSC (u, 1) $ and $ GEE (H, 1) $ are called the Gaussian subgraph centrality and the Gaussian Estrada index, respectively. According to the rules of quantum mechanics, in a network of particles, the generalized Gaussian Estrada index can be interpreted as the partition function of the system, utilizing a Hamiltonian derived from the $ \mathit{\boldsymbol{A}}^2 $ folded spectrum method[27]. The $ GEE (H, \beta) $ relates to the time-dependent Schrödinger equation involving the squared Hamiltonian, which uncovers information in the eigenvalues close to zero[28,29]. In recent years, with the deepening research in network science [30,31,32]. Scholars have discovered that Gaussian functions have important applications in capturing the structural signals of graphs [33], graph classification [34], and graph representation learning [35] in random and dynamic networks.
Equitable partition (EP) was initially introduced in [36,37] and is defined as follows. Consider $ H $ to be a graph with $ n $ vertices, and $ \tau $ to be a partition of $ V(H) $ with $ V_1 \cup V_2 \cup \cdots \cup V_t $. If constants $ b_{i j} $ exist so that every vertex in the cell $ V_i $ has $ b_{i j} $ neighbors in the cell $ V_j $ for all $ i, j \in\{1, 2, \ldots, t\} $, then $ \tau $ is an EP. The matrix $ \left(b_{i j}\right)_{t \times t} $ is the divisor matrix of $ \tau $. Every graph $ G $ possesses EP, as the orbits of any group of automorphisms of $ G $ create an EP [38]. EP has many applications in fields such as control theory, chemical analysis, and data clustering [39,40,41]. The spectrum of $ (b_{ij})_{t\times t} $ is contained in that of $ H $ [37]. The matrix $ \mathit{\boldsymbol{C}} $ serves as the characteristic matrix for $ \tau $, with its columns representing the characteristic vectors of the subsets $ V_1, \ldots, V_t $.
The concept of the star set and the star complement was introduced by [38,45]. For an eigenvalue $ \mu $ of the graph $ H $ with multiplicity $ k $, a star set for $ \mu $ in $ H $ is a subset of vertices $ \mathbb{X} \subseteq $ $ V(H) $ such that $ |\mathbb{X}| = k $ and the induced subgraph $ G-\mathbb{X} $ do not possess $ \mu $ as an eigenvalue. In this scenario, $ G-\mathbb{X} $ is referred to as the star complement for $ \mu $ in $ H $. As well known, a star set exists for any eigenvalue of any graph [46,47]. Further investigations into star set and star complement are studied in [48,49].
Since EP and star set exist for any eigenvalue of any graph, this study employs the methods of EP and star set to derive new equations for the $ GSC (u, \beta) $ of graphs. These equations can assist in calculating the $ GSC (u, \beta) $ of a large graph by utilizing the structures of a smaller graph. Moreover, some bounds for the $ GSC (u, \beta) $ of $ H $ are established based on the graph parameters of $ H $. The rest of this paper is structured as follows. In Section 2, we give some lemmas used later. Section 3 details the generalized Gaussian subgraph centrality calculation formula in the context of the equitable partitions of graphs and star complements technique method. The influence of the parameter $ \beta $ on the robustness of the formula is explored through experiments. Section 4 introduces some bounds for the generalized Gaussian Estrada index using the graph parameters for graph $ H $. Section 5 conclusions are given.
2.
Preliminaries
In this paper, let $ \mathit{\boldsymbol{I}} $ denote the identity matrix and $ \mathit{\boldsymbol{A}} $ represent the adjacency matrix of the graph $ H $. For a graph $ H $ with order $ n $ that has an EP with $ V(H) = V_1 \cup V_2 \cup\cdots\cup V_t $, the corresponding characteristic matrix $ \mathit{\boldsymbol{C}} $ is an $ n \times t $ matrix whose columns consist of the characteristic vectors of $ V_1, \ldots, V_t $.
Next, we present the relationship between the characteristic, adjacency, and divisor matrices.
Lemma 2.1. [36] Let $ \tau $ be an EP of graph $ H $ with characteristic matrix $ \mathit{\boldsymbol{C}} $ and divisor matrix $ \mathit{\boldsymbol{B}} $. Then
Here is the standard for determining whether a vertex subset qualifies as a star set.
Lemma 2.2. [46,47] Let $ \mathbb{X} \subseteq V(H) $ and the adjacency matrix of the subgraph induced by $ \mathbb{X} $ be $ \mathit{\boldsymbol{A}}_{\mathbb{X}} $, and $ \mathit{\boldsymbol{A}} = (AXT⊤TP)
$. Then $ \mathbb{X} $ is a star set for an eigenvalue $ \mu $ of $ H $ if and only if $ \mu $ is not an eigenvalue of $ \mathit{\boldsymbol{P}} $ and
Lemma 2.3. [54] Let $ G $ be a graph with $ n $ vertices and $ m $ edges. Then
3.
Calculation of generalized Gaussian subgraph centrality for graph
We formulate the generalized Gaussian subgraph centrality using the EP for graph $ H $ and compute its values with a smaller order matrix as follows.
Theorem 3.1. Suppose that $ H $ has an EP with $ V(G) = V_1 \cup V_2 \cup \cdots \cup V_t $ and $ V_1 = \{u\} $, then
where $ B $ is the divisor matrix of EP.
Proof. Consider $ \mathit{\boldsymbol{C}} $ and $ \mathit{\boldsymbol{B}} $ as the characteristic and divisor matrices of EP, respectively. According to Lemma 2.1, then
Since $ V_1 = \{u\} $, we get
Hence
□
Let $ H_1 $ and $ H_2 $ be two graphs; the join of $ H_1 $ and $ H_2 $ is the graph $ H_1 \otimes H_2 $ such that $ V\left(H_1 \otimes H_2\right) = V\left(H_1\right) \cup V\left(H_2\right) $ and $ E\left(H_1 \otimes H_2\right) = E\left(H_1\right) \cup E\left(H_2\right) \cup\left\{x y: x \in V\left(H_1\right)\right. $ and $ \left.y \in V\left(H_2\right)\right\} $. Furthermore, if $ H_2 $ is a complete graph $ K_1 $, then $ H_1\otimes K_1 $ is also called a cone over $ H_1 $ [38].
Theorem 3.2. Let $ H_1 $ be an $ r $-regular graph on $ n $ vertices and $ H = H_1 \otimes K_1 $. If $ u \in V(K_1) $, then
Proof. Since $ H $ has an EP with $ V(H) = \{u\} \cup V(H_1) $, it follows that the divisor matrix of EP is $ \mathit{\boldsymbol{B}} = (0n1r)
$. By matrix diagonalization, we have
where the eigenvalues of divisor matrix $ \mathit{\boldsymbol{B}} $ are $ \mu_1(\mathit{\boldsymbol{B}}) = \frac{r+\sqrt{r^2+4 n}}{2} $ and $ \mu_2(\mathit{\boldsymbol{B}}) = \frac{r-\sqrt{r^2+4 n}}{2} $, respectively.
Let
and
So, we can obtain
Therefore,
According to Theorem 3.1, we have
□
Next, we calculate the generalized Gaussian subgraph centrality for a windmill graph by Theorem 3.2.
Example 3.3. The $ F_s = K_1 \otimes (sK_2) $ is also called the windmill graph with order $ 2s+1 $. The spectra of $ F_s $ are $ \left \{ \frac{1 \pm \sqrt{8s+1}}{2}, -1^{[m]}, 1^{[m-1]} \right \} $[38]. From Theorem 3.2, if $ u \in V(K_1) $, where its degree is $ 2s $, then
The eigenvalues of $ F_s $ are known; we have
If $ v \in V(sK_2) $, where their degree is $ 2 $. Obviously, the $ GSC(v, \beta) $ values of these vertices are the same, then
We further give the generalized Gaussian subgraph centrality formula of a class multicone graph by Theorem 3.2.
Example 3.4. The $ R_{s, n} = K_1 \otimes (sC_n) $ is called the multicone graph with order $ sn+1 $, where $ n\geq 3 $. If $ s > 1 $, the spectra of $ R_{s, n} $ are $ \{ 1 \pm \sqrt{sn+1}, 2\cos\frac{2k \pi}{n}^{[s]}, 2^{[s-1]} \} $, where $ k = 1, 2, \cdots, n-1 $. If $ s = 1 $, the spectra of $ R_{s, n} $ are $ \{ 1 \pm \sqrt{n+1}, 2\cos\frac{2k \pi}{n} \} $, where $ k = 1, 2, \cdots, n-1 $ [50]. From Theorem 3.2, if $ u \in V(K_1) $, where its degree is $ sn $, then
The eigenvalues of $ R_{s, n} $ are known; we have
Case 1. If $ s = 1 $, then
If $ v \in V(s C_n) $, where their degree is $ 3 $. Obviously, the $ GSC(v, \beta) $ values of these vertices are the same, then
Case 2. If $ s > 1 $, then
If $ v \in V(sC_n) $, where their degree is $ 3 $. Obviously, the $ GSC(v, \beta) $ values of these vertices are the same, then
Remark 1. More studies of eigenvalues of multicone graphs (e.g., $ K_1 \otimes (sK_n) $, $ K_1 \otimes (s\overline{C_n}) $) are shown in [38]. Similar to the proof of examples 3.3 and 3.4, these graphs' generalized Gaussian subgraph centrality can be immediately obtained from the conclusion of Theorem 3.2. In addition, for some classes of graphs, such as the transitive graph (e.g., $ K_n $, Petersen graph) and the large symmetries graph (e.g., Dandelion graph, Cayley tree), the quotient matrix of the graphs used by Theorem 3.1 will be much smaller than the order of the adjacency matrix, so the convergence rate may be faster when using Theorems 3.1 and 3.2 to calculate the generalized Gaussian subgraph centrality than the adjacency matrix.
Furthermore, we discuss the influence of the parameter $ \beta $ on the robustness of the generalized Gaussian subgraph centrality by calculating the parameter $ \beta $ change in examples 3.3 and 3.4. First, we give the change of the value of the generalized Gaussian subgraph centrality with the windmill and wheel graphs, respectively, by the formulas in examples 3.3 and 3.4, as shown in Tables 1 and 2 (the results are to be retained to four decimal places).
Sandwich coordination compounds, also known as metallocenes, are a fascinating class of organometallic compounds characterized by a metal atom sandwiched between two cyclopentadienyl anions [51]. The molecular graph for metallocenes can be represented as $ R_{2, 5} = K_1 \otimes (2C_5) $, where $ K_1 $ stands for a transition metal (e.g., iron atoms, chromium atom), and $ C_5 $ represents the cyclopentadienyl ring [52]. Like ferrocene and chromocene, many molecular structures can be represented by the molecular graph $ R_{2, n} $ of the molecular sandwich structure. Therefore, our discussion on the influence of parameter $ \beta $ of generalized Gaussian subgraph centrality may further explain the physical and chemical properties of the molecular map of the sandwich structure, as shown in Table 3.
As seen from the results in Tables 1, 2, and 3. When $ \beta = 1 $ is used as the basis for calculating the generalized Gaussian subgraph centrality, $ F_7 $ requires more digits to compute. As the number of $ F_n $ nodes increases, higher precision is required to obtain the results for $ F_n $. Therefore, when $ \beta \leq 0.7 $, the result is conducive to the numerical analysis of the graph (network) and structural analysis. From a chemical analysis perspective, taking the molecular structure of ferrocene as an example, nodes with a higher degree are prone to substitution reactions, which correspond to nodes with generalized Gaussian subgraph centrality values close to 0. Therefore, appropriately increasing $ \beta $ may be beneficial in identifying the positions of atoms that are prone to substitution. So, the parameter $ \beta $ plays a significant role in understanding how variations in the molecular configuration can impact the reactivity and stability of metallocenes.
Below is the generalized Gaussian subgraph centrality formula for the graph, which utilizes a star set of graphs.
Theorem 3.5. Let $ \mathbb{X} \subseteq V(H) $ and the adjacency matrix of the subgraph induced by $ \mathbb{X} $ be $ \mathit{\boldsymbol{A}}_{\mathbb{X}}. $ Consider $ \mathbb{X} $ as a star set corresponding to an eigenvalue $ \mu $ of graph $ H $, and $ \mathit{\boldsymbol{A}} = (AXT⊤TP)
$. Suppose that $ \mathit{\boldsymbol{Q}} = \mathit{\boldsymbol{P}} -\mu \mathit{\boldsymbol{I}} $, $ \mathit{\boldsymbol{C}} = \mathit{\boldsymbol{Q}}^{-1} \mathit{\boldsymbol{T}} \mathit{\boldsymbol{T}}^{\top}+ \mathit{\boldsymbol{Q}} $. Then,
$ (1) $ For any vertex $ u \in \mathbb{X} $, we have
where $ N_u(\widetilde{\mathbb{X}}) $ denotes the set of all adjacent vertices of $ u $ in $ \widetilde{\mathbb{X}} = V(H) \backslash \mathbb{X}. $
$ (2) $ For any $ v \in \widetilde{\mathbb{X}} = V(H) \backslash \mathbb{X} $, we can obtain
Proof. According to Lemma 2.2 and $ C=Q−1TT⊤+Q=Q−1(TT⊤+Q2),
$ we can obtain
Since $ \mathit{\boldsymbol{T}} \mathit{\boldsymbol{T}}^{\top}+\mathit{\boldsymbol{Q}}^2 $ is positive definite and $ \mathit{\boldsymbol{C}} $ is nonsingular, it follows that
where
Then
Therefore, we can obtain the generalized Gaussian subgraph centrality of $ H $ as follows:
$ (1) $ For any vertex $ u \in \mathbb{X} $, we can obtain
where $ N_u(\widetilde{\mathbb{X}}) $ denotes the set of all adjacent vertices of $ u $ in $ \widetilde{\mathbb{X}} = V(H) \backslash \mathbb{X} $.
$ (2) $ For any vertex $ v \in \widetilde{\mathbb{X}} $, we can obtain
□
Remark 2. The emergence of the star complement technique is a method to study the problem of graph space and graph isomorphism. However, it is still challenging to find the maximal graphs corresponding to nice star complements; literature [53] still gives some small $ \mu $-rank (the value of $ t = n-s $ is as small as possible, where $ s $ is the multiplicity of the eigenvalues of $ \mu $) graphs with good structural characteristics. The order of the matrices of these small $ \mu $-rank graphs is much smaller than that of the adjacency matrices. Therefore, Theorem 3.5 can significantly improve the convergence rate of calculating the centrality of generalized Gaussian subgraphs in theory. However, we cannot find a good way to obtain nice star complements of graphs, which is also our future research direction.
4.
Bounds of generalized Gaussian Estrada index
In this section, we get some bounds of the generalized Gaussian Estrada index based on the count of vertices and edges in graph $ H $. We determine the bounds of $ GEE(H, \beta) $ by some graph parameters as follows.
Theorem 4.1. Let $ H $ be a graph with $ n $ vertices and $ m $ edges, then
where $ \omega = \sqrt{2 m \beta^2+m \beta^2\left(\frac{2 m}{n-1}+n-2\right)} $ and the equality mentioned above is valid if and only if $ H \cong \overline{K_n} $.
Proof. According to the generalized Gaussian Estrada index definition, we have
From the arithmetic-geometric inequality $ \frac{\sum _{i = 1}^n x_i}{n} \geq \sqrt[n]{\prod_{i = 1}^n x_i} $ for positive number $ x $, in which equality holds if and only if $ x_1 = x_2 = \cdots = x_n $. For $ \sum_{i = 1}^n \mu_i^2 = 2m $, we can obtain
From the expansion of Taylor series $ e^{-x} = \sum^{\infty}_{k = 0} \frac{(-x)^k}{k!} $ and $ e^{-x} \geq 1- x $ for positive number $ x $, it follows that
So we can obtain
By substituting the mentioned formula and solving for $ GEE(H, \beta) $, we obtained
then
where $ \beta > 0 $.
We also give an upper bound by Lemma 2.3.
where $ \omega = \sqrt{2 m \beta^2+m \beta^2\left(\frac{2 m}{n-1}+n-2\right)} $.
Based on the previous derivation, it is evident that equality holds if and only if graph $ H $ has all eigenvalues equal to zero. This condition is only satisfied by the empty graph $ \overline{K_n} $.
□
Next, we give another simple lower bound for $ GEE(H, \beta) $.
Theorem 4.2. Let $ H $ be a graph with $ n $ vertices and $ m $ edges, then
the equality holds if and only if $ H \cong \overline{K_n} $.
Proof. Similar to the analysis of Theorem 4.1, we have
From the expansion of Taylor series $ e^{-x} = \sum^{\infty}_{k = 0} \frac{(-x)^k}{k!} $ and $ e^{-x} \geq 1- x $ for positive number $ x $, it follows that
Consequently, for any $ \epsilon \in[0, 1] $, we have
For $ \epsilon < 1 $, it follows that
Based on the previous derivation, it is evident that equality holds if and only if graph $ H $ has all eigenvalues equal to zero. This condition is only satisfied by the empty graph $ \overline{K_n} $. □
5.
Conclusions
The Estrada index and subgraph centrality for exploring network structure and properties focus more on the influence of extreme eigenvalues on network structure and properties. Unlike the well-known Estrada index and subgraph centrality, $ GSC(u, \beta) $ and $ GEE(H, \beta) $, under the Gaussian function definition, assign more weight to zero eigenvalues (if they exist) and near-zero eigenvalues to the network structure. At the same time, $ GSC(u, \beta) $ and $ GEE(H, \beta) $ are highly related to frontier orbital theory in quantum chemistry, so studying $ GSC(u, \beta) $ and $ GEE(H, \beta) $ is valuable. In this paper, since every graph has EP and a star set, against this background, we give some new formulas to calculate $ GSC(u, \beta) $. We can obtain $ GSC(u, \beta) $ formulas using a smaller matrix than the adjacency matrix. The influence of the parameter $ \beta $ on the robustness of the formula is explored through experiments. In addition, we also give some bounds for $ GEE(H, \beta) $. Based on the above research and recent research results, our future work will study $ GSC(u, \beta) $ and $ GEE(H, \beta) $ of random and dynamic graphs.
Author contributions
Yang Yang: Writing-review & editing, Writing-original draft, Visualization, Validation, Supervision, Software, Project administration, Methodology, Investigation, Formal analysis, Data curation, Conceptualization. Yanyan Song: Writing-review & editing. Haifeng Fan: Writing-review & editing, Writing-original draft, Methodology, Investigation, Funding acquisition. Haiyan Qiao: Writing-review & editing, Funding acquisition. Yang Yang and Yanyan Song contribute equally to the article.
Use of Generative-AI tools declaration
The authors declare they have not used Artificial Intelligence (AI) tools in the creation of this article.
Acknowledgments
The authors would like to thank the editor and the kind anonymous referees for their insightful comments that helped to improve the paper's final edition. Hebei Province high-level talent funding project(B20221014, C20221079).
Conflict of interest
The authors declare no conflict of interest.