Loading [MathJax]/jax/output/SVG/jax.js
Research article Special Issues

Existence results of fractional differential equations with nonlocal double-integral boundary conditions


  • Received: 24 October 2022 Revised: 15 December 2022 Accepted: 16 December 2022 Published: 26 December 2022
  • This article presents the existence outcomes concerning a family of singular nonlinear differential equations containing Caputo's fractional derivatives with nonlocal double integral boundary conditions. According to the nature of Caputo's fractional calculus, the problem is converted into an equivalent integral equation, while two standard fixed theorems are employed to prove its uniqueness and existence results. An example is presented at the end of this paper to illustrate our obtained results.

    Citation: Debao Yan. Existence results of fractional differential equations with nonlocal double-integral boundary conditions[J]. Mathematical Biosciences and Engineering, 2023, 20(3): 4437-4454. doi: 10.3934/mbe.2023206

    Related Papers:

    [1] Doris Dumičić Danilović, Andrea Švob . On Hadamard 2-(51,25,12) and 2-(59,29,14) designs. AIMS Mathematics, 2024, 9(8): 23047-23059. doi: 10.3934/math.20241120
    [2] Jiaxin Shen, Yuqing Xia . Flag-transitive 2-designs with block size 5 and alternating groups. AIMS Mathematics, 2025, 10(5): 10308-10323. doi: 10.3934/math.2025469
    [3] Yongli Zhang, Jiaxin Shen . Flag-transitive non-symmetric 2-designs with λ prime and exceptional groups of Lie type. AIMS Mathematics, 2024, 9(9): 25636-25645. doi: 10.3934/math.20241252
    [4] Simone Costa, Marco Pavone . Orthogonal and oriented Fano planes, triangular embeddings of K7, and geometrical representations of the Frobenius group F21. AIMS Mathematics, 2024, 9(12): 35274-35292. doi: 10.3934/math.20241676
    [5] Qian Li, Qianqian Yuan, Jianhua Chen . An efficient relaxed shift-splitting preconditioner for a class of complex symmetric indefinite linear systems. AIMS Mathematics, 2022, 7(9): 17123-17132. doi: 10.3934/math.2022942
    [6] Yanyan Gao, Yangjiang Wei . Group codes over symmetric groups. AIMS Mathematics, 2023, 8(9): 19842-19856. doi: 10.3934/math.20231011
    [7] Yuna Zhao . Construction of blocked designs with multi block variables. AIMS Mathematics, 2021, 6(6): 6293-6308. doi: 10.3934/math.2021369
    [8] Muhammad Sajjad, Tariq Shah, Huda Alsaud, Maha Alammari . Designing pair of nonlinear components of a block cipher over quaternion integers. AIMS Mathematics, 2023, 8(9): 21089-21105. doi: 10.3934/math.20231074
    [9] Ted Hurley . Ultimate linear block and convolutional codes. AIMS Mathematics, 2025, 10(4): 8398-8421. doi: 10.3934/math.2025387
    [10] Hailin Liu, Longzhi Lu, Liping Zhong . On the proportion of elements of order a product of two primes in finite symmetric groups. AIMS Mathematics, 2024, 9(9): 24394-24400. doi: 10.3934/math.20241188
  • This article presents the existence outcomes concerning a family of singular nonlinear differential equations containing Caputo's fractional derivatives with nonlocal double integral boundary conditions. According to the nature of Caputo's fractional calculus, the problem is converted into an equivalent integral equation, while two standard fixed theorems are employed to prove its uniqueness and existence results. An example is presented at the end of this paper to illustrate our obtained results.



    Drivers interact with a rich and highly dynamic external environment as integral components of a closed-loop control system with multiple input sources and feedback pathways. According to statistics from the American Automobile Association Foundation for Traffic Safety in 2019 [1], nearly 80% of American drivers reported having driven in anger, exhibited aggressive driving behaviors, or experienced road rage at least once in the past 30 days. These emotions are primarily triggered by traffic congestion, road construction, time pressure, and the high cognitive demands of driving. Such negative emotional states can significantly impair drivers' perception, judgment, decision-making, and operational performance, posing serious safety risks to road traffic [2]. With the rapid development of wearable devices, high-precision environmental sensing technologies, and artificial intelligence algorithms, automobiles have evolved from simple transportation into sophisticated human-machine interaction (HMI) systems and personalized spaces underpinned by intelligent cockpits. Unlike traditional cockpits that prioritize functionality and comfort, intelligent cockpits integrate advanced technologies such as software development, virtualization, artificial intelligence, personalized infotainment systems, and intelligent driving features. These systems act as intelligent environments, combining information processing, entertainment, relaxation, and safety assurance, enhancing the driving experience and road safety [3].

    As a product of the deep integration of information technology and the automotive sector, intelligent cockpits are characterized by intelligence, multimodality, natural interaction, and personalized customization [4]. Its technological framework has also extended to various areas, including smart manufacturing, agricultural equipment, and intelligent medical systems [5], all of which face critical issues regarding the impact of operators' emotional states on operational safety and efficiency. Furthermore, as driving modes gradually evolve toward human-machine collaborative driving and autonomous driving, particularly in the transition from single-vehicle to multi-vehicle cooperative autonomous driving, the intelligent cockpit emotional perception system is crucial for enhancing drivers' trust in autonomous driving technology. These systems employ in-cabin sensors, such as cameras, microphones, and physiological monitors, to continuously monitor and analyze drivers' emotional states and behavioral patterns. They dynamically adjust the in-cabin environment and system settings using real-time feedback [6]. In complex traffic scenarios where manual and autonomous driving coexist, drivers' emotional states directly influence their decision-making, behavioral responses, and acceptance of autonomous driving technologies. Emotional fluctuations become more pronounced under high cognitive load conditions, such as dense traffic, adverse weather, or nighttime driving. In these situations, drivers are prone to stress, anxiety, and fear, which can impair their judgment and reaction times. Emotion recognition systems in intelligent cockpits can quickly identify these emotional changes and, through adaptive interaction modes, provide timely support or services to drivers. The in-depth exploration of emotion recognition systems in intelligent cockpits holds significant potential for improving driving safety and facilitating the widespread adoption of human-machine collaborative driving models. By mitigating the risks associated with emotional disruptions, these systems serve as critical enablers of safer and more efficient human-machine synergy in the evolving landscape of modern transportation.

    Emotion recognition systems in intelligent cockpits hold immense potential for enhancing autonomous vehicles' acceptability, safety, and comfort. Scholars have reviewed the current research landscape of such systems from various perspectives. From a technical perspective, Shukur Alfaras et al. [7] conducted a comprehensive review of emotion recognition algorithms in intelligent cockpits based on deep learning and machine learning, multimodal data fusion technologies, and real-time processing and feedback mechanisms. Zepf et al. [8] summarized the multimodal expression of human emotions, experimental methods for inducing driving-related emotions, and pathways for emotion recognition and regulation. Regarding driving user experience and usability, Tan et al. [9] explored the application value of intelligent cockpit emotion recognition systems in improving driving safety, optimizing user experiences, and facilitating personalized driving assistance. Their work emphasized the critical role of emotion recognition systems in sub-fields such as cognitive load management, situational awareness, and takeover behavior. Regarding HMIs, Li et al. [10] analyzed the mechanisms, manifestations, and behavioral impacts of driving-related emotions and their implications for traffic safety. Additionally, their study examined how emotion recognition systems can provide personalized support and interaction strategies based on individual characteristics and emotional responses, fostering more natural emotional interaction and higher satisfaction with driving experiences. Lastly, Lu et al. [11] investigated sensory-based HMI technologies, including visual, auditory, tactile, and olfactory modalities. Their systematic review highlighted the bidirectional perception and feedback mechanisms between drivers and the vehicular environment, underscoring the importance of multisensory integration in intelligent cockpits.

    Current research primarily focuses on single-function capabilities of emotion recognition and regulation technologies within emotion perception systems, yet future studies face several challenges. First, existing reviews often overlook the multidimensional impacts of potential HMIs, needing comprehensive evaluations of intelligent cockpit emotion perception technologies and predictions for their future applications. Second, developing an emotional interaction framework for intelligent cockpits in autonomous vehicles requires interdisciplinary collaboration, encompassing HMI, automotive engineering, cognitive psychology, affective computing, and intelligent transportation systems. Close cooperation among experts from these domains is essential for constructing effective emotional interaction systems. Third, despite the rapid progress in intelligent cockpit technologies, significant challenges remain, including the naturalness of HMIs, system stability, multimodal information integration, and interactions between humans and intelligent agents. The gap between current research and industrial implementation persists, with limited studies addressing the complex emotional dynamics arising from the coupling of humans, vehicles, and roads in the context of autonomous intelligent cockpits. Existing literature primarily categorizes research themes such as emotion recognition technologies and multimodal fusion in emotion perception systems. However, a comprehensive analysis of publication timelines, geographical distributions, authorship, institutions, and collaborative networks is needed.

    Based on the challenges above, this study selected 698 publications on "emotion perception in intelligent cockpits" from 2010 to 2024 from the Web of Science Core Collection database. Using bibliometric methods and scientific knowledge mapping visualization tools, we comprehensively analyzed publications' temporal and spatial distribution, collaboration networks among authors and institutions, co-occurrence clustering, and emerging trends. The goal was to reveal the research status, hotspots, and trends in emotion perception in intelligent cockpits. Specifically, this study:

    Examined the themes and disciplinary knowledge structure of emotion perception research in intelligent cockpits over the past 15 years;

    Identified leading authors in the theoretical domain of intelligent cockpit emotion perception and analyzed their contributions;

    Summarized the evolutionary trends of research hotspots in the field, providing references for the development of emotion-centered intelligent hardware and software systems capable of self-perception, self-learning, and self-evolution, as well as for enhancing natural interaction and proactive decision-making capabilities.

    The objectives of this research are primarily reflected in the following questions:

    Q1: What are the current stages of development in emotion perception technologies within intelligent cockpits? (See Section 3.1.)

    Q2: What current research focuses on intelligent cockpit emotion perception? (See Section 4.)

    Q3: What will be the future research hotspots in this domain? (See Section 5.)

    The remainder of this paper is structured as follows. Section 2 presents the methodology and data sources for bibliometric analysis. Section 3 provides descriptive statistics and qualitative analysis of publication quantity and keywords based on bibliometric methods. Section 4 systematically reviews the development trajectory and research hotspots in the field of emotional perception in intelligent cockpits through keyword timelines, clustering, and emergent analysis. In Section 5, this study discusses future research directions in the field of emotional perception in intelligent cockpits from four perspectives: human-computer interaction, application development, implementation, and assistive technology. Section 6 summarizes the research findings of the entire paper.

    Bibliometric analysis is a quantitative method used to review and describe published research, serving as a critical tool for navigating complex academic landscapes, summarizing existing findings, and formulating research strategies. Its advantages are widely acknowledged and validated across numerous studies.

    Bibliometric analysis employs quantitative approaches to objectively evaluate literature data, introducing transparent and reproducible review mechanisms that enhance academic output assessment's scientific rigor and reliability.

    Techniques such as co-citation analysis [12], co-occurrence analysis [13], cluster analysis [14], and thematic evolution analysis [15] are utilized to reveal the structure of academic networks, identify core publications and research hotspots, and track the dynamic evolution of topics.

    By identifying highly cited publications and co-citation networks, the analysis elucidates the critical contributions of academic clusters, providing a robust foundation for knowledge evaluation and accumulation.

    The Web of Science Core Collection (WoSCC) was selected as the dataset for bibliometric analysis due to the following reasons: WoSCC's comprehensive multidisciplinary scope is well-suited to studying emotion perception in intelligent cockpits, which intersects fields such as neurocognition, psychology, automotive engineering, automation and control engineering, and computer science. WoSCC's robust dataset is highly compatible with bibliometric tools like VOSviewer and CiteSpace, providing detailed citation references and citation reports that enhance the accuracy and reliability of the analysis [16]. The dataset can be directly imported into mainstream bibliometric software, minimizing data conversion losses and ensuring analytical completeness. Its advanced search and filtering functions enable precise topic, journal, author, institution, and country-based analyses, yielding targeted insights. This study utilized the Science Citation Index Expanded (SCIE) and Social Science Citation Index (SSCI) to ensure comprehensiveness and accuracy. A search strategy was designed to balance breadth and precision, ensuring the inclusion of relevant literature while excluding unrelated content.

    The literature retrieval strategy for CiteSpace and VOSviewer analysis comprised four main steps:

    Literature Search: Keywords such as "Driving Emotion Recognition, " "Driving Affective Computing, " "Affective Computing Automotive, " and "Affective Computing Car" were used. Additional keywords included "Emotion Recognition Automotive, " "Mood Detection Driver, " "Emotion Recognition Co-Driver, " and "Car Occupants Emotion Recognition." Studies focusing on the driver's mental and physical states, such as drowsiness/fatigue [17], inattention/distraction [18], and mental workload [19], were excluded. The search was restricted to 2010–2024, journal and review articles, and English-language publications. An initial search yielded N = 835 papers.

    Literature Screening: To eliminate "noise" in the database, the following criteria were applied: Removal of duplicate articles. Exclusion of articles with missing metadata (e.g., author, year, journal). Exclusion of non-academic materials such as newspaper articles, conference announcements, book reviews, and dissertations. Retention of only one version of articles with overlapping content, prioritizing the most comprehensive version. After filtering, 740 articles were retained.

    Standardized Review: Two researchers independently reviewed the titles and abstracts of the remaining articles based on predefined inclusion and exclusion criteria. Full-text reviews were conducted for further screening. Disagreements were resolved through discussion with a third researcher. After this stage, 683 articles were retained.

    Database Expansion: To ensure comprehensive coverage, the database was expanded by incorporating cited references. Following Chen's [20] recommendation, a co-citation network was constructed based on the references cited in the selected articles, which provided valuable information about the relationships between different concepts and theories. Related citations were included using the "Create Citation Report" function in Web of Science to capture the field's evolution. This approach expanded the original database and improved its representativeness. After integrating the citation data, the final database comprised 698 articles.

    The selected publications were exported from the WoSCC database in "Plain Text - Full Record and Cited References" format. The exported data included authors, titles, source journals, publication dates, languages, document types, keywords, abstracts, author affiliations, volume and page numbers, citation counts, and cited references.

    Descriptive statistical analyses were performed using Excel 2019, focusing on annual publication volume, authors, journals, keywords, institutions, countries, and citation frequencies. The bibliometric analysis was conducted using CiteSpace (6.1.6) and VOSviewer (1.6.19), both operating in a Java environment. The metadata from the filtered publications was imported into VOSviewer and CiteSpace for co-occurrence analysis of keywords, authors, countries, and institutions. Furthermore, clustering and temporal evolution analyses were conducted for keywords, institutions, and authors to uncover patterns and trends in the field.

    Figure 1 illustrates the bibliometric visualization workflow adopted in this study. It comprises two primary components: retrieval strategy (Section 2.1.2) and bibliometric visualization methods (Section 2.2).

    Figure 1.  Bibliometric visualization analysis workflow.

    This study employs CiteSpace and VOSviewer for bibliometric visualization of the emotion perception domain in intelligent cockpits, covering co-citation, co-occurrence, clustering, and thematic evolution analyses, as shown in the literature visualization methods section of Figure 1. CiteSpace and VOSviewer are advanced software tools widely used for bibliometric network visualization. CiteSpace, developed by Dr. Chaomei Chen at Drexel University, employs co-citation analysis and pathfinding algorithms to conduct bibliometric literature analysis within specific fields. It aims to reveal knowledge domains' structure, evolution, and collaboration networks through visual mapping, thereby identifying the underlying driving forces of disciplinary development and uncovering emerging technologies [21,22,23]. VOSviewer, developed by Nees Jan van Eck and Ludo Waltman from the Centre for Science and Technology Studies at Leiden University, generates compelling visualizations of graphical networks by calculating the number and cumulative strength of connections. It is particularly effective in co-occurrence analyses of authors, institutions, countries, and keywords, with outstanding visualization performance [24,25]. CiteSpace uses keyword clustering maps, burst detection charts, and timeline diagrams to illustrate research advancements and frontier trends.

    Co-citation analysis evaluates the relationships between academic publications by examining their co-citation patterns, forming logically connected thematic clusters based on grouping documents that share strong co-citation relationships. This method enables an in-depth examination of the literature within each cluster, allowing researchers to distinguish between foundational documents representing the core knowledge base and theoretical underpinnings of the field and newer studies that extend and innovate upon these foundations. Co-citation analysis involves both cited references, which typically reflect the essential knowledge structure and theoretical context of the domain, and citing references, which showcase research expansions and new developments grounded in these foundational works. The co-citation occurs when two or more references are cited within the same publication, creating systematic interrelationships that reflect the structural connections among documents. The intensity of co-citation relationships is commonly measured using a co-citation index, providing a quantitative foundation for understanding the cohesion and dynamics of research within the domain [26]. The formula for calculating co-citation strength is as follows:

    Ci,j=Nk=1(Ak,iAk,j) (2.1)

    In the formula (2.1), Ci, j represents the co-citation strength between documents i and j, and Ak, i and Ak, j indicates whether document k cites documents i and j, respectively (1 if cited, 0 otherwise). N denotes the total number of documents in the dataset.

    Temporal co-citation clustering analysis provides insights into the evolution and development trajectory of research topics, offering valuable support for understanding the origins of a field and predicting future trends. Variations in the distribution of citation frequencies highlight the dissemination and concentration of research outputs. Compared to standard citation counting, co-citation analysis more accurately reveals the substantive relationships and dynamic development within academic clusters. This makes it an indispensable method for exploring scholarly communication's structural and temporal dimensions.

    Co-occurrence analysis is a technique used to uncover relationships and latent patterns among elements within a dataset of academic literature. This method explores their interconnections and highlights research hotspots by statistically examining the frequency of simultaneous occurrences of keywords, authors, institutions, and countries. This study employs co-occurrence analysis to identify research themes in emotion perception in intelligent cockpits. The visualization of keyword networks and density maps is achieved based on word frequency. In CiteSpace, the density view uses the color of each point to reflect its density, while the average distance between two points is represented by d. The formula is as follows [27]:

    d=2n(n1)i<jxixj (2.2)

    where the density D(X) of a point (xi, xj) is calculated as follows:

    D(X)=ni<jWik{xixjdh} (2.3)

    In the formula (2.3), k ∈ [0, +∞) is the kernel function, h > 0 represents the bandwidth parameter, and Wi is the weight of object i, corresponding to the total frequency of its occurrence.

    For the analysis of authors and research institutions, Price's Law [28] is applied to calculate the minimum publication threshold for core authors and institutions. The formula is expressed as follows:

    n=0.749×max (2.4)

    In the formula (2.4), max represents the number of publications by the most prolific author or institution.

    The formula for testing whether core authors and core institutions have formed a cohesive core group is expressed as follows:

    M = (Nmax/N)×100%  (2.5)

    In the formula (2.5), where M represents the proportion of contributions by core authors or institutions, Nmax is the frequency above the Price's Law threshold, and N is the total frequency count. A core group has been established when M ≥ 50%.

    Cluster analysis is a classification method based on the numerical characteristics of research objects, such as words or phrases [29]. It does so by constructing an n * n similarity matrix S of n keywords.

    S=(Sij) (2.6)

    In the formula (2.6), Sij ≥ 0, Sii = 0, and SijSji for i, j∈{1, …, n}.

    This similarity matrix quantifies the relationships between keywords, with the principle of separating objects with low similarity and clustering those with high similarity. The similarity between two objects i and j is expressed as:

    Sij=CijWiWj (2.7)

    In the formula (2.7), Cij is the co-occurrence frequency between objects i and j, and Wi and Wj represent their individual frequencies.

    To improve clustering effectiveness, the weighted Euclidean distance between objects within the same cluster is minimized. The distance between clusters is calculated as follows:

    E(X;S)=i<jSijxixj2 (2.8)

    In the formula (2.8), X represents the set of data points included in the clustering analysis, and S denotes the similarity matrix, Sij indicates the similarity between data points xi and xj. The symbol represents the Euclidean norm. The minimization of the objective function is subject to the following constraints:

    i<jxixj=1 (2.9)

    In the formula (2.9), xi and xj denote individual data points from the set X that are being compared in the clustering analysis. To determine the color of each node in the knowledge map, based on Sections 2.1.2 and 2.1.3, the cluster color is calculated by taking the weighted average of the colors of all nodes in the cluster, where the density of each node serves as the weight. Finally, the calculated cluster color is blended with the background color of the cluster density map.

    Thematic evolution analysis primarily uses timeline diagrams and burst detection to reveal research themes and their development trends. Timeline diagrams allow researchers to observe the temporal evolution of different themes visually. Each node on the timeline represents a theme or keyword, with node size proportional to its frequency and connections between nodes indicating their relevance. Larger nodes signify themes with high density and centrality, suggesting research hotspots during that period. The analysis identifies rapidly emerging research themes or keywords during specific periods. It measures the intensity of these emergent themes using the concept of "burstiness", effectively capturing nascent research trends and highlighting increasingly prominent topics [30]. In CiteSpace, burstiness is calculated by integrating parameters such as frequency, time window, and significance thresholds. This approach identifies statistically significant changes, providing deep insights into research dynamics. The burstiness formula is expressed as:

    B(t)=ntNt (2.10)

    In the formula (2.10), B(t) represents the burstiness at a specific time t, nt is the frequency of a keyword or theme at time t, and Nt is the total number of publications at time t.

    Sorting the number of publications by year provides insights into the level of interest, development trajectory, and trends in emotion perception in intelligent cockpits. An analysis of 698 articles from the WOSCC database reveals the annual distribution of publications, as shown in Figure 2.

    Figure 2.  Annual publication distribution of research on emotion perception in intelligent cockpits (2010–2024).

    Figure 2 illustrates the annual publication trends from 2010 to 2024 in the field of intelligent cockpit emotion perception. Based on this data, 2014 and 2019 emerge as two significant turning points, dividing the research progress into three distinct phases. The first phase, spanning 2010–2014, is characterized by low and steady publication activity. This corresponds to the relatively low attention paid to technologies such as machine learning, deep learning, the Internet of Things (IoT), and computer vision during that period. Additionally, as a nascent interdisciplinary field, research output in intelligent cockpit emotion perception was relatively limited. The second phase, covering 2014–2019, demonstrates more variability, with a gradual increase in total publications. This period reflects the growing recognition of intelligent cockpit technologies as a critical research domain. Starting in 2019, the third phase marks a period of rapid growth and sustained upward momentum, indicating that research in this field has entered a stable and mature stage of accelerated development. As intelligent cockpit technologies have advanced and become more widespread, emotion perception has become a key technology for enhancing driving experiences and safety, significantly increasing research output. Overall, the annual publication volume in this field has shown a consistent growth trend since 2010, reflecting the maturation of related research domains such as artificial intelligence and signal processing. This growing interest highlights intelligent cockpit emotion perception technologies' increasing scientific value and practical relevance.

    Journal statistics reveal the journals that publish the most articles contribute the most to the field, and exert the most significant influence [31]. Table 1 ranks the top 10 journals by publication volume and Total Global Citation Score (TGCS). TGCS represents the total number of citations received by a journal's articles in the WOS database. High TGCS values indicate a journal's significant role in advancing relevant research [32]. As shown in Table 1, the journal with the highest number of publications is Transportation Research Part F: Traffic Psychology and Behavior (TRF), with 183 articles. This is followed by Accident Analysis and Prevention (AAP), while the number of articles published by journals ranked third–10th is significantly lower. Moreover, TRF and AAP have far higher citation frequencies than other journals, establishing them as the most influential journals in intelligent cockpit emotion perception. In addition to TRF and AAP, other journals with high TGCS values include the Journal of Safety Research, which focuses on safety topics, the IEEE Transactions on Affective Computing, and the International Journal of Human-Computer Interaction, which centers on affective computing and HMIs. These journals provide critical references for designing HMI systems, applying emotion recognition algorithms, and evaluating user experiences in intelligent cockpits. In summary, TRF and AAP are highly aligned with intelligent cockpit emotion perception themes and serve as primary platforms for publishing related research. The top 10 journals, in terms of publication volume, predominantly cover transportation, safety, psychology, behavior, public health, and computer science.

    Table 1.  Top 10 journals ranked by the number of papers and TGCS.
    Rank Journal name Count TGCS Impact factor
    1 Transportation Research Part F: Traffic Psychology and Behavior 183 1617 3.5
    2 Accident Analysis and Prevention 165 3563 5.7
    3 IEEE Transactions on Affective Computing 99 988 9.6
    4 International Journal of Human-Computer Interaction 63 710 3.4
    5 Behaviour & Information Technology 45 266 2.9
    6 Journal of Advanced Transportation 37 371 2
    7 IEEE Communications Survey and Tutorials 18 659 34.4
    8 Sensors 15 344 3.4
    9 Journal of Personality and Social 11 135 6.4
    10 Journal of Safety Research 10 558 3.9

     | Show Table
    DownLoad: CSV

    Citation analysis efficiently identifies foundational knowledge bases within a research field by examining highly cited publications [33]. Table 2 lists the top 10 most cited articles in intelligent cockpit emotion perception. Among these, Mollahosseini et al. [34] published a landmark study in IEEE Transactions on Affective Computing (IF = 9.6) titled "AffectNet: A Database for Facial Expression, Valence, and Arousal Computing in the Wild." This study introduced the AffectNet database, containing over a million annotated images sourced through 1250 emotion-related keywords across seven languages. The dataset includes annotations for seven discrete emotions and valence and arousal labels. Using two baseline deep neural networks for classification, their results outperformed traditional machine learning methods and pre-existing facial expression recognition systems. The top 10 cited works in this field can be categorized into two major themes: affective computing analysis and intelligent transportation systems.

    Table 2.  Top 10 most cited publications in the intelligent cockpit emotion perception research field.
    Rank Author Title Citations Impact factor
    1 Mollahosseini A (2017) AffectNet: A Database for Facial Expression, Valence, and Arousal Computing in the Wild 2255 9.6
    2 Li S (2020) Deep Facial Expression Recognition: A Survey 2084 9.6
    3 Barrett LF (2019) Emotional Expressions Reconsidered: Challenges to Inferring Emotion from Human Facial Movements 1892 18.2
    4 Song TF (2018) EEG Emotion Recognition Using Dynamical Graph Convolutional Neural Networks 1304 9.6
    5 Alarcao SM (2017) Emotions Recognition Using EEG Signals: A Survey 1123 9.6
    6 Katsigiannis S (2017) DREAMER: A Database for Emotion Recognition through EEG and ECG Signals from Wireless Low-cost Off-the-Shelf Devices 1044 6.7
    7 Minaee S (2021) Deep-Emotion: Facial Expression Recognition Using Attentional Convolutional Network 754 3.4
    8 Othman K (2021) Public Acceptance and Perception of Autonomous Vehicles: A Comprehensive Review 296 4.2
    9 Zepf S (2020) Driver Emotion Recognition for Intelligent Vehicles: A Survey 223 23.8
    10 Li GF (2020) [43] Influence of Traffic Congestion on Driver Behavior in Post-Congestion Driving 205 5.7

     | Show Table
    DownLoad: CSV

    Mollahosseini [34], Song [35], Barrett [36], and Alarcao [37] focused on advancing affective computing technologies, employing machine learning and deep learning algorithms to improve the accuracy and reliability of emotion recognition. Their work emphasized the importance of standardized datasets and robust algorithmic training while highlighting the significance of diverse and representative data for capturing emotional expressions across varying cultural and environmental contexts. For instance, Song et al. [35] applied recurrent neural networks (RNNs), particularly long short-term memory (LSTM) networks, to video-based emotion analysis. The LSTM architecture effectively captured dynamic changes in time-series data, making it suitable for handling multimodal inputs related to emotional fluctuations. On the other hand, Barrett et al. [36] employed rule-based sentiment analysis methods using emotion dictionaries to evaluate text sentiment, exploring cultural variations in emotional expression.

    In the domain of intelligent transportation systems, researchers such as Li [38], Katsigiannis [39], Minaee [40], Zepf [41], and Othman [42] examined safety and efficiency in autonomous driving. They analyzed environmental complexity, regulatory challenges, and public acceptance. Katsigiannis et al. [39] developed a biomedical health monitoring system based on innovative sensing technology to enhance driver health evaluation. Li et al. [38] used simulation experiments to reveal the impact of environmental complexity on autonomous driving safety. Othman et al. [42] focused on public acceptance, emphasizing the importance of addressing public concerns and needs in the promotion of autonomous driving technologies. Through a systematic literature review, the authors analyzed key factors influencing public acceptance of autonomous vehicles (AVs). The findings highlighted safety, ethical dilemmas, accountability, and regulatory lag as critical determinants of public acceptance. Among these, safety concerns stood out prominently, with the public exhibiting widespread distrust in AVs' decision-making capabilities. Furthermore, media coverage of AV-related accidents was shown to significantly undermine public trust. In summary, the most cited works in this field span a variety of themes and are published in high-impact journals. They offer a comprehensive perspective on the diverse developments in intelligent cockpit emotion perception. These influential studies will continue to shape the future research agenda, enabling readers to better understand this domain's multifaceted evolution.

    Using CiteSpace and VOSviewer, this study analyzed collaboration networks to gain insights into academic collaboration patterns in intelligent cockpit emotion perception. The analysis was conducted across three levels: researcher collaboration, institutional collaboration, and inter-country collaboration, representing micro-, meso-, and macro-level analyses, respectively. The co-occurrence maps reveal clusters of collaborative groups, where the size of each node corresponds to the number of publications, and the color represents the chronological order of publications [44]. The thickness of links between nodes indicates the strength of collaboration. These publications are typically the product of efforts by scholars from different countries, institutions, and disciplines. The analysis also highlights the centrality of specific authors within the collaboration network, indicating their degree of collaboration with researchers from other institutions. This collaborative landscape reflects the growing importance of cross-disciplinary and cross-institutional efforts in advancing the field of intelligent cockpit emotion perception.

    a. Analysis by author publication volume

    A co-occurrence knowledge map of significant research groups in intelligent cockpit emotion perception was generated using the WOS database, as shown in Figure 3. As seen in Table 3, the team led by Gaspar, John has the highest publication volume with 22 papers, though no distinct core author group has been formed. Following this are teams led by Yousefian, Reza, Lenne, Michael G, Cao, Dongpu, and Guo, Gang. High-centrality documents are frequently cited or co-cited across various fields. Such works often serve as intersections between different areas of knowledge or critical nodes in the temporal evolution of scholarly disciplines. Most core team members exhibit a centrality greater than 0.01. In particular, authors such as Lenne, Michael G, Gaspar, John, and Aitken, Blair have relatively high centrality, highlighting the influential position of their work within the knowledge structure. Notably, Guo, Gang from Chongqing University's Department of Automotive Engineering exhibits the highest centrality value of 0.12, underscoring their significant role in the field.

    Figure 3.  Collaborative network map of major research groups.
    Table 3.  Distribution of centrality: Number of publications of authors.
    Rank Authors Institution Count Centrality
    1 Gaspar, John G University of Iowa 22 0.11
    2 Yousefian, Reza University of North Carolina 20 0.09
    3 Lenne, Michael G Monash University 17 0.05
    4 Cao, Dongpu Tsinghua University 16 0.09
    5 Guo, Gang Chongqing University 14 0.12
    6 Jipp, Meike German Aerospace Centre 11 0.09
    7 Bosch, Esther German Aerospace Centre 7 0.02
    8 Shiferaw, Brook Swinburne University of Technology 5 0.03
    9 Aitken, Blair Swinburne University of Technology 4 0.07
    10 Schwarz, Chris University of Iowa 3 0.01

     | Show Table
    DownLoad: CSV

    b. Analysis by author citation metrics

    Using VOSviewer, a citation node map (Figure 4) and a citation distribution table (Table 4) were created for authors in the intelligent cockpit emotion perception domain. The connection strength in the table represents the number of times an individual author is linked to others within the map. A higher value indicates that the author is cited more frequently by others in the network [45]. Figure 4 shows that the Ekman, P node has the largest font size, signifying the highest total citation count. Ekman's seminal 1985 paper on basic emotion theory, published in Science, has been cited 57 times [46], establishing him as a pioneering figure in emotion classification with profound international influence. Similarly, Picard, RW, Scherer, KR, and Russell, JA, are highly regarded in driving emotion recognition. Table 4 reveals that Picard, RW ranks first in citation count and node connection strength for papers related to driving emotion recognition, affirming his authoritative status in the field. Other significant contributors include Scherer, KR, Russell, JA, and Barrett, LF, who have substantially advanced emotion recognition. Additionally, teams led by Li, Y from Nanjing University of Science and Technology and Braun, M from Salzburg University also exhibit high citation counts, each exceeding 340 citations, reflecting their global impact. These research teams have collectively provided critical theoretical and practical support for advancing intelligent cockpit emotion perception.

    Figure 4.  Author co-citation network map.
    Table 4.  Citation distribution of relevant scholars.
    Rank Cited authors Citations Total link strength
    1 Ekman, P 57 1059
    2 Picard, RW 36 606
    3 Scherer, KR 36 543
    4 Russell, JA 34 690
    5 Barrett, LF 25 652
    6 Li, Y 19 348
    7 Li, GF 17 342
    8 Braun, M 16 461
    9 Kashevnik, A 16 71
    10 Klauer, SG 16 194

     | Show Table
    DownLoad: CSV

    A statistical analysis of publications by country and region highlights key nations with significant contributions and influence in intelligent cockpit emotion perception. From 2010 to 2024, researchers from 103 countries published in this domain. Table 5 lists the top 10 countries by publication volume: China, USA, United Kingdom, Germany, Japan, Canada, Australia, India, Italy, and South Korea. Figure 5 presents a global heatmap of publication distribution. The results indicate that China has the highest number of publications, with 205 articles, accounting for 27.26% of the total, far exceeding the United States in second place with 142 articles. This underscores China's dominant position in the field. An analysis using VOSviewer further reveals collaboration networks between countries (Figure 6). Larger nodes represent higher publication volumes, while thicker lines indicate stronger collaboration ties between nations. China demonstrates the highest centrality, followed by the USA, with close collaboration observed between the two nations. Strong collaborative relationships are also evident among the United Kingdom, Australia, Japan, and Canada. These partnerships highlight the global and interdisciplinary nature of intelligent cockpit emotion perception research.

    Table 5.  Top 10 countries by number of publications.
    Rank Countries Count Centrality
    1 China 205 0.23
    2 USA 142 0.19
    3 United Kingdom 81 0.14
    4 Germany 71 0.07
    5 Japan 55 0.08
    6 Canada 45 0.12
    7 Australia 45 0.04
    8 India 35 0.03
    9 Italy 33 0.01
    10 South Korea 30 0.06

     | Show Table
    DownLoad: CSV
    Figure 5.  Heatmap of country contributions in the field of emotion perception in intelligent cockpits.
    Figure 6.  Network map of country and regional collaboration.

    Based on Price's Law, the core institutions contributing to the field (publishing ≥ 2 papers) total 33, collectively producing 86 articles, accounting for 41.15% of all publications (below the 50% threshold). Figure 7 illustrates the institutional co-occurrence network map, which reveals that the 209 institutions involved in this field form clusters centered around Tsinghua University (4 papers), the University of Iowa (3 papers), and the Austin Research Institute (3 papers). Most of the contributing institutions are universities. Notably, Chinese research institutions dominate among the top contributors, aligning with the findings of Section 3.3.2, which identified China as a leading nation in intelligent cockpit emotion perception research. Table 6 lists the top 10 institutions by publication volume. However, as shown in the table, institutional centrality remains low, indicating weak inter-institutional ties and limited cross-regional collaboration. Most institutions conduct independent research or collaborate within national or regional contexts rather than engaging in extensive international partnerships. While these institutions actively study emotion perception and HMI within intelligent cockpits, their research outputs remain relatively limited, suggesting that the field is still in a stage of autonomous development rather than widespread collaborative expansion.

    Figure 7.  Co-occurrence network map of research institution collaboration.
    Table 6.  Top 10 research institutions by number of publications.
    Rank Institution Count Centrality
    1 Tsinghua University 4 0.01
    2 University of Iowa 4 0
    3 Austin Research Institute 3 0
    4 Chongqing University 3 0
    5 Korea Advanced Institute of Science & Technology (KAIST) 3 0
    6 University of Waterloo 3 0
    7 Russian Academy of Sciences 3 0
    8 Southeast University - China 3 0
    9 Electronics & Telecommunications Research Institute - Korea (ETRI) 3 0
    10 Monash University 3 0

     | Show Table
    DownLoad: CSV

    Keywords, as a concise summary of a study's content, provide a focused representation of research themes. Using CiteSpace, a keyword clustering map (Figure 8(a)) and a keyword co-occurrence visualization (Figure 8(b)) were generated from the WoSCC database of 698 publications. As shown in Figure 8, there are 321 keyword nodes and 1235 network connections, with the color of the nodes and lines transitioning from cool to warm hues to represent chronological changes over time. According to the clustering results in Figure 8(a), the research topics in the field of intelligent cockpit emotion perception can be divided into three distinct phases:

    Figure 8.  Keyword clustering and co-occurrence map: (a) keyword clustering map; (b) keyword co-occurrence map.

    The first phase focused on driver monitoring systems (#0), affective gaming (#1), and emotion contagion (#2), with an emphasis on enhancing user engagement and safety. As shown in Figure 8(b), keywords representing earlier research clusters include risky driving, aggressive driving, driving behavior, traffic accidents, driving performance, negative emotion, road rage, and driving anger scale. This phase primarily explored the intrinsic connections between driving anger and risky or aggressive driving behaviors, focusing on how negative emotions serve as key variables influencing driving decisions and road safety. Research during this phase can be divided into two main aspects: first, modeling the relationship between driving anger and risky or aggressive driving behaviors. Early studies used surveys [47], simulated driving experiments [48], and real-world observations [49]. The Driving Anger Scale became widely adopted for assessing drivers' anger levels across various scenarios [50], contributing to the development of quantitative models linking driving anger with risky behaviors. Second, the negative impact of driving anger on driving performance was highlighted, as studies demonstrated that driving anger led to higher speeds, shorter following distances [51], increased red-light violations [52], and more frequent aggressive driving [53]. Additionally, it was associated with behaviors such as more frequent honking and reduced courtesy while driving. Individual traits, such as trait anger, driving-specific anger [54], sensation-seeking, urgency, and perseverance [55], were also significant in moderating the relationship between anger and driving behaviors. Moreover, this phase began to investigate the phenomenon of emotion contagion, which examines how emotional states are transmitted between drivers and their effects on driving behavior and road safety. The challenge of identifying, analyzing, and regulating driver emotions to reduce risky driving became a key focus and laid the foundation for subsequent research.

    The second phase concentrated on convolutional neural networks (#3), affect sensing and analysis (#4), and driver emotions (#5), with an emphasis on emotion recognition and monitoring for emotional regulation. The corresponding keywords in Figure 8(b) include facial expression, emotion recognition, driver emotion recognition, driving experience, emotional states, and negative emotion. Research during this phase showed a continued focus on facial expression-based methods for emotion recognition, which remained a cornerstone of this field. Advances in algorithms and neural networks were central to this phase, as researchers applied convolutional neural networks (CNNs) to automatically extract and classify features such as facial expressions [56], physiological signals [57], and driver posture [58]. Efforts were also directed at developing robust systems [59] for collecting and annotating emotional data from drivers and designing interfaces that visualize this data to provide real-time feedback through visual tools [60]. Ensuring the robustness of intelligent cockpit emotion perception technologies across various environmental conditions, such as changes in lighting or complex driving scenarios, became a critical research priority.

    The third phase focused on meta-analysis (#6), active safety (#7), driver monitoring (#8), and visualization (#9), with an emphasis on integrating emotion perception technologies into intelligent cockpits to enable real-time driver monitoring, provide timely interventions, and enhance active safety functions while reducing driving risks. Corresponding keywords in Figure 8(b) include driving safety, driver monitoring system, significant difference, autonomous vehicles, driving context, and emotion regulation. Research in this phase concentrated on developing emotion-perception modules within driver monitoring systems to detect emotional states and behavioral anomalies in real-time [61], evaluating the impact of these technologies on driving safety and active safety functions through meta-analysis [62], and optimizing the performance of emotion-recognition systems in autonomous or complex driving environments. Particular attention was paid to dynamic driving scenarios, diverse lighting conditions, and individual differences [63]. HMI functionalities were also a significant focus, as intelligent cockpits incorporated emotional data feedback through visual, auditory, or multimodal interactions to offer personalized emotion regulation solutions for drivers [64]. Additionally, active safety features were developed to mitigate traffic risks, such as dynamically adjusting driving assistance strategies or issuing warnings based on the driver's emotional state [65].

    The quality of keyword timeline clustering in CiteSpace is evaluated using two metrics: Modularity (Q value) and silhouette (S value), which serve as key indicators for assessing the structure of the network and the clarity of clustering results [66]. Generally, a Q value more excellent than 0.3 indicates significant clustering analysis, while an S value more excellent than 0.7 suggests a high observation value for the clustering results. When the S value approaches infinity, the number of clusters typically reduces to one, reflecting a network too small to capture diverse research themes. As shown in Table 7, the S values in this study are all greater than 0.7, demonstrating that the clustering maps have strong representational capabilities and align well with the research keywords. Additionally, data from the upper-left corner of CiteSpace (Figure 9) indicate that the S value for the keyword timeline clustering is 0.8493, and the Q value is 0.6036, exceeding the 0.5 threshold for Q values. This confirms that CiteSpace effectively integrates clustered keywords and generates meaningful timeline clustering maps.

    Table 7.  Evaluation metrics for keyword timeline clustering analysis.
    Cluster Node Silhouette(S) Year Cluster labels
    0 67 0.912 2015 driver monitoring system
    1 48 0.874 2010 affective gaming
    2 37 0.851 2012 emotion contagion
    3 31 0.939 2018 convolutional neural network
    4 26 0.846 2011 affect sensing and analysis
    5 25 0.963 2016 driver emotion
    6 23 0.869 2020 meta-analysis
    7 17 0.898 2019 active safety
    8 17 0.927 2011 driving monitoring
    9 14 0.875 2018 visualization

     | Show Table
    DownLoad: CSV
    Figure 9.  Keyword timeline clustering knowledge map in the field of emotion perception in intelligent cockpits.

    In Figure 9, keywords are arranged along a timeline according to their appearance year. The subgroups of research clusters span multiple themes, including driver monitoring systems, affective gaming, emotion contagion, convolutional neural networks, affect sensing and analysis, driver emotions, meta-analysis, active safety, driving monitoring, and visualization. Temporal analysis reveals early research into emotion recognition and HMI focused on theoretical exploration. Around 2015, the rise of machine learning and deep learning technologies drove breakthroughs in multimodal emotion recognition and physiological signal processing. In recent years (post-2020), research has shifted toward driver monitoring systems, active safety, and task analysis, emphasizing the application of emotion perception in driving safety and automated decision-making.

    Keyword burst analysis identifies emerging keywords within a specific period and their duration, allowing for a more intuitive understanding of research hotspots and trends by combining the changes in node colors in Figure 9. Burst strength indicates the degree of significant change in keyword citation frequency during a specific period, with higher values reflecting more pronounced citation changes [67]. This strength metric is calculated using the Burst Detection feature in CiteSpace.

    An analysis of the WOSCC database using CiteSpace reveals the results of keyword bursts in Figure 10. High burst strength is observed for keywords such as affect sensing and analysis, driving simulator, convolutional neural networks, machine learning, system, blood pressure, and artificial intelligence. Research in this domain primarily focuses on optimizing multimodal emotion recognition and real-time interaction. The driving simulator has been employed to validate the adaptability of emotion perception algorithms in complex driving scenarios. In contrast, optimizing convolutional neural networks and machine learning algorithms has significantly improved the accuracy of emotion recognition tasks. Furthermore, the multimodal integration of physiological indicators and artificial intelligence has enhanced the practicality and robustness of emotion perception technologies. Additionally, keywords such as "awareness" and "behavior" exhibit long burst durations, spanning several years, highlighting the importance of real-time perception of drivers' cognitive states and behavioral patterns for improving emotion recognition accuracy, HMI, and personalized service experiences. Keywords like "task analysis" and "intelligent vehicles" have demonstrated high burst strength since 2022, persisting to the present. This suggests that in-depth research on integrating sensing, decision-making, and execution in intelligent transportation and autonomous driving, along with safety-oriented task analysis, has become a critical development trend in intelligent cockpit emotion perception.

    Figure 10.  Keyword burst detection map in the field of emotion perception in intelligent cockpits.

    By analyzing and synthesizing the keyword-based knowledge maps, the ten clusters identified in Figure 9 can be further categorized into four layers (Figure 11): the impact of emotion perception on HMI, application development, specific implementations, and assistance technologies. The construction of this classification system is grounded in several logical frameworks [68]. First, it emphasizes the consistency of functional objectives, with research topics in the same category focusing on a shared technological aim; for example, studies on human-machine interaction investigate how emotional perception can enhance feedback mechanisms for driving behavior. Second, these categories illustrate a temporal progression of technological evolution, transitioning from concrete implementation to application development, and finally to auxiliary technology levels, aligning closely with the evolution of technological hotspots identified in literature from 2010 to 2024. Lastly, this classification structure is closely connected to the "perception-decision-execution" process in the development of intelligent cockpits, ensuring that the research framework accurately reflects the practical needs of industry. This layered classification aids in understanding how different technologies work together to drive the overall development of the intelligent cockpit emotion perception field. It reflects the trajectory of emerging technologies and provides a coherent framework to capture the multifaceted evolution and future research trends in this domain.

    Figure 11.  Research on technology for intelligent cockpit emotion perception.

    Emotion perception technology in intelligent cockpits significantly enhances HMI's naturalness, intelligence, and efficiency, thereby improving the overall user experience [69]. The system can provide personalized and customized services by recognizing and interpreting the user's emotional state. For instance, when a driver experiences anger or stress, the system can automatically adjust the in-car environment—lowering music volume, dimming the lights, or issuing appropriate reminders—to alleviate negative emotions and improve driving safety. This emotion-based interaction mimics the habits of natural human communication. Moreover, natural language-based interactive systems, such as chatbots and intelligent voice assistants, are key technologies for achieving more intuitive human-machine interfaces within intelligent driving environments. These systems leverage advancements in natural language processing and affective computing to interpret drivers' commands and emotional states through speech patterns and semantic content. Emotion-aware voice assistants are capable of dynamically adjusting their response strategies, employing calming language when detecting driver stress or simplifying instructions under high cognitive load, effectively reducing driver distraction. Recent studies have demonstrated that combining natural language processing with multimodal emotion recognition can enhance interaction robustness in noisy driving conditions [70].

    Future research could leverage deep learning and big data analytics to enable systems to better understand users' emotional preferences and interaction habits [71]. This would allow the system to recommend appropriate music or adjust the in-car ambiance to create a more comfortable and tailored interaction experience. Furthermore, emotion perception technology plays a crucial role in ensuring interaction smoothness and coherence, particularly in complex driving environments. It can help reduce drivers' cognitive load by predicting their operational needs based on emotion recognition and pre-emptively preparing related assistance measures. For example, the system could automatically adjust rearview mirror angles or provide visual cues when the driver is about to change lanes, making operations more effortless [72]. Additionally, more emphasis should be placed on the synergistic interactions with vehicle sensor data to develop context-aware dialogue systems that can automatically suggest rest stops when detecting fatigue through voice analysis. These technological advancements will enhance the naturalness of human-machine collaboration and decision-making efficiency by integrating linguistic interaction with multimodal emotion perception.

    Emotion perception technology in intelligent cockpits creates extensive opportunities for innovation in application development for automakers and service providers. Its potential lies in the deep integration across fields such as driving assistance [73], in-car entertainment [74], health management [75], and autonomous driving [76]. In driving assistance systems, emotion monitoring technology can real-time identify the driver's fatigue and stress levels, providing timely feedback to enhance safety. In in-car entertainment systems, dynamic adjustments to the driver's emotional state can be achieved through music playback and modulation of ambient lighting color and intensity, thereby improving the driving experience. Health management systems leverage physiological data to offer personalized health recommendations. Moreover, autonomous driving systems optimize driving strategies based on the driver's emotional state by adjusting driving styles and path planning. Furthermore, insights from the field of industrial human-machine interaction, particularly from applications in manufacturing assembly lines, can be utilized to enhance in-vehicle collaboration. In takeover scenarios, gaze-guided control paradigms from robotic systems can effectively improve the negotiation efficiency between the driver and the vehicle [77].

    Future research on application development should focus on the following areas: (1) The deep integration of emotion recognition technology with autonomous driving and the Internet of Vehicles, enabling systems to more accurately understand user needs and intentions, while continuously optimizing algorithms to adapt to individual users' emotional preferences and interaction habits, thereby providing more intelligent and humanized services; (2) The application of emotion recognition technology in intelligent cockpits in emergency rescue scenarios, providing emotional support and psychological comfort for trapped or injured individuals [78]. Through this multi-faceted integration and application, emotion recognition technology is driving intelligent cockpits toward greater personalization, safety, and humanization, bringing about a transformative change and value enhancement in the automotive industry. However, how to effectively collect and process large amounts of personal data while ensuring privacy and data security, and simultaneously achieve more efficient and precise human-machine interaction, remains an important direction for future research. Furthermore, the application of emotion perception technologies in intelligent cockpits can be further expanded to other fields: (1) In industrial applications, enhancing anti-interference capability by optimizing operators' cognitive load; (2) In agricultural machinery applications, focusing on the adaptability of outdoor lighting and monitoring operators' fatigue and stress; (3) In intelligent medical systems, developing interfaces for surgical robots that can adapt to changes in emotional states.

    Implementing emotion perception technologies in intelligent cockpits involves key stages such as data collection, emotion recognition, emotion regulation decision-making, and feedback execution. Future research should focus on the following aspects:

    (1) Data Collection: Robust and reproducible experimental datasets are crucial, given that driving is a complex and prolonged task. The intensity, duration, and naturalness of induced emotional states are critical to dataset quality and significantly impact driving performance. Key requirements for data collection include: high-quality measurements of essential cues related to human emotional experience and expression (e.g., facial, verbal, physiological, and behavioral indicators) while ensuring low intrusiveness, prioritizing indirect and non-invasive metrics; accurate annotation of drivers' emotional states, using real-world traffic scenarios and realistic durations to collect authentic driving data. Consider participants' characteristics, such as age, gender, driving experience, and style; and attention to in-cabin interactions among occupants, capturing comprehensive emotional cues.

    (2) Emotion Recognition: Future emotion recognition systems in driving contexts should evolve toward multimodal, multi-scenario, and personalized applications. The comprehensiveness and accuracy of emotion recognition can be improved by integrating physiological signals, behavioral data, subjective scales, and contextual information about the vehicle and environment. Research must also delve into the continuity of emotions, modeling dimensions such as valence, arousal, and dominance. Cognitive appraisal theory and neuroscience insights can aid in understanding the mechanisms of emotion generation and optimizing recognition models. Furthermore, cultural differences in emotional expression should be addressed to develop more universal and adaptable emotion detection algorithms.

    (3) Emotion Regulation: Research on the specific effects of visual (e.g., color, shapes, expressions) [79] and tactile (e.g., activation timing, frequency, intensity) [80] parameters on emotions can provide theoretical support for HMI design. Personalized regulation strategies should incorporate drivers' cultural backgrounds, individual traits, and preferences. At the same time, privacy protection will become a critical issue, with future challenges focusing on balancing effective emotion recognition and regulation with safeguarding driver privacy in intelligent cockpits.

    Assistance technologies within intelligent cockpits primarily refer to advanced driver-assistance systems (ADAS) enhanced by emotion perception. These systems provide proactive behavioral support for driving operations through the integration of multimodal technologies, aiming to enhance driving safety and the human-machine interaction experience. This integration has led to four key advancements: (1) Integration with artificial intelligence (AI) empowers emotion perception systems with adaptive learning and optimization capabilities, enabling continuous refinement of emotion recognition algorithms and interaction strategies based on user feedback, thereby enhancing system intelligence and personalization. (2) Incorporating Internet of Things (IoT) technologies can enable seamless information exchange between vehicles and their environments, enriching the contextual data available for emotion recognition and enhancing the precision and intelligence of interactions. (3) Virtual reality (VR) technology [81] significantly enhances user experience through immersive interactions, improving the natural and efficient emotional communication capabilities within virtual environments. It is particularly suitable for providing personalized entertainment content to passengers and for driving simulation training, especially in complex traffic scenarios, facilitating emotional management. (4) Augmented reality (AR) technology [82] integrates with in-vehicle head-up displays (HUDs) to provide information from vehicle sensors and planning systems, including data on surrounding objects, distances, and speeds. By delivering visual cues—such as real-time navigation overlays and dynamic color feedback on the driver's emotional state—AR facilitates the development of a mental model of the autonomous driving system, optimizing decisions and improving road safety. The collaboration between these technologies underscores the pivotal role of emotion perception systems in elevating the driving experience and establishes a foundation for their broader application within intelligent transportation ecosystems. Through continuous innovation, emotion perception technologies are poised to drive advancements in road safety and human-vehicle collaboration.

    While current research primarily focuses on general driving scenarios, the underlying emotion-aware ADAS framework could be extended to support diverse user needs, including accessibility adaptations for drivers with disabilities through real-time physiological and behavioral monitoring.

    This study analyzed 698 publications on intelligent cockpit emotion perception technologies from 2010 to 2024, providing insights into the latest development trends and research hotspots. It traced the core research themes and disciplinary knowledge structures of the past 15 years and highlighted the contributions of key authors in the domain. Furthermore, the evolution of future research trends was outlined and categorized into four dimensions: the impact on HMI, application development, specific implementations, and supporting technologies.

    The study's findings are as follows:

    Development Trends: Since 2010, emotion perception technologies have shown a continuous growth trajectory, entering a phase of rapid development after 2019. Research has expanded from modeling driving anger and risky behavior to incorporating deep learning and multimodal data fusion for emotion recognition and regulation. Recent efforts have focused on integrating emotion perception technologies into intelligent cockpits, particularly for driving monitoring, active safety functions, and personalized HMI.

    Current Research Hotspots: Key areas of focus include optimizing multimodal emotion recognition technologies and developing real-time interaction systems. Core technologies such as convolutional neural networks, machine learning, and physiological signal processing have significantly improved the accuracy and robustness of recognition systems. Current research emphasizes the impact of drivers' emotional states on driving behavior, leveraging emotion perception technologies to enhance driving safety and comfort.

    Future Research Directions: The deep integration of emotion perception technologies with autonomous driving and vehicle-to-network (V2N) systems represents a major future trend, aiming to deliver more intelligent and human-centric driving assistance services. In application development, emotion perception technologies hold significant potential in emergency rescue and health management areas. From a technological perspective, future efforts will focus on enhancing the diversity and reproducibility of data collection, personalizing emotion regulation strategies, and addressing privacy protection challenges. Supporting technologies, including the integration of artificial intelligence, the Internet of Things, virtual reality, and augmented reality, are expected further to enhance the performance and applicability of emotion perception systems.

    Despite its contributions, this study has certain limitations. First, it only relied on the Web of Science Core Collection database, which, although one of the most authoritative sources, only encompasses some research in intelligent cockpit emotion perception. Second, due to the data extraction timeframe, literature predating 2010 must be fully included. Future studies should expand the database scope to include platforms like Scopus and CNKI, enabling a deeper exploration of publications over extended periods. Lastly, bibliometric analysis primarily reflects the trends of interest within the academic community. This paper strengthens the qualitative analysis of technological evolution logic and application scenarios based on traditional bibliometric frameworks. By employing a multi-dimensional cross-validation of "technology-demand, " it combines quantitative data with in-depth interpretations of relevant literature. However, future research still needs to further address the limitations of quantitative statistical analysis.

    The authors declare they have not used Artificial Intelligence (AI) tools in the creation of this article.

    This research is supported by the Industry-University Collaborative Education Project of the Ministry of Education of China (Grant No. 220606545201713), and the "Human Factors and Ergonomics" Education Project by the Department of Higher Education, Ministry of Education of China (Grant No. 220705329115056).

    The authors declare no conflicts of interest.

    Conceptualization, L.S. and B.L.; Methodology, H.Y.; Software, L.S. and X.F.; Validation, L.S. and B.L.; Formal analysis, H.Y. and W.Z.; Investigation, L.S. and W.Z.; Resources, B.L. and H.Y.; Data curation, L.S.; Writing—Original draft preparation, L.S.; Writing—Review and editing, B.L.; Visualization, H.Y., X.F., and W.Z.; Supervision, B.L.; Project administration, B.L.; Funding acquisition, B.L. All authors have read and agreed to the published version of the manuscript.



    [1] I. Podlubny, Fractional Differential Equations, Academic Press, San Diego, 198 (1999), 1–340. https://doi.org/10.1016/s0076-5392(99)x8001-5
    [2] Y. Zhou, Basic Theory of Fractional Differential Equations, World Scientific Publishing Co. Pte. Ltd., Hackensack, NJ, 2014. https://doi.org/10.1142/9069
    [3] A. A. Kilbas, H. M. Srivastava, J. J. Trujillo, Theory and Applications of Fractional Differential Equations, NorthHolland Mathematics Studies, Elsevier Science B. V., Amsterdam, 204 (2006). https://doi.org/10.1016/S0304-0208(06)80001-0
    [4] K. B. Oldham, J. Spanier, The Fractional Calculus, Academic Press, New York, 1974. https://doi:10.1007/978-3-642-18101-6-2
    [5] K. S. Miller, B. Ross, An Introduction to the Fractional Calculus and Fractional Differential Equations, Wiley, New York, 1993.
    [6] A. A. Kilbas, J. J. Trujiuo, Differential equations of fractional order: methods, results and problems Ⅰ, Appl. Anal., 78 (2001), 153–192. https://doi.org/10.1080/00036810108840931 doi: 10.1080/00036810108840931
    [7] A. A. Kilbas, J. J. Trujiuo, Differential equations of fractional order: methods, results and problems Ⅱ, Appl. Anal., 81 (2002), 435–493. https://doi.org/10.1080/0003681021000022032 doi: 10.1080/0003681021000022032
    [8] D. Delbosco, Fractional calculus and function spaces, J. Fract. Calc., 6 (1994), 45–53.
    [9] K. Diethelm, N. Ford, Analysis of fractional differential equations, J. Math. Anal. Appl., 265 (2002), 229–248. https://doi.org/10.1006/jmaa.2000.7194 doi: 10.1006/jmaa.2000.7194
    [10] V. Laksmikantham, A. S. Vatsala, Basic theory of fractional differential equations, Nonlinear Anal., 69 (2008), 2677–2682. https://doi.org/10.1016/j.na.2007.08.042 doi: 10.1016/j.na.2007.08.042
    [11] V. Laksmikantham, A. S. Vatsala, Theory of fractional differential inequalities and applications, Commun. Appl. Anal., 11 (2007), 395–402. Available from: http://www.acadsol.eu/en/articles/11/3/4.pdf.
    [12] E. Demirci, N. Ozalp, A method for solving differential equations of fractional, J. Comput. Appl. Math., 236 (2012), 2754–2762. https://doi.org/10.1016/j.cam.2012.01.005 doi: 10.1016/j.cam.2012.01.005
    [13] K. Sayevand, A. Golbabai, A. Yidirim, Analysis of differential equations of fractional order, Appl. Math. Modell., 36 (2012), 4356–4364. https://doi.org/10.1016/j.apm.2011.11.061 doi: 10.1016/j.apm.2011.11.061
    [14] N. Kosmatov, Integral equations and initial value problems for nonlinear differential equations of fractional order, Nonlinear Anal., 70 (2009), 2521–2529. https://doi.org/10.1016/j.na.2008.03.037 doi: 10.1016/j.na.2008.03.037
    [15] B. Ahmad, N. Alghamdi, A. Alsaedi, S. K. Ntouyas, A system of coupled multi-term fractional differential equations with three-point coupled boundary conditions, Fract. Calc. Appl. Anal., 22 (2019), 601–616. https://doi.org/10.1515/fca-2019-0034 doi: 10.1515/fca-2019-0034
    [16] D. Delbosco, L. Rodino, Existence and uniqueness for a nonlinear fractional differential equation, J. Math. Anal. Appl., 204 (1996), 609–625. http://dx.doi.org/10.1006/jmaa.1996.0456 doi: 10.1006/jmaa.1996.0456
    [17] X. Zheng, H. Wang, A hidden-memory variable-order time-fractional optimal control model: analysis and approximation, SIAM J. Control Optim., 59 (2021), 1851–1880. https://doi.org/10.1137/20M1344962 doi: 10.1137/20M1344962
    [18] X. Zheng, H. Wang, An optimal-order numerical approximation to variable-order space-fractional diffusion equation uniform or graded meshes, SIAM J. Numer. Anal., 58 (2020), 330–352. https://doi.org/10.1137/19M1245621 doi: 10.1137/19M1245621
    [19] J. Li, X. Su, K. Zhao, Barycentric interpolation collocation algorithm to solve fractional differential equations, Math. Comput. Simul., 205 (2023), 340–347. https://doi.org/10.1016/j.matcom.2022.10.005 doi: 10.1016/j.matcom.2022.10.005
    [20] X. Zheng, H. Wang, An error estimate of a numerical approximation to a hidden-memory variable-order space-time Fractional diffusion equation, SIAM J. Numer. Anal., 58 (2020), 2492–2514. https://doi.org/10.1137/20M132420X doi: 10.1137/20M132420X
    [21] T. Qiu, Z. Bai, Existence of positive solutions for singular fractional differential equations, Electron. J. Differ. Equations, 146 (2008), 1–9. Available from: https://ejde.math.txstate.edu/index.html.
    [22] R. P. Agarwal, D. O'regan, S. Stank, Positive solutions for Dirichlet problems of singular nonlinear fractional differential equations, J. Math. Anal. Appl., 371 (2010), 57–68. https://doi.org/10.1016/j.jmaa.2010.04.034 doi: 10.1016/j.jmaa.2010.04.034
    [23] X. Zhang, Q. Zhong, Multiple positive solutions for nonlocal boundary value problems of singular fractional differential equations, Bound. Value Probl., 65 (2016). https://doi.org/10.1186/s13661-016-0572-0 doi: 10.1186/s13661-016-0572-0
    [24] Y. Wang, Existence and multiplicity of positive solutions for a class of singular fractional nonlocal boundary value problems, Bound. Value Probl., 92 (2019). https://doi.org/10.1186/s13661-019-1205-1 doi: 10.1186/s13661-019-1205-1
    [25] D. Yan, Existence and uniqueness of positive solutions for a class of nonlinear fractional differential equations with singular boundary value conditions, Math. Probl. Eng., 2021 (2021). https://doi.org/10.1155/2021/6692620 doi: 10.1155/2021/6692620
    [26] Y. Wang, L. Liu, Y. Wu, Existence and uniqueness of a positive solution to singular fractional differential equations, Bound. Value Probl., 81 (2012). https://doi.org/10.1186/1687-2770-2012-81 doi: 10.1186/1687-2770-2012-81
    [27] T. Wang, Z.Hao, Existence and uniqueness of positive solutions for singular nonlinear fractional differential equation via mixed monotone operator method, J. Funct. Space, 2020 (2020). https://doi.org/10.1155/2020/2354927 doi: 10.1155/2020/2354927
    [28] L. Guo, X. Zhang, Existence of positive solutions for the singular fractional differential equations, J. Appl. Math. Comput., 44 (2014), 215–228. https://doi.org/10.1007/s12190-013-0689-6 doi: 10.1007/s12190-013-0689-6
    [29] A. Cabada, Z. Hamdi, Nonlinear fractional differential equations with integral boundary value conditions, Appl. Math. Comput., 228 (2014), 251–257. https://doi.org/10.1016/j.amc.2013.11.057 doi: 10.1016/j.amc.2013.11.057
    [30] A. Cabada, G. Wang, Positive solutions of nonlinear fractional differential equations with integral boundary value conditions, J. Math. Anal. Appl., 389 (2012), 403–411. https://doi.org/10.1016/j.jmaa.2011.11.065 doi: 10.1016/j.jmaa.2011.11.065
    [31] T. Wang, F. Xie, Existence and uniqueness of fractional differential equations with integral boundary conditions, J. Nonlinear Sci. Appl., 1 (2008), 206–212. http://doi.org/10.22436/jnsa.001.04.02 doi: 10.22436/jnsa.001.04.02
    [32] Y. Qiao, Z. Zhou, Existence and uniqueness of positive solutions for a fractional differential equation with integral boundary conditions, Adv. Differ. Equations, 31 (2016). https://doi.org/10.1186/s13662-016-0772-z doi: 10.1186/s13662-016-0772-z
    [33] B. Ahmad, J. J. Nieto, Existence results for nonlinear boundary value problems of fractional integrodifferential equations with integral boundary conditions, Bound. Value Probl., 2009 (2009). https://doi.org10.1155/2009/708576 doi: 10.1155/2009/708576
    [34] S. Hamani, M. Benchohra, J. R. Graef, Existence results for boundary-value problems with nonlinear fractional differential inclusions and integral conditions, Electron. J. Differ. Equations, 20 (2010), 1–16. https://ejde.math.txstate.edu/index.html
    [35] J. A. Nanware, D. B. Dhaigude, Existence and uniqueness of solutions of Riemann-Liouville fractional differential equation with integral boundary condition, J. Nonlinear Sci., 14 (2012), 410–415. Available from: http://www.internonlinearscience.org/bookseries.aspx?jouid=53&journals=Volume.
    [36] J. A. Nanware, D. B. Dhaigude, Existence and uniqueness of solutions of differential equations of fractional order with integral boundary conditions, J. Nonlinear Sci. Appl., 7 (2014), 246–254. Available from: https://www.emis.de/journals/TJNSA/includes/files/articles/.
    [37] S. Padhi, J. R. Graef, S. Pati, Multiple positive solutions for a boundary value problem with nonlinear nonlocal Riemann-Stieltjes integral boundary conditions, Fract. Calc. Appl. Anal., 21 (2018), 716–745. https://doi.org/10.1515/fca-2018-0038 doi: 10.1515/fca-2018-0038
    [38] M. A. Darwish, S. K. Ntouyas, Existence results for first order boundary value problems for fractional differential equations with four-point integral boundary conditions, Miskolc Math. Notes, 15 (2014), 51–61. http://doi.org/10.18514/MMN.2014.511 doi: 10.18514/MMN.2014.511
    [39] B. Ahmad, R. P. Agarwal, Some new versions of fractional boundary value problems with slit-strips conditions, Bound. Value Probl., 175 (2014). https://doi.org/10.1186/s13661-014-0175-6 doi: 10.1186/s13661-014-0175-6
    [40] Y. He, Existence and multiplicity of positive solutions for singular fractional differential equations with integral boundary value conditions, Adv. Differ. Equations, 31 (2016). https://doi.org/10.1186/s13662-015-0729-7 doi: 10.1186/s13662-015-0729-7
    [41] S. Vong, Positive solutions of singular fractional differential equation with integral boundary conditions, Math. Comput. Model., 57 (2013), 1053–1059. https://doi.org/10.1016/j.mcm.2012.06.024 doi: 10.1016/j.mcm.2012.06.024
    [42] D. Min, L. Liu, Y. Wu, Uniqueness of positive solutions for the singular nonlinear fractional differential equations involving integral boundary value conditions, Bound. Value Probl., 23 (2018). https://doi.org/10.1186/s13661-018-0941-y doi: 10.1186/s13661-018-0941-y
    [43] K. Chandran, K. Gopalan, S. T. Zubair, T. Abdeljawad, A fixed point approach to the solution of singular fractional differential equations with integral boundary conditions, Adv. Differ. Equations, 56 (2021). https://doi.org/10.1186/s13662-021-03225-y doi: 10.1186/s13662-021-03225-y
    [44] D. Yan, Solutions for a category of singular nonlinear fractional differential equations subject to integral boundary conditions, Bound. Value Probl., 3 (2022). https://doi.org/10.1186/s13661-022-01585-2 doi: 10.1186/s13661-022-01585-2
    [45] M. A. Krasnoselskii, Two remarks on the method of successive approximations, Usp. Math. Nauk, 10 (1955), 123–127. Available from: https://www.mathnet.ru/links/eaecb7dd6edb854d12cbc2c3a8ed39f4/rm7954.pdf.
    [46] Y. Wang, J. Xu, Sobolev Space (in Chinese), Southeast University Press, 2003.
  • This article has been cited by:

    1. Yuna Zhao, Construction of blocked designs with multi block variables, 2021, 6, 2473-6988, 6293, 10.3934/math.2021369
  • Reader Comments
  • © 2023 the Author(s), licensee AIMS Press. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0)
通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Metrics

Article views(2117) PDF downloads(126) Cited by(1)

Other Articles By Authors

/

DownLoad:  Full-Size Img  PowerPoint
Return
Return

Catalog