Review Topical Sections

Transglutaminase inhibition: possible therapeutic mechanisms to protect cells from death in neurological disorders

  • Received: 17 July 2017 Accepted: 27 September 2017 Published: 10 October 2017
  • Transglutaminases are a family of Ca2+-dependent enzymes which catalyze post-translational modifications of proteins. The main activity of these enzymes is the cross-linking of glutaminyl residues of a protein/peptide substrate to lysyl residues of a protein/peptide co-substrate. In addition to lysyl residues, other second nucleophilic co-substrates may include monoamines or polyamines (to form mono- or bi-substituted/crosslinked adducts) or −OH groups (to form ester linkages). In absence of co-substrates, the nucleophile may be water, resulting in the net deamidation of the glutaminyl residue. Transglutaminase activity has been suggested to be involved in molecular mechanisms responsible for both physiological or pathological processes. In particular, transglutaminase activity has been shown to be responsible for human autoimmune diseases, and Celiac Disease is just one of them. Interestingly, neurodegenerative diseases, such as Alzheimer’s Disease, Parkinson’s Disease, supranuclear palsy, Huntington’s Disease and other polyglutamine diseases, are characterized in part by aberrant cerebral transglutaminase activity and by increased cross-linked proteins in affected brains. Here we describe the possible molecular mechanisms by which these enzymes could be responsible for such diseases and the possible use of transglutaminase inhibitors for patients with diseases characterized by aberrant transglutaminase activity.

    Citation: Nicola Gaetano Gatta, Rosaria Romano, Elenamaria Fioretti, Vittorio Gentile. Transglutaminase inhibition: possible therapeutic mechanisms to protect cells from death in neurological disorders[J]. AIMS Molecular Science, 2017, 4(4): 399-414. doi: 10.3934/molsci.2017.4.399

    Related Papers:

    [1] David Kuo Chuen Lee, Chia Mei Shih, Jincheng Zheng . Asian CBDCs on the rise: An in-depth analysis of developments and implications. Quantitative Finance and Economics, 2023, 7(4): 665-696. doi: 10.3934/QFE.2023032
    [2] Dilvin Taşkın, Görkem Sarıyer . Use of derivatives, financial stability and performance in Turkish banking sector. Quantitative Finance and Economics, 2020, 4(2): 252-273. doi: 10.3934/QFE.2020012
    [3] Larysa Dokiienko, Nataliya Hrynyuk, Igor Britchenko, Viktor Trynchuk, Valentyna Levchenko . Determinants of enterprise's financial security. Quantitative Finance and Economics, 2024, 8(1): 52-74. doi: 10.3934/QFE.2024003
    [4] Tram Thi Xuan Huong, Tran Thi Thanh Nga, Tran Thi Kim Oanh . Liquidity risk and bank performance in Southeast Asian countries: a dynamic panel approach. Quantitative Finance and Economics, 2021, 5(1): 111-133. doi: 10.3934/QFE.2021006
    [5] Raymond J. Hawkins, Hengyu Kuang . Lending Sociodynamics and Drivers of the Financial Business Cycle. Quantitative Finance and Economics, 2017, 1(3): 219-252. doi: 10.3934/QFE.2017.3.219
    [6] Dimitra Loukia Kolia, Simeon Papadopoulos . The levels of bank capital, risk and efficiency in the Eurozone and the U.S. in the aftermath of the financial crisis. Quantitative Finance and Economics, 2020, 4(1): 66-90. doi: 10.3934/QFE.2020004
    [7] Iuliana Matei . Is financial development good for economic growth? Empirical insights from emerging European countries. Quantitative Finance and Economics, 2020, 4(4): 653-678. doi: 10.3934/QFE.2020030
    [8] Oleg Sukharev, Ekaterina Voronchikhina . Financial and non-financial investments: comparative econometric analysis of the impact on economic dynamics. Quantitative Finance and Economics, 2020, 4(3): 382-411. doi: 10.3934/QFE.2020018
    [9] Mohammed Yaw Broni, Mosharrof Hosen, Mansur Masih . Does a country’s external debt level affect its Islamic banking sector development? Evidence from Malaysia based on Quantile regression and Markov regime-switching. Quantitative Finance and Economics, 2019, 3(2): 366-389. doi: 10.3934/QFE.2019.2.366
    [10] Md Qamruzzaman, Wei Jianguo . Investigation of the asymmetric relationship between financial innovation, banking sector development, and economic growth. Quantitative Finance and Economics, 2018, 2(4): 952-980. doi: 10.3934/QFE.2018.4.952
  • Transglutaminases are a family of Ca2+-dependent enzymes which catalyze post-translational modifications of proteins. The main activity of these enzymes is the cross-linking of glutaminyl residues of a protein/peptide substrate to lysyl residues of a protein/peptide co-substrate. In addition to lysyl residues, other second nucleophilic co-substrates may include monoamines or polyamines (to form mono- or bi-substituted/crosslinked adducts) or −OH groups (to form ester linkages). In absence of co-substrates, the nucleophile may be water, resulting in the net deamidation of the glutaminyl residue. Transglutaminase activity has been suggested to be involved in molecular mechanisms responsible for both physiological or pathological processes. In particular, transglutaminase activity has been shown to be responsible for human autoimmune diseases, and Celiac Disease is just one of them. Interestingly, neurodegenerative diseases, such as Alzheimer’s Disease, Parkinson’s Disease, supranuclear palsy, Huntington’s Disease and other polyglutamine diseases, are characterized in part by aberrant cerebral transglutaminase activity and by increased cross-linked proteins in affected brains. Here we describe the possible molecular mechanisms by which these enzymes could be responsible for such diseases and the possible use of transglutaminase inhibitors for patients with diseases characterized by aberrant transglutaminase activity.


    This study focuses on comparing the adoption of cartesian and complexity science frameworks in quantitative risk management of Zimbabwean banks. Precisely this research pioneers a comparison of traditional and modern theoretical perspectives of financial risk management in the unique settings of Zimbabwe, a developing country contrary to extant studies that focus on developed economies. First the study advances the theory of cartesianism emanating from classical science and finance theories such as Efficient Market Hypothesis (Fama, 1970), Modern Portfolio Theory (Markowitz, 1952), Capital Asset Pricing Model (Sharpe, 1964), Option Pricing Theory (Black and Scholes, 1973; Merton, 1974) and Arbitrage Pricing Theory (Ross, 1976). Second it advances complexity economics theory following Knight (1921), Hayek (1964), Diebold et al, (2010), Ganegoda and Evans (2014), Haldane, (2016); Flage and Aven (2015), Aven (2017), Mazri (2017) and Tepetepe et al., (2021). The objectives of this paper are to test and compare the adoption of these two theories using a survey. The paper answers these research questions: (a) What are the theoretical perspectives for quantitative financial risk management? (b) What are static and complexity science frameworks? (c) What are the similarities and differences in the adoption of cartesian and complexity science theoretical frameworks in financial risk management of Zimbabwean banks? (d) Is there any point of intersection and complementarity for these two theories? The rest of the paper is structured as follows. Section 2 focuses on the literature review focusing on cartesian, interpretivist and complexity science theories. Section 3 documents the survey research methodology that was applied. Section 4 provides the comparative analysis of survey results, discussion, and recommendations. Section 5 provides the conclusion and areas of further research.

    This section provides a literature review on three theoretical frameworks namely cartesian, interpretivist and complexity science that influence the definition of risk, practice, and policy formulation for quantitative risk management in banks.

    Cartesian or traditional financial risk management is founded on bedrock of classical science and neoclassical financial economics theories. Its basis are assumptions of closed systems, stability, rational expectations, ordered states, Gaussian normality, independence, homogeneity, optimisation, equilibrium, and absence of information asymmetries (Muth, 1961; Arthur, 2013; Ganegoda and Evans, 2014; Hull, 2018; Plosser and Santos, 2018). Under this view risk is the magnitude of deviation from the target, measurable with objective probabilities (Knight, 1921; Keynes, 1921; Gebizlioglu and Dhaene, 2009; Sweeting, 2011; McNeil et al., 2015; Pichler, 2017). Thus, it is possible to manage risks with linear and reductionist "best practices" based on laws of probability and available historical data (Ganegoda and Evans, 2014; Aven, 2017).

    According to cartesian theory the whole is the sum of its parts (Arthur, 2021). As such to understand closed systems, one needs to decompose them into components which are studied in isolation. The aggregation of these components forms the whole. Cartesianism is identified with nomenclatures such as positivism, post positivism, conventional economics, reductionism, conventional science, and scientific objectivism (Arthur, 2021). Proponents of cartesianism believe that risks are factual and measurable. Hence, they uphold a scientific constructivism claim of the existence of standard solutions for risk management problems (Dowd and Hutchinson, 2014). To them quantitative risk management is essentially founded on mechanistic Newtonian science and classical financial theories such as Efficient Market Hypothesis (Fama, 1970), Modern Portfolio Theory (Markowitz, 1952), Capital Asset Pricing Model (Sharpe, 1964), Option Pricing Theory (Black and Scholes, 1973; Merton, 1974) and Arbitrage Pricing Theory (Ross, 1976).

    The cartesian belief is the reason for the proliferation of static frameworks, books and financial regulations that tries to offer standard solutions and practices on bank risk management (Sweeting, 2011; McNeil, et al., 2015). These static frameworks focus on specified resilience modeling and silo risk analysis (Sweeting, 2011; Morin, 2014; Cavallo, 2014; McNeil et al., 2015). In general, these frameworks are founded on principles of establishing context, risk assessment, risk identification, risk analysis, evaluation, monitoring, communication, and consultation (Sweeting, 2011; Cavallo, 2014). Three types of static frameworks exist (Sweeting, 2011). First, advisory risk management frameworks are neither compulsory nor legislative but provides generic guidelines of capital management for example, Risk Analysis and Management for Projects (RAMP), the COSO ERM Integrated Framework, the Treasury Board of Canada Integrated Risk Management Framework, the Orange Book, etc. Second, proprietary risk management frameworks are provided by credit rating agencies for instance the Fitch's Standard and Poor & Moody's for the institution's debtholders and investors. Third, mandatory risk management frameworks are compulsory and regulatory for example Basel Ⅲ in banks and Solvency Ⅱ in insurance.

    Static frameworks promote linear, and fact-based risk management principally based on the doctrine of Gaussian distribution Value at Risk modeling (Dowd and Hutchison, 2014). Therefore, by applying these frameworks present cartesian risk management in banks focuses predominantly on Knightian risks which are less severe in magnitude. However, in the real financial world, standard solutions do not exist because sometimes risks are emergent and unknown (Dowd and Hutchison, 2014). Hence the elegant modeling methods offered by static frameworks are not applicable to complex systems where standard solutions do not exist (Sornette, 2003; Helbing, 2013; Cavallo, 2014, Bolton et al., 2020). Arthur (2021) criticise cartesian risk management for producing elegant but unrealistic financial engineering models based on deductive logic which ignores other uncertainties. Several studies highlight that, cartesian risk management also suffers from the illusion of risk control, treating risk preferences as exogenous events, and omitting ethics (Taleb, 2007; Allan et al., 2013; Motet and Bieder, 2017; Righi et al., 2019). Dowd et al., (2011) argues that the cartesian paradigm failed practically during the 2008 great financial crisis where banks that had adopted Basel Ⅱ/Ⅲ flimsy Value at risk doctrine of quantitative finance in Europe and United States of America were hit the most. Farmer et al., (2012) and Arthur (2021) highlights that cartesian paradigm offers no solution to current challenges that violate rational expectation theory and those emanating from emerging operational risks such as climate, technological, geopolitical, societal, health and economic risks. As such some authors are demanding for a new paradigm to complement standard finance and risk management (Sornette, 2003; Blume and Durlauf, 2006; Hellbing, 2013; Cavallo, 2014; Battiston, et al., 2016a; Arthur, 2021). These argue that financial risk management will be effective if static and dynamic frameworks are integrated. Strong critics of cartesianism are calling for its radical elimination in favour of either laissez faire banking or complexity science (Haldane, 2009; Dowd et al., 2011; Farmer, et al., 2012; Dowd and Hutchison, 2014). They argue that a laissez faire banking system through competition promotes higher capital ratios, better risk management and financial stability.

    To find the state of cartesian frameworks adoption, this paper focuses more on Basel Ⅱ/Ⅲ than other frameworks because it is mandatory in Zimbabwean banks and provides straightforward quantitative risk management methods under Pillar 1. Basel Ⅱ/Ⅲ provides various risk weighted asset modeling approaches divided into two types. First, the simple standardised approaches where the key parameters are fixed and given by supervisor (Zimbabwe Basel Ⅱ Technical Guidelines, 2011). These include the modified standardised approach (MSA) for assessing credit risk, alternative standardised approach (ASA) for operational risk and standardised approach (SA) for market risk. The simpler approaches are designed specifically for smaller banks with simpler financial models. Currently, Basel Ⅱ/Ⅲ proposes the replacement of Value at Risk methods for market risk with expected shortfall (BCBS, 2017). Second, the internal modeling approaches where banks are conditionally allowed to determine their own parameters. There are two internal rating-based approaches for credit risk modeling. Under the foundation internal ratings-based approach (FIRB) approach banks estimate probabilities of default for each borrower but are given other parameters by supervisor, while under the advanced internal ratings-based approach (AIRB) banks also estimate other parameters, such as loss given default and exposure at default on their own. The asset correlation is provided by the supervisor in all cases. Operational risk is calculated with the advanced measurement approach (AMA) and market risk is forecasted with internal models' approach (IMA). The latest method for operational risk modeling is the standardised measurement approach (SMA). Adoption of internal modeling is theoretically supposed to result in the convergence of regulatory and economic capital thus reducing regulatory arbitrage and promoting financial stability (BCB, 2006; Zimbabwe Basel Ⅱ Technical Guidelines, 2011)

    Interpretivists view risk as the subjective perception of individuals based on their personal experience, social beliefs, subjective notions, and culture. Thus, risk is qualitative and can be quantified by subjective or Bayesian probabilities (Bayes and Price, 1763; Ramsey, 1926; Shackle, 1949; Savage, 1954; Aven, 2017; Righi et al., 2019). Interpretivists reject notions of objectivity, certainty in knowledge, rational expectations, subject centred reasoning, and referential use of language (Beck, 1992; Cavallo, 2014; Morin, 2014). Policy makers prefer qualitative risk management because it is more accessible, easily understood, and requires no mathematical competence. Interpretivism explains the proliferation of qualitative frameworks and books on bank risk management (Sweeting, 2011). Table 1 shows the differences between cartesian, interpretivist risk and complexity science risk management.

    Table 1.  Cartesian, interpretivist and complexity science risk management.
    Dimension of contrast Traditional Modern Neo-modernism
    Philosophy Cartesianism/Reductionism Interpretivism Complexity science/Holism
    Epistemology Risk is an event analysed primarily at individual levels in silos or elements Risk is a social and subjective phenomenon Risk is holistic phenomenon analysed as a whole
    Expectations Rational expectations Irrational Irrational, adaptive, and evolutionary expectations
    Logic Deduction Induction Abductive (Induction & deduction)
    Risk preferences They are given and unexplained Risk preferences, perceptions and responses are learned Risk preferences are learned and adaptive
    Risk quantification Risk is objectively quantifiable Risk is subjectively quantifiable Risk is both objectively and subjectively quantifiable
    System Stable and closed Any type of system Dynamic and open system
    Risk Responses Mitigate negative events Probe, sense and respond qualitatively to mitigate negative events Probe, sense and respond Build safe operating space
    Ethics Ethics are omitted Ethics are integral to risk management Ethics are integral to risk management
    Modeling tools Quantitative (Basis is Central Limit Theorem) Qualitative Complexity science / algorithmic modeling

     | Show Table
    DownLoad: CSV

    Table 1 provides brief descriptions on the similarities and differences of cartesian, interpretivist and complexity science financial risk management. The descriptions are obtained from analysis of Cavallo, (2014); Righi et al., (2019); and Arthur, (2021).

    Complexity science is part of neo-modernism which provides an opportunity to understand non-linear, evolutionary, dynamic, and self-organising systems (Zimmerman, 1999). Neo-modernism combines cartesian and interpretivists' views of risk management (Althaus, 2005; Righi, et al., 2019). Risk is hence an integration of definitions from multiple disciplines such as mathematics, science, economics, finance, engineering, medicine, arts, humanities, philosophy, and information technology (Althaus, 2005). Complexity science is an interdisciplinary approach providing a pluralistic view of the world (Newman, 2011; Mitchell, 2009; Arthur, 2013; Farmer, 2015; Haldane, 2016; Li et al., 2018; Li and Evans, 2019; Tepetepe et al., 2021). It is an epistemological position where the whole is not the sum of its parts dating back to Aristotle's views. Decomposing a complex system into individual components destroys the system's properties (Bruno et al., 2016).

    In physics, biology, chemistry, medicine, and engineering sciences complexity science is used to examine complex adaptive systems such as condensed matter, natural ecosystems, immune systems, internet of things, nervous systems, and ant colonies. In economics complexity science is rooted in Adam Smith's laissez faire and Pareto's laws (Anderson, 1972; Arthur, 2013). It is used to examine economies, financial systems, and organisations. In most social sciences the approach is used to analyse culture, organisational behaviour, crowds, society, and population dynamics. Complexity science is a multidisciplinary approach, framework, theory, and philosophy (Mitleton-Kelly, 2003). Researchers term it the "new economic thinking", "new scientific way", "new discipline", "new world view" and "complexity economics." It is an abductive approach providing an alternative and challenge to classical economics and science (Senge, 1990; Mitleton-Kelly, 2003; Haldane, 2009; Famer, 2012; Morin, 2014; Cavallo, 2014; Allan et al., 2013; Bolton et al., 2020).

    There is no universal definition for complex systems. Holmes (2021) define complex systems on basis of two characteristics (a) heterogeneity and (b) non-linearity. Heterogeneity means that many interconnected agents interact in a complex system. Non-linearities leads into multiple equilibria and as such small changes at micro level amplifies and lead to critical transitions or tipping points at macro-level. A complex adaptive system is an open, living, and dynamic system characterised by heterogenous agents, no central control and simple rules of operation that give rise to collective behaviour, sophisticated information processing, and adaptation via learning or evolution (Mitleton-Kelly, 2003; Mitchell, 2009; Baranger, 2010; Newman, 2011). According to Arthur (2013) complexity science is the study of the consequences of interactions; patterns, or structures, or phenomena, that emerge from interactions among elements-particles, or cells, or dipoles, or agents, or firms.

    Fundamentally, economics answers the problems of allocation and formation (Arthur, 2013). Classical financial modeling whether it uses Newtonian science or traditional economics theories; focuses on the allocation problem (Arthur, 2021). However, these traditional models work poorly in providing solutions to questions of formation, exploration, adaptation, and qualitative change which are complex systems (Tabb, 1999). This is because problems of formation are "hard to predict" and, in many cases, unpredictable due to emergence, chaos, and creative destruction (Schumpeter, 1912; Arthur, 2021). According to Allan et al., (2013) complexity science is (a) a discipline of seeing wholes, (b) a framework for seeing interrelationships, patterns, and networks; and (c) a methodology to see both the forest and trees. Recent empirical studies affirm that financial risks are complex adaptive systems (Allan et al, 2013; Farmer et al., 2012; Corrigan et al., 2013; Li and Evans, 2019). However, most of these empirical studies concentrate on risks in banking and insurance companies of developed economies such as United States, Europe, and Australia. This paper concentrates on the unique settings of a developing country.

    Complexity economists assume that financial risks are complex systems characterised by heterogenous agents, self-organisation, emergence, adaptation, non-linearity, far from equilibrium, space of possibilities, co-evolution, historicity and time, path dependence and creation of new order (Mitleton-Kelly, 2003; Gros, 2011; Corrigan et al., 2013; Turner and Baker, 2019). Heterogenous agents in the financial system exist in form of multiple banks, regulators, insurance companies, asset managers, stock markets, pensions, and employees (Li and Evans, 2019). These agents are learning, evolving, and adapting to changing beliefs, norms, culture, perceptions, history, and physiology of the system leading to feedback processes such as tipping points, self-fulfilling processes, and creation of new order.

    Broadly speaking, financial risks emanate, exist, and evolve in complex systems. First, many researchers argue that the economic systems out of which risks emanate are complex systems (Hommes, 2001; Blume and Durlauf, 2006; Haldane and May, 2011; Haldane, 2012; Farmer, 2012; Arthur, 2013; Battiston et al., 2016a; Li, 2017). Second, many studies reveal that institutions such as banks, and insurance companies are both economic and social organisations classified as complex adaptive systems (Allan et al., 2013; Mitchell, 2009; Kremer, 2012; Battiston et al., 2016b). Third, risks in banks are managed by human beings, who are influenced by cultural perceptions of organisations and individuals. Organisations and culture are regarded as complex adaptive systems (Mitleton-Kelly, 2003; Ingram and Thompson, 2011; Thompson, 2018).

    Financial risks exist in multiple states. This refers to space of possibilities implying that unpredictability is due to intertwined and non-linear relationships. Additionally, financial risks are multidimensional. The classification of risks into four realms is testament to multidimensional existence. All the dimensions of financial risks interact and influence each other. Non-linearity means small changes emanating from interrelationships, networks, interconnections, and feedback loops among heterogenous agents have larger effects on the system. Financial risks are non-linear, interrelated, networked and systemic as shown by difficulties in delineating boundaries between Basel Ⅲ market, credit, and operational risks (Corrigan et al., 2013, Battiston et al., 2016a; BCBS, 2017).

    One key assumption is that financial risks display emergent and evolutionary behaviour. Emergence is the "unpredictable surprise" emanating from random behaviour and interaction of agents. Evolution implies that there are always novel risks whose impact are highly catastrophic. This leads into unforeseen, new, and unexpected behaviours whose "causes and effects" cannot be understood by analysing the system's components (Pagel et al., 2007; Aven, 2017). Historicity and time imply that there is path dependence and unique histories for each risk and bank. This means that agents take context specific decisions that violates best practice and "one size fit all" treatments such as those proposed by cartesian frameworks for example Basel Ⅲ (Li, 2017). Path dependence is also called sensitivity to initial conditions. Simply, this means a small difference in starting values can result in significantly different trajectories. New risks are always emerging disrupting existing patterns. Banks have structure and hierarchy but few leverage points such that risks self-organise. Given these properties of financial risks, complexity science theory provides a robust framework to understand financial risk management in a hyperconnected world (Farmer, 2012; Allan et al., 2013; Battiston et al., 2016b). According to Gross (2011) complexity science is the best method of integrating and combining the language of uncertainty, positivist, and interpretivist views of risk.

    Having outlined the complexity adaptive nature of financial risks, this section proposes a holistic dynamic four realm risk classification framework. The proposed framework is based on Knight (1921), Hayek (1964), Diebold et al, (2010), Ganegoda and Evans (2014), Flage and Aven (2015), Aven (2017), Mazri (2017) and Tepetepe et al., (2021) theory of risk and uncertainty. The foundation of this theoretical direction is Knight (1921)'s distinction between risk and uncertainty. According to Knight (1921) risk is measurable uncertainty where the probability and future outcomes are known whereas Knightian uncertainty is the immeasurable risk where there is either limited or no knowledge about the events but depends on intuition, judgement, and experience. Ellseberg (1961), in support classifies random variables into known (certain) and unknown (uncertain) probability distributions. The four realms are Knightian risk, Knightian Uncertainty, Ambiguous Uncertainty and Ontological Uncertainty. These realms are interconnected and overlaid upon each other (Ganegoda and Evans, 2014). Quantitative Risk Managers and executives can use this framework to classify risks for strategic modeling and decision-making purposes.

    Knightian risk is the state of familiar or fully emerged risks (Mazri, 2017). Risks in this realm are well known by risks managers hence are quantifiable with cartesian theoretical frameworks because there is adequate knowledge of probability distributions and future outcomes. Traditional statistical modeling and Newtonian mechanistic approaches are suitable due to the existence of adequate theory, models, and data (Ganegoda and Evans, 2014; Fadina and Schmidt, 2019). Gaussian distributions through random walks or martingale theories are normally used to estimate these risks (Hull, 2018). According to Mikes (2005) themes for Basel Ⅱ/Ⅲ capital modeling such as Silo and Integrated Risk Management draw heavily from the concept of Knightian risk. Furthermore, standard solutions and best practice are used to deal with this realm (Snowden and Boone, 2007).

    Knightian uncertainty is the realm of epistemic uncertainty where objective probabilities are difficult or impossible to compute. While risk managers have the knowledge about possible future states, they cannot measure their underlying probabilities with accuracy because of insufficient data, lack of well-developed theory and models, wild randomness, and presence of high degree unique events (Mandelbrot and Taleb, 2007; Ganegoda and Evans, 2014; Mousavi and Gigerenzer, 2014; Beissner and Riedel, 2016; Burzoni et al., 2021). Mazri (2017) calls Knightian uncertainty the realm of scientifically controversial risks. To deal with this realm Quantitative Risk Managers should employ novel practice such as investing in knowledge, fractal market hypothesis, scenario analysis and stress testing, and complexity science (Mandelbrot and Hudson, 2004; Allan et al., 2013; Ganegoda and Evans, 2014; Haldane, 2016).

    Ambiguous uncertainty refers to risks that emanate from psychological reactions of human participants that operate the financial systems. Ganegoda and Evans (2014) argue that in environments governed by human beings, possible future states are vaguely defined. This ambiguity emanates from free-will, irrational behaviour, adaptive and cognitive biases. This psychological behaviour, termed "animal spirits" is manifested in selfish, greed, and inhumane acts that leads to incomplete information and unpredictability (Keynes, 1921; Shefrin and Statman, 1985; De Bondt and Thaler, 1985; Shefrin, 2010). Contrary to the Efficient Market Hypothesis, ambiguity imply that financial markets are imperfect (Burzoni et al., 2017; Fadina and Schmidt, 2019). Hence the formation of asset bubbles, liquidity market crashes, scams, depressions, and long-term mispricing of assets which culminates into financial crises is attributed to human cognitive biases or neuroeconomics (Sornette and Quillon, 2012). Quantitative Risk Managers can deal with this realm by applying behavioural economics and finance, heuristics, nudging and game theory (Ganegoda and Evans, 2014; Haldane, 2016; Thaler and Sunstein, 2008).

    Ontological uncertainty is the realm of the unknown-unknowns or hidden risks (Taleb, 2007; Ganegoda and Evans, 2014; Mazri, 2017). Risks in this realm are still unknown or very novel because there is absence of knowledge, data, probability distributions and future outcomes. In some cases, the future outcomes are undefined (Bronk, 2011). To deal with this realm Quantitative Risk Managers should humble themselves, acknowledge their limitations and admit ignorance (Ganegoda and Evans, 2014). They should sense, respond, and act with emergent modeling practice and pattern- based management (Snowden, 2003; Turner and Baker, 2019). To do so they utilise a balanced mix of precaution, regular stress testing, scenario analysis, crisis management, early warning models, investing in research and knowledge, and complexity science (Allan, et al., 2013; Ganegoda and Evans, 2014; Aven, 2015; Haldane, 2017).

    This section briefly reviews complex systems modeling methods from a practitioner and strategic quantitative risk management perspective. A complete and detailed review can be found in Allan et al., (2013). Modeling techniques for complex systems are quite different from those used in cartesianism (Farmer et al., 2012). Quantitative Risk Managers can employ heuristics, behavioural economics, concept mapping, theory of critical phenomena, data mining, systems dynamic modeling, chaos theory, fuzzy set theory, possibility theory, neural networks, Bayesian networks, agent-based computation, network theory, ant colony optimisation, cellular automata, and genetic algorithms to measure and manage financial risks in banks.

    Heuristics are accurate and robust rules of thumb used to solve problems, make inferences, and learn in complex systems (Girgerenzer, 2007; Shefrin, 2010; Haldane, 2015). By using experience, gut feelings, educated guesses, intuitive judgement, stereotyping, profiling and common-sense, animals and humans solve complex problems of finding a nesting position, mate, and strategic decisions (Gigerenzer et al., 2011; Katsikopoulos, 2011; Haldane, 2015). For example, Haldane (2012) argues that dogs catch the frisbee using simpler rules of thumb than aerodynamic and complicated PhD formulae. In support empirical studies by Mousavi and Gigerenzer (2014) and Aikman et al. (2014) indicate that simple heuristics outperform complex models in uncertainty environments where there is little data and small sample sizes. Specifically, Aikman et al. (2014) compares fast frugal trees for stress testing, and Basel Ⅱ simple versus advanced credit risk modeling methods. They find that (a) simple methods outperform complex modeling methods for calculating capital requirements where there is limited data and fat tailed distributions, (b) simple indicators outperformed complex metrics for in bank failure prediction during the great financial crisis (c) fast frugal decision trees perform better than information-intensive regression techniques in bank failure prediction. Haldane (2017) thus propose the replacement of Basel Ⅲ with heuristics capital regulation based on the doctrine "less information is more" contrary to traditional quantitative finance doctrine "more information is better".

    Behavioural economics and finance provide frameworks for modeling decision making that is influenced by "animal spirits", heterogenous expectations and psychological behaviours of market agents (Farmer et al., 2012; Battiston et al., 2016b; Virigineni and Rao, 2017). To behavioural scientists' classical theories (e.g., Efficient Market Hypothesis) do not represent real world financial markets because they exclude irrational behaviour, adaptation, learning and dynamism of human agents (Lo, 2011). The adaptive market hypothesis provided by Hommes (2011) and Lo (2012) offers a theoretical framework for behavioural modeling. The theory postulates that financial markets behave like biological systems. Its assumptions are individuals in financial markets have self-interest, make mistakes, learn, and adapt (Farmer and Lo, 1999; Farmer, 2002; Lo, 2004; Lo, 2012). Competition in banks drive adaptation and innovation. Evolutionary principles such as competition, natural selection, adaptation, reproduction, and interactions control financial markets. Therefore, quantitative risk managers that pay attention to heuristic and prospect theory biases such as overconfidence, anchoring, herding, framing, availability, representativeness, hindsight bias, mental accounting, loss aversion, and so on provide better predictions of risk and capital (Shefrin, 2010; Thaler, 2015; Barberis, 2017).

    Concept mapping is the qualitative and non-numerical representation of situations in form of mental, naïve, or folk theories that explore different aspects of a matter. Concept maps are imaged, dynamic, and outcome-based simulations used in everyday life to understand the world. Concept mapping enables the visualisation of complex and non-linear relationships between different ideas and concepts (Allan et al., 2013). Abstract or concrete concepts are denoted as nodes and their interrelationships as links. Systems dynamic modeling is causal prediction of nonlinear relationships of underlying structures of flows, delays, and feedback loops (Forrester, 1969; Bharathy and McShane, 2014). It is based on laws of motion and change, termed calculus in mathematics and mechanics in physics. Once a model is established, the system can simulate future scenarios using deterministic rules as well as random values.

    Fractal market hypothesis (FMH) provides a framework for modeling turbulence, feedback loops, jumps, discontinuity, irrational behaviour, butterfly effects, and non-periodicity that truly characterise real world financial markets (Mandelbrot, 1963; Peters, 1991; Peters, 1994; Liu and Song, 2012; Anderson and Noss, 2013; Karp and Van Vuuren, 2019). FMH provides the theoretical basis for modeling variables with wild randomness and mean reversion such as market crashes, bubbles, and natural disasters (Joshi, 2014). It is also called chaos theory. Adoption of fractal market theory demands big data and complete elimination of some current cartesian frameworks such as Basel Ⅱ/Ⅲ (Ganegoda and Evans, 2014; Klioutchnikov et al., 2017). However, chaos theory is complicated because it requires significant mathematical programming (Allan et al., 2013).

    Fuzzy set theory deals with circumstances where there is ambiguous and vague information (Chaudhuri and Ghosh, 2016). Set theory, the foundation of probability theory is converted into fuzzy sets and all subsequent applications are updated to enable them to incorporate vagueness (Georgescu, 2012; Allan et al., 2013; Chaudhuri and Ghosh, 2016). When using fuzzy logic, people's qualitative descriptions and quantitative estimations are elaborated to maximise utility. Possibility theory by Zadeh(1965, 1978) is an extension of fuzzy theory. It is a subjective measure applicable in situations where there are small sample sizes, vague data, fuzzy or imprecise data, subjective elements, conflicting data, and linguistic information (Georgescu, 2012; Georgescu and Kinnunen, 2016). Chaudhuri and Ghosh (2016) apply possibility theory in operational risk modeling in a mining environment and they conclude that it calculates capital charges better than probability theory.

    Neural networks are computational methods inspired by the brain's structure, processing method, and learning abilities. A neural network or artificial neural network (ANN) is an automated nonlinear regression process (Allan et al., 2013). Bayesian Networks allow the system to perform inference and learning. Virtually a BN is a hierarchy structure with nodes cascading in layers, allowing users to visually understand the logical relationships among variables (Allan et al., 2013). Agent based modeling examines a collection of autonomous decision-making entities called agents (Bonabeau, 2002; Farmer and Foley, 2009; Haldane, 2016). It is used to model non-linear relationships, scenario analysis, feedback loops, local interactions, and externalities because they can reproduce emergent phenomena, dynamic equilibria and non-linear properties represented in complex systems (Kremer, 2012; Haldane, 2016). ABM outputs are very sensitive to initial conditions, often producing a vast space of possible outcomes which can only be understood through rigorous statistical scrutiny (Macal et al., 2013). Agent based modeling is well known for simulating agent's behaviour in four areas: flow, organisational, market, and diffusion (Bonabeau, 2002; Macal et al., 2013, Chan-Lau, 2017).

    Network theory originates from Leonhard Euler's paper on "Seven Bridges of Konigsberg", published in 1736. Network theory is based on the proposition that to some extent everything is somehow connected (Ellinas et al., 2018; Petrone and Latora, 2018). Mitchell (2006) states that under network theory an object is conceptualised as a node (or vertex) and its relationships with others as edges. A network (or graph) is simply a collection of nodes(vertices) and links (edges). The links can be directed or undirected, and weighted or unweighted. Network theory has applications in macroprudential regulation, systemic risk, measuring contagion, emerging risks, credit risks, liquidity risks and stress testing (Eisenberg and Noe, 2001; Haldane and May, 2011; Glasserman and Young, 2015; Gao et al., 2018; Soramaki and Cook, 2018; Battiston et al., 2016b; Ellinas et al., 2018).

    Ant colony optimisation (ACO) is a technique for solving combinatorial optimisation problems using ant behaviour (Dorigo et al., 1996; Dorigo, 2007). Ant colony optimization mimics techniques employed by ants when establishing the short route to a food source. An ACO algorithm consists of several cycles (iterations) of constructing a solution. During each iteration several ants (which is a parameter) construct complete solutions using heuristic information and the collected experiences of previous groups of ants. These collected experiences are represented by a digital analogue of trail pheromone which is deposited on the constituent elements of a solution. The ACO method is applicable in situations where there is self-organisation. This technique is applicable to derivatives pricing, credit risk modeling and bankruptcy prediction (Marten et al., 2010). Martens et al., (2010) applies ACO in credit risk under Basel Ⅱ, but the method is still limited to academia.

    Genetic algorithms are methods for solving both constrained and unconstrained optimisation processes based on laws of natural selection and genetics evolution (Mitchell, 2006). They are applicable where the objective function is stochastic, nonlinear, discontinuous, and non-differentiable (Holland, 2002). Genetic algorithms replicate natural evolution processes and population growth under the "survival of the fittest" principle. Risks are modelled following biological principles of selection, reproduction, and mutation of a population. This method is potentially applicable in credit scoring, stock selection, bankruptcy predictions (Allan, et al., 2013).

    Cellular automata refer to spatial modeling that explores discrete behaviour of interacting elements in a complex system using simple rules (Mitchell, 2006). The method allows analysis of emergence, diffusive, transportive and dissipative effect of risks with a bottom-up approach (Mitchell, 2006). This method is commonly applied where data is low and severity high such as natural hazard modeling of earthquakes, hurricanes, fires, terrorism, nuclear and hazardous material incidents. Phylogenetic theory is simply the theory of evolution in systematic biology. It propagates the belief that all organisms on earth descended from one common ancestor (Allan et al., 2013, Evans et al., 2017). Hence, organisms' historical patterns of descent, path dependence and evolutionary relationships can be evaluated with phylogenetic or cladistic trees using molecular, physiological, and morphological data (Allan et al., 2013). A phylogenetic tree is essentially a connected graph that shows historical pattern of ancestry, divergence, and descent. It is composed of nodes and branches which does not contain any closed structures. The nodes symbolise the organisms under investigation, whereas branches that connect all nodes represent the relationships among different organisms, in terms of their ancestry and descent relationships. This theory can be used to model history, evolution, and emergence of risks.

    This section provides an outline of the research methodology which was used for data collection and analysis. As in Tepetepe et al., (2021), a structured questionnaire was employed to collect primary data from sixteen Zimbabwean banks. The structured questionnaire was adopted because it provided factual, standard responses, and measurable data from a large sample (Zikmund, 2003; Fellegi, 2010; Babbie 2011; Saunders et al., 2019). The primary purpose for employing the survey was to obtain descriptive generalisable information and facilitate independent objective analysis (Babbie, 2011; Ponto, 2015). The questionnaire was designed to collect categorical data with a mixture of Likert scale and "yes/no" questions. The questions were designed by researchers based on theoretical constructs from cartesian and complexity science theory (Collis and Hussey, 2009; Fellegi, 2010; Saunders, et al., 2012; Easterby-Smith et al., 2015). The state of adoption was examined with level of static frameworks, four realms of uncertainty, and modeling method adoption.

    The objective of sampling in a survey strategy is to obtain a sample representative of the population (Ponto, 2015). In this study banks were divided into four strata by bank size and ownership structure into International, Pan African, Private owned and Government owned indigenous banks. The sample size was determined at 95% confidence interval using Krejcie and Morgan's (1970) table as in Sekaran (2003), Yamane's (1967), and Saunders et al. (2012) formulae. These formulae provide samples of 132,133 and 132 respectively. This led to the distribution of 160 self-administered questionnaires out of a population of 200 Risk Managers. To reduce non-response bias, questionnaires were distributed beyond those recommended by the three techniques and a follow up mail was used. 131 questionnaires were collected, representing a 75% active response rate. 120 adequately completed questionnaires were used for data analysis.

    The structured questionnaire was piloted using Babbie and Quinlan (2011)'s two stage procedure. First, academics selected from the University of Bolton and second, ten Risk Managers chosen by stratified random sampling reviewed the questionnaire for correctness and feasibility for the study (Saunders et al., 2012; Hussey and Collins, 2010). Ten risk managers were employed following Brace (2008), who states that pilot testing is successful in identifying the needed changes if few individuals up to ten are willing to complete and provide suggestions.

    Data was collected for a period of six months. The data was analysed with descriptive statistics on Statistical Package for Social Sciences (SPSS). Following Tukey (1977)'s exploratory data analysis method, the data was presented in form of form of frequency tables and bar charts (Field, 2009; Saunders et al., 2012; Hair et al., 2014).

    Validity and reliability of the structured questionnaire were measured by content validity and Cronbach's alpha. Content validity was establishing by ensuring the structured questionnaire was reviewed by academic experts from the University of Bolton and ten risk managers who participated in the pilot study (Fink, 1995; Kimberlin and Winterstein, 2008; Pallant, 2010; Mohajan, 2017). The Cronbach alpha was 0.813 which is acceptable. Hence results are regarded as valid and reliable.

    Ethical considerations in this study cover permission to conduct survey, informed consent, data privacy and confidentiality. Permission to conduct survey were obtained from the University of Bolton. Participants completed the questionnaire from informed consent where they had a right to withdraw their participation. The purpose of the research was explained to the participants by the researcher on the cover letter attached to questionnaire. The study ensured data privacy on two matters. Firstly, no reference is made to names of banks where questionnaires were distributed. Secondly, no reference is made to any individual or bank on data analysis, presentation, and discussion of outcomes.

    This section reports findings of the comparative analysis of cartesian and complexity science theoretical frameworks adoption in quantitative risk management of Zimbabwean banks. The structured questionnaire was completed by 26 percent Quantitative Risk Managers, 24 percent Credit Risk Managers, 22 percent Operational Risk Managers and 13 percent Market Risk Managers. 77 percent of the Risk Managers who participated in the survey had above three years' experience in risk management.

    A comparison of results shows that banks in Zimbabwe have adopted both cartesian and complexity science theories with the same speed and in the same direction regardless of size but under varying strengths. The strengths of adoption are higher for cartesian than complexity science frameworks adoption. In summary Figure 1 shows that Pan African, international, and local banks are moving at the same pace in adopting the four types of frameworks.

    Figure 1.  Cartesian and complexity science framework adoption.

    Particularly Figure 1 shows that Zimbabwean banks have adopted mandatory frameworks (100 percent), proprietary frameworks (89 percent), advisory frameworks (89 percent) and complexity science frameworks (40 percent). This initial assessment reveal that Quantitative Risk Management in Zimbabwean banks is predominantly cartesian in nature complying to specified resilience modeling and fact-based risk management within established standards of practice (Sweeting, 2011; Morin, 2014; Cavallo, 2014). We perform further analysis on the adoption of complexity science using four realms: Knightian risk, Knightian Uncertainty, Ambiguous Uncertainty and Ontological Uncertainty

    Table 2 shows that regardless of bank size there is significant adoption of the four-realm complexity science framework. On average 97 percent respondents show that risk managers classify risks as Knightian risks, 88 percent as ontological uncertainty, 87 percent as ambiguous uncertainty and 73 percent as Knightian uncertainty. Our conclusion is that banks in Zimbabwe are adopting both cartesian and complexity science approaches in the same direction and trajectory regardless of their size. However, cartesianism predominates the new paradigm of complexity science.

    Table 2.  Bank size and complexity science four realms of uncertainty adoption.
    Question 4 Do you utilise the below classification in quantitative risk management
    Uncertainty Realm International Pan African Private Indigenous Government indigenous
    Yes No Yes No Yes No Yes No
    Ontological Uncertainty 90% 10% 89% 11% 93% 7% 80% 20%
    Ambiguous Uncertainty 93% 7% 89% 11% 90% 10% 74% 26%
    Knightian uncertainty 57% 43% 84% 16% 91% 9% 61% 39%
    Knightian risks 98% 2% 95% 5% 96% 4% 97% 3%
    Note: This table provides descriptive statistics on complexity science realms adoption in relation to bank size.

     | Show Table
    DownLoad: CSV

    The study finds the adoption of Basel Ⅱ/Ⅲ modelling methods to be more comprehensive and deeper than complexity science methods. First, the paper performs an analysis of cartesian framework adoption using Basel Ⅱ/Ⅲ. The results reveal deep and comprehensive Eurocentric adoption of Basel Ⅱ/Ⅲ financial engineering methods in both large (Pan African, international) and smaller domestic banks. Hence banks are adopting all types of Basel Ⅱ/Ⅲ capital modeling methods regardless of size. As summarised by Table 3 the adoption of various Basel Ⅱ/Ⅲ modeling methods is high. These results confirm postulations by Jones and Knaack (2019) that low-income countries sometimes adopt Basel Ⅱ/Ⅲ methods to a greater extent even if they are ill-suited for them to enhance their perceived international competitive standing. On average, first, banks are compliant to credit risk modeling methods as 97 percent Modified Standardised Approach (MSA), 83 percent Advanced Internal Ratings Based Approach (AIRB), and 62 percent Foundation Internal Rating Based Approach (FIRB). Second banks are compliant to operational risk modeling methods as 88 percent Alternative Standardised Approach (ASA), 85 percent Advanced Measurement Approach (AMA), 0 percent Standardised Measurement Approach (SMA). Third banks are compliant to market risks as 100 percent standardised approach (SA) and 98 percent internal models' approach.

    Table 3.  Basel Ⅱ/Ⅲ financial engineering methods adoption.
    Question 3 To what extent have you have implemented the following Basel Ⅱ/Ⅲ methods for capital modeling. Choose the methods that best apply to your bank from the list below.
    Risk Type Modeling Method Level of adoption by bank type (frequencies)
    Indigenous Private Indigenous Government Owned International Pan African
    Credit MSA 100% 100% 87% 92%
    AIRB 76% 77% 67% 53%
    FIRB 40% 71% 33% 25%
    Operational ASA 90% 100% 93% 78%
    AMA 77% 74% 60% 56%
    SMA - - - -
    Market SA 68% 61% 73% 61%
    IMA 76% 87% 80% 67%
    Note: This table reports descriptive statistics on the adoption of Basel Ⅱ/Ⅲ modeling methods in relation to bank size.

     | Show Table
    DownLoad: CSV

    Second, we carry an analysis of complexity science modeling adoption. Figure, 2 shows low adoption of complexity science modeling methods. However behavioural economics & finance (90 percent), heuristics (71 percent) and concept mapping (30 percent) are adopted to a greater extent.

    Figure 2.  Complexity science modeling methods adoption.

    A comparison of the results indicates that the rate of Basel Ⅱ/Ⅲ capital modeling methods adoption is higher than that of complexity science modeling methods.

    As shown in Table 4 this study empirically discovers diverging capital modeling priorities between banks and their regulator. While Zimbabwean banks place strategic thrust on general resilience modeling and pattern-based management for immeasurable and hard to predict risks in ontological, ambiguous and Knightian uncertainty realms, the regulator does the contrary. Rather the regulator prioritises specified resilience modeling and fact-based management where the modeling of Knightian risks is more important than other realms of uncertainty.

    Table 4.  Diverging modeling priorities between banks and their supervisor.
    Realm Banks' ranking Supervisor's ranking
    Ontological Uncertainty 1 4
    Ambiguous Uncertainty 2 3
    Knightian Uncertainty 3 2
    Knightian Risk 4 1
    Note: This table provides an analysis of divergence in modeling priorities between supervisor and banks.

     | Show Table
    DownLoad: CSV

    These divergent capital modeling views are a result of information asymmetries between banks and supervisor. Given different modeling priorities, achieving regulatory capital adequacy with cartesianism alone in Zimbabwean banks may not be possible. Thus, clinging to the cartesian paradigm might increase the probability of banks reporting overestimated or underestimated capital adequacy figures to impress and comply to supervisory demands because of peer pressure.

    Finally, the study finds that complexity science and cartesian frameworks intersect on market discipline. Hence enhancing market discipline will possibly improve financial risk management in banks.

    As previously mentioned, this study finds that both cartesian and complexity science frameworks adoption in Zimbabwean banks is moving in the same direction regardless of bank size. However, the adoption of Basel Ⅱ/Ⅲ modeling methods is more comprehensive and deeper than complexity science. Furthermore, due to information asymmetries between regulator and banks, the strategic thrusts for financial risk management are different. The regulator focuses on Knightian risk modeling whereas banks prioritise ontological, ambiguous and Knightian uncertainty respectively. Lastly, the study shows that complexity science and cartesianism converges on market discipline. These results confirm Blume and Durlauf (2006), Farmer et al. (2012), Battiston et al. (2016b), Haldane (2017), Li and Evans (2019) and Arthur (2021) that complexity science provides an additional dimension to the study of quantitative risk management and its integration into mainstream financial risk management will provide possible solutions to some of the pertinent current non- Gaussian distribution challenges. Examples of current challenges are financial stability, climate risk modeling, sustainable banking, systemic risk, emerging risks, and interconnected risks. The study argues for an inclusion of uncertainty in the practice of Quantitative Risk Management.

    Based on this study's findings we propose three optional policy recommendations to improve Quantitative Risk Management in Zimbabwean banks. The first proposal entails the integration of cartesian and complexity science in financial regulation and modeling methods. It should be noted that an integration of complexity science and cartesianism, in this case is not aimed at achieving equilibrium, but obtaining maximum benefits, paradigm shift in policy making, complementary and different viewpoints from these two schools and application to suitable situations. We propose that financial modeling regulation such as Basel Ⅱ/Ⅲ should be reduced to heuristic simplicity. Our recommendation is strongly inclined to laissez faire and complexity science economists who argue that "less is more". Less was more, historically in the free banking era where with few simple rules banks had higher capital adequacy of above 50 percent (Dowd and Hutchison, 2014). Currently there are more rulebooks with more pages, but capital adequacy is slightly above 12 percent. This recommendation challenges the predominantly upheld doctrine of quantitative finance "more is better" culminating in less useful excessive cartesian rulebooks for instance Basel Ⅱ/Ⅲ as noted by Dowd and Hutchison (2014). Instead of providing a cure these rulebooks are characterised by "too many pages of detailed technical modeling" which perpetrate herding behaviour, regulatory gaming, complicated "geek" mathematical models, poor risk management and financial crises. Furthermore, in our view, regulation that will improve quantitative finance should embrace the ideology of nudging rather than coercion, self-regulation, and simplicity, rather than excessive complex rules. Nudges are defined as any aspect of choice architecture that alters people's behaviour in a predictable way without forbidding any options or significantly changing their economic incentives (Thaler and Sunstein, 2008, p.6). The second option is enhancing market discipline since complexity science and cartesianism intersect on market discipline. The third option is for central bank to curtail information asymmetries by accelerating its speed in embracing new standards and models of risk management.

    Traditionally, the examination of financial risk management is done with cartesian and interpretivist theories. However, the emergence of complexity science offers a different perspective. Complexity economists argue that financial risks are complex adaptive systems characterised by emergence, evolution, non-linearity, interconnectedness, chaos, irrational expectations, adaptive behaviour, and heterogeneous agents. Hence traditional methods of quantitative finance fail when assumptions of rational expectations, perfect markets, independence, perfect information, and homogeneity are violated. This paper pioneers a comparative analysis of cartesian and complexity science adoption in Zimbabwean banks, in settings of a developing country. A structured questionnaire was employed for data collection.

    First, the results suggests that both cartesian and complexity science frameworks are adopted in the same direction and at same pace by both large and domestic banks. However financial risk management is still predominantly cartesian based. Furthermore, there is evidence of deeper and comprehensive compliance to Basel Ⅱ/Ⅲ modeling than complex science modeling techniques. Second, the study reveals diverging modeling priorities between the regulator and banks due to information asymmetries. Third, complexity science and cartesian frameworks converge on market discipline. Finally, this paper provides recommendations to improve Quantitative Risk Management practices and policy formulation in Zimbabwean banks. Future research should look on these two issues: (a) an examination of the impact of integrating complexity science and Basel Ⅱ/Ⅲ, and (b) analysis of the factors that enhance market discipline in Zimbabwean banks.

    Special thanks to participants in the survey and anonymous reviewers of this work.

    All authors declare no conflicts of interests in this article.

    [1] Folk JE (1983) Mechanism and basis for specificity of transglutaminase-catalyzed e-(g-glutamyl) lysine bond formation. Adv Enzymol Relat Areas Mol Biol 54: 1-56.
    [2] Lorand L, Conrad SM (1984) Transglutaminases. Mol Cell Biochem 58: 9-35. doi: 10.1007/BF00240602
    [3] Piacentini M, Martinet N, Beninati S, et al. (1988) Free and protein conjugated-polyamines in mouse epidermal cells. Effect of high calcium and retinoic acid. J Biol Chem 263: 3790-3794.
    [4] Song Y, Kirkpatrick LL, Schilling AB, et al. (2013) Transglutaminase and polyamination of tubulin: posttranslational modification for stabilizing axonal microtubules. Neuron 78: 109-123. doi: 10.1016/j.neuron.2013.01.036
    [5] Achyuthan KE, Greenberg CS (1987) Identification of a guanosine triphosphate-binding site on guinea pig liver transglutaminase. Role of GTP and calcium ions in modulating activity. J Biol Chem 262: 1901-1906.
    [6] Hasegawa G, Suwa M, Ichikawa Y, et al. (2003) A novel function of tissue-type transglutaminase: protein disulfide isomerase. Biochem J 373: 793-803. doi: 10.1042/bj20021084
    [7] Lahav J, Karniel E, Bagoly Z, et al. (2009) Coagulation factor XIII serves as protein disulfide isomerase. Thromb Haemost 101: 840-844.
    [8] Iismaa SE, Mearns BM, Lorand L, et al. (2009) Transglutaminases and disease: lessons from genetically engineered mouse models and inherited disorders. Physiol Rev 89: 991-1023. doi: 10.1152/physrev.00044.2008
    [9] Smethurst PA, Griffin M (1996) Measurement of tissue transglutaminase activity in a permeabilized cell system: its regulation by calcium and nucleotides. Biochem J 313: 803-808. doi: 10.1042/bj3130803
    [10] Nakaoka H, Perez DM, Baek KJ, et al. (1994) Gh: a GTP-binding protein with transglutaminase activity and receptor signalling function. Science 264: 1593-1596.
    [11] Gentile V, Porta R, Chiosi E, et al. (1997) Tissue transglutaminase and adenylate cyclase interactions in Balb-C 3T3 fibroblast membranes. Biochim Biophys Acta 1357: 115-122. doi: 10.1016/S0167-4889(97)00024-4
    [12] Nanda N, Iismaa SE, Owens WA, et al. (2001) Targeted inactivation of Gh/tissue transglutaminase II. J Biol Chem 276: 20673-20678. doi: 10.1074/jbc.M010846200
    [13] Mian S, El Alaoui S, Lawry J, et al. (1995) The importance of the GTP binding protein tissue transglutaminase in the regulation of cell cycle progression. FEBS Lett 370: 27-31. doi: 10.1016/0014-5793(95)00782-5
    [14] Olaisen B, Gedde-Dahl TJR, Teisberg P, et al. (1985) A structural locus for coagulation factor XIIIA (F13A) is located distal to the HLA region on chromosome 6p in man. Am J Hum Genet 37: 215-220.
    [15] Yamanishi K, Inazawa J, Liew F-M, et al. (1992) Structure of the gene for human transglutaminase 1. J Biol Chem 267: 17858-17863.
    [16] Gentile V, Davies PJA, Baldini A (1994) The human tissue transglutaminase gene maps on chromosome 20q12 by in situ fluorescence hybridization. Genomics 20: 295-297.
    [17] Wang M, Kim IG, Steinert PM, et al. (1994) Assignment of the human transglutaminase 2 (TGM2) and transglutaminase 3 (TGM3) genes to chromosome 20q11.2. Genomics 23: 721-722.
    [18] Gentile V, Grant F, Porta R, et al. (1995) Human prostate transglutaminase is localized on chromosome 3p21.33-p22 by in situ fluorescence hybridization. Genomics 27: 219-220.
    [19] Grenard P, Bates MK, Aeschlimann D (2001) Evolution of transglutaminase genes: identification of a transglutaminases gene cluster on human chromosome 15q. Structure of the gene encoding transglutaminase X and a novel gene family member, transglutaminase Z. J Biol Chem 276: 33066-33078.
    [20] Thomas H, Beck K, Adamczyk M, et al. (2013) Transglutaminase 6: a protein associated with central nervous system development and motor function. Amino Acids 44: 161-177. doi: 10.1007/s00726-011-1091-z
    [21] Bailey CDC, Johnson GVW (2004) Developmental regulation of tissue transglutaminase in the mouse forebrain. J Neurochem 91: 1369-1379. doi: 10.1111/j.1471-4159.2004.02825.x
    [22] Kim SY, Grant P, Lee JHC, et al. (1999) Differential expression of multiple transglutaminases in human brain. Increased expression and cross-linking by transglutaminase 1 and 2 in Alzheimer's disease. J Biol Chem 274: 30715-30721.
    [23] lannaccone M, Giuberti G, De Vivo G, et al. (2013) Identification of a FXIIIA variant in human neuroblastoma cell lines. Int J Biochem Mol Biol 4: 102-107.
    [24] Citron BA, Santa Cruz KS, Davies PJ, et al. (2001) Intron-exon swapping of transglutaminase mRNA and neuronal tau aggregation in Alzheimer's disease. J Biol Chem 276: 3295-3301. doi: 10.1074/jbc.M004776200
    [25] De Laurenzi V, Melino G (2001) Gene disruption of tissue transglutaminase. Mol Cell Biol 21: 148-155.
    [26] Mastroberardino PG, Iannicola C, Nardacci R, et al. (2002) 'Tissue' transglutaminase ablation reduces neuronal death and prolongs survival in a mouse model of Huntington's disease. Cell Death Differ 9: 873-880.
    [27] Lorand L, Graham RM (2003) Transglutaminases: crosslinking enzymes with pleiotropic functions. Nature Mol Cell Biol 4: 140-156. doi: 10.1038/nrm1014
    [28] Wolf J, Jäger C, Lachmann I, et al. (2013) Tissue transglutaminase is not a biochemical marker for Alzheimer's disease. Neurobiol Aging 34: 2495-2498. doi: 10.1016/j.neurobiolaging.2013.05.008
    [29] Wilhelmus MMM, Drukarch B (2014) Tissue transglutaminase is a biochemical marker for Alzheimer's disease. Neurobiol Aging 35: 3-4.
    [30] Wolf J, Jäger C, Morawski M, et al. (2014) Tissue transglutaminase in Alzheimer's disease-facts and fiction: a reply to "Tissue transglutaminase is a biochemical marker for Alzheimer's disease". Neurobiol Aging 35: 5-9.
    [31] Adams RD, Victor M (1993) Principles of Neurology.
    [32] Selkoe DJ, Abraham C, Ihara Y (1982) Alzheimer's disease: insolubility of partially purified paired helical filaments in sodium dodecyl sulfate and urea. Proc Natl Acad Sci USA 79: 6070-6074. doi: 10.1073/pnas.79.19.6070
    [33] Grierson AJ, Johnson GV, Miller CC (2001) Three different human isoforms and rat neurofilament light, middle and heavy chain proteins are cellular substrates for transglutaminase. Neurosci Lett 298: 9-12. doi: 10.1016/S0304-3940(00)01714-6
    [34] Singer SM, Zainelli GM, Norlund MA (2002) Transglutaminase bonds in neurofibrillary tangles and paired helical filament t early in Alzheimer's disease. Neurochem Int 40: 17-30. doi: 10.1016/S0197-0186(01)00061-4
    [35] Halverson RA, Lewis J, Frausto S, et al. (2005) Tau protein is cross-linked by transglutaminase in P301L tau transgenic mice. J Neurosci 25: 1226-33. doi: 10.1523/JNEUROSCI.3263-04.2005
    [36] Jeitner TM, Matson WR, Folk JE, et al. (2008) Increased levels of g-glutamylamines in Huntington disease CSF. J Neurochem 106: 37-44. doi: 10.1111/j.1471-4159.2008.05350.x
    [37] Dudek SM, Johnson GV (1994) Transglutaminase facilitates the formation of polymers of the beta-amyloid peptide. Brain Res 651: 129-33. doi: 10.1016/0006-8993(94)90688-2
    [38] Hartley DM, Zhao C, Speier AC, et al. (2008) Transglutaminase induces protofibril-like amyloid b protein assemblies that are protease-resistant and inhibit long-term potentiation. J Biol Chem 283: 16790-16800. doi: 10.1074/jbc.M802215200
    [39] Citron BA, Suo Z, SantaCruz K, et al. (2002) Protein crosslinking, tissue transglutaminase, alternative splicing and neurodegeneration. Neurochem Int 40: 69-78. doi: 10.1016/S0197-0186(01)00062-6
    [40] Junn E, Ronchetti RD, Quezado MM, et al. (2003) Tissue transglutaminase-induced aggregation of a-synuclein: Implications for Lewy body formation in Parkinson's disease and dementia with Lewy bodies. Proc Natl Acad Sci USA 100: 2047-2052. doi: 10.1073/pnas.0438021100
    [41] Zemaitaitis MO, Lee JM, Troncoso JC, et al (2000) Transglutaminase-induced cross-linking of tau proteins in progressive supranuclear palsy. J Neuropathol Exp Neurol 59: 983-989. doi: 10.1093/jnen/59.11.983
    [42] Zemaitaitis MO, Kim SY, Halverson RA, et al. (2003) Transglutaminase activity, protein, and mRNA expression are increased in progressive supranuclear palsy. J Neuropathol Exp Neurol 62: 173-184. doi: 10.1093/jnen/62.2.173
    [43] Iuchi S, Hoffner G, Verbeke P, et al. (2003) Oligomeric and polymeric aggregates formed by proteins containing expanded polyglutamine. Proc Natl Acad Sci USA 100: 2409-2414. doi: 10.1073/pnas.0437660100
    [44] Gentile V, Sepe C, Calvani M, et al. (1998) Tissue transglutaminase-catalyzed formation of high-molecular-weight aggregates in vitro is favored with long polyglutamine domains: a possible mechanism contributing to CAG-triplet diseases. Arch Biochem Biophys 352: 314-321. doi: 10.1006/abbi.1998.0592
    [45] Kahlem P, Green H, Djian P (1998) Transglutaminase action imitates Huntington's disease: selective polymerization of huntingtin containing expanded polyglutamine. Mol Cell 1: 595-601. doi: 10.1016/S1097-2765(00)80059-3
    [46] Karpuj MV, Garren H, Slunt H, et al (1999) Transglutaminase aggregates huntingtin into nonamyloidogenic polymers, and its enzymatic activity increases in Huntington's disease brain nuclei. Proc Natl Acad Sci USA 96: 7388-7393. doi: 10.1073/pnas.96.13.7388
    [47] Segers-Nolten IM, Wilhelmus MM, Veldhuis G, et al. (2008) Tissue transglutaminase modulates α-synuclein oligomerization. Protein Sci 17: 1395-1402. doi: 10.1110/ps.036103.108
    [48] Lai TS, Tucker T, Burke JR, et al. (2004) Effect of tissue transglutaminase on the solubility of proteins containing expanded polyglutamine repeats. J Neurochem 88: 1253-1260. doi: 10.1046/j.1471-4159.2003.02249.x
    [49] Konno T, Mori T, Shimizu H, et al. (2005) Paradoxical inhibition of protein aggregation and precipitation by transglutaminase-catalyzed intermolecular cross-linking. J Biol Chem 280: 17520-17525. doi: 10.1074/jbc.M413988200
    [50] The Huntington's Disease Collaborative Research Group (1993) A novel gene containing a trinucleotide repeat that is expanded and unstable on Huntington's disease chromosome. Cell 72: 971-983. doi: 10.1016/0092-8674(93)90585-E
    [51] Banfi S, Chung MY, Kwiatkowski TJ, et al. (1993) Mapping and cloning of the critical region for the spinocerebellar ataxia type 1 gene (SCA1) in a yeast artificial chromosome contig spanning 1.2 Mb. Genomics 18: 627-635. doi: 10.1016/S0888-7543(05)80365-9
    [52] Sanpei K, Takano H, Igarashi S, et al. (1996) Identification of the spinocerebellar ataxia type 2 gene using a direct identification of repeat expansion and cloning technique, DIRECT. Nat Genet 14: 277-284. doi: 10.1038/ng1196-277
    [53] Pujana MA, Volpini V, Estivill X (1998) Large CAG/CTG repeat templates produced by PCR, usefulness for the DIRECT method of cloning genes with CAG/CTG repeat expansions. Nucleic Acids Res 1: 1352-1353.
    [54] Fletcher CF, Lutz CM, O'Sullivan TN, et al. (1996) Absence epilepsy in tottering mutant mice is associated with calcium channel defects. Cell 87: 607-617. doi: 10.1016/S0092-8674(00)81381-1
    [55] Vincent JB, Neves-Pereira ML, Paterson AD, et al. (2000) An unstable trinucleotide-repeat region on chromosome 13 implicated in spinocerebellar ataxia: a common expansion locus. Am J Hum Genet 66: 819-829. doi: 10.1086/302803
    [56] Holmes SE, O'Hearn E, Margolis RL (2003) Why is SCA12 different from other SCAs? Cytogenet Genome Res 100: 189-197. doi: 10.1159/000072854
    [57] Imbert G, Trottier Y, Beckmann J, et al. (1994) The gene for the TATA binding protein (TBP) that contains a highly polymorphic protein coding CAG repeat maps to 6q27. Genomics 21: 667-668. doi: 10.1006/geno.1994.1335
    [58] La Spada AR, Wilson EM, Lubahn DB, et al. (1991) Androgen receptor gene mutations in X-linked spinal and bulbar muscular atrophy. Nature 352: 77-79. doi: 10.1038/352077a0
    [59] Onodera O, Oyake M, Takano H, et al. (1995) Molecular cloning of a full-length cDNA for dentatorubral-pallidoluysian atrophy and regional expressions of the expanded alleles in the CNS. Am J Hum Genet 57: 1050-1060.
    [60] Cooper AJL, Sheu K-FR, Burke JR, et al. (1999) Pathogenesis of inclusion bodies in (CAG)n/Qn-expansion diseases with special reference to the role of tissue transglutaminase and to selective vulnerability. J Neurochem 72: 889-899.
    [61] Hadjivassiliou M, Maki M, Sanders DS, et al. (2006) Autoantibody targeting of brain and intestinal transglutaminase in gluten ataxia. Neurology 66: 373-377. doi: 10.1212/01.wnl.0000196480.55601.3a
    [62] Boscolo S, Lorenzon A, Sblattero D, et al. (2010) Anti transglutaminase antibodies cause ataxia in mice. Plos One 5: e9698. doi: 10.1371/journal.pone.0009698
    [63] Stamnaes J, Dorum S, Fleckenstein B, et al. (2010) Gluten T cell epitope targeting by TG3 and TG6; implications for dermatitis herpetiformis and gluten ataxia. Amino Acids 39: 1183-1191. doi: 10.1007/s00726-010-0554-y
    [64] Lerner A, Matthias T (2016) GUT-the Trojan horse in remote organs' autoimmunity. J Clin Cell Immunol 7: 401.
    [65] Matthias T, Jeremias P, Neidhofer S, et al. (2016) The industrial food additive microbial transglutaminase, mimics the tissue transglutaminase and is immunogenic in celiac disease patients. Autoimmun Rev 15: 1111-1119. doi: 10.1016/j.autrev.2016.09.011
    [66] Lerner A, Neidhofer S, Matthias T (2015) Transglutaminase 2 and anti transglutaminase 2 autoantibodies in celiac disease and beyond: Part A: TG2 double-edged sword: gut and extraintestinal involvement. Immunome Res 11: 101-105.
    [67] Wakshlag JJ, Antonyak MA, Boehm JE, et al. (2006) Effects of tissue transglutaminase on beta-amyloid 1-42-induced apoptosis. Protein J 25: 83-94. doi: 10.1007/s10930-006-0009-1
    [68] Lee JH, Jeong J, Jeong EM, et al. (2014) Endoplasmic reticulum stress activates transglutaminase 2 leading to protein aggregation. Int J Mol Med 33: 849-855. doi: 10.3892/ijmm.2014.1640
    [69] Grosso H, Woo JM, Lee KW, et al. (2014) Transglutaminase 2 exacerbates α-synuclein toxicity in mice and yeast. FASEB J 28: 4280-4291. doi: 10.1096/fj.14-251413
    [70] Zhang J, Wang S, Huang W, et al. (2016) Tissue transglutaminase and its product isopeptide are increased in Alzheimer's disease and APPswe/PS1dE9 double transgenic mice brains. Mol Neurobiol 53: 5066-5078. doi: 10.1007/s12035-015-9413-x
    [71] Wilhelmus MM, De JM, Smit AB, et al (2016) Catalytically active tissue transglutaminase colocalises with Ab pathology in Alzheimer's disease mouse models. Sci Rep 6: 20569. doi: 10.1038/srep20569
    [72] Wilhelmus MMM, De JM, Rozemuller AJM, et al. (2012) Transglutaminase 1 and its regulator Tazarotene-induced gene 3 localize to neuronal tau inclusions in tauopathies. J Pathol 226: 132-142. doi: 10.1002/path.2984
    [73] Basso M, Berlin J, Xia L, et al. (2012) Transglutaminase inhibition protects against oxidative stress-induced neuronal death downstream of pathological ERK activation. J Neurosci 39: 6561-6569.
    [74] Lee J, Kim YS, Choi DH, et al. (2004) Transglutaminase 2 induces nuclear factor-kB activation via a novel pathway in BV-2 microglia. J Biol Chem 279: 53725-53735. doi: 10.1074/jbc.M407627200
    [75] Kumar S, Mehta K (2012) Tissue transglutaminase constitutively activates HIF-1a promoter and nuclear factor-kB via a non-canonical pathway. Plos One 7: e49321
    [76] Lu S, Saydak M, Gentile V, et al. (1995) Isolation and characterization of the human tissue transglutaminase promoter. J Biol Chem 270: 9748-9755. doi: 10.1074/jbc.270.17.9748
    [77] Ientile R, Currò M, Caccamo D (2015) Transglutaminase 2 and neuroinflammation. Amino Acids 47: 19-26. doi: 10.1007/s00726-014-1864-2
    [78] Griffith OW, Larsson A, Meister A (1977) Inhibition of g-glutamylcysteine synthetase by cystamine: an approach to a therapy of 5-oxoprolinuria (pyroglutamic aciduria). Biochem Biophys Res Commun 79: 919-925. doi: 10.1016/0006-291X(77)91198-6
    [79] Igarashi S, Koide R, Shimohata T, et al. (1998) Suppression of aggregate formation and apoptosis by transglutaminase inhibitors in cells expressing truncated DRPLA protein with an expanded polyglutamine stretch. Nat Genet 18: 111-117. doi: 10.1038/ng0298-111
    [80] Karpuj MV, Becher MW, Springer JE, et al. (2002) Prolonged survival and decreased abnormal movements in transgenic model of Huntington disease, with administration of the transglutaminase inhibitor cystamine. Nat Med 8: 143-149.
    [81] Dedeoglu A, Kubilus JK, Jeitner TM, et al. (2002) Therapeutic effects of cystamine in a murine model of Huntington's disease. J Neurosci 22: 8942-8950.
    [82] Lesort M, Lee M, Tucholski J, et al. (2003) Cystamine inhibits caspase activity. Implications for the treatment of polyglutamine disorders. J Biol Chem 278: 3825-3830.
    [83] Dubinsky R, Gray C (2006) CYTE-I-HD: Phase I Dose Finding and Tolerability Study of Cysteamine (Cystagon) in Huntington's Disease. Movement Disord 21: 530-533. doi: 10.1002/mds.20756
    [84] Langman CB, Greenbaum LA, Sarwal M, et al. (2012) A randomized controlled crossover trial with delayed-release cysteamine bitartrate in nephropathic cystinosis: effectiveness on white blood cell cystine levels and comparison of safety. Clin J Am Soc Nephrol 7: 1112-1120. doi: 10.2215/CJN.12321211
    [85] Besouw M, Masereeuw R, Van DHL, et al. (2013) Cysteamine: an old drug with new potential. Drug Discov Today 18: 785-792. doi: 10.1016/j.drudis.2013.02.003
    [86] Hadjivassiliou M, Aeschlimann P, Strigun A, et al. (2008) Autoantibodies in gluten ataxia recognize a novel neuronal transglutaminase. Ann Neurol 64: 332-343. doi: 10.1002/ana.21450
    [87] Krasnikov BF, Kim SY, McConoughey SJ, et al. (2005) Transglutaminase activity is present in highly purified nonsynaptosomal mouse brain and liver mitochondria. Biochemistry 44: 7830-7843. doi: 10.1021/bi0500877
    [88] Menalled LB, Kudwa AE, Oakeshott S, et al. (2014) Genetic deletion of transglutaminase 2 does not rescue the phenotypic deficits observed in R6/2 and zQ175 mouse models of Huntington's disease. Plos One 9: e99520-e99520. doi: 10.1371/journal.pone.0099520
    [89] Bailey CD, Johnson GV (2005) Tissue transglutaminase contributes to disease progression in the R6/2 Huntington's disease mouse model via aggregate-independent mechanisms. J Neurochem 92: 83-92. doi: 10.1111/j.1471-4159.2004.02839.x
    [90] Davies JE, Rose C, Sarkar S, et al. (2010) Cystamine suppresses polyalanine toxicity in a mouse model of oculopharyngeal muscular dystrophy. Sci Transl Med 2: 34-40.
    [91] Keillor JW, Apperley KYP (2016) Transglutaminase inhibitors: a patent review. Expert Opin Ther Pat 26: 49-63. doi: 10.1517/13543776.2016.1115836
    [92] Song M, Hwang H, Im CY, et al. (2017) Recent progress in the development of Transglutaminase 2 (TGase2) inhibitors. J Med Chem 60: 554-567. doi: 10.1021/acs.jmedchem.6b01036
    [93] Pietsch M, Wodtke R, Pietzsch J, et al. (2013) Tissue transglutaminase: An emerging target for therapy and imaging. Bioorg Med Chem Lett 23: 6528-6543. doi: 10.1016/j.bmcl.2013.09.060
    [94] Bhatt MP, Lim YC, Hwang J, et al. (2013) C-peptide prevents hyperglycemia-induced endothelial apoptosis through inhibition of reactive oxygen species-mediated transglutaminase 2 activation. Diabetes 62: 243-253. doi: 10.2337/db12-0293
  • This article has been cited by:

    1. Cletos Garatsa, Leo Mataruka, Christopher Zishiri, Service Provider Migration and Bank Switching Behaviour: Factors Influencing Customer Retention in Harare's Banking Sector, 2025, 6, 2958-2326, 10, 10.59413/ajocs/v6.i.1.2
  • Reader Comments
  • © 2017 the Author(s), licensee AIMS Press. This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0)
通讯作者: 陈斌, bchen63@163.com
  • 1. 

    沈阳化工大学材料科学与工程学院 沈阳 110142

  1. 本站搜索
  2. 百度学术搜索
  3. 万方数据库搜索
  4. CNKI搜索

Metrics

Article views(5901) PDF downloads(973) Cited by(1)

Figures and Tables

Figures(5)  /  Tables(3)

/

DownLoad:  Full-Size Img  PowerPoint
Return
Return

Catalog