Siga este enlace para ver otros tipos de publicaciones sobre el tema: Order-sorted theories.

Artículos de revistas sobre el tema "Order-sorted theories"

Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros

Elija tipo de fuente:

Consulte los 28 mejores artículos de revistas para su investigación sobre el tema "Order-sorted theories".

Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.

También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.

Explore artículos de revistas sobre una amplia variedad de disciplinas y organice su bibliografía correctamente.

1

Lucas, Salvador y Raúl Gutiérrez. "Automatic Synthesis of Logical Models for Order-Sorted First-Order Theories". Journal of Automated Reasoning 60, n.º 4 (12 de julio de 2017): 465–501. http://dx.doi.org/10.1007/s10817-017-9419-3.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Alpuente, María, Angel Cuenca-Ortega, Santiago Escobar y José Meseguer. "Order-sorted Homeomorphic Embedding Modulo Combinations of Associativity and/or Commutativity Axioms*". Fundamenta Informaticae 177, n.º 3-4 (10 de diciembre de 2020): 297–329. http://dx.doi.org/10.3233/fi-2020-1991.

Texto completo
Resumen
The Homeomorphic Embedding relation has been amply used for defining termination criteria of symbolic methods for program analysis, transformation, and verification. However, homeomorphic embedding has never been investigated in the context of order-sorted rewrite theories that support symbolic execution methods modulo equational axioms. This paper generalizes the symbolic homeomorphic embedding relation to order–sorted rewrite theories that may contain various combinations of associativity and/or commutativity axioms for different binary operators. We systematically measure the performance of different, increasingly efficient formulations of the homeomorphic embedding relation modulo axioms that we implement in Maude. Our experimental results show that the most efficient version indeed pays off in practice.
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Lucas, Salvador. "Synthesis of models for order-sorted first-order theories using linear algebra and constraint solving". Electronic Proceedings in Theoretical Computer Science 200 (19 de diciembre de 2015): 32–47. http://dx.doi.org/10.4204/eptcs.200.3.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Durán, Francisco y José Meseguer. "On the Church-Rosser and coherence properties of conditional order-sorted rewrite theories". Journal of Logic and Algebraic Programming 81, n.º 7-8 (octubre de 2012): 816–50. http://dx.doi.org/10.1016/j.jlap.2011.12.004.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

LIU, Fu-Chun. "Lawvere Theorem in Institution of Regular Order-Sorted Equational Logic and Initial (Terminal) Semantics for Its Glued Theories". Journal of Software 16, n.º 7 (2005): 1205. http://dx.doi.org/10.1360/jos161205.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Decker, Valerie D., Philip D. Suman, Barb J. Burge, Ankita Deka, Melanie Harris, Dwight J. Hymans, Michael Marcussen, Donna Pittman, David Wilkerson y James G. Daley. "Analysis of Social Work Theory Progression Published in 2004". Advances in Social Work 8, n.º 1 (30 de abril de 2007): 81–103. http://dx.doi.org/10.18060/133.

Texto completo
Resumen
The authors reviewed 67 articles that discussed and/or tested human behavior theories from social work journals published in 2004 in order to assess the level and quality of theory progression. The articles were further sorted into Council on Social Work Education (CSWE) Educational Policy and Accreditation Standards (EPAS) Foundation Curriculum content areas of HBSE, practice, policy, field education, values & ethics, diversity, populations-at-risk/social and economic justice, and research for purposes of categorization. Results indicated that HBSE and practice were by far the largest group of articles reviewed.Also found was that social work has a limited amount of theory discussion in the content areas of field, values and ethics, diversity, and populations-at-risk/social and economic justice. Thirty-three articles were found to demonstrate theory progression, eight articles presented new/emerging theories, and 26 articles discussed or critiqued theories without presenting evidence of theory progression.
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Krajíček, Jan. "Discretely ordered modules as a first-order extension of the cutting planes proof system". Journal of Symbolic Logic 63, n.º 4 (diciembre de 1998): 1582–96. http://dx.doi.org/10.2307/2586668.

Texto completo
Resumen
AbstractWe define a first-order extension LK(CP) of the cutting planes proof system CP as the first-order sequent calculus LK whose atomic formulas are CP-inequalities ∑i ai · xi ≥ b (xi's variables, ai's and b constants). We prove an interpolation theorem for LK(CP) yielding as a corollary a conditional lower bound for LK(CP)-proofs. For a subsystem R(CP) of LK(CP), essentially resolution working with clauses formed by CP-inequalities, we prove a monotone interpolation theorem obtaining thus an unconditional lower bound (depending on the maximum size of coefficients in proofs and on the maximum number of CP-inequalities in clauses). We also give an interpolation theorem for polynomial calculus working with sparse polynomials.The proof relies on a universal interpolation theorem for semantic derivations [16, Theorem 5.1].LK(CP) can be viewed as a two-sorted first-order theory of Z considered itself as a discretely ordered Z-module. One sort of variables are module elements, another sort are scalars. The quantification is allowed only over the former sort. We shall give a construction of a theory LK(M) for any discretely ordered module M (e.g., LK(Z) extends LK(CP)). The interpolation theorem generalizes to these theories obtained from discretely ordered Z-modules. We shall also discuss a connection to quantifier elimination for such theories.We formulate a communication complexity problem whose (suitable) solution would allow to improve the monotone interpolation theorem and the lower bound for R(CP).
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

ALPUENTE, M., S. ESCOBAR, J. SAPIÑA y A. CUENCA-ORTEGA. "Inspecting Maude variants withGLINTS". Theory and Practice of Logic Programming 17, n.º 5-6 (24 de agosto de 2017): 689–707. http://dx.doi.org/10.1017/s147106841700031x.

Texto completo
Resumen
AbstractThis paper introducesGLINTS, a graphical tool for exploring variant narrowing computations in Maude. The most recent version of Maude, version 2.7.1, provides quite sophisticated unification features, including order-sorted equational unification for convergent theories modulo axioms such as associativity, commutativity, and identity. This novel equational unification relies on built-in generation of the set ofvariantsof a termt, i.e., the canonical form oftσ for a computed substitution σ. Variant generation relies on a novel narrowing strategy calledfolding variant narrowingthat opens up new applications in formal reasoning, theorem proving, testing, protocol analysis, and model checking, especially when the theory satisfies thefinite variant property, i.e., there is a finite number of most general variants for every term in the theory. However, variant narrowing computations can be extremely involved and are simply presented in text format by Maude, often being too heavy to be debugged or even understood. TheGLINTSsystem provides support for (i) determining whether a given theory satisfies the finite variant property, (ii) thoroughly exploring variant narrowing computations, (iii) automatic checking of nodeembeddingandclosednessmodulo axioms, and (iv) querying and inspecting selected parts of the variant trees.
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

DIACONESCU, RĂZVAN y ALEXANDRE MADEIRA. "Encoding hybridized institutions into first-order logic". Mathematical Structures in Computer Science 26, n.º 5 (12 de noviembre de 2014): 745–88. http://dx.doi.org/10.1017/s0960129514000383.

Texto completo
Resumen
A ‘hybridization’ of a logic, referred to as the base logic, consists of developing the characteristic features of hybrid logic on top of the respective base logic, both at the level of syntax (i.e. modalities, nominals, etc.) and of the semantics (i.e. possible worlds). By ‘hybridized institutions’ we mean the result of this process when logics are treated abstractly as institutions (in the sense of the institution theory of Goguen and Burstall). This work develops encodings of hybridized institutions into (many-sorted) first-order logic (abbreviated $\mathcal{FOL}$) as a ‘hybridization’ process of abstract encodings of institutions into $\mathcal{FOL}$, which may be seen as an abstraction of the well-known standard translation of modal logic into $\mathcal{FOL}$. The concept of encoding employed by our work is that of comorphism from institution theory, which is a rather comprehensive concept of encoding as it features encodings both of the syntax and of the semantics of logics/institutions. Moreover, we consider the so-called theoroidal version of comorphisms that encode signatures to theories, a feature that accommodates a wide range of concrete applications. Our theory is also general enough to accommodate various constraints on the possible worlds semantics as well a wide variety of quantifications. We also provide pragmatic sufficient conditions for the conservativity of the encodings to be preserved through the hybridization process, which provides the possibility to shift a formal verification process from the hybridized institution to $\mathcal{FOL}$.
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Wang, Qingsong, Hongkun Xiao, Qiao Ma, Xueliang Yuan, Jian Zuo, Jian Zhang, Shuguang Wang y Mansen Wang. "Review of Emergy Analysis and Life Cycle Assessment: Coupling Development Perspective". Sustainability 12, n.º 1 (2 de enero de 2020): 367. http://dx.doi.org/10.3390/su12010367.

Texto completo
Resumen
Two methods of natural ecosystem assessment—emergy analysis (EMA) and life cycle assessment (LCA)—are reviewed in this paper. Their advantages, disadvantages, and application areas are summarized, and the similarities and differences between these two evaluation methods are analyzed respectively. Their research progress is also sorted out. The study finds that EMA and LCA share common attributes in evaluation processes and research fields, but they focus on different aspects of macrocosms and microcosms. The assessment of system sustainability is valued by both EMA and LCA, but the former has unique advantages in natural system input analysis, and the latter is more convincing in assessing environmental loading capacity. If the system boundaries of the two methods are expanded, in other words, factors such as ecosystem services, labor, and infrastructure construction are integrated into the upstream of the target system, and environmental impact is further analyzed using LCA in the downstream of the system, the two approaches would complete each other. The quantified results would be more objective. Therefore, these two theories have the necessity of coupling development. After reviewing recent coupling application cases, the results show that LCA and EMA have commonality in the upstream of the target system (mainly in inventory database construction), while the environmental impact assessment methods are different in the downstream. So the overall coupling analysis method is not formed. The current paper gives rational suggestions on the coupling development of the two systems in terms of the aggregate emergy flow table, the indicator system construction and indicator evaluation methods. In addition, it is necessary to introduce sensitivity analysis and uncertainty analysis in order to improve the reliability of assessment results. At present, the research on the coupling development of the two theories is in rapid development stage, but there are still many problems that need further exploration.
Los estilos APA, Harvard, Vancouver, ISO, etc.
11

Chatzidakis, Zoé. "Properties of forking in ω-free pseudo-algebraically closed fields". Journal of Symbolic Logic 67, n.º 3 (septiembre de 2002): 957–96. http://dx.doi.org/10.2178/jsl/1190150143.

Texto completo
Resumen
The study of pseudo-algebraically closed fields (henceforth called PAC) started with the work of J. Ax on finite and pseudo-finite fields [1]. He showed that the infinite models of the theory of finite fields are exactly the perfect PAC fields with absolute Galois group isomorphic to , and gave elementary invariants for their first order theory, thereby proving the decidability of the theory of finite fields. Ax's results were then extended to a larger class of PAC fields by M. Jarden and U. Kiehne [21], and Jarden [19]. The final word on theories of PAC fields was given by G. Cherlin, L. van den Dries and A. Macintyre [10], see also results by Ju. Ershov [13], [14]. Let K be a PAC field. Then the elementary theory of K is entirely determined by the following data:• The isomorphism type of the field of absolute numbers of K (the subfield of K of elements algebraic over the prime field).• The degree of imperfection of K.• The first-order theory, in a suitable ω-sorted language, of the inverse system of Galois groups al(L/K) where L runs over all finite Galois extensions of K.They also showed that the theory of PAC fields is undecidable, by showing that any graph can be encoded in the absolute Galois group of some PAC field. It turns out that the absolute Galois group controls much of the behaviour of the PAC fields. I will give below some examples illustrating this phenomenon.
Los estilos APA, Harvard, Vancouver, ISO, etc.
12

Goguen, Joseph y Răzvan Diaconescu. "An Oxford survey of order sorted algebra". Mathematical Structures in Computer Science 4, n.º 3 (septiembre de 1994): 363–92. http://dx.doi.org/10.1017/s0960129500000517.

Texto completo
Resumen
This paper surveys several different variants of order sorted algebra (abbreviated OSA), comparing some of the main approaches (overloaded OSA, universe OSA, unified algebra, term declaration algebra, etc.), emphasising motivation and intuitions, and pointing out features that distinguish the original ‘overloaded’ OSA approach from some later developments. These features include sort constraints and retracts; the latter is particularly useful for handling multiple data representations (including automatic coercions among them). Many examples are given, for most of which, runs are shown on the OBJ3 system.This paper also significantly generalises overloaded OSA by dropping the regularity and monotonicity assumptions, and by adding signatures of non-monotonicities, which support simple semantics for some aspects of object oriented programming. A number of new results for this generalisation are proved, including initiality, variety, and quasi-variety theorems. Axiomatisability results à la Birkhoff are also proved for unified algebras.
Los estilos APA, Harvard, Vancouver, ISO, etc.
13

Yu, Yan, Ben Qianqian Liu, Jin-Xing Hao y Chuanqi Wang. "Complicating or simplifying? Investigating the mixed impacts of online product information on consumers’ purchase decisions". Internet Research 30, n.º 1 (28 de junio de 2019): 263–87. http://dx.doi.org/10.1108/intr-05-2018-0247.

Texto completo
Resumen
Purpose Prior literature indicates conflicting effects of online product information, which may complicate or simplify consumer purchase decisions. Therefore, the purpose of this paper is to investigate how different online product information (i.e. the choice set size and the popularity information and its presentation) affect consumers’ decision making and the related market outcomes. Design/methodology/approach This research relies on information-processing theories and social learning theory. By stepwise conducting two 2×2 within-subject factorial design experiments, this research examines the effects of the choice set size, product popularity information and product presentation on consumers’ decision making and the aggregated market outcomes. Findings The results show that product popularity information led consumers to either simplify or complicate their decision strategy, depending on the size of the choice sets. Additionally, presenting products by their popularity in descending order resulted in consumers making decisions with a larger decision bias. The results also show that the presence of product popularity was more likely to forge a “superstar” structure in a large market. Practical implications The research suggests that e-retailers and e-marketplace operators should carefully utilize product popularity information. Multiple mechanisms that shape different shopping environments with different orders are necessary to create a long-tailed market structure. Originality/value This study found the mixed effects of product popularity information when it is presented in different environments (i.e. the large/small choice set and the sorted/randomized product presentation). The overuse of popularity information may induce consumers’ decision bias.
Los estilos APA, Harvard, Vancouver, ISO, etc.
14

Swaen, M. D. G. "The logic of first order intuitionistic type theory with weak sigma-elimination". Journal of Symbolic Logic 56, n.º 2 (junio de 1991): 467–83. http://dx.doi.org/10.2307/2274694.

Texto completo
Resumen
AbstractVia the formulas-as-types embedding certain extensions of Heyting Arithmetic can be represented in intuitionistic type theories. In this paper we discuss the embedding of ω-sorted Heyting Arithmetic HAω into a type theory WL, that can be described as Troelstra's system with so-called weak Σ-elimination rules. By syntactical means it is proved that a formula is derivable in HAω if and only if its corresponding type in WL is inhabited. Analogous results are proved for Diller's so-called restricted system and for a type theory based on predicate logic instead of arithmetic.
Los estilos APA, Harvard, Vancouver, ISO, etc.
15

Miller, Alison R., Ying Liang y Gary Van Zant. "Stem Cell Aging Studies Reveal Changes in the Proteasome." Blood 106, n.º 11 (16 de noviembre de 2005): 4207. http://dx.doi.org/10.1182/blood.v106.11.4207.4207.

Texto completo
Resumen
Abstract To investigate aging mechanisms in murine hematopoietic stem cells (HSCs), a microarray analysis was performed on sorted Lin-cKit+ Sca-1+ cells of young and old mice of two strains: long-lived C57B6L/6 (B6) and short-lived DBA/2 (D2). Following analysis by two-way ANOVA with a FDR of 5%, the data was organized using gene ontology software. The following age-related transcriptional changes were found to be of significance: increases in 20S (α and β) and 26S proteasome subunits, ribosomal proteins, mitochondrial enzymatic proteins, and carbohydrate metabolism, and a decrease in ubiquitination proteins. In general, B6 changes were more dramatic than those of D2. These data are surprisingly consistent with well-established theories of aging in post-mitotic cells, most commonly neurons, where the function of the proteasome decreases while the number of proteasome subunits increases in a compensatory manner to maintain protein turnover. Eventually, the proteasome can become over-loaded and the lysosomal pathway must assume the load of additional protein degradation. It is common for protein aggregates, lysosomal dysfunction, and damaged mitochondria to be observed in neurons with inhibited proteasome function. Our data are consistent with the initial steps in this theory, but the later, dysfunctional consequences of proteasome inhibition have not yet been observed in HSCs. It is interesting that this pattern would be present in stem cells, since they are responsible for self-renewal and differentiation throughout the lifetime of the organism. It is possible that the lack of protein aggregates or dysfunctional mitochondria is actually due to dilution by cell division, especially considering that the proteasome is responsible for degradation of many cell cycle proteins and that genes involved in energy production were up regulated. Therefore, it is possible that HSCs show similar aging trends with regard to the proteasome but do not have the same cellular consequences as post-mitotic cells. It is also interesting that B6 and D2 showed the same aging trends but B6 appears better able to compensate for proteasome alterations. For example, B6 increases the beta catalytic subunits of the 20S proteasome core more so than its short-lived counterpart. Additionally, B6 are known to have more HSCs in old age than D2, which is consistent with more cells being able to either compensate for decreased proteasome function or to increase cell cycle to dilute out any deleterious effects of proteasome inhibition. These data suggest that HSCs share common aging trends with other cell types; however, they are able to maintain “healthy aging” more effectively than post-mitotic cells, especially in the case of B6. It is tempting to speculate that the longevity of D2 mice could be limited by its alterations in proteasome function and its decreased ability to compensate other cellular responses in order to maintain homeostasis.
Los estilos APA, Harvard, Vancouver, ISO, etc.
16

Wanyonyi, Kizito Simiyu y Dominic Ngaba. "Digital Financial Services and Financial Performance of Savings and Credit Cooperative Societies in Kakamega County, Kenya". International Journal of Current Aspects in Finance, Banking and Accounting 3, n.º 1 (9 de julio de 2021): 9–20. http://dx.doi.org/10.35942/ijcfa.v3i1.177.

Texto completo
Resumen
Savings and Credit Co-operative Societies (Saccos) in Kenya have realised a tremendous growth in the subsector and are investing huge amount of their scarce financial resources in digital technology to enhance services delivery and offer a wide variety of products and services range, increased membership mobilisation and size, ensure better structure and effective financial performance. Digital financial Services as used in the Saccos industry is as a result of Information Communication Technology revolution commonly referred to as digital commerce. Many Saccos are steadily changing from manual banking system of operations to providing digital Financial (e-banking) services that include internet banking, M-banking and Automated Teller machine support. The adoption of digital financial Services by the Saccos is a strategic attempt to deal with increased cut throat competition from traditional banking institutions and non-banking financial institutions, to cut costs and add value to their services in order to optimise benefits to the shareholders. Despite the fact that Saccos have rapidly adopted digital financial services to provide services, and that they drive a huge section of the financial sector savings of the economy, they have experienced various challenges such as uncertainty and risk due to digital financial services. The study sought to establish the influence of digital financial services on the financial performance of SACCOs in Kakamega County, Kenya. The specific objectives was to determine the effect of the mobile banking, internet banking, use of credit cards and digital funds transfer on the financial performance of SACCOs in Kakamega County, Kenya. The research was guided by three theories of innovation and technology: Diffusion of Innovation Theory, The Theory of Task-Technology Fit Theory and the Technological context, Organisational context and Environmental context Theory.The study used a descriptive research design. The population of study were staff at the three SACCOs operating in Kakamega County. This consisted of 162 respondents who are the staff of the SACCOs. A sample of 49 respondents was taken which forms 30% of the target population which shall be evenly spread across the three SACCOs. The primary data was collected by use of self-administered semi-structured questionnaire.Collected data was analysed through descriptive and inferential statistics by the use of SPSS. Findings were presented by use of tables, frequencies, percentages, means and standard deviation.The study found that the financial performance of the SACCOs was significantly influenced by the digital financial services instituted by the SACCO managements. They demonstrated to have reliable mobile banking system where most of their customers had enrolled on the mobile banking platform and most of customer queries and updates were sorted via the mobile platform.Given the limitations and findings of this study, the researcher recommends that since there exists a positive relationship between digital financial services and bank performance and that e-banking has brought services closer to bank customer’s hence improving banking industry performance, SACCOs must also enhance the dynamics of the sector and embrace digital banking fully and extensively. Mobile banking faces various challenges among them being, system delays by the mobile money transfer service providers, slow processing of transactions, high transactions costs, limit on the amount of money that can be withdrawn in a day and fraud.
Los estilos APA, Harvard, Vancouver, ISO, etc.
17

Geist, C. y U. Endriss. "Automated Search for Impossibility Theorems in Social Choice Theory: Ranking Sets of Objects". Journal of Artificial Intelligence Research 40 (24 de enero de 2011): 143–74. http://dx.doi.org/10.1613/jair.3126.

Texto completo
Resumen
We present a method for using standard techniques from satisfiability checking to automatically verify and discover theorems in an area of economic theory known as ranking sets of objects. The key question in this area, which has important applications in social choice theory and decision making under uncertainty, is how to extend an agent's preferences over a number of objects to a preference relation over nonempty sets of such objects. Certain combinations of seemingly natural principles for this kind of preference extension can result in logical inconsistencies, which has led to a number of important impossibility theorems. We first prove a general result that shows that for a wide range of such principles, characterised by their syntactic form when expressed in a many-sorted first-order logic, any impossibility exhibited at a fixed (small) domain size will necessarily extend to the general case. We then show how to formulate candidates for impossibility theorems at a fixed domain size in propositional logic, which in turn enables us to automatically search for (general) impossibility theorems using a SAT solver. When applied to a space of 20 principles for preference extension familiar from the literature, this method yields a total of 84 impossibility theorems, including both known and nontrivial new results.
Los estilos APA, Harvard, Vancouver, ISO, etc.
18

Otto, Martin. "An Interpolation Theorem". Bulletin of Symbolic Logic 6, n.º 4 (diciembre de 2000): 447–62. http://dx.doi.org/10.2307/420966.

Texto completo
Resumen
AbstractLyndon's Interpolation Theorem asserts that for any valid implication between two purely relational sentences of first-order logic, there is an interpolant in which each relation symbol appears positively (negatively) only if it appears positively (negatively) in both the antecedent and the succedent of the given implication. We prove a similar, more general interpolation result with the additional requirement that, for some fixed tuple of unary predicates U, all formulae under consideration have all quantifiers explicitly relativised to one of the U. Under this stipulation, existential (universal) quantification over U contributes a positive (negative) occurrence of U.It is shown how this single new interpolation theorem, obtained by a canonical and rather elementary model theoretic proof, unifies a number of related results: the classical characterisation theorems concerning extensions (substructures) with those concerning monotonicity, as well as a many-sorted interpolation theorem focusing on positive vs. negative occurrences of predicates and on existentially vs. universally quantified sorts.
Los estilos APA, Harvard, Vancouver, ISO, etc.
19

Koch, Sebastian. "About Quotient Orders and Ordering Sequences". Formalized Mathematics 25, n.º 2 (1 de julio de 2017): 121–39. http://dx.doi.org/10.1515/forma-2017-0012.

Texto completo
Resumen
Summary In preparation for the formalization in Mizar [4] of lotteries as given in [14], this article closes some gaps in the Mizar Mathematical Library (MML) regarding relational structures. The quotient order is introduced by the equivalence relation identifying two elements x, y of a preorder as equivalent if x ⩽ y and y ⩽ x. This concept is known (see e.g. chapter 5 of [19]) and was first introduced into the MML in [13] and that work is incorporated here. Furthermore given a set A, partition D of A and a finite-support function f : A → ℝ, a function Σf : D → ℝ, Σf (X)= ∑x∈X f(x) can be defined as some kind of natural “restriction” from f to D. The first main result of this article can then be formulated as: $$\sum\limits_{x \in A} {f(x)} = \sum\limits_{X \in D} {\Sigma _f (X)\left( { = \sum\limits_{X \in D} {\sum\limits_{x \in X} {f(x)} } } \right)} $$ After that (weakly) ascending/descending finite sequences (based on [3]) are introduced, in analogous notation to their infinite counterparts introduced in [18] and [13]. The second main result is that any finite subset of any transitive connected relational structure can be sorted as a ascending or descending finite sequence, thus generalizing the results from [16], where finite sequence of real numbers were sorted. The third main result of the article is that any weakly ascending/weakly descending finite sequence on elements of a preorder induces a weakly ascending/weakly descending finite sequence on the projection of these elements into the quotient order. Furthermore, weakly ascending finite sequences can be interpreted as directed walks in a directed graph, when the set of edges is described by ordered pairs of vertices, which is quite common (see e.g. [10]). Additionally, some auxiliary theorems are provided, e.g. two schemes to find the smallest or the largest element in a finite subset of a connected transitive relational structure with a given property and a lemma I found rather useful: Given two finite one-to-one sequences s, t on a set X, such that rng t ⊆ rng s, and a function f : X → ℝ such that f is zero for every x ∈ rng s \ rng t, we have ∑ f o s = ∑ f o t.
Los estilos APA, Harvard, Vancouver, ISO, etc.
20

Iovino, José. "On the maximality of logics with approximations". Journal of Symbolic Logic 66, n.º 4 (diciembre de 2001): 1909–18. http://dx.doi.org/10.2307/2694984.

Texto completo
Resumen
In this paper we analyze some aspects of the question of using methods from model theory to study structures of functional analysis.By a well known result of P. Lindström, one cannot extend the expressive power of first order logic and yet preserve its most outstanding model theoretic characteristics (e.g., compactness and the Löwenheim-Skolem theorem). However, one may consider extending the scope of first order in a different sense, specifically, by expanding the class of structures that are regarded as models (e.g., including Banach algebras or other structures of functional analysis), and ask whether the resulting extensions of first order model theory preserve some of its desirable characteristics.A formal framework for the study of structures based on Banach spaces from the perspective of model theory was first introduced by C. W. Henson in [8] and [6]. Notions of syntax and semantics for these structures were defined, and it was shown that using them one obtains a model theoretic apparatus that satisfies many of the fundamental properties of first order model theory. For instance, one has compactness, Löwenheim-Skolem, and omitting types theorems. Further aspects of the theory, namely, the fundamentals of stability and forking, were first introduced in [10] and [9].The classes of mathematical structures formally encompassed by this framework are normed linear spaces, possibly expanded with additional structure, e.g., operations, real-valued relations, and constants. This notion subsumes wide classes of structures from functional analysis. However, the restriction that the universe of a structure be a normed space is not necessary. (This restriction has a historical, rather than technical origin; specifically, the development of the theory was originally motivated by questions in Banach space geometry.) Analogous techniques can be applied if the universe is a metric space. Now, when the underlying metric topology is discrete, the resulting model theory coincides with first order model theory, so this logic extends first order in the sense described above. Furthermore, without any cost in the mathematical complexity, one can also work in multi-sorted contexts, so, for instance, one sort could be an operator algebra while another is. say, a metric space.
Los estilos APA, Harvard, Vancouver, ISO, etc.
21

Kachapova, Farida. "Multi-sorted version of second order arithmetic". Australasian Journal of Logic 13, n.º 5 (5 de septiembre de 2016). http://dx.doi.org/10.26686/ajl.v13i5.3936.

Texto completo
Resumen
This paper describes axiomatic theories SA and SAR, which are versions of second order arithmetic with countably many sorts for sets of natural numbers. The theories are intended to be applied in reverse mathematics because their multi-sorted language allows to express some mathematical statements in more natural form than in the standard second order arithmetic. We study metamathematical properties of the theories SA, SAR and their fragments. We show that SA is mutually interpretable with the theory of arithmetical truth PATr obtained from the Peano arithmetic by adding infinitely many truth predicates. Corresponding fragments of SA and PATr are also mutually interpretable. We compare the proof-theoretical strengths of the fragments; in particular, we show that each fragment SAs with sorts <=s is weaker than next fragment SAs+1.
Los estilos APA, Harvard, Vancouver, ISO, etc.
22

Durán, Francisco y José Meseguer. "WITHDRAWN: On the Church-Rosser and coherence properties of conditional order-sorted rewrite theories". Journal of Logic and Algebraic Programming, mayo de 2012. http://dx.doi.org/10.1016/j.jlap.2012.03.012.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
23

Calvanese, Diego, Silvio Ghilardi, Alessandro Gianola, Marco Montali y Andrey Rivkin. "Combination of Uniform Interpolants via Beth Definability". Journal of Automated Reasoning, 12 de mayo de 2022. http://dx.doi.org/10.1007/s10817-022-09627-1.

Texto completo
Resumen
AbstractUniform interpolants were largely studied in non-classical propositional logics since the nineties, and their connection to model completeness was pointed out in the literature. A successive parallel research line inside the automated reasoning community investigated uniform quantifier-free interpolants (sometimes referred to as “covers”) in first-order theories. In this paper, we investigate cover transfer to theory combinations in the disjoint signatures case. We prove that, for convex theories, cover algorithms can be transferred to theory combinations under the same hypothesis needed to transfer quantifier-free interpolation (i.e., the equality interpolating property, aka strong amalgamation property). The key feature of our algorithm relies on the extensive usage of the Beth definability property for primitive fragments to convert implicitly defined variables into their explicitly defining terms. In the non-convex case, we show by a counterexample that covers may not exist in the combined theories, even in case combined quantifier-free interpolants do exist. However, we exhibit a cover transfer algorithm operating also in the non-convex case for special kinds of theory combinations; these combinations (called ‘tame combinations’) concern multi-sorted theories arising in many model-checking applications (in particular, the ones oriented to verification of data-aware processes).
Los estilos APA, Harvard, Vancouver, ISO, etc.
24

Dias, Filipe S., Michael Betancourt, Patricia María Rodríguez-González y Luís Borda-de-Água. "Analysing the distance decay of community similarity in river networks using Bayesian methods". Scientific Reports 11, n.º 1 (4 de noviembre de 2021). http://dx.doi.org/10.1038/s41598-021-01149-x.

Texto completo
Resumen
AbstractThe distance decay of community similarity (DDCS) is a pattern that is widely observed in terrestrial and aquatic environments. Niche-based theories argue that species are sorted in space according to their ability to adapt to new environmental conditions. The ecological neutral theory argues that community similarity decays due to ecological drift. The continuum hypothesis provides an intermediate perspective between niche-based theories and the neutral theory, arguing that niche and neutral factors are at the opposite ends of a continuum that ranges from competitive to stochastic exclusion. We assessed the association between niche-based and neutral factors and changes in community similarity measured by Sorensen’s index in riparian plant communities. We assessed the importance of neutral processes using network distances and flow connection and of niche-based processes using Strahler order differences and precipitation differences. We used a hierarchical Bayesian approach to determine which perspective is best supported by the results. We used dataset composed of 338 vegetation censuses from eleven river basins in continental Portugal. We observed that changes in Sorensen indices were associated with network distance, flow connection, Strahler order difference and precipitation difference but to different degrees. The results suggest that community similarity changes are associated with environmental and neutral factors, supporting the continuum hypothesis.
Los estilos APA, Harvard, Vancouver, ISO, etc.
25

Zhang, Jian, Fenhua Zhou, Jinxia Jiang, Xia Duan y Xin Yang. "Effective Teaching Behaviors of Clinical Nursing Teachers: A Qualitative Meta-Synthesis". Frontiers in Public Health 10 (28 de abril de 2022). http://dx.doi.org/10.3389/fpubh.2022.883204.

Texto completo
Resumen
ObjectivesTo identify, appraise, and synthesize the available evidence exploring the effective teaching behaviors of clinical nursing teachers.DesignThe Joanna Briggs Institute (JBI) guidelines were followed, and a meta-synthesis was conducted.Review MethodsFollowing databases were searched for relevant qualitative studies published in English and reporting primary data analysis, including experiences and perceptions of nursing students: PubMed, EBSCOhost, OVID, etc. Qualitative Assessment and Review Instrument were used to pool the qualitative research findings. Through the repeated reading of the original literature, the similar findings were combined and sorted into new categories, and then summarized into different synthesized themes.ResultsA total of nine articles were included. The review process produced 29 subcategories that were aggregated into seven categories. The categories generated three synthesized findings: good teaching literacy, solid professional competence, and harmonious faculty-student relationship.ConclusionsThe effective teaching behaviors of clinical nursing teachers are the driving force for the progress and growth of nursing students. In order to improve the effectiveness of clinical nursing teaching, nursing teachers should be fully aware of effective teaching behaviors for nursing students to master nursing theories and skills.
Los estilos APA, Harvard, Vancouver, ISO, etc.
26

Correa Guerrero, Jose, Jorge Rico Fontalvo, Rodrigo Daza Arnedo, Emilio Abuabara Franco, Nehomar Eduardo Pájaro Galvis, Maria Cardona Blanco, Victor Leal Martínez et al. "Acid-base imbalance: a review with proposed unified diagnostic algorithm". Revista Colombiana de Nefrología 7, n.º 2 (4 de agosto de 2020). http://dx.doi.org/10.22265/acnef.7.2.497.

Texto completo
Resumen
Background: Alterations in the acid-base balance are studied in all medical specialties. Although most cases derive from a preexisting pathology, they can also manifest themselves in a primary context. The proper identification of the acid-base disorder allows the pathological process to be characterized. The correct interpretation of the blood gasometry as a technique for monitoring the ventilatory status, oxygenation and acid-base balance of a patient requires the integration of various physicochemical approaches in order to specify a diagnosis, quantify a therapeutic response, and monitor the severity or the progression of a pathological process. Material & Method: A literature review was conducted in the PubMed, Scopus and Science Direct databases. The articles were selected according to the title and the abstract and sorted by topics relevant by pathophysiology, divergences, clinical approach, diagnosis, and management. Results: A guide the clinical correlation of the critical patient with the blood gasometry parameters to characterize the acid-base disorder through the proposition of a diagnostic algorithm. Conclusion: The incorporation of the three theories in a diagnostic algorithm facilitates a greater understanding of the pathophysiological mechanisms and allows us to identify a more precise therapeutic objective to correct the underlying disorder in the different clinical contexts of the patient.
Los estilos APA, Harvard, Vancouver, ISO, etc.
27

Kiekens, Anneleen, Bernadette Dierckx de Casterlé, Giampietro Pellizzer, Idda H. Mosha, Fausta Mosha, Tobias F. Rinke de Wit, Raphael Z. Sangeda et al. "Exploring the mechanisms behind HIV drug resistance in sub-Saharan Africa: conceptual mapping of a complex adaptive system based on multi-disciplinary expert insights". BMC Public Health 22, n.º 1 (7 de marzo de 2022). http://dx.doi.org/10.1186/s12889-022-12738-4.

Texto completo
Resumen
Abstract Background HIV drug resistance (HIVDR) continues to threaten the effectiveness of worldwide antiretroviral therapy (ART). Emergence and transmission of HIVDR are driven by several interconnected factors. Though much has been done to uncover factors influencing HIVDR, overall interconnectedness between these factors remains unclear and African policy makers encounter difficulties setting priorities combating HIVDR. By viewing HIVDR as a complex adaptive system, through the eyes of multi-disciplinary HIVDR experts, we aimed to make a first attempt to linking different influencing factors and gaining a deeper understanding of the complexity of the system. Methods We designed a detailed systems map of factors influencing HIVDR based on semi-structured interviews with 15 international HIVDR experts from or with experience in sub-Saharan Africa, from different disciplinary backgrounds and affiliated with different types of institutions. The resulting detailed system map was conceptualized into three main HIVDR feedback loops and further strengthened with literature evidence. Results Factors influencing HIVDR in sub-Saharan Africa and their interactions were sorted in five categories: biology, individual, social context, healthcare system and ‘overarching’. We identified three causal loops cross-cutting these layers, which relate to three interconnected subsystems of mechanisms influencing HIVDR. The ‘adherence motivation’ subsystem concerns the interplay of factors influencing people living with HIV to alternate between adherence and non-adherence. The ‘healthcare burden’ subsystem is a reinforcing loop leading to an increase in HIVDR at local population level. The ‘ART overreliance’ subsystem is a balancing feedback loop leading to complacency among program managers when there is overreliance on ART with a perceived low risk to drug resistance. The three subsystems are interconnected at different levels. Conclusions Interconnectedness of the three subsystems underlines the need to act on the entire system of factors surrounding HIVDR in sub-Saharan Africa in order to target interventions and to prevent unwanted effects on other parts of the system. The three theories that emerged while studying HIVDR as a complex adaptive system form a starting point for further qualitative and quantitative investigation.
Los estilos APA, Harvard, Vancouver, ISO, etc.
28

Al-Rawi, Ahmed, Carmen Celestini, Nicole Stewart y Nathan Worku. "How Google Autocomplete Algorithms about Conspiracy Theorists Mislead the Public". M/C Journal 25, n.º 1 (21 de marzo de 2022). http://dx.doi.org/10.5204/mcj.2852.

Texto completo
Resumen
Introduction: Google Autocomplete Algorithms Despite recent attention to the impact of social media platforms on political discourse and public opinion, most people locate their news on search engines (Robertson et al.). When a user conducts a search, millions of outputs, in the form of videos, images, articles, and Websites are sorted to present the most relevant search predictions. Google, the most dominant search engine in the world, expanded its search index in 2009 to include the autocomplete function, which provides suggestions for query inputs (Dörr and Stephan). Google’s autocomplete function also allows users to “search smarter” by reducing typing time by 25 percent (Baker and Potts 189). Google’s complex algorithm is impacted upon by factors like search history, location, and keyword searches (Karapapa and Borghi), and there are policies to ensure the autocomplete function does not contain harmful content. In 2017, Google implemented a feedback tool to allow human evaluators to assess the quality of search results; however, the algorithm still provides misleading results that frame far-right actors as neutral. In this article, we use reverse engineering to understand the nature of these algorithms in relation to the descriptive outcome, to illustrate how autocomplete subtitles label conspiracists in three countries. According to Google, these “subtitles are generated automatically”, further stating that the “systems might determine that someone could be called an actor, director, or writer. Only one of these can appear as the subtitle” and that Google “cannot accept or create custom subtitles” (Google). We focused our attention on well-known conspiracy theorists because of their influence and audience outreach. In this article we argue that these subtitles are problematic because they can mislead the public and amplify extremist views. Google’s autocomplete feature is misleading because it does not highlight what is publicly known about these actors. The labels are neutral or positive but never negative, reflecting primary jobs and/or the actor’s preferred descriptions. This is harmful to the public because Google’s search rankings can influence a user’s knowledge and information preferences through the search engine manipulation effect (Epstein and Robertson). Users’ preferences and understanding of information can be manipulated based upon their trust in Google search results, thus allowing these labels to be widely accepted instead of providing a full picture of the harm their ideologies and belief cause. Algorithms That Mainstream Conspiracies Search engines establish order and visibility to Web pages that operationalise and stabilise meaning to particular queries (Gillespie). Google’s subtitles and blackbox operate as a complex algorithm for its search index and offer a mediated visibility to aspects of social and political life (Gillespie). Algorithms are designed to perform computational tasks through an operational sequence that computer systems must follow (Broussard), but they are also “invisible infrastructures” that Internet users consciously or unconsciously follow (Gran et al. 1779). The way algorithms rank, classify, sort, predict, and process data is political because it presents the world through a predetermined lens (Bucher 3) decided by proprietary knowledge – a “secret sauce” (O’Neil 29) – that is not disclosed to the general public (Christin). Technology titans, like Google, Facebook, and Amazon (Webb), rigorously protect and defend intellectual property for these algorithms, which are worth billions of dollars (O’Neil). As a result, algorithms are commonly defined as opaque, secret “black boxes” that conceal the decisions that are already made “behind corporate walls and layers of code” (Pasquale 899). The opacity of algorithms is related to layers of intentional secrecy, technical illiteracy, the size of algorithmic systems, and the ability of machine learning algorithms to evolve and become unintelligible to humans, even to those trained in programming languages (Christin 898-899). The opaque nature of algorithms alongside the perceived neutrality of algorithmic systems is problematic. Search engines are increasingly normalised and this leads to a socialisation where suppositions are made that “these artifacts are credible and provide accurate information that is fundamentally depoliticized and neutral” (Noble 25). Google’s autocomplete and PageRank algorithms exist outside of the veil of neutrality. In 2015, Google’s photos app, which uses machine learning techniques to help users collect, search, and categorise images, labelled two black people as ‘gorillas’ (O’Neil). Safiya Noble illustrates how media and technology are rooted in systems of white supremacy, and how these long-standing social biases surface in algorithms, illustrating how racial and gendered inequities embed into algorithmic systems. Google actively fixes algorithmic biases with band-aid-like solutions, which means the errors remain inevitable constituents within the algorithms. Rising levels of automation correspond to a rising level of errors, which can lead to confusion and misdirection of the algorithms that people use to manage their lives (O’Neil). As a result, software, code, machine learning algorithms, and facial/voice recognition technologies are scrutinised for producing and reproducing prejudices (Gray) and promoting conspiracies – often described as algorithmic bias (Bucher). Algorithmic bias occurs because algorithms are trained by historical data already embedded with social biases (O’Neil), and if that is not problematic enough, algorithms like Google’s search engine also learn and replicate the behaviours of Internet users (Benjamin 93), including conspiracy theorists and their followers. Technological errors, algorithmic bias, and increasing automation are further complicated by the fact that Google’s Internet service uses “2 billion lines of code” – a magnitude that is difficult to keep track of, including for “the programmers who designed the algorithm” (Christin 899). Understanding this level of code is not critical to understanding algorithmic logics, but we must be aware of the inscriptions such algorithms afford (Krasmann). As algorithms become more ubiquitous it is urgent to “demand that systems that hold algorithms accountable become ubiquitous as well” (O’Neil 231). This is particularly important because algorithms play a critical role in “providing the conditions for participation in public life”; however, the majority of the public has a modest to nonexistent awareness of algorithms (Gran et al. 1791). Given the heavy reliance of Internet users on Google’s search engine, it is necessary for research to provide a glimpse into the black boxes that people use to extract information especially when it comes to searching for information about conspiracy theorists. Our study fills a major gap in research as it examines a sub-category of Google’s autocomplete algorithm that has not been empirically explored before. Unlike the standard autocomplete feature that is primarily programmed according to popular searches, we examine the subtitle feature that operates as a fixed label for popular conspiracists within Google’s algorithm. Our initial foray into our research revealed that this is not only an issue with conspiracists, but also occurs with terrorists, extremists, and mass murderers. Method Using a reverse engineering approach (Bucher) from September to October 2021, we explored how Google’s autocomplete feature assigns subtitles to widely known conspiracists. The conspiracists were not geographically limited, and we searched for those who reside in the United States, Canada, United Kingdom, and various countries in Europe. Reverse engineering stems from Ashby’s canonical text on cybernetics, in which he argues that black boxes are not a problem; the problem or challenge is related to the way one can discern their contents. As Google’s algorithms are not disclosed to the general public (Christin), we use this method as an extraction tool to understand the nature of how these algorithms (Eilam) apply subtitles. To systematically document the search results, we took screenshots for every conspiracist we searched in an attempt to archive the Google autocomplete algorithm. By relying on previous literature, reports, and the figures’ public statements, we identified and searched Google for 37 Western-based and influencial conspiracy theorists. We initially experimented with other problematic figures, including terrorists, extremists, and mass murderers to see whether Google applied a subtitle or not. Additionally, we examined whether subtitles were positive, neutral, or negative, and compared this valence to personality descriptions for each figure. Using the standard procedures of content analysis (Krippendorff), we focus on the manifest or explicit meaning of text to inform subtitle valence in terms of their positive, negative, or neutral connotations. These manifest features refer to the “elements that are physically present and countable” (Gray and Densten 420) or what is known as the dictionary definitions of items. Using a manual query, we searched Google for subtitles ascribed to conspiracy theorists, and found the results were consistent across different countries. Searches were conducted on Firefox and Chrome and tested on an Android phone. Regardless of language input or the country location established by a Virtual Private Network (VPN), the search terms remained stable, regardless of who conducted the search. The conspiracy theorists in our dataset cover a wide range of conspiracies, including historical figures like Nesta Webster and John Robison, who were foundational in Illuminati lore, as well as contemporary conspiracists such as Marjorie Taylor Greene and Alex Jones. Each individual’s name was searched on Google with a VPN set to three countries. Results and Discussion This study examines Google’s autocomplete feature associated with subtitles of conspiratorial actors. We first tested Google’s subtitling system with known terrorists, convicted mass shooters, and controversial cult leaders like David Koresh. Garry et al. (154) argue that “while conspiracy theories may not have mass radicalising effects, they are extremely effective at leading to increased polarization within societies”. We believe that the impact of neutral subtitling of conspiracists reflects the integral role conspiracies plays in contemporary politics and right-wing extremism. The sample includes contemporary and historical conspiracists to establish consistency in labelling. For historical figures, the labels are less consequential and simply reflect the reality that Google’s subtitles are primarily neutral. Of the 37 conspiracy theorists we searched (see Table 1 in the Appendix), seven (18.9%) do not have an associated subtitle, and the other 30 (81%) have distinctive subtitles, but none of them reflects the public knowledge of the individuals’ harmful role in disseminating conspiracy theories. In the list, 16 (43.2%) are noted for their contribution to the arts, 4 are labelled as activists, 7 are associated with their professional affiliation or original jobs, 2 to the journalism industry, one is linked to his sports career, another one as a researcher, and 7 have no subtitle. The problem here is that when white nationalists or conspiracy theorists are not acknowledged as such in their subtitles, search engine users could possibly encounter content that may sway their understanding of society, politics, and culture. For example, a conspiracist like Alex Jones is labeled as an “American Radio Host” (see Figure 1), despite losing two defamation lawsuits for declaring that the shooting at Sandy Hook Elementary School in Newtown, Connecticut, was a ‘false flag’ event. Jones’s actions on his InfoWars media platforms led to parents of shooting victims being stalked and threatened. Another conspiracy theorist, Gavin McInnes, the creator of the far-right, neo-fascist Proud Boys organisation, a known terrorist entity in Canada and hate group in the United States, is listed simply as a “Canadian writer” (see Figure 1). Fig. 1: Screenshots of Google’s subtitles for Alex Jones and Gavin McInnes. Although subtitles under an individual’s name are not audio, video, or image content, the algorithms that create these subtitles are an invisible infrastructure that could cause harm through their uninterrogated status and pervasive presence. This could then be a potential conduit to media which could cause harm and develop distrust in electoral and civic processes, or all institutions. Examples from our list include Brittany Pettibone, whose subtitle states that she is an “American writer” despite being one of the main propagators of the Pizzagate conspiracy which led to Edgar Maddison Welch (whose subtitle is “Screenwriter”) travelling from North Carolina to Washington D.C. to violently threaten and confront those who worked at Comet Ping Pong Pizzeria. The same misleading label can be found via searching for James O’Keefe of Project Veritas, who is positively labelled as “American activist”. Veritas is known for releasing audio and video recordings that contain false information designed to discredit academic, political, and service organisations. In one instance, a 2020 video released by O’Keefe accused Democrat Ilhan Omar’s campaign of illegally collecting ballots. The same dissembling of distrust applies to Mike Lindell, whose Google subtitle is “CEO of My Pillow”, as well as Sidney Powell, who is listed as an “American lawyer”; both are propagators of conspiracy theories relating to the 2020 presidential election. The subtitles attributed to conspiracists on Google do not acknowledge the widescale public awareness of the negative role these individuals play in spreading conspiracy theories or causing harm to others. Some of the selected conspiracists are well known white nationalists, including Stefan Molyneux who has been banned from social media platforms like Twitter, Twitch, Facebook, and YouTube for the promotion of scientific racism and eugenics; however, he is neutrally listed on Google as a “Canadian podcaster”. In addition, Laura Loomer, who describes herself as a “proud Islamophobe,” is listed by Google as an “Author”. These subtitles can pose a threat by normalising individuals who spread conspiracy theories, sow dissension and distrust in institutions, and cause harm to minority groups and vulnerable individuals. Once clicking on the selected person, the results, although influenced by the algorithm, did not provide information that aligned with the associated subtitle. The search results are skewed to the actual conspiratorial nature of the individuals and associated news articles. In essence, the subtitles do not reflect the subsequent search results, and provide a counter-labelling to the reality of the resulting information provided to the user. Another significant example is Jerad Miller, who is listed as “American performer”, despite the fact that he is the Las Vegas shooter who posted anti-government and white nationalist 3 Percenters memes on his social media (SunStaff), even though the majority of search results connect him to the mass shooting he orchestrated in 2014. The subtitle “performer” is certainly not the common characteristic that should be associated with Jerad Miller. Table 1 in the Appendix shows that individuals who are not within the contemporary milieux of conspiracists, but have had a significant impact, such as Nesta Webster, Robert Welch Junior, and John Robison, were listed by their original profession or sometimes without a subtitle. David Icke, infamous for his lizard people conspiracies, has a subtitle reflecting his past football career. In all cases, Google’s subtitle was never consistent with the actor’s conspiratorial behaviour. Indeed, the neutral subtitles applied to conspiracists in our research may reflect some aspect of the individuals’ previous careers but are not an accurate reflection of the individuals’ publicly known role in propagating hate, which we argue is misleading to the public. For example, David Icke may be a former footballer, but the 4.7 million search results predominantly focus on his conspiracies, his public fora, and his status of being deplatformed by mainstream social media sites. The subtitles are not only neutral, but they are not based on the actual search results, and so are misleading in what the searcher will discover; most importantly, they do not provide a warning about the misinformation contained in the autocomplete subtitle. To conclude, algorithms automate the search engines that people use in the functions of everyday life, but are also entangled in technological errors, algorithmic bias, and have the capacity to mislead the public. Through a process of reverse engineering (Ashby; Bucher), we searched 37 conspiracy theorists to decode the Google autocomplete algorithms. We identified how the subtitles attributed to conspiracy theorists are neutral, positive, but never negative, which does not accurately reflect the widely known public conspiratorial discourse these individuals propagate on the Web. This is problematic because the algorithms that determine these subtitles are invisible infrastructures acting to misinform the public and to mainstream conspiracies within larger social, cultural, and political structures. This study highlights the urgent need for Google to review the subtitles attributed to conspiracy theorists, terrorists, and mass murderers, to better inform the public about the negative nature of these actors, rather than always labelling them in neutral or positive ways. Funding Acknowledgement This project has been made possible in part by the Canadian Department of Heritage – the Digital Citizen Contribution program – under grant no. R529384. The title of the project is “Understanding hate groups’ narratives and conspiracy theories in traditional and alternative social media”. References Ashby, W. Ross. An Introduction to Cybernetics. Chapman & Hall, 1961. Baker, Paul, and Amanda Potts. "‘Why Do White People Have Thin Lips?’ Google and the Perpetuation of Stereotypes via Auto-Complete Search Forms." Critical Discourse Studies 10.2 (2013): 187-204. Benjamin, Ruha. Race after Technology: Abolitionist Tools for the New Jim Code. Polity, 2019. Bucher, Taina. If... Then: Algorithmic Power and Politics. OUP, 2018. Broussard, Meredith. Artificial Unintelligence: How Computers Misunderstand the World. MIT P, 2018. Christin, Angèle. "The Ethnographer and the Algorithm: Beyond the Black Box." Theory and Society 49.5 (2020): 897-918. D'Ignazio, Catherine, and Lauren F. Klein. Data Feminism. MIT P, 2020. Dörr, Dieter, and Juliane Stephan. "The Google Autocomplete Function and the German General Right of Personality." Perspectives on Privacy. De Gruyter, 2014. 80-95. Eilam, Eldad. Reversing: Secrets of Reverse Engineering. John Wiley & Sons, 2011. Epstein, Robert, and Ronald E. Robertson. "The Search Engine Manipulation Effect (SEME) and Its Possible Impact on the Outcomes of Elections." Proceedings of the National Academy of Sciences 112.33 (2015): E4512-E4521. Garry, Amanda, et al. "QAnon Conspiracy Theory: Examining its Evolution and Mechanisms of Radicalization." Journal for Deradicalization 26 (2021): 152-216. Gillespie, Tarleton. "Algorithmically Recognizable: Santorum’s Google Problem, and Google’s Santorum Problem." Information, Communication & Society 20.1 (2017): 63-80. Google. “Update your Google knowledge panel.” 2022. 3 Jan. 2022 <https://support.google.com/knowledgepanel/answer/7534842?hl=en#zippy=%2Csubtitle>. Gran, Anne-Britt, Peter Booth, and Taina Bucher. "To Be or Not to Be Algorithm Aware: A Question of a New Digital Divide?" Information, Communication & Society 24.12 (2021): 1779-1796. Gray, Judy H., and Iain L. Densten. "Integrating Quantitative and Qualitative Analysis Using Latent and Manifest Variables." Quality and Quantity 32.4 (1998): 419-431. Gray, Kishonna L. Intersectional Tech: Black Users in Digital Gaming. LSU P, 2020. Karapapa, Stavroula, and Maurizio Borghi. "Search Engine Liability for Autocomplete Suggestions: Personality, Privacy and the Power of the Algorithm." International Journal of Law and Information Technology 23.3 (2015): 261-289. Krasmann, Susanne. "The Logic of the Surface: On the Epistemology of Algorithms in Times of Big Data." Information, Communication & Society 23.14 (2020): 2096-2109. Krippendorff, Klaus. Content Analysis: An Introduction to Its Methodology. Sage, 2004. Noble, Safiya Umoja. Algorithms of Oppression. New York UP, 2018. O'Neil, Cathy. Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. Crown, 2016. Pasquale, Frank. The Black Box Society. Harvard UP, 2015. Robertson, Ronald E., David Lazer, and Christo Wilson. "Auditing the Personalization and Composition of Politically-Related Search Engine Results Pages." Proceedings of the 2018 World Wide Web Conference. 2018. Staff, Sun. “A Look inside the Lives of Shooters Jerad Miller, Amanda Miller.” Las Vegas Sun 9 June 2014. <https://lasvegassun.com/news/2014/jun/09/look/>. Webb, Amy. The Big Nine: How the Tech Titans and Their Thinking Machines Could Warp Humanity. Hachette UK, 2019. Appendix Table 1: The subtitles of conspiracy theorists on Google autocomplete Conspiracy Theorist Google Autocomplete Subtitle Character Description Alex Jones American radio host InfoWars founder, American far-right radio show host and conspiracy theorist. The SPLC describes Alex Jones as "the most prolific conspiracy theorist in contemporary America." Barry Zwicker Canadian journalist Filmmaker who made a documentary that claimed fear was used to control the public after 9/11. Bart Sibrel American producer Writer, producer, and director of work to falsely claim the Apollo moon landings between 1969 and 1972 were staged by NASA. Ben Garrison American cartoonist Alt-right and QAnon political cartoonist Brittany Pettibone American writer Far-right, political vlogger on YouTube and propagator of #pizzagate. Cathy O’Brien American author Cathy O’Brien claims she was a victim of a government mind control project called Project Monarch. Dan Bongino American radio host Stakeholder in Parler, Radio Host, Ex-Spy, Conspiracist (Spygate, MAGA election fraud, etc.). David Icke Former footballer Reptilian humanoid conspiracist. David Wynn Miller (No subtitle) Conspiracist, far-right tax protester, and founder of the Sovereign Citizens Movement. Jack Posobiec American activist Alt-right, alt-lite political activist, conspiracy theorist, and Internet troll. Editor of Human Events Daily. James O’Keefe American activist Founder of Project Veritas, a far-right company that propagates disinformation and conspiracy theories. John Robison Foundational Illuminati conspiracist. Kevin Annett Canadian writer Former minister and writer, who wrote a book exposing the atrocities to Indigenous Communities, and now is a conspiracist and vlogger. Laura Loomer Author Far-right, anti-Muslim, conspiracy theorist, and Internet personality. Republican nominee in Florida's 21st congressional district in 2020. Marjorie Taylor Greene United States Representative Conspiracist, QAnon adherent, and U.S. representative for Georgia's 14th congressional district. Mark Dice American YouTuber Right-wing conservative pundit and conspiracy theorist. Mark Taylor (No subtitle) QAnon minister and self-proclaimed prophet of Donald Trump, the 45th U.S. President. Michael Chossudovsky Canadian economist Professor emeritus at the University of Ottawa, founder of the Centre for Research on Globalization, and conspiracist. Michael Cremo(Drutakarmā dāsa) American researcher Self-described Vedic creationist whose book, Forbidden Archeology, argues humans have lived on earth for millions of years. Mike Lindell CEO of My Pillow Business owner and conspiracist. Neil Patel English entrepreneur Founded The Daily Caller with Tucker Carlson. Nesta Helen Webster English author Foundational Illuminati conspiracist. Naomi Wolf American author Feminist turned conspiracist (ISIS, COVID-19, etc.). Owen Benjamin American comedian Former actor/comedian now conspiracist (Beartopia), who is banned from mainstream social media for using hate speech. Pamela Geller American activist Conspiracist, Anti-Islam, Blogger, Host. Paul Joseph Watson British YouTuber InfoWars co-host and host of the YouTube show PrisonPlanetLive. QAnon Shaman (Jake Angeli) American activist Conspiracy theorist who participated in the 2021 attack on Capitol Hil. Richard B. Spencer (No subtitle) American neo-Nazi, antisemitic conspiracy theorist, and white supremacist. Rick Wiles (No subtitle) Minister, Founded conspiracy site, TruNews. Robert W. Welch Jr. American businessman Founded the John Birch Society. Ronald Watkins (No subtitle) Founder of 8kun. Serge Monast Journalist Creator of Project Blue Beam conspiracy. Sidney Powell (No subtitle) One of former President Trump’s Lawyers, and renowned conspiracist regarding the 2020 Presidential election. Stanton T. Friedman Nuclear physicist Original civilian researcher of the 1947 Roswell UFO incident. Stefan Molyneux Canadian podcaster Irish-born, Canadian far-right white nationalist, podcaster, blogger, and banned YouTuber, who promotes conspiracy theories, scientific racism, eugenics, and racist views Tim LaHaye American author Founded the Council for National Policy, leader in the Moral Majority movement, and co-author of the Left Behind book series. Viva Frei (No subtitle) YouTuber/ Canadian Influencer, on the Far-Right and Covid conspiracy proponent. William Guy Carr Canadian author Illuminati/III World War Conspiracist Google searches conducted as of 9 October 2021.
Los estilos APA, Harvard, Vancouver, ISO, etc.
Ofrecemos descuentos en todos los planes premium para autores cuyas obras están incluidas en selecciones literarias temáticas. ¡Contáctenos para obtener un código promocional único!

Pasar a la bibliografía