Siga este enlace para ver otros tipos de publicaciones sobre el tema: Decision Procedures Hypergeometric Sequences.

Artículos de revistas sobre el tema "Decision Procedures Hypergeometric Sequences"

Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros

Elija tipo de fuente:

Consulte los 43 mejores artículos de revistas para su investigación sobre el tema "Decision Procedures Hypergeometric Sequences".

Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.

También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.

Explore artículos de revistas sobre una amplia variedad de disciplinas y organice su bibliografía correctamente.

1

Wagner, Kyle, Alex Smith, Abigail Allen, Kristen McMaster, Apryl Poch y Erica Lembke. "Exploration of New Complexity Metrics for Curriculum-Based Measures of Writing". Assessment for Effective Intervention 44, n.º 4 (28 de mayo de 2018): 256–66. http://dx.doi.org/10.1177/1534508418773448.

Texto completo
Resumen
Researchers and practitioners have questioned whether scoring procedures used with curriculum-based measures of writing (CBM-W) capture growth in complexity of writing. We analyzed data from six independent samples to examine two potential scoring metrics for picture word CBM-W (PW), a sentence-level CBM task. Correct word sequences per response (CWSR) and words written per response (WWR) were compared with the current standard metric of correct word sequences (CWS). Linear regression analyses indicated that CWSR predicted scores on standardized norm-referenced criterion measures in more samples than did WWR or CWS. Future studies should explore the capacity of CWSR and WWR to show growth over time, stability, diagnostic accuracy, and utility for instructional decision making.
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Strough, JoNell, Wändi Bruine de Bruin y Andrew M. Parker. "Taking the Biggest First: Age Differences in Preferences for Monetary and Hedonic Sequences". Journals of Gerontology: Series B 74, n.º 6 (4 de enero de 2018): 964–74. http://dx.doi.org/10.1093/geronb/gbx160.

Texto completo
Resumen
Abstract Objectives People face decisions about how to sequence payments and events, including when to schedule bigger events relative to smaller ones. We examine age differences in these sequence preferences. Methods We gave a national adult life-span sample (n = 1,296, mean = 53.06 years, standard deviation = 16.33) four scenarios describing a positive or negative hedonic (enjoyable weekends, painful dental procedures) or monetary (receiving versus paying money) event. We considered associations among age, sequence preferences, three self-reported decision-making processes—emphasizing experience, emotion, and reasoning—and two dimensions of future time perspective—focusing on future opportunities and limited time. Results Older age was associated with taking the “biggest” event sooner instead of later, especially for receiving money, but also for the other three scenarios. Older age was associated with greater reported use of reason and experience and lesser reported use of emotion. These decision-making processes played a role in understanding age differences in sequence preferences, but future time perspective did not. Discussion We discuss “taking the biggest first” preferences in light of prior mixed findings on age differences in sequence preferences. We highlight the distinct roles of experience- and emotion-based decision-making processes. We propose applications to financial and health-care settings.
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Xu, Zhou Bo, Tian Long Gu, Liang Chang y Feng Ying Li. "A Novel Symbolic OBDD Algorithm for Generating Mechanical Assembly Sequences Using Decomposition Approach". Advanced Materials Research 201-203 (febrero de 2011): 24–29. http://dx.doi.org/10.4028/www.scientific.net/amr.201-203.24.

Texto completo
Resumen
The compact storage and efficient evaluation of feasible assembly sequences is one crucial concern for assembly sequence planning. The implicitly symbolic ordered binary decision diagram (OBDD) representation and manipulation technique has been a promising way. In this paper, Sharafat’s recursive contraction algorithm and cut-set decomposition method are symbolically implemented, and a novel symbolic algorithm for generating mechanical assembly sequences is presented using OBDD formulations of liaison graph and translation function. The algorithm has the following main procedures: choosing any one of vertices in the liaison graph G as seed vertex and scanning all connected subgraphs containing seed vertex by breadth first search; transforming the problem of enumerating all cut-sets in liaison graph into the problem of generating all the partitions: two subsets V1and V2of a set of vertices V where both the induced graph of vertices V1and V2are connected; checking the geometrical feasibility for each cut-set. Some applicable experiments show that the novel algorithm can generate feasible assembly sequences correctly and completely.
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Khan, M. Shamim, Alex Chong y Tom Gedeon. "A Methodology for Developing Adaptive Fuzzy Cognitive Maps for Decision Support". Journal of Advanced Computational Intelligence and Intelligent Informatics 4, n.º 6 (20 de noviembre de 2000): 403–7. http://dx.doi.org/10.20965/jaciii.2000.p0403.

Texto completo
Resumen
Differential Hebbian Learning (DHL) was proposed by Kosko as an unsupervised learning scheme for Fuzzy Cognitive Maps (FCMs). DHL can be used with a sequence of state vectors to adapt the causal link strengths of an FCM. However, it does not guarantee learning of the sequence by the FCM and no concrete procedures for the use of DHL has been developed. In this paper a formal methodology is proposed for using DHL in the development of FCMs in a decision support context. The four steps in the methodology are: (1) Creation of a crisp cognitive map; (2) Identification of event sequences for use in DHL; (3) Event sequence encoding using DHL; (4) Revision of the trained FCM. Feasibility of the proposed methodology is demonstrated with an example involving a dynamic system with feedback based on a real-life scenario.
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Orakçı, Erhan y Ali Özdemir. "Using Social Choice Function for Multi Criteria Decision Making Problems". Alphanumeric Journal 12, n.º 1 (20 de julio de 2024): 21–38. http://dx.doi.org/10.17093/alphanumeric.1426694.

Texto completo
Resumen
Many social choice preference functions or aggregation techniques, such as Borda, Copeland, Dodgson, Kemeny, are employed to obtain integrated solutions in multi-criteria decision problems. On the other hand, the number of studies comparing these techniques in the context of aggregation procedures in multi-criteria decision problems is limited. Furthermore, the advantages and disadvantages of the techniques have not been adequately discussed. In this context, the applicability of Borda, Copeland Dodgson, and Kemeny techniques in solving multi-criteria decision problems was investigated in this study. Analyses were performed on 500,000 samples containing various alternatives and sequences produced using the R software. The Kendall W test was used to assess the compatibility of the aggregation techniques. As a result, as the number of alternatives in the problem increases, the examined techniques produce an incomplete ranking. The features of the new aggregation technique to be developed were also determined in the context of the obtained results.
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Marchand, Pascal y Claude Navarro. "Dialog Organization and Functional Communication in a Medical Assistance Task by Phone". Perceptual and Motor Skills 81, n.º 2 (octubre de 1995): 451–61. http://dx.doi.org/10.1177/003151259508100218.

Texto completo
Resumen
This study is based on a corpus of 110 dialogs recorded in a medical assistance telephonist's workstation. Given the nature of the task, we have considered five main variables in dialog (topics) as well as their order of occurrence (sequences). These data were analyzed with a lexical analysis program. Results show a great difference between Operator/Specialist dialog and Operator/Nonspecialist dialog. Dialog “script” is very strong in the first case in which the operator merely plays a feedback role (routine procedures). If the caller is a private individual, the situation is often an indefinite problem, and the operator may have to adapt to the person (weak script) to obtain relevant information as quickly as possible (problem-solving procedures). This provides confirmation of the operator's twofold competence (efficient decision-making, dialog management).
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Bibilo, P. N., Yu Yu Lankevich y V. I. Romanov. "Logical Minimization of Multilevel Representations of Boolean Function Systems". Informacionnye Tehnologii 29, n.º 2 (20 de febrero de 2023): 59–71. http://dx.doi.org/10.17587/it.29.59-71.

Texto completo
Resumen
A decisive influence on complexity and speed of a combinational logic circuit of library CMOS elements is exerted by the preliminary stage of technologically independent optimization of the implemented system of Boolean functions. At present, the main methods of such optimization in the logical synthesis of custom CMOS VLSI blocks are methods for minimizing binary decision diagrams — Binary Decision Diagrams (BDD) or their modifications. Graphical representations of BDD are built on the basis of the Shannon expansions of Boolean functions. A BDD graph corresponds to a set of interrelated Shannon expansion formulas that form a multilevel representation of the minimized system of Boolean functions. The efficiency of applying various optimization procedures of minimization for several types of BDD representations of systems of Boolean functions is investigated in the paper. 7hese procedures are used as a technologically independent optimization in the synthesis of multi-output logic circuits of library CMOS elements. In addition to single logical optimization procedures, sequences of such procedures are studied that form various methods of logical optimization of multilevel representations of systems of Boolean functions. The results of experiments on 49 examples of systems of Boolean functions are presented. 25 optimization routes have been studied, efficient routes have been determined for various types of specifications of function systems. The obtained experimental results are compared with the known ones. It has been established that to estimate the complexity of optimized algebraic representations of systems of functions, it is advisable to use such a criterion as the total number of literals (variables or their inversions) of Boolean variables.
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Pathiraja Rathnayaka Hitige, Nadeesha, Ting Song, Steven J. Craig, Kimberley J. Davis, Xubing Hao, Licong Cui y Ping Yu. "An Ontology-Based Approach for Understanding Appendicectomy Processes and Associated Resources". Healthcare 13, n.º 1 (24 de diciembre de 2024): 10. https://doi.org/10.3390/healthcare13010010.

Texto completo
Resumen
Background: Traditional methods for analysing surgical processes often fall short in capturing the intricate interconnectedness between clinical procedures, their execution sequences, and associated resources such as hospital infrastructure, staff, and protocols. Aim: This study addresses this gap by developing an ontology for appendicectomy, a computational model that comprehensively represents appendicectomy processes and their resource dependencies to support informed decision making and optimise appendicectomy healthcare delivery. Methods: The ontology was developed using the NeON methodology, drawing knowledge from existing ontologies, scholarly literature, and de-identified patient data from local hospitals. Results: The resulting ontology comprises 108 classes, including 11 top-level classes and 96 subclasses organised across five hierarchical levels. The 11 top-level classes include “clinical procedure”, “appendicectomy-related organisational protocols”, “disease”, “start time”, “end time”, “duration”, “appendicectomy outcomes”, “hospital infrastructure”, “hospital staff”, “patient”, and “patient demographics”. Additionally, the ontology includes 77 object and data properties to define relationships and attributes. The ontology offers a semantic, computable framework for encoding appendicectomy-specific clinical procedures and their associated resources. Conclusion: By systematically representing this knowledge, this study establishes a foundation for enhancing clinical decision making, improving data integration, and ultimately advancing patient care. Future research can leverage this ontology to optimise healthcare workflows and outcomes in appendicectomy management.
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Petrosyan, Azniv F. "Importance Rearrangement within Education, Economy and Natural Protection Ministries for Armenian Composite Supportive Progress (ACASP)". Business and Management Studies 2, n.º 3 (23 de agosto de 2016): 78. http://dx.doi.org/10.11114/bms.v2i3.1834.

Texto completo
Resumen
Sustainable Development is an innovative concept characterized as Composite Appraising Supportive Progress of Armenia (ACASP). Biodiversity concept is another topic having a vast power on public management and employment. Social, economic and environmental impacts optimize sequences within Education (S3), Economy (E5), Air (N3) & Land (N1) Ministries. Seven (7) phases of Operation Research (OR) studies correspond to nine (9) procedures of composite progressive indicators. Two (2) techniques are applied as decision making of utility function and transportation tasks. North-West Corner Rule (NWCR) and Low Cost Cell Rule (LCCR) are applied to manage public employment as per Education (S3), Economy (E5), Land (N1) and Air (N3) magnitudes of ACASP. A motivating approach is pertained not only to pick the order of significance as per biodiversity concept, but also to characterize the process of decision making through operation research techniques, particularly transportation assignments as per origins and destinations with supply and demand applications. The resulting sequence is revealed by sustenance of biodiversity concept with public employment of Armenian CASP.
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Reichelt, Florian, Dietmar Traub y Thomas Maier. "DERIVATION OF A METHOD DNA FOR THE UNIFIED DESCRIPTION OF METHODICAL PROCEDURES IN PRODUCT DEVELOPMENT". Proceedings of the Design Society 3 (19 de junio de 2023): 1187–96. http://dx.doi.org/10.1017/pds.2023.119.

Texto completo
Resumen
AbstractThe number of publications on methods in product development is increasing constantly. In addition to scientific models, method guidelines exist in practice to support the selection of suitable methods. When looking more closely, it is noticeable that new methods are not new developments of methodical principles, but rather adaptations and summaries of known methods to specific application areas.Although approaches to standardize methods exist, they are usually formulated too abstractly to be useful to project managers as a support for method decision making.In our contribution, we analyse common methods of technical product development regarding similarities in content and time. In doing so, we were able to derive a method DNA on the basis of which all methods can be described and, above all, distinguished in a verifiable manner. In addition to essential activity blocks, the DNA also includes the description of temporal sequences, which in particular enables a differentiation between agile and classic methods. Ultimately, the method DNA not only offers the chance to make methodical work comprehensible, but also the possibility to select methods specifically for upcoming development steps arises through the classification option.
Los estilos APA, Harvard, Vancouver, ISO, etc.
11

Rowe, Jonathan y James Lester. "A Modular Reinforcement Learning Framework for Interactive Narrative Planning". Proceedings of the AAAI Conference on Artificial Intelligence and Interactive Digital Entertainment 9, n.º 4 (30 de junio de 2021): 57–63. http://dx.doi.org/10.1609/aiide.v9i4.12636.

Texto completo
Resumen
A key functionality provided by interactive narrative systems is narrative adaptation: tailoring story experiences in response to users’ actions and needs. We present a data-driven framework for dynamically tailoring events in interactive narratives using modular reinforcement learning. The framework involves decomposing an interactive narrative into multiple concurrent sub-problems, formalized as adaptable event sequences (AESs). Each AES is modeled as an independent Markov decision process (MDP). Policies for each MDP are induced using a corpus of user interaction data from an interactive narrative system with exploratory narrative adaptation policies. Rewards are computed based on users’ experiential outcomes. Conflicts between multiple policies are handled using arbitration procedures. In addition to introducing the framework, we describe a corpus of user interaction data from a testbed interactive narrative, CRYSTAL ISLAND, for inducing narrative adaptation policies. Empirical findings suggest that the framework can effectively shape users’ interactive narrative experiences.
Los estilos APA, Harvard, Vancouver, ISO, etc.
12

Ding, Shuhui, Zhongyuan Guo, Bin Wang, Haixia Wang y Fai Ma. "MBD-Based Machining Feature Recognition and Process Route Optimization". Machines 10, n.º 10 (8 de octubre de 2022): 906. http://dx.doi.org/10.3390/machines10100906.

Texto completo
Resumen
Machining feature recognition is considered the key connecting technique to the integration of Computer-Aided Design (CAD) and Computer-Aided Process Planning (CAPP), and decision-making of the part processing scheme and the optimization of process route can effectively improve the processing efficiency and reduce the cost of product machining cost. At present, for the recognition of machining features in CAD models, there is a lack of a systematic method to consider process information (such as tolerance and roughness) and an effective process route optimization method to plan part processing procedures. Here we represent a novel model processing feature recognition method, and, on the basis of feature processing plan decision, realize the optimization of the process route. On the basis of a building model Attributed Adjacency Graph (AAG) based on model geometry, topology, and process information, we propose an AAG decomposition and reconstruction method based on Decomposed Base Surface (DBS) and Joint Base Surface (JBS) as well as the recognition of model machining features through Attributed Adjacency Matrix-based (AAM) feature matching. The feature machining scheme decision method based on fuzzy comprehensive evaluation is adopted, and the decision is realized by calculating the comprehensive evaluation index. Finally, the Machining Element Directed Graph (MEDGraph) is established based on the constraint relationship between Machining Elements (MEs). The improved topological sorting algorithm lists the topological sequences of all MEs. The evaluation function is constructed with the processing cost or efficiency as the optimization objective to obtain the optimal process route. Our research provides a new method for model machining feature recognition and process route optimization. Applications of the proposed approach are provided to validate the method by case study.
Los estilos APA, Harvard, Vancouver, ISO, etc.
13

Nan Chu, Narisa y Yuri E. Shelepin. "Conscious and Unconscious Vision Transmission to Brain over Behavior". International Journal of Clinical Medicine and Bioengineering 1, n.º 1 (30 de diciembre de 2021): 25–36. http://dx.doi.org/10.35745/ijcmb2021v01.01.0004.

Texto completo
Resumen
Our goal is to apply historical evidence with longitudinal records of human underlining brain led behavior, to compensate for the findings based on conventional brain signal and image measurements taken at a resting position. This compensation, due to unconsciousness, is expected to play a major role in one’s decision making process. Using masks to distract vision, also measuring facial muscle movements, we were able to separate cognition through conscious and unconscious vision. Two types of masks are used over face image arrays under different cultural backgrounds to exercise 3 sequences of tests: (1) randomly generated masks anterior and posterior to target images; (2) Tai Chi Tu as masks, to take cognizance of longitudinal changes of the environment; (3) for each mask, contrasting European and Asian perception of facial emotional images. Our proposed approach is an attempt to investigate spatio-temporal effect about brain decision-making when historical evidence is considered to provoke the unconsciousness. Evidence and environmental influences over brain are commonly transmitted from vision. By masking vision at varying epoch, rates of longitudinal changes are examined for impact on perception, even when these rates have not been normalized for testing. Some justification of mask usage, its exposure duration and frequency are designed into our testing procedures for sensitivity studies. Significantly more testing is expected to understand the ambiguity between unconscious and conscious vision and to justify the proposed type of longitudinal changes on brain triggered behavior.
Los estilos APA, Harvard, Vancouver, ISO, etc.
14

Cummings, Chad W., Becky Habecker, Katherine Tullio, Andrew Rothacker, Nathan A. Pennell, Gregory M. M. Videtic, Daniel Raymond, Peter J. Mazzone y Alison Ibsen. "Identifying delays in care for patients with NSCLC using value-stream mapping." Journal of Clinical Oncology 36, n.º 30_suppl (20 de octubre de 2018): 136. http://dx.doi.org/10.1200/jco.2018.36.30_suppl.136.

Texto completo
Resumen
136 Background: Value-Stream Mapping (VSM) was employed to evaluate non-value added activities focused on minimizing time between pathological diagnosis and first treatment (Time-To-Treat or TTT). Objective is to identify unnecessary delays in care for NSLC patients treated at a large academic medical center. Methods: A total of 253 patient records were examined between 1/15/2015 and 7/19/2016 and divided into stages: Stage I (Non-Surgical), Stage I-II (Surgical), Stage III, and Stage IV. Selection criteria required a min. of 50 patients/stage, including internally and externally diagnosed patients. A VSM was developed for each stage. Spreadsheets were used to detail dates and sequences of events, including consults, E&M visits, imaging, procedures, and testing. Results: Overall TTT results by stage (median days) are as follows: Stage I (Non-Surgical) = 46 days (n = 55), Stage I-II (Surgical) = 35 days, n = 50), Stage III = 34 days (n = 71), Stage IV = 19 days (n = 77). Consults were reviewed among 4 specialties (Med/Onc, Rad/Onc, Surgical, Pulmonary), revealing Pulmonary Consults most common regardless of stage, 38%, 40%, 49%, 29%, respectively. It was found consults among specialties were rarely coordinated (stage III: 11/70 patients had consults same day between 2 specialties). Bronchoscopy procedures were most common method of Dx; sampling (n = 60, all stages) revealed MD orders are placed within 1 median day for each stage (15% ≥ 5days), but lead time to procedure ranged 7-12 median days depending on stage. Comorbidities for surgical patients (n = 46) were reviewed and found TTT delays correlates with number of comorbidities and FEV1 test results. Interventions included weekly, multi-disciplinary identification and review of patients across the 4 specialties, development of a TTT visual dashboard, and creation of communication standards across specialties. Conclusions: A VSM will identify areas where excessive delays occur. Opportunities exist to combine activities (same-day appointments/consults), reduce delays between activities, and/or improve communication. Decision-making can be accelerated when time between events (consults, staging, procedures, and tests) is minimized, regardless of diagnosis origin.
Los estilos APA, Harvard, Vancouver, ISO, etc.
15

Al-Haj Husain, Adib, Daphne Schönegg, Silvio Valdec, Bernd Stadlinger, Thomas Gander, Harald Essig, Marco Piccirelli y Sebastian Winklhofer. "Visualization of Inferior Alveolar and Lingual Nerve Pathology by 3D Double-Echo Steady-State MRI: Two Case Reports with Literature Review". Journal of Imaging 8, n.º 3 (17 de marzo de 2022): 75. http://dx.doi.org/10.3390/jimaging8030075.

Texto completo
Resumen
Injury to the peripheral branches of the trigeminal nerve, particularly the lingual nerve (LN) and the inferior alveolar nerve (IAN), is a rare but serious complication that can occur during oral and maxillofacial surgery. Mandibular third molar surgery, one of the most common surgical procedures in dentistry, is most often associated with such a nerve injury. Proper preoperative radiologic assessment is hence key to avoiding neurosensory dysfunction. In addition to the well-established conventional X-ray-based imaging modalities, such as panoramic radiography and cone-beam computed tomography, radiation-free magnetic resonance imaging (MRI) with the recently introduced black-bone MRI sequences offers the possibility to simultaneously visualize osseous structures and neural tissue in the oral cavity with high spatial resolution and excellent soft-tissue contrast. Fortunately, most LN and IAN injuries recover spontaneously within six months. However, permanent damage may cause significant loss of quality of life for affected patients. Therefore, therapy should be initiated early in indicated cases, despite the inconsistency in the literature regarding the therapeutic time window. In this report, we present the visualization of two cases of nerve pathology using 3D double-echo steady-state MRI and evaluate evidence-based decision-making for iatrogenic nerve injury regarding a wait-and-see strategy, conservative drug treatment, or surgical re-intervention.
Los estilos APA, Harvard, Vancouver, ISO, etc.
16

Alam, Mohammad Zulfeequar, Tameem Ahmad y Shaista Parveen. "Assessing social media and influential marketing on brand perception and selection of higher educational institute in India". International Journal of Data and Network Science 9, n.º 1 (2025): 27–36. https://doi.org/10.5267/j.ijdns.2024.11.001.

Texto completo
Resumen
In the rapidly evolving landscape of higher education, Social media marketing (SMM) has evolved a critical factor in shaping brand perception (B.P.) and the decision-making process of prospective students in India. This paper intends to explore the intricate dynamics among Social media (S.M.), influencer marketing, and the selection of higher academic institutes (HEIs), focusing on understanding how these factors shape students' perceptions. We conducted this research on 560 students, who represented the research's population. SEM-PLS was applied to analyze the data and acquire procedures. Surveys on the Internet were used. Employing exogenous/endogenous elements to create sequences, the SEM approach examines causal associations among elements. It provides solutions to research into causation in dimensional and structural frameworks. According to the research's findings, selecting HEI is positively impacted by SMM initiatives. The aspects of SMM activities (electronic word-of-mouth (eWOM), personalization, interaction, and trendiness) affect the HEI selection. In addition, personalization and two have an impact on perceptions of the brand. Understanding and utilizing the efficacy of S.M. and influencer marketing will be crucial for HEIs looking to draw in and hold on to potential students as the higher education environment keeps evolving in the age of digitization.
Los estilos APA, Harvard, Vancouver, ISO, etc.
17

Gryska, Emilia Agnieszka, Justin Schneiderman y Rolf A. Heckemann. "Automatic brain lesion segmentation on standard MRIs of the human head: a scoping review protocol". BMJ Open 9, n.º 2 (febrero de 2019): e024824. http://dx.doi.org/10.1136/bmjopen-2018-024824.

Texto completo
Resumen
IntroductionAutomatic brain lesion segmentation from medical images has great potential to support clinical decision making. Although numerous methods have been proposed, significant challenges must be addressed before they will become established in clinical and research practice. We aim to elucidate the state of the art, to provide a synopsis of competing approaches and identify contrasts between them.Methods and analysisWe present the background and study design of a scoping review for automatic brain lesion segmentation methods for conventional MRI according to the framework proposed by Arksey and O’Malley. We aim to identify common image processing steps as well as mathematical and computational theories implemented in these methods. We will aggregate the evidence on the efficacy and identify limitations of the approaches. Methods to be investigated work with standard MRI sequences from human patients examined for brain lesions, and are validated with quantitative measures against a trusted reference. PubMed, IEEE Xplore and Scopus will be searched using search phrases that will ensure an inclusive and unbiased overview. For matching records, titles and abstracts will be screened to ensure eligibility. Studies will be excluded if a full paper is not available or is not written in English, if non-standard MR sequences are used, if there is no quantitative validation, or if the method is not automatic. In the data charting phase, we will extract information about authors, publication details and study cohort. We expect to find information about preprocessing, segmentation and validation procedures. We will develop an analytical framework to collate, summarise and synthesise the data.Ethics and disseminationEthical approval for this study is not required since the information will be extracted from published studies. We will submit the review report to a peer-reviewed scientific journal and explore other venues for presenting the work.
Los estilos APA, Harvard, Vancouver, ISO, etc.
18

KEHLMAIER, CHRISTIAN. "A nomenclatural note on European Chalarus (Diptera: Pipunculidae): a new synonymy of C. elegantulus Jervis, 1992". Zootaxa 2656, n.º 1 (25 de octubre de 2010): 67. http://dx.doi.org/10.11646/zootaxa.2656.1.4.

Texto completo
Resumen
Recently, Kehlmaier & Assmann (2008) presented a revision of the European representatives of the big-headed fly genus Chalarus Walker, 1834. The work introduced four previously unknown taxa and provided diagnoses of all other known species, illustrated through numerous line drawings and photo-micrographs. In total, the authors treated 25 species including one nomen dubium. Three of these were known from males only and five solely from females. Taxonomic decision-making was backed up by molecular evidence, published partly therein and in the context of a phylogenetic study of the subfamily Chalarinae (Kehlmaier & Assmann 2010). Despite large collecting efforts, three taxa could not be analysed genetically, namely C. argenteus Coe, 1966, C. elegantulus Jervis, 1992 and C. proprius Jervis, 1992, all representing species based on females only. Shortly after the publication of the above mentioned revision, a specimen of C. elegantulus was identified amongst material sent by Dr. Gunilla Ståhls (Finnish Museum of Natural History, Helsinki): 1&, Finland, Ab (Regio aboënsis), Karjalohja, Karkalinniemi, 66581:33221, control trap #2, 25.VI.–22.VII.2007, leg. G. Ståhls, coll. C. Kehlmaier. The 5’ half of the mitochondrial cytochrome oxidase subunit 1 (CO1) gene was sequenced following standard lab procedures (see Kehlmaier & Assmann 2010) using the primer pair LCO1490 and HCO2198 (Folmer et al. 1994). The resulting DNA-barcode (Genbank sequence accession number: FN999909) was compared against a set of reference Chalarus barcode sequences and matched C. absconditus Kehlmaier in Kehlmaier & Assmann, 2008 syn. nov., a species know from male specimens only, sharing an identical haplotype.
Los estilos APA, Harvard, Vancouver, ISO, etc.
19

Brown, Nolan, Cathleen Kuo, Zachery Neil y Julian Gendreau. "BIOS-06. MACHINE LEARNING MADE EASY: DESIGN OF A PROPOSAL FOR A MEDICAL STUDENT EDUCATIONAL SESSION ON HIGH YIELD STRATEGIES FOR LEARNING AND APPLYING DATA SCIENCE". Neuro-Oncology 25, Supplement_5 (1 de noviembre de 2023): v21. http://dx.doi.org/10.1093/neuonc/noad179.0083.

Texto completo
Resumen
Abstract Machine learning is a burgeoning field in data science that has many applications in medicine, and it is especially useful in neurosurgery due to the often-complex and potentially emergent nature of day-to-day cases. Machine learning models have the potential to deliver accurate patient prognostication while offering the benefit of providing real-time AI supported clinical decision-making with high data granularity. These models can utilize numerical inputs for more straightforward applications and, with practice, can enable clinicians and trainees to incorporate advanced imaging sequences as inputs to optimize the accuracy of patient prognostication. In the future, these models will become much more commonplace in the clinical setting for neurosurgeons. As medical school curricula do not typically incorporate these statistical techniques into biomedical statistics courses, symposia centered on machine learning could prove essential to medical students seeking to learn these skills. This session would include a brief overview of machine learning models and the procedures for using the programming language of RStudio for the development of novel internet-based prognostication calculators. The benefit of RStudio is that it is user friendly and pre-designed templates are available that can enable students to perform facile customization in tailoring the code to the needs of their clinical investigation. There will be a brief overview of accuracy verification involving discussion of receiver operating curves with the use of training and testing datasets - the heart of machine learning. We have designed and outlined a curriculum that will enable students to identify and classify different techniques (support vector machines, logistic regression, clustering, nearest neighbor classifiers), and recognize the utility of machine learning approaches to neurosurgical outcomes prognostication. Our hope is that students will leave this symposium well-equipped with a newfound confidence and the baseline conceptual framework needed to begin exploring the awesome world of machine learning throughout their training.
Los estilos APA, Harvard, Vancouver, ISO, etc.
20

Berthold, Daniel P., Lukas Willinger, Lukas N. Muench, Philipp Forkel, Andreas Schmitt, Klaus Woertler, Andreas B. Imhoff y Elmar Herbst. "Visualization of Proximal and Distal Kaplan Fibers Using 3-Dimensional Magnetic Resonance Imaging and Anatomic Dissection". American Journal of Sports Medicine 48, n.º 8 (14 de mayo de 2020): 1929–36. http://dx.doi.org/10.1177/0363546520919986.

Texto completo
Resumen
Background: In current magnetic resonance imaging (MRI) of the knee, injuries to the anterolateral ligament complex (ALC) and the Kaplan fibers (KFs) are not routinely assessed. As ruptures of the KFs contribute to anterolateral rotatory instability in the anterior cruciate ligament–deficient knee, detecting these injuries on MRI may help surgeons to individualize treatment. Purpose: To visualize the KFs on 3-T MRI and to conduct a layer-by-layer dissection of the ALC. Study Design: Descriptive laboratory study. Methods: Ten fresh-frozen human cadaveric knees (mean ± SD age, 72 ± 8.5 years) without history of ligament injury were used in this study. Before layer-by-layer dissection of the ALC, MRI was performed to define the radiologic anatomy of the KFs. A coronal T1-weighted 3-dimensional turbo spin echo sequence and a transverse T2-weighted turbo spin echo sequence were obtained. Three-dimensional data sets were used for multiplanar reconstructions. Results: KFs were identified in 100% of cases on MRI and in anatomic dissection. The mean length of the proximal and distal KFs was 17.9 ± 3.6 mm and 12.4 ± 6.5 mm, respectively. On MRI, the distance from the lateral femoral epicondyle to the proximal KFs was 35.9 ± 6.9 mm and to the distal KFs, 16.6 ± 4.1 mm; in anatomic dissection, the distances were 41.4 ± 8.1 mm for proximal KFs and 28.2 ± 8.1 mm for distal KFs. The distance from the lateral joint line to the proximal KFs was 63.5 ± 7.6 mm and to the distal KFs, 45.3 ± 3.7 mm. Interobserver reliability for image analysis was excellent for all measurements. Conclusion: KFs can be consistently identified on MRI with use of 3-dimensional sequences. Subsequent anatomic dissection confirmed their close topography to the superior lateral genicular artery. For clinical implications, the integrity of the KFs should be routinely reviewed on MRI scans Clinical Relevance: As ruptures of the KFs contribute to anterolateral rotatory instability, accurate visualization of the KFs on MRI may facilitate surgical decision making for additional anterolateral procedures in the anterior cruciate ligament–deficient knee.
Los estilos APA, Harvard, Vancouver, ISO, etc.
21

Soloukey, S., L. Verhoef, F. Mastik, B. S. Generowicz, E. M. Bos, B. S. Harhangi, K. E. Collée et al. "P09.03 Fully integrating functional Ultrasound (fUS) into the onco-neurosurgical operating room: Towards a new real-time, high-resolution image-guided resection tool with multimodal potential". Neuro-Oncology 23, Supplement_2 (1 de septiembre de 2021): ii26—ii27. http://dx.doi.org/10.1093/neuonc/noab180.091.

Texto completo
Resumen
Abstract BACKGROUND Onco-neurosurgical practice still relies heavily on pre-operatively acquired images to guide intra-operative decision-making for safe tumor removal, a practice with inherent pitfalls such as registration inaccuracy due to brain shift, and lack of real-time (functional) feedback. Exploiting the opportunity for real-time imaging of the exposed brain can improve intra-operative decision-making, neurosurgical safety and patient outcomes. Previously, we described functional Ultrasound (fUS) as a high-resolution, depth-resolved imaging technique able to detect functional regions and vascular morphology during awake resections. Here, we present for the first time fUS as a fully integrated, MRI/CT-registered imaging modality in the OR. MATERIAL AND METHODS fUS relies on high-frame-rate (HFR) ultrasound, making the technique sensitive for very small motions caused by vascular dynamics (µDoppler) and allowing measurements of changes in cerebral blood volume (CBV) with micrometer-millisecond precision. This opens up the possibility to 1) detect functional response, as CBV-changes reflect changes in metabolism of activated neurons through neurovascular coupling and 2) visualize in-vivo vascular morphology of tumor and healthy tissue. During a range of anesthetized and awake onco-neurosurgical procedures we acquired images of brain and spinal cord using conventional linear ultrasound probes connected to an experimental acquisition unit. Building on Brainlab’s ‘Cranial Navigation’ and ‘Intra-Operative Ultrasound’ modules, we could co-register our intra-operative Power Doppler Images (PDIs) to patient-registered MRI/CT-data. Using the ‘IGTLink’ research interface, we could access and store real-time tracking data for informed volume reconstructions in post-processing. RESULTS Intra-operative fUS could be registered to MRI/CT-images in real-time, showing overlays of PDIs at imaging depths of >5 centimeters. During meningioma resections, these co-registered PDIs revealed fUS’ ability to visualize the tumor’s feeding vessels and surrounding healthy vasculature prior to durotomy, with a level of detail unprecedented by conventional MRI-sequences. Comparing post-operatively reconstructed 3D-vascular maps of pre- and post-durotomy acquisitions, further confirmed the dural dependency of the vascular network feeding the tumor. During awake resections, fUS revealed distinct functional areas as activated during motor and language tasks. CONCLUSION fUS is a new real-time, high-resolution and depth-resolved imaging technique, combining characteristics uniquely beneficial for a potential image-guided resection tool. The successful integration of fUS in the onco-neurosurgical OR demonstrated by our team, is an essential step towards clinical integration of fUS, as well as the technique’s validation against modalities such as MRI and CT.
Los estilos APA, Harvard, Vancouver, ISO, etc.
22

Granata, Vincenza, Roberta Fusco, Maria Chiara Brunese, Gerardo Ferrara, Fabiana Tatangelo, Alessandro Ottaiano, Antonio Avallone et al. "Machine Learning and Radiomics Analysis for Tumor Budding Prediction in Colorectal Liver Metastases Magnetic Resonance Imaging Assessment". Diagnostics 14, n.º 2 (9 de enero de 2024): 152. http://dx.doi.org/10.3390/diagnostics14020152.

Texto completo
Resumen
Purpose: We aimed to assess the efficacy of machine learning and radiomics analysis using magnetic resonance imaging (MRI) with a hepatospecific contrast agent, in a pre-surgical setting, to predict tumor budding in liver metastases. Methods: Patients with MRI in a pre-surgical setting were retrospectively enrolled. Manual segmentation was made by means 3D Slicer image computing, and 851 radiomics features were extracted as median values using the PyRadiomics Python package. Balancing was performed and inter- and intraclass correlation coefficients were calculated to assess the between observer and within observer reproducibility of all radiomics extracted features. A Wilcoxon–Mann–Whitney nonparametric test and receiver operating characteristics (ROC) analysis were carried out. Balancing and feature selection procedures were performed. Linear and non-logistic regression models (LRM and NLRM) and different machine learning-based classifiers including decision tree (DT), k-nearest neighbor (KNN) and support vector machine (SVM) were considered. Results: The internal training set included 49 patients and 119 liver metastases. The validation cohort consisted of a total of 28 single lesion patients. The best single predictor to classify tumor budding was original_glcm_Idn obtained in the T1-W VIBE sequence arterial phase with an accuracy of 84%; wavelet_LLH_firstorder_10Percentile was obtained in the T1-W VIBE sequence portal phase with an accuracy of 92%; wavelet_HHL_glcm_MaximumProbability was obtained in the T1-W VIBE sequence hepatobiliary excretion phase with an accuracy of 88%; and wavelet_LLH_glcm_Imc1 was obtained in T2-W SPACE sequences with an accuracy of 88%. Considering the linear regression analysis, a statistically significant increase in accuracy to 96% was obtained using a linear weighted combination of 13 radiomic features extracted from the T1-W VIBE sequence arterial phase. Moreover, the best classifier was a KNN trained with the 13 radiomic features extracted from the arterial phase of the T1-W VIBE sequence, obtaining an accuracy of 95% and an AUC of 0.96. The validation set reached an accuracy of 94%, a sensitivity of 86% and a specificity of 95%. Conclusions: Machine learning and radiomics analysis are promising tools in predicting tumor budding. Considering the linear regression analysis, there was a statistically significant increase in accuracy to 96% using a weighted linear combination of 13 radiomics features extracted from the arterial phase compared to a single radiomics feature.
Los estilos APA, Harvard, Vancouver, ISO, etc.
23

Chesnokova, А. A. "A Model of Forming a Dynamic Structure for Establishing the Source of Messages in the Receiver's Memory". Proceedings of the Southwest State University. Series: IT Management, Computer Science, Computer Engineering. Medical Equipment Engineering 13, n.º 3 (27 de enero de 2024): 122–34. http://dx.doi.org/10.21869/2223-1536-2023-13-3-122-134.

Texto completo
Resumen
The purpose of research is increasing the speed of procedures for determining the data source in the block coupling mode, due to the analysis of the dynamic list structure of messages formed in the receiver's memory as a result of intermediate calculations.Methods. The model of forming a dynamic list structure is based on the hardware implementation of the method of limiting the set of data blocks processed by the receiver. The data package includes a special service word, the contents of which are checked for falling into the range of values formed by the receiver when each data block is received. The described restriction reduces the number of typical comparison operations of service words performed when determining the source of messages, and also reduces the likelihood of errors in determining the data source.Results. Based on the model of the formation of a dynamic tree-like list structure, distributions of a priori probabilities of the number of nodes of a certain level are obtained in case of errors in determining the data source and without them. This allows us to obtain significant a posteriori error probabilities depending on the observed number of nodes of a certain level. The criteria for making a decision on the error of determining the source based on the calculation of the number of nodes before the complete completion of the formation of the tree structure and before the stage of its analysis are formulated. This reduces the computational complexity of the procedure for determining the data source in the block coupling mode and reduces the memory costs for storing intermediate results.Conclusion. In the course of the study, it was revealed that for sequences of messages with a length of more than 20, the detection of more than 8 extraneous branches of the dynamic list structure being formed allows us to state with a 90% probability that the procedure for determining the source ended in an error. The refusal to transmit subsequent sequence messages and to perform processing operations of the tree structure allows to increase the speed of the procedure for determining the source of messages and reduce its computational complexity.
Los estilos APA, Harvard, Vancouver, ISO, etc.
24

Hahn, Christopher N., Milena Babic, Peter J. Brautigan, Parvathy Venugopal, Kerry Phillips, Julia Dobbins, Peer Arts et al. "Australian Familial Haematological Cancer Study - Findings from 15 Years of Aggregated Clinical, Genomic and Transcriptomic Data". Blood 134, Supplement_1 (13 de noviembre de 2019): 1439. http://dx.doi.org/10.1182/blood-2019-131686.

Texto completo
Resumen
The Australian Familial Haematological Cancer Study (AFHCS) was initiated in 2004 with the aim to define genes predisposing to hematological malignancy (HM) to offer better options for clinical decision making and genetic counselling, and to identify therapeutic targets. The study is a referral centre for Australia and New Zealand, and currently has 230 families with multiple cases of myeloid and/or lymphoid malignancies or early onset cases (Figure 1), and is growing as clinical awareness of a germline genetic basis for blood cancers increases. To date, we have identified families with causal germline variants in several predisposition genes (five GATA2, ten RUNX1, one CEBPA, ten DDX41, one SAMD9L) including novel single nucleotide variants, deletions and insertions in coding and intronic sequences using traditional Sanger sequencing and now genomic and transcriptomic technologies. Of these, one GATA2 and four DDX41 germline mutations were identified during the screening of "sporadic" MDS samples. All four DDX41 mutant samples also harbored a somatic DDX41 (R525H) variant on the other allele at a low variant allele frequency. A comprehensive clinical analysis of the RUNX1 families has uncovered segregating phenotypes, in addition to thrombocytopenia and myeloid and lymphoid malignancies, including skin disorders such as psoriasis. In an increasing number of individuals in these families, important clinical decisions have been made dependent on mutation carrier status. Recently, we have identified and characterized a unique myeloproliferative neoplasm (MPN)/acute myeloid leukemia (AML)/myelodysplastic syndrome (MDS) family with a germline Chr14q duplication that overlaps with duplications in two other reported MPN/AML families. This appears to be a unique genotypic/phenotypic entity when compared to other myeloid predisposition genes and their associated phenotypes. Interestingly, we have identified several families carrying heterozygous pathogenic/likely pathogenic variants in genes representing autosomal recessive genomic instability syndromes segregating with HM. Here mutations in the genes NBN, RECQL4, DDX11 and RAD21 appear to act in an autosomal dominant manner. Further, we have found DNA damage repair gene predicted pathogenic variants in PALB2 and BARD1 in families with both solid cancers and HM, predominantly lymphomas, implicating an expansion of the major predisposition phenotype of these gene perturbations. Familial cases of chronic lymphocytic leukemia (CLL) have been well recognized, but it has been particularly difficult to identify predisposing variants. We have identified a number of strong candidate genes/variants in CLL families including PRPF8 (Y208C and N400S) and SAMHD1 (R371H) although more families are required to confirm these. An integral part of the AFHCS is the continued generation of cell and animal models to help define mechanisms of action of predicted or known pathogenic variants, and functional model systems for testing of variants of unknown significance. To facilitate the collection of patient samples, we have adopted the use of hair bulbs as the main germline sample as they are easy to collect, can be easily sent long distance by mail at room temperature, require no culture, are quickly and cheaply processed and provide good quality DNA using automated procedures. Overall, collaborative efforts within Australia and New Zealand and internationally have been highly fruitful in solving familial cases of hematopoietic malignancies over the last 15 years, and even more concerted international efforts will be required in the future to uncover the familial basis of unsolved cases, particularly in the lymphoid lineage, and to clarify best approaches for clinical decision making and treatment options. Figure 1. Summary of AFHCS families with associated hematological malignancies. Figure Disclosures Scott: Celgene: Honoraria.
Los estilos APA, Harvard, Vancouver, ISO, etc.
25

Blocka, Joanna, Seong Gu Heo, Shonali Midha, Nathaniel G. Tadros, Steffen B. Kulp, Julia Frede, Marius Mathies, Nikhil C. Munshi, Andrew J. Yee y Jens G. Lohr. "Single-Cell RNA Sequencing of Circulating Myeloma Cells: Clinical Implications in Times of Immunotherapies". Blood 144, Supplement 1 (5 de noviembre de 2024): 1898. https://doi.org/10.1182/blood-2024-198713.

Texto completo
Resumen
Introduction In the times of promising immunotherapies against multiple myeloma (MM), close monitoring of patients' response and disease complexity is crucial for accurate decision making. To date, bone marrow (BM) aspiration and biopsy have been the gold standard for diagnosis and molecular characterization of the disease. These procedures, however, are invasive and associated with discomfort for the patients. Not all existing disease clones can be captured by a one-time aspiration at a single location. Furthermore, current standard diagnostic methods such as FiSH and bulk DNA sequencing provide little information with regard to decision making about treatment with modern therapeutics such as monoclonal antibodies, bispecific T-cell engagers, and CAR-T cells. Here, we present data to show that single-cell RNA sequencing (scRNA-seq) of circulating MM cells is an excellent proxy for detecting prognostically relevant inferred copy number variants (CNVs) and translocations, while depicting the heterogeneity of the disease. Moreover, scRNA-seq can yield additional, therapeutically relevant information such as expression level of the target protein for the immunotherapies. Because of the non-invasiveness and possibility of frequent peripheral blood (PB) draws, this method constitutes a great tool for patient monitoring in both clinical-trial and non-trial settings. Methods Fresh PB and BM samples of 10 myeloma patients at various disease stages were obtained. Mononuclear cells were isolated by density-gradient centrifugation. CD138-positive cells were selected using magnetic-beads separation. Single MM cells were sorted onto 96-well plates using the following gating strategy: 7AAD-, CD14-, CD138+, CD38+, SLAMF7+, CD45-/CD45dim. RNA isolation and cDNA-library preparation were performed using the Smart-seq2 method (Picelli et al. Nat Prot 2014), which yielded information of the full transcript length. For testing purposes, we used our previously published data (Frede et al. Nat Cell Biol 2021). Clustering was performed with Seurat. Cell-type annotation was performed using SingleR and reference dataset from the BLUEPRINT consortium. Light- and heavy-chain B-cell receptor sequences were determined using BASIC and igBLAST. CNV analysis was performed with inferCNV. Expression level was assessed for genes expected to be overexpressed in case of translocations (CCND1, CCND3, MAF, MAFB, MMSET, FGFR3) as well as for surface marker genes (CD138, CD38, SLAMF7, BCMA, GPRC5D, FCRH5). Fisher test (expression of a particular gene vs. all other genes without the gene of interest) was performed to define the p-value cut-off for identifying an inferred translocation. Percentage of MM cells expressing a surface marker was assessed for each sample. Results We could reproduce the heavy- and light-chain clonality with a sensitivity and specificity of 100 %. scRNA-seq showed to be a valid substitute for BM FiSH with a sensitivity of 94.4 % and specificity of 100 % in detecting prognostically relevant CNVs (amp/gain(1q), del(1p), del(17p), hyperdiploidy) and translocations (t(11;14), t(6;14), t(14;16), t(14;20), t(4;14)). Interestingly, we could show a direct correlation between very low expression of BCMA (expression in 8.1% of circulating MM cells) 29 months after 1st CAR-T cell therapy and progressive disease directly after re-exposure to a 2nd dose of anti-BCMA CAR-T cells (31 months after the 1st administration), suggesting that circulating tumor cells may aid in identifying effective therapies and avoiding those that are not. Furthermore, we observed 2 patients with significant downregulation of BCMA 1.5 months and 14 months (10.5 % and 3 %, respectively) after disease progression under belantamab mafodotin as well as a primary low expression (8.6 %) of FcRH5 in a patient naïve to anti-FcRH5 treatment. Moreover, in 2 patients, FiSH from BM was not feasible due to technical difficulties, which further highlights the utility of interrogation of single MM cells from PB. Conclusion scRNA-seq of MM cells from PB is a robust, non-invasive substitute for BM FiSH to detect prognostically and therapeutically relevant CNVs and translocations. Furthermore, it yields additional information such as expression level or absence of surface markers as immunotherapeutic targets. This is of critical use in clinical decision making and provides opportunity for more personalized therapy approaches.
Los estilos APA, Harvard, Vancouver, ISO, etc.
26

DeKeyser, Graham J., Yantarat Sripanich, Jesse Steadman, Chamnanni Rungprai, Justin Haller, Charles L. Saltzman y Alexej Barg. "Mapping of Posterior Talar Dome Access through Standard Posteromedial and Posterolateral Approaches With or Without an External Fixator Distraction: A Match-paired Cadaveric Study". Foot & Ankle Orthopaedics 5, n.º 4 (1 de octubre de 2020): 2473011420S0019. http://dx.doi.org/10.1177/2473011420s00191.

Texto completo
Resumen
Category: Trauma; Ankle; Other Introduction/Purpose: Posterior talar body fractures (AO/OTA 81.1.B/C) are rare injuries that present unique challenges in their access to the treating surgeon. Accessibility to this structure has been investigated extensively in the context of osteochondral lesion interventions, normally requiring perpendicular access to perform operative procedures. However, techniques in gaining this access regarding fracture repair, requiring only adequate visualization, has not been described in literature. Generally, a pre-operative decision is made between a posterior, soft-tissue based approach or a peri-articular osteotomy, which is associated with comparatively higher morbidity and complication rates. The aim of this study is to evaluate the accessible area of the talar dome via two standard posterior approaches (posteromedial; PM, and posterolateral; PL) with and without external fixator distraction. Methods: Eight male through-knee matched-paired cadaveric legs (mean age: 49.0 +- 14.6; mean BMI: 24.5+- 3.9 kg/m2) were included in this study. A standard PM or PL approach was performed using a randomized crossover design for surgical sequences. The accessible area without distraction was initially outlined by drilling a 1.6-mm Kirschner wire around the periphery of the visualized talus. Five millimeters of distraction, confirmed with fluoroscopy, was then applied to the specimens using an external fixator. The accessible area was again marked using the same method. The tali specimens were then explanted and imaged using a Micro-CT scanner to acquire 3 dimensional reconstructions. The accessible area was calculated as a percentage of the total talar dome surface area. The Mann-Whitney U test was used to compare the reported areas among the two surgical approaches, where the Wilcoxon signed rank test was utilized to compare values among distracted and non-distracted conditions. Results: In reference, the average total surface area of the talus is 16.94 +- 2.47 cm2. No statistically significant differences were found among match-paired specimens (p=0.63). The PM approach allowed access to 17.1% (11.1 to 23.6%, SD 5.4) of the talar dome surface without distraction and 29.3% (20.0 to 38.6%, SD 8.6) of the talar dome surface with distraction. The PL approach provided access to 7.4% (4.7 to 11.8%, SD 3.1) and 17.0% (11.0 to 26.1%, SD 6.5) of the talar dome surface with and without distraction, respectively. A statistically significant difference was observed in talar dome accessibility among distracted and non- distracted conditions in both surgical approaches (p=0.008). Additionally, the PM approach provided significantly more access to the talar dome relative to the PL approach (p=0.043). Conclusion: This matched-paired cadaveric study provides roadmap that can assist in the pre-operative planning of talar dome access in the treatment of talar body and posterior tubercle fractures. We found no advantage to a PL approach over a PM approach to access these challenging fractures. Additionally, added distraction using an external fixator consistently increased visualization of the talar dome by a magnitude of at least 40% greater than the non-distracted conditions. These methods can be applied clinically to gain appropriate access to the talar dome, allowing fracture repair.
Los estilos APA, Harvard, Vancouver, ISO, etc.
27

Glover, D. M. "New doors to open…and so many!" Journal of Cell Science 113, n.º 3 (1 de febrero de 2000): 359–60. http://dx.doi.org/10.1242/jcs.113.3.359.

Texto completo
Resumen
The pursuit of science is a wonderful journey of discovery along which there are a myriad of avenues to be explored. There have always been so many objects of fascination, so many questions to ask along the way, so many possibilities to understand new principles, that making the decision about which problem to address and then having the self-discipline to explore it in depth challenge all who practice the art. How then are we, as cell biologists, to cope with the mountain of information that is accumulating as we enter the twenty-first century? We now have the potential to decipher the primary sequences of every single cellular protein for several model organisms. Just how are we to put this information into an intelligible framework for understanding cell physiology? The turn of a century is a time at which we can permit ourselves the luxury of looking backwards as well as forwards. Where were we a century ago, what were the challenges that faced us then and how do these questions relate to our future goals? As a cell biologist standing on the threshold of the twentieth century, one must have had a similar feeling of elation and expectation to that which we have at the present time. The Theory of Cells had been established by Schleiden and Schwan in 1838–1839, and in the following fifty years it had led to unifying ideas about the nature of plants and animals, an understanding of embryonic development, and the mysteries of the fertilisation of the egg and genetic continuity in terms of ‘cellular immortality’. These were truly halcyon days. By the end of the nineteenth century many of the central principles of cell biology were firmly established. Virchow had maintained in 1855 that every cell is the offspring of a pre-existing parent cell, but the realisation that the cell nucleus is essential for this continuity had to wait another 30 years. By this time, Miecher had already made in 1871 his famous discovery of nuclein, a phosphorus-rich substance extracted from preparations of nuclei from sperm and pus cells, and over the next twenty years a spectrum of sophisticated dyes became available that facilitated the visualisation of not only nuclein but also asters, spindle fibres, and microsomal components of cytoplasm in fixed preparations of cells. The centrosome, discovered independently by Flemming in 1875 and Van Beneden in 1876, and named by Boveri in 1888, was already considered to be an autonomous organelle with a central role in cell division. The behaviour of chromosomes, centrosomes, astral fibres and spindle fibres throughout mitosis and meiosis had been described in exquisite detail. Galeotti had even concluded by 1893 that the unequal distribution of chromatin in cancer cells correlates with an inequality of the centrosomes and the development of abnormal spindles - a conclusion reinforced by others over a century later (Pihan et al., 1998; Lingle et al., 1998). It had taken 200 years following Leuwenhoek's first observation of sperm to Hertwig's demonstration in 1875 that fertilisation of the egg is accomplished by its union with one spermatozoon. This demonstration was rapidly followed by Van Beneden's discovery - eventually to unify genetics and cell biology - that the nuclei of germ cells each contain one half the number of chromosomes characteristic of body cells. By 1902, both Sutton and Boveri had realised that the behaviour of chromosomes in meiosis precisely parallels the behaviour of Mendel's genetic particles described some 35 years earlier. In many ways we have witnessed during the past 50 years, and particularly in the last quarter century, a series of exciting breakthroughs in establishing an understanding of genetic function and continuity that are comparable to those of the previous century in demonstrating cellular function and continuity. The determination of the structure of DNA in 1953 and the elucidation of the genetic code throughout the 1960s led to the rapid realisation of the code's universality. The parallel development of sophisticated techniques for studying the genetics of the model bacterium Escherichia coli and its plasmids and viruses paved the way for a new era in biology. We were soon to construct recombinant DNA molecules in vitro, propagate them and eventually express them in E. coli, taking full advantage of the universality of the code. The principles of cloning DNA molecules had been clearly enunciated by Berg and Hogness in the early 1970s, and I myself had the great fortune as a young post-doc to share in this excitement and participate in putting some of these principles into their early practice. By the end of that decade, genes had been cloned from a multitude of eukaryotes and, moreover, technologies had been developed by Maxam and Gilbert and by Sanger that enabled these cloned genes to be sequenced. The accelerating accumulation of knowledge enabled by these simple technical breakthroughs has been astounding, leading to the determination of the complete genome sequences of budding yeast, the nematode Caenorhabditis elegans and the fruit fly, Drosophila melanogaster, and the prospect of the complete human sequence within a few years. To date we have managed this accumulating wealth reasonably well. Cloned genes have allowed cell biologists access to the encoded proteins, and as a consequence we have a working knowledge of many cellular processes. The sub-cellular meanderings of molecules have been charted with increasing accuracy, and gene products have been positioned in regulatory pathways. The concerted application of genetic and molecular approaches has given new insights into cell biology. This is particularly evident from work on the yeasts, which have come into their own as model systems with our realisation of the extent to which cell biological processes have been conserved. Nevertheless, the resulting regulatory pathways that emerge from our current ways of looking at the cell are rather unidimensional, gene products being placed into linear pathways as a result of either molecular or genetic analyses. Our current views are often blind to the fact that the cell is a multidimensional structure whose components are arranged in space, have multiple contacts that change with time and can respond simultaneously to a multitude of signals. Glimpses of such complexity are emerging from studies in which microarrays of all the identified open reading frames (ORFs) from the complete budding yeast genome have been screened for changes in patterns of gene expression throughout the cell cycle or upon sporulation. Cell-cycle-dependent periodicity was found for 416 of the 6220 monitored ORFs, and over 25% of these genes were found to be clustered at particular chromosomal sites, which suggesting there are global chromosomal responses in transcriptional control (Cho et al., 1998). The study of sporulation is perhaps the first example of the application of this type of technology to a developmental process. It revealed that, of the 6220 genes, about 500 undergo repression and 500 induction in seven temporally distinct patterns during the sporulation process, identifying potential functions for many previously uncharacterised genes (Chu et al., 1998). These studies already reveal layers of complexity in the regulation of the levels of transcripts as cells prepare for and pass through the different stages of meiosis. How much more complex are these patterns likely to be when viewed in terms of proteins, and their interactions, locations and functions within the cell? It seems clear, however, that a wonderful molecular description of the events of meiosis that can match the cytological understanding revealed by the work of Van Beneden and Boveri one hundred years ago is within our grasp. The cataloguing of all cellular proteins is now feasible through a combination of 2D-gel analysis and mass spectrometry, from which molecular mass data can be correlated with the fragment sizes of peptides predicted from whole genome sequence data (the emerging field of proteomics). It is not an easy task, but it seems just a matter of time before we have all this information at our fingertips. Yet how can we know the functions of all these proteins and have a full 3D picture of how they interact within a cell and the dynamics with which they do so? Yeast may be the first eukaryote for which some of these problems can be approached. Its genome is six-times smaller than that of C. elegans and 200 times smaller than the human genome, and has the further advantage that the genes can be easily disrupted through homologous recombination. Thus the prospect of systematic gene deletion to study the function of the 3700 novel ORFs identified in the whole genome sequence is feasible for this organism (Winzeler et al., 1999). One group in particular has devised a multifaceted approach for doing this: the affected gene is simultaneously tagged with an in-frame transcriptional reporter and further modified to epitope tag the affected protein, which thus allows the latter to be immunolocalised within cells (Ross-MacDonald et al., 1999). We can thus see the glimmerings of a holistic, genome-wide, cell-wide unravelling of cellular physiology. Some of these approaches will be easily adaptable to higher organisms. We will soon have read-outs of RNA expression patterns in cells undergoing a variety of developmental and physiological programmes in normal and diseased states. The analysis of function and the identification of ORFs in higher eukaryotes are likely to be more problematic. However, solutions for the rapid assessment of the functions of novel genes are already emerging. New insights are coming from labs using double-stranded RNA to interfere with cellular processes in C. elegans. It was originally found in this organism that the injection of double-stranded RNA corresponding to part of the mRNA of a gene prevents the expression of that gene through a mechanism that currently remains mysterious (Fire, 1999). The technique works extremely well in the nematode and even in the fruit fly, but doubts had been cast as to whether it would ever be valuable in mammals. The recent finding that the technique does indeed work in the mouse may well accelerate programmes to identify gene function by circumventing the particularly lengthy procedures for disruption of mouse genes (Wianny and Zernicka-Goetz, 2000). The multiple layers of complexity revealed by these emerging studies give some indication of the computational power that will be needed to model the cell. Is it now time for a new breed of mathematical biologists to emerge? Our present generation of cellular and molecular biologists have lost sight of some of the basic principles of physical chemistry, and quantitative analyses are done poorly if at all. Should the quantification of reaction kinetics now come out of the traditional domain of enzymology and be applied to multiple cellular processes - if we are truly to understand the dynamics of the living cell? If the yeast cell is complex, then how much greater complexity will we find in multicellular eukaryotes, given all the potential for cell-cell interactions? These problems are perhaps most alluring in the field of development, in which many phenomena are now demanding attention at the cellular level. In recent decades we have seen classical embryological approaches supplemented by genetic analyses to define the components of many developmental signalling pathways. This has demonstrated the existence of a conserved collection of molecular switches that can be used in a variety of different developmental circumstances. We are perhaps reaching the limits at which conventional genetic analyses can interpret these processes: often the precise relationships between components of regulatory pathways is not clear. We require a better grasp of how the molecules within the pathways interact, which will require the concerted application of sub-cellular fractionation, to identify molecular complexes, and proteomics. This has to be achieved in a way that allows us to interpret the consequences of multiple signalling events between different cell types. In the introduction to his famous text The Cell in Development and Inheritance, E. B. Wilson wrote almost a century ago: ‘It has only recently become possible adequately to formulate the great problems of development and heredity in terms of cellular biology - indeed we can as yet do little more than so formulate them.’ Has our perspective changed during the past one hundred years? Are not these the same challenges that lie ahead for the twenty-first century? It is now rather like being Alice in Wonderland in a room with many doors, each of which marks the onset of a new journey. Undoubtedly, any of the doors will lead to remarkable opportunities, but to what extent can we, as Alice, rely upon drinking from the bottle, or eating the biscuit, that happens to be at hand? We will have to use the existing resources, but it will be fascinating to see what new ingenuities we can bring to bear to help us on our journey through Wonderland. I have the feeling that we are to witness conceptual challenges to the way we think about cell biology that we cannot yet begin to appreciate…but what I would give to be around in one hundred years time to witness the progress we have made on our journeys!
Los estilos APA, Harvard, Vancouver, ISO, etc.
28

Brady, Ross. "Semantic Decision Procedures for Some Relevant Logics". Australasian Journal of Logic 1 (1 de julio de 2003). http://dx.doi.org/10.26686/ajl.v1i0.1760.

Texto completo
Resumen
This paper proves decidability of a range of weak relevant logics using decision procedures based on the Routley-Meyer semantics. Logics are categorized as F-logics, for those proved decidable using a filtration method, and U-logics, for those proved decidable using a direct (unfiltered) method. Both of these methods are set out as reductio methods, in the style of Hughes and Cresswell. We also examine some extensions of the U-logics where the method fails and infinite sequences of worlds can be generated.
Los estilos APA, Harvard, Vancouver, ISO, etc.
29

Satria, Budy, Acihmah Sidauruk, Raditya Wardhana, Abdussalam Al Akbar y M. Arinal Ihsan. "Penerapan Composite Performance Index (CPI) Sebagai Metode Pada Sistem Pendukung Keputusan Seleksi Penerima Beasiswa". Indonesian Journal of Computer Science 11, n.º 2 (17 de agosto de 2022). http://dx.doi.org/10.33022/ijcs.v11i2.3056.

Texto completo
Resumen
The selection process for scholarship recipients in higher education requires a measurable system. The problem that has been happening is that the procedures carried out by the scholarship provider still use a manual file examination system. Composite performance index is the method used in this study. The purpose of this study is to create a decision support system for the selection of scholarship recipients to be more systematic and time efficient in the process. There are 10 alternatives and 4 criteria, namely parental income, GPA, electricity usage ,and semesters. The results of this study were obtained 5 highest values are A7 t with a value of 235.00 rank 1, A4 with a value of 200.00 rank 2, A1 with a value of 134.14 rank 3 sequences, A5 with a value of 120.00 ranks 4, and A8 with a value of 91.67 sequences 5.
Los estilos APA, Harvard, Vancouver, ISO, etc.
30

Wyss, Patric, David Ginsbourger, Haochang Shou, Christos Davatzikos, Stefan Klöppel y Ahmed Abdulkadir. "Adaptive data-driven selection of sequences of biological and cognitive markers in pre-clinical diagnosis of dementia". Scientific Reports 13, n.º 1 (19 de abril de 2023). http://dx.doi.org/10.1038/s41598-023-32867-z.

Texto completo
Resumen
AbstractEffective clinical decision procedures must balance multiple competing objectives such as time-to-decision, acquisition costs, and accuracy. We describe and evaluate POSEIDON, a data-driven method for PrOspective SEquentIal DiagnOsis with Neutral zones to individualize clinical classifications. We evaluated the framework with an application in which the algorithm sequentially proposes to include cognitive, imaging, or molecular markers if a sufficiently more accurate prognosis of clinical decline to manifest Alzheimer’s disease is expected. Over a wide range of cost parameter data-driven tuning lead to quantitatively lower total cost compared to ad hoc fixed sets of measurements. The classification accuracy based on all longitudinal data from participants that was acquired over 4.8 years on average was 0.89. The sequential algorithm selected 14 percent of available measurements and concluded after an average follow-up time of 0.74 years at the expense of 0.05 lower accuracy. Sequential classifiers were competitive from a multi-objective perspective since they could dominate fixed sets of measurements by making fewer errors using less resources. Nevertheless, the trade-off of competing objectives depends on inherently subjective prescribed cost parameters. Thus, despite the effectiveness of the method, the implementation into consequential clinical applications will remain controversial and evolve around the choice of cost parameters.
Los estilos APA, Harvard, Vancouver, ISO, etc.
31

Clauss, Günther F., Sascha Kosleck y Daniel Testa. "Critical Situations of Vessel Operations in Short Crested Seas—Forecast and Decision Support System". Journal of Offshore Mechanics and Arctic Engineering 134, n.º 3 (2 de febrero de 2012). http://dx.doi.org/10.1115/1.4004515.

Texto completo
Resumen
The encounter of extreme waves, extreme wave groups, or unfavorable wave sequences poses dangerous threats for ships and floating/stationary marine structures. The impact of extreme waves causes enormous forces, whereas an unfavorable wave sequence—not necessarily extreme waves—can arouse critical motions or even resonance, often leading to loss of cargo, ship, or crew. Thus, besides a well thought-out maritime design, a system detecting critical incoming wave sequences in advance can help avoiding those dangerous situations, increasing the safety of sea transport or offshore operations. During the last two years a new system for decision support onboard a ship or floating/fixed marine structure named CASH—Computer Aided Ship Handling—has been introduced. The preceding papers showed the step wise development of the main components of the program code—3d-wave forecast and 3d-ship motion forecast. These procedures provide a deterministic approach to predict the short crested seas state within radar range of the ship, as well as resulting ship motions in six degrees of freedom. Both methods have been enhanced with special focus on the speed of calculation to ensure a just-in-time forecast. A newly developed component is the adaptive 3d-pressure distribution. This method calculates the pressure distribution along the wetted surface of the ship hull using a newly developed stretching approach. With the end of the joint project Loads on Ships in Seaway (LaSSe), (funded by the German Government) the paper presents the CASH system, giving the possibility to detect critical situations in advance. Thus not only decision support onboard a cruising ship can be provided, but also time windows for offshore operations are identified well in advance.
Los estilos APA, Harvard, Vancouver, ISO, etc.
32

Haroutunian, Evgueni y Aram Yesayan. "A Neyman-Pearson Proper Way to Universal Testing of Multiple Hypotheses Formed by Groups of Distributions". Mathematical Problems of Computer Science, 25 de diciembre de 2020, 18–33. http://dx.doi.org/10.51408/1963-0056.

Texto completo
Resumen
The asymptotically optimal Neyman-Pearson procedures of detection for models characterized by M discrete probability distributions arranged into K, 2 ≤ K ≤ M groups considered as hypotheses are investigated. The sequence of tests based on a growing number of observations is logarithmically asymptotically optimal (LAO) when a certain part of the given error probability exponents (reliabilities) provides positives values for all other reliabilities. LAO tests sequences for some models of objects, including cases, when rejection of decision may be permitted, and when part, or all given error probabilities decrease subexponentially with an increase in the of number of experiments, are desined. For all reliabilities of such tests single-letter characterizations are obtained. A simple case with three distributions and two hypotheses is considered.
Los estilos APA, Harvard, Vancouver, ISO, etc.
33

Simić, Stella, Alberto Bemporad, Omar Inverso y Mirco Tribastone. "Tight Error Analysis in Fixed-point Arithmetic". Formal Aspects of Computing, 4 de mayo de 2022. http://dx.doi.org/10.1145/3524051.

Texto completo
Resumen
We consider the problem of estimating the numerical accuracy of programs with operations in fixed-point arithmetic and variables of arbitrary, mixed precision and possibly non-deterministic value. By applying a set of parameterised rewrite rules, we transform the relevant fragments of the program under consideration into sequences of operations in integer arithmetic over vectors of bits, thereby reducing the problem as to whether the error enclosures in the initial program can ever exceed a given order of magnitude to simple reachability queries on the transformed program. We describe a possible verification flow and a prototype analyser that implements our technique.We present an experimental evaluation on a particularly complex industrial case study, including a preliminary comparison between bit-level and word-level decision procedures.
Los estilos APA, Harvard, Vancouver, ISO, etc.
34

Mentz, Marian, Karin Barac y Elza Odendaal. "An audit evidence planning model for the public sector". Journal of Economic and Financial Sciences 11, n.º 1 (29 de marzo de 2018). http://dx.doi.org/10.4102/jef.v11i1.166.

Texto completo
Resumen
Orientation: Auditors have to exercise complex, multi-dimensional evidence-planning judgements. Research purpose: Drawing on social closure theory, the aim of this study is to develop a model to inform the flexible exercise of judgement regarding the types, extent and combinations of audit procedures implemented to gather sufficient appropriate audit evidence to respond to the assessed risks of material misstatement. Motivation for the study: The exercise of considerable judgements by auditors may mean that little consistency is achieved regarding the quantity and quality of the audit evidence obtained, especially in the public sectors of developing countries (which are often plagued by corruption), and where auditors and auditees have limited skills and experience. Research approach, design and method: The study employs a theory-building approach to develop a model intended to guide public sector auditors (following an audit risk approach), to exercise planning judgements for a class of transactions, account balance and/or disclosure. Main findings: The model clarifies the audit evidence decision-making sequences, interrelationships and contingent dependencies of the different audit procedures, and quantifies the compensatory inter-relationships between the types of audit procedures to be performed and the overall levels of assurance desired in response to the assessed risks of material misstatement. Practical and managerial implications: The model could aid public sector auditors to reduce uncertainty, ambiguity and judgement errors during their planning decision-making. Contribution or value-add: The model has been incorporated into the audit methodology of the Auditor-General of South Africa, and has been assessed for compliance with the International Standards on Auditing by the Independent Regulatory Board for Auditors in South Africa.
Los estilos APA, Harvard, Vancouver, ISO, etc.
35

Zhang, Mengyi, Arianna Alfieri y Andrea Matta. "Generation of mathematical programming representations for discrete event simulation models of timed petri nets". Discrete Event Dynamic Systems, 6 de diciembre de 2023. http://dx.doi.org/10.1007/s10626-023-00387-7.

Texto completo
Resumen
AbstractThis work proposes a mathematical programming (MP) representation of discrete event simulation of timed Petri nets (TPN). Currently, mathematical programming techniques are not widely applied to optimize discrete event systems due to the difficulty of formulating models capable to correctly represent the system dynamics. This work connects the two fruitful research fields, i.e., mathematical programming and Timed Petri Nets. In the MP formalism, the decision variables of the model correspond to the transition firing times and the markings of the TPN, whereas the constraints represent the state transition logic and temporal sequences among events. The MP model and a simulation run of the TPN are then totally equivalent, and this equivalence has been validated through an application in the queuing network field. Using a TPN model as input, the MP model can be routinely generated and used as a white box for further tasks such as sensitivity analysis, cut generation in optimization procedures, and proof of formal properties.
Los estilos APA, Harvard, Vancouver, ISO, etc.
36

Igual Munoz, B., C. F. D. Fernandez Diaz, M. F. V. Ferre Vallverdu, A. B. J. Berenguer Jofesa, E. S. C. Sanchez Lacuesta, A. P. Pirola, F. R. S. Ridocci Soriano et al. "Impact of blood pressure on the extent of microvascular damage in the setting of reperfusion injury in STEMI asessed with magnetic resonance imaging". European Heart Journal 42, Supplement_1 (1 de octubre de 2021). http://dx.doi.org/10.1093/eurheartj/ehab724.1309.

Texto completo
Resumen
Abstract Hypertension is a poor prognostic factor following STEMI, however the impact of blood pressure (BP) on the extent of microvascular damage in the setting of acute ischemia -reperfusion injury has not yet been fully evaluated and this evaluation is the purpose of the present study. Methods A cohort of patients with acute STEMI referred to primary percutaneus intervention were prospectively included. Angiographic analysis were performed according to standard clinical practice and decision regarding type of stent and antiplatelet drugs was left to the discretion of cardiologist. Information about high blood pressure needing drug therapy before acute event and also BP levels at catheterization laboratory during reperfusion procedures were assessed. All patients underwent cardiac MRI during the first week post reperfusion with a standardized protocol including 8–10 short axis slices in order to asses: area of myocardium at risk (AR) as an area of signal hyperintensity >2sd with respect to the remote one in TSE-T2 sequences, necrosis size (NS) as signal hyperintensity >5 SDs relative to the remote in IR-FGE sequences, microvascular obstruction (MVO) as signal hypointensity in the infarct core in IR-FGE sequences and intra-infarct haemorrhage (IIH) as an area with T2*<20msec in T2* mapping sequences. The extent was quantified using a semi-automatic approach and expressed as a percentage of left ventricular mass. Results 94 patients,mean age 62,SD 13, 67% males were included. Hypertensive patients were more likely to have diabetes mellitus 43 vs 28% (p<0.05) while smoking was more frequent in non-hypertensive patients 41% vs 80%. (p<0.05). No differencies were found in extent of AR o rmyocardial salvage between groups, nevertheless hypertensive patients had significantly lower extent of NS, MVO and IIH. Regarding microvascular damage, hypertension showed a protective effect and presence of IIH was observed in 35% of hypertensive patients vs. 65% of non-hypertensive patients (p=0.019 OR: 0.9, p=0.04, CI: 0.1–0.9) and MVO in 40% vs 70% (OR: 0.2, p=0.02, CI: 0.1–0.6). Furthermore patients with BP levels below 120mmhg at cathetherization laboratory showed higher percentage of MVO (62% vs 36%, p<0.05) and IIH (36 vs 15%, p>0.05). Conclusions 1. Hypertension is associated with lower extent of necrosis and microvascular damage without differences in myocardial salvage and area of myocardium at risk. 2. Low BP levels at catheterization laboratory were associated with a significantly higher extent of microvascular damage, so maintaining systolic blood pressure over 120mmhg during reperfusion procedures should be advised. Funding Acknowledgement Type of funding sources: None. Typical CMR sequences: T2w, T2*, IR-FGRCMR segmental analysis with T2* mapping
Los estilos APA, Harvard, Vancouver, ISO, etc.
37

Vogel, Frederike, Nils M. Vahle, Jan Gertheiss y Martin J. Tomasik. "Supervised learning for analysing movement patterns in a virtual reality experiment". Royal Society Open Science 9, n.º 4 (abril de 2022). http://dx.doi.org/10.1098/rsos.211594.

Texto completo
Resumen
The projection into a virtual character and the concomitant illusionary body ownership can lead to transformations of one’s entity. Both during and after the exposure, behavioural and attitudinal changes may occur, depending on the characteristics or stereotypes associated with the embodied avatar. In the present study, we investigated the effects on physical activity when young students experience being old. After assignment (at random) to a young or an older avatar, the participants’ body movements were tracked while performing upper body exercises. We propose and discuss the use of supervised learning procedures to assign these movement patterns to the underlying avatar class in order to detect behavioural differences. This approach can be seen as an alternative to classical feature-wise testing. We found that the classification accuracy was remarkably good for support vector machines with linear kernel and deep learning by convolutional neural networks, when inserting time sub-sequences extracted at random and repeatedly from the original data. For hand movements, associated decision boundaries revealed a higher level of local, vertical positions for the young avatar group, indicating increased agility in their performances. This occurrence held for both guided movements as well as achievement-orientated exercises.
Los estilos APA, Harvard, Vancouver, ISO, etc.
38

Yoshizawa, Tomohiko, Makoto Ito y Kenji Doya. "Neuronal representation of a working memory-based decision strategy in the motor and prefrontal cortico-basal ganglia loops". eneuro, 1 de junio de 2023, ENEURO.0413–22.2023. http://dx.doi.org/10.1523/eneuro.0413-22.2023.

Texto completo
Resumen
While animal and human decision strategies are typically explained by model-free and model-based reinforcement learning, their choice sequences often follow simple procedures based on working memory of past actions and rewards. Here we address how working memory-based choice strategies, such as win-stay-lose-switch (WSLS), are represented in the prefrontal and motor cortico-basal ganglia loops by simultaneous recording of neuronal activities in the dorsomedial striatum (DMS), the dorsolateral striatum (DLS), the medial prefrontal cortex (mPFC), and the primary motor cortex (M1). In order to compare neuronal representations when rats employ working memory-based strategies, we developed a new task paradigm, a continuous/intermittent choice task, consisting of choice and no-choice trials. While the continuous condition (CC) consisted of only choice trials, in the intermittent condition (IC), a no-choice trial was inserted after each choice trial to disrupt working memory of the previous choice and reward. Behaviors in CC showed high proportions of win-stay and lose-switch choices, which could be regarded as “a noisy WSLS strategy.” Poisson regression of neural spikes revealed encoding specifically in CC of the previous action and reward before action choice and prospective coding of WSLS action during action execution. A striking finding was that the DLS and M1 in the motor cortico-basal ganglia loop carry substantial WM information about previous choices, rewards, and their interactions, in addition to current action coding.Significance StatementWorking memory-based decision strategies, such as win-stay-lose-switch (WSLS), are widely observed in humans and animals. To address neuronal bases of these strategies, we recorded neuronal activities of rat prefrontal and motor cortico-basal ganglia loops during continuous/intermittent choice tasks. The rat choice strategy was a noisy WSLS in the continuous choice condition, whereas non-WSLS was selected in the intermittent choice condition. In the continuous choice condition, the primary motor cortex and the dorsolateral striatum in the motor loop more strongly conveyed information about previous choices, rewards, and their interactions than the medial prefrontal cortex and the dorsomedial striatum in the prefrontal loop. These results demonstrate that the motor cortico-basal ganglia loop contributes to working memory-based decision strategies.
Los estilos APA, Harvard, Vancouver, ISO, etc.
39

Lo, Chien-Chi, Migun Shakya, Ryan Connor, Karen Davenport, Mark Flynn, Adán Myers y. Gutiérrez, Bin Hu et al. "EDGE COVID-19: a web platform to generate submission-ready genomes from SARS-CoV-2 sequencing efforts". Bioinformatics, 24 de marzo de 2022. http://dx.doi.org/10.1093/bioinformatics/btac176.

Texto completo
Resumen
Abstract Summary Genomics has become an essential technology for surveilling emerging infectious disease outbreaks. A range of technologies and strategies for pathogen genome enrichment and sequencing are being used by laboratories worldwide, together with different and sometimes ad hoc, analytical procedures for generating genome sequences. A fully integrated analytical process for raw sequence to consensus genome determination, suited to outbreaks such as the ongoing COVID-19 pandemic, is critical to provide a solid genomic basis for epidemiological analyses and well-informed decision making. We have developed a web-based platform and integrated bioinformatic workflows that help to provide consistent high-quality analysis of SARS-CoV-2 sequencing data generated with either the Illumina or Oxford Nanopore Technologies (ONT). Using an intuitive web-based interface, this workflow automates data quality control, SARS-CoV-2 reference-based genome variant and consensus calling, lineage determination and provides the ability to submit the consensus sequence and necessary metadata to GenBank, GISAID and INSDC raw data repositories. We tested workflow usability using real world data and validated the accuracy of variant and lineage analysis using several test datasets, and further performed detailed comparisons with results from the COVID-19 Galaxy Project workflow. Our analyses indicate that EC-19 workflows generate high-quality SARS-CoV-2 genomes. Finally, we share a perspective on patterns and impact observed with Illumina versus ONT technologies on workflow congruence and differences. Availability and implementation https://edge-covid19.edgebioinformatics.org, and https://github.com/LANL-Bioinformatics/EDGE/tree/SARS-CoV2. Supplementary information Supplementary data are available at Bioinformatics online.
Los estilos APA, Harvard, Vancouver, ISO, etc.
40

Sirois, Caroline, Richard Khoury, Audrey Durand, Pierre-Luc Deziel, Olga Bukhtiyarova, Yohann Chiu, Denis Talbot et al. "Exploring polypharmacy with artificial intelligence: data analysis protocol". BMC Medical Informatics and Decision Making 21, n.º 1 (20 de julio de 2021). http://dx.doi.org/10.1186/s12911-021-01583-x.

Texto completo
Resumen
Abstract Background Polypharmacy is common among older adults and it represents a public health concern, due to the negative health impacts potentially associated with the use of several medications. However, the large number of medication combinations and sequences of use makes it complicated for traditional statistical methods to predict which therapy is genuinely associated with health outcomes. The project aims to use artificial intelligence (AI) to determine the quality of polypharmacy among older adults with chronic diseases in the province of Québec, Canada. Methods We will use data from the Quebec Integrated Chronic Disease Surveillance System (QICDSS). QICDSS contains information about prescribed medications in older adults in Quebec collected over 20 years. It also includes diagnostic codes and procedures, and sociodemographic data linked through a unique identification number for each individual. Our research will be structured around three interconnected research axes: AI, Health, and Law&Ethics. The AI research axis will develop algorithms for finding frequent patterns of medication use that correlate with health events, considering data locality and temporality (explainable AI or XAI). The Health research axis will translate these patterns into polypharmacy indicators relevant to public health surveillance and clinicians. The Law&Ethics axis will assess the social acceptability of the algorithms developed using AI tools and the indicators developed by the Heath axis and will ensure that the developed indicators neither discriminate against any population group nor increase the disparities already present in the use of medications. Discussion The multi-disciplinary research team consists of specialists in AI, health data, statistics, pharmacy, public health, law, and ethics, which will allow investigation of polypharmacy from different points of view and will contribute to a deeper understanding of the clinical, social, and ethical issues surrounding polypharmacy and its surveillance, as well as the use of AI for health record data. The project results will be disseminated to the scientific community, healthcare professionals, and public health decision-makers in peer-reviewed publications, scientific meetings, and reports. The diffusion of the results will ensure the confidentiality of individual data.
Los estilos APA, Harvard, Vancouver, ISO, etc.
41

Boellaard, Thierry N., Margriet C. van Dijk-de Haan, Stijn W. T. P. J. Heijmink, Corinne N. Tillier, Hans Veerman, Laura S. Mertens, Henk G. van der Poel, Pim J. van Leeuwen y Ivo G. Schoots. "Membranous urethral length measurement on preoperative MRI to predict incontinence after radical prostatectomy: a literature review towards a proposal for measurement standardization". European Radiology, 22 de septiembre de 2023. http://dx.doi.org/10.1007/s00330-023-10180-7.

Texto completo
Resumen
Abstract Objectives To investigate the membranous urethral length (MUL) measurement and its interobserver agreement, and propose literature-based recommendations to standardize MUL measurement for increasing interobserver agreement. MUL measurements based on prostate MRI scans, for urinary incontinence risk assessment before radical prostatectomy (RP), may influence treatment decision-making in men with localised prostate cancer. Before implementation in clinical practise, MRI-based MUL measurements need standardization to improve observer agreement. Methods Online libraries were searched up to August 5, 2022, on MUL measurements. Two reviewers performed article selection and critical appraisal. Papers reporting on preoperative MUL measurements and urinary continence correlation were selected. Extracted information included measuring procedures, MRI sequences, population mean/median values, and observer agreement. Results Fifty papers were included. Studies that specified the MRI sequence used T2-weighted images and used either coronal images (n = 13), sagittal images (n = 18), or both (n = 12) for MUL measurements. ‘Prostatic apex’ was the most common description of the proximal membranous urethra landmark and ‘level/entry of the urethra into the penile bulb’ was the most common description of the distal landmark. Population mean (median) MUL value range was 10.4–17.1 mm (7.3–17.3 mm), suggesting either population or measurement differences. Detailed measurement technique descriptions for reproducibility were lacking. Recommendations on MRI-based MUL measurement were formulated by using anatomical landmarks and detailed descriptions and illustrations. Conclusions In order to improve on measurement variability, a literature-based measuring method of the MUL was proposed, supported by several illustrative case studies, in an attempt to standardize MRI-based MUL measurements for appropriate urinary incontinence risk preoperatively. Clinical relevance statement Implementation of MUL measurements into clinical practise for personalized post-prostatectomy continence prediction is hampered by lack of standardization and suboptimal interobserver agreement. Our proposed standardized MUL measurement aims to facilitate standardization and to improve the interobserver agreement. Key Points • Variable approaches for membranous urethral length measurement are being used, without detailed description and with substantial differences in length of the membranous urethra, hampering standardization. • Limited interobserver agreement for membranous urethral length measurement was observed in several studies, while preoperative incontinence risk assessment necessitates high interobserver agreement. • Literature-based recommendations are proposed to standardize MRI-based membranous urethral length measurement for increasing interobserver agreement and improving preoperative incontinence risk assessment, using anatomical landmarks on sagittal T2-weighted images.
Los estilos APA, Harvard, Vancouver, ISO, etc.
42

Hardisty, Alex, Hannu Saarenmaa, Ana Casino, Mathias Dillen, Karsten Gödderz, Quentin Groom, Helen Hardy et al. "Conceptual design blueprint for the DiSSCo digitization infrastructure - DELIVERABLE D8.1". Research Ideas and Outcomes 6 (18 de mayo de 2020). http://dx.doi.org/10.3897/rio.6.e54280.

Texto completo
Resumen
DiSSCo, the Distributed System of Scientific Collections, is a pan-European Research Infrastructure (RI) mobilising, unifying bio- and geo-diversity information connected to the specimens held in natural science collections and delivering it to scientific communities and beyond. Bringing together 120 institutions across 21 countries and combining earlier investments in data interoperability practices with technological advancements in digitisation, cloud services and semantic linking, DiSSCo makes the data from natural science collections available as one virtual data cloud, connected with data emerging from new techniques and not already linked to specimens. These new data include DNA barcodes, whole genome sequences, proteomics and metabolomics data, chemical data, trait data, and imaging data (Computer-assisted Tomography (CT), Synchrotron, etc.), to name but a few; and will lead to a wide range of end-user services that begins with finding, accessing, using and improving data. DiSSCo will deliver the diagnostic information required for novel approaches and new services that will transform the landscape of what is possible in ways that are hard to imagine today. With approximately 1.5 billion objects to be digitised, bringing natural science collections to the information age is expected to result in many tens of petabytes of new data over the next decades, used on average by 5,000 – 15,000 unique users every day. This requires new skills, clear policies and robust procedures and new technologies to create, work with and manage large digital datasets over their entire research data lifecycle, including their long-term storage and preservation and open access. Such processes and procedures must match and be derived from the latest thinking in open science and data management, realising the core principles of 'findable, accessible, interoperable and reusable' (FAIR). Synthesised from results of the ICEDIG project ("Innovation and Consolidation for Large Scale Digitisation of Natural Heritage", EU Horizon 2020 grant agreement No. 777483) the DiSSCo Conceptual Design Blueprint covers the organisational arrangements, processes and practices, the architecture, tools and technologies, culture, skills and capacity building and governance and business model proposals for constructing the digitisation infrastructure of DiSSCo. In this context, the digitisation infrastructure of DiSSCo must be interpreted as that infrastructure (machinery, processing, procedures, personnel, organisation) offering Europe-wide capabilities for mass digitisation and digitisation-on-demand, and for the subsequent management (i.e., curation, publication, processing) and use of the resulting data. The blueprint constitutes the essential background needed to continue work to raise the overall maturity of the DiSSCo Programme across multiple dimensions (organisational, technical, scientific, data, financial) to achieve readiness to begin construction. Today, collection digitisation efforts have reached most collection-holding institutions across Europe. Much of the leadership and many of the people involved in digitisation and working with digital collections wish to take steps forward and expand the efforts to benefit further from the already noticeable positive effects. The collective results of examining technical, financial, policy and governance aspects show the way forward to operating a large distributed initiative i.e., the Distributed System of Scientific Collections (DiSSCo) for natural science collections across Europe. Ample examples, opportunities and need for innovation and consolidation for large scale digitisation of natural heritage have been described. The blueprint makes one hundred and four (104) recommendations to be considered by other elements of the DiSSCo Programme of linked projects (i.e., SYNTHESYS+, COST MOBILISE, DiSSCo Prepare, and others to follow) and the DiSSCo Programme leadership as the journey towards organisational, technical, scientific, data and financial readiness continues. Nevertheless, significant obstacles must be overcome as a matter of priority if DiSSCo is to move beyond its Design and Preparatory Phases during 2024. Specifically, these include: Organisational: Strengthen common purpose by adopting a common framework for policy harmonisation and capacity enhancement across broad areas, especially in respect of digitisation strategy and prioritisation, digitisation processes and techniques, data and digital media publication and open access, protection of and access to sensitive data, and administration of access and benefit sharing. Pursue the joint ventures and other relationships necessary to the successful delivery of the DiSSCo mission, especially ventures with GBIF and other international and regional digitisation and data aggregation organisations, in the context of infrastructure policy frameworks, such as EOSC. Proceed with the explicit aim of avoiding divergences of approach in global natural science collections data management and research. Strengthen common purpose by adopting a common framework for policy harmonisation and capacity enhancement across broad areas, especially in respect of digitisation strategy and prioritisation, digitisation processes and techniques, data and digital media publication and open access, protection of and access to sensitive data, and administration of access and benefit sharing. Pursue the joint ventures and other relationships necessary to the successful delivery of the DiSSCo mission, especially ventures with GBIF and other international and regional digitisation and data aggregation organisations, in the context of infrastructure policy frameworks, such as EOSC. Proceed with the explicit aim of avoiding divergences of approach in global natural science collections data management and research. Technical: Adopt and enhance the DiSSCo Digital Specimen Architecture and, specifically as a matter of urgency, establish the persistent identifier scheme to be used by DiSSCo and (ideally) other comparable regional initiatives. Establish (software) engineering development and (infrastructure) operations team and direction essential to the delivery of services and functionalities expected from DiSSCo such that earnest engineering can lead to an early start of DiSSCo operations. Adopt and enhance the DiSSCo Digital Specimen Architecture and, specifically as a matter of urgency, establish the persistent identifier scheme to be used by DiSSCo and (ideally) other comparable regional initiatives. Establish (software) engineering development and (infrastructure) operations team and direction essential to the delivery of services and functionalities expected from DiSSCo such that earnest engineering can lead to an early start of DiSSCo operations. Scientific: Establish a common digital research agenda leveraging Digital (extended) Specimens as anchoring points for all specimen-associated and -derived information, demonstrating to research institutions and policy/decision-makers the new possibilities, opportunities and value of participating in the DiSSCo research infrastructure. Establish a common digital research agenda leveraging Digital (extended) Specimens as anchoring points for all specimen-associated and -derived information, demonstrating to research institutions and policy/decision-makers the new possibilities, opportunities and value of participating in the DiSSCo research infrastructure. Data: Adopt the FAIR Digital Object Framework and the International Image Interoperability Framework as the low entropy means to achieving uniform access to rich data (image and non-image) that is findable, accessible, interoperable and reusable (FAIR). Develop and promote best practice approaches towards achieving the best digitisation results in terms of quality (best, according to agreed minimum information and other specifications), time (highest throughput, fast), and cost (lowest, minimal per specimen). Adopt the FAIR Digital Object Framework and the International Image Interoperability Framework as the low entropy means to achieving uniform access to rich data (image and non-image) that is findable, accessible, interoperable and reusable (FAIR). Develop and promote best practice approaches towards achieving the best digitisation results in terms of quality (best, according to agreed minimum information and other specifications), time (highest throughput, fast), and cost (lowest, minimal per specimen). Financial Broaden attractiveness (i.e., improve bankability) of DiSSCo as an infrastructure to invest in. Plan for finding ways to bridge the funding gap to avoid disruptions in the critical funding path that risks interrupting core operations; especially when the gap opens between the end of preparations and beginning of implementation due to unsolved political difficulties. Broaden attractiveness (i.e., improve bankability) of DiSSCo as an infrastructure to invest in. Plan for finding ways to bridge the funding gap to avoid disruptions in the critical funding path that risks interrupting core operations; especially when the gap opens between the end of preparations and beginning of implementation due to unsolved political difficulties. Strategically, it is vital to balance the multiple factors addressed by the blueprint against one another to achieve the desired goals of the DiSSCo programme. Decisions cannot be taken on one aspect alone without considering other aspects, and here the various governance structures of DiSSCo (General Assembly, advisory boards, and stakeholder forums) play a critical role over the coming years.
Los estilos APA, Harvard, Vancouver, ISO, etc.
43

Horrigan, Matthew. "A Flattering Robopocalypse". M/C Journal 23, n.º 6 (28 de noviembre de 2020). http://dx.doi.org/10.5204/mcj.2726.

Texto completo
Resumen
RACHAEL. It seems you feel our work is not a benefit to the public.DECKARD. Replicants are like any other machine. They're either a benefit or a hazard. If they're a benefit it's not my problem.RACHAEL. May I ask you a personal question?DECKARD. Yes.RACHAEL. Have you every retired a human by mistake? (Scott 17:30) CAPTCHAs (henceforth "captchas") are commonplace on today's Internet. Their purpose seems clear: block malicious software, allow human users to pass. But as much as they exclude spambots, captchas often exclude humans with visual and other disabilities (Dzieza; W3C Working Group). Worse yet, more and more advanced captcha-breaking technology has resulted in more and more challenging captchas, raising the barrier between online services and those who would access them. In the words of inclusive design advocate Robin Christopherson, "CAPTCHAs are evil". In this essay I describe how the captcha industry implements a posthuman process that speculative fiction has gestured toward but not grasped. The hostile posthumanity of captcha is not just a technical problem, nor just a problem of usability or access. Rather, captchas convey a design philosophy that asks humans to prove themselves by performing well at disembodied games. This philosophy has its roots in the Turing Test itself, whose terms guide speculation away from the real problems that today's authentication systems present. Drawing the concept of "procedurality" from game studies, I argue that, despite a design goal of separating machines and humans to the benefit of the latter, captchas actually and ironically produce an arms race in which humans have a systematic and increasing disadvantage. This arms race results from the Turing Test's equivocation between human and machine bodies, an assumption whose influence I identify in popular film, science fiction literature, and captcha design discourse. The Captcha Industry and Its Side-Effects Exclusion is an essential function of every cybersecurity system. From denial-of-service attacks to data theft, toxic automated entities constantly seek admission to services they would damage. To remain functional and accessible, Websites need security systems to keep out "abusive agents" (Shet). In cybersecurity, the term "user authentication" refers to the process of distinguishing between abusive agents and welcome users (Jeng et al.). Of the many available authentication techniques, CAPTCHA, "Completely Automated Public Turing test[s] to tell Computers and Humans Apart" (Von Ahn et al. 1465), is one of the most iconic. Although some captchas display a simple checkbox beside a disclaimer to the effect that "I am not a robot" (Shet), these frequently give way to more difficult alternatives: perception tests (fig. 1). Test captchas may show sequences of distorted letters, which a user is supposed to recognise and then type in (Godfrey). Others effectively digitize a game of "I Spy": an image appears, with an instruction to select the parts of it that show a specific type of object (Zhu et al.). A newer type of captcha involves icons rotated upside-down or sideways, the task being to right them (Gossweiler et al.). These latter developments show the influence of gamification (Kani and Nishigaki; Kumar et al.), the design trend where game-like elements figure in serious tasks. Fig. 1: A series of captchas followed by multifactor authentication as a "quick security check" during the author's suspicious attempt to access LinkedIn over a Virtual Private Network Gamified captchas, in using tests of ability to tell humans from computers, invite three problems, of which only the first has received focussed critical attention. I discuss each briefly below, and at greater length in subsequent sections. First, as many commentators have pointed out (W3C Working Group), captchas can accidentally categorise real humans as nonhumans—a technical problem that becomes more likely as captcha-breaking technologies improve (e.g. Tam et al.; Brown et al.). Indeed, the design and breaking of captchas has become an almost self-sustaining subfield in computer science, as researchers review extant captchas, publish methods for breaking them, and publish further captcha designs (e.g. Weng et al.). Such research fuels an industry of captcha-solving services (fig. 2), of which some use automated techniques, and some are "human-powered", employing groups of humans to complete large numbers of captchas, thus clearing the way for automated incursions (Motoyama et al. 2). Captchas now face the quixotic task of using ability tests to distinguish legitimate users from abusers with similar abilities. Fig. 2: Captcha production and captcha breaking: a feedback loop Second, gamified captchas import the feelings of games. When they defeat a real human, the human seems not to have encountered the failure state of an automated procedure, but rather to have lost, or given up on, a game. The same frame of "gameful"-ness (McGonigal, under "Happiness Hacking") or "gameful work" (under "The Rise of the Happiness Engineers"), supposed to flatter users with a feeling of reward or satisfaction when they complete a challenge, has a different effect in the event of defeat. Gamefulness shifts the fault from procedure to human, suggesting, for the latter, the shameful status of loser. Third, like games, gamified captchas promote a particular strain of logic. Just as other forms of media can be powerful venues for purveying stereotypes, so are gamified captchas, in this case conveying the notion that ability is a legitimate means, not only of apportioning privilege, but of humanising and dehumanising. Humanity thus appears as a status earned, and disability appears not as a stigma, nor an occurrence, but an essence. The latter two problems emerge because the captcha reveals, propagates and naturalises an ideology through mechanised procedures. Below I invoke the concept of "procedural rhetoric" to critique the disembodied notion of humanity that underlies both the original Turing Test and the "Completely Automated Public Turing test." Both tests, I argue, ultimately play to the disadvantage of their human participants. Rhetorical Games, Procedural Rhetoric When videogame studies emerged as an academic field in the early 2000s, once of its first tasks was to legitimise games relative to other types of artefact, especially literary texts (Eskelinen; Aarseth). Scholars sought a framework for discussing how video games, like other more venerable media, can express ideas (Weise). Janet Murray and Ian Bogost looked to the notion of procedure, devising the concepts of "procedurality" (Bogost 3), "procedural authorship" (Murray 171), and "procedural rhetoric" (Bogost 1). From a proceduralist perspective, a videogame is both an object and a medium for inscribing processes. Those processes have two basic types: procedures the game's developers have authored, which script the behaviour of the game as a computer program; and procedures human players respond with, the "operational logic" of gameplay (Bogost 13). Procedurality's two types of procedure, the computerised and the human, have a kind of call-and-response relationship, where the behaviour of the machine calls upon players to respond with their own behaviour patterns. Games thus train their players. Through the training that is play, players acquire habits they bring to other contexts, giving videogames the power not only to express ideas but "disrupt and change fundamental attitudes and beliefs about the world, leading to potentially significant long-term social change" (Bogost ix). That social change can be positive (McGonigal), or it can involve "dark patterns", cases where game procedures provoke and exploit harmful behaviours (Zagal et al.). For example, embedded in many game paradigms is the procedural rhetoric of "toxic meritocracy" (Paul 66), where players earn rewards, status and personal improvement by overcoming challenges, and, especially, excelling where others fail. While meritocracy may seem logical within a strictly competitive arena, its effect in a broader cultural context is to legitimise privileges as the spoils of victory, and maltreatment as the just result of defeat. As game design has influenced other fields, so too has procedurality's applicability expanded. Gamification, "the use of game design elements in non-game contexts" (Deterding et al. 9), is a popular trend in which designers seek to imbue diverse tasks with some of the enjoyment of playing a game (10). Gamification discourse has drawn heavily upon Mihaly Csikszentmihalyi's "positive psychology" (Seligman and Csikszentmihalyi), and especially the speculative psychology of flow (Csikszentmihalyi 51), which promise enormously broad benefits for individuals acting in the "flow state" that challenging play supposedly promotes (75). Gamification has become a celebrated cause, advocated by a group of scholars and designers Sebastian Deterding calls the "Californian league of gamification evangelists" (120), before becoming an object of critical scrutiny (Fuchs et al.). Where gamification goes, it brings its dark patterns with it. In gamified user authentication (Kroeze and Olivier), and particularly gamified captcha, there occurs an intersection of deceptively difficult games, real-world stakes, and users whose differences go often ignored. The Disembodied Arms Race In captcha design research, the concept of disability occurs under the broader umbrella of usability. Usability studies emphasise the fact that some technology pieces are easier to access than others (Yan and El Ahmad). Disability studies, in contrast, emphasises the fact that different users have different capacities to overcome access barriers. Ability is contextual, an intersection of usability and disability, use case and user (Reynolds 443). When used as an index of humanness, ability yields illusive results. In Posthuman Knowledge, Rosi Braidotti begins her conceptual enquiry into the posthuman condition with a contemplation of captcha, asking what it means to tick that checkbox claiming that "I am not a robot" (8), and noting the baffling multiplicity of possible answers. From a practical angle, Junya Kani and Masakatsu Nishigaki write candidly about the problem of distinguishing robot from human: "no matter how advanced malicious automated programs are, a CAPTCHA that will not pass automated programs is required. Hence, we have to find another human cognitive processing capability to tackle this challenge" (40). Kani and Nishigaki try out various human cognitive processing capabilities for the task. Narrative comprehension and humour become candidates: might a captcha ascribe humanity based on human users' ability to determine the correct order of scenes in a film (43)? What about panels in a cartoon (40)? As they seek to assess the soft skills of machines, Kani and Nishigaki set up a drama similar to that of Philip K. Dick's Do Androids Dream of Electric Sheep. Do Androids Dream of Electric Sheep, and its film adaptation, Blade Runner (Scott), describe a spacefaring society populated by both humans and androids. Androids have lesser legal privileges than humans, and in particular face execution—euphemistically called "retirement"—for trespassing on planet Earth (Dick 60). Blade Runner gave these androids their more famous name: "replicant". Replicants mostly resemble humans in thought and action, but are reputed to lack the capacity for empathy, so human police, seeking a cognitive processing capability unique to humans, test for empathy to test for humanness (30). But as with captchas, Blade Runner's testing procedure depends upon an automated device whose effectiveness is not certain, prompting the haunting question: "have you ever retired a human by mistake?" (Scott 17:50). Blade Runner's empathy test is part of a long philosophical discourse about the distinction between human and machine (e.g. Putnam; Searle). At the heart of the debate lies Alan Turing's "Turing Test", which a machine hypothetically passes when it can pass itself off as a human conversationalist in an exchange of written text. Turing's motivation for coming up with the test goes: there may be no absolute way of defining what makes a human mind, so the best we can do is assess a computer's ability to imitate one (Turing 433). The aporia, however—how can we determine what makes a human mind?—is the result of an unfair question. Turing's test, dealing only with information expressed in strings of text, purposely disembodies both humans and machines. The Blade Runner universe similarly evens the playing field: replicants look, feel and act like humans to such an extent that distinguishing between the two becomes, again, the subject of a cognition test. The Turing Test, obsessed with information processing and steeped in mind-body dualism, assesses humanness using criteria that automated users can master relatively easily. In contrast, in everyday life, I use a suite of much more intuitive sensory tests to distinguish between my housemate and my laptop. My intuitions capture what the Turing Test masks: a human is a fleshy entity, possessed of the numerous trappings and capacities of a human body. The result of the automated Turing Test's focus on cognition is an arms race that places human users at an increasing disadvantage. Loss, in such a race, manifests not only as exclusion by and from computer services, but as a redefinition of proper usership, the proper behaviour of the authentic, human, user. Thus the Turing Test implicitly provides for a scenario where a machine becomes able to super-imitate humanness: to be perceived as human more often than a real human would be. In such an outcome, it would be the human conversationalist who would begin to fail the Turing test; to fail to pass themself off according to new criteria for authenticity. This scenario is possible because, through procedural rhetoric, machines shift human perspectives: about what is and is not responsible behaviour; about what humans should and should not feel when confronted with a challenge; about who does and does not deserve access; and, fundamentally, about what does and does not signify authentic usership. In captcha, as in Blade Runner, it is ultimately a machine that adjudicates between human and machine cognition. As users we rely upon this machine to serve our interests, rather than pursue some emergent automated interest, some by-product of the feedback loop that results from the ideologies of human researchers both producing and being produced by mechanised procedures. In the case of captcha, that faith is misplaced. The Feeling of Robopocalypse A rich repertory of fiction has speculated upon what novelist Daniel Wilson calls the "Robopocalypse", the scenario where machines overthrow humankind. Most versions of the story play out as a slave-owner's nightmare, featuring formerly servile entities (which happen to be machines) violently revolting and destroying the civilisation of their masters. Blade Runner's rogue replicants, for example, are effectively fugitive slaves (Dihal 196). Popular narratives of robopocalypse, despite showing their antagonists as lethal robots, are fundamentally human stories with robots playing some of the parts. In contrast, the exclusion a captcha presents when it defeats a human is not metaphorical or emancipatory. There, in that moment, is a mechanised entity defeating a human. The defeat takes place within an authoritative frame that hides its aggression. For a human user, to be defeated by a captcha is to fail to meet an apparently common standard, within the framework of a common procedure. This is a robopocalypse of baffling systems rather than anthropomorphic soldiers. Likewise, non-human software clients pose threats that humanoid replicants do not. In particular, software clients replicate much faster than physical bodies. The sheer sudden scale of a denial-of-service attack makes Philip K. Dick's vision of android resistance seem quaint. The task of excluding unauthorised software, unlike the impulse to exclude replicants, is more a practical necessity than an exercise in colonialism. Nevertheless, dystopia finds its way into the captcha process through the peril inherent in the test, whenever humans are told apart from authentic users. This is the encroachment of the hostile posthuman, naturalised by us before it denaturalises us. The hostile posthuman sometimes manifests as a drone strike, Terminator-esque (Cameron), a dehumanised decision to kill (Asaro). But it is also a process of gradual exclusion, detectable from moment to moment as a feeling of disdain or impatience for the irresponsibility, incompetence, or simply unusualness of a human who struggles to keep afloat of a rising standard. "We are in this together", Braidotti writes, "between the algorithmic devil and the acidified deep blue sea" (9). But we are also in this separately, divided along lines of ability. Captcha's danger, as a broken procedure, hides in plain sight, because it lashes out at some only while continuing to flatter others with a game that they can still win. Conclusion Online security systems may always have to define some users as legitimate and others as illegitimate. Is there a future where they do so on the basis of behaviour rather than identity or essence? Might some future system accord each user, human or machine, the same authentic status, and provide all with an initial benefit of the doubt? In the short term, such a system would seem grossly impractical. The type of user that most needs to be excluded is the disembodied type, the type that can generate orders of magnitude more demands than a human, that can proliferate suddenly and in immense number because it does not lag behind the slow processes of human bodies. This type of user exists in software alone. Rich in irony, then, is the captcha paradigm which depends on the disabilities of the threats it confronts. We dread malicious software not for its disabilities—which are momentary and all too human—but its abilities. Attenuating the threat presented by those abilities requires inverting a habit that meritocracy trains and overtrains: specifically, we have here a case where the plight of the human user calls for negative action toward ability rather than disability. References Aarseth, Espen. "Computer Game Studies, Year One." Game Studies 1.1 (2001): 1–15. Asaro, Peter. "On Banning Autonomous Weapon Systems: Human Rights, Automation, and the Dehumanization of Lethal Decision-Making." International Review of the Red Cross 94.886 (2012): 687–709. Blade Runner. Dir. Ridley Scott. Warner Bros, 1982. Bogost, Ian. Persuasive Games: The Expressive Power of Videogames. Cambridge, MA: MIT Press, 2007. Braidotti, Rosi. Posthuman Knowledge. Cambridge: Polity Press, 2019. Brown, Samuel S., et al. "I Am 'Totally' Human: Bypassing the Recaptcha." 13th International Conference on Signal-Image Technology & Internet-Based Systems (SITIS), 2017. Christopherson, Robin. "AI Is Making CAPTCHA Increasingly Cruel for Disabled Users." AbilityNet 2019. 17 Sep. 2020 <https://abilitynet.org.uk/news-blogs/ai-making-captcha-increasingly-cruel-disabled-users>. Csikszentmihalyi, Mihaly. Flow: The Psychology of Optimal Experience. Harper & Row: New York, 1990. Deterding, Sebastian. "Eudaimonic Design, Or: Six Invitations to Rethink Gamification." Rethinking Gamification. Eds. Mathias Fuchs et al. Lüneburg: Meson Press, 2014. Deterding, Sebastian, et al. "From Game Design Elements to Gamefulness: Defining Gamification." Proceedings of the 15th International Academic MindTrek Conference: Envisioning Future Media Environments. ACM, 2011. Dick, Philip K. Do Androids Dream of Electric Sheep. 1968. New York: Del Rey, 1996. Dihal, Kanta. "Artificial Intelligence, Slavery, and Revolt." AI Narratives: A History of Imaginative Thinking about Intelligent Machines. Eds. Stephen Cave, Kanta Dihal, and Sarah Dillon. 2020. 189–212. Dzieza, Josh. "Why Captchas Have Gotten So Difficult." The Verge 2019. 17 Sep. 2020 <https://www.theverge.com/2019/2/1/18205610/google-captcha-ai-robot-human-difficult-artificial-intelligence>. Eskelinen, Markku. "Towards Computer Game Studies." Digital Creativity 12.3 (2001): 175–83. Fuchs, Mathias, et al., eds. Rethinking Gamification. Lüneburg: Meson Press, 2014. Godfrey, Philip Brighten. "Text-Based CAPTCHA Algorithms." First Workshop on Human Interactive Proofs, 15 Dec. 2001. 14 Nov. 2020 <http://www.aladdin.cs.cmu.edu/hips/events/abs/godfreyb_abstract.pdf>. Gossweiler, Rich, et al. "What's Up CAPTCHA? A CAPTCHA Based on Image Orientation." Proceedings of the 18th International Conference on World Wide Web. WWW, 2009. Jeng, Albert B., et al. "A Study of CAPTCHA and Its Application to User Authentication." International Conference on Computational Collective Intelligence. Springer, 2010. Kani, Junya, and Masakatsu Nishigaki. "Gamified Captcha." International Conference on Human Aspects of Information Security, Privacy, and Trust. Springer, 2013. Kroeze, Christien, and Martin S. Olivier. "Gamifying Authentication." 2012 Information Security for South Africa. IEEE, 2012. Kumar, S. Ashok, et al. "Gamification of Internet Security by Next Generation Captchas." 2017 International Conference on Computer Communication and Informatics (ICCCI). IEEE, 2017. McGonigal, Jane. Reality Is Broken: Why Games Make Us Better and How They Can Change the World. Penguin, 2011. Motoyama, Marti, et al. "Re: Captchas – Understanding CAPTCHA-Solving Services in an Economic Context." USENIX Security Symposium. 2010. Murray, Janet. Hamlet on the Holodeck: The Future of Narrative in Cyberspace. New York: The Free Press, 1997. Paul, Christopher A. The Toxic Meritocracy of Video Games: Why Gaming Culture Is the Worst. University of Minnesota Press, 2018. Putnam, Hilary. "Robots: Machines or Artificially Created Life?" The Journal of Philosophy 61.21 (1964): 668–91. Reynolds, Joel Michael. "The Meaning of Ability and Disability." The Journal of Speculative Philosophy 33.3 (2019): 434–47. Searle, John. "Minds, Brains, and Programs." Behavioral and Brain Sciences 3.3 (1980): 417–24. Seligman, Martin, and Mihaly Csikszentmihalyi. "Positive Psychology: An Introduction." Flow and the Foundations of Positive Psychology. 2000. Springer, 2014. 279–98. Shet, Vinay. "Are You a Robot? Introducing No Captcha Recaptcha." Google Security Blog 3 (2014): 12. Tam, Jennifer, et al. "Breaking Audio Captchas." Advances in Neural Information Processing Systems. 2009. Proceedings of the 21st International Conference on Neural Information Processing Systems 1625–1632. ACM, 2008. The Terminator. Dir. James Cameron. Orion, 1984. Turing, Alan. "Computing Machinery and Intelligence." Mind 59.236 (1950). Von Ahn, Luis, et al. "Recaptcha: Human-Based Character Recognition via Web Security Measures." Science 321.5895 (2008): 1465–68. W3C Working Group. "Inaccessibility of CAPTCHA: Alternatives to Visual Turing Tests on the Web." W3C 2019. 17 Sep. 2020 <https://www.w3.org/TR/turingtest/>. Weise, Matthew. "How Videogames Express Ideas." DiGRA Conference. 2003. Weng, Haiqin, et al. "Towards Understanding the Security of Modern Image Captchas and Underground Captcha-Solving Services." Big Data Mining and Analytics 2.2 (2019): 118–44. Wilson, Daniel H. Robopocalypse. New York: Doubleday, 2011. Yan, Jeff, and Ahmad Salah El Ahmad. "Usability of Captchas or Usability Issues in CAPTCHA Design." Proceedings of the 4th Symposium on Usable Privacy and Security. 2008. Zagal, José P., Staffan Björk, and Chris Lewis. "Dark Patterns in the Design of Games." 8th International Conference on the Foundations of Digital Games. 2013. 25 Aug. 2020 <http://soda.swedish-ict.se/5552/1/DarkPatterns.1.1.6_cameraready.pdf>. Zhu, Bin B., et al. "Attacks and Design of Image Recognition Captchas." Proceedings of the 17th ACM Conference on Computer and Communications Security. 2010.
Los estilos APA, Harvard, Vancouver, ISO, etc.
Ofrecemos descuentos en todos los planes premium para autores cuyas obras están incluidas en selecciones literarias temáticas. ¡Contáctenos para obtener un código promocional único!

Pasar a la bibliografía