Academic literature on the topic 'Chained computer codes'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Chained computer codes.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Chained computer codes"

1

Yang, Lei, Yixuan Xie, Jinhong Yuan, Xingqing Cheng, and Lei Wan. "Chained LDPC Codes for Future Communication Systems." IEEE Communications Letters 22, no. 5 (May 2018): 898–901. http://dx.doi.org/10.1109/lcomm.2018.2808958.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Gil-González, Julián, Andrés Valencia-Duque, Andrés Álvarez-Meza, Álvaro Orozco-Gutiérrez, and Andrea García-Moreno. "Regularized Chained Deep Neural Network Classifier for Multiple Annotators." Applied Sciences 11, no. 12 (June 10, 2021): 5409. http://dx.doi.org/10.3390/app11125409.

Full text
Abstract:
The increasing popularity of crowdsourcing platforms, i.e., Amazon Mechanical Turk, changes how datasets for supervised learning are built. In these cases, instead of having datasets labeled by one source (which is supposed to be an expert who provided the absolute gold standard), databases holding multiple annotators are provided. However, most state-of-the-art methods devoted to learning from multiple experts assume that the labeler’s behavior is homogeneous across the input feature space. Besides, independence constraints are imposed on annotators’ outputs. This paper presents a regularized chained deep neural network to deal with classification tasks from multiple annotators. The introduced method, termed RCDNN, jointly predicts the ground truth label and the annotators’ performance from input space samples. In turn, RCDNN codes interdependencies among the experts by analyzing the layers’ weights and includes l1, l2, and Monte-Carlo Dropout-based regularizers to deal with the over-fitting issue in deep learning models. Obtained results (using both simulated and real-world annotators) demonstrate that RCDNN can deal with multi-labelers scenarios for classification tasks, defeating state-of-the-art techniques.
APA, Harvard, Vancouver, ISO, and other styles
3

Tu, Chia-Heng, Qihui Sun, and Hsiao-Hsuan Chang. "RAP: A Software Framework of Developing Convolutional Neural Networks for Resource-constrained Devices Using Environmental Monitoring as a Case Study." ACM Transactions on Cyber-Physical Systems 5, no. 4 (October 31, 2021): 1–28. http://dx.doi.org/10.1145/3472612.

Full text
Abstract:
Monitoring environmental conditions is an important application of cyber-physical systems. Typically, the monitoring is to perceive surrounding environments with battery-powered, tiny devices deployed in the field. While deep learning-based methods, especially the convolutional neural networks (CNNs), are promising approaches to enriching the functionalities offered by the tiny devices, they demand more computation and memory resources, which makes these methods difficult to be adopted on such devices. In this article, we develop a software framework, RAP , that permits the construction of the CNN designs by aggregating the existing, lightweight CNN layers, which are able to fit in the limited memory (e.g., several KBs of SRAM) on the resource-constrained devices satisfying application-specific timing constrains. RAP leverages the Python-based neural network framework Chainer to build the CNNs by mounting the C/C++ implementations of the lightweight layers, trains the built CNN models as the ordinary model-training procedure in Chainer, and generates the C version codes of the trained models. The generated programs are compiled into target machine executables for the on-device inferences. With the vigorous development of lightweight CNNs, such as binarized neural networks with binary weights and activations, RAP facilitates the model building process for the resource-constrained devices by allowing them to alter, debug, and evaluate the CNN designs over the C/C++ implementation of the lightweight CNN layers. We have prototyped the RAP framework and built two environmental monitoring applications for protecting endangered species using image- and acoustic-based monitoring methods. Our results show that the built model consumes less than 0.5 KB of SRAM for buffering the runtime data required by the model inference while achieving up to 93% of accuracy for the acoustic monitoring with less than one second of inference time on the TI 16-bit microcontroller platform.
APA, Harvard, Vancouver, ISO, and other styles
4

Horvath, Kaleb, Mohamed Riduan Abid, Thomas Merino, Ryan Zimmerman, Yesem Peker, and Shamim Khan. "Cloud-Based Infrastructure and DevOps for Energy Fault Detection in Smart Buildings." Computers 13, no. 1 (January 16, 2024): 23. http://dx.doi.org/10.3390/computers13010023.

Full text
Abstract:
We have designed a real-world smart building energy fault detection (SBFD) system on a cloud-based Databricks workspace, a high-performance computing (HPC) environment for big-data-intensive applications powered by Apache Spark. By avoiding a Smart Building Diagnostics as a Service approach and keeping a tightly centralized design, the rapid development and deployment of the cloud-based SBFD system was achieved within one calendar year. Thanks to Databricks’ built-in scheduling interface, a continuous pipeline of real-time ingestion, integration, cleaning, and analytics workflows capable of energy consumption prediction and anomaly detection was implemented and deployed in the cloud. The system currently provides fault detection in the form of predictions and anomaly detection for 96 buildings on an active military installation. The system’s various jobs all converge within 14 min on average. It facilitates the seamless interaction between our workspace and a cloud data lake storage provided for secure and automated initial ingestion of raw data provided by a third party via the Secure File Transfer Protocol (SFTP) and BLOB (Binary Large Objects) file system secure protocol drivers. With a powerful Python binding to the Apache Spark distributed computing framework, PySpark, these actions were coded into collaborative notebooks and chained into the aforementioned pipeline. The pipeline was successfully managed and configured throughout the lifetime of the project and is continuing to meet our needs in deployment. In this paper, we outline the general architecture and how it differs from previous smart building diagnostics initiatives, present details surrounding the underlying technology stack of our data pipeline, and enumerate some of the necessary configuration steps required to maintain and develop this big data analytics application in the cloud.
APA, Harvard, Vancouver, ISO, and other styles
5

Quinan, C. L., and Hannah Pezzack. "A Biometric Logic of Revelation: Zach Blas’s SANCTUM (2018)." M/C Journal 23, no. 4 (August 12, 2020). http://dx.doi.org/10.5204/mcj.1664.

Full text
Abstract:
Ubiquitous in airports, border checkpoints, and other securitised spaces throughout the world, full-body imaging scanners claim to read bodies in order to identify if they pose security threats. Millimetre-wave body imaging machines—the most common type of body scanner—display to the operating security agent a screen with a generic body outline. If an anomaly is found or if an individual does not align with the machine’s understanding of an “average” body, a small box is highlighted and placed around the “problem” area, prompting further inspection in the form of pat-downs or questioning. In this complex security regime governed by such biometric, body-based technologies, it could be argued that nonalignment with bodily normativity as well as an attendant failure to reveal oneself—to become “transparent” (Hall 295)—marks a body as dangerous. As these algorithmic technologies become more pervasive, so too does the imperative to critically examine their purported neutrality and operative logic of revelation and readability.Biometric technologies are marketed as excavators of truth, with their optic potency claiming to demask masquerading bodies. Failure and bias are, however, an inescapable aspect of such technologies that work with narrow parameters of human morphology. Indeed, surveillance technologies have been taken to task for their inherent racial and gender biases (Browne; Pugliese). Facial recognition has, for example, been critiqued for its inability to read darker skin tones (Buolamwini and Gebru), while body scanners have been shown to target transgender bodies (Keyes; Magnet and Rodgers; Quinan). Critical security studies scholar Shoshana Magnet argues that error is endemic to the technological functioning of biometrics, particularly since they operate according to the faulty notion that bodies are “stable” and unchanging repositories of information that can be reified into code (Magnet 2).Although body scanners are presented as being able to reliably expose concealed weapons, they are riddled with incompetencies that misidentify and over-select certain demographics as suspect. Full-body scanners have, for example, caused considerable difficulties for transgender travellers, breast cancer patients, and people who use prosthetics, such as artificial limbs, colonoscopy bags, binders, or prosthetic genitalia (Clarkson; Quinan; Spalding). While it is not in the scope of this article to detail the workings of body imaging technologies and their inconsistencies, a growing body of scholarship has substantiated the claim that these machines unfairly impact those identifying as transgender and non-binary (see, e.g., Beauchamp; Currah and Mulqueen; Magnet and Rogers; Sjoberg). Moreover, they are constructed according to a logic of binary gender: before each person enters the scanner, transportation security officers must make a quick assessment of their gender/sex by pressing either a blue (corresponding to “male”) or pink (corresponding to “female”) button. In this sense, biometric, computerised security systems control and monitor the boundaries between male and female.The ability to “reveal” oneself is henceforth predicated on having a body free of “abnormalities” and fitting neatly into one of the two sex categorisations that the machine demands. Transgender and gender-nonconforming individuals, particularly those who do not have a binary gender presentation or whose presentation does not correspond to the sex marker in their documentation, also face difficulties if the machine flags anomalies (Quinan and Bresser). Drawing on a Foucauldian analysis of power as productive, Toby Beauchamp similarly illustrates how surveillance technologies not only identify but also create and reshape the figure of the dangerous subject in relation to normative configurations of gender, race, and able-bodiedness. By mobilizing narratives of concealment and disguise, heightened security measures frame gender nonconformity as dangerous (Beauchamp, Going Stealth). Although national and supranational authorities market biometric scanning technologies as scientifically neutral and exact methods of identification and verification and as an infallible solution to security risks, such tools of surveillance are clearly shaped by preconceptions and prejudgements about race, gender, and bodily normativity. Not only are they encoded with “prototypical whiteness” (Browne) but they are also built on “grossly stereotypical” configurations of gender (Clarkson).Amongst this increasingly securitised landscape, creative forms of artistic resistance can offer up a means of subverting discriminatory policing and surveillance practices by posing alternate visualisations that reveal and challenge their supposed objectivity. In his 2018 audio-video artwork installation entitled SANCTUM, UK-based American artist Zach Blas delves into how biometric technologies, like those described above, both reveal and (re)shape ontology by utilising the affectual resonance of sexual submission. Evoking the contradictory notions of oppression and pleasure, Blas describes SANCTUM as “a mystical environment that perverts sex dungeons with the apparatuses and procedures of airport body scans, biometric analysis, and predictive policing” (see full description at https://zachblas.info/works/sanctum/).Depicting generic mannequins that stand in for the digitalised rendering of the human forms that pass through body scanners, the installation transports the scanners out of the airport and into a queer environment that collapses sex, security, and weaponry; an environment that is “at once a prison-house of algorithmic capture, a sex dungeon with no genitals, a weapons factory, and a temple to security.” This artistic reframing gestures towards full-body scanning technology’s germination in the military, prisons, and other disciplinary systems, highlighting how its development and use has originated from punitive—rather than protective—contexts.In what follows, we adopt a methodological approach that applies visual analysis and close reading to scrutinise a selection of scenes from SANCTUM that underscore the sadomasochistic power inherent in surveillance technologies. Analysing visual and aural elements of the artistic intervention allows us to complicate the relationship between transparency and recognition and to problematise the dynamic of mandatory complicity and revelation that body scanners warrant. In contrast to a discourse of visibility that characterises algorithmically driven surveillance technology, Blas suggests opacity as a resistance strategy to biometrics' standardisation of identity. Taking an approach informed by critical security studies and queer theory, we also argue that SANCTUM highlights the violence inherent to the practice of reducing the body to a flat, inert surface that purports to align with some sort of “core” identity, a notion that contradicts feminist and queer approaches to identity and corporeality as fluid and changing. In close reading this artistic installation alongside emerging scholarship on the discriminatory effects of biometric technology, this article aims to highlight the potential of art to queer the supposed objectivity and neutrality of biometric surveillance and to critically challenge normative logics of revelation and readability.Corporeal Fetishism and Body HorrorThroughout both his artistic practice and scholarly work, Blas has been critical of the above narrative of biometrics as objective extractors of information. Rather than looking to dominant forms of representation as a means for recognition and social change, Blas’s work asks that we strive for creative techniques that precisely queer biometric and legal systems in order to make oneself unaccounted for. For him, “transparency, visibility, and representation to the state should be used tactically, they are never the end goal for a transformative politics but are, ultimately, a trap” (Blas and Gaboury 158). While we would simultaneously argue that invisibility is itself a privilege that is unevenly distributed, his creative work attempts to refuse a politics of visibility and to embrace an “informatic opacity” that is attuned to differences in bodies and identities (Blas).In particular, Blas’s artistic interventions titled Facial Weaponization Suite (2011-14) and Face Cages (2013-16) protest against biometric recognition and the inequalities that these technologies propagate by making masks and wearable metal objects that cannot be detected as human faces. This artistic-activist project contests biometric facial recognition and their attendant inequalities by, as detailed on the artist’s website,making ‘collective masks’ in workshops that are modelled from the aggregated facial data of participants, resulting in amorphous masks that cannot be detected as human faces by biometric facial recognition technologies. The masks are used for public interventions and performances.One mask explores blackness and the racist implications that undergird biometric technologies’ inability to detect dark skin. Meanwhile another mask, which he calls the “Fag Face Mask”, points to the heteronormative underpinnings of facial recognition. Created from the aggregated facial data of queer men, this amorphous pink mask implicitly references—and contests—scientific studies that have attempted to link the identification of sexual orientation through rapid facial recognition techniques.Building on this body of creative work that has advocated for opacity as a tool of social and political transformation, SANCTUM resists the revelatory impulses of biometric technology by turning to the use and abuse of full-body imaging. The installation opens with a shot of a large, dark industrial space. At the far end of a red, spotlighted corridor, a black mask flickers on a screen. A shimmering, oscillating sound reverberates—the opening bars of a techno track—that breaks down in rhythm while the mask evaporates into a cloud of smoke. The camera swivels, and a white figure—the generic mannequin of the body scanner screen—is pummelled by invisible forces as if in a wind tunnel. These ghostly silhouettes appear and reappear in different positions, with some being whipped and others stretched and penetrated by a steel anal hook. Rather than conjuring a traditional horror trope of the body’s terrifying, bloody interior, SANCTUM evokes a new kind of feared and fetishized trope that is endemic to the current era of surveillance capitalism: the abstracted body, standardised and datafied, created through the supposedly objective and efficient gaze of AI-driven machinery.Resting on the floor in front of the ominous animated mask are neon fragments arranged in an occultist formation—hands or half a face. By breaking the body down into component parts— “from retina to fingerprints”—biometric technologies “purport to make individual bodies endlessly replicable, segmentable and transmissible in the transnational spaces of global capital” (Magnet 8). The notion that bodies can be seamlessly turned into blueprints extracted from biological and cultural contexts has been described by Donna Haraway as “corporeal fetishism” (Haraway, Modest). In the context of SANCTUM, Blas illustrates the dangers of mistaking a model for a “concrete entity” (Haraway, “Situated” 147). Indeed, the digital cartography of the generic mannequin becomes no longer a mode of representation but instead a technoscientific truth.Several scenes in SANCTUM also illustrate a process whereby substances are extracted from the mannequins and used as tools to enact violence. In one such instance, a silver webbing is generated over a kneeling figure. Upon closer inspection, this geometric structure, which is reminiscent of Blas’s earlier Face Cages project, is a replication of the triangulated patterns produced by facial recognition software in its mapping of distance between eyes, nose, and mouth. In the next scene, this “map” breaks apart into singular shapes that float and transform into a metallic whip, before eventually reconstituting themselves as a penetrative douche hose that causes the mannequin to spasm and vomit a pixelated liquid. Its secretions levitate and become the webbing, and then the sequence begins anew.In another scene, a mannequin is held upside-down and force-fed a bubbling liquid that is being pumped through tubes from its arms, legs, and stomach. These depictions visualise Magnet’s argument that biometric renderings of bodies are understood not to be “tropic” or “historically specific” but are instead presented as “plumbing individual depths in order to extract core identity” (5). In this sense, this visual representation calls to mind biometrics’ reification of body and identity, obfuscating what Haraway would describe as the “situatedness of knowledge”. Blas’s work, however, forces a critique of these very systems, as the materials extracted from the bodies of the mannequins in SANCTUM allude to how biometric cartographies drawn from travellers are utilised to justify detainment. These security technologies employ what Magnet has referred to as “surveillant scopophilia,” that is, new ways and forms of looking at the human body “disassembled into component parts while simultaneously working to assuage individual anxieties about safety and security through the promise of surveillance” (17). The transparent body—the body that can submit and reveal itself—is ironically represented by the distinctly genderless translucent mannequins. Although the generic mannequins are seemingly blank slates, the installation simultaneously forces a conversation about the ways in which biometrics draw upon and perpetuate assumptions about gender, race, and sexuality.Biometric SubjugationOn her 2016 critically acclaimed album HOPELESSNESS, openly transgender singer, composer, and visual artist Anohni performs a deviant subjectivity that highlights the above dynamics that mark the contemporary surveillance discourse. To an imagined “daddy” technocrat, she sings:Watch me… I know you love me'Cause you're always watching me'Case I'm involved in evil'Case I'm involved in terrorism'Case I'm involved in child molestersEvoking a queer sexual frisson, Anohni describes how, as a trans woman, she is hyper-visible to state institutions. She narrates a voyeuristic relation where trans bodies are policed as threats to public safety rather than protected from systemic discrimination. Through the seemingly benevolent “daddy” character and the play on ‘cause (i.e., because) and ‘case (i.e., in case), she highlights how gender-nonconforming individuals are predictively surveilled and assumed to already be guilty. Reflecting on daddy-boy sexual paradigms, Jack Halberstam reads the “sideways” relations of queer practices as an enactment of “rupture as substitution” to create a new project that “holds on to vestiges of the old but distorts” (226). Upending power and control, queer art has the capacity to both reveal and undermine hegemonic structures while simultaneously allowing for the distortion of the old to create something new.Employing the sublimatory relations of bondage, discipline, sadism, and masochism (BDSM), Blas’s queer installation similarly creates a sideways representation that re-orientates the logic of the biometric scanners, thereby unveiling the always already sexualised relations of scrutiny and interrogation as well as the submissive complicity they demand. Replacing the airport environment with a dark and foreboding mise-en-scène allows Blas to focus on capture rather than mobility, highlighting the ways in which border checkpoints (including those instantiated by the airport) encourage free travel for some while foreclosing movement for others. Building on Sara Ahmed’s “phenomenology of being stopped”, Magnet considers what happens when we turn our gaze to those “who fail to pass the checkpoint” (107). In SANCTUM, the same actions are played out again and again on spectral beings who are trapped in various states: they shudder in cages, are chained to the floor, or are projected against the parameters of mounted screens. One ghostly figure, for instance, lies pinned down by metallic grappling hooks, arms raised above the head in a recognisable stance of surrender, conjuring up the now-familiar image of a traveller standing in the cylindrical scanner machine, waiting to be screened. In portraying this extended moment of immobility, Blas lays bare the deep contradictions in the rhetoric of “freedom of movement” that underlies such spaces.On a global level, media reporting, scientific studies, and policy documents proclaim that biometrics are essential to ensuring personal safety and national security. Within the public imagination, these technologies become seductive because of their marked ability to identify terrorist attackers—to reveal threatening bodies—thereby appealing to the anxious citizen’s fear of the disguised suicide bomber. Yet for marginalised identities prefigured as criminal or deceptive—including transgender and black and brown bodies—the inability to perform such acts of revelation via submission to screening can result in humiliation and further discrimination, public shaming, and even tortuous inquiry – acts that are played out in SANCTUM.Masked GenitalsFeminist surveillance studies scholar Rachel Hall has referred to the impetus for revelation in the post-9/11 era as a desire for a universal “aesthetics of transparency” in which the world and the body is turned inside-out so that there are no longer “secrets or interiors … in which terrorists or terrorist threats might find refuge” (127). Hall takes up the case study of Umar Farouk Abdulmutallab (infamously known as “the Underwear Bomber”) who attempted to detonate plastic explosives hidden in his underwear while onboard a flight from Amsterdam to Detroit on 25 December 2009. Hall argues that this event signified a coalescence of fears surrounding bodies of colour, genitalia, and terrorism. News reports following the incident stated that Abdulmutallab tucked his penis to make room for the explosive, thereby “queer[ing] the aspiring terrorist by indirectly referencing his willingness … to make room for a substitute phallus” (Hall 289). Overtly manifested in the Underwear Bomber incident is also a desire to voyeuristically expose a hidden, threatening interiority, which is inherently implicated with anxieties surrounding gender deviance. Beauchamp elaborates on how gender deviance and transgression have coalesced with terrorism, which was exemplified in the wake of the 9/11 attacks when the United States Department of Homeland Security issued a memo that male terrorists “may dress as females in order to discourage scrutiny” (“Artful” 359). Although this advisory did not explicitly reference transgender populations, it linked “deviant” gender presentation—to which we could also add Abdulmutallab’s tucking of his penis—with threats to national security (Beauchamp, Going Stealth). This also calls to mind a broader discussion of the ways in which genitalia feature in the screening process. Prior to the introduction of millimetre-wave body scanning technology, the most common form of scanner used was the backscatter imaging machine, which displayed “naked” body images of each passenger to the security agent. Due to privacy concerns, these machines were replaced by the scanners currently in place which use a generic outline of a passenger (exemplified in SANCTUM) to detect possible threats.It is here worth returning to Blas’s installation, as it also implicitly critiques the security protocols that attempt to reveal genitalia as both threatening and as evidence of an inner truth about a body. At one moment in the installation a bayonet-like object pierces the blank crotch of the mannequin, shattering it into holographic fragments. The apparent genderlessness of the mannequins is contrasted with these graphic sexual acts. The penetrating metallic instrument that breaks into the loin of the mannequin, combined with the camera shot that slowly zooms in on this action, draws attention to a surveillant fascination with genitalia and revelation. As Nicholas L. Clarkson documents in his analysis of airport security protocols governing prostheses, including limbs and packies (silicone penis prostheses), genitals are a central component of the screening process. While it is stipulated that physical searches should not require travellers to remove items of clothing, such as underwear, or to expose their genitals to staff for inspection, prosthetics are routinely screened and examined. This practice can create tensions for trans or disabled passengers with prosthetics in so-called “sensitive” areas, particularly as guidelines for security measures are often implemented by airport staff who are not properly trained in transgender-sensitive protocols.ConclusionAccording to media technologies scholar Jeremy Packer, “rather than being treated as one to be protected from an exterior force and one’s self, the citizen is now treated as an always potential threat, a becoming bomb” (382). Although this technological policing impacts all who are subjected to security regimes (which is to say, everyone), this amalgamation of body and bomb has exacerbated the ways in which bodies socially coded as threatening or deceptive are targeted by security and surveillance regimes. Nonetheless, others have argued that the use of invasive forms of surveillance can be justified by the state as an exchange: that citizens should willingly give up their right to privacy in exchange for safety (Monahan 1). Rather than subscribing to this paradigm, Blas’ SANCTUM critiques the violence of mandatory complicity in this “trade-off” narrative. Because their operationalisation rests on normative notions of embodiment that are governed by preconceptions around gender, race, sexuality and ability, surveillance systems demand that bodies become transparent. This disproportionally affects those whose bodies do not match norms, with trans and queer bodies often becoming unreadable (Kafer and Grinberg). The shadowy realm of SANCTUM illustrates this tension between biometric revelation and resistance, but also suggests that opacity may be a tool of transformation in the face of such discriminatory violations that are built into surveillance.ReferencesAhmed, Sara. “A Phenomenology of Whiteness.” Feminist Theory 8.2 (2007): 149–68.Beauchamp, Toby. “Artful Concealment and Strategic Visibility: Transgender Bodies and U.S. State Surveillance after 9/11.” Surveillance & Society 6.4 (2009): 356–66.———. Going Stealth: Transgender Politics and U.S. Surveillance Practices. Durham, NC: Duke UP, 2019.Blas, Zach. “Informatic Opacity.” The Journal of Aesthetics and Protest 9 (2014). <http://www.joaap.org/issue9/zachblas.htm>.Blas, Zach, and Jacob Gaboury. 2016. “Biometrics and Opacity: A Conversation.” Camera Obscura: Feminism, Culture, and Media Studies 31.2 (2016): 154-65.Buolamwini, Joy, and Timnit Gebru. “Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification.” Proceedings of Machine Learning Research 81 (2018): 1-15.Browne, Simone. Dark Matters: On the Surveillance of Blackness. Durham, NC: Duke UP, 2015.Clarkson, Nicholas L. “Incoherent Assemblages: Transgender Conflicts in US Security.” Surveillance & Society 17.5 (2019): 618-30.Currah, Paisley, and Tara Mulqueen. “Securitizing Gender: Identity, Biometrics, and Transgender Bodies at the Airport.” Social Research 78.2 (2011): 556-82.Halberstam, Jack. The Queer Art of Failure. Durham: Duke UP, 2011.Hall, Rachel. “Terror and the Female Grotesque: Introducing Full-Body Scanners to U.S. Airports.” Feminist Surveillance Studies. Eds. Rachel E. Dubrofsky and Shoshana Amielle Magnet. Durham, NC: Duke UP, 2015. 127-49.Haraway, Donna. “Situated Knowledges: The Science Question in Feminism and the Privilege of Partial Perspective.” Feminist Studies 14.3 (1988): 575-99.———. Modest_Witness@Second_Millennium. FemaleMan_Meets_OncoMouse: Feminism and Technoscience. New York: Routledge, 1997.Kafer, Gary, and Daniel Grinberg. “Queer Surveillance.” Surveillance & Society 17.5 (2019): 592-601.Keyes, O.S. “The Misgendering Machines: Trans/HCI Implications of Automatic Gender Recognition.” Proceedings of the ACM on Human-Computer Interaction 2. CSCW, Article 88 (2018): 1-22.Magnet, Shoshana Amielle. When Biometrics Fail: Gender, Race, and the Technology of Identity. Durham: Duke UP, 2011.Magnet, Shoshana, and Tara Rodgers. “Stripping for the State: Whole Body Imaging Technologies and the Surveillance of Othered Bodies.” Feminist Media Studies 12.1 (2012): 101–18.Monahan, Torin. Surveillance and Security: Technological Politics and Power in Everyday Life. New York: Routledge, 2006.Packer, Jeremy. “Becoming Bombs: Mobilizing Mobility in the War of Terror.” Cultural Studies 10.5 (2006): 378-99.Pugliese, Joseph. “In Silico Race and the Heteronomy of Biometric Proxies: Biometrics in the Context of Civilian Life, Border Security and Counter-Terrorism Laws.” Australian Feminist Law Journal 23 (2005): 1-32.Pugliese, Joseph. Biometrics: Bodies, Technologies, Biopolitics New York: Routledge, 2010.Quinan, C.L. “Gender (In)securities: Surveillance and Transgender Bodies in a Post-9/11 Era of Neoliberalism.” Eds. Stef Wittendorp and Matthias Leese. Security/Mobility: Politics of Movement. Manchester: Manchester UP, 2017. 153-69.Quinan, C.L., and Nina Bresser. “Gender at the Border: Global Responses to Gender Diverse Subjectivities and Non-Binary Registration Practices.” Global Perspectives 1.1 (2020). <https://doi.org/10.1525/gp.2020.12553>.Sjoberg, Laura. “(S)he Shall Not Be Moved: Gender, Bodies and Travel Rights in the Post-9/11 Era.” Security Journal 28.2 (2015): 198-215.Spalding, Sally J. “Airport Outings: The Coalitional Possibilities of Affective Rupture.” Women’s Studies in Communication 39.4 (2016): 460-80.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Chained computer codes"

1

Balde, Oumar. "Calage bayésien sous incertitudes des outils de calcul scientifique couplés : application en simulation numérique du combustible." Electronic Thesis or Diss., Université de Toulouse (2023-....), 2024. http://www.theses.fr/2024TLSES035.

Full text
Abstract:
Dans le cadre des études des comportements des systèmes physiques complexes tels que les réacteurs nucléaires, les simulateurs numériques sont devenus des outils essentiels pour la modélisation, l'analyse, la compréhension et la prévision des phénomènes physiques impliqués. Ces simulateurs prennent souvent un grand nombre de paramètres en entrée, lesquels sont souvent entachés d'incertitudes, ce qui se traduit par des sorties également incertaines. Ainsi, il est crucial avant toute utilisation dans un contexte industriel, de quantifier et de réduire les différentes sources d'incertitude. Le processus de calage de modèle vise ainsi à réduire et à quantifier au mieux les incertitudes des paramètres en entrée, en se basant sur les données expérimentales et simulées disponibles. Il existe deux types de calage de modèle : le calage déterministe et le calage bayésien. Le calage bayésien est une méthode qui repose sur une approche probabiliste pour quantifier les incertitudes paramétriques par des distributions de probabilité. Dans cette thèse, nous nous sommes intéressés au calage bayésien conditionnel de deux modèles numériques chaînés simulant le comportement du combustible dans un réacteur à eau pressurisée. Plus précisément, l'objectif est de réaliser un calage bayésien des paramètres incertains du second modèle conditionnellement à toute l'incertitude a posteriori des paramètres incertains du premier modèle numérique. Pour ce faire, nous avons proposé une nouvelle méthodologie d'inférence bayésienne basée sur des processus gaussiens et appelée GP-LinCC (pour Gaussian Process and Linearization-based Conditional Calibration). La mise en œuvre pratique de cette nouvelle approche a nécessité le développement d'une méthode d'analyse de sensibilité afin de sélectionner préalablement les paramètres à caler du second modèle tout en prenant en compte toute l'incertitude des paramètres du premier modèle. Cette méthode d'analyse de sensibilité globale en support au calage conditionnel est basée sur des mesures de dépendance de type HSIC (Hilbert-Schmidt Independence Criterion). Enfin, ces deux contributions méthodologiques ont été appliquées au simulateur chaîné ALCYONE-CARACAS afin de quantifier les incertitudes paramétriques du code CARACAS simulant le comportement des gaz de fission conditionnellement à l'incertitude de la conductivité thermique du modèle thermique
Nowadays, numerical models have become essential tools for modeling, understanding, analyzing and predicting the physical phenomena involved in complex physical systems such as nuclear power plants. Such numerical models often take a large number of uncertain input parameters, thus leading to uncertain outputs as well. Before any industrial use of those numerical models, an important step is therefore to reduce and quantify these uncertainties as much as possible. In this context, the goal of model calibration is to reduce and quantify the uncertainties of the input parameters based on available experimental and simulated data. There are two types of model calibration: deterministic calibration and Bayesian calibration. The latter quantifies parameter uncertainties by probability distributions. This thesis deals with the conditional Bayesian calibration of two chained numerical models. The objective is to calibrate the uncertain parameters of the second model while taking into account the uncertain parameters of the first model. To achieve this, a new Bayesian inference methodology called GP-LinCC (Gaussian Process and Linearization-based Conditional Calibration) was proposed. In practice, the deployment of this new approach has required a preliminary step of global sensitivity analysis to identify the most significant input parameters to calibrate in the second model, while considering the uncertainty of the parameters of the first model. To do this, an integrated version of the HSIC (Hilbert-Schmidt Independence Criterion) was used to define well-suited sensitivity measures and the theoretical properties of their nested Monte Carlo estimators were investigated. Finally, these two methodological contributions have been applied to the multi-physics application called ALCYONE, to quantify the uncertain parameters of the CARACAS code (second model) simulating the behavior of fission gases in the pressurized water reactor conditionally on the uncertainty of the parameter conductivity of the thermal model (first model)
APA, Harvard, Vancouver, ISO, and other styles
2

Houghton, Michael Kevin. "Image feature matching using polynomial representation of chain codes." Thesis, University of Central Lancashire, 1993. http://clok.uclan.ac.uk/20359/.

Full text
Abstract:
In this thesis the development of a novel descriptor for boundary images represented in a chain code format is reported. This descriptor is based on a truncated series of orthogonal polynomials used to represent a piecewise continuous function derived from a chain code. This piecewise continuous function is generated from a chain code by mapping individual chain links onto real numbers. A variety of alternative mappings of chain links onto real numbers are evaluated, along with two specific orthogonal polynomials; namely Legendre polynomials and Chebychev polynomials. The performance of this series descriptor for chain codes is evaluated initially by applying it to the problem of locating short chains within a long chain; and then extending the application and critically evaluating the descriptor when attempting to match features from pairs of similar images. In addition, a formal algebra is developed that provides the rule base that enables the transformation and manipulation of chain encoded boundary images. The foundation of this algebra is based on the notion that the labelling of the directions of an 8-connected chain code is essentially arbitrary and 7 other, different and consistent labellings can be distinguished.
APA, Harvard, Vancouver, ISO, and other styles
3

Jiang, Yi. "Design and implementation of tool-chain framework to support OpenMP single source compilation on cell platform." Access to citation, abstract and download form provided by ProQuest Information and Learning Company; downloadable PDF file, 83 p, 2008. http://proquest.umi.com/pqdweb?did=1459924771&sid=29&Fmt=2&clientId=8331&RQT=309&VName=PQD.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Herms, Paolo. "Certification of a Tool Chain for Deductive Program Verification." Phd thesis, Université Paris Sud - Paris XI, 2013. http://tel.archives-ouvertes.fr/tel-00789543.

Full text
Abstract:
This thesis belongs to the domain of software verification. The goalof verifying software is to ensure that an implementation, a program,satisfies the requirements, the specification. This is especiallyimportant for critical computer programs, such as control systems forair planes, trains and power plants. Here a malfunctioning occurringduring operation would have catastrophic consequences. Software requirements can concern safety or functioning. Safetyrequirements, such as not accessing memory locations outside validbounds, are often implicit, in the sense that any implementation isexpected to be safe. On the other hand, functional requirementsspecify what the program is supposed to do. The specification of aprogram is often expressed informally by describing in English or someother natural language the mission of a part of the program code.Usually program verification is then done by manual code review,simulation and extensive testing. But this does not guarantee that allpossible execution cases are captured. Deductive program proving is a complete way to ensure soundness of theprogram. Here a program along with its specificationis a mathematical object and its desired properties are logicaltheorems to be formally proved. This way, if the underlying logicsystem is consistent, we can be absolutely sure that the provenproperty holds for the program in any case.Generation of verification conditions is a technique helpingthe programmer to prove the properties he wants about his programs.Here a VCG tool analyses a program and its formal specification andproduces a mathematical formula, whose validity implies the soundnessof the program with respect to its specification. This is particularlyinteresting when the generated formulas can be proved automatically byexternal SMT solvers.This approach is based on works of Hoare and Dijkstra and iswell-understood and shown correct in theory. Deductive verificationtools have nowadays reached a maturity allowing them to be used inindustrial context where a very high level of assurance isrequired. But implementations of this approach must deal with allkinds of language features and can therefore become quite complex andcontain errors -- in the worst case stating that a program correcteven if it is not. This raises the question of the level ofconfidence granted to these tools themselves. The aim of this thesis is to address this question. We develop, inthe Coq system, a certified verification-condition generator (VCG) forACSL-annotated C programs.Our first contribution is the formalisation of an executableVCG for the Whycert intermediate language,an imperative language with loops, exceptions and recursive functionsand its soundness proof with respect to the blocking big-step operational semantics of the language.A second contribution is the formalisation of the ACSL logicallanguage and the semantics of ACSL annotations of Compcert's Clight.From the compilation of ACSL annotated Clight programs to Whycertprograms and its semantics preservation proof combined with a Whycertaxiomatisation of the Compcert memory model results our maincontribution: an integrated certified tool chainfor verification of C~programs on top of Compcert. By combining oursoundness result with the soundness of the Compcert compiler we obtaina Coq theorem relating the validity of the generated proof obligationswith the safety of the compiled assembly code.
APA, Harvard, Vancouver, ISO, and other styles
5

Huet, Fabrice. "Objets mobiles : conception d'un middleware et évaluation de la communication." Phd thesis, Université de Nice Sophia-Antipolis, 2002. http://tel.archives-ouvertes.fr/tel-00505420.

Full text
Abstract:
Cette thèse a pour sujet la mobilité faible des applications et en particulier la communication entre entités mobiles. Nous nous sommes tout d'abord intéressés aux relations existant entre le paradigme des objets actifs et celui des applications mobiles. De nombreux protocoles pour assurer les communications entre objets mobiles ont été décrits dans la littérature mais leurs performances n'ont jamais été étudiées formellement. Nous avons isolé des propriétés permettant de les classer en trois familles~: la poste restante, la recherche et le routage. Après avoir choisi deux protocoles utilisés dans des bibliothèques Java, nous avons entrepris leur étude à l'aide de chaînes de Markov, le but étant de pouvoir déterminer le temps moyen nécessaire pour communiquer avec un agent mobile. Le mécanisme des répéteurs est représenté à l'aide d'une chaîne à espace d'états infini. Nous avons pu exprimer deux métriques: le temps moyen de réponse du système et le nombre moyen de répéteurs. Le cas du serveur nécessite l'analyse d'une chaîne à espace d'états fini qui est résolue numériquement. Pour valider nos modèles nous avons utilisé un simulateur à événements discrets puis, nous avons mené des expérimentations sur un réseau local et sur un réseau régional. Les résultats ont été comparés à ceux obtenus théoriquement ce qui nous a permis de montrer que les performances de nos modèles sont tout à fait acceptables. Il est donc possible de les utiliser pour prédire les performances. Enfin, nous terminons ce travail par la présentation d'un nouveau protocole de communications utilisant des répéteurs à durée de vie limitée et un serveur. Nous montrons qu'il est possible d'obtenir de bonnes performances sans les aspects négatifs des deux protocoles précédents.
APA, Harvard, Vancouver, ISO, and other styles
6

Robbeloth, Michael Christopher. "Recognition of Incomplete Objects based on Synthesis of Views Using a Geometric Based Local-Global Graphs." Wright State University / OhioLINK, 2019. http://rave.ohiolink.edu/etdc/view?acc_num=wright1557509373174391.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Li, Yunming. "Machine vision algorithms for mining equipment automation." Thesis, Queensland University of Technology, 2000.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Chained computer codes"

1

Gareau, Jaël Champagne, Éric Beaudry, and Vladimir Makarenkov. "PcTVI: Parallel MDP Solver Using a Decomposition into Independent Chains." In Studies in Classification, Data Analysis, and Knowledge Organization, 101–9. Cham: Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-09034-9_12.

Full text
Abstract:
AbstractMarkov Decision Processes (MDPs) are useful to solve real-world probabilistic planning problems. However, finding an optimal solution in an MDP can take an unreasonable amount of time when the number of states in the MDP is large. In this paper, we present a way to decompose an MDP into Strongly Connected Components (SCCs) and to find dependency chains for these SCCs. We then propose a variant of the Topological Value Iteration (TVI) algorithm, called parallel chained TVI (pcTVI), which is able to solve independent chains of SCCs in parallel leveraging modern multicore computer architectures. The performance of our algorithm was measured by comparing it to the baseline TVI algorithm on a new probabilistic planning domain introduced in this study. Our pcTVI algorithm led to a speedup factor of 20, compared to traditional TVI (on a computer having 32 cores).
APA, Harvard, Vancouver, ISO, and other styles
2

Ahmad, Muhammad Bilal, Jong-An Park, Min Hyuk Chang, Young-Suk Shim, and Tae-Sun Choi. "Shape Registration Based on Modified Chain Codes." In Lecture Notes in Computer Science, 600–607. Berlin, Heidelberg: Springer Berlin Heidelberg, 2003. http://dx.doi.org/10.1007/978-3-540-39425-9_70.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Dingli, Alexiei, Mark Bugeja, Dylan Seychell, and Simon Mercieca. "Recognition of Handwritten Characters Using Google Fonts and Freeman Chain Codes." In Lecture Notes in Computer Science, 65–78. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-319-99740-7_5.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Jusoh, Nor Amizam, and Jasni Mohamad Zain. "Malaysian Car Plates Recognition Using Freeman Chain Codes and Characters’ Features." In Software Engineering and Computer Systems, 581–91. Berlin, Heidelberg: Springer Berlin Heidelberg, 2011. http://dx.doi.org/10.1007/978-3-642-22170-5_50.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Melen, Trond, and Takouhi Ozanian. "A fast algorithm for dominant point detection on chain-coded contours." In Computer Analysis of Images and Patterns, 245–53. Berlin, Heidelberg: Springer Berlin Heidelberg, 1993. http://dx.doi.org/10.1007/3-540-57233-3_33.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Henzgen, Sascha, Marc Strickert, and Eyke Hüllermeier. "Rule Chains for Visualizing Evolving Fuzzy Rule-Based Systems." In Proceedings of the 8th International Conference on Computer Recognition Systems CORES 2013, 279–88. Heidelberg: Springer International Publishing, 2013. http://dx.doi.org/10.1007/978-3-319-00969-8_27.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Loebecke, Claudia. "RFID in the Retail Supply Chain." In Information Security and Ethics, 1923–30. IGI Global, 2008. http://dx.doi.org/10.4018/978-1-59904-937-3.ch130.

Full text
Abstract:
The use of RFID (radio-frequency identification) in the retail supply chain and at the point of sale (POS) holds much promise to revolutionize the process by which products pass from manufacturer to retailer to consumer. The basic idea of RFID is a tiny computer chip placed on pallets, cases, or items. The data on the chip can be read using a radio beam. RFID is a newer technology than bar codes, which are read using a laser beam. RFID is also more effective than bar codes at tracking moving objects in environments where bar code labels would be suboptimal or could not be used as no direct line of sight is available, or where information needs to be automatically updated. RFID is based on wireless (radio) systems, which allows for noncontact reading of data about products, places, times, or transactions, thereby giving retailers and manufacturers alike timely and accurate data about the flow of products through their factories, warehouses, and stores.
APA, Harvard, Vancouver, ISO, and other styles
8

Choi, Thomas Y. "Supply Chains as Networks." In The Nature of Supply Networks, 138—C7P158. Oxford University PressNew York, 2023. http://dx.doi.org/10.1093/oso/9780197673249.003.0007.

Full text
Abstract:
Abstract This chapter maps the actual supply networks of the center console module for the Honda Accord, Acura CL/TL, and DaimlerChrysler Grand Cherokee. The chapter characterizes the structural behavior of supply networks. The chapter then subjects the data to software to compute the network structural indexes both at the node level and network level. Building on the lessons learned from the network analysis, a new type of critical supplier whose importance comes from their structural position in the network is proposed. They are called nexus suppliers, and an algorithm is developed for the nexus supplier index. It is then discussed how supply network mapping should be considered as one anticipates more extreme events that will disrupt global supply networks in the coming years.
APA, Harvard, Vancouver, ISO, and other styles
9

P, Sheela Rani, Sankara Revathi S, Dharshini J S, and Rekha M. "Identification of Product Originality Based on Supply Chain Management Using Block Chain." In Intelligent Systems and Computer Technology. IOS Press, 2020. http://dx.doi.org/10.3233/apc200141.

Full text
Abstract:
The Internet of Things (IOT) is integrated with supply chain management process to track the product. To track the product smart tags is used. The smart tags like QR code and NFC is used. But with the technology enhancement the block chain is introduced into the supply chain management process. The block chain is the great revolution that data in the centralized form is transformed in to a decentralized manner. The distributed Ledger Technology (DLT) is one of the method used in ethereum block chain. The main advantage of using DLT is, it offers decentralized, privacy-preserving and verifiable process in the smart tags. In existing system only single server was used to maintain all the process like supplier, manufacturer and distributor. In this application we are using different server which was more secure than existing system. The proposed solution in this paper is it checks the product evidence during the entire lifecycle of the product by using the smart contract. The data can be immutable by using smart contract with ethereum block chain. The duplication is manipulated by the block chainserver.
APA, Harvard, Vancouver, ISO, and other styles
10

Eksili, Nisa. "Human Resources Management." In Increasing Supply Chain Performance in Digital Society, 262–77. IGI Global, 2022. http://dx.doi.org/10.4018/978-1-7998-9715-6.ch013.

Full text
Abstract:
In the digital world, computers have turned into objects that know more than humans. Devices are redefining the human way of living in a network that allows people to talk to each other. On the one hand, suspicion and fear about technology increased; on the other hand, it became impossible to stay away from a life without technology. In this process, the contribution of digitalization to productivity, the convenience and speed it provides to the business world has made digitalization necessary in all sectors. The role of human resources management (HRM) is very important in the realization of digital transformation in businesses. This role comes to the fore both in adapting to changing social norms and adapting to changes within the enterprise. HRM also has difficulties in this process. This chapter provides background on the challenges HRM faces in the digital society. The digitalization process, digitalization and its effects in businesses, the effects of the digitalization process on employees, working environment, and HR technologies are mentioned.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Chained computer codes"

1

Shin, Jae Kyun, and Sundar Krishnamurty. "Development of a Standard Code for Colored Graphs and its Application to Kinematic Chains." In ASME 1992 Design Technical Conferences. American Society of Mechanical Engineers, 1992. http://dx.doi.org/10.1115/detc1992-0387.

Full text
Abstract:
Abstract The development of an efficient solution procedure for the detection of isomorphism and canonical numbering of vertices of colored graphs is introduced. This computer based algorithm for colored graphs is formed by extending the standard code approach earlier developed for the canonical numbering of simple noncolored graphs, which fully utilizes the capabilities of symmetry analysis of such noncolored graphs. Its application to various kinematic chains and mechanisms is investigated with the aid of examples. The method never failed to produce unique codes, and is also found to be robust and efficient. Using this method, every kinematic chain and mechanism, as well as path generators and function generators, will have their own unique codes and a corresponding canonical numbering of their respective links. Thus, based on its efficiency and applicability, this method can be used as a universal standard code for identifying isomorphisms, as well as for enumerating nonisomorphic kinematic chains and mechanisms.
APA, Harvard, Vancouver, ISO, and other styles
2

Liu, Chuan-He, and Ting-Li Yang. "The Topological Structure Code Permutation Group Method of Polyhedral Solid for Identification of Kinematic Chains Isomorphism." In ASME 1998 Design Engineering Technical Conferences. American Society of Mechanical Engineers, 1998. http://dx.doi.org/10.1115/detc98/mech-5915.

Full text
Abstract:
Abstract This paper puts forward a new method for identification of kinematic chains isomorphism, which is called the topological structure code permutation group method of polyhedral solid. It can be put into practice by means of computer, and it is exact, dependable, quick and efficient. When isomorphism of two kinematic chains is identified by the method, it only need determine whether one permutation can be found out or not between topological structure codes of the two kinematic chains, or determine whether adjoint codes of the two topological structure codes are equal or not, it need not find out a symmetry group at all. It applies to all kinds of non-separable closed kinematic chains that don’t contain compound hinges. In this paper, a new concept called polyhedral solid is put forward, the polyhedral solid model is given, and a theorem that describes the relation of faces, vertices, edges and geometric solids of a connected graph is given and proved. This paper expounds the definitions and general forms of the topological structure code and its adjoint code, the methods and rules of coding, the procedure of generating them. It also expounds and proves the beingness, accuracy and the uniqueness of the topological structure code and its adjoint code and the necessary and sufficient condition and criteria of kinematic chains isomorphism as well. The algorithm and some typical examples for identification of kinematic chains isomorphism are given.
APA, Harvard, Vancouver, ISO, and other styles
3

Hay, Akara, and Shanzhong (Shawn) Duan. "Implementation of an Integrated Sequential Procedure for Computer Simulation of Dynamics of Multibody Molecular Structures in Polymers and Biopolymers." In ASME 2019 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2019. http://dx.doi.org/10.1115/imece2019-11752.

Full text
Abstract:
Abstract This paper presents the implementation results of an integrated sequential algorithm, which the second author developed mathematically in a pseudo code format previously to improve computational efficiency of computer simulation of the dynamical behaviors of multibody molecular structures in polymers and biopolymers. This new algorithm is a seamless integration between multibody molecular algorithm (MMA: a multibody-dynamics-based procedure for motion simulation of molecular structure) and fast multipole method (FMM). The fast multipole method is used to calculate interatomic forces from potentials, and the multibody molecular algorithm is used to generate equations of motion associated with molecular structures. The algorithm improves computational efficiency when comparing with its counterpart procedures. A study case of an opened-chain molecular structure was used to demonstrate the algorithm works and to study improvement of computing efficiency of the algorithm. The algorithm is coded in MATLAB and run on both laptop and workstations computers with various numbers of molecules along the chain. FMM started with scaling all atoms into a box with coordinate ranges to ensure numerical stability of subsequent operations. The flow of calculations in FMM was carried out along the chain structure with five computational passes. MMA began with numbering subsets, forming bond graph, and developing three computing passes along the chain structure. Flows of both calculations and data in FMM and MMA were lined up recursively along the chain structure to obtain an O(N)1 computational efficiency. Simulation results were compared with results produced by MMA and traditional methods of FMM for interatomic force calculation procedure. Implementation presented in this paper first proves that the integration between FMM and MMA works and the integrated algorithm improves computing efficiency associated with both calculation of interatomic forces from potentials and formation/solution of equations of motion. Implementation results also indicates that the integrated algorithm works more efficiently for a large-sized molecular chain than a small-sized molecular chain. Further work is needed to optimize the related FMM codes.
APA, Harvard, Vancouver, ISO, and other styles
4

Wang, Wan. "A Model for Mechanism Data Storage." In ASME 1992 Design Technical Conferences. American Society of Mechanical Engineers, 1992. http://dx.doi.org/10.1115/detc1992-0150.

Full text
Abstract:
Abstract A data model for kinematic structure of mechanisms and its coding principle are proposed, based on the topological graph and contract graph. In the model every basic chain is mapped by a code of 5 decimal digits and a mechanism is mapped by a set of code of basic chains. The model occupies minimal memory, and contains a complete set of useful primary parameters of structure, and significantly reduce computer time for isomorphism identification.
APA, Harvard, Vancouver, ISO, and other styles
5

Jongsma, T. J., and W. Zhang. "An Efficient Algorithm for Finding Optimum Code Under the Condition of Incident Degree." In ASME 1992 Design Technical Conferences. American Society of Mechanical Engineers, 1992. http://dx.doi.org/10.1115/detc1992-0409.

Full text
Abstract:
Abstract This paper deals with the identification of kinematic chains. A kinematic chain can be represented by a weighed graph. The identification of kinematic chains is thereby transformed into the isomorphism problem of graph. When a computer program has to detect isomorphism between two graphs, the first step is to set up the corresponding connectivity matrices for each graph, which are adjacency matrices when considering adjacent vertices and the weighed edges between them. Because these adjacency matrices are dependent of the initial labelling, one can not conclude that the graphs differ when these matrices differ. The isomorphism problem needs an algorithm which is independent of the initial labelling. This paper provides such an algorithm.
APA, Harvard, Vancouver, ISO, and other styles
6

Jonnalagadda, Srinath, and Sundar Krishnamurty. "Modified Standard Codes in Enumeration and Automatic Sketching of Mechanisms." In ASME 1996 Design Engineering Technical Conferences and Computers in Engineering Conference. American Society of Mechanical Engineers, 1996. http://dx.doi.org/10.1115/96-detc/mech-1192.

Full text
Abstract:
Abstract This paper presents the application of modified standard codes in the systematic enumeration and graphical display of non-isomorphic mechanisms. A two-stage enumeration procedure is presented that involves the identification and generation of contracted graphs using modified standard codes. In this scheme, unique numbering reflecting the symmetry properties of the higher order links deduced from these contracted graphs is propagated to the second stage in which all the non-isomorphic kinematic chains corresponding to each contracted graph are enumerated. The integration of the resulting numerical code values with a robust automatic sketching algorithm provides the basis for the systematic enumeration and pictorial displays of basic kinematic chains. Illustrative examples from the enumeration of basic kinematic chains with up to 12 bar 3 degree of freedom system are presented, and the results are discussed.
APA, Harvard, Vancouver, ISO, and other styles
7

Huang, Z., H. F. Ding, and Y. Cao. "The Novel Characteristic Representations of Kinematic Chains and Their Applications." In ASME 2005 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2005. http://dx.doi.org/10.1115/detc2005-84282.

Full text
Abstract:
In this paper, based on perimeter topological graphs of kinematic chains, many novel topological concepts including the synthetic degree-sequence, the characteristic adjacency matrix and the characteristic representation code of kinematic chain are proposed. Both the characteristic adjacency matrix and the characteristic representation code are unique for any kinematic chain and easy to be set up. Therefore a quite effective isomorphism identification method is presented depending on the characteristic adjacency matrix. It high effectiveness is proved by many examples. With object-oriented programming language, a program which can sketch topological graphs of kinematic chains has been developed based on the characteristic representation code. Finally, an application software system establishing the atlas database of topological graphs is introduced. And some functions about the atlas database are also presented in this paper.
APA, Harvard, Vancouver, ISO, and other styles
8

Jain, Anil K., and David W. Paglieroni. "Object Boundary Encoding Methodologies with Applications to Computer Vision*." In Machine Vision. Washington, D.C.: Optica Publishing Group, 1985. http://dx.doi.org/10.1364/mv.1985.thc3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Liu, Chuan-He, and Ting-Li Yang. "The Compound Union Theory and Method of Type Synthesis of Geared Linkage Kinematic Chains." In ASME 2000 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2000. http://dx.doi.org/10.1115/detc2000/mech-14179.

Full text
Abstract:
Abstract The compound union theory and method of type synthesis of geared linkage kinematic chains is developed in this paper. The theory includes the new concepts, such as the basic geared kinematic chain, the compound union and the type code etc., the new constitution principle of geared linkage kinematic chains and four new theorems that are essential for type synthesis of geared linkage kinematic chains. One typical example and its results of the type synthesis are also given. It is proved that the compound union theory and method is systematic, strict, simple, practical and efficient.
APA, Harvard, Vancouver, ISO, and other styles
10

Azarkhail, Mohammadreza, Victor Ontiveros, and Mohammad Modarres. "A Bayesian Framework for Model Uncertainty Considerations in Fire Simulation Codes." In 17th International Conference on Nuclear Engineering. ASMEDC, 2009. http://dx.doi.org/10.1115/icone17-75684.

Full text
Abstract:
Over the past twenty years, computer-based simulation codes have emerged as the leading tools to assess risks of severe events such as fire. The results of such simulation codes are usually used to estimate the magnitude, frequency and consequence of hazards. A typical simulation program/model utilizes many different sub-models, each characterizing a physical or chemical process contributing to exposure of the hazard or occurrence of certain adverse failure events. The final prediction made by such simulation codes can be temporal, spatial or just a single estimate for the measure of interest. The predictions made by the simulation codes are subject to different contributing uncertainties, including the uncertainty about the inputs to the code, uncertainty of sub-models used in the codes and uncertainty in the parameters of the probabilistic models (if applicable) used in the codes to characterize (e.g., validate) code outputs. A primary way to measure the model uncertainties is to perform independent experiments and assess conformance of the models to observations from such experiments. It is very important to note that the experimental results themselves may also involve uncertainties, for example due to measurement errors and lack of precision in instrumentation. In this research experimental data collected as part of the Fire Model Verification and Validation [1] are used to characterize the share of model uncertainty in the total code output uncertainty, when experimental data are compared to the code predictions. In this particular case, one should assume the uncertainty of experiments (e.g., due to sensor or material variability) is available from independent sources. The outcome of this study is the probabilistic estimation of uncertainty associated with the model and the corresponding uncertainty in the predictions made by the simulation code. A Bayesian framework was developed in this research to assess fire model prediction uncertainties in light of uncertain experimental observations. In this research the complexity of the Bayesian inference equations was overcome by adopting a Markov Chain Monte Carlo (MCMC) simulation technique. This paper will discuss the Bayesian framework, examples of using this framework in assessing fire model uncertainties, and a discussion of how the results can be used in risk-informed analyses.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Chained computer codes"

1

Riggle, K. J., and J. W. Roddy. A preliminary assessment of selected atmospheric dispersion, food-chain transport, and dose-to-man computer codes for use by the DOE Office of Civilian Radioactive Waste Management. Office of Scientific and Technical Information (OSTI), February 1989. http://dx.doi.org/10.2172/6187511.

Full text
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography