Academic literature on the topic 'Scripts generated'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Scripts generated.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Scripts generated"

1

Usui, Toshinori, Yuto Otsuki, Tomonori Ikuse, Yuhei Kawakoya, Makoto Iwamura, Jun Miyoshi, and Kanta Matsuura. "Automatic Reverse Engineering of Script Engine Binaries for Building Script API Tracers." Digital Threats: Research and Practice 2, no. 1 (March 2021): 1–31. http://dx.doi.org/10.1145/3416126.

Full text
Abstract:
Script languages are designed to be easy-to-use and require low learning costs. These features provide attackers options to choose a script language for developing their malicious scripts. This diversity of choice in the attacker side unexpectedly imposes a significant cost on the preparation for analysis tools in the defense side. That is, we have to prepare for multiple script languages to analyze malicious scripts written in them. We call this unbalanced cost for script languages asymmetry problem . To solve this problem, we propose a method for automatically detecting the hook and tap points in a script engine binary that is essential for building a script Application Programming Interface (API) tracer. Our method allows us to reduce the cost of reverse engineering of a script engine binary, which is the largest portion of the development of a script API tracer, and build a script API tracer for a script language with minimum manual intervention. This advantage results in solving the asymmetry problem. The experimental results showed that our method generated the script API tracers for the three script languages popular among attackers (Visual Basic for Applications (VBA), Microsoft Visual Basic Scripting Edition (VBScript), and PowerShell). The results also demonstrated that these script API tracers successfully analyzed real-world malicious scripts.
APA, Harvard, Vancouver, ISO, and other styles
2

Liu, Shuang Mei, Yong Po Liu, and Ji Wu. "Building a Distributed Testing Execution System Based on TTCN-3." Applied Mechanics and Materials 556-562 (May 2014): 2772–78. http://dx.doi.org/10.4028/www.scientific.net/amm.556-562.2772.

Full text
Abstract:
In this paper, a distributed testing execution system is designed, which provides a mechanism of node communication, test script deployment, test scheduling, executor-driving and test result collection in distributed environment. A workload model is established, by which testers can describe the performance testing requirement. A performance testing framework is given, which simulates user behaviors in real environment based on virtual users so as to generate workload from the system under test (SUT). It can control the execution of virtual users by TTCN-3 standard interface. After executing the performance testing, test report is generated by extracting log. A method of generating performance test-case is studied by reusing functional test scripts. By executing performance testing on an online bookstore, this paper demonstrates the availability of the method of reusing TTCN-3 functional test scripts and the capability of distributed performance testing system that had been established.
APA, Harvard, Vancouver, ISO, and other styles
3

Liu, Shuang Mei, Xue Mei Liu, and Yong Po Liu. "Realization of an Execution System of Distributed Performance Testing Based on TTCN-3." Applied Mechanics and Materials 713-715 (January 2015): 486–90. http://dx.doi.org/10.4028/www.scientific.net/amm.713-715.486.

Full text
Abstract:
An execution system of distributed performance testing is designed in this paper, which provides a mechanism of node communication, test script deployment, test scheduling, execution-driving and test result collection in distributed environment. A workload model is established, by which testers can describe the performance testing requirement. A performance testing framework is given, which simulates user behaviors in real environment based on virtual users so as to generate workload from the system under test (SUT). It can control the execution of virtual users by TTCN-3 standard interface. After executing the performance testing, test report is generated by extracting log. A method of generating performance test-case is studied by reusing functional test scripts. By executing performance testing on an online bookstore, this paper demonstrates the availability of the method of reusing TTCN-3 functional test scripts and the capability of distributed performance testing system established.
APA, Harvard, Vancouver, ISO, and other styles
4

Wenzel, Amy. "Schema Content for a Threatening Situation: Responses to Expected and Unexpected Events." Journal of Cognitive Psychotherapy 23, no. 2 (May 2009): 136–46. http://dx.doi.org/10.1891/0889-8391.23.2.136.

Full text
Abstract:
Although previous research has identified the components of event-based schemas, or scripts, for threatening situations in anxious individuals, no studies have examined how scripts change when anxious individuals are faced with a deviation in the expected sequence of events. In the present study, blood fearful (n = 49) and nonfearful (n = 48) participants assigned subjective units of discomfort (SUD) ratings to the events comprising the script for getting a bleeding cut on the arm. Subsequently, they listed a series of 10 events that would occur following 1 of 2 unexpected events that interrupted the script. Results indicated that blood fearful participants assigned higher SUD ratings to scripted events than nonfearful participants. Participants in the two groups generated largely similar sequences of events that would occur after the unexpected events. However, relative to nonfearful participants, blood fearful participants listed more events characterized by negative affect. These results suggest that blood fearful individuals are able to recover from deviations from the standard script for a common but threatening situation, although their associated emotional experiences are more distressing than those of nonfearful individuals.
APA, Harvard, Vancouver, ISO, and other styles
5

Friedland, Gerald, Luke Gottlieb, and Adam Janin. "Narrative theme navigation for sitcoms supported by fan-generated scripts." Multimedia Tools and Applications 63, no. 2 (September 20, 2011): 387–406. http://dx.doi.org/10.1007/s11042-011-0877-z.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Briggs, Pamela, and Ken Goryo. "Biscriptal Interference: A Study of English and Japanese." Quarterly Journal of Experimental Psychology Section A 40, no. 3 (August 1988): 515–31. http://dx.doi.org/10.1080/02724988843000050.

Full text
Abstract:
It was once assumed that alphabetic, syllabic, and logographic scripts could be clearly differentiated in terms of their respective processing demands, but recent evidence suggests that, as visual stimuli, they all draw upon common “configurational” processing resources. Two experiments are reported which address this issue. Both employ cross-lingual interference paradigms, with the rationale that competition for limited processing resources will be reflected in the degree of interference generated when two scripts are presented simultaneously. The experiments differ in terms of task requirements, the first being a word-naming task, biased towards reliance upon the more rule-based decoding skills; whereas the second is a colour naming task, with a more configurational bias. In the first study, the locus of the interference effect was clearly pre-lexical, and interference was only generated by those scripts that could feasibly draw upon grapheme-phoneme correspondence rules. No interference was generated by logographs that could be accessed “directly” without recourse to any prelexical phonological code. In the second study, the locus of interference was twofold: early in processing, as a result of competition for configurational processes, and later, phonological output competition prior to articulation. These results clearly demonstrate major differences in the ways in which logographic, syllabic, and alphabetic scripts are processed.
APA, Harvard, Vancouver, ISO, and other styles
7

Wilson, Christine, Dave Smith, Adrian Burden, and Paul Holmes. "Participant-generated imagery scripts produce greater EMG activity and imagery ability." European Journal of Sport Science 10, no. 6 (November 2010): 417–25. http://dx.doi.org/10.1080/17461391003770491.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Oz, Mert, Caner Kaya, Erdi Olmezogullari, and Mehmet S. Aktas. "On the Use of Generative Deep Learning Approaches for Generating Hidden Test Scripts." International Journal of Software Engineering and Knowledge Engineering 31, no. 10 (October 2021): 1447–68. http://dx.doi.org/10.1142/s0218194021500480.

Full text
Abstract:
With the advent of web 2.0, web application architectures have been evolved, and their complexity has grown enormously. Due to the complexity, testing of web applications is getting time-consuming and intensive process. In today’s web applications, users can achieve the same goal by performing different actions. To ensure that the entire system is safe and robust, developers try to test all possible user action sequences in the testing phase. Since the space of all the possibilities is enormous, covering all user action sequences can be impossible. To automate the test script generation task and reduce the space of the possible user action sequences, we propose a novel method based on long short-term memory (LSTM) network for generating test scripts from user clickstream data. The experiment results clearly show that generated hidden test sequences are user-like sequences, and the process of generating test scripts with the proposed model is less time-consuming than writing them manually.
APA, Harvard, Vancouver, ISO, and other styles
9

Souza, Fernando, and Adolfo Maia Jr. "A Mathematical, Graphical and Visual Approach to Granular Synthesis Composition." Revista Vórtex 9, no. 2 (December 10, 2021): 1–27. http://dx.doi.org/10.33871/23179937.2021.9.2.4.

Full text
Abstract:
We show a method for Granular Synthesis Composition based on a mathematical modeling for the musical gesture. Each gesture is drawn as a curve generated from a particular mathematical model (or function) and coded as a MATLAB script. The gestures can be deterministic through defining mathematical time functions, hand free drawn, or even randomly generated. This parametric information of gestures is interpreted through OSC messages by a granular synthesizer (Granular Streamer). The musical composition is then realized with the models (scripts) written in MATLAB and exported to a graphical score (Granular Score). The method is amenable to allow statistical analysis of the granular sound streams and the final music composition. We also offer a way to create granular streams based on correlated pair of grains parameters.
APA, Harvard, Vancouver, ISO, and other styles
10

Pino, Rodney, Renier Mendoza, and Rachelle Sambayan. "A Baybayin word recognition system." PeerJ Computer Science 7 (June 16, 2021): e596. http://dx.doi.org/10.7717/peerj-cs.596.

Full text
Abstract:
Baybayin is a pre-Hispanic Philippine writing system used in Luzon island. With the effort in reintroducing the script, in 2018, the Committee on Basic Education and Culture of the Philippine Congress approved House Bill 1022 or the ”National Writing System Act,” which declares the Baybayin script as the Philippines’ national writing system. Since then, Baybayin OCR has become a field of research interest. Numerous works have proposed different techniques in recognizing Baybayin scripts. However, all those studies anchored on the classification and recognition at the character level. In this work, we propose an algorithm that provides the Latin transliteration of a Baybayin word in an image. The proposed system relies on a Baybayin character classifier generated using the Support Vector Machine (SVM). The method involves isolation of each Baybayin character, then classifying each character according to its equivalent syllable in Latin script, and finally concatenate each result to form the transliterated word. The system was tested using a novel dataset of Baybayin word images and achieved a competitive 97.9% recognition accuracy. Based on our review of the literature, this is the first work that recognizes Baybayin scripts at the word level. The proposed system can be used in automated transliterations of Baybayin texts transcribed in old books, tattoos, signage, graphic designs, and documents, among others.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Scripts generated"

1

Dogantimur, Erkan. "A method to generate modern city buildings with the aid of Python-scripting." Thesis, University of Gävle, Ämnesavdelningen för datavetenskap, 2009. http://urn.kb.se/resolve?urn=urn:nbn:se:hig:diva-4660.

Full text
Abstract:

It takes time to model buildings in a 3D city environment, for example in a game. Time is usually something very constricted in a production stage of anything, whether it is a personal project at home, at school or more occurring; in the 3D industry. This report will bring forth a method to quickly generate detailed buildings with the help of Python scripting, integrated in Maya 2009. The script will be working with modules that will be assembled together to create a modern city type of building. A comparison will be made between this script and a couple other scripts that offer the same solution but in different ways.

APA, Harvard, Vancouver, ISO, and other styles
2

Challco, Geiser Chalco. "Planejamento instrucional automatizado em aprendizagem colaborativa com suporte computacional utilizando planejamento hierárquico." Universidade de São Paulo, 2012. http://www.teses.usp.br/teses/disponiveis/45/45134/tde-06122012-030238/.

Full text
Abstract:
Em Aprendizagem Colaborativa com Suporte Computacional (ACSC), o planejamento instrucional consiste em obter uma sequência de interações instrucionais que definem o conteúdo instrucional como a representação do que deve ser ensinado e da forma em que os participantes devem interagir, denominada informação de planejamento instrucional. O desenvolvimento, adaptação e personalização de unidades concisas de estudo compostas por recursos instrucionais e informação de planejamento instrucional, denominadas unidades de aprendizagem, envolve um processo de planejamento instrucional complexo que consome muito tempo e apresenta um conjunto de tarefas repetitivas a serem efetuadas pelos projetistas instrucionais. Neste trabalho, o planejamento instrucional em ACSC é modelado como um problema de planejamento hierárquico para dar suporte ao desenvolvimento, adaptação e personalização das unidades de aprendizagem de forma automática. A modelagem consiste na representação do domínio a ser ensinado, das caraterísticas dos estudantes e das estratégias de planejamento instrucional na linguagem do sistema JSHOP2ip, um sistema de planejamento hierárquico desenvolvido para dar solução aos problemas específicos de planejamento instrucional. Para avaliar a modelagem proposta, efetuamos o desenvolvimento de um gerador de cursos colaborativos como um serviço Web usando a modelagem proposta e o sistema JSHOP2ip, no qual foram avaliados o desempenho, a modelagem das estratégias e a saída do planejador. Além disso, para demonstrar a viabilidade do modelo proposto em situações reais, efetuamos o desenvolvimento de uma ferramenta de autoria de unidades de aprendizagem que emprega o gerador de cursos colaborativos.
In Computer Supported Collaborative Learning (CSCL), the goal of instructional design is to obtain a instructional interaction sequence that define the instructional content as a representation of what should be taught and the way in which participants must interact, called instructional planning information. The development, adaptation and personalization of basic units of study comprised of the instructional resources and instructional planning information, called units of learning, that involves a complex instructional planning process, time consuming and repetitive. In this work, the instructional design in CSCL is modeled as hierarchical planning problem to support the development, adaptation and personalization for units of learning. The modeling is the representation of the domain to be taught, the characteristics of students and instructional strategies in JSHOP2ip, an independent hierarchical planning system designed to solve problems of instructional design. To evaluate the proposed model, we developed a collaborative course generator as a Web service using the proposed model and JSHOP2ip system, upon which we evaluated the performance, modeling strategies and the output scheduler. Furthermore, to demonstrate the feasibility of the proposed model in real situations, we developed an authoring tool for units of learning employing the collaborative course generator
APA, Harvard, Vancouver, ISO, and other styles
3

ANDREATTA, DANIELA. "Un’analisi esplorativa delle determinanti della gestione illegale dei rifiuti: il caso italiano." Doctoral thesis, Università Cattolica del Sacro Cuore, 2019. http://hdl.handle.net/10280/55868.

Full text
Abstract:
Negli ultimi anni, la gestione illegale dei rifiuti ha attirato l’attenzione pubblica e dell’accademia. A causa delle sue conseguenze negative non solo per l’ambiente, ma anche per la salute pubblica e la crescita economica, gli esperti hanno cominciato ad esplorare le dinamiche del fenomeno e le possibilità di prevenzione. Alcuni studi hanno evidenziato l’esistenza di diversi fattori che possono determinare la gestione illegale dei rifiuti, ma pochi di essi hanno empiricamente testato la validità dei fattori stessi. Di conseguenza, si avverte la necessità di produrre nuova conoscenza sull’argomento. Il presente studio consiste in un’analisi esplorativa di fattori socio-economici, fattori di policy e di performance, e fattori criminali che influenzano la gestione illegale dei rifiuti in Italia. Dopo aver identificato le determinanti considerate rilevanti dalla letteratura, l’obiettivo è quello di testarle empiricamente. Per prima cosa, grazie all’unicità di un dataset creato sul contesto italiano, nello studio si indaga quantitativamente l’effetto di diversi fattori sul fenomeno attraverso un’analisi econometrica. Successivamente, lo studio prosegue con un’analisi “crime script” al fine di esplorare quali fattori suggeriti dalla letteratura e testati nella parte quantitativa emergono anche da casi studio e come effettivamente intervengono nel ciclo dei rifiuti italiano. I risultati indicano che la gestione illegale dei rifiuti è determinata da: i) uno scarso sviluppo economico e demografico, un alto livello d’istruzione nel territorio, la presenza di turisti; ii) l'inefficienza della normativa ambientale, delle forze dell’ordine e delle prestazioni sui rifiuti; iii) la presenza di criminalità organizzata e la diffusione di crimini economici e fiscali. Prendendo spunto da questi risultati, lo studio non solo aumenta la conoscenza sul fenomeno, ma è anche in grado di avanzare alcuni suggerimenti di policy per contrastare efficacemente le condotte illegali legate alla gestione dei rifiuti.
In the last several decades, illegal waste management (IWM) has attracted great academic and public attention. Due to its negative consequences not only for the environment, but also for public health and economic growth, scholars started to be interested in the dynamics of IWM and in how to prevent it. Some studies stressed the existence of different factors that can determine the phenomenon, but very few of them have empirically tested their validity. Consequently, developing new research on the topic is still necessary. The present study conducts an explorative analysis of the socio-economic, policy and performance-driven and criminal factors influencing IWM in Italy. After the identification of the most relevant determinants according to the literature, the objective is to empirically test them. First, thanks to a unique dataset focused on the Italian context, the study quantitatively investigates the effect of different factors on the phenomenon through an econometric analysis. Second, the study realises a crime script analysis to explore which factors suggested by the literature and tested in the quantitative part emerge also in concrete case studies and how they effectively intervene in the Italian waste cycle. Results indicate that IWM is determined by: i) a low level of economic development and population density, a high level of education and tourists’ presence; ii) inefficiency in environmental regulation, enforcement and waste performances; iii) the presence of organised crime and the diffusion of economic and fiscal crimes. According to these findings, the study not only deepens the knowledge of the phenomenon, but it is also able to provide some policy suggestions to efficiently hinder illegal conducts related to waste management.
APA, Harvard, Vancouver, ISO, and other styles
4

Kalenský, Ondřej. "Design přenosného veterinárního rentgenového přístroje." Master's thesis, Vysoké učení technické v Brně. Fakulta strojního inženýrství, 2017. http://www.nusl.cz/ntk/nusl-318804.

Full text
Abstract:
This diploma thesis is focused on the design of a product family of portable veterinary x-ray generators. The thesis deals with the possibilities of using parameterization when designing products which are part of a product series. The main body of the thesis lies in the design of a parametric script which generates variations of portable veterinary x-ray generators depending on the size and position of the inner components. The outer surfaces are defined by algorithms from the input parameters. It is possible to alter the individual attributes (e.g. the progresion of curves, the dimensions of component parts and the proportions between the individual parts). The output from the parametric script is three size-variations of the product.
APA, Harvard, Vancouver, ISO, and other styles
5

Cutumisu, Maria. "Using behaviour patterns to generate scripts for computer role-playing games." 2009. http://hdl.handle.net/10048/583.

Full text
Abstract:
Thesis (Ph.D.)--University of Alberta, 2009.
Title from PDF file main screen (viewed on Sept. 9, 2009). "A thesis submitted to the Faculty of Graduate Studies and Research in partial fulfillment of the requirements for the degree of Doctor of Philosophy, Department of Computing Science, University of Alberta." Includes bibliographical references.
APA, Harvard, Vancouver, ISO, and other styles
6

Wang, Han-Lin, and 王漢麟. "A Parser-based Call-graph Generator for Script Languages." Thesis, 2012. http://ndltd.ncl.edu.tw/handle/18426342696223501567.

Full text
Abstract:
碩士
國立交通大學
資訊管理研究所
100
A call graph is a directed graph that represents calling relationships between subroutines in a computer program, and it also is a basic program analysis result that can be used for human understanding of programs. There have already been many call-graph tools, but most of them only support complied language. The call-graph tools for script languages (e.g., phpCallGraph, pyCallGrpah, etc.) are relatively few. As a result, we present a parser-based call-graph generator for script languages in this thesis. The proposed call-graph generator is constructed by using a compiler generator, such as yacc (Yet Another Compiler Compiler) and PLY (Python Lex-Yacc). The proposed call-graph generator is composed of the call-graph lexer, the call-graph parser and the call-graph renderer. In the simulation cases, we implement two call-graph generators for Lua and PHP respectively. We compare the generated call graphs with phpCallGraph, and the graphic results shows that the call graph we generated is better for understanding in handling nested functions and entry functions.
APA, Harvard, Vancouver, ISO, and other styles
7

Hsu, Han-Wei, and 許瀚蔚. "Script-Controlled Constrained-Random Pattern Generator for Processor Verification." Thesis, 2009. http://ndltd.ncl.edu.tw/handle/91148451249666852058.

Full text
Abstract:
碩士
國立交通大學
電子工程系所
97
IC complexity is increasing so rapidly that the time spent on whole design flow increases in this situation. It is necessary to reduce the development time due to the pressure from the time to market. Verification presents about 60-70% of the total design effort and advances in verification methodology can improve the time to market considerably. Directed tests and golden reference models are becoming the primitive tools in the modern design verification environment. Verification strategies are consequently developed towards advance methodologies like constrained-random approach to reduce verification pattern development time, and speed up the time it takes to achieve complete verification. Constrained-random pattern generation tools create tests for corner cases that the microprocessor designers may not expect and hence find bugs early in the verification stage. This thesis describes the details of the constrained-random generator and the script file that helps easily produce a huge amount of constrained-random patterns for designated corner cases.
APA, Harvard, Vancouver, ISO, and other styles
8

Wang, Shih Chi, and 王仕祺. "Automatic UI Interaction Script Generator for Android Applications Using Activity Call Graph Analysis." Thesis, 2015. http://ndltd.ncl.edu.tw/handle/6aukng.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Scripts generated"

1

Vasset, Philippe. Script generator: Machines. London: Serpent's Tail, 2004.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Barucci, Piero, Piero Bini, and Lucilla Conigliello, eds. Il Corporativismo nell'Italia di Mussolini. Florence: Firenze University Press, 2019. http://dx.doi.org/10.36253/978-88-6453-793-1.

Full text
Abstract:
Il volume raccoglie dieci saggi sul tema generale dell’ampliamento del governo pubblico dell’economia durante il corporativismo fascista. Alcuni scritti affrontano il problematico rapporto tra intellettuali (come Luigi Einaudi, Attilio Cabiati, Ugo Spirito, Nino Massimo Fovel) e dittatura mussoliniana. Altri saggi discutono criticamente le scelte di politica fiscale del periodo e i vari aspetti connessi alla riforma della finanza pubblica. Alcuni autori indagano poi sui rapporti tra diritto amministrativo e diritto corporativo, dedicando un commento puntuale alle due leggi del 1939 sulla tutela del paesaggio e dei beni artistici. Occupano un posto a sé all’interno del volume uno scritto sull’interscambio tra pensiero economico italiano e tedesco dal 1918 al 1945, e quello di cui è autore Sabino Cassese, che, partendo dai suoi numerosi studi, riflette sul senso generale della vicenda storica del fascismo.
APA, Harvard, Vancouver, ISO, and other styles
3

Kornicki, Peter Francis. Scripts and Writing. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198797821.003.0003.

Full text
Abstract:
This chapter deals with the diversity of scripts in East Asia. By the time that neighbouring societies came into contact with Chinese writing, it had already gone a long way towards standardization and had already generated a substantial literature. In many of those societies the Chinese script was used to write official and then literary texts in Sinitic, and subsequently to write the vernaculars using characters phonographically. This was true of Japan, Korea, and Vietnam, but it was not true of the Tanguts or of Tibet, where the arrival of Buddhist scriptures stimulated the desire to translate, and for this purpose scripts were invented for the Tangut and Tibetan vernaculars. In Japan the two kana scripts emerged in the ninth century, and later vernacular scripts were developed in Vietnam (nôm), Korea (hangul or chosongul) and other societies. This diversity, it is argued, is largely due to the principle of disassociation.
APA, Harvard, Vancouver, ISO, and other styles
4

Schneider, Florian. The User-Generated Nation. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780190876791.003.0007.

Full text
Abstract:
Chapter 7 turns to user-generated content, social media, and ‘Web 2.0’ technologies in digital China’s message boards and comment sections. The cases of the Nanjing Massacre and the Diaoyu Islands then show that online commentaries often provide a nuanced picture of how to make sense of Sino-Japanese relations, and yet the overarching discursive patterns combine with digital mechanisms such as ‘likes’ and algorithmic popularity rankings to push the discussion into nationalist media scripts. In contrast, China’s microblogging spheres at first sight offer a different story: discussions on Weibo or Weixin are diverse, dynamic, and can have impressive reach. Yet the nature of such social networks ultimately either skews them in favour of a few influential users or moves discussions into the walled gardens of small social groups, making nationalist discourse reverberate through the echo chambers of digital China and contributing to a visceral sense of a shared nationhood.
APA, Harvard, Vancouver, ISO, and other styles
5

Maier, Esther R. Creating Production Values in a Dramatic Television Series. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198827436.003.0003.

Full text
Abstract:
This chapter draws on the insights generated through an ethnographic study of a dramatic television series production, to show how the creative and financial imperatives of the project are integrated through the calculative practices of the mid-level managers on the project. The analysis draws on a broader definition of calculative practice that does not restrict calculation to mathematical or numerical computations but also incorporates judgement and intuition. The findings show how the final configuration of the product emerges through monetary valuations or ‘estimates’ prepared by the mid-level managers for each scene in the scripts. In their focus on the creation of ‘production values’ project members also incorporate properties of the product that are not related to money or price into their calculative practice.
APA, Harvard, Vancouver, ISO, and other styles
6

Harries, Matthew, and Benedict Wilkinson. Strategic Scripts and Nuclear Disarmament. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780190851163.003.0017.

Full text
Abstract:
This chapter spans Freedman’s earliest focus on nuclear weapons and his development of strategic scripts as an analytical tool over three decades later. It discusses the way in which opposing logics of disarmament and armament co-existed in relation to nuclear weapons. It deploys the notion of strategic scripts to explain the contradictions inherent in approaches to nuclear disarmament, developing the concept of strategic scripts as it does so. The notion of scripts can be used to explore and even to promote nuclear disarmament. Two scripts, one of ‘stable reduction’, the other of ‘disarmament’, each serve to frame thinking. These scripts and the interactions they generate facilitate understanding of the way in which opposite instinctive reactions and, stemming from these, scripts about nuclear weapons co-exist, but are fragile as either an analytical or a strategic tool.
APA, Harvard, Vancouver, ISO, and other styles
7

National Aeronautics and Space Administration (NASA) Staff. Tutorial for Using the CGT Script Library to Generate and Assemble Overset Meshes. Independently Published, 2019.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Drouet, Pascale, and Nathalie Rivère de Carles. French Receptions of Shakespearean Tragedy. Edited by Michael Neill and David Schalkwyk. Oxford University Press, 2016. http://dx.doi.org/10.1093/oxfordhb/9780198724193.013.44.

Full text
Abstract:
The philosopher Alain synthesized the French approach to Shakespeare when he announced: ‘If Hamlet fell down to Earth, naked, without its procession of admirers, the critics would mock it, not without the semblance of a reason.’ Contrasting the quest for the essence of a work of art as opposed to its mere existence, he emphasized an enduring dualism in the French reception of Shakespearean tragedy. Between Voltaire’s rejection and the Romantics’ adoration of the ‘black sun’, Shakespeare generated a dialogue in French art. Starting with the critical reception, this chapter shows how Shakespearean tragedy became the instrument of a reflection on Liberty. The stage dynamic of destruction and regeneration in the treatment of Shakespearean tragedies produced a call for new liberated approaches to French dramatic language. Shakespeare thus became a laboratory where translators, adaptors, stage directors, and actors would put their trade to the test of a protean script and free themselves from the strictures of national tradition.
APA, Harvard, Vancouver, ISO, and other styles
9

Weiss, Harvey. 4.2 ka BP Megadrought and the Akkadian Collapse. Oxford University Press, 2017. http://dx.doi.org/10.1093/oso/9780199329199.003.0004.

Full text
Abstract:
The Akkadians, of southern Mesopotamia, created the first empire ca. 2300 BC with the conquest and imperialization of southern irrigation agriculture and northern Mesopotamian dry-farming landscapes. The Akkadian Empire conquered and controlled a territory of roughly 30,000 square kilometers and, importantly, its wealth in labor and cereal crop-yields. The Empire maintained a standing army, weaponry, and a hierarchy of administrators, scribes, surveyors, craft specialists, and transport personnel, sustainable and profitable for about one hundred years. Archaeological excavations indicate the empire was still in the process of expansion when the 2200 BC–1900 BC/4.2–3.9 ka BP global abrupt climate change deflected or weakened the Mediterranean westerlies and the Indian Monsoon and generated synchronous megadrought across the Mediterranean, west Asia, the Indus, and northeast Africa. Dry-farming agriculture domains and their productivity across west Asia were reduced severely, forcing adaptive societal collapses, regional abandonments, habitat-tracking, nomadization, and the collapse of the Akkadian Empire. These adaptive processes extended across the hydrographically varied landscapes of west Asia and thereby provided demographic and societal resilience in the face of the megadrought’s abruptness, magnitude, and duration.
APA, Harvard, Vancouver, ISO, and other styles
10

Canestrari, Stefano. Ferite dell'anima e corpi prigionieri. Bononia University Press, 2021. http://dx.doi.org/10.30682/sg308.

Full text
Abstract:
Il tema del suicidio richiede un confronto con questioni esistenziali fondamentali e il principale rischio di chi affronta un argomento così vertiginoso è quello di non riconoscerne la complessità. La perenne tentazione del giurista è di “chiudere le porte” a una riflessione problematica e sofferta ricorrendo ad equiparazioni che producono effetti semplificatori e occultano la molteplicità delle questioni in gioco. Questo scritto, all’opposto, si pone l’obiettivo di svelare l’inadeguatezza di alcune assimilazioni: la “liceità del suicidio” – principio fondamentale del biodiritto penale – non comporta automaticamente la “liceità dell’aiuto al suicidio”. E il concetto stesso di “aiuto al suicidio” dev’essere analizzato nelle sue numerose sfaccettature: l’agevolazione al suicidio “tradizionale” – provocato da forte sofferenza psicologica ed esistenziale – non può essere equiparata al fenomeno differente e dilemmatico dell’aiuto medico a morire. Il compito di delineare tali distinzioni con nettezza è necessario e urgente, anche alla luce delle recenti e celebri sentenze della Corte costituzionale italiana e di quella tedesca, che per motivi diversi rischiano di generare equivoci e fraintendimenti. L’aiuto al suicidio e il suicidio medicalmente assistito non sono gemelli congiunti e neppure fratelli: sono soltanto parenti che si ribellano a una convivenza forzata. Nell’ambito di queste coordinate il presente volume fornisce un contributo di fondamentale importanza anche nella prospettiva di un dibattito pubblico ponderato sulle questioni di fine vita.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Scripts generated"

1

Milevsky, Moshe Arye. "Building a Tontine Simulation in R." In How to Build a Modern Tontine, 27–48. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-00928-0_3.

Full text
Abstract:
AbstractIn this chapter I explain the core of the (basic, version 1.0) modern tontine simulation algorithm and provide R-scripts that can be used to generate forecasted values for what I have called the Modern Tontine (MoTo) Fund.
APA, Harvard, Vancouver, ISO, and other styles
2

Pereira, Daniel J. C., Kenedy Marconi G. Santos, Douglas O. Campos, Polyane A. Santos, Lucas S. Ribeiro, Marcelo B. Perotoni, Tagleorge M. Silveira, Marcela S. Novo, and Willian F. S. Maia. "A Python Script to Generate a 3D Model of a Coaxial Cable." In Proceedings of the 7th Brazilian Technology Symposium (BTSym’21), 615–22. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-04435-9_66.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Paulsen, Brandon, and Chao Wang. "Example Guided Synthesis of Linear Approximations for Neural Network Verification." In Computer Aided Verification, 149–70. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-13185-1_8.

Full text
Abstract:
AbstractLinear approximations of nonlinear functions have a wide range of applications such as rigorous global optimization and, recently, verification problems involving neural networks. In the latter case, a linear approximation must be hand-crafted for the neural network’s activation functions. This hand-crafting is tedious, potentially error-prone, and requires an expert to prove the soundness of the linear approximation. Such a limitation is at odds with the rapidly advancing deep learning field – current verification tools either lack the necessary linear approximation, or perform poorly on neural networks with state-of-the-art activation functions. In this work, we consider the problem of automatically synthesizing sound linear approximations for a given neural network activation function. Our approach is example-guided: we develop a procedure to generate examples, and then we leverage machine learning techniques to learn a (static) function that outputs linear approximations. However, since the machine learning techniques we employ do not come with formal guarantees, the resulting synthesized function may produce linear approximations with violations. To remedy this, we bound the maximum violation using rigorous global optimization techniques, and then adjust the synthesized linear approximation accordingly to ensure soundness. We evaluate our approach on several neural network verification tasks. Our evaluation shows that the automatically synthesized linear approximations greatly improve the accuracy (i.e., in terms of the number of verification problems solved) compared to hand-crafted linear approximations in state-of-the-art neural network verification tools. An artifact with our code and experimental scripts is available at: https://zenodo.org/record/6525186#.Yp51L9LMIzM."Image missing""Image missing"
APA, Harvard, Vancouver, ISO, and other styles
4

Svoren, Martin, Elena Camerini, Merijn van Erp, Feng Wei Yang, Gert-Jan Bakker, and Katarina Wolf. "Approaches to Determine Nuclear Shape in Cells During Migration Through Collagen Matrices." In Cell Migration in Three Dimensions, 97–114. New York, NY: Springer US, 2023. http://dx.doi.org/10.1007/978-1-0716-2887-4_7.

Full text
Abstract:
AbstractFibrillar collagen is an abundant extracellular matrix (ECM) component of interstitial tissues which supports the structure of many organs, including the skin and breast. Many different physiological processes, but also pathological processes such as metastatic cancer invasion, involve interstitial cell migration. Often, cell movement takes place through small ECM gaps and pores and depends upon the ability of the cell and its stiff nucleus to deform. Such nuclear deformation during cell migration may impact nuclear integrity, such as of chromatin or the nuclear envelope, and therefore the morphometric analysis of nuclear shapes can provide valuable insight into a broad variety of biological processes. Here, we describe a protocol on how to generate a cell-collagen model in vitro and how to use confocal microscopy for the static and dynamic visualization of labeled nuclei in single migratory cells. We developed, and here provide, two scripts that (Fidler, Nat Rev Cancer 3(6):453–458, 2003) enable the semi-automated and fast quantification of static single nuclear shape descriptors, such as aspect ratio or circularity, and the nuclear irregularity index that forms a combination of four distinct shape descriptors, as well as (Frantz et al., J Cell Sci 123 (Pt 24):4195–4200, 2010) a quantification of their changes over time. Finally, we provide quantitative measurements on nuclear shapes from cells that migrated through collagen either in the presence or the absence of an inhibitor of collagen degradation, showing the distinctive power of this approach. This pipeline can also be applied to cell migration studied in different assays, ranging from 3D microfluidics to migration in the living organism.
APA, Harvard, Vancouver, ISO, and other styles
5

Yekkala, Indu, and Sunanda Dixit. "Prediction of Heart Disease Using Random Forest and Rough Set Based Feature Selection." In Coronary and Cardiothoracic Critical Care, 208–19. IGI Global, 2019. http://dx.doi.org/10.4018/978-1-5225-8185-7.ch011.

Full text
Abstract:
Data is generated by the medical industry. Often this data is of very complex nature—electronic records, handwritten scripts, etc.—since it is generated from multiple sources. Due to the Complexity and sheer volume of this data necessitates techniques that can extract insight from this data in a quick and efficient way. These insights not only diagnose the diseases but also predict and can prevent disease. One such use of these techniques is cardiovascular diseases. Heart disease or coronary artery disease (CAD) is one of the major causes of death all over the world. Comprehensive research using single data mining techniques have not resulted in an acceptable accuracy. Further research is being carried out on the effectiveness of hybridizing more than one technique for increasing accuracy in the diagnosis of heart disease. In this article, the authors worked on heart stalog dataset collected from the UCI repository, used the Random Forest algorithm and Feature Selection using rough sets to accurately predict the occurrence of heart disease
APA, Harvard, Vancouver, ISO, and other styles
6

Ali, Ahmad Zamzuri Mohamad, and Wee Hoe Tan. "3D Talking-Head Mobile App." In Encyclopedia of Mobile Phone Behavior, 1130–37. IGI Global, 2015. http://dx.doi.org/10.4018/978-1-4666-8239-9.ch092.

Full text
Abstract:
The 3D talking-head mobile app is a type of mobile app that presents the head of a computer generated three-dimensional animated character that can talk or hold a conversation with human users. It is commonly used for language learning or entertainment, thus the quality of the mobile app is determined by the accuracy and the authenticity of lip synchronization and facial expressions. A typical 3D talking-head mobile app is structured by six key components, i.e., animated 3D head model, voice over scripts, background audio, background graphics, navigational buttons, and instructional captions and subtitles. When the app is meant for educational purposes, the integration of these components requires proficiency in creating an animated 3D talking head, authoring a mobile app, and understanding pedagogical principles for mobile assisted language learning. The mastery of scientific knowledge in these areas is essential to keep abreast with the advancement of mobile technologies and future research direction.
APA, Harvard, Vancouver, ISO, and other styles
7

Segal, Yoram, Ofer Hadar, and Lenka Lhotska. "Assessing Human Mobility by Constructing a Skeletal Database and Augmenting it Using a Generative Adversarial Network (GAN) Simulator." In Studies in Health Technology and Informatics. IOS Press, 2022. http://dx.doi.org/10.3233/shti220967.

Full text
Abstract:
This paper presents a neural network simulator based on anonymized patient motions that measures, categorizes, and infers human gestures based on a library of anonymized patient motions. There is a need for a sufficient training set for deep learning applications (DL). Our proposal is to extend a database that includes a limited number of videos of human physiotherapy activities with synthetic data. As a result of our posture generator, we are able to generate skeletal vectors that depict human movement. A human skeletal model is generated by using OpenPose (OP) from multiple-person videos and photographs. In every video frame, OP represents each human skeletal position as a vector in Euclidean space. The GAN is used to generate new samples and control the parameters of the motion. The joints in our skeletal model have been restructured to emphasize their linkages using depth-first search (DFS), a method for searching tree structures. Additionally, this work explores solutions to common problems associated with the acquisition of human gesture data, such as synchronizing activities and linking them to time and space. A new simulator is proposed that generates a sequence of virtual coordinated human movements based upon a script.
APA, Harvard, Vancouver, ISO, and other styles
8

"Matlab Scripts to generate example matrix decompositions." In Understanding Complex Datasets, 203–21. Chapman and Hall/CRC, 2007. http://dx.doi.org/10.1201/9781584888338.axa.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Callies, Sophie, Mathieu Gravel, Eric Beaudry, and Josianne Basque. "Logs Analysis of Adapted Pedagogical Scenarios Generated by a Simulation Serious Game Architecture." In Natural Language Processing, 1178–98. IGI Global, 2020. http://dx.doi.org/10.4018/978-1-7998-0951-7.ch057.

Full text
Abstract:
This paper presents an architecture designed for simulation serious games, which automatically generates game-based scenarios adapted to learner's learning progression. We present three central modules of the architecture: (1) the learner model, (2) the adaptation module and (3) the logs module. The learner model estimates the progression of the development of skills targeted in the game. The adaptation module uses this estimation to automatically plan an adapted sequence of in-game situations optimizing learning. We implemented our architecture in Game of Homes, a simulation serious game, which aims to train adults the basics of real estate. We built a scripted-based version of Game of Homes in order to compare the impact of scripted-based scenarios versus generated scenarios on learning progression. We qualitatively analyzed logs files of thirty-six adults who played Game of Homes for 90 minutes. The main results highlighted the specificity of the generated pedagogical scenarios for each learner and, more specifically, the optimization of the guidance provided and of the presentation of the learning content throughout the game.
APA, Harvard, Vancouver, ISO, and other styles
10

Chien, Been-Chian, and Shiang-Yi He. "A Generic Context Interpreter for Pervasive Context-Aware Systems." In Mobile and Handheld Computing Solutions for Organizations and End-Users, 308–21. IGI Global, 2013. http://dx.doi.org/10.4018/978-1-4666-2785-7.ch017.

Full text
Abstract:
Developing pervasive context-aware systems to construct smart space applications has attracted much attention from researchers in recent decades. Although many different kinds of context-aware computing paradigms were built of late years, it is still a challenge for researchers to extend an existing system to different application domains and interoperate with other service systems due to heterogeneity among systems This paper proposes a generic context interpreter to overcome the dependency between context and hardware devices. The proposed generic context interpreter contains two modules: the context interpreter generator and the generic interpreter. The context interpreter generator imports sensor data from sensor devices as an XML schema and produces interpretation scripts instead of interpretation widgets. The generic interpreter generates the semantic context for context-aware applications. A context editor is also designed by employing schema matching algorithms for supporting context mapping between devices and context model.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Scripts generated"

1

Friedland, Gerald, Luke Gottlieb, and Adam Janin. "Narrative theme navigation for sitcoms supported by fan-generated scripts." In the 3rd international workshop. New York, New York, USA: ACM Press, 2010. http://dx.doi.org/10.1145/1877850.1877854.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Zhao, M., and N. Tailor. "Development of a Test Facility and Its Application for Validation and Reliability Testing of Safety-Critical Software." In 18th International Conference on Nuclear Engineering. ASMEDC, 2010. http://dx.doi.org/10.1115/icone18-29958.

Full text
Abstract:
This paper describes a versatile test facility developed by AECL for validation and reliability (V&R) testing of safety-critical software used in the process trip computers for CANDU reactors. It describes the hardware and software aspects of the test facility. The test hardware consists of a test rig with a test computer used for executing the test software and a process trip computer emulator. The test software is comprised of an operating system, a test interpreter, a test oracle, and a man-machine interface. This paper also discusses the application of the test facility in V&R testing of the process trip computer, how test scripts are prepared and automatically run on the test computer, and how test results are automatically generated by the test computer, thus eliminating potential human errors. The test scripts, which contain specific instructions for testing, are text files written in a special AECL test language. An AECL Test Language Interpreter (ATLIN) program interprets the test scripts and translates structured English statements in the test scripts into test actions. The intuitive nature of the special AECL test language, the version controlled test scripts in text format and automatic test logging feature facilitate the preparation of test cases, which are easy to repeat, review and readily modifiable, and production of consistent results. This paper presents the concept of adding a process trip computer emulator for use in preparation of V&R testing. The process trip computer emulator is designed independently from the actual process trip computer but based on the same functional specification as for the process trip computer. The use of the process trip computer emulator allows the test scripts to be exercised before the actual process trip computers are available for V&R testing, thereby, resulting in a significant improvement to the project schedule. The test facility, with the built-in process trip computer emulator, is also a valuable training tool for the V&R staff and plant personnel.
APA, Harvard, Vancouver, ISO, and other styles
3

Aktassov, Kanat, Dauletbek Ayaganov, Kanat Imagambetov, Ruslan Alissov, Said Muratbekov, Zhaksylyk Kali, Bagdad Amangaliyev, Dmitry Sidorov, and Alikhan Kurmankulov. "High Resolution Reservoir Simulator Driven Custom Scripts as the Enabler for Solving Reservoir to Surface Network Coupling Challenges." In Abu Dhabi International Petroleum Exhibition & Conference. SPE, 2021. http://dx.doi.org/10.2118/207444-ms.

Full text
Abstract:
Abstract This paper presents a practical methodology of optimizing and building a detailed field surface network system by using the high-resolution reservoir simulator driven custom-made Python scripts to efficiently predict the future performance of the vast oil and gas-condensate carbonate field. All existing surface hydraulic tables are quality checked and lifting issue constraints corrected. Pressure losses at the wellhead chokes incorporated into the high-resolution reservoir simulator in the form of equation by using the custom scripts instead of a table format to calculate gas rate dependent pressure losses more precisely. Consequently, all 400+ surface production system manifolds, pipes and well chokes Horizontal Flow Performance (HFP) tables are updated and coupled to the reservoir simulator through Field Management (FM) controller which in turn generates Inflow Performance Relationship (IPR) tables for the coupled wells and passes them to solve the network. The methodology described in this paper applied for a complex field development planning of the Karachaganak. At present, reservoir management strategy requires constant balancing effort to uniformly spread gas re-injection into the lower Voidage Replacement Ratio areas in the Upper Gas-Condensate part of the reservoir due to reservoir heterogeneity. Additionally, an increase in field and wells gas-oil ratio and water-cut creates bottlenecks in the surface gathering system and requires robust solutions to decongest the surface network. Current simulation tools are not always effective due longer run times and simulation instability due to complex network system. As a solution, project-specific network balancing challenges are resolved by incorporating custom-made scripts into the high-resolution simulator. Faster and flexible integrated model based on hydraulic tables reproduced the historical pressure losses of the surface pipelines at similar resolution and generated accurate prediction profiles in a twice-quicker time than existing reservoir simulator. Overall, this approach helped to generate more stable production profiles by identifying bottlenecks in the surface network and evaluate future projects with more confidence by achieving a significant CAPEX cost savings. The comprehensive guidelines provided in this paper can aid reservoir modeling by setting up flexible integrated models to account for surface network effects. The value of incorporating Python scripts demonstrated to implement non-standard and project specific network balancing solutions leveraging on the flexibility and the openness of the modelling tool.
APA, Harvard, Vancouver, ISO, and other styles
4

Jo, Yong-Wook, David Farnsworth, and Jacob Wiest. "Pier 55, NYC: A Case Study for the Future of Design, Documentation and Fabrication." In IABSE Congress, New York, New York 2019: The Evolving Metropolis. Zurich, Switzerland: International Association for Bridge and Structural Engineering (IABSE), 2019. http://dx.doi.org/10.2749/newyork.2019.2787.

Full text
Abstract:
<p>The Pier 55 project in New York City represents an achievement in design, documentation, fabrication and construction achievable only through recent advances in construction technology. Pier 55 is a new park built over the Hudson River constructed from complex precast concrete. It is a one of its kind pier with a signature design by the Heatherwick Studio that undulates in elevation and is structurally composed of tulip shaped concrete “pots”. Heatherwick's vision required significant collaborative efforts by all involved to define a geometry that satisfied the often-competing needs for prefabrication efficiency, durability, accessibility, design aesthetics and construction feasibility. Arup and Heatherwick developed parametric tools to automate much of the design process so that multiple iterations of geometry could be tested and refined to find optimal solutions. Initial scripts to define surface geometry of the “pot” structures for coordination evolved into additional scripts which created analysis models, full structural geometry, and shop drawing level documentation. As the project moved into construction, Arup and the fabrication team at Fort Miller precast concrete manufacturer and Fab3 steel fabricator utilized the models and scripts generated during the design process for direct digital input of the structural geometry to create complex CNC-milled foam formwork, 3- dimensional rebar documentation, and documentation and digital fabrication of steel components required for assembly and erection of the various pieces by Weeks Marine. This paper will discuss significant innovations including using sophisticated parametric modeling to digitally design, document, fabricate and construct geometrically complex structures.</p>
APA, Harvard, Vancouver, ISO, and other styles
5

Peña-Cortés, Fernando, Miguel Escalona, Gonzalo Rebolledo, Gustavo Donoso, and Fredy Lara. "SIG_PDUT, aplicación para la planificación territorial de La Araucanía." In International Conference Virtual City and Territory. Concepción: Centre de Política de Sòl i Valoracions, 2005. http://dx.doi.org/10.5821/ctv.7358.

Full text
Abstract:
The development of GIS tools involves the generation of territorial information, organized according to the specific consultation, input and processing requirements of a group of users, and must be sufficiently flexible to meet both present and future requirements. In the context of the Regional Urban Development Plan (UDRP) for the Araucanía Region, a GIS application has been developed as a set of Avenue scripts for ArcView(c) 3.x, in order to administer the information generated by the project, making use of the potential offered by GIS and its extensions, in a customized GUI, oriented towards loading, display, consultation, cleanliness and graphic products
APA, Harvard, Vancouver, ISO, and other styles
6

Wilkerson, Patrick W., Andrzej J. Przekwas, and Chung-Lung Chen. "Multiphysics Design and Analysis Simulations for Power Electronic Device Wirebonds." In ASME 2003 International Electronic Packaging Technical Conference and Exhibition. ASMEDC, 2003. http://dx.doi.org/10.1115/ipack2003-35170.

Full text
Abstract:
Multiscale multiphysics simulations were performed to analyze wirebonds for power electronic devices. Modern power-electronic devices can be subjected to extreme electrical and thermal conditions. Fully coupled electro-thermo-mechanical simulations were performed utilizing CFDRC’s CFD-ACE+ multiphysics simulation software and scripting capabilities. Use of such integrated multiscale multiphysics simulation and design tools in the design process can cut cost, shorten product development cycle time, and result in optimal designs. The parametrically designed multiscale multiphysics simulations performed allowed for a streamlined parametric analysis of the electrical, thermal, and mechanical effects on the wirebond geometry, bonding sites and power electronic device geometry. Multiscale analysis allowed for full device thermo-mechanical analysis as well as detailed analysis of wirebond structures. The multiscale simulations were parametrically scripted allowing for parametric simulations of the device and wirebond geometry as well as all other simulation variables. Analysis of heat dissipation from heat generated in the power-electronic device and through Joule heating were analyzed. The multiphysics analysis allowed for investigation of the location and magnitude of stress concentrations in the wirebond and device. These stress concentrations are not only investigated for the deformed wirebond itself, but additionally at the wirebond bonding sites and contacts. Changes in the wirebond geometry and bonding geometry, easily changed through the parametrically designed simulation scripts, allows for investigation of various wirebond geometries and operating conditions.
APA, Harvard, Vancouver, ISO, and other styles
7

Johnson, Clayton, Stephen Schmidt, Justin Taylor, and John deLaChapelle. "Geospatial Database Development: Supporting Geohazard Risk Assessments Through Real-Time Data and Geospatial Analytics." In 2022 14th International Pipeline Conference. American Society of Mechanical Engineers, 2022. http://dx.doi.org/10.1115/ipc2022-87139.

Full text
Abstract:
Abstract TC Energy owns and operates over 32,000 miles of natural gas pipelines within 38 states in the United States, a sizable portion of which crosses through terrain highly susceptible to geologic hazards. To better support geohazard risk management, TC Energy has implemented a customized web-based geohazard platform (GeoForce) to identify, inventory, and track geohazards across their U.S. pipeline system. The platform was built within the Environmental Systems Research Institute (ESRI) ArcGIS Enterprise environment and leverages a diverse amount of ESRI product offerings. The platform is hosted on ArcGIS Portal and includes multiple custom apps and dashboards which allow users to efficiently view, summarize, add, and update geohazard data. Another key component of the system is a connection to ArcGIS GeoEvent Server. GeoEvent Server allows for the delivery of near real-time geohazard threat notifications through emails and dashboards (i.e., seismic events, flooding, precipitation). The notifications also provide detailed information for the sections of the system affected by the event, and in the case of a seismic event, a suggested course of action in alignment with TC Energy procedures. ArcGIS Image Server was leveraged to host over 4 terabytes of LiDAR imagery which can be used in concert with the other geohazard datasets present in the database. Custom geospatial scripts were also developed to create a near real-time link between the GeoForce master database and TC Energy’s master data mart, asset information, regulatory, and other organizational data. These scripts flag and report where spatial and/or attribute data have changed or may no longer be valid and therefore require follow-up action to support risk management (e.g., when pipelines are abandoned, HCAs are updated, or pipe properties are updated). The scripts also store the centerline attribute changes in a table for further review to identify potential trends. The GeoForce database is also built as a launching platform for proactive analytics, and eventually predictive analytics for critical precipitation thresholds for landslide risk management, landslide susceptibility mapping, system-wide risk scoring, seismic events and ILI bending strain coincident with geohazards. Details related to the threat alerts that are issued by the geospatial system are stored and visualized in a PowerBI integration. With the incorporation of ArcGIS Notebook Server, algorithms will be developed that will review the historical threat repository that is being generated and issue threat alerts based on probability of a hazardous event occurring in proximity of the pipeline system.
APA, Harvard, Vancouver, ISO, and other styles
8

Cobos, Miguel, and Daniel Ripalda. "Production of educational videogame from the design document." In Intelligent Human Systems Integration (IHSI 2022) Integrating People and Intelligent Systems. AHFE International, 2022. http://dx.doi.org/10.54941/ahfe100996.

Full text
Abstract:
This work aims to promote environmental awareness in children, as a philosophy of life, to promote a culture of care for the ecosystem in their family and social environment. From the field of video games, we wanted to achieve the proposed objective, for this we started from the design document, based on Rogers' model. Prototypes were developed, levels were designed, resources were placed in the scenario, physics and mechanics were tested. The agile Scrum methodology and the Unity video game engine with C# scripts were considered for the development. The video game consists of a superhero of nature with three levels, was generated in its initial phase to be tested with the target audience. The results obtained from a control group of children from 7 to 10 years old are presented. A user experience evaluation method was applied by inspection to obtain results related to usability heuristics, Gestalt principles, interactions, and perception of aesthetics.
APA, Harvard, Vancouver, ISO, and other styles
9

Stori, James A., and Paul K. Wright. "A Knowledge-Based System for Machining Operation Planning in Feature Based, Open-Architecture Manufacturing." In ASME 1996 Design Engineering Technical Conferences and Computers in Engineering Conference. American Society of Mechanical Engineers, 1996. http://dx.doi.org/10.1115/96-detc/dfm-1286.

Full text
Abstract:
Abstract Within the Integrated Design And Manufacturing Environment (IMADE), operation planning provides a mapping from geometric design primitives to machining operation sequences for manufacturing processes. Operation planning includes tool selection, machining parameter selection, and tool path generation. An object oriented approach to program structure is adopted, whereby features, operations and tools, inherit behaviors and attributes from the appropriate class-hierarchies for the part, the manufacturing operations, and tooling classes. A detailed example is presented illustrating the operation planning search algorithm. Scripts are generated by the individual machining operations for execution on a machine tool. Tooling information is maintained in an object-oriented database through the FAR libraries for Common LISP. Examples of particular process plans show that the inherent trade-offs between specified precision and machining time can be investigated. An Open Architecture Machine Tool (MOSAIC-PM) has been used to machine the parts created by the feature based design and planning system. The novel contributions of this paper relate to the demonstration of “seamless” links between, a) design, b) planning, and c) actual fabrication by milling.
APA, Harvard, Vancouver, ISO, and other styles
10

Gribel Ito, Luana, Mariana Helena Inês Moreira, Sarah Brandão Souza, Sinara Pimenta Medeiros, and Phyllipe Lima. "What are the Top Used Modules in Python Open-Source Projects?" In Computer on the Beach. Itajaí: Universidade do Vale do Itajaí, 2022. http://dx.doi.org/10.14210/cotb.v13.p037-044.

Full text
Abstract:
ABSTRACTWhen a team of developers are creating new software, they mostlikely will use libraries of code that can assist in a given requiredfeature. One source to find these libraries can be popular questionanswerwebsites, blogs, personal web pages and the usage of toolsthat can automatically suggest libraries. Popularity might be onecriterion that developers can use when choosing a library. In thiswork, we performed an empirical evaluation through mining Pythonprojects hosted in GitHub to identify the most popular used modules.We selected 129 projects based on specific criteria, one of thembeing the number of stars that reflects their popularity. To automatethe data extraction process, we developed the PySniffer, an opensourcetool that performs a static code analysis in Python scripts,checking which modules from both the standard library and externalmodules are used in a project. Our tool also has a front-end thatcan display the data more friendly with statistical information. Asa result, we generated a list with the top used modules in Pythonprojects hosted in GitHub, serving as complementary informationalongside the most popular libraries informed in personal blogs andwebsites.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Scripts generated"

1

Wenren, Yonghu, Joon Lim, Luke Allen, Robert Haehnel, and Ian Dettwiler. Helicopter rotor blade planform optimization using parametric design and multi-objective genetic algorithm. Engineer Research and Development Center (U.S.), December 2022. http://dx.doi.org/10.21079/11681/46261.

Full text
Abstract:
In this paper, an automated framework is presented to perform helicopter rotor blade planform optimization. This framework contains three elements, Dakota, ParBlade, and RCAS. These elements are integrated into an environment control tool, Galaxy Simulation Builder, which is used to carry out the optimization. The main objective of this work is to conduct rotor performance design optimizations for forward flight and hover. The blade design variables manipulated by ParBlade are twist, sweep, and anhedral. The multi-objective genetic algorithm method is used in this study to search for the optimum blade design; the optimization objective is to minimize the rotor power required. Following design parameter substitution, ParBlade generates the modified blade shape and updates the rotor blade properties in the RCAS script before running RCAS. After the RCAS simulations are complete, the desired performance metrics (objectives and constraints) are extracted and returned to the Dakota optimizer. Demonstrative optimization case studies were conducted using a UH-60A main rotor as the base case. Rotor power in hover and forward flight, at advance ratio 𝜇𝜇 = 0.3, are used as objective functions. The results of this study show improvement in rotor power of 6.13% and 8.52% in hover and an advance ratio of 0.3, respectively. This configuration also yields greater reductions in rotor power for high advance ratios, e.g., 12.42% reduction at 𝜇𝜇 = 0.4.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography