Academic literature on the topic 'Heterogeneous Environments Present Trade-offs'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Heterogeneous Environments Present Trade-offs.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Heterogeneous Environments Present Trade-offs"

1

García, Sebastián, Diego F. Larios, Julio Barbancho, Enrique Personal, Javier M. Mora-Merchán, and Carlos León. "Heterogeneous LoRa-Based Wireless Multimedia Sensor Network Multiprocessor Platform for Environmental Monitoring." Sensors 19, no. 16 (August 7, 2019): 3446. http://dx.doi.org/10.3390/s19163446.

Full text
Abstract:
The acquisition of data in protected natural environments is subordinated to actions that do not stress the life-forms present in that environment. This is why researchers face two conflicting interests: autonomous and robust systems that minimize the physical interaction with sensors once installed, and complex enough ones to capture and process higher volumes of data. On the basis of this situation, this paper analyses the current state-of-the-art of wireless multimedia sensor networks, identifying the limitations and needs of these solutions. In this sense, in order to improve the trade-off between autonomous and computational capabilities, this paper proposes a heterogeneous multiprocessor sensor platform, consisting of an ultra-low power microcontroller and a high-performance processor, which transfers control between processors as needed. This architecture allows the shutdown of idle systems and fail-safe remote reprogramming. The sensor equipment can be adapted to the needs of the project. The deployed equipment incorporates, in addition to environmental meteorological variables, a microphone input and two cameras (visible and thermal) to capture multimedia data. In addition to the hardware description, the paper provides a brief description of how long-range (LoRa) can be used for sending large messages (such as an image or a new firmware), an economic analysis of the platform, and a study on energy consumption of the platform according to different use cases.
APA, Harvard, Vancouver, ISO, and other styles
2

Stevens, James D., and Andreas Klöckner. "A mechanism for balancing accuracy and scope in cross-machine black-box GPU performance modeling." International Journal of High Performance Computing Applications 34, no. 6 (June 3, 2020): 589–614. http://dx.doi.org/10.1177/1094342020921340.

Full text
Abstract:
The ability to model, analyze, and predict execution time of computations is an important building block that supports numerous efforts, such as load balancing, benchmarking, job scheduling, developer-guided performance optimization, and the automation of performance tuning for high performance, parallel applications. In today’s increasingly heterogeneous computing environment, this task must be accomplished efficiently across multiple architectures, including massively parallel coprocessors like GPUs, which are increasingly prevalent in the world’s fastest supercomputers. To address this challenge, we present an approach for constructing customizable, cross-machine performance models for GPU kernels, including a mechanism to automatically and symbolically gather performance-relevant kernel operation counts, a tool for formulating mathematical models using these counts, and a customizable parameterized collection of benchmark kernels used to calibrate models to GPUs in a black-box fashion. With this approach, we empower the user to manage trade-offs between model accuracy, evaluation speed, and generalizability. A user can define their own model and customize the calibration process, making it as simple or complex as desired, and as application-targeted or general as desired. As application examples of our approach, we demonstrate both linear and nonlinear models; these examples are designed to predict execution times for multiple variants of a particular computation: two matrix-matrix multiplication variants, four discontinuous Galerkin differentiation operation variants, and two 2D five-point finite difference stencil variants. For each variant, we present accuracy results on GPUs from multiple vendors and hardware generations. We view this highly user-customizable approach as a response to a central question arising in GPU performance modeling: how can we model GPU performance in a cost-explanatory fashion while maintaining accuracy, evaluation speed, portability, and ease of use, an attribute we believe precludes approaches requiring manual collection of kernel or hardware statistics.
APA, Harvard, Vancouver, ISO, and other styles
3

Schulte, Ingrid, Juliana Eggers, Jonas Ø. Nielsen, and Sabine Fuss. "What influences the implementation of natural climate solutions? A systematic map and review of the evidence." Environmental Research Letters 17, no. 1 (December 30, 2021): 013002. http://dx.doi.org/10.1088/1748-9326/ac4071.

Full text
Abstract:
Abstract Emergingresearch points to large greenhouse gas mitigation opportunities for activities that are focused on the preservation and maintenance of ecosystems, also known as natural climate solutions (NCS). Despite large quantifications of the potential biophysical and carbon benefits of these activities, these estimates hold large uncertainties and few capture the socio-economic bounds. Furthermore, the uptake of NCS remains slow and information on the enabling factors needed for successful implementation, co-benefits, and trade-offs of these activities remain underrepresented at scale. As such, we present a systematic review that synthesizes and maps the bottom-up evidence on the contextual factors that influence the implementation of NCS in the peer-reviewed literature. Drawing from a large global collection of (primarily case study-based, N = 211) research, this study (1) clarifies the definition of NCS, including in the context of nature-based solutions and other ecosystem-based approaches to addressing climate change; (2) provides an overview of the current state of literature, including research trends, opportunities, gaps, and biases; and (3) critically reflects on factors that may affect implementation in different geographies. We find that the content of the reviewed studies overwhelmingly focuses on tropical regions and activities in forest landscapes. We observe that implementation of NCS rely, not on one factor, but a suite of interlinked enabling factors. Specifically, engagement of indigenous peoples and local communities, performance-based finance, and technical assistance are important drivers of NCS implementation. While the broad categories of factors mentioned in the literature are similar across regions, the combination of factors and how and for whom they are taken up remains heterogeneous globally, and even within countries. Thus our results highlight the need to better understand what trends may be generalizable to inform best practices in policy discussions and where more nuance may be needed for interpreting research findings and applying them outside of their study contexts.
APA, Harvard, Vancouver, ISO, and other styles
4

Viviani, Marco, Nadia Bennani, and Elöd Egyed-Zsigmond. "G-Profile." Information Resources Management Journal 25, no. 3 (July 2012): 61–77. http://dx.doi.org/10.4018/irmj.2012070103.

Full text
Abstract:
In the digital world, many organizations are developing different applications (with different purposes) where users are generally represented by a heterogeneous set of attributes. From time to time, depending on the context, different attributes can provide different digital identities for the same user, often involved in the identification/authentication processes. In the personalized service provision perspective, the scope of identity management becomes much larger, and takes into account information susceptible to change such as user profile information as a whole. Many purely user-centric identity management systems has emerged in the few last years, among them the Higgins project that provides the user with a direct control over his/her data and covers some data security issues. However, a complete user-centric view of extended user identity management is not realistic, in our opinion. In this paper, the authors present G-Profile: a hybrid, open, general-purpose and flexible user modeling system for extended identity management in multi-application environments. G-Profile also tackles the trade-off between users’ and applications’ requirements.
APA, Harvard, Vancouver, ISO, and other styles
5

Ostrak, Andre, Jaak Randmets, Ville Sokk, Sven Laur, and Liina Kamm. "Implementing Privacy-Preserving Genotype Analysis with Consideration for Population Stratification." Cryptography 5, no. 3 (August 20, 2021): 21. http://dx.doi.org/10.3390/cryptography5030021.

Full text
Abstract:
In bioinformatics, genome-wide association studies (GWAS) are used to detect associations between single-nucleotide polymorphisms (SNPs) and phenotypic traits such as diseases. Significant differences in SNP counts between case and control groups can signal association between variants and phenotypic traits. Most traits are affected by multiple genetic locations. To detect these subtle associations, bioinformaticians need access to more heterogeneous data. Regulatory restrictions in cross-border health data exchange have created a surge in research on privacy-preserving solutions, including secure computing techniques. However, in studies of such scale, one must account for population stratification, as under- and over-representation of sub-populations can lead to spurious associations. We improve on the state of the art of privacy-preserving GWAS methods by showing how to adapt principal component analysis (PCA) with stratification control (EIGENSTRAT), FastPCA, EMMAX and the genomic control algorithm for secure computing. We implement these methods using secure computing techniques—secure multi-party computation (MPC) and trusted execution environments (TEE). Our algorithms are the most complex ones at this scale implemented with MPC. We present performance benchmarks and a security and feasibility trade-off discussion for both techniques.
APA, Harvard, Vancouver, ISO, and other styles
6

Atkins, Justine L., George L. W. Perry, and Todd E. Dennis. "Effects of mis-alignment between dispersal traits and landscape structure on dispersal success in fragmented landscapes." Royal Society Open Science 6, no. 1 (January 2019): 181702. http://dx.doi.org/10.1098/rsos.181702.

Full text
Abstract:
Dispersal is fundamental to population dynamics and hence extinction risk. The dispersal success of animals depends on the biophysical structure of their environments and their biological traits; however, comparatively little is known about how evolutionary trade-offs among suites of biological traits affect dispersal potential. We developed a spatially explicit agent-based simulation model to evaluate the influence of trade-offs among a suite of biological traits on the dispersal success of vagile animals in fragmented landscapes. We specifically chose traits known to influence dispersal success: speed of movement, perceptual range, risk of predation, need to forage during dispersal, and amount of suitable habitat required for successful settlement in a patch. Using the metric of relative dispersal success rate, we assessed how the costs and benefits of evolutionary investment in these biological traits varied with landscape structure. In heterogeneous environments with low habitat availability and scattered habitat patches, individuals with more equal allocation across the trait spectrum dispersed most successfully. Our analyses suggest that the dispersal success of animals in heterogeneous environments is highly dependent on hierarchical interactions between trait trade-offs and the geometric configurations of the habitat patches in the landscapes through which they disperse. In an applied sense, our results indicate potential for ecological mis-alignment between species' evolved suites of dispersal-related traits and altered environmental conditions as a result of rapid global change. In many cases identifying the processes that shape patterns of animal dispersal, and the consequences of abiotic changes for these processes, will require consideration of complex relationships among a range of organism-specific and environmental factors.
APA, Harvard, Vancouver, ISO, and other styles
7

Hao, Guang-You, Mary E. Lucero, Stewart C. Sanderson, Elizabeth H. Zacharias, and N. Michele Holbrook. "Polyploidy enhances the occupation of heterogeneous environments through hydraulic related trade-offs inAtriplex canescens(Chenopodiaceae)." New Phytologist 197, no. 3 (December 3, 2012): 970–78. http://dx.doi.org/10.1111/nph.12051.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

López-Matencio, Pablo, Javier Vales-Alonso, and Juan J. Alcaraz. "LBTM: Listen-before-Talk Protocol for Multiclass UHF RFID Networks." Sensors 20, no. 8 (April 18, 2020): 2313. http://dx.doi.org/10.3390/s20082313.

Full text
Abstract:
Radio Frequency Identification (RFID) is considered one of the pioneering technologies of the Internet of Things (IoT). It allows to bind physical environments to information processing systems, adding new capabilities like automatic inventorying, location, or sensing with batteryless tags. Indeed, many data flows of physical objects can be tracked using this technology, and it is common to find heterogeneous traffics present in the same facility, each managed by different sets of readers. For example, in a grocery store, typically we have two kinds of readers: those carrying out a continuous inventory, whose goal is knowing the contents of the shelves as accurately as possible; and a set of checking-out readers at exit gates for the billing process that has to minimize the waiting time of customers. Another example of multiclass traffic is a hospital, where new families of sensing tags allow staff to wirelessly monitor patients—which obviously must be done as a priority—and coexist with other readers aimed at precisely knowing the location of equipment or drugs. Even with the same goal, there could be readers requiring different setups, for example in the hospital case, readers located at doors for inventorying purposes have a short time available to identify passing-by objects or people, and thus they have to work with a higher priority than regular readers performing inventorying tasks. In this work, we investigate a modification of the standard listen-before-talk (LBT) protocol for RFID networks which can support this kind of multipriority environment, by offering different qualities of service to each traffic. Results demonstrate that by tuning the protocol setup, it is possible to establish a trade-off between the performance of each traffic. This is shown for the two cited examples, the grocery shop and the hospital, using a simulation tool allowing us to implement a full-scale RFID model. In addition, we present a greedy mechanism for online reader setup. Instead of selecting offline a hard priority level, this greedy algorithm is able to adapt the priority to achieve the required quality-of-service (QoS) level.
APA, Harvard, Vancouver, ISO, and other styles
9

Liu, Miao, Helena Korpelainen, and Chunyang Li. "Sexual differences and sex ratios of dioecious plants under stressful environments." Journal of Plant Ecology 14, no. 5 (April 20, 2021): 920–33. http://dx.doi.org/10.1093/jpe/rtab038.

Full text
Abstract:
Abstract Dioecious plants exhibit sexual dimorphism in both sexual features (reproductive organs) and secondary sex characteristics (vegetative traits). Sexual differences in secondary traits, including morphological, physiological and ecological characters, have been commonly associated with trade-offs between the cost of reproduction and other plant functions. Such trade-offs may be modified by environmental stressors, although there is evidence that sexually dimorphic responses to stress do not always exist in all plant species. When sexual dimorphism exists, sexually different responses appear to depend on the species and stress types. Yet, further studies on dioecious plant species are needed to allow the generalization of stress effects on males and females. Additionally, sexual dimorphism may influence the frequency and distribution of the sexes along environmental gradients, likely causing niche differentiation and spatial segregation of sexes. At the present, the causes and mechanisms governing sex ratio biases are poorly understood. This review aims to discuss sex-specific responses and sex ratio biases occurring under adverse conditions, which will advance our knowledge of sexually dimorphic responses to environmental stressors.
APA, Harvard, Vancouver, ISO, and other styles
10

Dutta, Anik, Fanny E. Hartmann, Carolina Sardinha Francisco, Bruce A. McDonald, and Daniel Croll. "Mapping the adaptive landscape of a major agricultural pathogen reveals evolutionary constraints across heterogeneous environments." ISME Journal 15, no. 5 (January 15, 2021): 1402–19. http://dx.doi.org/10.1038/s41396-020-00859-w.

Full text
Abstract:
AbstractThe adaptive potential of pathogens in novel or heterogeneous environments underpins the risk of disease epidemics. Antagonistic pleiotropy or differential resource allocation among life-history traits can constrain pathogen adaptation. However, we lack understanding of how the genetic architecture of individual traits can generate trade-offs. Here, we report a large-scale study based on 145 global strains of the fungal wheat pathogen Zymoseptoria tritici from four continents. We measured 50 life-history traits, including virulence and reproduction on 12 different wheat hosts and growth responses to several abiotic stressors. To elucidate the genetic basis of adaptation, we used genome-wide association mapping coupled with genetic correlation analyses. We show that most traits are governed by polygenic architectures and are highly heritable suggesting that adaptation proceeds mainly through allele frequency shifts at many loci. We identified negative genetic correlations among traits related to host colonization and survival in stressful environments. Such genetic constraints indicate that pleiotropic effects could limit the pathogen’s ability to cause host damage. In contrast, adaptation to abiotic stress factors was likely facilitated by synergistic pleiotropy. Our study illustrates how comprehensive mapping of life-history trait architectures across diverse environments allows to predict evolutionary trajectories of pathogens confronted with environmental perturbations.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Heterogeneous Environments Present Trade-offs"

1

Martin, Graham R. The Sensory Ecology of Birds. Oxford University Press, 2017. http://dx.doi.org/10.1093/oso/9780199694532.001.0001.

Full text
Abstract:
The natural world contains a huge amount of constantly changing information. Limitations on, and specializations within, sensory systems mean that each species receives only a small part of that information. In essence, information is filtered by sensory systems. Sensory ecology aims to understand the nature and functions of those filters for each species and sensory system. Fluxes of information, and the perceptual challenges posed by different natural environments, are so large that sensory and behavioural specializations have been inevitable. There have been many trade-offs in the evolution of sensory capacities, and trade-offs and complementarity between different sensory capacities within species. Many behavioural tasks may have influenced the evolution of sensory capacities in birds, but the principal drivers have been associated with just two tasksforaging and predator detection. The key task is the control of the position and timing of the approach of the bill towards a target. Other tasks, such as locomotion and reproduction, are achieved within the requirements of foraging and predator detection. Information thatguides behaviours may often be sparse and partial and key behaviours may only be possible because of cognitive abilities which allow adequate interpretation of partial information. Human modifications of natural environments present perceptual challenges that cannot always be met by the information available to particular birds. Mitigations of the negative effects of human intrusions into natural environments must take account of the sensory ecology of the affected species. Effects of environmental changes cannot be understood sufficiently by viewing them through the filters of human sensory systems.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Heterogeneous Environments Present Trade-offs"

1

Shlomo Agon, Sivan. "Extending the Analysis." In International Adjudication on Trial, 265–311. Oxford University Press, 2019. http://dx.doi.org/10.1093/oso/9780198788966.003.0013.

Full text
Abstract:
The present chapter extends the goal-based analytic framework applied in Parts II and III of the book to an additional category of disputes filed with the World Trade Organization (WTO) Dispute Settlement System (DSS)—those reflecting the growing friction between the WTO’s multilateral trade regime and the network of regional trade agreements (RTAs) proliferating around the globe. Looking at a series of prominent RTA-related cases that came before the WTO DSS, the extensive analysis carried out in this chapter shows that the dynamic reality of goal shifts and goal conflicts experienced within the DSS is not unique to trade-and and perennial disputes. Similar processes can be observed in the histories of other classes of WTO disputes, an analysis of which is likely to disclose different DSS goal-attainment patterns evidencing new goal priorities and trade-offs, and resulting in varying dimensions of judicial effectiveness and ineffectiveness, adjusted to the new operational environments.
APA, Harvard, Vancouver, ISO, and other styles
2

Ratnasingam, Pauline. "A Security Framework for E-Marketplace Participation." In Encyclopedia of Multimedia Technology and Networking, Second Edition, 1272–83. IGI Global, 2009. http://dx.doi.org/10.4018/978-1-60566-014-1.ch172.

Full text
Abstract:
The increasing trend in the use of Internet-based emarketplace applications has created tremendous opportunities for businesses to manage effective supply chain management. An electronic market exists when a supplier provides goods and services to a customer in a transaction partially or fully automated by information technology. E-Marketplaces can be defined as a digital infrastructure that supports industrial commerce, such as auctions, catalogues and exchanges (Ivang & Sorenson, 2005). IDC predicts IT and e-marketplace spending will reach $496.7 billion in the U.S. and $1.3 trillion globally by the year 2009. Despite extensive research on this topic, there has been limited work in the realm of e-marketplace security. These e-marketplaces are generally implemented on the Internet, whose original purpose was to provide a robust heterogeneous distributed computing environment for applications that may not yet be developed. Previous researchers have noted that the formation of electronic marketplaces has been declining and that the failure rates are high. For instance, Dai and Kauffman (2002) suggest that only one-fifth of the electronic marketplaces in operation would succeed since firms have to face serious technical challenges. Theoretically e-marketplaces should enable firms to trade and collaborate more efficiently. The reason for this is due to the proliferation of affordable technology and the explosive growth of B2B transactions that have allowed buyers and sellers to conduct transactions electronically and to generate substantial savings and revenue for participants and owners (Sharifi, Kehoe, & Hopkins, 2006). Nevertheless, in reality, many emarketplaces disappeared during major consolidation phase (Tran, 2006).This study aims to examine the nature of security in e-marketplaces. We identify four types of risks, namely economic, technological, implementation, and relational risks in seven e-marketplace firms from a cross-section of different industries. We then present the control measures as in the responses that the seven firms enforced in order to reduce and manage their risks. The contribution of this study is the development of a security framework based on the findings for e-marketplace participation.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Heterogeneous Environments Present Trade-offs"

1

Baughman, Matt, Nathaniel Hudson, Ian Foster, and Kyle Chard. "Balancing Federated Learning Trade-Offs for Heterogeneous Environments." In 2023 IEEE International Conference on Pervasive Computing and Communications Workshops and other Affiliated Events (PerCom Workshops). IEEE, 2023. http://dx.doi.org/10.1109/percomworkshops56833.2023.10150228.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Mycek, Andrzej, Daniel Grzonka, and Jacek Tchorzewski. "Agent-Based Simulation And Analysis Of Infrastructure-As-Code Process To Build And Manage Cloud Environment." In 37th ECMS International Conference on Modelling and Simulation. ECMS, 2023. http://dx.doi.org/10.7148/2023-0513.

Full text
Abstract:
Widespread cloud systems present new challenges time and time again. An essential element of such environments is their management. The Infrastructure as Code model has been gaining popularity for some time. In work presented here, we have proposed an agent-based approach to process execution within the Infrastructure as Code approach and have performed several numerical experiments. The work also includes an original formal agent model of the system. The results obtained allow us to develop trade-offs regarding computational demand and utilization.
APA, Harvard, Vancouver, ISO, and other styles
3

Piacenza, Joseph, Irem Y. Tumer, Christopher Hoyle, and John Fields. "Power Grid System Design Optimization Considering Renewable Energy Strategies and Environmental Impact." In ASME 2012 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2012. http://dx.doi.org/10.1115/imece2012-88451.

Full text
Abstract:
The North American power grid is a highly heterogeneous and dispersed complex system that has been constructed ad-hoc over the past century. Large-scale propagating system failures remain constant over the past 30 years as the rising population and affiliated energy centric culture continues to drive increases in energy demand. In addition, there are continued negative effects from various types of energy generation strategies, including renewables, on the environment. This paper presents a methodology for a high-level system optimization of a power grid capturing annual cost, energy use, and environmental impact for use during the early design trade studies. A model has been created to explore the system state of a power grid based on various types of energy generation, including both fossil fuel and renewable strategies. In addition, energy conservation practices for commercial and residential applications are explored as an alternative solution to meet predicted demand. A component for incorporating design trades within the model has been developed to analyze the feasibility of trading surplus energy between interconnections as a means to address issues with excess generation and mitigate the need for additional generation. The result is a set of Pareto Optimal solutions considering both cost and environmental impact that meet predicted energy demand constraints.
APA, Harvard, Vancouver, ISO, and other styles
4

Kane, S. C., S. Croft, P. McClay, R. Venkataraman, and M. F. Villani. "Relative Performance of a TGS for the Assay of Drummed Waste as Function of Collimator Opening." In The 11th International Conference on Environmental Remediation and Radioactive Waste Management. ASMEDC, 2007. http://dx.doi.org/10.1115/icem2007-7174.

Full text
Abstract:
Improving the safety, accuracy and overall cost effectiveness of the processes and methods used to characterize and handle radioactive waste is an on-going mission for the nuclear industry. An important contributor to this goal is the development of superior non-destructive assay instruments. The Tomographic Gamma Scanner (TGS) is a case in point. The TGS applies low spatial resolution experimental computed tomograghy (CT) linear attenuation coefficient maps with three-dimensional high-energy resolution single photon emission reconstructions. The results are presented as quantitative matrix attenuation corrected images and assay values for gamma-emitting radionuclides. Depending on a number of operational factors, this extends the diversity of waste forms that can be assayed, to a given accuracy, to items containing more heterogeneous matrix distributions and less uniform emission activity distributions. Recent advances have significantly extended the capability to a broader range of matrix density and to a wider dynamic range of surface dose rate. Automated systems sense the operational conditions, including the container type, and configure themselves accordingly. The TGS also provides a flexible data acquisition platform and can be used to perform far-field style measurements, classical segmented gamma scanner measurements, or to implement hybrid methods, such as reconstructions that use a priori knowledge to constrain the image reconstruction or the underlying energy dependence of the attenuation. A single, yet flexible, general purpose instrument of this kind adds several tiers of strategic and tactical value to facilities challenged by a diverse and difficult to assay waste streams. The TGS is still in the early phase of industrial uptake. There are only a small number of general purpose TGS systems operating worldwide, most being configured to automatically select between a few configurations appropriate for routine operations. For special investigations, one may wish to widen the repertoire but there is currently little guidance as to the trade-offs involved. In this work, we address this weakness by studying the performance of a typical TGS arrangement as a function of collimator opening, scan pattern and scan time for a representative selection of simulated waste forms. Our focus is on assessing the impact on the precision and accuracy of the quantitative assay result but we also report the utility of the imaging information in confirming acceptable knowledge about the packages.
APA, Harvard, Vancouver, ISO, and other styles
5

Lall, Pradeep, Ganesh Hariharan, Guoyun Tian, Jeff Suhling, Mark Strickland, and Jim Blanche. "Risk Management Models for Flip-Chip Electronics in Extreme Environments." In ASME 2006 International Mechanical Engineering Congress and Exposition. ASMEDC, 2006. http://dx.doi.org/10.1115/imece2006-15443.

Full text
Abstract:
In this work, risk-management and decision-support models for reliability prediction of flip chip packages in harsh environments have been presented. The models presented in this paper provide decision guidance for smart selection of component packaging technologies and perturbing product designs for minimal risk insertion of new packaging technologies. In addition, qualitative parameter interaction effects, which are often ignored in closed-form modeling, have been incorporated in this work. Previous studies have focused on development of modeling tools at sub-scale or component level. The tools are often available only in an offline manner for decision support and risk assessment of advanced technology programs. There is need for a turn key approach, for making trade-offs between geometry and materials and quantitatively evaluating the impact on reliability. Multivariate linear regression and robust principal components regression methods were used for developing these models. The first approach uses the potentially important variables from stepwise regression, and the second approach uses the principal components obtained from the eigen-values and eigen-vectors, for model building. Principal-component models have been included because if their added ability in addressing multi-collinearity. The statistics models are based on accelerated test data in harsh environments, while failure mechanics models are based on damage mechanics and material constitutive behavior. Statistical models developed in the present work are based on failure data collected from the published literature and extensive accelerated test reliability database in harsh environments, collected by center of advanced vehicle electronics. Sensitivity relations for geometry, materials, and architectures based on statistical models, failure mechanics based closed form models and FEA models have been developed. Convergence of statistical, failure mechanics, and FEA based model sensitivities with experimental data has been demonstrated.
APA, Harvard, Vancouver, ISO, and other styles
6

Booth, Joseph J. "Automatic selection of correlation filters by using expert networks." In OSA Annual Meeting. Washington, D.C.: Optica Publishing Group, 1992. http://dx.doi.org/10.1364/oam.1992.thpp3.

Full text
Abstract:
Optical correlators have successfully been integrated with missile guidance electronics to track and intercept cooperative targets in benign environments. Subsequent development of optical correlators stresses the use of realistic targets in cluttered backgrounds. This requirement results in the use of a large filter space representative of a wide class of targets. Therefore, initial selection of a correlation filter by the tracker is computationally expensive because of the large search space involved. A hybrid paradigm employing expert systems technology and neural networks can efficiently locate likely target regions in the missile's field of view and provide tentative identification of target type for use by the tracker. This presentation will introduce expert network architectures currently under consideration and compare their performance against test images with varying degrees of target and background complexity. Hardware utilization and processing latency will be evaluated for each network architecture in an attempt to present some of the design trade-offs inherent in the construction of these systems.
APA, Harvard, Vancouver, ISO, and other styles
7

Bach, Christian, Christian Drobny, and Martin Tajmar. "Trade-off between concurrent engineering software tools for utilisation in space education and beyond." In Symposium on Space Educational Activities (SSAE). Universitat Politècnica de Catalunya, 2022. http://dx.doi.org/10.5821/conference-9788419184405.125.

Full text
Abstract:
Concurrent engineering is an approach to the development of complex systems that is char- acterised by direct communication between the disciplines involved. Instead of processing the individual disciplines one after the other, as in sequential design, or processing via a single contact person, as in centralised design, all systems work simultaneously. Learning this inter- action and understanding what information needs to be communicated between disciplines are among the central learning objectives of the course "Spacecraft Design" at Technische Uni- versität Dresden, Institute of Aerospace Engineering. In this course, the students represent different disciplines and work out a mission study that is commissioned by the lecturers. The lecturers thus participate in the development process in the role of customers. Key to the concurrent engineering approach is that each discipline has access to the most current design data at all times. This can be done via a dedicated software solution. Both commercial and open source software tools are available. Within the frame of the above-men- tioned course, several tools have been tested. The covered software solutions comprise ESA Open Concurrent Design Tool (OCDT), RHEA Concurrent Design Platform (CDP), Valispace and IBM Rhapsody. This contribution presents the experience that we gathered with these concurrent engineering software tools. First, the tools are described and their commonalities and distinctions are high- lighted. Subsequently, a detailed trade-off between the tools is being presented. This trade-off will particularly focus on the utilisation of these tools within the scope of course work at univer- sities, as this entails special requirements and boundary conditions, such as very limited time for introducing the software, highly heterogeneous user group, limited utilisation of the software in terms of depth and functionality, to only name a few. Within this contribution, we will also explore alternative approaches, such as using no software at all. The aim of this contribution is to offer other teachers and students some guideline for selecting a concurrent engineering software solution and implementing it in course work, in a way that using the tool itself does not become the central learning challenge of the course. The results might be of interest beyond university courses, as some requirements, like short times to get familiar with the software or certain interface requirements, also apply to other environments in research and development.
APA, Harvard, Vancouver, ISO, and other styles
8

Arroyos, Marina Roche, Javier Arturo Corea Araujo, Didac Sabria, Vinayak Padmaji, Pablo Cano, and Patrice Garmier. "Model based component co-optimization and scalability of virtual testing for electric drivetrain vehicle." In FISITA World Congress 2021. FISITA, 2021. http://dx.doi.org/10.46720/f2021-dgt-045.

Full text
Abstract:
"Within the automotive product development cycle virtual and heterogeneous testing is becoming increasingly established through component, module and vehicle-level simulation. Though a number of standards in this field have been established, models are still mostly created in a fragmented manner: using domain-specific tools to create, manage and execute simulations without standardization of the content of the functional interfaces (FMI does only standardize the format) and limited scalability. This fragmentation leads to a lot of redundant effort as models of the same component or system are re-created several times. HIFI-ELEMENTS project addressed this fragmentation through two main mechanisms: Firstly, developing, validating and publishing a recommendation for standardization of model interfaces for common e-drive components (e-machine, inverter, battery, DC/DC converter, thermal management) and implementation of compliant versions of existing models. Secondly, implementing a seamless workflow linking extended versions of existing tools with effort-saving automated methods for model parameterization and test case generation. This seamless integration will substantially increase the number of integrations and test cases that can be early validated through simulation, leading to optimized efficiency designs and development effort reduction. The standardization also guarantees scalability among fidelity levels, from concept design to XiL through detailed modelling. In this paper we present the results of the Use Case C: Component co-optimization. The purpose of use cases is the demonstration of the advantages of the standardized models and workflow industry relevant scenarios. The work content performed in the use case is very extensive and multidisciplinary. In the first step, the high fidelity models from the expert components developers were validated independently with automated testing tools and later integrated to create a complete vehicle architecture integration. The standardization permitted to seamlessly test several component variants developed within the project for the same architecture, including tens of motor models with different technologies, inverters and high voltage converters with different IGBT technology and various battery packs. This possibility was exploited through co-optimization with multi objective Genetic Algorithm, permitting to select the optimal component combination, powertrain architecture (with and without high voltage DCDC converter) and components parametrization considering the trade-off of consumption and performances. The optimized and baseline variants were used to demonstrate the scalability of the models to different simulation objectives. The model was co-simulated with a traffic simulation environment in order to evaluate the impact of eco-driving recommendation algorithms in a realistic driving situation. The optimized solution was also validated against a wide database of driving conditions including real driving cycles, performance and vehicle dynamics. Finally, the integrated models were seamlessly transferred to real time simulation platform for Model-in-the-Loop testing with a simulated 3D environment aimed at ADAS testing. Real-time capability demonstrates that next steps such as Driver-in-the-Loop and Hardware-in-the loop can be achieved smoothly. The extensive simulation activities performed in this use case demonstrate the benefits of the standard in models exchangeability and effort reduction in model based development. This project received funding from the European Union’s (EU) Horizon 2020 Research and innovation program under grant agreement N 769935."
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Heterogeneous Environments Present Trade-offs"

1

Martin, Ciaran. Five tests for risk-based approaches to national cybersecurity in resource-constrained environments. Digital Pathways at Oxford, April 2022. http://dx.doi.org/10.35489/bsg-dp-wp_2022/05.

Full text
Abstract:
While we cannot currently accurately specify what good cybersecurity looks like, we can analyse what good risk-based approaches to national cybersecurity should aim at achieving. This is particularly important in low- and middle-income countries operating in resource-constrained environments in the early stages of economic development and digitalisation. This paper, therefore, discusses key considerations for risk-based cybersecurity by investigating the trade-offs that decision-makers should address so that scarce resources are best deployed to fend off threats that are more likely to happen and cause significant harm. The analysis is presented in the form of five tests that can be used to analyse the robustness of risk-based cybersecurity when resources are limited and to think about the potential paths that nations can take as they grapple with various economic and digitalisation challenges. As such, this framework does not present an exhaustive list of all the fundamental components of a cybersecurity strategy, but rather analyses the most important trade-offs and challenges that a cybersecurity strategy should address.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography