Academic literature on the topic 'PRECEDE framework'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'PRECEDE framework.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "PRECEDE framework"

1

Singh, Harjit, Rajiv Kumar Garg, and Anish Sachdeva. "Framework to precede collaboration in supply chain." Benchmarking: An International Journal 25, no. 8 (November 29, 2018): 2635–59. http://dx.doi.org/10.1108/bij-04-2017-0061.

Full text
Abstract:
PurposeThe purpose of this paper is to help supply chain (SC) decision makers successfully penetrate through SC collaboration and strengthen their SC in the global market by understanding collaborative activities, and understand how these activities are related to each other in the SC.Design/methodology/approachThis paper develops a set of collaborative activities from literature, and the developed model is helpful for SC decision makers to monitor their SC activities and take corrective actions to improve collaboration in their SC by using interpretive structural modeling (ISM) and MICMAC analysis.FindingsThis study reveals that collaborative activities increase the value of whole SC. The various activities are modeled on the basis of “an activity influencing other activities” and “an activity influenced by other activities,” which is useful for SC managers to take a decision.Research limitations/implicationsThe current study is literature based; therefore, there would be need of more explanation of the activities which lead to understand and implement SC collaboration in case of service and manufacturing industry.Practical implicationsThe model of this study is helpful for decision makers to implement supply chain collaboration (SCC) and to understand various SCC activities on the basis of their driving and dependence power.Originality/valueThis research provided insight into skills needed for SC decision makers to implement collaboration in the SC using ISM. The results of the study could be adopted to monitor the existing SCC program or design new collaboration program to meet the global market requirements. To the best of knowledge, there is no reference that discusses SC collaborative activities on the basis of their driving and dependence powers.
APA, Harvard, Vancouver, ISO, and other styles
2

Ribeiro Junior, Luiz Antonio, and Wiliam Ferreira da Cunha. "Nonadiabatic dynamics of injected holes in conjugated polymers." Physical Chemistry Chemical Physics 19, no. 15 (2017): 10000–10008. http://dx.doi.org/10.1039/c7cp00729a.

Full text
Abstract:
The dynamics of injected holes in short transient times that precede polaron formation is numerically studied in the framework of a tight-binding electron–phonon interacting approach aimed at describing organic one-dimensional lattices.
APA, Harvard, Vancouver, ISO, and other styles
3

Males, Jonathan R., and John H. Kerr. "Stress, Emotion, and Performance in Elite Slalom Canoeists." Sport Psychologist 10, no. 1 (March 1996): 17–36. http://dx.doi.org/10.1123/tsp.10.1.17.

Full text
Abstract:
This paper examines the relationship between precompetitive affect and performance, using elements of reversal theory (Apter, 1982): a conceptual framework that incorporates a full range of pleasant and unpleasant moods. Nine elite male slalom canoeists completed questionnaires prior to each event of a season that included the world championships. Results were analyzed using a time-series model to make comparisons of each subject’s best and worst performance of the season. Predicted variations in precompetitive levels of pleasant and unpleasant mood did not occur, despite variations in subsequent performances. As predicted, good performances were preceded by low discrepancies between felt and preferred arousal levels, but there was no support for the hypothesis that a large discrepancy between perceived stress and coping efforts would precede a poor performance.
APA, Harvard, Vancouver, ISO, and other styles
4

Taylor, S., A. Cairns, and B. Glass. "Application of the PRECEDE-PROCEED model for the development of a community pharmacy ear health intervention for rural populations." International Journal of Pharmacy Practice 29, Supplement_1 (March 26, 2021): i14—i15. http://dx.doi.org/10.1093/ijpp/riab016.018.

Full text
Abstract:
Abstract Introduction The World Health Organisation has identified ear disease to be a major public health problem in rural and remote communities, with access to services an identified barrier. (1) Rural community pharmacists are recognised as highly skilled, accessible and trusted health professionals. An innovative service “LISTEN UP” (Locally Integrated Screening and Testing Ear aNd aUral Program) has been implemented in two remote community pharmacies in Australia. The service involves patients with an ear complaint self-presenting to a participating pharmacy and receiving a clinical examination by a pharmacist, who has completed accredited training in ear health, otoscopy and tympanometry. “LISTEN UP” has been developed using the PRECEDE-PROCEED planning model.(2) The PRECEDE component of the model assesses social, epidemiological, behavioural, environmental, educational and ecological factors to inform the development of an intervention.(2) The PROCEED-component consists of pilot testing and evaluation. Aim To describe an ecological approach to health promotion via the application of the PRECEDE-PROCEED planning model to develop a rural community pharmacy-based ear health intervention. Methods PRECEDE (Predisposing, Reinforcing, and Enabling Constructs in Educational Diagnosis) provided a framework to plan and develop a locally relevant and community focused program. This included research and engagement via meetings, surveys and interviews of consumers, pharmacists, health professionals and stakeholders. PROCEED (Policy, Regulatory, and Organisational Constructs in Educational and Environmental Development) outlined the structure for implementing and evaluating the intervention that was developed in the PRECEDE process. A pilot study has been included in PROCEED segment to allow improvement before implementing and evaluating the final model. Data will be collected in the pilot study via semi-structured interviews and surveys. This will be analysed using descriptive statistics and thematic analysis of qualitative data. Results As part of the PRECEDE segment a social assessment was undertaken via mixed method studies of rural consumers, pharmacists and health professionals. Hearing testing was ranked as the seventh (from twenty-six) most important expanded pharmacy service by both consumer and health professional groups. An epidemiological assessment found extensive ear disease in rural and remote locations resulting in complications and hearing loss. Behavioural and environment assessments identified eleven ear health interventions which include hearing screening [3], otoscopy pilot studies [2], audiometry services [1], specific education for undergraduate pharmacy students [2] and a pharmacy-based clinic [3]. However none of the interventions described a framework for continued service provision. Policy and regulation assessment was undertaken to align the intervention within the regulatory framework. The application of this model is partially complete with the study protocol for the intervention developed and the initial pilot study in progress. This study’s strengths include its applicability to rural populations and the limited evidence base that currently exists. It is however limited by the small size of the pilot study and application of this model to a national intervention would be useful for future. Conclusions The application of the PRECEDE-PROCEED model demonstrates the applicability of this planning model for developing and evaluating an ear health intervention with a particular focus on community pharmacies in rural and remote locations. References 1. World Health Organisation. Deafness and hearing loss; 2020. Available from: https://www.who.int/health-topics/hearing-loss#tab=tab_1 [Accessed: 15/9/2020] 2. Binkley CJ, Johnson KW. Application of the PRECEDE-PROCEED Planning Model in Designing an Oral Health Strategy. J Theory Pract Dent Public Health. 2013;1(3):http://www.sharmilachatterjee.com/ojs-2.3.8/index.php/JTPDPH/article/view/89
APA, Harvard, Vancouver, ISO, and other styles
5

Marchand, Calena, Farshad Farshidfar, Jodi Rattner, and Oliver Bathe. "A Framework for Development of Useful Metabolomic Biomarkers and Their Effective Knowledge Translation." Metabolites 8, no. 4 (September 30, 2018): 59. http://dx.doi.org/10.3390/metabo8040059.

Full text
Abstract:
Despite the significant advantages of metabolomic biomarkers, no diagnostic tests based on metabolomics have been introduced to clinical use. There are many reasons for this, centered around substantial obstacles in developing clinically useful metabolomic biomarkers. Most significant is the need for interdisciplinary teams with expertise in metabolomics, analysis of complex clinical and metabolomic data, and clinical care. Importantly, the clinical need must precede biomarker discovery, and the experimental design for discovery and validation must reflect the purpose of the biomarker. Standard operating procedures for procuring and handling samples must be developed from the beginning, to ensure experimental integrity. Assay design is another challenge, as there is not much precedent informing this. Another obstacle is that it is not yet clear how to protect any intellectual property related to metabolomic biomarkers. Viewing a metabolomic biomarker as a natural phenomenon would inhibit patent protection and potentially stifle commercial interest. However, demonstrating that a metabolomic biomarker is actually a derivative of a natural phenomenon that requires innovation would enhance investment in this field. Finally, effective knowledge translation strategies must be implemented, which will require engagement with end users (clinicians and lab physicians), patient advocate groups, policy makers, and payer organizations. Addressing each of these issues comprises the framework for introducing a metabolomic biomarker to practice.
APA, Harvard, Vancouver, ISO, and other styles
6

Quick, V., and C. Byrd-Bredbenner. "Behavioral Factors Affecting Young Adults' Health and Body Weight: A PRECEDE-PROCEED Framework Approach." Journal of the American Dietetic Association 110, no. 9 (September 2010): A76. http://dx.doi.org/10.1016/j.jada.2010.06.288.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Armstrong, Leisa J., Dean A. Diepeveen, and Khumphicha Tantisantisom. "An eAgriculture-Based Decision Support Framework for Information Dissemination." International Journal of Human Capital and Information Technology Professionals 1, no. 4 (October 2010): 1–13. http://dx.doi.org/10.4018/jhcitp.2010100101.

Full text
Abstract:
The ability of farmers to acquire knowledge to make decisions is limited by the information quality and applicability. Inconsistencies in information delivery and standards for the integration of information also limit decision making processes. This research uses a similar approach to the Knowledge Discovery in Databases (KDD) methodology to develop an ICT based framework which can be used to facilitate the acquisition of knowledge for farmers’ decision making processes. This is one of the leading areas of research and development for information technology in an agricultural industry, which is yet to utilize such technologies fully. The Farmer Knowledge and Decision Support Framework (FKDSF) takes information provided to farmers and utilizes processes that deliver this critical information for knowledge acquisition. The framework comprises data capture, analysis, and data processing, which precede the delivery of the integrated information for the farmer. With information collected, captured, and validated from disparate sources, according to defined sets of rules, data mining tools are then used to process and integrate the data into a format that contributes to the knowledge base used by the farmer and the agricultural industry.
APA, Harvard, Vancouver, ISO, and other styles
8

Onken, Lisa. "PRECEDE-PROCEED and the NIDA stage model: the value of a conceptual framework for intervention research." Journal of Public Health Dentistry 71 (January 2011): S18—S19. http://dx.doi.org/10.1111/j.1752-7325.2011.00221.x.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Gallani, Maria Cecília Bueno Jayme, Marilia Estevam Cornélio, Rúbia de Freitas Agondi, and Roberta Cunha Matheus Rodrigues. "Conceptual framework for research and clinical practice concerning cardiovascular health-related behaviors." Revista Latino-Americana de Enfermagem 21, spe (February 2013): 207–15. http://dx.doi.org/10.1590/s0104-11692013000700026.

Full text
Abstract:
OBJECTIVE: To present a conceptual framework based on the PRECEDE model conceived to guide research and the clinical practice of nurses in the clinical follow-up of patients with cardiovascular diseases. METHOD: The conceptual bases as well as the study designs used in the framework are discussed. The contextualization of the proposed structure is presented in the clinical follow-up of hypertensive patients. Examples of the intervention planning steps according to the intervention mapping protocol are provided. RESULTS: This conceptual framework coherently and rationally guided the diagnostic steps related to excessive salt intake among hypertensive individuals, as well as the development and assessment of specific interventions designed to change this eating behavior. CONCLUSION: The use of this conceptual framework enables a greater understanding of health-related behaviors implied in the development and progression of cardiovascular risk factors and is useful in proposing nursing interventions with a greater chance of success. This model is a feasible strategy to improve the cardiovascular health of patients cared for by the Brazilian Unified Health System.
APA, Harvard, Vancouver, ISO, and other styles
10

Meyer, Christian T., Megan P. Jewell, Eugene J. Miller, and Joel M. Kralj. "Machine Learning Establishes Single-Cell Calcium Dynamics as an Early Indicator of Antibiotic Response." Microorganisms 9, no. 5 (May 5, 2021): 1000. http://dx.doi.org/10.3390/microorganisms9051000.

Full text
Abstract:
Changes in bacterial physiology necessarily precede cell death in response to antibiotics. Herein we investigate the early disruption of Ca2+ homeostasis as a marker for antibiotic response. Using a machine learning framework, we quantify the temporal information encoded in single-cell Ca2+ dynamics. We find Ca2+ dynamics distinguish kanamycin sensitive and resistant cells before changes in gross cell phenotypes such as cell growth or protein stability. The onset time (pharmacokinetics) and probability (pharmacodynamics) of these aberrant Ca2+ dynamics are dose and time-dependent, even at the resolution of single-cells. Of the compounds profiled, we find Ca2+ dynamics are also an indicator of Polymyxin B activity. In Polymyxin B treated cells, we find aberrant Ca2+ dynamics precedes the entry of propidium iodide marking membrane permeabilization. Additionally, we find modifying membrane voltage and external Ca2+ concentration alters the time between these aberrant dynamics and membrane breakdown suggesting a previously unappreciated role of Ca2+ in the membrane destabilization during Polymyxin B treatment. In conclusion, leveraging live, single-cell, Ca2+ imaging coupled with machine learning, we have demonstrated the discriminative capacity of Ca2+ dynamics in identifying antibiotic-resistant bacteria.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "PRECEDE framework"

1

O'Meara, Carmel M., and n/a. "Childbirth and parenting education in the ACT: a review and analysis." University of Canberra. Education, 1990. http://erl.canberra.edu.au./public/adt-AUC20060710.161652.

Full text
Abstract:
The study reviewed the provision of childbirth and parenting education in the ACT for indicators of effectiveness and needs. Users (n = 207) and providers (n = 7) were surveyed for information on educational and administrative aspects of the service. An original design questionnaire was based on the PRECEDE framework (predisposing, reinforcing and enabling factors in educational diagnosis and evaluation) and the social model of health. Items were drawn from the relevant literature, concerning individual, social and service delivery elements of the health fields concept interpreted for pregnancy, childbirth and parenting. Individual factors were related to Maslow's hierarchy and the valuing approach to health education. The provider survey covered information on organisational elements, comprising inputs, processes, products, outputs and outcomes of childbirth education. The study comprised a literature review, cross-sectional non-experimental surveys of users and providers, and a needs assessment combining information from each of the three sources. Descriptive statistical techniques, analysis of variance and valuing analysis were used to extract information on effectiveness indicators and needs from the user data. Comparisons were made between present and past users, and between women of different ages, experience of pregnancy and preferences for public or private methods of education for childbirth. No evidence was found of individual differences in the women's attitudes, beliefs and values that could be attributed to education. However, users expressed strong approval and positive views of the service and its providers. The level of personal health skills, confidence and emotional preparatiqn they achieved through childbirth and parenting education did not fully meet their expectations. The survey also found that the organisation of childbirth and parenting education has not developed professionally like other health services. Service goals and objectives are ill-defined; planning and coordinating are inadequate for an integrated maternal health care system. The service's main resources are its highly motivated and dedicated teachers and clients. Several recommendations are made for educational and administrative measures to enhance service effectiveness within present organisational constraints, based on the needs identified by the study.
APA, Harvard, Vancouver, ISO, and other styles
2

Adair, Joel C. "Resolving Problems in Engineering Ethics: Precept and Example." BYU ScholarsArchive, 1999. https://scholarsarchive.byu.edu/etd/3448.

Full text
Abstract:
This thesis has served to accomplish several objectives. First, a foundation was laid for the consideration of ethical factors in an engineering context. This was done by first establishing the need for ethical judgement in the engineering disciplines. A summary of several significant classical ethical theories followed, providing several tools with which to evaluate decisions that have ethical implications. Finally, the conclusion was made that the best framework for making ethical decisions is found in the application of the virtues espoused by the gospel of Jesus Christ.
APA, Harvard, Vancouver, ISO, and other styles
3

Makkawi, Khoder. "An adaptive fault tolerant fusion framework for a precise, available and fail-safe localization." Thesis, Lille 1, 2020. http://www.theses.fr/2020LIL1I069.

Full text
Abstract:
L'adoption d'une solution technologique comme moyen de localisation d'un Système de Transport Intelligent nécessite une validation des performances usuelles. Celles-ci sont principalement la précision, la disponibilité, la continuité et la sûreté de fonctionnement. Elles présentent pourtant un comportement antagoniste, dans la mesure où assurer la sûreté de fonctionnement est généralement au détriment de la disponibilité. Cette brique de localisation peut être utilisée dans des fonctions n'engageant pas la sécurité du système et de l'environnement l'entourant comme le suivi de flotte ou l'information voyageurs. Mais quand il s'agit de fournir une information de localisation au module de contrôle de trajectoire du véhicule, il paraît évident que l'erreur de positionnement inconnue doit être correctement bornée, c'est ce que l'on nomme l'intégrité de positionnement. Pour augmenter l'intégrité, la littérature préconise l'intégration d'une couche de diagnostic et de supervision. De même, le couplage de solutions de localisation complémentaires comme le GNSS utilisé pour ses facultés de positionnement absolu, et l'odométrie pour la justesse à court terme de ses données relatives est préconisé pour augmenter la précision, la disponibilité et la continuité du système. Dans ces travaux nous proposons un formalisme permettant la mise en place d'une fusion de données brutes GNSS et de données odométriques par le biais d'un filtre stochastique de fusion de données, le Maximum Correntropy Criterion Nonlinear Information Filter, robuste aux différents bruit de mesures (shot noises, mutli-gaussiens, etc...). Ce formalisme intègre également une couche de diagnostic conçue pour être adaptative au contexte de navigation ou au changement d'exigences opérationnelles par le biais d'une métrique informationnelle, l'α-Rényi Divergence, généralisant les métriques habituellement utilisées à ces fins, comme la divergence de Bhattacharyya ou la divergence de Kullback-Leibler. Cette divergence permet la conception de résidus paramétriques qui tiennent en compte du changement d'environnement et donc du changement de la probabilité à priori de faire face ou non à un défaut de mesures GNSS. Nous étudions la possibilité d'implémenter une politique de sélection de ce paramètre et étudions l'impact de cette politique sur l'ensemble des performances précédemment citées. Les résultats encourageants permettent d'envisager comme perspective la complexification de la politique et des algorithmes de fixation de la valeur du paramètre α par l'apport des technologies d'intelligence artificielle afin d'augmenter la discernabilité des défauts, minimiser la probabilité de fausse alarme (et donc augmenter la disponibilité) et minimiser la probabilité de détection manquée dans l’objectif d'augmenter la sûreté de fonctionnement du véhicule. Des données réelles fournies par la plateforme PRETIL du laboratoire CRIStAL ont été utilisées pour le test et la validation des approches proposées
The adoption of a technological solution as a means of localization of an Intelligent Transport System requires validation of the usual performance metrics. These are mainly accuracy, availability, continuity, and safety. However, they present an antagonistic behavior, insofar as ensuring operational safety is generally to the detriment of availability. This localization brick can be used in functions that do not involve the security of the system and the surrounding environment, such as fleet tracking or passenger information. But, when it comes to providing localization information to the vehicle's trajectory control module, it seems obvious, that the unknown positioning error must be properly bounded, this is called positioning integrity. To increase integrity, the literature recommends the integration of a diagnostic and monitoring layer. Similarly, the coupling of complementary localization solutions such as GNSS for its absolute positioning capabilities, and odometry for the precision of its relative data is recommended to increase the accuracy, availability, and continuity of the system. In this work, we propose a framework allowing the implementation of merging GNSS raw data and odometric data, through the use of a data fusion stochastic filter, the Maximum Correntropy Criterion Nonlinear Information Filter, robust to different measurement noises (shot noises, multi-gaussian, etc...). This framework also integrates a diagnostic layer designed to be adaptive to the navigation context or to changing operational requirements through an informational metric, the α-Rényi Divergence, generalizing the metrics usually used for these purposes, such as the Bhattacharyya Divergence or the Kullback-Leibler Divergence. This divergence allows the design of parametric residuals that take into account the change in environment and thus the change in the a priori probability of facing or not facing GNSS measurement failures. We study the possibility of implementing a selection policy for this parameter and study the impact of this policy on all the above-mentioned performance. The encouraging results allow us to consider, as a perspective for this work, the complexification of the policy and the algorithms for setting the value of the α parameter by the contribution of artificial intelligence technologies in order to increase the discernibility of faults, minimize the probability of false alarms (and thus increase availability) and minimize the probability of missed detections (and thus increase operational safety). In this work, real data provided by the PRETIL plate-forme of CRIStAL Lab are used in order to test and validate the proposed approach
APA, Harvard, Vancouver, ISO, and other styles
4

Yu, Ann Tit-wan. "A value management framework for systematic identification and precise representation of client requirements in the briefing process." online access from Digital Dissertation Consortium, 2006. http://libweb.cityu.edu.hk/cgi-bin/er/db/ddcdiss.pl?3265624.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Karlsson, Erik. "Behavior recording with the scoring program MouseClick : A study in cross platform and precise timing developing." Thesis, Uppsala universitet, Informationssystem, 2010. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-132079.

Full text
Abstract:
This thesis will deal with problems and solutions of cross-platform developing using MoNo framework as a replacement of Microsoft .NET framework on Linux and Mac OS-X platforms. It will take in account matters such as limitations in the filesystem to problems with deploying released programs. It will also deal with demands of precise timing and the need of efficient code on precise tasks to construct a program used for creating data from recordings of animals. These animals is set to perform a task, for example exploring a labyrinth or running on a rod, and it is all recorded on video. These videos are later reviewed by an observer which transcripts the recordings into data based on predefined behaviors and the time and frequency with which the animal is expressing them.
APA, Harvard, Vancouver, ISO, and other styles
6

Kratzl, Kathrin [Verfasser], Roland A. [Akademischer Betreuer] Fischer, Ulrich K. [Gutachter] Heiz, and Roland A. [Gutachter] Fischer. "Encapsulation of Atom-Precise Clusters in Metal-Organic Frameworks for Electrocatalytic Applications / Kathrin Kratzl ; Gutachter: Ulrich K. Heiz, Roland A. Fischer ; Betreuer: Roland A. Fischer." München : Universitätsbibliothek der TU München, 2020. http://d-nb.info/1210163799/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Pflüger, Stefan [Verfasser], Miriam [Gutachter] Fritsch, and Ulrich [Gutachter] Wiedner. "Precise determination of the luminosity with the PANDA-luminosity detector and implementation of the helicity formalism for the ComPWA framework for an extraction of the scalar wave in the channel \(\it J\) / \(\psi\)\(\rightarrow\)\(\pi^0\)\(\pi^0\)\(\gamma\) / Stefan Pflüger ; Gutachter: Miriam Fritsch, Ulrich Wiedner ; Fakultät für Physik und Astronomie." Bochum : Ruhr-Universität Bochum, 2018. http://d-nb.info/1154308030/34.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Alhilaly, Mohammad Jaber. "Atomically Precise Silver Nanoclusters: Controlled Synthesis and Assembly into Structurally Diverse Frameworks with Tailored Optical Properties." Diss., 2019. http://hdl.handle.net/10754/660299.

Full text
Abstract:
Ligand-protected metal nanoclusters (NCs), which are ultra-small nanoparticles marked by their atomic precision, are distinctly importance for contemporary nanomaterials. NCs have attracted significant research attention for utilizing their novel optical and physicochemical properties in various applications, including fluorescence sensing, catalysis, and biomedical applications. This dissertation deals with ligand-protected atomically precise silver NCs and is divided into two main parts. The first part is focused on the exploration and design of well-defined silver NCs through surface co-ligand engineering. The second part is related to the development of silver NC-based frameworks (NCFs). In the first part, we designed a synthetic strategy based on engineering the structure of the phosphine co-ligands with thiols to generate the large box-shaped [Ag67(SPhMe2)32(PPh3)8]3+ (referred to as Ag67) NC. The strategy demonstrates that the combined use of judiciously chosen thiol and phosphine co-ligands can result in stable highly anisotropic box-like shapes. The optical absorption spectrum of the Ag67 NC displays highly structured multiple sharp peaks. The crystal structure shows a Ag23 core formed of a centered cuboctahedron (an unprecedented core geometry in silver clusters), which is encased by a layer with a composition of Ag44S32P8 arranged in the shape of a box. The electronic structure of this box-shaped cluster resembles a jellium box model with 32 free electrons. In the second part, a novel approach is developed for the assembly and linkage of atomically precise Ag NCs into one-dimensional (1D) and two-dimensional (2D) NC-based frameworks (NCFs) with atomic-level control over cluster size and dimensionality. With this approach three novel, but related, crystal structures (one silver NC and two NCFs) were synthesized. These structures have the same protecting ligands, and also the same organic linker. The three structures exhibit a similar square gyrobicupola geometry of the building NC unit with only a single Ag atom difference. The critical role of using a chloride template in controlling the NC’s nuclearity was demonstrated, as well as the effect of a single Ag atom difference in the NC’s size on the NCF’s dimensionality, optical properties, and thermal stability.
APA, Harvard, Vancouver, ISO, and other styles
9

Vijaya, Krishna A. "A Filterbank Precoding Framework For MIMO Frequency Selective Channels." Thesis, 2006. http://hdl.handle.net/2005/1084.

Full text
Abstract:
Wireless systems with multiple antennas at both the transmitter and receiver (MIMO systems) have been the focus of research in the recent past due to their ability to provide higher data rates and better reliability than their single antenna counterparts. Designing a communication system for MIMO frequency selective channels provides many signal processing challenges. Popular methods like MIMOOFDM and space-time precoding linearly process blocks of data at both the transmitter and the receiver. Independence between the blocks is ensured by introducing sufficient redundancy between successive blocks. This approach has many pitfalls, including the limit on achievable data rate due to redundancy requirements and the need for additional coding/processing. In this thesis, we provide a filterbank precoding framework (FBP) for communication over MIMO frequency selective channels. By viewing the channel as a polynomial matrix, we derive the minimum redundancy required for achieving FIR equalization of the precoded channel. It is shown that, for most practical channels, a nominal redundancy is enough. The results are general, and hold for channels of any dimension and order. We derive the zero-forcing and MMSE equalizers for the precoded channel. The role of equalizer delay in system performance is analyzed. We extend the minimum redundancy result to the case of space-time filterbank precoding (STFP). Introducing the time dimension allows the channel to be represented by a block pseudocirculant matrix. By using the Smith form of block pseudocirculant matrices, we show that very high data rates can be achieved with STFP. When channel information is available at the transmitter, we derive an iterative algorithm for obtaining the MMSE optimal precoder-equalizer pair. We then provide a comparison of FBP with the block processing methods. It is shown that FBP provides better BER performance than the block processing methods at a lower computational cost. The reasons for the better performance of FBP are discussed.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "PRECEDE framework"

1

Hadda, Lamia, ed. Médina. Florence: Firenze University Press, 2021. http://dx.doi.org/10.36253/978-88-5518-248-5.

Full text
Abstract:
Dedicated to the medina in the Mediterranean space, this book is essentially based on detailed historical and photographic research into the characteristics of city design and its evolution, as well as some case studies from direct experience. The main objective of the present study consists of its documentary and evocative value, without forgetting the analysis of the multiple architectural spaces with monumental complexes of extraordinary cultural importance arranged according to precise hierarchies and specific uses. The research summarises the different experiences from this immense Arab-Muslim architectural heritage and its urban evolution. These aspects are expressed both by the large number of case studies (from Cordoba to Palermo, passing through Fez, Séfrou, Marrakech and Tunis) as well as by the quality of the built spaces as a whole. The several contributions show an urban framework that is still legible and significant, consisting of grids of houses with forms, structures and functions that show a concentration of spaces, places and monuments stratified over time and developed in the Mediterranean countries, producing extremely diverse situations.
APA, Harvard, Vancouver, ISO, and other styles
2

Rivadossi, Silvia. Sciamani urbani. Venice: Fondazione Università Ca’ Foscari, 2020. http://dx.doi.org/10.30687/978-88-6969-414-1.

Full text
Abstract:
What does it mean to be a ‘shaman’ in present-day Tokyo today? In what way(s) is the role of the shamanic practitioner represented at a popular level? Are certain characteristics emphasised and others downplayed? This book offers an answer to these questions through the analysis of a specific discourse on shamans that emerged in the Japanese metropolitan context between the late 20th century and the first decade of the 21st century, a discourse that the more ‘traditional’ approaches to the study on shamanism do not take into account. In order to better contextualise this specific discourse, the volume opens with a brief historical account of the formation of the academic discourse on shamans. Within the theoretical framework offered by critical discourse analysis and by means of multi-sited ethnographic research, it then weaves together different case studies: three novels by Taguchi Randy, a manga, a TV series and the case of an urban shaman who is mostly active in Tokyo. The main elements emerging from these case studies are explored by situating them in the precise historical and social context within which the discourse has been developed. This shows that the new discourse analysed shares several characteristics with the more ‘traditional’ and accepted discourses on shamanism, while at the same time differing in certain respects. In this work, particular attention is given to how the category and term ‘shaman’ is defined, used and re-negotiated in the Japanese metropolitan context. Through this approach, the book aims to further problematize the categories of ‘shaman’ and ‘shamanism’, by highlighting certain aspects that are not yet accepted by many scholars, even though they constitute a discourse that is relevant and effective.
APA, Harvard, Vancouver, ISO, and other styles
3

Campney, Brent M. S. Hostile Heartland. University of Illinois Press, 2019. http://dx.doi.org/10.5622/illinois/9780252042492.001.0001.

Full text
Abstract:
Hostile Heartland examines racial violence—or, more aptly, racist violence—against blacks (African Americans) in the Midwest, emphasizing lynching, whipping, and violence by police (or police brutality). It also focuses on black responses, including acts of armed resistance, the development of local and regional civil rights organizations, and the work of individual activists. Within that broad framework the book considers patterns of institutionalized violence in studies of individual states, like Ohio, Indiana, Illinois, Missouri, and Kansas over a number of decades; it also targets specific incidents of such violence or resistance in case studies representative of changes in these patterns like the lynching of Joseph Spencer in Cairo, Illinois, in 1854 and the lynching of Luke Murray in South Point, Ohio, in 1932. Significantly, Hostile Heartland not only addresses the years from the Civil War to World War I, which are the typical focus of such studies, but also incorporates the twenty-five years that precede the Civil War and the additional twenty-five that follow World War I. It pioneers new research methodologies, as exemplified by Chapter 4’s analysis of the relations between and among racist violence, family history, and the black freedom struggle. Finally, Hostile Heartland situates its findings within the historiography more broadly.
APA, Harvard, Vancouver, ISO, and other styles
4

Anderson, Cheryl P., and Debra L. Martin, eds. Massacres. University Press of Florida, 2018. http://dx.doi.org/10.5744/florida/9781683400691.001.0001.

Full text
Abstract:
Bioarchaeology and forensic anthropology offer unique perspectives on studies of mass violence and present opportunities to interpret human skeletal remains in a broader cultural context. Massacres and other forms of large-scale violence have been documented in many different ancient and modern contexts. Moving the analysis from the victims to the broader political and cultural context necessitates using social theories about the nature of mass violence. Massacres can be seen as a process, that is, as the unfolding of nonrandom patterns or chains of events that precede the events and continue long after. Mass violence has a cultural logic of its own that is shaped by social and historical dynamics. Massacres can have varying aims, including subjugation or total eradication of a group based on status, ethnicity, or religion. The goal of this edited volume is to present case studies that integrate the evidence from human remains within the broader cultural and historical contexts through the utilization of social theory to provide a framework for interpretation. This volume highlights case studies of massacres across time and space that stress innovative theoretical models that help make sense of this unique form of violence. The primary focus will be on how massacres are used as a strategy of violence across time and cultural/geopolitical landscapes.
APA, Harvard, Vancouver, ISO, and other styles
5

Mevorach, Irit. A Normative Framework for Promoting Compliance. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198782896.003.0005.

Full text
Abstract:
This chapter completes the proposed normative framework for cross-border insolvency. It considers the problem of compliance with a cross-border insolvency system by countries and implementing institutions. The previous chapters have shown how the choice and use of certain international legal sources, such as customary international law (CIL), can strengthen the system, close gaps, and address biases that may otherwise impede the choices of optimal solutions. Yet, notwithstanding the pervasiveness and behavioural force of CIL, the observance of the norms is not guaranteed. Written instruments, even if precise and comprehensive, and designed effectively, do not assure compliance either. Even where so-called soft law is in fact hard in important ways, countries might still underperform. This chapter suggests how compliance can be induced, and discusses which measures can be more, or less effective in that regard, including in view of decision-making constraints.
APA, Harvard, Vancouver, ISO, and other styles
6

Salleh, Dani, and Mazlan Ismail. Infrastructure procurement framework for local authority. UUM Press, 2015. http://dx.doi.org/10.32890/9789670474434.

Full text
Abstract:
The spread of infrastructure requirements and variety in mechanisms used to secure contributions (infrastructure provision) from private sector was a reflection of the institutional framework in planning system.The study has identified that although both private and local authorities have a good understanding of the fundamental of concept of local infrastructure provision and the arguments for and against the use of private provision, there are still considerable areas of uncertainty surrounding the precise definition (as prescribed in the relevant legislations) and measurements of the key elements pertaining to local infrastructure.The findings revealed that the previous studies has tended to examine the nature of the practice of the infrastructure delivery within the framework of national economy and very little focus has been given to a comprehensive examination on how private developers can be involved in local infrastructure development.The primary problem is that there is no single framework available at the local level that might be considered or applied to secure infrastructure from private developers.The study then provides the parameters for securing contributions towards infrastructure provision. To achieve a complete understanding of this issue, it is necessary to appreciate the broader picture of what is required in terms of infrastructure for the operation of the urban environment.
APA, Harvard, Vancouver, ISO, and other styles
7

Hansford, Thomas G. Vertical Stare Decisis. Edited by Lee Epstein and Stefanie A. Lindquist. Oxford University Press, 2017. http://dx.doi.org/10.1093/oxfordhb/9780199579891.013.18.

Full text
Abstract:
This chapter critically assesses the current state of the literature on vertical stare decisis. It begins with a consideration of how stare decisis does, or does not, fit with the principal–agent framework that is often used as a starting point for theories of the relationship between high and low courts. Various approaches to testing the existence of vertical stare decisis and the factors that might condition the strength of this constraint are then addressed. While there is a good deal of evidence that is consistent with the claim that High Court precedent constrains lower court decision-making, this evidence is not as conclusive as it might first appear. There is also ambiguity regarding the precise causal mechanism at work. This chapter then considers recent scholarship focused on the potential for bottom-up influences on the operation of precedent in a judicial hierarchy.
APA, Harvard, Vancouver, ISO, and other styles
8

Jan, Paulsson. Part I Investment Treaties and the Settlement of Investment Disputes: The Framework, 4 The Role of Precedent in Investment Treaty Arbitration. Oxford University Press, 2018. http://dx.doi.org/10.1093/law/9780198758082.003.0004.

Full text
Abstract:
This chapter examines the role of precedent in investment treaty arbitration. The technical rules of precedent are practice rules developed within legal systems. A system that enforces the rule of precedent requires a supreme court authorised both to impose a rule on inferior courts and to modify it when it sees fit. However, there is nothing like it in the international realm, and even less so in the context of arbitration. Nonetheless, it is possible to imagine the development of an international ‘law on investment protection’ by something akin to the common-law process of developing authoritative rules by case-by-case accretion, though this type of precedent must be qualified by the word ‘persuasive’ rather than ‘binding’.
APA, Harvard, Vancouver, ISO, and other styles
9

LO, Gane Samb, Aladji Babacar Niang, and Lois Chinwendu Okereke. A course of Elementary Probability Course. SPAS-EDS, 2020. http://dx.doi.org/10.16929/sts/2020.001.

Full text
Abstract:
This book introduces to the theory of probabilities from the beginning. Assuming that the reader possesses the normal mathematical level acquired at the end of the secondary school, we aim to equip him with a solid basis in probability theory. The theory is preceded by a general chapter on counting methods. Then, the theory of probabilities is presented in a discrete framework. Two objectives are sought. The first is to give the reader the ability to solve a large number of problems related to probability theory, including application problems in a variety of disciplines. The second is to prepare the reader before he takes course on the mathematical foundations of probability theory. In this later book, the reader will concentrate more on mathematical concepts, while in the present text, experimental frameworks are mostly found. If both objectives are met, the reader will have already acquired a definitive experience in problem-solving ability with the tools of probability theory and at the same time he is ready to move on to a theoretical course on probability theory based on the theory of Measure and Integration. The book ends with a chapter that allows the reader to begin an intermediate course in mathematical statistics.
APA, Harvard, Vancouver, ISO, and other styles
10

Heitzeg, Mary M. Brain Functional Contributors to Vulnerability for Substance Abuse. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780190676001.003.0006.

Full text
Abstract:
Substance use disorder (SUD) is one of the most significant health concerns worldwide; therefore, understanding the mechanisms that precede the onset and contribute to the escalation of substance use from childhood to adulthood is vital. Evidence suggests that behavioral undercontrol and negative affectivity are two behavioral pathways through which risk for SUD emerges across development. This chapter discusses studies that probe the neural systems underlying these behavioral phenotypes in high-risk youth from the Michigan Longitudinal Study, a prospective study of families with high levels of parental SUDs. The focus is on work that integrates behavioral trait, developmental, neurobiological, and, in some cases, genetic frameworks to develop a better understanding of the risk factors leading to SUDs.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "PRECEDE framework"

1

Hartson, H. Rex, and Kevin A. Mayo. "A Framework for Precise, Reusable Task Abstractions." In Interactive Systems: Design, Specification, and Verification, 279–97. Berlin, Heidelberg: Springer Berlin Heidelberg, 1995. http://dx.doi.org/10.1007/978-3-642-87115-3_17.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Wang, Yue. "Precise Simulation Based Task Planning Framework of Earth Observing Satellites." In Lecture Notes in Electrical Engineering, 544–51. Singapore: Springer Singapore, 2020. http://dx.doi.org/10.1007/978-981-15-4163-6_65.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Abdollahi, Behnoush, Ahmed Soliman, A. C. Civelek, X. F. Li, G. Gimel’farb, and Ayman El-Baz. "A Novel 3D Joint MGRF Framework for Precise Lung Segmentation." In Machine Learning in Medical Imaging, 86–93. Berlin, Heidelberg: Springer Berlin Heidelberg, 2012. http://dx.doi.org/10.1007/978-3-642-35428-1_11.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Mattoccia, Stefano, Federico Tombari, and Luigi Di Stefano. "Stereo Vision Enabling Precise Border Localization Within a Scanline Optimization Framework." In Computer Vision – ACCV 2007, 517–27. Berlin, Heidelberg: Springer Berlin Heidelberg, 2007. http://dx.doi.org/10.1007/978-3-540-76390-1_51.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Backes, Michael, Fabian Bendun, Jörg Hoffmann, and Ninja Marnau. "PriCL: Creating a Precedent, a Framework for Reasoning about Privacy Case Law." In Lecture Notes in Computer Science, 344–63. Berlin, Heidelberg: Springer Berlin Heidelberg, 2015. http://dx.doi.org/10.1007/978-3-662-46666-7_18.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Varró, Dániel, and András Pataricza. "Metamodeling Mathematics: A Precise and Visual Framework for Describing Semantics Domains of UML Models." In ≪UML≫ 2002 — The Unified Modeling Language, 18–33. Berlin, Heidelberg: Springer Berlin Heidelberg, 2002. http://dx.doi.org/10.1007/3-540-45800-x_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Armentano, D., and E. Pardo. "Chapter 14. Atomically Precise Metal Clusters in Confined Spaces of Metal–Organic Frameworks." In Reactivity in Confined Spaces, 428–61. Cambridge: Royal Society of Chemistry, 2021. http://dx.doi.org/10.1039/9781788019705-00428.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Beyer, Dirk, and Heike Wehrheim. "Verification Artifacts in Cooperative Verification: Survey and Unifying Component Framework." In Leveraging Applications of Formal Methods, Verification and Validation: Verification Principles, 143–67. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-61362-4_8.

Full text
Abstract:
Abstract The goal of cooperative verification is to combine verification approaches in such a way that they work together to verify a system model. In particular, cooperative verifiers provide exchangeable information (verification artifacts) to other verifiers or consume such information from other verifiers with the goal of increasing the overall effectiveness and efficiency of the verification process. This paper first gives an overview over approaches for leveraging strengths of different techniques, algorithms, and tools in order to increase the power and abilities of the state of the art in software verification. To limit the scope, we restrict our overview to tools and approaches for automatic program analysis. Second, we specifically outline cooperative verification approaches and discuss their employed verification artifacts. Third, we formalize all artifacts in a uniform way, thereby fixing their semantics and providing verifiers with a precise meaning of the exchanged information.
APA, Harvard, Vancouver, ISO, and other styles
9

Zhang, Yedi, Zhe Zhao, Guangke Chen, Fu Song, and Taolue Chen. "BDD4BNN: A BDD-Based Quantitative Analysis Framework for Binarized Neural Networks." In Computer Aided Verification, 175–200. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-81685-8_8.

Full text
Abstract:
AbstractVerifying and explaining the behavior of neural networks is becoming increasingly important, especially when they are deployed in safety-critical applications. In this paper, we study verification and interpretability problems for Binarized Neural Networks (BNNs), the 1-bit quantization of general real-numbered neural networks. Our approach is to encode BNNs into Binary Decision Diagrams (BDDs), which is done by exploiting the internal structure of the BNNs. In particular, we translate the input-output relation of blocks in BNNs to cardinality constraints which are in turn encoded by BDDs. Based on the encoding, we develop a quantitative framework for BNNs where precise and comprehensive analysis of BNNs can be performed. We demonstrate the application of our framework by providing quantitative robustness analysis and interpretability for BNNs. We implement a prototype tool and carry out extensive experiments, confirming the effectiveness and efficiency of our approach.
APA, Harvard, Vancouver, ISO, and other styles
10

Späth, Johannes. "Applications of Synchronized Pushdown Systems." In Ernst Denert Award for Software Engineering 2019, 19–45. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-58617-1_3.

Full text
Abstract:
AbstractA precise static data-flow analysis transforms the program into a context-sensitive and field-sensitive approximation of the program. It is challenging to design an analysis of this precision efficiently due to the fact that the analysis is undecidable per se. Synchronized pushdown systems (SPDS) present a highly precise approximation of context-sensitive and field-sensitive data-flow analysis. This chapter presents some data-flow analyses that SPDS can be used for. Further on, this chapter summarizes two other contributions of the thesis “Synchronized Pushdown System for Pointer and Data-Flow Analysis” called Boomerang and IDEal. Boomerang is a demand-driven pointer analysis that builds on top of SPDS and minimizes the highly computational effort of a whole-program pointer analysis by restricting the computation to the minimal program slice necessary for an individual query. IDEal is a generic and efficient framework for data-flow analyses, e.g., typestate analysis. IDEal resolves pointer relations automatically and efficiently by the help of Boomerang. This reduces the burden of implementing pointer relations into an analysis. Further on, IDEal performs strong updates, which makes the analysis sound and precise.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "PRECEDE framework"

1

Nimr, Ahmad, Marwa Chafii, and Gerhard Fettweis. "Precoded-OFDM within GFDM Framework." In 2019 IEEE 89th Vehicular Technology Conference (VTC2019-Spring). IEEE, 2019. http://dx.doi.org/10.1109/vtcspring.2019.8746539.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Alanwar, Amr, Fatima M. Anwar, Yi-Fan Zhang, Justin Pearson, Joao Hespanha, and Mani B. Srivastava. "Cyclops: PRU programming framework for precise timing applications." In 2017 IEEE International Symposium on Precision Clock Synchronization for Measurement, Control, and Communication (ISPCS). IEEE, 2017. http://dx.doi.org/10.1109/ispcs.2017.8056744.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Vlado, Porobic, Adzic Evgenije, Grabic Stevan, Vekic Marko, and Rapaic Milan. "Precise PV active power — Converter control rapid prototyping framework." In 2017 International Symposium on Power Electronics (Ee). IEEE, 2017. http://dx.doi.org/10.1109/pee.2017.8171708.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Nellippallil, Anand Balu, Vignesh Rangaraj, B. P. Gautham, Amarendra Kumar Singh, Janet K. Allen, and Farrokh Mistree. "A Goal-Oriented, Inverse Decision-Based Design Method to Achieve the Vertical and Horizontal Integration of Models in a Hot Rod Rolling Process Chain." In ASME 2017 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2017. http://dx.doi.org/10.1115/detc2017-67570.

Full text
Abstract:
Reducing the manufacturing and marketing time of products by means of integrated simulation-based design and development of the material, product, and the associated manufacturing processes is the need of the hour for industry. This requires the design of materials to targeted performance goals through bottom-up and top-down modeling and simulation practices that enables handshakes between modelers and designers along the entire product realization process. Manufacturing a product involves a host of unit operations and the final properties of the manufactured product depends on the processing steps carried out at each of these unit operations. In order to effectively couple the material processing-structure-property-performance spaces, there needs to be an interplay of the systems-based design of materials with enhancement of models of various unit operations through multiscale modeling methodologies and integration of these models at different length scales (vertical integration). This ensures the flow of information from one unit operation to another thereby establishing the integration of manufacturing processes (horizontal integration). Together these types of integration will support the decision-based design of the manufacturing process chain so as to realize the end product. In this paper, we present a goal-oriented, inverse decision-based design method to achieve the vertical and horizontal integration of models for the hot rolling and cooling stages of the steel manufacturing process chain for the production of a rod with defined properties. The primary mathematical construct used for the method presented is the compromise Decision Support Problem (cDSP) supported by the proposed Concept Exploration Framework (CEF) to generate satisficing solutions under uncertainty. The efficacy of the method is illustrated by exploring the design space for the microstructure after cooling that satisfies the requirements identified by the end mechanical properties of the product. The design decisions made are then communicated in an inverse manner to carry out the design exploration of the cooling stage to identify the design set points for cooling that satisfies the new target microstructure requirements identified. Specific requirements such as managing the banded microstructure to minimize distortion in forged gear blanks are considered in the problem. The proposed method is generic and we plan to extend the work by carrying out the integrated decision-based design exploration of rolling and reheating stages that precede to realize the end product.
APA, Harvard, Vancouver, ISO, and other styles
5

Saha, Ratna, Mariusz Bajger, and Gobert Lee. "SRM Superpixel Merging Framework for Precise Segmentation of Cervical Nucleus." In 2019 Digital Image Computing: Techniques and Applications (DICTA). IEEE, 2019. http://dx.doi.org/10.1109/dicta47822.2019.8945887.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Vora, Urjaswala, Peeyush Chomal, and Avani Vakharwala. "Precept-Based Framework for Using Crowdsourcing in IoT-Based Systems." In 2019 IEEE International Conference on Smart Computing (SMARTCOMP). IEEE, 2019. http://dx.doi.org/10.1109/smartcomp.2019.00077.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Goo, Young-Hoon, Kyu-Seok Shim, Byeong-Min Chae, and Myung-Sup Kim. "Framework for precise protocol reverse engineering based on network traces." In NOMS 2018 - 2018 IEEE/IFIP Network Operations and Management Symposium. IEEE, 2018. http://dx.doi.org/10.1109/noms.2018.8406307.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Cui, Baojiang, Fuwei Wang, Tao Guo, Guowei Dong, and Bing Zhao. "FlowWalker: A Fast and Precise Off-Line Taint Analysis Framework." In 2013 Fourth International Conference on Emerging Intelligent Data and Web Technologies (EIDWT). IEEE, 2013. http://dx.doi.org/10.1109/eidwt.2013.105.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Satake, Takashi. "Precise and fast interactive area QoE management framework toward 5G era." In 2016 17th International Telecommunications Network Strategy and Planning Symposium (Networks). IEEE, 2016. http://dx.doi.org/10.1109/netwks.2016.7751182.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Horn, Alex, Ali Kheradmand, and Mukul R. Prasad. "A Precise and Expressive Lattice-theoretical Framework for Efficient Network Verification." In 2019 IEEE 27th International Conference on Network Protocols (ICNP). IEEE, 2019. http://dx.doi.org/10.1109/icnp.2019.8888144.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "PRECEDE framework"

1

Rycroft, Taylor, Kerry Hamilton, Charles Haas, and Igor Linkov. A quantitative risk assessment method for synthetic biology products in the environment. Engineer Research and Development Center (U.S.), July 2021. http://dx.doi.org/10.21079/11681/41331.

Full text
Abstract:
The need to prevent possible adverse environmental health impacts resulting from synthetic biology (SynBio) products is widely acknowledged in both the SynBio risk literature and the global regulatory community. However, discussions of potential risks of SynBio products have been largely speculative, and the attempts to characterize the risks of SynBio products have been non-uniform and entirely qualitative. As the discipline continues to accelerate, a standardized risk assessment framework will become critical for ensuring that the environmental risks of these products are characterized in a consistent, reliable, and objective manner that incorporates all SynBio-unique risk factors. Current established risk assessment frameworks fall short of the features required of this standard framework. To address this, we propose the Quantitative Risk Assessment Method for Synthetic Biology Products (QRASynBio) – an incremental build on established risk assessment methodologies that supplements traditional paradigms with the SynBio risk factors that are currently absent and necessitates quantitative analysis for more transparent and objective risk characterizations. The proposed framework facilitates defensible quantification of the environmental risks of SynBio products in both foreseeable and hypothetical use scenarios. Additionally, we show how the proposed method can promote increased experimental investigation into the likelihood of hazard and exposure parameters and highlight the parameters where uncertainty should be reduced, leading to more targeted risk research and more precise characterizations of risk.
APA, Harvard, Vancouver, ISO, and other styles
2

Kofler, Jakob, Elisabeth Nindl, Dorothea Sturn, and Magdalena Wailzer. Participatory Approaches in Research, Technology and Innovation (RTI) Policy and their Potential Impact. Fteval - Austrian Platform for Research and Technology Policy Evaluation, July 2021. http://dx.doi.org/10.22163/fteval.2021.518.

Full text
Abstract:
The present article reviews various concepts of participatory science and research and discusses their potential to exhibit impact on the relationship between science and society. Starting with an overview of rationales, concepts and challenges, different forms and intensities of participatory approaches in research and innovation are discussed. We then look at the situation in Austria and sort selected Austrian funding programmes and initiatives into a diagram according to the intensity of participation as well as the social groups involved in each case. Finally, we try to gain more precise indications of the impact of participatory programmes on the relationship between science and society. Many questions remain unanswered, as precise analyses and evaluation results are usually lacking. While different surveys provide insights into society’s level of information on a general level, interest, involvement and attitude towards science and research, approaches for impact assessment are fragmented and remain on the surface. We therefore propose to develop an analytical framework based on existing approaches and to include collaboratively developed indicators in it.
APA, Harvard, Vancouver, ISO, and other styles
3

Latzman, Natasha E., Cecilia Casanueva, and Melissa Dolan. Defining and understanding the Scope of Child Sexual Abuse: Challenges and Opportunities. RTI Press, November 2017. http://dx.doi.org/10.3768/rtipress.2017.op.0044.1711.

Full text
Abstract:
The enormous individual, familial, and societal burden of child sexual abuse has underscored the need to address the problem from a public health framework. Much work remains, however, at the first step of this framework — defining and understanding the scope of the problem, or establishing incidence and prevalence estimates. In this occasional paper, we provide an overview of the ways researchers have defined and estimated the scope of child sexual abuse, focusing on agency tabulations and large-scale surveys conducted over the last several decades. More precise estimates of the number of children affected by child sexual abuse would improve the ability of the public health, child welfare, pediatrics, and other communities to prevent and respond to the problem. We recommend using a comprehensive surveillance system to assess and track the scope of child sexual abuse. This system should be grounded by common definitional elements and draw from multiple indicators and sources to estimate the prevalence of a range of sexually abusive experiences.
APA, Harvard, Vancouver, ISO, and other styles
4

Villamizar-Villegas, Mauricio, and Yasin Kursat Onder. Uncovering Time-Specific Heterogeneity in Regression Discontinuity Designs. Banco de la República de Colombia, November 2020. http://dx.doi.org/10.32468/be.1141.

Full text
Abstract:
The literature that employs Regression Discontinuity Designs (RDD) typically stacks data across time periods and cutoff values. While practical, this procedure omits useful time heterogeneity. In this paper we decompose the RDD treatment effect into its weighted time-value parts. This analysis adds richness to the RDD estimand, where each time-specific component can be different and informative in a manner that is not expressed by the single cutoff or pooled regressions. To illustrate our methodology, we present two empirical examples: one using repeated cross-sectional data and another using time-series. Overall, we show a significant heterogeneity in both cutoff and time-specific effects. From a policy standpoint, this heterogeneity can pick up key differences in treatment across economically relevant episodes. Finally, we propose a new estimator that uses all observations from the original design and which captures the incremental effect of policy given a state variable. We show that this estimator is generally more precise compared to those that exclude observations exposed to other cutoffs or time periods. Our proposed framework is simple and easily replicable and can be applied to any RDD application that carries an explicitly traceable time dimension.
APA, Harvard, Vancouver, ISO, and other styles
5

Herbert, George, and Lucas Loudon. The Size and Growth Potential of the Digital Economy in ODA-eligible Countries. Institute of Development Studies (IDS), December 2021. http://dx.doi.org/10.19088/k4d.2021.016.

Full text
Abstract:
This rapid review synthesises evidence on the current size of the digital market, the countries promoting development of digital business and their approach through Trade Policies or Incentive Frameworks, and the current and potential size of the market with the UK / China / US / other significant countries. It draws on a variety of sources, including reports by international organisations (such as the World Bank and OECD), grey literature produced by think tanks and the private sector, and peer reviewed academic papers. A high proportion of estimates of the size of the digital economy come from research conducted by or for corporations and industry bodies, such as Google and the GSMA (which represents the telecommunications industry). Their research may be influenced by their business interests, the methodologies and data sources they utilise are often opaque, and the information required to critically assess findings is sometimes missing. Given this, the estimates presented in this review are best seen as ballpark figures rather than precise measurements. A limitation of this rapid evidence review stems from the lack of consistent methodologies for estimating the size of the digital economy. The OECD is attempting to develop a standard approach to measuring the digital economy across the national accounts of the G20, but this has not yet been finalised. This makes comparing the results of different studies very challenging. The problem is particularly stark in low income countries, where there are frequently huge gaps in the relevant data.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography