Auswahl der wissenschaftlichen Literatur zum Thema „Depth level-sets“

Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an

Wählen Sie eine Art der Quelle aus:

Machen Sie sich mit den Listen der aktuellen Artikel, Bücher, Dissertationen, Berichten und anderer wissenschaftlichen Quellen zum Thema "Depth level-sets" bekannt.

Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.

Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.

Zeitschriftenartikel zum Thema "Depth level-sets"

1

Brunel, Victor-Emmanuel. „Concentration of the empirical level sets of Tukey’s halfspace depth“. Probability Theory and Related Fields 173, Nr. 3-4 (11.05.2018): 1165–96. http://dx.doi.org/10.1007/s00440-018-0850-0.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Grünewald, T., Y. Bühler und M. Lehning. „Elevation dependency of mountain snow depth“. Cryosphere Discussions 8, Nr. 4 (11.07.2014): 3665–98. http://dx.doi.org/10.5194/tcd-8-3665-2014.

Der volle Inhalt der Quelle
Annotation:
Abstract. Elevation strongly affects quantity and distribution of precipitation and snow. Positive elevation gradients were identified by many studies, usually based on data from sparse precipitation stations or snow depth measurements. We present a systematic evaluation of the elevation – snow depth relationship. We analyse areal snow depth data obtained by remote sensing for seven mountain sites. Snow depths were averaged to 100 m elevation bands and then related to their respective elevation level. The assessment was performed at three scales ranging from the complete data sets by km-scale sub-catchments to slope transects. We show that most elevation – snow depth curves at all scales are characterised through a single shape. Mean snow depths increase with elevation up to a certain level where they have a distinct peak followed by a decrease at the highest elevations. We explain this typical shape with a generally positive elevation gradient of snow fall that is modified by the interaction of snow cover and topography. These processes are preferential deposition of precipitation and redistribution of snow by wind, sloughing and avalanching. Furthermore we show that the elevation level of the peak of mean snow depth correlates with the dominant elevation level of rocks.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Bogicevic, Milica, und Milan Merkle. „Approximate calculation of Tukey's depth and median with high-dimensional data“. Yugoslav Journal of Operations Research 28, Nr. 4 (2018): 475–99. http://dx.doi.org/10.2298/yjor180520022b.

Der volle Inhalt der Quelle
Annotation:
We present a new fast approximate algorithm for Tukey (halfspace) depth level sets and its implementation-ABCDepth. Given a d-dimensional data set for any d ? 1, the algorithm is based on a representation of level sets as intersections of balls in Rd. Our approach does not need calculations of projections of sample points to directions. This novel idea enables calculations of approximate level sets in very high dimensions with complexity that is linear in d, which provides a great advantage over all other approximate algorithms. Using different versions of this algorithm, we demonstrate approximate calculations of the deepest set of points ("Tukey median") and Tukey's depth of a sample point or out-of-sample point, all with a linear in d complexity. An additional theoretical advantage of this approach is that the data points are not assumed to be in "general position". Examples with real and synthetic data show that the executing time of the algorithm in all mentioned versions in high dimensions is much smaller than the time of other implemented algorithms. Also, our algorithms can be used with thousands of multidimensional observations.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Gupta, Pawan, Lorraine A. Remer, Falguni Patadia, Robert C. Levy und Sundar A. Christopher. „High-Resolution Gridded Level 3 Aerosol Optical Depth Data from MODIS“. Remote Sensing 12, Nr. 17 (02.09.2020): 2847. http://dx.doi.org/10.3390/rs12172847.

Der volle Inhalt der Quelle
Annotation:
The state-of-art satellite observations of atmospheric aerosols over the last two decades from NASA’s Moderate Resolution Imaging Spectroradiometer (MODIS) instruments have been extensively utilized in climate change and air quality research and applications. The operational algorithms now produce Level 2 aerosol data at varying spatial resolutions (1, 3, and 10 km) and Level 3 data at 1 degree. The local and global applications have benefited from the coarse resolution gridded data sets (i.e., Level 3, 1 degree), as it is easier to use since data volume is low, and several online and offline tools are readily available to access and analyze the data with minimal computing resources. At the same time, researchers who require data at much finer spatial scales have to go through a challenging process of obtaining, processing, and analyzing larger volumes of data sets that require high-end computing resources and coding skills. Therefore, we created a high spatial resolution (high-resolution gridded (HRG), 0.1 × 0.1 degree) daily and monthly aerosol optical depth (AOD) product by combining two MODIS operational algorithms, namely Deep Blue (DB) and Dark Target (DT). The new HRG AODs meet the accuracy requirements of Level 2 AOD data and provide either the same or more spatial coverage on daily and monthly scales. The data sets are provided in daily and monthly files through open an Ftp server with python scripts to read and map the data. The reduced data volume with an easy to use format and tools to access the data will encourage more users to utilize the data for research and applications.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Grünewald, T., Y. Bühler und M. Lehning. „Elevation dependency of mountain snow depth“. Cryosphere 8, Nr. 6 (20.12.2014): 2381–94. http://dx.doi.org/10.5194/tc-8-2381-2014.

Der volle Inhalt der Quelle
Annotation:
Abstract. Elevation strongly affects quantity and distribution patterns of precipitation and snow. Positive elevation gradients were identified by many studies, usually based on data from sparse precipitation stations or snow depth measurements. We present a systematic evaluation of the elevation–snow depth relationship. We analyse areal snow depth data obtained by remote sensing for seven mountain sites near to the time of the maximum seasonal snow accumulation. Snow depths were averaged to 100 m elevation bands and then related to their respective elevation level. The assessment was performed at three scales: (i) the complete data sets (10 km scale), (ii) sub-catchments (km scale) and (iii) slope transects (100 m scale). We show that most elevation–snow depth curves at all scales are characterised through a single shape. Mean snow depths increase with elevation up to a certain level where they have a distinct peak followed by a decrease at the highest elevations. We explain this typical shape with a generally positive elevation gradient of snow fall that is modified by the interaction of snow cover and topography. These processes are preferential deposition of precipitation and redistribution of snow by wind, sloughing and avalanching. Furthermore, we show that the elevation level of the peak of mean snow depth correlates with the dominant elevation level of rocks (if present).
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Tavkhelidze, Avto, Amiran Bibilashvili, Larissa Jangidze und Nima E. Gorji. „Fermi-Level Tuning of G-Doped Layers“. Nanomaterials 11, Nr. 2 (17.02.2021): 505. http://dx.doi.org/10.3390/nano11020505.

Der volle Inhalt der Quelle
Annotation:
Recently, geometry-induced quantum effects were observed in periodic nanostructures. Nanograting (NG) geometry significantly affects the electronic, magnetic, and optical properties of semiconductor layers. Silicon NG layers exhibit geometry-induced doping. In this study, G-doped junctions were fabricated and characterized and the Fermi-level tuning of the G-doped layers by changing the NG depth was investigated. Samples with various indent depths were fabricated using laser interference lithography and a consecutive series of reactive ion etching. Four adjacent areas with NG depths of 10, 20, 30, and 40 nm were prepared on the same chip. A Kelvin probe was used to map the work function and determine the Fermi level of the samples. The G-doping-induced Fermi-level increase was recorded for eight sample sets cut separately from p-, n-, p+-, and n+-type silicon substrates. The maximum increase in the Fermi level was observed at a10 nm depth, and this decreased with increasing indent depth in the p- and n-type substrates. Particularly, this reduction was more pronounced in the p-type substrates. However, the Fermi-level increase in the n+- and p+-type substrates was negligible. The obtained results are explained using the G-doping theory and G-doped layer formation mechanism introduced in previous works.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Lowrey, Wilson, Ryan Broussard und Lindsey A. Sherrill. „Data journalism and black-boxed data sets“. Newspaper Research Journal 40, Nr. 1 (März 2019): 69–82. http://dx.doi.org/10.1177/0739532918814451.

Der volle Inhalt der Quelle
Annotation:
This study explores the level of scrutiny data journalists from national, local, traditional and digital outlets apply to data sets and data categories, and reasons that scrutiny varies. The study applies a sociology of quantification framework that assumes a tendency for data categories to become “black-boxed,” or taken-for-granted and unquestioned. Results of in-depth interviews with 15 data journalists suggested these journalists were more concerned with data accessibility and ease of use than validity of data categories, though this varied across outlet size and level of story complexity.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Harms-Ringdahl, Lars. „Analysis of Results from Event Investigations in Industrial and Patient Safety Contexts“. Safety 7, Nr. 1 (05.03.2021): 19. http://dx.doi.org/10.3390/safety7010019.

Der volle Inhalt der Quelle
Annotation:
Accident investigations are probably the most common approach to evaluate the safety of systems. The aim of this study is to analyse event investigations and especially their recommendations for safety reforms. Investigation reports were studied with a methodology based on the characterisation of organisational levels and types of recommendations. Three sets of event investigations from industrial companies and hospitals were analysed. Two sets employed an in-depth approach, while the third was based on the root-cause concept. The in-depth approach functioned in a similar way for both industrial organisations and hospitals. The number of suggested reforms varied between 56 and 143 and was clearly greater for the industry. Two sets were from health care, but with different methodologies. The number of suggestions was eight times higher with the in-depth approach, which also addressed higher levels in the organisational hierarchy and more often safety management issues. The root-cause investigations had a clear emphasis on reforms at the local level and improvement of production. The results indicate a clear need for improvements of event investigations in the health care sector, for which some suggestions are presented.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Letang, D. L., und W. J. de Groot. „Forest floor depths and fuel loads in upland Canadian forests“. Canadian Journal of Forest Research 42, Nr. 8 (August 2012): 1551–65. http://dx.doi.org/10.1139/x2012-093.

Der volle Inhalt der Quelle
Annotation:
Forest floor data are important for many forest resource management applications. In terms of fire and forest carbon dynamics, these data are critical for modeling direct carbon emissions from wildfire in Canadian forests because forest floor organic material is usually the greatest emissions source. However, there are very few data available to initialize wildfire emission models. Six data sets representing 41 534 forest stands across Canada were combined to provide summary statistics and to analyze factors controlling forest floor fuel loads and depths. The impacts of dominant tree species, ecozone, drainage-class, and age-class data on forest floor fuel loads and depth were examined using ANOVA and regression. All four parameters were significant factors affecting forest floor fuel load and depth, but only tree species and ecozone were substantially influential. Although forest floor depths summarized in this study are similar to those of previous studies, forest floor fuel loads are higher. Average forest floor fuel loads and depths are summarized by species and ecozone and can be used to initialize dynamic stand-level forest models.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Xi, Qingkui, Weiming Wu, Junjie Ji, Zhenghui Zhang und Feng Ni. „Comparing the Level of Commitment to In-Depth Reference and Research Support Services in Two Sets of Chinese Universities“. Science & Technology Libraries 38, Nr. 2 (07.03.2019): 204–23. http://dx.doi.org/10.1080/0194262x.2019.1583624.

Der volle Inhalt der Quelle
APA, Harvard, Vancouver, ISO und andere Zitierweisen

Dissertationen zum Thema "Depth level-sets"

1

Armaut, Elisabeth. „Estimation d'ensembles de niveau d'une fonction de profondeur pour des données fonctionnelles. Applications au clustering et à la théorie du risque“. Electronic Thesis or Diss., Université Côte d'Azur, 2024. http://www.theses.fr/2024COAZ5021.

Der volle Inhalt der Quelle
Annotation:
Les fonctions de profondeur statistiques jouent un rôle fondamental dans l'analyse et la caractérisation des structures de données complexes. Les profondeurs fournissent une mesure de centralité ou d'excentricité pour une observation individuelle ou pour l'ensemble des données, ce qui aide à comprendre leurs positions relatives et leurs distributions sous-jacentes. Les concepts relatifs à la profondeur, tels qu'ils sont présents dans la littérature, trouvent leur origine dans la notion de profondeur de Tukey, également désignée sous le nom de profondeur médiane. Cette notion a été introduite par le statisticien John W. Tukey dans son article intitulé "Mathematics and the Picturing of Data" publié en 1975 [170]. La principale idée sous-jacente à la profondeur de Tukey consiste à généraliser la médiane univariée d'un jeu de données unidimensionnel en dimension supérieure.Dans un premier temps, nous nous intéressons aux profondeurs multivariées suivies des profondeurs fonctionnelles, pour lesquelles nous construisons une revue générale dans le Chapitre 1.Dans la seconde partie de la thèse, i.e. dans le Chapitre 2, nous entreprenons une étude rigoureuse des ensembles de niveaux des fonctions de profondeur multivariées et établissons plusieurs propriétés analytiques et statistiques. Tout d'abord, nous montrons que lorsque la profondeur multivariée sous-jacente est suffisamment régulière, la différence symétrique entre l'ensemble de niveaux de profondeur estimé et son équivalent théorique converge vers zéro en termes de volume d-dimensionel et de probabilité sous la distribution considérée. Outre ces contributions, la nouveauté du Chapitre 2, dans le cadre de la théorie du risque, réside dans l'introduction d'une mesure de risque basée sur une profondeur appelée Covariate-Conditional-Tail-Expectation (CCTE). Globalement, la CCTE vise à calculer un coût moyen sachant qu'au moins un des facteurs de risque en jeu est 'élevé' suivant une certaine direction. Cette dernière zone de risque est modélisée par un ensemble de niveau de faible profondeur. Contrairement à des mesures de risques fondées sur les queues de distribution, notre définition de CCTE est indépendante de toute direction grâce à l'implication des ensembles de niveaux d'une profondeur. Nous démontrons également que, lorsque la taille de l'échantillon tend vers l'infini, la CCTE basée sur la profondeur empirique est consistante par rapport à sa version théorique. Et nous fournissons les taux de convergence pour la CCTE, pour des niveaux de risque fixes ainsi que lorsque le niveau de risque tend vers zéro quand la taille de l'échantillon tend vers l'infini. Dans ce dernier cas d'étude, nous analysons de même le comportement de la définition originelle de CCTE basée sur une fonction de répartition, cas qui n'a pas été étudié dans [56]. En plus des simulations effectuées sur la CCTE, nous illustrons son utilité sur des données environnementales.La dernière partie de cette thèse, le Chapitre 3, conclut notre travail et consiste à définir une profondeur fonctionnelle générale pour des données fonctionnelles basée sur l'analyse en composantes principales fonctionnelles. Cela implique l'utilisation d'une profondeur multivariée générique. Dans cette optique, nous utilisons la décomposition bien connue de Karhunen-Loève comme outil pour pro- jeter un processus aléatoire centré et de carré intégrable le long d'une combinaison linéaire finie de fonctions orthogonales appelées composantes principales. À notre connaissance, il s'agit d'une approche novatrice dans le cadre des profondeurs fonctionnelles. Naturellement, nous proposons un estimateur de notre profondeur fonctionnelle pour lequel nous démontrons une consistance uniforme. Nous complétons enfin notre étude avec des simulations et des applications sur données réelles dans des problèmes de classifications, où notre nouvelle profondeur se révèle être au moins aussi performante que la plupart des concurrents classiques
Statistical depth functions play a fundamental role in analyzing and characterizing complex data structures. Depth functions provide a measure of centrality or outlyingness for individual observations or entire datasets, aiding in the understanding of their relative positions and underlying distributions. The concepts related to depth, as found in the literature, originate from the notion of Tukey's depth, also known as the median depth. This concept was introduced by the statistician John W. Tukey in his article titled "Mathematics and the Picturing of Data," published in 1975 [170]. The fundamental idea underlying Tukey's depth is to generalize the univariate median of a one-dimensional dataset in higher dimension. First, our interest focuses on multivariate depths followed by functional depths, both of which we build an overall review within Chapter 1. In the second part of this thesis, i.e. in Chapter 2, we undertake a rigorous study of multivariate depth-level sets and establish several analytical and statistical properties. First, we show that, when the underlying multivariate depth is smooth enough, then the symmetric difference between the estimated depth-level set and its theoretical counterpart converges to zero in terms of the d-dimensional volume and of the probability under the unknown distribution. Apart from these contributions, the novelty of Chapter 2 is the introduction and study of a depth-based risk measure called the Covariate-Conditional- Tail-Expectation (CCTE), within a risk theory setup. Roughly, the CCTE aims at computing an average cost knowing that at least one of the risk factors at hand is 'high' in some direction. The latter risk area is modelled by a level-set of low depth value. In contrast to risk measures based on distribution tails, our definition of CCTE is direction-free, owing to the involvement of depth level sets. We establish that, as the sample size goes to infinity the empirical depth-based CCTE is consistent for its theoretical version. We demonstrate consistency and provide rates of convergence for the depth- CCTE, for fixed levels of risk as well as when the risk level goes to zero as the sample size goes to infinity. In this last case of study, we also analyze the behavior of the original CCTE definition based on a distribution function, a case that was not studied in [56]. On top of several simulations performed on the CCTE, we illustrate its usefulness on environmental data.The final part of this thesis, Chapter 3, wraps up our work in which we contribute to defining a new type of depth for functional data based on functional principal component analysis. This includes using a generic multivariate depth. In this view, we use the well known Karhunen-Loève decomposition as a tool to project a centered square-integrable random process along some finite linear combination of orthogonal functions called the principal components. To the best of our knowledge, this is a novel approach in the functional depth literature. In this extent, we involve a multivariate depth function for the vector of the projected principal components. Naturally, we provide an estimator of our functional depth for which we demonstrate uniform consistency with a rate of convergence. We complement our study with several simulations and real data applications to functional classification, where our new depth equals or outperforms most of conventional competitors
APA, Harvard, Vancouver, ISO und andere Zitierweisen

Bücher zum Thema "Depth level-sets"

1

Bennett, D. Scott. Teaching the Scientific Study of International Processes. Oxford University Press, 2017. http://dx.doi.org/10.1093/acrefore/9780190846626.013.314.

Der volle Inhalt der Quelle
Annotation:
The Scientific Study of International Processes (SSIP) is an approach aimed at teaching of international politics scientifically. Teaching scientifically means teaching students how to use evidence to support or disprove some particular logical argument or hypothesis that reaches some level of generalization about relationships between concepts. Closely related to simply asking what evidence there is, is teaching students to address the breadth, depth, and quality of that evidence. The scientific approach may also draw attention to the logic of arguments and policies. Are policies, positions, and the arguments behind them logical? Or is some policy or position based on assumptions that are not logically related, or only true if certain auxiliary assumptions hold true? Teaching methods for SSIP include comparative case studies, experiments and surveys, data sets, and game theory and simulation. Instructors also face several challenges when seeking to teach scientifically, and in particular when they try to make time to teach methodology as part of an international politics course. Some problems are relatively easily overcome just by focusing on effective teaching. Other are unique to SSIP and cannot be dealt with quite so easily. Among these are the need to appeal to a broad audience, and dealing with students' negative reactions to the term “science” and the constraint of finite time in a course.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

O’Leary, Brendan. A Treatise on Northern Ireland, Volume I. Oxford University Press, 2019. http://dx.doi.org/10.1093/oso/9780199243341.001.0001.

Der volle Inhalt der Quelle
Annotation:
O’Leary’s authoritative treatment of the history of Northern Ireland and its current prospects is genuinely unique. Beginning with an in-depth account of the scale of the recent conflict, he sets out to explain why Northern Ireland recently had the highest incidence of political violence in twentieth-century western Europe. Volume 1 demonstrates the salience of the colonial past in accounting for current collective mentalities, institutions, and rivalrous animosities, culminating in a distinct comparative account of the partition of the island in 1920. The major moments in the development of Irish republicanism and Ulster unionism are freshly treated by this Irish-born political scientist who has spent thirty-five years mastering the relevant historiography. Volume 2 shows how Ulster Unionists improvised a distinctive control system, driven by their fear of abandonment by the metropolitan power in Great Britain, their anxieties about Irish nationalist irredentism, and their inherited settler colonial culture. British political institutions were exploited to organize a sustained political monopoly on power and to disorganize the cultural Catholic minority. At the same juncture, the Irish Free State’s punctuated movement from restricted dominion-level autonomy to sovereign republican independence led to the full-scale political decolonization of the South. Irish state-building had a price, however: it further estranged Ulster Unionists, and Northern nationalists felt abandoned. Volume 3 unpacks the consequences and takes the reader to the present, explaining Northern Ireland’s distinctive consociational settlement, accomplished in 1998, and its subsequently turbulent and currently imperiled implementation. An assessment of the confederation of European Union and the prospects for an Irish confederation close the book, which vividly engages with feasible futures that may unfold from the UK’s exit from the EU.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

O’Leary, Brendan. A Treatise on Northern Ireland, Volume II. Oxford University Press, 2019. http://dx.doi.org/10.1093/oso/9780198830573.001.0001.

Der volle Inhalt der Quelle
Annotation:
O’Leary’s authoritative treatment of the history of Northern Ireland and its current prospects is genuinely unique. Beginning with an in-depth account of the scale of the recent conflict, he sets out to explain why Northern Ireland recently had the highest incidence of political violence in twentieth-century western Europe. Volume 1 demonstrates the salience of the colonial past in accounting for current collective mentalities, institutions, and rivalrous animosities, culminating in a distinct comparative account of the partition of the island in 1920. The major moments in the development of Irish republicanism and Ulster unionism are freshly treated by this Irish-born political scientist who has spent thirty-five years mastering the relevant historiography. Volume 2 shows how Ulster Unionists improvised a distinctive control system, driven by their fear of abandonment by the metropolitan power in Great Britain, their anxieties about Irish nationalist irredentism, and their inherited settler colonial culture. British political institutions were exploited to organize a sustained political monopoly on power and to disorganize the cultural Catholic minority. At the same juncture, the Irish Free State’s punctuated movement from restricted dominion-level autonomy to sovereign republican independence led to the full-scale political decolonization of the South. Irish state-building had a price, however: it further estranged Ulster Unionists, and Northern nationalists felt abandoned. Volume 3 unpacks the consequences and takes the reader to the present, explaining Northern Ireland’s distinctive consociational settlement, accomplished in 1998, and its subsequently turbulent and currently imperiled implementation. An assessment of the confederation of European Union and the prospects for an Irish confederation close the book, which vividly engages with feasible futures that may unfold from the UK’s exit from the EU.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

O’Leary, Brendan. A Treatise on Northern Ireland, Volume III. Oxford University Press, 2019. http://dx.doi.org/10.1093/oso/9780198830580.001.0001.

Der volle Inhalt der Quelle
Annotation:
O’Leary’s authoritative treatment of the history of Northern Ireland and its current prospects is genuinely unique. Beginning with an in-depth account of the scale of the recent conflict, he sets out to explain why Northern Ireland recently had the highest incidence of political violence in twentieth-century western Europe. Volume 1 demonstrates the salience of the colonial past in accounting for current collective mentalities, institutions, and rivalrous animosities, culminating in a distinct comparative account of the partition of the island in 1920. The major moments in the development of Irish republicanism and Ulster unionism are freshly treated by this Irish-born political scientist who has spent thirty-five years mastering the relevant historiography. Volume 2 shows how Ulster Unionists improvised a distinctive control system, driven by their fear of abandonment by the metropolitan power in Great Britain, their anxieties about Irish nationalist irredentism, and their inherited settler colonial culture. British political institutions were exploited to organize a sustained political monopoly on power and to disorganize the cultural Catholic minority. At the same juncture, the Irish Free State’s punctuated movement from restricted dominion-level autonomy to sovereign republican independence led to the full-scale political decolonization of the South. Irish state-building had a price, however: it further estranged Ulster Unionists, and Northern nationalists felt abandoned. Volume 3 unpacks the consequences and takes the reader to the present, explaining Northern Ireland’s distinctive consociational settlement, accomplished in 1998, and its subsequently turbulent and currently imperiled implementation. An assessment of the confederation of European Union and the prospects for an Irish confederation close the book, which vividly engages with feasible futures that may unfold from the UK’s exit from the EU.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Doak, Brian R. Heroic Bodies in Ancient Israel. Oxford University Press, 2019. http://dx.doi.org/10.1093/oso/9780190650872.001.0001.

Der volle Inhalt der Quelle
Annotation:
The bodies of a people encode and continually retell the story of their families, cities, and nations. In the Hebrew Bible, the bodies of notable heroic figures—warriors, kings, and cultural founders—not only communicate values on an individual level but they also bear meaning for the fate of the nation. The patriarch Jacob, who takes on the name of the nation, “Israel,” engages in an intense bodily drama by way of securing the family blessing and passing on his identity to the Tribes of Israel. Judges is a deeply bodily book: left-handed, mutilating and mutilated, long haired, and fractured like the nation itself, its warriors revel in bodies and violence. The David and Saul drama, throughout 1–2 Samuel, repeatedly juxtaposes the bodies of the two kings and sets them on a collision course. Saul’s body continues to act in strange and powerful ways beyond his death, and in the final episodes of Saul’s bone movement and reburial, the last heroic body goes underground. Thus, Israel’s heroic national body rises and falls on the bodies of its heroes, and the Hebrew Bible takes up a profound place in the ancient literary landscape in its treatment of heroic and body themes.
APA, Harvard, Vancouver, ISO und andere Zitierweisen

Buchteile zum Thema "Depth level-sets"

1

Ouellette, Nadine, France Meslé, Jacques Vallin und Jean-Marie Robine. „Supercentenarians and Semi-supercentenarians in France“. In Demographic Research Monographs, 105–23. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-49970-9_9.

Der volle Inhalt der Quelle
Annotation:
AbstractThe purpose of this study is twofold. Firstly, it attempts to exhaustively identify cases of French supercentenarians and semi-supercentenarians and to validate their alleged age at death. Secondly, it seeks to uncover careful patterns and trends in probabilities of death and life expectancy at very old ages in France. We use three sets of data with varying degrees of accuracy and coverage: nominative transcripts from the RNIPP (Répertoire national d’identification des personnes physiques), death records from the vital statistics system, and “public” lists of individual supercentenarians. The RNIPP stands out as the most reliable source. Based on all deaths registered in the RNIPP at the alleged ages of 110+ for extinct cohorts born between 1883 and 1901, errors are only few, at least for individuals who were born and died in France. For alleged semi-supercentenarians, age validation on a very large sample shows that errors are extremely rare, suggesting the RNIPP data can be used without any verification until age 108 at the minimum. Moreover, a comparison with “public” lists of individual supercentenarians reveals a single missing occurrence only in the RNIPP transcripts since 1991. While the quality of vital statistics data remains quite deficient at very old ages compared to RNIPP, the analytical results show a significant improvement over time at younger old ages. Our RNIPP-based probabilities of death for females appear to level-off at 0.5 between ages 108 and 111, but data becomes too scarce afterwards to assess the trend. Also, we obtain a quite low life expectancy value of 1.2 years at age 108.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Sborgi, Anna Viola. „Housing Problems: Britain’s Housing Crisis and Documentary“. In Cinema of Crisis, 180–97. Edinburgh University Press, 2020. http://dx.doi.org/10.3366/edinburgh/9781474448505.003.0012.

Der volle Inhalt der Quelle
Annotation:
This chapter explores recent documentary production on the housing crisis in the UK and sets it within the wider context of social and economic crisis at the European level and beyond. I argue that this output not only provides an in-depth representation of growing social and economic inequality in housing, but also (through different channels of distribution and in its interconnection with the world beyond the screen, housing activism in particular) increasingly shapes the debate on the home, opening a potential platform for discussion that goes well beyond the national level.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Zhan, Tao. „Multi-Granulation-Based Optimal Scale Selection in Multi-Scale Information Systems“. In Advances in Systems Analysis, Software Engineering, and High Performance Computing, 160–72. IGI Global, 2024. http://dx.doi.org/10.4018/979-8-3693-4292-3.ch006.

Der volle Inhalt der Quelle
Annotation:
This chapter establishes belief and plausibility functions within the context of multigranulation, delving into the structures of belief and plausibility. The focus extends to the examination of multigranulation rough sets within multi-scale information systems. Subsequently, to determine the optimal level within the multigranulation rough set, a method for optimal scale selection is introduced. This method caters to diverse requirements in optimistic and pessimistic multigranulation within the multi-scale information system. In-depth analyses of the characteristics of optimistic and pessimistic multigranulation optimal scale selection for multi-scale information systems are conducted separately, revealing intrinsic connections between distinct optimal scale selection methodologies.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Baines, Susan, Judit Csoba, Flórián Sipos und Andrea Bassi. „Social Investment in welfare: a sub-national perspective“. In Implementing Innovative Social Investment, 1–22. Policy Press, 2019. http://dx.doi.org/10.1332/policypress/9781447347828.003.0001.

Der volle Inhalt der Quelle
Annotation:
Taking a critical but sympathetic perspective, this chapter discusses recent debates around Social Investment as a new welfare paradigm. Scholarly and policy literature on Social Investment focus on aggregate effects and macro-comparative analysis with limited reference to local and micro level implementation and practice. Innovation is an essential element of Social Investment as social policies require constant adaptation to new challenges, yet literatures on Social Investment and social innovation rarely connect. This chapter sets the scene for the edited collection, highlighting the aim to advance empirical and conceptual insight into Social Investment from a social innovation and a sub national perspective. It briefly introduces in-depth, multi method case studies in ten EU countries of innovative, strategic approaches to delivering social investment policy at a sub national level.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Harcourt, Alison, George Christou und Seamus Simpson. „Informal Governance and Decision-making Through Multiple Streams“. In Global Standard Setting in Internet Governance, 15–33. Oxford University Press, 2020. http://dx.doi.org/10.1093/oso/9780198841524.003.0002.

Der volle Inhalt der Quelle
Annotation:
Chapter 2 develops the analytical framework for the book. It sets out the twin theoretical premises of the book, the first relating to global governance at the level of actor influence and the second applying the multiple streams (MS) framework to the level of decision-making within standards-developing organizations (SDOs). It provides critical discussion of the literature on the influence of different actors in international self-regulatory fora and explains how certain actors can uphold the public interest in fora where states and public actors are largely absent. The multiple streams approach provides an in-depth account of how problems appear on SDO agendas, the nature of the contestation relating to the discussion of standards development, and the extent to which and how standards are translated to final adoption. The chapter identifies mechanisms at the conceptual level which facilitate our understanding of how deliberation of standards development lead to agreement and policy decisions.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Costumado, Maximino G. S., Delma C. Da Silva und A. D. Chemane. „Mozambique's Singular Path in Southern Africa's Coalition Governance Landscape“. In Advances in Public Policy and Administration, 92–111. IGI Global, 2024. http://dx.doi.org/10.4018/979-8-3693-1654-2.ch006.

Der volle Inhalt der Quelle
Annotation:
This in-depth comparative study explores the complexities of coalition governance in Southern Africa, specifically focusing on Mozambique's remarkable absence of coalition governments. It sets it apart from neighboring countries where coalition experiences and party alliances are firmly established. The tumultuous contestation of the October 2023 municipal election results, marked by allegations of fraud and favoritism, sheds light on critical challenges within Mozambique's electoral system. This contentious episode highlights the need for comprehensive changes in the country's electoral regulatory framework, which is crucial for alignment with regional contexts and cultivating a political environment conducive to joint governance, particularly at the local level. In addition, it requires a re-evaluation, leading to improvements that increase transparency, fairness, and public confidence in the nation's electoral processes, and could bring the country closer to standards of electoral integrity.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Goodfellow, Tom. „Transformation and divergence“. In Politics and the Urban Frontier, 30–54. Oxford University PressOxford, 2022. http://dx.doi.org/10.1093/oso/9780198853107.003.0002.

Der volle Inhalt der Quelle
Annotation:
Abstract This chapter elaborates the book’s theoretical framework in depth. It begins with a discussion of some contemporary strands of urban theory and the extent to which these engage with or deflect questions of causality. Drawing on Critical Realism, it explains this book’s interdisciplinary approach to causal force, particularly in relation to questions of structure and infrastructure, and the human and non-human elements that shape urban transformations. It then sets out in detail each of the four casual factors used to explain divergent trajectories of urban development: the distribution of associational power, the pursuit of social legitimacy, modalities of political informality, and legacies and practices of infrastructural reach. Finally, it discusses how these are conceptualized in relation to one another and at different scales, building a ‘level abstracted’ analytical lens on the politics of urban transformation.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

Martell, Luke. „Introduction“. In Alternative Societies, 1–5. Policy Press, 2023. http://dx.doi.org/10.1332/policypress/9781529229660.003.0001.

Der volle Inhalt der Quelle
Annotation:
The introduction sets out the aim of the book to go beyond critique of present society and look for alternatives. It outlines: the breadth of alternatives covered, in theory and practice, current and future; the thrust of the book in going beyond polarizations and dichotomies and its argument for pluralism and complexity in pursuing alternatives; the emphasis on socialism as coming out of the many alternatives but also how socialism is pursued in the book, seeing the need for a pluralist and liberal approach. The introduction explains why the following themes were chosen for more in-depth analysis: utopianism, socialism, the democratic economy, and local/global levels. The chapter outlines how the book is international, discussing alternatives at a global level and located and relevant internationally, including in the Global South. The introduction also outlines how the book is designed for students and lay readers as well as experts.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Chamon, Merijn, Annalisa Volpato und Mariolina Eliantonio. „Introduction“. In Boards of Appeal of EU Agencies, 1–7. Oxford University Press, 2022. http://dx.doi.org/10.1093/oso/9780192849298.003.0001.

Der volle Inhalt der Quelle
Annotation:
This chapter provides the introduction to the edited volume by spotlighting the Boards of Appeal (BoAs) in the context of the rise of agencies in the EU administration. It sets out how BoAs are internal review bodies of EU agencies that allow a certain level of administrative protection which needs to be exhausted before private parties can seize the EU Courts but which, in principle, also allows a more in-depth review of agency decisions compared to the review offered by the courts. Sketching this context, the chapter identifies the overarching research question of the edited volume as how the Boards of Appeal should be conceptualized and assessed both as a mechanism of legal protection and as to the degree to which they deliver on their theoretic potential of more intense scrutiny. The introduction thereby also clarifies the approach adopted in the edited volume, the first part of which is devoted to a series of case studies of specific BoAs or agencies, while the second part brings together chapters devoted to horizontal issues.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Lantschner, Emma. „Reaction to Discrimination by Judicial Mechanisms“. In Reflexive Governance in EU Equality Law, 127–97. Oxford University Press, 2021. http://dx.doi.org/10.1093/oso/9780192843371.003.0005.

Der volle Inhalt der Quelle
Annotation:
Chapter 4 sets out to pool the results of more than fifteen years of implementation of those provisions of RED and EED that give interested organizations an important role in providing a more effective level of protection to victims of discrimination. The chapter comparatively assesses the legal standing of NGOs in discrimination disputes and to what extent Member States have introduced also collective forms of redress. It then carries out in-depth case studies on the implementation practice in Romania, Hungary, and Germany to understand which factors (legal framework or implementation practice) influence the success or failure of a system. The research finds that the positive results of NGO litigation at individual and societal level could not be achieved because of, but despite of the role played by state authorities. Legal challenges relate to limited legal standing, sometimes restricted to certain levels of jurisdiction or certain bodies, and the fact that collective redress is foreseen only in about half of the Member States. Even where legislation is permissive, practical challenges involve an insufficient territorial coverage with NGOs acting in support of victims of discrimination, lack of funding, lack of awareness among victims about being able to turn to NGOs, lack of referencing systems, and an increasingly hostile environment vis-à-vis NGOs working for vulnerable groups. On the basis of these findings, structural, process, and outcome indicators to monitor the effective implementation of the provisions giving a role to NGOs in judicial dispute resolution are deduced.
APA, Harvard, Vancouver, ISO und andere Zitierweisen

Konferenzberichte zum Thema "Depth level-sets"

1

Wang, Lifu, Bo Shen, Ning Zhao und Zhiyuan Zhang. „Is the Skip Connection Provable to Reform the Neural Network Loss Landscape?“ In Twenty-Ninth International Joint Conference on Artificial Intelligence and Seventeenth Pacific Rim International Conference on Artificial Intelligence {IJCAI-PRICAI-20}. California: International Joint Conferences on Artificial Intelligence Organization, 2020. http://dx.doi.org/10.24963/ijcai.2020/387.

Der volle Inhalt der Quelle
Annotation:
The residual network is now one of the most effective structures in deep learning, which utilizes the skip connections to “guarantee" the performance will not get worse. However, the non-convexity of the neural network makes it unclear whether the skip connections do provably improve the learning ability since the nonlinearity may create many local minima. In some previous works [Freeman and Bruna, 2016], it is shown that despite the non-convexity, the loss landscape of the two-layer ReLU network has good properties when the number m of hidden nodes is very large. In this paper, we follow this line to study the topology (sub-level sets) of the loss landscape of deep ReLU neural networks with a skip connection and theoretically prove that the skip connection network inherits the good properties of the two-layer network and skip connections can help to control the connectedness of the sub-level sets, such that any local minima worse than the global minima of some two-layer ReLU network will be very “shallow". The “depth" of these local minima are at most O(m^(η-1)/n), where n is the input dimension, η<1. This provides a theoretical explanation for the effectiveness of the skip connection in deep learning.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Habibullah, Saleha. „Extension of a 2018 sample-based study on the level of awareness regarding big data in the statistics community of pakistan“. In Decision Making Based on Data. International Association for Statistical Education, 2019. http://dx.doi.org/10.52041/srap.19202.

Der volle Inhalt der Quelle
Annotation:
Whereas statisticians of advanced countries are developing new methodologies to trends inherent in extremely large data sets, the statistics communities of developing countries are deficient in this regard. In the year 2018, a sample survey was carried out in Pakistan to ascertain the level of awareness regarding big data among academics and practitioners of statistics in the country. The survey revealed that, for many terms related to big data, there did not exist much awareness among the statisticians of the country. This paper extends the 2018 study in terms of coverage, scope and depth of analysis. Results of the extended survey seem to confirm the findings of the previous year indicating that there is a need for multi-pronged strategies to create awareness in the statistical community of Pakistan regarding big data and its role in evidence-based decision-making conducive to development and progress of the country.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
3

Prokša, Miroslav, Zuzana Haláková und Anna Drozdíková. „CHEMICAL EQUILIBRIUM IN TERMS OF ITS CONCEPTUAL UNDERSTANDING IN THE CONTEXT OF SUBMICROSCOPIC, MACROSCOPIC AND SYMBOLIC INTERPRETATION BY LEARNERS“. In Proceedings of the 2nd International Baltic Symposium on Science and Technology Education (BalticSTE2017). Scientia Socialis Ltd., 2017. http://dx.doi.org/10.33225/balticste/2017.104.

Der volle Inhalt der Quelle
Annotation:
The research was focused on solving the following research question: What is the depth and breadth of 16-year-old learners' knowledge of the chemical equilibrium in Slovakia? The main aim of our research was to find out the conceptual understanding of this part of chemistry in the context of submicroscopic, macroscopic and symbolic representations. A special research tool, which consisted of five sets of tasks, was created for this research. The research included a sample of 473 children. The results indicate that knowledge is more at the level of memory reproduction and algorithmic use. Learners have been facing a problem with the conceptual understanding of the given concept. Keywords: chemical equilibrium, submicroscopic, macroscopic and symbolic interpretation, conceptual understanding.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
4

Arntsen, Martin, Juliane Borge, Ole-Hermann Strømmesen und Edmond Hansen. „The Effect of Temporal Length of Current Measurements on the Derived Design Level“. In ASME 2018 37th International Conference on Ocean, Offshore and Arctic Engineering. American Society of Mechanical Engineers, 2018. http://dx.doi.org/10.1115/omae2018-77769.

Der volle Inhalt der Quelle
Annotation:
The duration of current measurements is often short, ranging from a few weeks up to a year. Application of extreme value statistics to derive design levels requires relatively long time series. To mitigate the lack of long-term measurements, the Norwegian standard NS9415 for fish farm design requires the design level of 50-year return period to be derived by multiplication of the current maximum in month-long current measurements by a prescribed conversion factor of 1.85. Here we use twelve data sets of yearlong coastal current measurements to explore the validity of this factor. For each yearlong time series, a design level of 50-year return period is calculated by extreme value statistics and used to calculate estimates of the conversion factor. The mean value of the resulting conversion factor is close to that of NS9415, 1.85 and 1.80 at 5 and 15 m depth, respectively. However, the spread in values is great, both geographically and between months. A conversion factor ranging from 1 to 4 reflects different relative dominance of the driving forces at different coastal regions and different seasons. The absence of a significant seasonal cycle in the conversion factors calculated here, illustrates the difficulty in adjusting for season. The results illustrate and quantify the uncertainty and — often — the lack of conservatism in design levels derived from month long current observations.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
5

Biswas, Arpan, Yong Chen und Christopher Hoyle. „A Bi-Level Optimization Approach for Energy Allocation Problems“. In ASME 2018 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2018. http://dx.doi.org/10.1115/detc2018-85139.

Der volle Inhalt der Quelle
Annotation:
In our previous paper,[1] we have integrated the Robust Optimization framework with the Real Options model to evaluate flexibility, introducing the Flexible-Robust Objective. Flexibility is defined as the energy left to allocate after meeting daily demands. This integration proved more efficient in risk evaluation in energy allocation problems. However, the integration has some limitations in applying operational and physical constraints of the reservoirs. In this paper, an in-depth analysis of all the limitations is discussed. To overcome those limitations and ensure a conceptually correct approach, a bilevel programming approach has been introduced in the second stage of the model to solve the energy allocation problem. We define the proposed model in this paper as Two-Stage, Bi-Level Flexible-Robust Optimization. Stage 1 provides the maximum total flexibility that can be allocated throughout the optimization period. Stage 2 uses bi-level optimization. The Stage 2 upper level sets the target allocation of flexibility in each iteration and maximizes net revenue along with the evaluation of allocated flexibility by the real options model. The Stage 2 lower level minimizes the deviation between the level 1 target and the achievable solution, ensuring no violation in physical and operational constraints of the reservoirs. Some compatibility issues have been identified in integrating the two levels, which have been discussed and solved successfully; the model provides an optimal achievable allocation of flexibility by maximizing net revenue and minimizing violation of constraints. Uncertainty in the objective function and constraints has been handled by converting into a robust objective and probabilistic constraints, respectively. Both classical methods (SQP) and evolutionary methods (GA) with continuous decision variables have been applied to solving the optimization problem, and the results are compared. Also, the result has been compared with the simplified version in previous paper, which was limited to randomly generate discrete decision variables. The new results provided an 8% improvement over the previous simplified model.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
6

Li, Yanping, Gordon Fredine, Yvan Hubert und Sherif Hassanien. „Making Integrity Decisions Using Metal Loss ILI Validation Process“. In 2016 11th International Pipeline Conference. American Society of Mechanical Engineers, 2016. http://dx.doi.org/10.1115/ipc2016-64601.

Der volle Inhalt der Quelle
Annotation:
With the increased number of In-Line Inspections (ILI) on pipelines, it is important to evaluate ILI tool performance to support making rational integrity decisions. API 1163 “In-Line inspection systems qualification” outlines an ILI data set validation process which is mainly based on comparing ILI data with field measurements. The concept of comparing ILI results with previous ILI data is briefly mentioned in API 1163 Level 1 validation and discussed in detail in CEPA metal Loss ILI tool validation guidance document. However, a different approach from API 1163 is recommended in the CEPA document. Although the methodologies of validating an ILI performance are available, other than determining whether an inspection data set is acceptable, the role of ILI validation in integrity management decision making is not well defined in these documents. Enbridge has reviewed API 1163 and CEPA methodologies and developed a process to validate metal loss ILI results. This process uses API 1163 as tool performance acceptance criteria while CEPA method is used to provide additional information such as depth over-call or under-call. The process captures the main concepts of both API 1163 and CEPA methodologies. It adds a new dimension to the validation procedure by evaluating different corrosion morphologies, depth ranges, and proximity to long seam and girth weld. The process also checks ILI results against previous ILI data sets and combines the results of several inspections. The validation results of one inspection provide information on whether the inspection data set is acceptable based on the ILI specification. This information is useful for excavation selection. Tool performance review based on several inspection data sets identifies the strength and weakness of an inspection tool; this information will be used to ensure the tool selection is appropriate for the expected feature types on the pipeline. Applications of the validation process are provided to demonstrate how the process can aid in making integrity decisions and managing metal loss threats.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
7

Shady, Sally Fouad. „Traditional, Active and Problem-Based Learning Methods Used to Improve an Undergraduate Biomechanics Course“. In ASME 2018 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2018. http://dx.doi.org/10.1115/imece2018-87478.

Der volle Inhalt der Quelle
Annotation:
Biomechanics is a core curriculum course taught in many biomedical engineering programs. Biomechanical analysis has become a necessary tool for both industry and research when developing a medical device. Despite its significance both inside and outside of the classroom, most students have demonstrated challenges in effectively mastering biomechanical concepts. Biomechanics requires adaptive skill sets needed to solve a multitude of problems from various disciplines and physiological systems. Many students taking biomechanics have not taken foundational courses that are necessary for in-depth learning and mastery of biomechanics. Consequently, limiting their ability to solve complex problems requiring strong foundations in statics, dynamics, fluid mechanics, and physiology. Active (AL) and problem-based learning (PBL) are techniques that has been widely used in medical education and allow faculty to implement engineering concepts into the context of disease solving real-world medical problems. This study investigates using both traditional and problem-based learning teaching pedagogy to enhance student learning in a senior level undergraduate biomechanics course. Results of this technique have shown an increase in student performance and self-assessments.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
8

WILLIAMS, W. M. „Yield stress and work hardening behavior of extruded AA6082 profiles under different homogenization and extrusion conditions“. In Material Forming. Materials Research Forum LLC, 2023. http://dx.doi.org/10.21741/9781644902479-51.

Der volle Inhalt der Quelle
Annotation:
Abstract. The mechanical properties of extruded AlMgSi alloys are affected by the applied thermo-mechanical parameters employed during the entire production process. In the following, the effect of different extrusion speeds and homogenization conditions paired with either air or water quenching, is examined on four different sets of rectangular hollow AA6082-T4 profiles. These profiles were fabricated and extruded under industrial conditions and selected cross sections of each profile were examined by optical microscopy to determine the microstructure and level of recrystallization. Uniaxial tension testing was used to explore the effect of homogenization conditions and extrusion cooling rate on mechanical properties. Tensile tests showed that the water-quenched extrusions had a higher yield and ultimate strength compared to air-quenched extrusions. Moreover, samples that had a recrystallized microstructure typically showed a larger standard deviation of mechanical properties, which may lead to product quality and consistency issues for metal forming operations. Overall, the present work provides a more in-depth understanding of how the selected thermo-mechanical parameters affect the resulting properties of such profiles. This can further contribute to expanding the potential for effective accommodation of extrusion parameters for zero-defect products.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
9

Lim, Shi Ying Candice, Bradley Adam Camburn, Diana Moreno, Zack Huang und Kristin Wood. „Design Concept Structures in Massive Group Ideation“. In ASME 2016 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2016. http://dx.doi.org/10.1115/detc2016-59805.

Der volle Inhalt der Quelle
Annotation:
Empirical work in design science has highlighted that the process of ideation can significantly affect design outcome. Exploring the design space with both breadth and depth increases the likelihood of achieving better design outcomes. Furthermore, iteratively attempting to solve challenging design problems in large groups over a short time period may be more effective than protracted exploration by an isolated set of individuals. There remains a substantial opportunity to explore the structure of various design concept sets. In addition, many empirical studies cap analysis at sample sizes of less than one hundred individuals. This has provided substantial, though partial, models of the ideation space. This work explores one new territory in large scale ideation. Two conditions are evaluated. In the first condition, an ideation session was run with 2400 practicing designers and engineers from one organization. In the second condition 1000 individuals ideate on the same problem in a completely distributed environment and without awareness of each other. We compare properties of solution sets produced by each of these groups and activities. Analytical tools from network modeling theory are applied as well as traditional ideation metrics such as concept binning with saturation analysis. Structural network modeling is applied to evaluate the interconnectivity of design concepts. This is a strictly quantitative, and at the same time graphically expressive, means to evaluate the diversity of a design solution set. Observations indicate that the group condition approached saturation of distinct categories more rapidly than the individual, distributed condition. The total number of solution categories developed in the group condition was also higher. Additionally, individuals generally provided concepts across a greater number of solution categories in the group condition. The indication for design practice is that groups of just under forty individuals would provide category saturation within group ideation for a system level design, while distributed individuals may provide additional concept differentiation. This evidence can support development of more systematic ideation strategies. Furthermore, we provide an algorithmic approach for quantitative evaluation of variety in design solution sets using networking analysis techniques. These methods can be used in complex or wicked problems, and system development where the design space is vast.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
10

Hasan, Syed Danish, Nazrul Islam und Khalid Moin. „Structural Response of a Multi-Hinged Articulated Offshore Tower Under Seismic Excitation“. In ASME 2009 28th International Conference on Ocean, Offshore and Arctic Engineering. ASMEDC, 2009. http://dx.doi.org/10.1115/omae2009-79315.

Der volle Inhalt der Quelle
Annotation:
Articulated towers are the compliant offshore structures that are designed with high degree of compliancy in horizontal direction and to remain relatively stiff in vertical direction. The nonlinear effects due to large displacements, large rotations and high environmental forces are of prime importance in the analysis. This paper investigates the structural response of a 580 m high multi-hinged articulated tower under different seismic sea environment in a water depth of 545 m. The articulated tower is represented as an upright flexible pendulum supported on the sea-bed by a mass-less rotational spring of zero stiffness while the top of it rigidly supports a deck in the air; a concentrated mass above still water level (SWL). For computation of seismic loads, the tower is idealized as a “stick” model of finite elements with masses lumped at the nodes. The earthquake response is carried out by time history analysis using real sets of Californian earthquakes. Disturbed water particle kinematics due to seismic shaking of sea bed is taken into consideration. Nonlinear dynamic equation of motion is formulated using Lagrangian approach. The approach is based on energy principle that relates the kinetic energy, potential energy and work of the system in terms of rotational degree-of-freedom. The solution to the equation of motion is obtained by Newmark-β scheme in the time domain that counters the nonlinearities associated with the system in an iterative fashion. It is observed that with the increase in water depth, additional hinges are required to compensate the increased bending moment due to additional earthquake loads. Analysis results are compared and presented in the form of time-histories and PSDFs of various responses along with combined responses due to horizontal and vertical component of ground motion using direct sum and SRSS method.
APA, Harvard, Vancouver, ISO und andere Zitierweisen

Berichte der Organisationen zum Thema "Depth level-sets"

1

Goulet, Christine, Yousef Bozorgnia, Norman Abrahamson, Nicolas Kuehn, Linda Al Atik, Robert Youngs, Robert Graves und Gail Atkinson. Central and Eastern North America Ground-Motion Characterization - NGA-East Final Report. Pacific Earthquake Engineering Research Center, University of California, Berkeley, CA, Dezember 2018. http://dx.doi.org/10.55461/wdwr4082.

Der volle Inhalt der Quelle
Annotation:
This document is the final project report of the Next Generation Attenuation for Central and Eastern North America (CENA) project (NGA-East). The NGA-East objective was to develop a new ground-motion characterization (GMC) model for the CENA region. The GMC model consists of a set of new ground-motion models (GMMs) for median and standard deviation of ground motions and their associated weights to be used with logic-trees in probabilistic seismic hazard analyses (PSHA). NGA-East is a large multidisciplinary project coordinated by the Pacific Earthquake Engineering Research Center (PEER), at the University of California. The project has two components: (1) a set of scientific research tasks, and (2) a model-building component following the framework of the “Seismic Senior Hazard Analysis Committee (SSHAC) Level 3” (Budnitz et al. 1997; NRC 2012). Component (2) is built on the scientific results of component (1) of the NGA-East project. This report documents the tasks under component (2) of the project. Under component (1) of NGA-East, several scientific issues were addressed, including: (a) development of a new database of ground motion data recorded in CENA; (b) development of a regionalized ground-motion map for CENA, (c) definition of the reference site condition; (d) simulations of ground motions based on different methodologies; and (e) development of numerous GMMs for CENA. The scientific tasks of NGA-East were all documented as a series of PEER reports. The scope of component (2) of NGA-East was to develop the complete GMC. This component was designed as a SSHAC Level 3 study with the goal of capturing the ground motions’ center, body, and range of the technically defensible interpretations in light of the available data and models. The SSHAC process involves four key tasks: evaluation, integration, formal review by the Participatory Peer Review Panel (PPRP), and documentation (this report). Key tasks documented in this report include review and evaluation of the empirical ground- motion database, the regionalization of ground motions, and screening sets of candidate GMMs. These are followed by the development of new median and standard deviation GMMs, the development of new analyses tools for quantifying the epistemic uncertainty in ground motions, and the documentation of implementation guidelines of the complete GMC for PSHA computations. Appendices include further documentation of the relevant SSHAC process and additional supporting technical documentation of numerous sensitivity analyses results. The PEER reports documenting component (1) of NGA-East are also considered “attachments” to the current report and are all available online on the PEER website (https://peer.berkeley.edu/). The final NGA-East GMC model includes a set of 17 GMMs defined for 24 ground-motion intensity measures, applicable to CENA in the moment magnitude range of 4.0 to 8.2 and covering distances up to 1500 km. Standard deviation models are also provided for site-specific analysis (single-station standard deviation) and for general PSHA applications (ergodic standard deviation). Adjustment factors are provided for consideration of source-depth effects and hanging-wall effects, as well as for hazard computations at sites in the Gulf Coast region.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
2

Study of Social Entrepreneurship and Innovation Ecosystems in the Latin American Pacific Alliance Countries: Regional Analysis: Chile, Colombia, Costa Rica, Mexico & Peru. Inter-American Development Bank, Juli 2016. http://dx.doi.org/10.18235/0009320.

Der volle Inhalt der Quelle
Annotation:
This report sets out to present some of the highlights from a more in depth study carried out on social entrepreneurship and innovation ecosystems in Chile, Colombia, Costa Rica, Mexico and Peru as part of a wider comparative study between the Latin American Pacific Alliance countries and sixcountries in Asia (Japan, South Korea, China, Singapore, Thailand and the Philippines). This study comprises a global, regional and country level perspectives as well as a detailed analysis of 25 examples of social enterprise within the two regions. In this report we begin by providing the context of the Pacific Alliance agenda and observe the opportunity this regional integration effort may have for social enterprise across the region. Secondly we outline some of the different ways each countries ecosystem has evolved over the last few years in terms of public policies, intermediaries, financial support mechanisms and Universities. There are different stages of evolution to be observed depending on the sector. For example Chile and Colombia have followed similar processes to develop public policies for social innovation (building on the maturity of their existing entrepreneurship and innovation support systems), whereas Costa Rica has leap-frogged this process with the creation of its new Social Innovation Council. Mexico and Colombia are leading the way in terms of social impact investment and Peru, with a far more incipient ecosystem although has seen rapid growth in the last two years, which above all has stimulated social entrepreneurship activity within the University sector. Thirdly we consider the different degrees of social and financial motivations of social enterprises and how these are made to fit within the existing legal frameworks and also explore the profile of the social entrepreneur in the region. Finally observe the emerging phenomena of social innovation labs as new ways of responding to social problems using diverse systemic perspectives, new ways of experimentation and learning and unique participatory design approaches.
APA, Harvard, Vancouver, ISO und andere Zitierweisen
Wir bieten Rabatte auf alle Premium-Pläne für Autoren, deren Werke in thematische Literatursammlungen aufgenommen wurden. Kontaktieren Sie uns, um einen einzigartigen Promo-Code zu erhalten!

Zur Bibliographie