Dissertationen zum Thema „Information-Based Complexity“
Geben Sie eine Quelle nach APA, MLA, Chicago, Harvard und anderen Zitierweisen an
Machen Sie sich mit Top-26 Dissertationen für die Forschung zum Thema "Information-Based Complexity" bekannt.
Neben jedem Werk im Literaturverzeichnis ist die Option "Zur Bibliographie hinzufügen" verfügbar. Nutzen Sie sie, wird Ihre bibliographische Angabe des gewählten Werkes nach der nötigen Zitierweise (APA, MLA, Harvard, Chicago, Vancouver usw.) automatisch gestaltet.
Sie können auch den vollen Text der wissenschaftlichen Publikation im PDF-Format herunterladen und eine Online-Annotation der Arbeit lesen, wenn die relevanten Parameter in den Metadaten verfügbar sind.
Sehen Sie die Dissertationen für verschiedene Spezialgebieten durch und erstellen Sie Ihre Bibliographie auf korrekte Weise.
Schmitt, Wagner. „A new 3D shape descriptor based on depth complexity and thickness information“. reponame:Biblioteca Digital de Teses e Dissertações da UFRGS, 2015. http://hdl.handle.net/10183/127030.
Der volle Inhalt der QuelleGeometric models play a vital role in several fields, from the entertainment industry to scientific applications. To reduce the high cost of model creation, reusing existing models is the solution of choice. Model reuse is supported by content-based shape retrieval (CBR) techniques that help finding the desired models in massive repositories, many publicly available on the Internet. Key to efficient and effective CBR techniques are shape descriptors that accurately capture the characteristics of a shape and are able to discriminate between different shapes. We present a descriptor based on the distribution of two global features measured on a 3D shape, depth complexity and thickness, which respectively capture aspects of the geometry and topology of 3D shapes. The final descriptor, called DCT (depth complexity and thickness histogram), is a 2D histogram that is invariant to the translation, rotation and scale of geometric shapes. We efficiently implement the DCT on the GPU, allowing its use in real-time queries of large model databases. We validate the DCT with the Princeton and Toyohashi Shape Benchmarks, containing 1815 and 10000 models respectively. Results show that DCT can discriminate meaningful classes of these benchmarks, and is fast to compute and robust against shape transformations and different levels of subdivision and smoothness.
Alamoudi, Rami Hussain. „Interaction Based Measure of Manufacturing Systems Complexity and Supply Chain Systems Vulnerability Using Information Entropy“. Scholarly Repository, 2008. http://scholarlyrepository.miami.edu/oa_dissertations/76.
Der volle Inhalt der QuelleThost, Veronika. „Using Ontology-Based Data Access to Enable Context Recognition in the Presence of Incomplete Information“. Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2017. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-227633.
Der volle Inhalt der QuelleDash, Santanu Kumar. „Adaptive constraint solving for information flow analysis“. Thesis, University of Hertfordshire, 2015. http://hdl.handle.net/2299/16354.
Der volle Inhalt der QuelleEugénio, António Luís Beja. „The information systems and technology innovation process: a study using an agent-based approach“. Master's thesis, Instituto Superior de Economia e Gestão, 2007. http://hdl.handle.net/10400.5/636.
Der volle Inhalt der QuelleUm modelo abstracto baseado em agentes é utilizado para estudar a inovação em Sistemas de Informação e em Tecnologia de Informação, no plano organizacional, utilizando uma aproximação sócio-cognitiva. A conclusão do estudo indica que o poder dos profissionais conhecedores de tecnologias de informação na decisão de adopção de uma determinada inovação varia com o nível de concordância de ideias entre eles e os decisores, ao mesmo tempo que depende da taxa de depreciação das transacções, conduzindo a uma forte flutuação de poder quando o ambiente é instável.
An abstract Agent Based Model is used to study Information Systems and Information Technology innovation on an organizational realm, using a socio-cognitive approach. Conclusion is drawn that the power of the knowledge workers in the decision to adopt an IS/IT innovation within an organization varies with the matching level of ideas between them and the top management, while being dependant of the transactions’ depreciation rate, leading to a strong fluctuation of power when the environment is unstable.
Syed, Tamseel Mahmood. „Precoder Design Based on Mutual Information for Non-orthogonal Amplify and Forward Wireless Relay Networks“. University of Akron / OhioLINK, 2014. http://rave.ohiolink.edu/etdc/view?acc_num=akron1392043776.
Der volle Inhalt der QuelleClément, François. „An Optimization Perspective on the Construction of Low-Discrepancy Point Sets“. Electronic Thesis or Diss., Sorbonne université, 2024. http://www.theses.fr/2024SORUS138.
Der volle Inhalt der QuelleDiscrepancy measures are metrics designed to quantify how well spread a point set is in a given space. Among these, the L∞ star discrepancy is arguably one of the most popular. Indeed, by the Koksma-Hlawka inequality~cite{Hlawka,Koksma}, when replacing an integral by the average of function evaluations in specific points, the error made is bounded by a product of two terms, one depending only on the function and the other on the L∞ star discrepancy of the points. This leads to a variety of applications, from computer vision to financial mathematics and to design of experiments where well-spread points covering a space are essential.Low-discrepancy sets used in such applications usually correspond to number theoretic designs, with a wide variety of possible constructions. Despite the high demand in practice, the design of these point sets remains largely the work of mathematicians, often more interested in finding asymptotic bounds than in adapting the point sets to the desired applications. This results in point sets that, while theoretically excellent, sometimes leave a lot to be desired for applications, in particular high-dimensional ones. Indeed, the constructions are not tailored to the many different settings found in applications and are thus suboptimal. Furthermore, not only do we not know how low the discrepancy of point sets of a given size in a fixed dimension can go, but often we do not even know the discrepancy of existing constructions. This leaves essential questions unanswered in the design of low-discrepancy sets and sequences. In this thesis, we tackle the problem of constructing low-discrepancy sets from a computational perspective. With optimization approaches applied in isolation or on top of existing sets and sequences, we provide a diverse set of methods to generate excellent low-discrepancy sets, largely outperforming the discrepancy of known constructions in a wide variety of contexts. In particular, we describe a number of examples such as provably optimal sets for very few points in dimension 2, or improved sets of hundreds of points in moderate dimensions via subset selection. Finally, we extend recent work on greedy one-dimensional sequence construction to show that greedy L2 construction of point sets provides excellent empirical results with respect to the L∞ star discrepancy
Domercant, Jean Charles. „ARC-VM: an architecture real options complexity-based valuation methodology for military systems-of-systems acquisitions“. Diss., Georgia Institute of Technology, 2011. http://hdl.handle.net/1853/42928.
Der volle Inhalt der QuelleМельничук, Андрій Богданович, und Andrii Melnychuk. „Методи захисту інформації в рамках предметно-орієнтованого проєктування інформаційних систем“. Master's thesis, ТНТУ, 2021. http://elartu.tntu.edu.ua/handle/lib/36742.
Der volle Inhalt der QuelleПредметно-орієнтоване проектування — це підхід до проектування програмного забезпечення. Метод проєктування визначає у собі практики як спілкуватись із спеціалістами, самої, предметної області та набір правил проектування де кінцевий код буде відображати у собі всі поняття самого домену. У деяких випадках розділення проблем, передбачених предметно орієнтованим підходом, важко досягти, а саме, коли розглядається функціональність, яка не залежить від домену, але є тісно пов’язаною з функціональністю, пов’язаною з предметною областю. Дані проблеми стосуються блоку програми яка б відповідала за безпеку. На жаль, засновник досліджуваного підходу не визначив як саме можна поєднати таку логіку, тому було досліджено різні методи забезпечення захисту даних та аналіз найкращих варіантів реалізації, які можна застосовувати у реальних проектах.
Domain Driven Design — is an approach to software design. The approach method defines the practices of how to communicate with specialists, of the subject area and a set of design rules where the final code will reflect all the concepts of the domain itself. In some cases, the separation covered by a subject-oriented approach is difficult to achieve, namely when considering functionality that is independent of the domain but is closely related to functionality related to the subject area. These issues are related to the security unit. Unfortunately, the founder of the researched approach did not define exactly how such logic can be combined. That is why were researched the different methods of data protection and analysis of the best implementation options that can be used in real projects.
ПЕРЕЛІК УМОВНИХ СКОРОЧЕНЬ…8 ВСТУП… 9 РОЗДІЛ 1. АНАЛІЗ ПРЕДМЕТНО-ОРІЄНТОВАНОГО ПРОЄКТУВАННЯ ТА ПРОБЛЕМИ БЕЗПЕКИ У НЬОМУ…13 1.1. Предметно-орієнтоване проєктування…13 1.2 Архітектура та структура проекту при DDD проєктуванні… 16 1.3 Аспектно-орієнтоване програмування…18 1.4 Вимірювання складності реалізації методу… 20 1.5 Проблема захисту даних при проектуванні…22 РОЗДІЛ 2. ЗАХИСТ ІНФОРМАЦІЇ В РАМКАХ ПРЕДМЕТНО-ОРІЄНТОВАНОГО ПРОЄКТУВАННЯ…24 2.1 Предмет дослідження проблеми захисту інформації… 24 2.2 Критерії оцінки варіантів реалізації безпеки в рамках предметно-орієнтованому проектуванні… 26 2.3 Аналіз методу захисту інформації, яка вбудована у шар предметної області… 28 2.3.1 Опис підходу та його плюси…28 2.3.2 Недоліки підходу… 30 2.3.3 Висновок по підрозділу…30 2.4 Аналіз методу захисту інформації, яка винесена в окремий контекст… 31 2.4.1 Опис підходу та його плюси…31 2.4.2 Недоліки підходу… 32 2.4.3 Висновок по підрозділу…34 2.5 Аналіз методу захисту інформації, з використанням фасаду предметної області… 34 2.5.1 Опис підходу та його плюси…35 2.5.2 Недоліки підходу… 36 2.5.3 Висновок по підрозділу…36 2.6 Аналіз методу захисту інформації, з використанням аспектно-орієнтованого підходу…37 2.6.1 Опис підходу та його плюси…37 2.6.2 Недоліки підходу… 38 2.6.3 Висновок по підрозділу…39 РОЗДІЛ 3. РЕАЛІЗАЦІЯ МЕТОДІВ ЗАХИСТУ ТА ПОРІВНЯННЯ СКЛАДНОСТІ…40 3.1 Реалізація базової структури проекту…40 3.1.1 Вибір середовища та технологія для створення інформаційної системи… 40 3.1.2 Розробка головного ядра інформаційної системи… 40 3.1.3 Розробка відокремленого контексту безпеки… 42 3.1.4 Розробка інтерфейсу користувача… 43 3.2 Аналіз складності у реалізацій та підтримки кожного із методів…44 3.2.1 Визначення значення залежностей у інформаційній системі…44 3.2.2 Вимірювання цикломатичної складності… 44 3.3 Підсумки отриманих результатів…45 РОЗДІЛ 4. ОХОРОНА ПРАЦІ ТА БЕЗПЕКА У НАДЗВИЧАЙНИХ СИТУАЦІЯХ…47 4.1 Охорона праці…47 4.2 Підвищення стійкості роботи підприємств будівельної галузі у воєнний час…49 ВИСНОВКИ… 54 СПИСОК ВИКОРИСТАНИХ ДЖЕРЕЛ… 56 ДОДАТКИ…59
Kuhn, John. „A THEORY OF COMPLEX ADAPTIVE INQUIRING ORGANIZATIONS: APPLICATION TO CONTINUOUS ASSURANCE OF CORPORATE FINANCIAL INFORMATION“. Doctoral diss., University of Central Florida, 2009. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/2432.
Der volle Inhalt der QuellePh.D.
Department of Management Information Systems
Business Administration
Business Administration PhD
Holm, Cyril. „F. A. Hayek's Critique of Legislation“. Doctoral thesis, Uppsala universitet, Juridiska institutionen, 2014. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-236890.
Der volle Inhalt der QuelleTröger, Ralph. „Supply Chain Event Management – Bedarf, Systemarchitektur und Nutzen aus Perspektive fokaler Unternehmen der Modeindustrie“. Doctoral thesis, Universitätsbibliothek Leipzig, 2014. http://nbn-resolving.de/urn:nbn:de:bsz:15-qucosa-155014.
Der volle Inhalt der QuelleFujdiak, Radek. „Analýza a optimalizace datové komunikace pro telemetrické systémy v energetice“. Doctoral thesis, Vysoké učení technické v Brně. Fakulta elektrotechniky a komunikačních technologií, 2017. http://www.nusl.cz/ntk/nusl-358408.
Der volle Inhalt der Quelle„The information-based complexity of dynamic programming“. Laboratory for Information and Decision Systems, Massachusetts Institute of Technology], 1989. http://hdl.handle.net/1721.1/3123.
Der volle Inhalt der QuelleCover title.
Includes bibliographical references.
Supported by the NSF, with matching funds from Bellcore and Dupont. ECS-8552419 Supported by the ARO. DAAL03-86-K-0171
Petras, Iasonas. „Contributions to Information-Based Complexity and to Quantum Computing“. Thesis, 2013. https://doi.org/10.7916/D80V8M04.
Der volle Inhalt der QuelleWu, Ming-Lu, und 吳明錄. „Low Complexity Antenna Array-Assisted Multiuser Detection Based on Information Theoretic Criteria“. Thesis, 2002. http://ndltd.ncl.edu.tw/handle/12168067347673016093.
Der volle Inhalt der Quelle國立臺灣科技大學
電子工程系
90
In this dissertation, a low complexity antenna array-assisted multiuser detection (MUD) which is based on the information theoretic criteria is proposed to detect the desired signal in TDMA systems. The contributions of this dissertation include the followings. First, a low complexity antenna array-assisted minimum mean-squared error (MMSE) MUD is addressed, which utilizes only partial information of the co-channel interferences (CCIs) in the demodulation process. In light of the fact that the power of the CCIs is lower than that of the desired user as the CCIs are out-of-cell in TDMA, the proposed approach truncates the channel length of the CCIs adaptively based on the power of the channel taps of the CCIs. As the truncated channel taps only account for negligible information of the CCIs, the performance does not substantially degrade. Moreover, the analytic expressions of the bit error rate (BER) performance for both of the full complexity and the proposed low complexity MMSE MUD are derived. Simulation results show that the simulations and the analytic results agree well in various scenarios, and that the performance remains close even if we truncate $60\%$ CCI channel length. Second, information theoretic criteria, which include the Akaike information theoretic criterion (AIC) and Rissanen's minimum distance length (MDL) criterion, are employed to form a theoretic foundation to determine the effective CCI channel length in the developed MUD. Two information theoretic approaches are considered. The first one is a direct extension of the previous works. Aiming at keeping the information of the desired user intact in the truncation, we propose modified information theoretic criteria, which first project the received signals onto the CCIs subspace and the noise subspace before the embarkation of the information theoretic analysis. To assess the statistical behavior of the proposed criteria, the consistency property is also investigated. The built simulations show that the developed MUD with the effective CCI channel length determined by the information theoretic criteria yields indistinguishable performance as the full complexity counterpart. Finally, to enhance the spectrum efficiency, we address a simple, yet effective dynamic channel assignment (DCA), which employs the angle constraints and distance constraints as the criteria to assign frequency bands. With a combination of the developed low complexity MUD and DCA, which are compensatory to each other, we can truncate more redundant information of the CCIs and thus achieve lower complexity, while still maintaining acceptable BER performance.
簡添福. „A Study and Implementation in Software Product Complexity Measurement Based on Information Theory“. Thesis, 2002. http://ndltd.ncl.edu.tw/handle/07165585496918960997.
Der volle Inhalt der Quelle佛光人文社會學院
資訊學研究所
90
The rapid development of software industry, fast change of industry environment, and consumer power grow up. Let software development confront a large test. Developer and purchaser must have more consideration about strategy at software quality and cost. If we can predict successfully the software development cost at the beginning. This will give the software company competition advantage. Software engineering has an important subject---software complexity metrics, which study the relationship between software complexity and development cost. More researchers agree that software complexity and development cost are positively related. In addition, they propose many metrics methods. Among those metrics methods, Simple Stack-Based Markov (SSBM) model is a very excellent method. SSBM solve the main drawback, which only can measure one attribute (ex. size, control flow, data flow, … etc.). And SSBM can use the new technical of software development, such as Object-oriented. Although SSBM have excellent capability. It has a weakness, which can’t express the nest complexity of control flow. And SSBM support the program control flow as same as expression control flow. Then researcher aim at nest complexity to refine it. But they support the program control flow as same as expression control flow. In this paper, we consider a different one and propose a method that uses two kinds of parenthesis to solve this problem. We follow a regular and strict process of the development of software metrics, which makes our model more complete. So we have made an experiment to proof the model correction.
Huang, Cheng-Chun, und 黃政鈞. „Lossless Information Hiding Schemes Based on Pixels Complexity Analysis and Histogram of Predicted Coding“. Thesis, 2008. http://ndltd.ncl.edu.tw/handle/83578901036736550668.
Der volle Inhalt der Quelle朝陽科技大學
資訊管理系碩士班
96
Along with Internet fast popularization, people can easily share and obtain information each other. It also imperceptibly increases the probability of intercept by the illegal third party in the information transmission. Therefore, it appears the importance of information security, especially in the military or the medicine image that does not allow having any error. In order to achieve this goal, the lossless information hiding property is an important subject. According to the motivation above, we first propose an effective lossless information hiding scheme, in which a host image is quantized firstly to generate spare spaces for hiding secret messages. The proposed scheme applies the complexity analysis of neighboring pixels to predict the number of secret message bits concealed in a pixel. In other words, the scheme reserves the differences between the host image and the quantized image for completely restoring the host image. According to the experimental results, the information capacity of the proposed scheme is 0.9 BPP for the standard Lena while that of Maniccam and Bourbakis’s scheme is only 0.3 BPP. In addition, there are many proposed lossless information hiding techniques such as difference expansion, integer transformation method, histogram modification and so on. Among them, histogram modification modifies maximum pixel value in a histogram to embed secret messages into a host image. The quality of stego image generated by the histogram modification is good. However, the capacity of histogram modification method is low. Therefore, we propose a lossless information hiding scheme to improve this problem in this thesis by using black-based internal and external forecasting. According to the experimental results, the proposed scheme can increase the information capacity and maintain image quality.
Erar, Bahar. „Mixture model cluster analysis under different covariance structures using information complexity“. 2011. http://trace.tennessee.edu/utk_gradthes/968.
Der volle Inhalt der QuelleZhang, Rui. „Model selection techniques for kernel-based regression analysis using information complexity measure and genetic algorithms“. 2007. http://etd.utk.edu/2007/ZhangRui.pdf.
Der volle Inhalt der QuelleBroadbent, Anne Lise. „Quantum nonlocality, cryptography and complexity“. Thèse, 2008. http://hdl.handle.net/1866/6448.
Der volle Inhalt der QuelleKao, Min-Chi, und 高明志. „QMF Banks Optimization Based on Derivative Information and Low-Complexity Design of Two-Channel Subband Filters Using Short Modular Half-Band Filters“. Thesis, 2000. http://ndltd.ncl.edu.tw/handle/40212442694390363151.
Der volle Inhalt der Quelle國立交通大學
電子工程系
88
The dissertation is concerned with three key issues of filter bank design, namely, responses optimization, low computational complexity, and low finite-precision-error realization of subband filters. In particular, this dissertation is divided into two parts: (I) Quadrature-Mirror-Filter (QMF) banks optimization based on derivative information, and (II) low-complexity design and realization of 1-D/2-D two-channel subband filters using short modular half-band/Nyquist(M) filters. The first part focuses on the optimization of QMF banks. New types of objective functions, utilizing derivative information of the reconstruction error in z-domain, are proposed. New designs of QMF banks using the objective functions are studied. Efficient design algorithms for low-delay QMF banks and linear-phase QMF banks are developed. From simulations, the new designs can achieve better results than the conventional design based on the standard least-square-error objective function. The second part focuses on the low-complexity design and realization of subband filters with good numerical properties. We devise novel low-complexity composition schemes for the design and realization of 1-D half-band filters, 1-D two-channel biorthogonal filter banks, 2-D Nyquist(M) filters, and 2-D two-channel diamond/quadrant filter banks, all with narrow transition band and high frequency selectivity. The existing design methods either result in high-performance but high-complexity subband filters or low-complexity but low-performance subband filters. The new schemes provide simple and efficient methods for synthesizing high-performance low-complexity subband filters with good numerical property for finite-precision realization. The synthesis process involves frequency response sharpening. For the low-complexity design and realization of 1-D half-band filters, the proposed scheme is based on an algebraic iterative composition method using adjustable short modular half-band filters. The modular filters can be user selectable as simple ones as desired. Specifically, the designed higher-order half-band filters can be made multiplierless if the modular filters are multiplierless. For the low-complexity design and realization of 1-D biorthogonal linear-phase filter banks, the proposed algebraic iterative composition scheme utilizes the solution of filter bank with two half-band filters. The resulting analysis filters are not only sharp but also low-complexity, which are composed of several short modular half-band filters. The 1-D schemes are extended to the synthesis of 2-D Nyquist(M) filters and two-channel nonseparable diamond/quadrant filter banks with sharp responses. Short modular 2-D Nyquist(M) filters, preferably multiplier-free ones, are used. Based on the proposed schemes, half-band/Nyquist(M) filters and 1-D/2-D filter banks can be synthesized in a tree-like multi-stage cascaded structure with considerably reduced arithmetic operations (that can be made multiplierless). Simulations are shown to validate the effectiveness of the proposed schemes.
„Managing Distributed Information: Implications for Energy Infrastructure Co-production“. Doctoral diss., 2018. http://hdl.handle.net/2286/R.I.49360.
Der volle Inhalt der QuelleDissertation/Thesis
Doctoral Dissertation Sustainability 2018
Naik, Debendra Kumar. „Fuzzy Rule Based Approach for Quality Analysis of Web Service Composition Using Complexity Metrics“. Thesis, 2015. http://ethesis.nitrkl.ac.in/7779/1/2015_Fuzzy_Rule__Naik.pdf.
Der volle Inhalt der QuelleHowe, John Andrew. „A New Generation of Mixture-Model Cluster Analysis with Information Complexity and the Genetic EM Algorithm“. 2009. http://trace.tennessee.edu/utk_graddiss/863.
Der volle Inhalt der QuelleBaek, Seung Hyun. „Kernel-Based Data Mining Approach with Variable Selection for Nonlinear High-Dimensional Data“. 2010. http://trace.tennessee.edu/utk_graddiss/676.
Der volle Inhalt der Quelle