Siga este enlace para ver otros tipos de publicaciones sobre el tema: Data commons.

Tesis sobre el tema "Data commons"

Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros

Elija tipo de fuente:

Consulte los 50 mejores tesis para su investigación sobre el tema "Data commons".

Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.

También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.

Explore tesis sobre una amplia variedad de disciplinas y organice su bibliografía correctamente.

1

Sharad, Chakravarthy Namindi. "Public Commons for Geospatial Data: A Conceptual Model". Fogler Library, University of Maine, 2003. http://www.library.umaine.edu/theses/pdf/SharadCN2003.pdf.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

McCurry, David B. "Provenance Tracking in a Commons of Geographic Data". Fogler Library, University of Maine, 2007. http://www.library.umaine.edu/theses/pdf/McCurryDB2007.pdf.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

John, Shirley Diane. "The analysis of House of Commons' division list data". Thesis, University of Bath, 1989. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.235796.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Pinto, Evelyn Cristina. "\"Repensando os commons na comunicação científica\"". Universidade de São Paulo, 2006. http://www.teses.usp.br/teses/disponiveis/45/45134/tde-07052007-092617/.

Texto completo
Resumen
Recentemente estudiosos como Benkler, Lessig, Boyle, Hess e Ostrom retomaram o uso do conceito de commons, mas agora relacionado à informação em geral ou à informação científica. Nesse trabalho, nós lançamos mão desse termo para destacar o caráter cooperativo da pesquisa científica, a importância da transparência e neutralidade no acesso ao commons da Ciência e a natureza anti-rival da informação científica. O conceito de commons nos é muito útil para focar todo o conjunto dos artigos científicos já publicados, quer estejam na forma impressa ou na digital. Ainda permite um estudo através de prismas multidisciplinares e, finalmente, enfatiza a dinâmica das comunidades científicos como um todo. Em qualquer commons de informação, quanto maior a distribuição do conhecimento, mais dinâmico e eficiente é o processo de evolução do conhecimento. A tecnologia da imprensa tem desempenhado um papel fundamental na divulgação de informação e o seu surgimento marcou uma revolução no conhecimento e na cultura da nossa civilização. A tecnologia digital tem se mostrado mais eficiente ainda, uma vez que a natureza da sua implementação em bits se aproxima mais da natureza anti-rival das idéias do que qualquer outra tecnologia hoje empregada para preservação e distribuição de informação. Em nosso estudo, constatamos que o commons da Ciência pode ser enormemente enriquecido através de práticas cooperativas e de acesso aberto na publicação da academia. Percebemos também que o uso da tecnologia digital no commons científico, especialmente na publicação dos resultados da pesquisa, aumenta grandemente a distribuição do conhecimento acadêmico, suas oportunidades de escrutínio e validação, a dinâmica de amadurecimento das idéias científicas e, conseqüentemente, pode tornar o desenvolvimento da Ciência mais veloz e eficiente. No entanto, o meio digital tem sido utilizado tanto para criar um ambiente de livre circulação de idéias quanto para controlá-las. Por um lado, código computacional tem sido implementado para garantir o acesso apenas aos que pagam pelos altos preços das revistas científicas. Por outro lado, a publicação de revistas on-line de acesso aberto e outras formas alternativas de disseminação de conteúdo científico têm se proliferado. Ainda, o decrescente orçamento das bibliotecas, o crescente preço das assinaturas de revistas científicas e as crescentes restrições aplicadas pelas leis de propriedade intelectual têm minado a natureza livre das idéias científicas e colocado a Comunicação Científica numa crise. Estamos no meio de uma transição de paradigmas quanto à publicação dos resultados de pesquisa científica, onde aspectos legais, tecnológicos e sócio-econômicos estão em renegociação. À luz das oportunidades da tecnologia digital e da publicação em acesso aberto, as formas de disseminação dos resultados da pesquisa científica presentemente estabelecidas tem sido repensadas. Inserimos essa análise num contexto maior, o paradigma da Comunicação Científica. Isso nos auxilia a fazer um estudo mais abrangente das complexas questões envolvendo nosso tema, analisando os aspectos tecnológicos, legais e sócio-econômicos de uma possível transição para o modelo de publicação de acesso aberto. Tão grandes são as oportunidades desse novo modelo que ele tem agregado em torno de si iniciativas sócio-acadêmicas conhecidas por Movimento de Acesso Aberto à literatura científica. Atualmente, há muitos testes e modelos de publicação dessa literatura. Em especial, nesse trabalho focamos o modelo de acesso aberto aos resultados científicos, suas vantagens, as dificuldades para seu estabelecimento e como ele tem se desenvolvido. Analisamos a viabilidade de criação de um ecossistema de bibliotecas digitais de acesso aberto, especializadas em cada ramo da Ciência. Nossos modelos de partida baseiam-se em alguns aspectos de serviços como arXiv, CiteSeer e Google Scholar. Entre as muitas conclusões desse estudo, constatamos que bibliotecas desse tipo aumentam sobremaneira a dinâmica de circulação, geração, transformação e renovação do conhecimento científico. Assim, o processo de produção de recursos no commons científico pode se tornar muito mais eficiente.
Recent studies done by Benkler, Lessig, Boyle, Hess and Ostrom look at the concept of commons again however, this time in relation to information in general more specifically to scientific information. In this study, we focused on the cooperative character of scientific research, the importance of transparency and neutrality to access the scientific commons. The concept of commons is highly useful to focus on every scientific article that has already been published in print or digitally. This allows studies through several multidisciplines and finally emphasizes the dynamic of scientific communities around the world. In each commons of information, the higher the distribution of knowledge, the more dynamic and efficient the process of the evolution of this information. Technology of the press has been key in the divulging of information and its expansion marked a revolution in knowledge and culture in our civilization. Digital technology has shown more efficiency. Its implementation into bits is closer to the non-rival nature of the ideas than other technologies used to preserve it and used to distribute information. In our work, we realized that the science of commons should be enriched through cooperative practices and open access to scientific results. We also realized that digital technology in scientific commons improves distribution of scholarly knowledge and the dynamic evolution of scientific ideas so the science development should be even more efficient and faster. The digital revolution has been used to create a free environment of circulation of ideas and it has also been used to control certain things. On one side, computational code has been implemented to allow access just for people who pay for the service. On the other hand, online journals publishing and other alternative forms of disseminating scientific knowledge have been proliferated. The decreasing budget of libraries, the increasing cost of journal subscriptions and the increasing restrictions applied by intellectual property has enclosed the free nature of scientific ideas and it has put Scholarly Communication into a crisis. We are in the middle of a transitional phase, where legal, technological, social and economic aspects of scientific publishing have being renegotiated. We inserted our analyses in a larger context, the Scholarly Communication paradigm. This supports a larger study about the complex questions of our subject, analyzing the technological, legal, social and economic aspects of a possible transition to the open access publishing model. This new publishing model is so interesting that some initiatives have started social movements pertaining to it. Nowadays, there are many tests and publishing models especially in this line of work. We focused on the open access model in scientific results, its advantages, the difficulties of its establishment and how it has been developed. Finally, we propose that the creation of an open access digital libraries ecosystem specialized in every scientific field. Our staring models are services such as: arXiv, CiteSeer and Google Scholar. Among our conclusions, we have realized that following this models stated above, digital libraries can enhance the dynamic of circulation, generation, transformation and renovation of the scientific knowledge.
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Alliot, Sandie. "Essai de qualification de la notion de données à caractère personnel". Thesis, Bourgogne Franche-Comté, 2018. http://www.theses.fr/2018UBFCB001.

Texto completo
Resumen
Les données à caractère personnel ont subi de grands changements qui justifient une nouvelle étude de cette notion. Ces données particulières sont aujourd’hui au cœur de l’économie numérique et font, pour cette raison, l’objet de nombreuses revendications. Volonté d’appropriation de la part des exploitants, demande de protection de la part des personnes concernées par les données, font partie des enjeux qu’il s’agit de concilier. Une définition et une qualification adéquates des données personnelles apparaissent alors indispensables afin de trouver un équilibre entre les différents intérêts en présence. La thèse démontrera la nécessité d’adopter une nouvelle vision des données personnelles, pour refléter au mieux leurs caractéristiques actuelles et ce afin de les encadrer de manière efficace
Personal data has undergone a major change which justifies a new study of this notion. This particular data is at the heart of the digital economy and thus stirs numerous claims. What is at stake here is to try to work out a balance between different claims such as managing operators' will of appropriation and protecting people concerned by the data, for example. This is why it is essential to find a precise definition and adequate qualifications of personal data to find a balance between the various interests. The thesis will focus on the necessity to adopt a new vision of personal data, to show its current characteristics so as to manage it efficiently
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Kang, Heechan. "Essays on methodologies in contingent valuation and the sustainable management of common pool resources". Columbus, Ohio : Ohio State University, 2006. http://rave.ohiolink.edu/etdc/view?acc%5Fnum=osu1141240444.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Hodges, Glenn A. "Designing a common interchange format for unit data using the Command and Control information exchange data model (C2IEDM) and XSLT". Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2004. http://library.nps.navy.mil/uhtbin/hyperion/04Sep%5FHodges.pdf.

Texto completo
Resumen
Thesis (M.S. in Modeling Virtual Environments and Simulation (MOVES))--Naval Postgraduate School, Sept. 2004.
Thesis advisor(s): Curtis Blais, Don Brutzman. Includes bibliographical references (p. 95-98). Also available online.
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Chen, Te-Ching. "Estimating common odds ratio with missing data". College Park, Md. : University of Maryland, 2005. http://hdl.handle.net/1903/2725.

Texto completo
Resumen
Thesis (Ph. D.) -- University of Maryland, College Park, 2005.
Thesis research directed by: Mathematical Statistics. Title from t.p. of PDF. Includes bibliographical references. Published by UMI Dissertation Services, Ann Arbor, Mich. Also available in paper.
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Slocumb, Calvin D. "Common data sharing system infrastructure : an object-oriented approach /". Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 1995. http://handle.dtic.mil/100.2/ADA304500.

Texto completo
Resumen
Thesis (M.S. in Systems Technology (Joint Command, Control, Communications, Computers, and Intelligence Systems) Naval Postgraduate School, June 1995.
"June 1995." Thesis advisor(s): Orin E. Marvel. Includes bibliographical references. Also available online.
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Doldirina, Catherine. "The common good and access to remote sensing data". Thesis, McGill University, 2011. http://digitool.Library.McGill.CA:80/R/?func=dbin-jump-full&object_id=104766.

Texto completo
Resumen
The thesis represents a search for the appropriate regime for protecting remote sensing data and information. Based on the technical and societal characteristics of this type of data, it argues in favour of the necessity to secure access to it. Using the regulatory examples of the USA and Europe, the research compares the effectiveness of such relevant legal regimes as intellectual property protection, in particular, copyright on the one hand, and the regulation of public sector information on the other. On the basis of this analysis the argument is made, that the unnecessary commodification of remote sensing data through private property-like protection regime will adversely influence their use and diminish their value. The principle of sharing, based on the theories of common property and the common good, is proposed as the best and most appropriate solution to avoid development of such a scenario. Its viability and effectiveness lies in the emphasis on the balance between the private and the public in the achievement of the common good of a better life that today manifests itself inter alia in being information rich. The principle of sharing has survived centuries of philosophical thought and is relevant today, particularly with regard to the establishment of the protection and distribution regime for remote sensing data, as the highlighted examples of geographic information infrastructures and the Geographic Earth Observation System of Systems show. The metaphor of information as a waterway rounds up the discussion regarding the relevance of the principle of sharing and emphasises the indispensability of the access-to-data oriented approach to the regulation of relationships over the generation, distribution and use of remote sensing data and information.
Cette thèse se veut une recherche du régime adéquat de protection des données de télédétections et de l'information. Son argument, en faveur de la nécessité d'en sécuriser l'accès, se base sur leurs caractéristiques techniques et sociétales. En prenant comme exemples les États-Unis et l'Europe, cette recherche compare l'efficacité de régimes légaux pertinents telle que la protection de la propriété intellectuelle, en particulier celle du droit d'auteur d'une part, et la régulation du secteur public de l'information, de l'autre. Sur la base de cette analyse, ce travail soutient qu'une marchandisation non nécessaire des données de télédétections par des régimes de protection, telle que celui de la propriété privée, vont influencer défavorablement leurs utilités ainsi que leurs valeurs. Le principe du partage, basé sur les théories de la propriété commune et du bien commun, est proposé comme étant la solution pour éviter de tels scénarios. Sa viabilité et son efficacité résident dans l'accent mis entre l'équilibre public et privé dans l'accomplissement du bien commun d'une vie meilleure, qui se manifeste aujourd'hui notamment par l'abondance de l'information. Le principe de partage, qui a survécu à des siècles de pensée philosophique, est toujours pertinent aujourd'hui, particulièrement en ce qui concerne l'implantation de régime de protection et de distribution des données de télédétections, tel que les exemples donnés sur l'infrastructure de l'information géographique et le "Geographic Earth Observation System of Systems" le montrent. La métaphore qui présente l'information comme une voie navigable reprend la discussion relative à la pertinence du principe de partage et accentue l'aspect indispensable d'une approche orientée vers l'accès aux données, préférable à la régulation des relations sur la génération, la distribution et l'utilisation de données de télédétections et de l'information.
Los estilos APA, Harvard, Vancouver, ISO, etc.
11

Abnaof, Khalid [Verfasser]. "Finding Common Patterns In Heterogeneous Perturbation Data / Khalid Abnaof". Bonn : Universitäts- und Landesbibliothek Bonn, 2016. http://d-nb.info/1103024337/34.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
12

Shi, Wei. "Essays on Spatial Panel Data Models with Common Factors". The Ohio State University, 2016. http://rave.ohiolink.edu/etdc/view?acc_num=osu1461300292.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
13

Kovács, Zsolt. "The integration of product data with workflow management systems through a common data model". Thesis, University of Bristol, 1999. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.312062.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
14

Jones, Sidney R. Jr. "The Common Airborne Instrumentation System Program Overview". International Foundation for Telemetering, 1993. http://hdl.handle.net/10150/608869.

Texto completo
Resumen
International Telemetering Conference Proceedings / October 25-28, 1993 / Riviera Hotel and Convention Center, Las Vegas, Nevada
The Common Airborne Instrumentation System (CAIS) is being developed by the Department of Defense through a Tri-service Program Office. The goals of the program are two fold. The first is to develop an instrumentation system that will meet the needs of the Air Force, Army, and Navy into the next century. The system is designed to support a full breadth of applications from a few parameters to engineering and management and development programs. The second is to provide a system that is airframe as well as activity independent. To accomplish these goals, the CAIS consists of two segments. The airborne segment consists of a system controller with a suite of data acquisition units. The system is configured with only the units that are required. The ground segment consists of a variety of support equipment. The support equipment enables the user to generate formats, load/verify airborne units, perform system level diagnostics and more.
Los estilos APA, Harvard, Vancouver, ISO, etc.
15

Wegener, John A. y Rodney L. Davis. "EXTENSION OF A COMMON DATA FORMAT FOR REAL-TIME APPLICATIONS". International Foundation for Telemetering, 2004. http://hdl.handle.net/10150/604961.

Texto completo
Resumen
International Telemetering Conference Proceedings / October 18-21, 2004 / Town & Country Resort, San Diego, California
The HDF5 (Hierarchical Data Format) data storage family is an industry standard format that allows data to be stored in a common format and retrieved by a wide range of common tools. HDF5 is a widely accepted industry standard container for data storage developed by the National Center for Supercomputing Applications (NCSA) at the University of Illinois at Urbana-Champaign. The HDF5 data storage family includes HDF-Time History, intended for data processing, and HDF-Packet, intended for real-time data collection; each of these is an extension to the basic HDF5 format, which defines data structures and associated interrelationships, optimized for that particular purpose. HDF-Time History, developed jointly by Boeing and NCSA, is in the process of being adopted throughout the Boeing test community and by its external partners. The Boeing/NCSA team is currently developing HDF-Packet to support real-time streaming applications, such as airborne data collection and recording of received telemetry. The advantages are significant cost reduction resulting from storing the data in its final format, thus avoiding conversion between a myriad of recording and intermediate formats. In addition, by eliminating intermediate file translations and conversions, data integrity is maintained from recording through processing and archival storage. As well, HDF5 is a general-purpose wrapper, into which can be stored processed data and other data documentation information (such as calibrations), thus making the final data file self-documenting. This paper describes the basics of the HDF-Time History, the extensions required to support real-time acquisition with HDF-Packet, and implementation issues unique to real-time acquisition. It also describes potential future implementations for data acquisition systems in different segments of the test data industry.
Los estilos APA, Harvard, Vancouver, ISO, etc.
16

Brown, Thomas R. Jr. "The Common Airborne Instrumentation System Program Management Overview". International Foundation for Telemetering, 1994. http://hdl.handle.net/10150/611744.

Texto completo
Resumen
International Telemetering Conference Proceedings / October 17-20, 1994 / Town & Country Hotel and Conference Center, San Diego, California
The Department of Defense, through a Tri-Service Program Office, is developing the Common Airborne Instrumentation System (CAIS) to promote standardization, commonality, and interoperability among aircraft test instrumentation systems. The advent of CAIS will change how the DoD test community conducts business. The CAIS program will allow aircraft test and evaluation facilities to utilize common airborne systems, ground support equipment, and technical knowledge for airborne instrumentation systems. The CAIS Program Office will conduct requirements analyses, manage system upgrades, and provide full life cycle support for this system. It is initiating several requirements contracts to provide direct ordering opportunities for DoD users to easily procure defined test instrumentation hardware. The program office will provide configuration management, inventory control, maintenance support, system integration, engineering support, and software management. In addition, it will continue to enhance the current system and develop new items to meet future requirements. Where existing equipment provides added benefit, this equipment may be added to the official CAIS family.
Los estilos APA, Harvard, Vancouver, ISO, etc.
17

Walker, Thaddeus Owens. "Real-time compressed video transmission across the common data link". Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 1995. http://handle.dtic.mil/100.2/ADA304462.

Texto completo
Resumen
Thesis (M.S. in Electrical Engineering and Electrical Engineer) Naval Postgraduate School, June 1995.
Thesis advisor(s): Murali Tummala, S.B. Shukla. "June 1995." Includes bibliographical references. Also available online.
Los estilos APA, Harvard, Vancouver, ISO, etc.
18

Lindberg, Sandra. "Common cause failure analysis : Methodology evaluation using Nordic experience data". Thesis, Uppsala universitet, Institutionen för informationsteknologi, 2007. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-131841.

Texto completo
Resumen
Within the nuclear industry there is an extensive need for evaluation of the safety of the plant. In such evaluations there is one phenomenon requiring some particular treatment, namely common cause failure (CCF). This involves the occurrences of components failing dependently, meaning failures that can overcome the applied redundancy or diversity. The impact of CCF is relatively large, but unfortunately the process of CCF analysis is complicated by the complex nature of CCF events and a very sparse availability of CCF data. Today, there are a number of methods for CCF analysis available with different characteristics, especially concerning their qualitative and quantitative features. The most common working procedure for CCF treatment is to divide the analysis in a qualitative and a quantitative part, but unfortunately the development of tools for the qualitative part has to a certain extent got behindhand. This subject is further explored in a comparative study focused on two totally different approaches for CCF analysis, the impact vector method and the unified partial method. Based on insights from this study an integrated impact vector and ‘Relations of Defences, Root causes and Coupling factors’ (RDRC) methodology is suggested to be further explored for progress towards a methodology incorporating both qualitative and quantitative aspects.
Los estilos APA, Harvard, Vancouver, ISO, etc.
19

Dervos, Dimitris A. y Anita Sundaram Coleman. "A Common Sense Approach to Defining Data, Information, and Metadata". Ergon-Verlag, 2006. http://hdl.handle.net/10150/105530.

Texto completo
Resumen
This is a presentation (~25 slides) made at ISKO 2006 in Vienna based on the paper (same title) published in the Proceedings of the Ninth International ISKO 2006 Conference, Vienna, Edited by Swertz, C. Berlin: Ergon, 2006.
Los estilos APA, Harvard, Vancouver, ISO, etc.
20

Brown, Thomas R. Jr. "THE COMMON AIRBORNE INSTRUMENTATION SYSTEM TEST PROGRAM". International Foundation for Telemetering, 1995. http://hdl.handle.net/10150/608403.

Texto completo
Resumen
International Telemetering Conference Proceedings / October 30-November 02, 1995 / Riviera Hotel, Las Vegas, Nevada
The Department of Defense (DoD), through a Tri-Service Program Office, is developing the Common Airborne Instrumentation System (CAIS) to promote standardization, commonality, and interoperability among aircraft test instrumentation systems. The advent of CAIS will change how the DoD test community conducts business. The CAIS program will allow aircraft test and evaluation facilities to utilize common airborne systems, ground support equipment, and technical knowledge for airborne instrumentation systems. During the development of the CAIS, the Program Office will conduct a broad spectrum of tests: engineering design, acceptance, environmental qualification, system demonstration, and flight qualification. Each of these tests addresses specific aspects of the overall functional requirements and specifications. The use of test matrices enables the program office to insure each specific test covers the optimum requirements, and the combination of all testing efforts addresses the total system functional requirements.
Los estilos APA, Harvard, Vancouver, ISO, etc.
21

Barrera, Raymond C. "Command and Control Data dissemination using IP multicast". Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 1999. http://handle.dtic.mil/100.2/ADA376707.

Texto completo
Resumen
Thesis (M.S. in Software Engineering) Naval Postgraduate School, December 1999.
"December 1999". Thesis advisor(s): Bert Lundy. Includes bibliographical references (p. 75-76). Also available online.
Los estilos APA, Harvard, Vancouver, ISO, etc.
22

Gordon, Michael. "SGLS COMMAND DATA ENCODING USING DIRECT DIGITAL SYNTHESIS". International Foundation for Telemetering, 1992. http://hdl.handle.net/10150/608937.

Texto completo
Resumen
International Telemetering Conference Proceedings / October 26-29, 1992 / Town and Country Hotel and Convention Center, San Diego, California
The Space Ground Link Subsystem (SGLS) provides full duplex communications for commanding, tracking, telemetry and ranging between spacecraft and ground stations. The up-link command signal is an S-Band carrier phase modulated with the frequency shift keyed (FSK) command data. The command data format is a ternary (S, 1, 0) signal. Command data rates of 1, 2, and 10 Kbps are used. The method presented uses direct digital synthesis (DDS) to generate the SGLS command data and clock signals. The ternary command data and clock signals are input to the encoder, and an FSK subcarrier with an amplitude modulated clock is digitally generated. The command data rate determines the frequencies of the S, 1, 0 tones. DDS ensures that phase continuity will be maintained, and frequency stability will be determined by the microprocessor crystal accuracy. Frequency resolution can be maintained to within a few Hz from DC to over 2 MHZ. This allows for the generation of the 1 and 2 Kbps command data formats as well as the newer 10 Kbps format. Additional formats could be accommodated through software modifications. The use of digital technology provides for encoder self-testing and more comprehensive error reporting.
Los estilos APA, Harvard, Vancouver, ISO, etc.
23

Schwier, Jason Montgomery. "Pattern recognition for command and control data systems". Connect to this title online, 2009. http://etd.lib.clemson.edu/documents/1252424695/.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
24

Skogetun, Erik. "Reduction of Common Operations in a Neural Network". Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2020. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-281880.

Texto completo
Resumen
Machine learning models are becoming increasingly complex, and particularly artificial neural networks. Meanwhile, solutions are moving closer towards the edge with implementations on devices such as smartphones, TVs and cameras. This creates a demand for efficient models that perform well despite restricted computational resources. One particularly successful convolutional neural network that was first introduced in 2014 is the Inception network, which in its first edition was named GoogLeNet. The Inception network’s structure was highly successful for complex image classification and the model has since then been developed iteratively, resulting in several new versions. The latest version, Inception-V4, was presented in 2016 and the model structure is still highly utilized, and many modern neural networks draw inspiration from it. The aim of this thesis is to develop and evaluate a more lightweight and potentially more efficient version of the Inception network, and discuss its opportunities and challenges. The proposed lighter version, named LightInception, is based on the original Inception-V4 model, where a process called reduction of common operations was implemented to lower the network’s complexity. Reduction of common operations is a method developed in this study for simplifying networks with parallel structures, such as the Inception network. The process draws inspiration from multiple modern network structures and best practices. In practice, its implementation on Inception-V4 lowered the redundancy in the network and reduced the total number of parameters by 33%. LightInception was evaluated on six diverse data sets with regard to inference time, accuracy, loss, convergence, training volatility, and weight utilization. The model showed promising results with higher or equal accuracy for at least half of the evaluated data sets. This indicates that reduction of common operations may be an efficient means to reduce model complexity without losing representative power, and the process is suggested for further investigation.
Maskininlärningsmodeller blir allt mer komplexa, och särskilt artificiella neurala nätverk. Samtidigt flyttar dessa lösningar från molnet och allt närmare de enheter där de faktiskt används, så som smartphones, TV-apparater och kameror. Detta skapar ett behov av effektiva modeller som fungerar bra trots begränsade beräkningsresurser. Ett särskilt framgångsrikt faltningsnätverk ("convolutional neuralt nätverk") som introducerades 2014 är Inception, som i sin första utgåva fick namnet GoogLeNet. Inceptionstrukturen var framgångsrik för komplexa utmaningar inom bildklassificering och modellen har sedan dess utvecklats iterativt, vilket resulterat i flera nya versioner. Den senaste versionen, Inception-V4, presenterades 2016 och är fortfarande väl utnyttjad, och många moderna nätverksstrukturer hämtar inspiration från denna. Syftet med denna studie är att utveckla och utvärdera en mindre komplex och potentiellt mer effektiv version av Inception-nätverket, samt diskutera dess möjligheter och utmaningar. Den föreslagna versionen, med namnet LightInception, är baserad på den ursprungliga Inception-V4-modellen, där en process som heter reduction of common operations implementeras för att minska nätverkets komplexitet. Reduction of common operations utvecklades i denna studie som en metod för att förenkla nätverk med parallella strukturer, så som Inception-nätverket. Processen tar inspiration från flera moderna nätverk och metodiker. I praktiken ledde dess implementation till en minskning av redundansen i nätverket och det totala antalet parametrar minskade med 33 %. LightInception utvärderades på sex dataset med avseende på inferensstid, noggrannhet, loss, konvergens, träningsvolatilitet och uttnyttjandegraden av nätverkets vikter. Modellen visade lovande resultat med högre eller lika noggrannhet för åtminstone hälften av de utvärderade dataseten. Detta indikerar att reduction of common operations kan vara ett effektivt sätt att sänka modellkomplexiteten utan att förlora representativ kraft, och processen föreslås att undersökas i vidare forskning.
Los estilos APA, Harvard, Vancouver, ISO, etc.
25

Barrows, Sam George. "Political Responses to Educational Performance Data". Thesis, Harvard University, 2014. http://nrs.harvard.edu/urn-3:HUL.InstRepos:13065019.

Texto completo
Resumen
Researchers have found considerable evidence that information about school performance affects people's choices about which schools to send their children to and even where to live. In contrast, little attention has been paid to the effects of school performance information on people's political behavior. Yet Hirschman (1970) famously highlighted the importance of taking seriously not only economic forces, but also the role of "political mechanisms", that is, "non-market forces" or "voice", in analyzing people's responses to school performance and the implications of these responses for school outcomes. This dissertation explores the effect of information about student and school performance on people's political attitudes and behavior. I first present findings from an original dataset of school board elections in Florida that indicate that voters fail to punish school board incumbents in response to information signaling poor school performance. There is even evidence that voters sometimes reward incumbents for failure. I next analyze a dataset that links student test scores in England to a subsequent survey, and find that that informational signals about individual student performance can have long−lasting effects on parental behavior. Finally, I analyze the results of a survey experiment administered to a nationally representative sample of Americans, and find that information about the relative performance of local schools depresses average perceptions of local school quality and increases support for school reforms.
Government
Los estilos APA, Harvard, Vancouver, ISO, etc.
26

Echegaray, Daniel. "Making a common graphical language for the validation of linked data". Thesis, KTH, Skolan för datavetenskap och kommunikation (CSC), 2017. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-211100.

Texto completo
Resumen
A variety of embedded systems is used within the design and the construction of trucks within Scania. Because of their heterogeneity and complexity, such systems require the use of many software tools to support embedded systems development. These tools need to form a well-integrated and effective development environment, in order to ensure that product data is consistent and correct across the developing organisation. A prototype is under development which adapts a linked data approach for data integration, more specifically this prototype adapt the Open Services for Lifecycle Collaboration(OSLC) specification for data-integration. The prototype allows users, to design OSLC-interfaces between product management tools and OSLC-links between their data. The user is further allowed to apply constraints on the data conforming to the OSLC validation language Resource Shapes(ReSh). The problem lies in the prototype conforming only to the language of Resource Shapes whose constraints are often too coarse-grained for Scania’s needs, and that there exists no standardised language for the validation of linked data. Thus, for framing this study two research questions was formulated (1) How can a common graphical language be created for supporting all validation technologies of RDF-data? and (2) How can this graphical language support the automatic generation of RDF-graphs? A case study is conducted where the specific case consists of a software tool named SESAMM-tool at Scania. The case study included a constraint language comparison and a prototype extension. Furthermore, a design science research strategy is followed, where an effective artefact was searched for answering the stated research questions. Design science promotes an iterative process including implementation and evaluation. Data has been empirically collected in an iterative development process and evaluated using the methods of informed argument and controlled experiment, respectively, for the constraint language comparison and the extension of the prototype. Two constraint languages were investigated Shapes Constraint Language (SHACL) and Shapes Expression (ShEx). The result of the constraint language comparison concluded SHACL as the constraint language with a larger domain of constraints having finer-grained constraints also with the possibility of defining new constraints. This was based on that SHACL constraints was measured to cover 89.5% of ShEx constraints and 67.8% for the converse. The SHACL and ShEx coverage on ReSh property constraints was measured to 75% and 50%. SHACL was recommended and chosen for extending the prototype. On extending the prototype abstract super classes was introduced into the underlying data model. Constraint language classes were stated as subclasses. SHACL was additionally stated as such a subclass. This design offered an increased code reuse within the prototype but gave rise to issues relating to the plug-in technologies that the prototype is based upon. The current solution still has the issue that properties of one constraint language may be added to classes of another constraint language.
En mängd olika inbyggda system används inom design och konstruktion av lastbilar inom Scania. På grund av deras heterogenitet och komplexitet kräver sådana system användningen av många mjukvaruverktyg för att stödja inbyggd systemutveckling. Dessa verktyg måste bilda en välintegrerad och effektiv utvecklingsmiljö för att säkerställa att produktdata är konsekventa och korrekta över utvecklingsorganisationen.En prototyp håller på att utvecklas som anpassar en länkad datainriktning för dataintegration, mer specifikt anpassar denna prototyp en dataintegration specifikation utvecklad av Open Services for Lifecycle Collaboration(OSLC). Prototypen tillåter användare att utforma OSLC-gränssnitt mellan produkthanteringsverktyg och OSLC-länkar mellan deras data. Användaren får vidare tillämpa begränsningar på de data som överensstämmer med OSLC-valideringsspråket Resource Shapes. Problemet ligger i prototypen som endast överensstämmer med Resource Shapes, vars begränsningar ofta är för grova för Scanias behov och att det inte finns något standardiserat språk för validering av länkad data. Således, för att utforma denna studie formulerades två forskningsfrågor (1) Hur kan ett gemensamt grafiskt språk skapas för att stödja alla valideringsteknologier av RDF-data? och (2) Hur kan detta grafiska språk stödja Automatisk generering av RDF-grafer? En fallstudie genomförs där det specifika fallet består av ett mjukvaruverktyg som heter SESAMM-tool hos Scania. Fallstudien innehöll en jämförelse av valideringsspråk och vidareutveckling av prototypen. Vidare följs Design Science som forskningsstrategi där en effektiv artefakt sökts för att svara på de angivna forskningsfrågorna. Design Science främjar en iterativ process inklusive genomförande och utvärdering. Data har empiriskt samlats på ett iterativt sätt och utvärderats med hjälp av utvärderingsmetoderna informerat argument och kontrollerat experiment, för valideringsspråkjämförelsen och vidareutvecklingen av prototypen. Två valideringsspråk undersöktes Shapes Constraint Language (SHACL) och Shapes Expression (ShEx).Resultatet av valideringsspråksjämförelsen konkluderade SHACL som valideringsspråket med en större domän av begränsningar, mer finkorniga begränsningar och med möjligheten att definiera nya begränsningar. Detta var baserat på att SHACL-begränsningarna uppmättes täcka 89,5 % av ShEx-begränsningarna och 67,8 % för det omvända. SHACL- och ShEx-täckningen för Resource Shapes-egenskapsbegränsningar mättes till 75 % respektive 50 %. SHACL rekommenderades och valdes för att vidareutveckla prototypen.Vid vidareutveckling av prototypen infördes abstrakta superklasser i den underliggande datamodellen. Superklasserna tog i huvudsak rollen som tidigare klasser för valideringsspråk, som istället utgjordes som underklasser. SHACL anges som en sådan underklass. Denna design erbjöd hög kodåteranvändning inom prototypen men gav också upphov till problem som relaterade till plugin-teknologier som prototypen bygger på. Den nuvarande lösningen har fortfarande problemet att egenskaper hos ett valideringsspråk kan läggas till klasser av ett annat valideringsspråk.
Los estilos APA, Harvard, Vancouver, ISO, etc.
27

Onyeka, Reginald Chukwuma. "Techniques for the common-channel transmission of data and speech signals". Thesis, Imperial College London, 1989. http://hdl.handle.net/10044/1/47599.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
28

Jazayeri, Sajad. "Full-waveform Inversion of Common-Offset Ground Penetrating Radar (GPR) data". Scholar Commons, 2019. https://scholarcommons.usf.edu/etd/7815.

Texto completo
Resumen
Maintenance of aging buried infrastructure and reinforced concrete are critical issues in the United States. Inexpensive non-destructive techniques for mapping and imaging infrastructure and defects are an integral component of maintenance. Ground penetrating radar (GPR) is a widely-used non-destructive tool for locating buried infrastructure and for imaging rebar and other features of interest to civil engineers. Conventional acquisition and interpretation of GPR profiles is based on the arrival times of strong reflected/diffracted returns, and qualitative interpretation of return amplitudes. Features are thereby generally well located, but their material properties are only qualitatively assessed. For example, in the typical imaging of buried pipes, the average radar wave velocity through the overlying soil is estimated, but the properties of the pipe itself are not quantitatively resolved. For pipes on the order of the radar wavelength (<5-35 cm), pipe dimensions and infilling material remain ambiguous. Full waveform inversion (FWI) methods exploit the entire radar return rather than the time and peak amplitude. FWI can generate better quantitative estimates of subsurface properties. In recent decades FWI methods, developed for seismic oil exploration, have been adapted and advanced for GPR with encouraging results. To date, however, FWI methods for GPR data have not been specifically tuned and applied on surface collected common offset GPR data, which are the most common type of GPR data for engineering applications. I present an effective FWI method specifically tailored for common-offset GPR data. This method is composed of three main components, the forward modeling, wavelet estimation and inversion tools. For the forward modeling and iterative data inversion I use two open-source software packages, gprMax and PEST. The source wavelet, which is the most challenging component that guarantees the success of the method, is estimated with a novel Sparse Blind Deconvolution (SBD) algorithm that I have developed. The present dissertation indicates that with FWI, GPR can yield better quantitative estimates, for example, of both the diameters of small pipes and rebar and their electromagnetic properties (permittivity, conductivity). Also better estimates of electrical properties of the surrounding media (i.e. soil or concrete) are achieved with FWI.
Los estilos APA, Harvard, Vancouver, ISO, etc.
29

Beaghen, Michael Jr. "Canonical Variate Analysis and Related Methods with Longitudinal Data". Diss., Virginia Tech, 1997. http://hdl.handle.net/10919/29840.

Texto completo
Resumen
Canonical variate analysis (CVA) is a widely used method for analyzing group structure in multivariate data. It is mathematically equivalent to a one-way multivariate analysis of variance and often goes by the name of canonical discriminant analysis. Change over time is a central feature of many phenomena of interest to researchers. This dissertation extends CVA to longitudinal data. It develops models whose purpose is to determine what is changing and what is not changing in the group structure. Three approaches are taken: a maximum likelihood approach, a least squares approach, and a covariance structure analysis approach. All methods have in common that they hypothesize canonical variates which are stable over time. The maximum likelihood approach models the positions of the group means in the subspace of the canonical variates. It also requires modeling the structure of the within-groups covariance matrix, which is assumed to be constant or proportional over time. In addition to hypothesizing stable variates over time, one can also hypothesize canonical variates that change over time. Hypothesis tests and confidence intervals are developed. The least squares methods are exploratory. They are based on three-mode PCA methods such as the Tucker2 and parallel factor analysis. Graphical methods are developed to display the relationships between the variables over time. Stable variates over time imply a particular structure for the between-groups covariance matrix. This structure is modeled using covariance structure analysis, which is available in the SAS package Proc Calis. Methods related to CVA are also discussed. First, the least squares methods are extended to canonical correlation analysis, redundancy analysis, Procrustes rotation and correspondence analysis with longitudinal data. These least squares methods lend themselves equally well to data from multiple datasets. Lastly, a least squares method for the common principal components model is developed.
Ph. D.
Los estilos APA, Harvard, Vancouver, ISO, etc.
30

Lindmark, Fanny. "Master Data Management : Creating a Common Language for Master Data Across an Extended and Complex Supply Chain". Thesis, Linköpings universitet, Programvara och system, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:liu:diva-149500.

Texto completo
Resumen
Connectivity provided by technology and liberation of trade have led to a globalization of organizations, affecting supply chains to expand in complexity. As a result, many organizations today have challenges of managing information in a consistent manner throughout a complex system environment. This study aims to identify the most valuable attributes of a solution for managing master data, in an efficient and consistent manner, across an extended and complex supply chain. Master data, such as products, customers and suppliers, can be defined as valuable core business information, since it is vital for supporting business operations. A requirements elicitation was performed, including interviews conducted internally with employees at IFS and externally with customers. Furthermore, a requirements analysis resulted in a specification of requirements including the most desirable attributes of a future Master Data Management (MDM) solution. Five main themes of the attributes were identified; architecture, availability and integration, governance, user interface and lifecycle management. The study contributes to the area of research, by identifying challenges and valuable attributes to consider when developing or investing in a solution for MDM.
Los estilos APA, Harvard, Vancouver, ISO, etc.
31

Barber, Mark H. y Paul R. Richey. "Naval Supply Systems Command: Data Administration planning and implementation". Thesis, Monterey, California. Naval Postgraduate School, 1989. http://hdl.handle.net/10945/27154.

Texto completo
Resumen
The Paperwork Reduction Act of 1980 mandated that federal government activities establish and enforce Information Resource Management policies. It also recommended the establishment of a Data Administration Branch within federal activities to provide an organizational entity devoted to effective information management. This presents guidelines for the successful implementation of Data Administration, describes a standard for an Information Resources Dictionary System (the Data Administrator's primary tool), and makes recommendations for planning an Information Resources Dictionary System implementation. Keywords: Criticality general. (kr)
Los estilos APA, Harvard, Vancouver, ISO, etc.
32

Chen, Mingdeng. "Low-voltage, low-power circuits for data communication systems". Diss., Texas A&M University, 2003. http://hdl.handle.net/1969.1/1585.

Texto completo
Resumen
There are growing industrial demands for low-voltage supply and low-power consumption circuits and systems. This is especially true for very high integration level and very large scale integrated (VLSI) mixed-signal chips and system-on-a-chip. It is mainly due to the limited power dissipation within a small area and the costs related to the packaging and thermal management. In this research work, two low-voltage, low-power integrated circuits used for data communication systems are introduced. The first one is a high performance continuous-time linear phase filter with automatic frequency tuning. The filter can be used in hard disk driver systems and wired communication systems such as 1000Base-T transceivers. A pseudo-differential operational transconductance amplifier (OTA) based on transistors operating in triode region is used to achieve a large linear signal swing with low-voltage supplies. A common-mode (CM) control circuit that combines common-mode feedback (CMFB), common-mode feedforward (CMFF), and adaptive-bias has been proposed. With a 2.3V single supply, the filter’s total harmonic distortion is less than –44dB for a 2VPP differential input, which is due to the well controlled CM behavior. The ratio of the root mean square value of the ac signal to the power supply voltage is around 31%, which is much better than previous realizations. The second integrated circuit includes two LVDS drivers used for high-speed point-to-point links. By removing the stacked switches used in the conventional structures, both LVDS drivers can operate with ultra low-voltage supplies. Although the Double Current Sources (DCS) LVDS driver draws twice minimum static current as required by the signal swing, it is quite simple and achieves very high speed operation. The Switchable Current Sources (SCS) LVDS driver, by dynamically switching the current sources, draws minimum static current and reduces the power consumption by 60% compared to the previously reported LVDS drivers. Both LVDS drivers are compliant to the standards and operate at data rates up to gigabits-per-second.
Los estilos APA, Harvard, Vancouver, ISO, etc.
33

Davis, Rodney, Greg Hupf y Chad Woolf. ""ADVANCED DATA DESCRIPTION EXCHANGE SERVICES FOR HETEROGENEOUS SYSTEMS"". International Foundation for Telemetering, 2004. http://hdl.handle.net/10150/605341.

Texto completo
Resumen
International Telemetering Conference Proceedings / October 18-21, 2004 / Town & Country Resort, San Diego, California
CCT is conducting research to provide a cross platform software capability that enables a common semantic for control and monitor of highly distributed systems-of-systems C^2 architectures by auto-generating semantic processing services from standardized metadata specifications. This new capability is significant because it will reduce development, operations, and support costs for legacy and future systems that are part of ground and space based distributed command and control systems. It will also establish a space systems information exchange model that can support future highly interoperable and mobile software systems.
Los estilos APA, Harvard, Vancouver, ISO, etc.
34

Zobair, Hamza A. "A method for finding common attributes in hetrogenous DoD databases". Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 2004. http://library.nps.navy.mil/uhtbin/hyperion/04Jun%5FZobair.pdf.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
35

Carter, Liam y Sofia Söderström. "Comparing Common Criteria's Vulnerability Analysis with SAFECode's Secure Coding Practices". Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-259201.

Texto completo
Resumen
Common Criteria is today used by multiple countries and authorities, to evaluate and certify secure IT products. This process is a lengthy one, that can take upwards of eighteen months. This thesis tries to solve this problem by seeing if the vulnerability analysis part of the Common Criteria on evaluation assurance level two can be replaced, by making sure that the development of the product was performed according to secure coding practices presented by SAFECode.To reach our conclusion we applied both the vulnerability analysis of Common Criteria, and the coding standards of SAFECode on a product to see what vulnerabilities we could find. After performing both of the evaluations of the product according to each process, and analysing the results. By looking at the results from both processes we were able to see if Common Criteria and SAFECode had any connections or crossovers.We found that the vulnerabilities that the Common Criteria found would not have been present if the secure coding practices of SAFECode had been used during the development meaning SAFECode could in some way be used with common Criteria. We did not find evidence that proves that the vulnerability analysis cant be replaced, we therefore imply that the possibility to replace or supplement exists for evaluation assurance level two. More research is needed on this question to provide a guarantee, for any real world application.
Idag används Common Criteria av flertalet länder och myndigheter för att utvärdera och certifiera säkra IT produkter. Detta är en lång process som kan ta upp till arton månader att utföra. Den här avhandlingen försöker lösa detta problem genom att testa om sårbarhetsanalysen i Common Criteria på "Evaluation Assurance Level" två kan bli utbytt, genom att säkerhetsställa att framtagningen av produkten görs enligt en metod för säker kod-framtagning presenterad av SAFECode.För att komma fram till vår slutsats applicerade vi både sårbarhetsanalysen från Common Criteria, och metoden för säkra kod-framtagning från SAFECode på en produkt för att se vilka sårbarheter som kunde hittas. Efter att båda processerna har utförts på produkten, så analyserades resultaten. Genom att kolla på resultaten från båda metoder kunde vi se om det fans några kopplingar eller överlappande resultat.Vi upptäckte att de sårbarheter som Common Criteria hittade inte skulle finnas om man hade använt de säkra kod-framtagnings metodern från SAFECode under framtagningsprocessen av produkten. Vi hittade inget som bevisar att sårbarhetsanalysen inte kan bli utbytt och antyder därför att det finns möjlighet till att byta ut Common Criterias sårbarhetsanalys process med SAFECode processen eller tillägga SAFECode processen som hjälpmedel för "Evaluation Assurance Level" två. Mer forskning behövs i detta ämne för att ge en garanti att detta gäller för alla tillämpningar i den verkliga världen.
Los estilos APA, Harvard, Vancouver, ISO, etc.
36

Brauer, David A. "EMBEDDED VIDEO TRANSMISSION IN A CAIS DATA ACQUISITION SYSTEM". International Foundation for Telemetering, 2000. http://hdl.handle.net/10150/608287.

Texto completo
Resumen
International Telemetering Conference Proceedings / October 23-26, 2000 / Town & Country Hotel and Conference Center, San Diego, California
Acquiring real-time video data, during flight testing, has become an integral component in aircraft design and performance evaluation. This unique data acquisition capability has been successfully integrated into the JSF (Joint Strike Fighter), CAIS compliant, FTIDAS (Flight Test Instrumentation Data Acquisition System) developed by L-3 Communications Telemetry-East.
Los estilos APA, Harvard, Vancouver, ISO, etc.
37

Song, Huadong. "People's commune and China's fertility : evidence from county-level data /". View abstract or full-text, 2009. http://library.ust.hk/cgi/db/thesis.pl?SOSC%202009%20SONG.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
38

Homan, Rodney M. "THE COMMON AIRBORNE INSTRUMENTATION SYSTEM (CAIS) TOOLSET SOFTWARE (CTS)". International Foundation for Telemetering, 1996. http://hdl.handle.net/10150/608359.

Texto completo
Resumen
International Telemetering Conference Proceedings / October 28-31, 1996 / Town and Country Hotel and Convention Center, San Diego, California
The Department of Defense (DoD), through a Tri-Service Program Office, is developing the Common Airborne Instrumentation System (CAIS) to promote standardization, commonality, and interoperability among aircraft test instrumentation systems. The advent of CAIS will change how the DoD test community conducts business. The CAIS program will allow aircraft test and evaluation facilities to utilize common airborne systems, ground support equipment, and technical knowledge for airborne instrumentation systems. The CAIS Toolset Software (CTS) provides the capability to generate formats and load/verify airborne memories. The CTS is primarily a software applications program hosted on an IBM compatible portable personal computer with several interface cards. The software will perform most functions without the presence of the interface cards to allow the user to develop test configurations and format loads on a desktop computer.
Los estilos APA, Harvard, Vancouver, ISO, etc.
39

Bakhuizen, Ellinor y Cecilia Landelius. "Diverge - Investigating the Consequences of Bad Comments". Thesis, KTH, Skolan för elektroteknik och datavetenskap (EECS), 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-255047.

Texto completo
Resumen
In large software development projects the majority of code comments are written at the beginning of the project and tend to not be updated when the code is rewritten. This commonly results in code with incorrect comments or no comments at all. This study intends to answer whether incorrect comments can mislead programmers and if well written comments assist programmers ininterpreting new code. Furthermore, the attitudes towards code comments are investigated. The research questions were answered with data from forms andc ode tests run on 35 engineering students. For the code tests eye trackers were used to provide a clear picture of how much the participants read the code and the associated comments. A majority of the test subjects agreed that comments are important whilst 8% considered comments to be unnecessary. 50% of the test subjects expressed positive feelings towards writing comments. The data from the eyetracker showed that the test subjects read comments and code equally. Thes tudy found that incorrect comments in many cases lead to misconceptions. Correct comments were shown to assist the programmer if the code contained library functions that the programmer was not familiar with. Regarding correct comments versus no comments at all there was no difference in readability if the code did not contain any library functions.
I stora mjukvaruprojekt skrivs majoriteten av kodkommentarerna i början av projektets livscykel och tenderar att med tiden att sluta uppdateras när koden skrivs om. Detta kan resultera i såväl okommenterad kod som felaktiga kommentarer. Denna studie avser att besvara frågeställningen om huruvida felaktiga kommentarer kan leda till att programmerare missförstår koden samt om korrekta kommentarer kan assistera programmerare vid tolkning av skriven kod. Vidare undersöktes attityder kring kodkommentarer. Frågeställningarna besvarades med data från formulär och kodtester utförda på 35 ingenjörsstudenter. Vid kodtesterna användes eye trackers för att skapa en tydlig bild av i vilken utsträckning testpersonerna läste koden och kommentarerna. En majoritet av deltagarna ansåg att kodkommentarer är viktiga, 8% ansåg kommentarer vara oviktiga och resten ställde sig neutrala. 50% av deltagarna uttryckte positiva känslor till att skriva kommentarer. Från eye trackern kunde det påvisas att deltagarna läste kommentarer i liknande utsträckning som de läste kod. Studien fann att felaktiga kommentarer i många fall leder till missförstånd. Korrekta kommentarer visades vara till hjälp då programmerare stötte på biblioteksfunktioner de inte var bekanta med. Det var däremot ingen skillnad mellan korrekta kommentarer och inga kommentarer för de kodexempel som ej innehöll biblioteksfunktioner.
Los estilos APA, Harvard, Vancouver, ISO, etc.
40

Zarebski, David. "Ontologie naturalisée et ingénierie des connaissances". Thesis, Paris 1, 2018. http://www.theses.fr/2018PA01H232/document.

Texto completo
Resumen
«Qu’ai-je besoin de connaître minimalement d’une chose pour la connaître ?» Le fait que cette question aux allures de devinette s’avère cognitivement difficile à appréhender de par son degré de généralité explique sans peine la raison pour laquelle son élucidation demeura plusieurs millénaires durant l’apanage d’une discipline unique : la Philosophie. Dans ce contexte, énoncer des critères à même de distinguer les composants primitifs de la réalité – ou le "mobilier du monde" – ainsi que leurs relations revient à produire une Ontologie. Cet ouvrage s’attelle à la tâche d’élucider le tournant historique curieux, en apparence anodin, que constitue l’émergence de ce type de questionnement dans le champ de deux disciplines connexes que constituent l’Intelligence Artificielle et l’Ingénierie des Connaissances. Nous montrons plus particulièrement ici que leur import d’une forme de méthodologie ontologique appliquée à la cognition ou à la représentation des connaissances ne relève pas de la simple analogie mais soulève un ensemble de questions et d’enjeux pertinents tant sur un plan appliqué que spéculatif. Plus spécifiquement, nous montrons ici que certaines des solutions techniques au problème de la data-masse (Big Data) – i.e. la multiplication et la diversification des données en ligne – constitue un point d’entrée aussi nouveau qu’inattendu dans de nombreuses problématiques traditionnellement philosophiques relatives à la place du langage et des raisonnements de sens commun dans la pensée ou encore l’existence d’une structuration de la réalité indépendante de l’esprit humain
«What do I need to know about something to know it ?». It is no wonder that such a general, hard to grasp and riddle-like question remained the exclusive domain of a single discipline for centuries : Philosophy. In this context, the distinction of the primitive components of reality – the so called "world’s furniture" – and their relations is called an Ontology. This book investigates the emergence of similar questions in two different though related fields, namely : Artificial Intelligence and Knowledge Engineering. We show here that the way these disciplines apply an ontological methodology to either cognition or knowledge representation is not a mere analogy but raises a bunch of relevant questions and challenges from both an applied and a speculative point of view. More specifically, we suggest that some of the technical answers to the issues addressed by Big Data invite us to revisit many traditional philosophical positions concerning the role of language or common sense reasoning in the thought or the existence of mind-independent structure in reality
Los estilos APA, Harvard, Vancouver, ISO, etc.
41

Pratt, Everett S. "Data compression standards and applications to Global Command and Control System". Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 1994. http://handle.dtic.mil/100.2/ADA293262.

Texto completo
Resumen
Thesis (M.S. in Electrical Engineering and M.S. in Systems Technology) Naval Postgraduate School, December 1994.
Thesis advisor(s): Murali Tummala, Paul H. Moose. "December 1994." Includes bibliographical references. Also available online.
Los estilos APA, Harvard, Vancouver, ISO, etc.
42

Sing, David Kent. "Post Common Envelope Pre-Cataclysmic and Cataclysmic Variable Binaries". Diss., Tucson, Arizona : University of Arizona, 2005. http://etd.library.arizona.edu/etd/GetFileServlet?file=file:///data1/pdf/etd/azu%5Fetd%5F1400%5F1%5Fm.pdf&type=application/pdf.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
43

Eichelberger, John W. "Design and modelling of a link monitoring mechanism for the Common Data Link (CDL)". Thesis, Monterey, Calif. : Springfield, Va. : Naval Postgraduate School ; Available from National Technical Information Service, 1994. http://handle.dtic.mil/100.2/ADA286167.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
44

Ozkaya, Eren Aysegul. "A Method To Decrease Common Problems In Effort Data Collection In The Software Industry". Master's thesis, METU, 2012. http://etd.lib.metu.edu.tr/upload/12614691/index.pdf.

Texto completo
Resumen
Efficient project planning and project management is crucial to complete the software projects in expected time and requirements. The most critical stage in project planning is estimation of the software size, time and budget. In this stage, effort data is used for benchmarking data sets, effort estimation, project monitoring and controlling. However, there are some problems related to effort data collection in the software industry. In this thesis, a pilot study and survey study are conducted to observe common practices and problems in effort data collection in the industry and results are analyzed. These problems are explained in terms of tool, process and people factors and solution suggestions are presented according to these problems. In accordance with the findings, a method and a tool which can facilitates to provide more accurate data are developed. A case study is performed in order to validate the method and applicability of the tool in the industry.
Los estilos APA, Harvard, Vancouver, ISO, etc.
45

Kempen, Markus [Verfasser]. "EU wide analysis of the Common Agricultural Policy using spatially disaggregated data / Markus Kempen". Bonn : Universitäts- und Landesbibliothek Bonn, 2013. http://d-nb.info/1045345814/34.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
46

Kalibjian, Jeffrey R. "The Impact of the Common Data Security Architecture (CDSA) on Telemetry Post Processing Architectures". International Foundation for Telemetering, 1999. http://hdl.handle.net/10150/608706.

Texto completo
Resumen
International Telemetering Conference Proceedings / October 25-28, 1999 / Riviera Hotel and Convention Center, Las Vegas, Nevada
It is an increasing requirement that commercial satellite telemetry data product be protected from unauthorized access during transmission to ground stations. While the technology (cryptography) to secure telemetry data product is well known, the software infrastructure to support such security is costly, and very customized. Further, many software packages have difficulty interoperating. The Common Data Security Architecture [1] [2] [3] (originally proposed by the Intel Corporation, and now adopted by the Open Group), is a set of common cryptographic [4] and public key infrastructure (PKI) application programming interfaces (APIs) which will facilitate better cryptographic interoperability as well as making cryptographic resources more readily available in telemetry post processing environments.
Los estilos APA, Harvard, Vancouver, ISO, etc.
47

Hunt, Trent W. "Common Airborne Processing System (CAPS) 2.0: Data Reduction Software on a Personal Computer (PC)". International Foundation for Telemetering, 1997. http://hdl.handle.net/10150/609756.

Texto completo
Resumen
International Telemetering Conference Proceedings / October 27-30, 1997 / Riviera Hotel and Convention Center, Las Vegas, Nevada
CAPS 2.0 provides a flexible, PC-based tool for meeting evolving data reduction and analysis requirements while supporting standardization of instrumentation data processing. CAPS 2.0 will accept a variety of data types including raw instrumentation, binary, ASCII, and Internet protocol message data and will output Engineering Unit data to files, static or dynamic plots, and Internet protocol message exchange. Additionally, CAPS 2.0 will input and output data in accordance with the Digital Data Standard. CAPS 2.0 will accept multiple input sources of PCM, MIL-STD-1553, or DDS data to create an output for every Output Product Description and Dictionary grouping specified for a particular Session. All of this functionality is performed on a PC within the framework of the Microsoft Windows 95/NT graphical user interface.
Los estilos APA, Harvard, Vancouver, ISO, etc.
48

Jansen, Dorine Yvette Manon. "The use of ringing data in the study of climatic influences on common passerines". Doctoral thesis, University of Cape Town, 2016. http://hdl.handle.net/11427/20645.

Texto completo
Resumen
To understand the potential impact of forecasted increases in climatic variability we need to determine the impact of climatic stochasticity on demographic rates. This thesis used available long-term ringing data collected by volunteers, augmented by data from research projects, to investigate the influence of climatic variation on survival of 10 common passerines in southern Africa. Through sheer numbers common species are fundamental to ecosystem functioning. Migratory species are subject to climatic stochasticity in breeding and wintering grounds, and during migration. In a population of African Reed Warblers Acrocephalus baeticatus (an azonal wetland specialist) a capture-mark-recapture model correlated higher temperature in the breeding grounds with higher adult survival (1998-2010), but - contrary to expectations - not wetter winters. A spatial analysis using a multi-state model in a Bayesian framework did not link survival in populations across southern Africa to environmental seasonality. However, as hypothesised, migratory populations appeared to survive better than sedentary populations. Increased climatic variation could synchronize survival of species assemblages and colonies in meta-populations. I investigated a 3-species assemblage in climatically stable fynbos (2000-2007) and a 4-species assemblage in more seasonal wetland (1999-2013) with a hierarchical model, run in WinBUGS, with a temporal, synchronous (common) and asynchronous (species-specific) component. Comparison of models with and without climatic covariates quantified the impact of climatic stochasticity as a synchronizing and desynchronizing agent. As expected, the wetland assemblage exhibited more synchronous and asynchronous variation in survival than the fynbos assemblage, but the analysis did not find evidence of climatic forcing. Demographic rates of a population of 25 colonies of a Sociable Weaver Philetairus socius meta-population in savanna near Kimberley did not correlate with climatic indices during 1993-2014. Age-specific survival and fecundity of the largest colony were influenced by climatic variation reinforcing earlier inference that colonies respond differently to environmental stochasticity. The integrated population model using count, ringing, and productivity data enabled the first estimation of annual fecundity, juvenile survival and recruitment. The volunteer data yielded the first estimates of adult survival of two African endemics and estimates of a second population for three other species. A review of volunteer ringing resulted in recommendations to improve its use from a demographic perspective.
Los estilos APA, Harvard, Vancouver, ISO, etc.
49

Sundland, Joseph J. Carroll Christopher J. "Transforming data and metadata into actionable intelligence and information within the maritime domain". Monterey, Calif. : Naval Postgraduate School, 2008. http://handle.dtic.mil/100.2/ADA483649.

Texto completo
Resumen
Thesis (M.S. in Information Technology Management)--Naval Postgraduate School, June 2008. Thesis (M.S. in Systems Technology (Command, Control, and Communications (C3)))--Naval Postgraduate School, June 2008.
Thesis Advisor(s): MacKinnon, Douglas. "June 2008." Description based on title screen as viewed on August 25, 2008. M.S. in Information Technology Management awarded to Joseph J. Sundland, June 2008. M.S. in Systems Technology (Command, Control, and Communications (C3)) awarded to Christopher J. Carroll, June 2008. Includes bibliographical references (p. 93-96). Also available in print.
Los estilos APA, Harvard, Vancouver, ISO, etc.
50

Jeong, Ki Tai. "A Common Representation Format for Multimedia Documents". Thesis, University of North Texas, 2002. https://digital.library.unt.edu/ark:/67531/metadc3336/.

Texto completo
Resumen
Multimedia documents are composed of multiple file format combinations, such as image and text, image and sound, or image, text and sound. The type of multimedia document determines the form of analysis for knowledge architecture design and retrieval methods. Over the last few decades, theories of text analysis have been proposed and applied effectively. In recent years, theories of image and sound analysis have been proposed to work with text retrieval systems and progressed quickly due in part to rapid progress in computer processing speed. Retrieval of multimedia documents formerly was divided into the categories of image and text, and image and sound. While standard retrieval process begins from text only, methods are developing that allow the retrieval process to be accomplished simultaneously using text and image. Although image processing for feature extraction and text processing for term extractions are well understood, there are no prior methods that can combine these two features into a single data structure. This dissertation will introduce a common representation format for multimedia documents (CRFMD) composed of both images and text. For image and text analysis, two techniques are used: the Lorenz Information Measurement and the Word Code. A new process named Jeong's Transform is demonstrated for extraction of text and image features, combining the two previous measurements to form a single data structure. Finally, this single data measurements to form a single data structure. Finally, this single data structure is analyzed by using multi-dimensional scaling. This allows multimedia objects to be represented on a two-dimensional graph as vectors. The distance between vectors represents the magnitude of the difference between multimedia documents. This study shows that image classification on a given test set is dramatically improved when text features are encoded together with image features. This effect appears to hold true even when the available text is diffused and is not uniform with the image features. This retrieval system works by representing a multimedia document as a single data structure. CRFMD is applicable to other areas of multimedia document retrieval and processing, such as medical image retrieval, World Wide Web searching, and museum collection retrieval.
Los estilos APA, Harvard, Vancouver, ISO, etc.
Ofrecemos descuentos en todos los planes premium para autores cuyas obras están incluidas en selecciones literarias temáticas. ¡Contáctenos para obtener un código promocional único!

Pasar a la bibliografía