Дисертації з теми "Bid Optimizer"

Щоб переглянути інші типи публікацій з цієї теми, перейдіть за посиланням: Bid Optimizer.

Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями

Оберіть тип джерела:

Ознайомтеся з топ-41 дисертацій для дослідження на тему "Bid Optimizer".

Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.

Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.

Переглядайте дисертації для різних дисциплін та оформлюйте правильно вашу бібліографію.

1

Bhandare, Ashray Sadashiv. "Bio-inspired Algorithms for Evolving the Architecture of Convolutional Neural Networks." University of Toledo / OhioLINK, 2017. http://rave.ohiolink.edu/etdc/view?acc_num=toledo1513273210921513.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
2

Correll, David. "Optimized landscape plans for bio-oil production." [Ames, Iowa : Iowa State University], 2009. http://gateway.proquest.com/openurl?url_ver=Z39.88-2004&rft_val_fmt=info:ofi/fmt:kev:mtx:dissertation&res_dat=xri:pqdiss&rft_dat=xri:pqdiss:1464191.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
3

Schamberger, Stefan. "Shape optimized graph partitioning." [S.l.] : [s.n.], 2006. http://deposit.ddb.de/cgi-bin/dokserv?idn=983282455.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
4

Mazzotti, Matilde <1986&gt. "Physiological studies to optimize algal biomass production in phytoremediation processes." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2015. http://amsdottorato.unibo.it/6934/1/tesi_Matilde_Mazzotti.pdf.

Повний текст джерела
Анотація:
Nowadays microalgae are studied, and a number of species already mass-cultivated, for their application in many fields: food and feed, chemicals, pharmaceutical, phytoremediation and renewable energy. Phytoremediation, in particular, can become a valid integrated process in many algae biomass production systems. This thesis is focused on the physiological and biochemical effects of different environmental factors, mainly macronutrients, lights and temperature on microalgae. Microalgal species have been selected on the basis of their potential in biotechnologies, and nitrogen occurs in all chapters due to its importance in physiological and applicative fields. There are 5 chapters, ready or in preparation to be submitted, with different specific matters: (i) to measure the kinetic parameters and the nutrient removal efficiencies for a selected and local strain of microalgae; (ii) to study the biochemical pathways of the microalga D. communis in presence of nitrate and ammonium; (iii) to improve the growth and the removal efficiency of a specific green microalga in mixotrophic conditions; (iv) to optimize the productivity of some microalgae with low growth-rate conditions through phytohormones and other biostimulants; and (v) to apply the phyto-removal of ammonium in an effluent from anaerobic digestion. From the results it is possible to understand how a physiological point of view is necessary to provide and optimize already existing biotechnologies and applications with microalgae.
Стилі APA, Harvard, Vancouver, ISO та ін.
5

Mazzotti, Matilde <1986&gt. "Physiological studies to optimize algal biomass production in phytoremediation processes." Doctoral thesis, Alma Mater Studiorum - Università di Bologna, 2015. http://amsdottorato.unibo.it/6934/.

Повний текст джерела
Анотація:
Nowadays microalgae are studied, and a number of species already mass-cultivated, for their application in many fields: food and feed, chemicals, pharmaceutical, phytoremediation and renewable energy. Phytoremediation, in particular, can become a valid integrated process in many algae biomass production systems. This thesis is focused on the physiological and biochemical effects of different environmental factors, mainly macronutrients, lights and temperature on microalgae. Microalgal species have been selected on the basis of their potential in biotechnologies, and nitrogen occurs in all chapters due to its importance in physiological and applicative fields. There are 5 chapters, ready or in preparation to be submitted, with different specific matters: (i) to measure the kinetic parameters and the nutrient removal efficiencies for a selected and local strain of microalgae; (ii) to study the biochemical pathways of the microalga D. communis in presence of nitrate and ammonium; (iii) to improve the growth and the removal efficiency of a specific green microalga in mixotrophic conditions; (iv) to optimize the productivity of some microalgae with low growth-rate conditions through phytohormones and other biostimulants; and (v) to apply the phyto-removal of ammonium in an effluent from anaerobic digestion. From the results it is possible to understand how a physiological point of view is necessary to provide and optimize already existing biotechnologies and applications with microalgae.
Стилі APA, Harvard, Vancouver, ISO та ін.
6

Vanden, Berg Andrew M. "Optimization-simulation framework to optimize hospital bed allocation in academic medical centers." Thesis, Massachusetts Institute of Technology, 2018. http://hdl.handle.net/1721.1/120223.

Повний текст джерела
Анотація:
Thesis: S.M., Massachusetts Institute of Technology, Sloan School of Management, Operations Research Center, 2018.
Cataloged from PDF version of thesis.
Includes bibliographical references (pages 99-100).
Congestion, overcrowding, and increasing patient wait times are major challenges that many large, academic centers currently face. To address these challenges, hospitals must effectively utilize available beds through proper strategic bed allocation and robust operational day-to-day bed assignment policies. Since patient daily demand for beds is highly variable, it is frequent that the physical capacity allocated to a given clinical service is not sufficient to accommodate all of the patients who belong to that service. This situation could lead to extensive wait time of patients in various locations in the hospital (e.g., the emergency department), as well as clinically and operationally undesirable misplacements of patients in hospital floors/beds that are managed by other clinical services than the ones to which the patients belong. In this thesis, we develop an optimization-simulation framework to optimize the bed allocation at Mass General Hospital. Detailed, data-driven simulation suggests that the newly proposed bed allocation would lead to significant reduction in patient intra-day wait time in the emergency department and other hospital locations, as well as a major reduction in the misplacements of patients in the Medicine service, which is the largest service in the hospital. We employ a two-pronged approach. First, we developed a detailed simulation setting of the entire hospital that could be used to assess the effectiveness of day-to-day operational bed assignment policies given a specific bed allocation. However, the simulation does not allow tractable optimization that seeks to find the best bed allocation among all possible allocations. This motivates the development of a network-flow/network design inspired mixed integer program that approximates the operational performance of bed allocations and allows us to effectively search for approximately the best allocation. The mixed integer program can be solved via a scenario sampling approach to provide candidate bed allocations. These are then tested and evaluated via the simulation setting. These tools facilitate expert discussions on how to modify the existing bed allocation at MGH to improve the day-to-day performance of the bed assignment process.
by Andrew M. Vanden Berg.
S.M.
Стилі APA, Harvard, Vancouver, ISO та ін.
7

Thelen, Andrea. "Optimized surface extraction from holographic data." [S.l.] : [s.n.], 2006. http://deposit.ddb.de/cgi-bin/dokserv?idn=980418798.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
8

Bär, Werner. "Optimized delivery of intensity modulated radiotherapy." [S.l.] : [s.n.], 2002. http://deposit.ddb.de/cgi-bin/dokserv?idn=965610934.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
9

Abbruzzese, Vito. "Using bio-manipulation to optimise nutrient management within intensive farm systems." Thesis, Lancaster University, 2017. http://eprints.lancs.ac.uk/86607/.

Повний текст джерела
Анотація:
Optimising the use of organic amendments, such as livestock slurry, on commercial farms represents one route through which the reliance of agricultural production on inorganic fertiliser use might be reduced. For economic, environmental and geopolitical reasons, decoupling future agricultural production from inorganic fertiliser use is desirable, particularly if increases in future demand for food at global scale are to be met sustainably. However, there remains substantial uncertainty surrounding the impacts of organic amendments on many of the key physico-chemical and microbial properties of agricultural soils. This uncertainty reduces the likelihood that land owners and land managers will adjust farming practices in order to deliver more widespread use of organic amendments to support production. In this context, the research reported in this thesis sought to understand how the management of livestock slurry within intensive grassland systems can be optimised to support production. The thesis had a particular focus on understanding how the soil microbial community mediates the input of livestock slurry, in terms of the influence of this community on the cycling and crop-availability of macronutrients within soil. The thesis first examined the impact of a biological slurry additive, SlurryBugs, on the nutrient content of livestock slurry during storage, finding positive effects of the additive particularly with respect to the total phosphorus (P), where an increase by 27% was observed compared to the control slurry treatment, and the total solids contents of slurry during storage. It was hypothesised that the SB additive may have altered the emission of phosphine (PH3) from slurry during storage. Subsequently, the impacts of slurry application, both with and without the biological additive, on soil organic matter (SOM), as well as on the nitrogen (N) and P content of grassland soils were examined, in comparison to inorganic fertiliser and control treatments. Positive effects following slurry application were observed, spanning SOM, Olsen P, mineral N and soil pH conditions. Finally, the impacts of applying slurry alongside a range of carbon (C) substrates of different quality (glucose, glucose-6-phosphate (G6P), and cellulose) to a grassland soil were examined, in terms of the partitioning of C within soil as mediated by the microbial community and in terms of changes in the structure and biomass of the soil microbial community. The results revealed an increase in the soil microbial biomass, as well as a decrease in the cumulative respiration, following the application of both slurry types, alongside a carbohydrate, compared to the treatment with the carbohydrate alone, likely due to a microbial metabolic mechanism known as preferential substrate utilisation. In addition, a bacterial predominance within the soil microbial community was observed in all treatments, with increasing dominance of fungi toward the end of the 49-day incubations. This thesis also revealed that the quality of C substrates represented a major factor affecting both the extent of mineralisation and of incorporation of externally-derived C into microbial biomass. The application of 14C-glucose or 14C-G6P to soil resulted in a significantly greater incorporation of 14C into microbial biomass by 68 or 57%, respectively, compared to 41% following the 14C-cellulose application. Further, the addition of US slurry alongside 14C-glucose generated a significantly greater extent of mineralisation by 30%, compared to the treatments with AS slurry or with only 14C-glucose added with 19 and 21%, respectively. Taken together, the data reported within this thesis have potentially important implications for the way in which livestock slurry is managed as a nutrient resource on commercial farms, as well as for broader environmental concerns including the acidification of agricultural soils and the impact of agricultural soils on the global C cycle.
Стилі APA, Harvard, Vancouver, ISO та ін.
10

陳從輝 and Chung-fai Chan. "MOS parameter extraction globally optimized with genetic algorithm." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1996. http://hub.hku.hk/bib/B31212785.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
11

Peters, Mathias. "Scheduling workflows to optimize for execution time." Thesis, Uppsala universitet, Institutionen för informatik och media, 2018. http://urn.kb.se/resolve?urn=urn:nbn:se:uu:diva-354464.

Повний текст джерела
Анотація:
Many functions in today’s society are immensely dependent on data. Data drives everything from business decisions to self-driving cars to intelligent home assistants like Amazon Echo and Google Home. To make good decisions based on data, of which exabytes are generated every day, somehow that data has to be processed. Data processing can be complex and time-consuming. One way of reducing the complexity is to create workflows that consist of several steps that together produce the right result. Klarna is an example of a company that relies on workflows for transforming and analyzing data. As a company whose core business involves analyzing customer data, being able to do those analyses faster will lead to direct business value in the form of more well-informed decisions. The workflows Klarna use are currently all written in a sequential form. However, workflows, where independent tasks are executed in parallel, are more performant than workflows where only one task is executed at any point in time. Due to limitations in human attention span, parallelized workflows are harder for humans to write, compared to sequential workflows. In this work, a computer application was created that automates the parallelization of a workflow to let humans write sequential workflows while still getting the performance of parallelized workflows. The application does this by taking a simple sequential workflow, identifies dependencies in the workflow and then schedules it in a way that is as parallel as possible given the identified dependencies. Such a solution has not been created before. However, experimental evaluation shows that parallelization of a sequential workflow used in daily production at Klarna can reduce execution time by up to 80%, showing that the application can bring value to Klarna and other organizations that use workflows to analyze big data.
Стилі APA, Harvard, Vancouver, ISO та ін.
12

Chirinos, Santander Lizett Rosario, and Llacta Julio Cesar Pecho. "Implementación de la metodología BIM en la construcción del proyecto multifamiliar DUPLO para optimizar el costo establecido." Master's thesis, Universidad Peruana de Ciencias Aplicadas (UPC), 2019. http://hdl.handle.net/10757/626030.

Повний текст джерела
Анотація:
El mundo de la construcción viene cambiando, siempre buscando la mejora continua y la optimización de los procesos. Debido a ello surge la necesidad de usar nuevas metodologías aplicadas en otros países donde los procesos de la construcción se vienen industrializando. Dentro de estas metodologías tenemos: • Las Tecnologías de Información y Comunicación, donde resalta la manera correcta de gestionar un proyecto de construcción. • Lean Proyect Delivery System, introducido por Ballard (2000). • Virtual Desing and Construction (VDC). • Bim Execution Plan (BEP). Siendo el BEP el plan de implementación del BIM a un proyecto. Nosotros optamos por usar la metodología BIM en el proyecto Multifamiliar DUPLO ubicado en el distrito de Breña, tiene 28 pisos y cuenta con 05 sótanos. El plazo para la ejecución otorgado según contrato fue de 12 meses. La finalidad de aplicar la metodología BIM es poder controlar el costo del proyecto según el presupuesto adjudicado. Con la aplicación del BIM pretendemos evitar: • Los tiempos muertos por consultas a los proyectistas durante la construcción. • Las ampliaciones de plazo por indefiniciones o RFI´s no resueltos. • El sobrecosto que conlleva una ampliación de plazo (mayores gastos generales, sobre costo por alquiler de equipos, máquinas y herramientas, mano de obra). La aplicación de la metodología BIM al proyecto Duplo servirá para identificar todas las interferencias de las diversas especialidades antes de la ejecución en el terreno. Realizando las reuniones colaborativas y/o ICE SESIONS encabezadas por un Bim Manager se logran identificar y resolver las incompatibilidades, cada participante sabe el papel que desempeña. Al finalizar el proyecto, se evaluará los resultados con la aplicación de la metodología BIM, en nuestro caso el resultado fue favorable.
The world of construction is changing, always looking for continuous improvement and optimization of processes. Due to this, there is a need to use new methodologies applied in other countries where construction processes are being industrialized. Within these methodologies we have: • Information and Communication Technologies, where the correct way to manage a construction project stands out. • Lean Project Delivery System, introduced by Ballard (2000). • Virtual Design and Construction (VDC). • BIM Execution Plan (BEP). The BEP being the BIM implementation plan for a project. We chose to use the BIM methodology in the DUPLO Multifamily project located in the district of Breña, it has 28 floors and has 05 basements. The term for execution granted under the contract was 12 months. The purpose of applying the BIM methodology is to be able to control the cost of the project according to the allocated budget. With the BIM application we intend to avoid: • Downtime due to consultations with the designers during construction. • Term extensions due to undefined or unresolved RFIs. • The cost of extending the term (higher overhead costs, cost of renting equipment, machines and tools, labor). The application of the BIM methodology to the Duplo project will serve to identify all the interferences of the various specialties before the execution in the field. Conducting collaborative meetings and / or ICE SESSIONS led by a BIM Manager can identify and resolve incompatibilities, each participant knows the role it plays. At the end of the project, the results will be evaluated with the application of the BIM methodology, in our case the result was favorable. Keyword: BIM Methodology reduces incompatibilities in a project; This optimizes and controls the cost established in the budget.
Trabajo de investigación
Стилі APA, Harvard, Vancouver, ISO та ін.
13

Hartunians, Jordan. "High temperature H2 bio-production in Thermococcales models : setting up bases optimized high pressure solutions." Thesis, Brest, 2020. http://www.theses.fr/2020BRES0033.

Повний текст джерела
Анотація:
L’H2, vecteur d’énergie prometteur, peut être synthétisé par les Thermococcales. La haute pression (HP) influencerait le métabolisme associé, mais n’a pas été envisagée en pratique. Après criblage d’isolats pour dégradations de substrats et productions d’H2, T. barophilus MPT, croissant préférentiellement à 40 MPa, a été choisi comme modèle, et sa fermentation a été décrite dans un contexte appliqué. Des méthodes HP ont été optimisées pour étudier l’H2. Un bioréacteur de 400mL de culture continue a été amélioré, maintenant des fluides corrosifs à HP hydrostatique (jusqu’à 120 MPa) et gazeuse (jusqu’à 40 MPa) jusqu’à 150 °C. Il a permis de mesurer la production d’H2 de notre souche à HP gazeuse. Un tube compressible pour culture discontinue à phase gaz étanche a été inventé, et a servi à mesurer la production d’H2 de T. barophilus en HP hydrostatique. Le métabolisme HP de la souche a été étudié grâce à des délétions préalables de gènes clés (mbh, mbs, co-mbh, shI, shII). Les rôles des enzymes liées ont été précisés via des mesures de croissances, produits (H2, H2S, acétate) et expressions génétiques des mutants, à 0,1 et 40 MPa. La tolérance à l’H2 de T. barophilus a été augmentée par évolution adaptative en laboratoire.« Evol », la souche fille acclimatée durant 76 générations à une saturation d’H2, a crû dans 10% d’H2, contrairement à la souche mère. Pour comprendre ces adaptations, les produits (H2, H2S, acétate), transcriptomes et génomes des deux souches ont été comparés. Avec 119 mutations génomiques, le métabolisme de l’H2 a été modifié dans le variant. Ce projet souligne l’intérêt du caractère piézophile des Thermococcales dans la bio-production d’H2 et permet de proposer des stratégies d’H2 et permet de proposer des stratégies d’optimisation
H2, a promising energetic vector, can be synthesized by Thermococcales. High pressure (HP) could influence the associated metabolism, but was not practically considered. After having screened isolates for assets in substrate degradation and H2 yields, T. barophilus MPT, growing optimally at 40 MPa, was chosen as a model and its metabolism was characterized in an applied context. Methods for HP culture were optimized for H2 studies. Our HP bioreactor for continuous culture underwent major improvements. This 400 mL container, able to maintain corrosive fluids at hydrostatic (up to 120 MPa) and gas (up to 40 MPa) pressures, at up to 150 °C, served to assess H2 production of our strain at high gas pressure. We also created a compressible device for discontinuous leak-free gas-phase incubations, allowing to measure T. barophilus HP H2 production (hydrostatic). HP adaptations of T. barophilus were observed thanks to previous deletions of key genes (mbh, mbs, co-mbh, shI, shII).We refined the roles of each concerned enzyme by assessing growths, end-products (H2, H2S, acetate), and gene expressions of the mutants, at 0.1 and 40 MPa. Additionally, we enhanced H2 tolerance in our model by adaptive laboratory evolution. “Evol”, the ensuing strain acclimatized to H2-saturating conditions for 76 generations, grew in 10% H2, contrarily to the parent strain. To understand such adaptation, we compared both strains’ end-products (H2, H2S, acetate), transcriptomes, and genomes.119 mutations were detected and the H2 metabolism was changed in the new variant. This work underlines the interest of Thermococcales’ piezophily for H2 bio-production and permits to propose optimization strategies
Стилі APA, Harvard, Vancouver, ISO та ін.
14

Hildebrand, Matthias. "Optimized network access in heterogeneous wireless networks." Kassel : Kassel Univ. Press, 2005. http://deposit.d-nb.de/cgi-bin/dokserv?idn=977677540.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
15

Poon, Tung-chung Jensen, and 潘冬松. "Laparoscopic colorectal resection: the impacton clinical outcomes & strategies to further optimize its results." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2010. http://hub.hku.hk/bib/B45205711.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
16

Fobel, Oliver. "Auditory brainstem and middle latency responses with optimized stimuli experiments and models /." [S.l.] : [s.n.], 2003. http://deposit.ddb.de/cgi-bin/dokserv?idn=967608074.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
17

Serfontein, Dawid Eduard. "Deep burn strategy for the optimized incineration of reactor waste plutonium in pebble bed high temperature gas–cooled reactors / Serfontein D.E." Thesis, North-West University, 2013. http://hdl.handle.net/10394/8069.

Повний текст джерела
Анотація:
In this thesis advanced fuel cycles for the incineration, i.e. deep–burn, of weapons–grade plutonium, reactor–grade plutonium from pressurised light water reactors and reactor–grade plutonium + the associated Minor Actinides in the 400 MWth Pebble Bed Modular Reactor Demonstration Power Plant was simulated with the VSOP 99/05 diffusion code. These results were also compared to the standard 9 g/fuel sphere U/Pu 9.6% enriched uranium fuel cycle. The addition of the Minor Actinides to the reactor–grade plutonium caused an unacceptable decrease in the burn–up and thus an unacceptable increase in the heavy metal (HM) content in the spent fuel, which is intended for direct disposal in a deep geological repository, without chemical reprocessing. All the Pu fuel cycles failed the adopted safety limits in that either the maximum fuel temperature of 1130°C, during normal operation, or the maximum power of 4.5 kW/sphere was exceeded. All the Pu cycles also produced positive Uniform Temperature Reactivity Coefficients, i.e. the coefficient where the temperature of the fuel and the graphite moderator in the fuel spheres are varied together. these positive temperature coefficients were experienced at low temperatures, typically below 700°C. This was due to the influence of the thermal fission resonance of 241Pu. The safety performance of the weapons–grade plutonium was the worst. The safety performance of the reactor–grade plutonium also deteriorated when the heavy metal loading was reduced from 3 g/sphere to 2 g or 1 g. In view of these safety problems, these Pu fuel cycles were judged to be not licensable in the PBMR DPP–400 reactor. Therefore a redesign of the fuel cycle for reactor–grade plutonium, the power conversion system and the reactor geometry was proposed in order to solve these problems. The main elements of these proposals are: v 1. The use of 3 g reactor–grade plutonium fuel spheres should be the point of departure. 232Th will then be added in order to restore negative Uniform Temperature Reactivity Coefficients. 2. The introduction of neutron poisons into the reflectors, in order to suppress the power density peaks and thus the temperature peaks. 3. In order to counter the reduction in burn–up by this introduction of neutron poisons, a thinning of the central reflector was proposed.
Thesis (PhD (Nuclear Engineering))--North-West University, Potchefstroom Campus, 2012.
Стилі APA, Harvard, Vancouver, ISO та ін.
18

Brockmeier, Ulf. "New strategies to optimize the secretion capacity for heterologous proteins in Bacillus subtilis." [S.l.] : [s.n.], 2006. http://deposit.ddb.de/cgi-bin/dokserv?idn=980650771.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
19

Sedaghat, Maman Reza. "Fault emulation reconfigurable hardware based fault simulation using logic emulation systems with optimized mapping /." [S.l. : s.n.], 1999. http://deposit.ddb.de/cgi-bin/dokserv?idn=95853893X.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
20

Huapaya, Vásquez José Alberto. "Integración de entorno big data en la gestión financiera de un sistema bancario para optimizar productos, servicios internos y toma de decisiones." Bachelor's thesis, Universidad Nacional Mayor de San Marcos, 2021. https://hdl.handle.net/20.500.12672/17040.

Повний текст джерела
Анотація:
Describe la integración de un entorno Big Data en la gestión de productos y servicios de una entidad bancaria optimizando productos financieros y tomas de decisiones, donde sus distintas áreas de negocio tienen sistemas y bases de datos están aisladas provocando mayor consumo de recursos informáticos, mayor mantenibilidad y en muchos casos retraso de procesos. Este problema es común en entidades financieras y se vuelve crítico cuando nuestra empresa es transnacional, multiplicando estas bases de datos aisladas, como es nuestro caso. Para todo ello, el área de Arquitectura de Datos planteó lineamientos como centralizar la información en un entorno big data asegurando la accesibilidad progresiva de los usuarios de negocio para nuevas iniciativas de analítica financiera y con ello ir apagando las bases de datos aisladas. El entorno big data comprende la capa de ingesta de datos en un Data Lakemediante el procesamiento distribuido de spark; y otra capa de consumo de información a través de sandboxes, donde los usuarios podrán realizar analítica avanzada y reportería con herramientas de inteligencia de negocios. Como resultados de la integración, se tienen varios proyectos trabajando sobre este entorno Big Data, que están productivos con data gobernada, flujos definidos y con mejores tiempos de respuesta en procesos que anteriormente se desarrollaban en base de datos tradicionales. Con ello, las áreas de negocio tienen productos y servicios más eficientes.
Стилі APA, Harvard, Vancouver, ISO та ін.
21

Zhang, Jingdong, and 張敬東. "Development of optimized deconvoluted coincidence doppler broadening spectroscopy and deep level transient spectroscopies with applicationsto various semiconductor materials." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 2006. http://hub.hku.hk/bib/B38279010.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
22

Stroka, Jörg. "Determination of aflatoxins in food and feed with simple and optimised methods." [S.l. : s.n.], 2000. http://deposit.ddb.de/cgi-bin/dokserv?idn=963266624.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
23

Hokam, Essam Mohamed. "Computer based expert system to optimize the water supply for modern irrigation systems in selected regions in Egypt." [S.l. : s.n.], 2002. http://deposit.ddb.de/cgi-bin/dokserv?idn=964676680.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
24

Urbanovsky, Joshua C. "Computational Methods to Optimize High-Consequence Variants of the Vehicle Routing Problem for Relief Networks in Humanitarian Logistics." Thesis, University of North Texas, 2018. https://digital.library.unt.edu/ark:/67531/metadc1248473/.

Повний текст джерела
Анотація:
Optimization of relief networks in humanitarian logistics often exemplifies the need for solutions that are feasible given a hard constraint on time. For instance, the distribution of medical countermeasures immediately following a biological disaster event must be completed within a short time-frame. When these supplies are not distributed within the maximum time allowed, the severity of the disaster is quickly exacerbated. Therefore emergency response plans that fail to facilitate the transportation of these supplies in the time allowed are simply not acceptable. As a result, all optimization solutions that fail to satisfy this criterion would be deemed infeasible. This creates a conflict with the priority optimization objective in most variants of the generic vehicle routing problem (VRP). Instead of efficiently maximizing usage of vehicle resources available to construct a feasible solution, these variants ordinarily prioritize the construction of a minimum cost set of vehicle routes. Research presented in this dissertation focuses on the design and analysis of efficient computational methods for optimizing high-consequence variants of the VRP for relief networks. The conflict between prioritizing the minimization of the number of vehicles required or the minimization of total travel time is demonstrated. The optimization of the time and capacity constraints in the context of minimizing the required vehicles are independently examined. An efficient meta-heuristic algorithm based on a continuous spatial partitioning scheme is presented for constructing a minimized set of vehicle routes in practical instances of the VRP that include critically high-cost penalties. Multiple optimization priority strategies that extend this algorithm are examined and compared in a large-scale bio-emergency case study. The algorithms designed from this research are implemented and integrated into an existing computational framework that is currently used by public health officials. These computational tools enhance an emergency response planner's ability to derive a set of vehicle routes specifically optimized for the delivery of resources to dispensing facilities in the event of a bio-emergency.
Стилі APA, Harvard, Vancouver, ISO та ін.
25

Rivas, Montenegro Alfredo, and Lévano Jose María Paredes. "Solución tecnológica para optimizar el proceso de identificación y atención de accidentes de tránsito en Lima Metropolitana utilizando Smart Traffic." Bachelor's thesis, Universidad Peruana de Ciencias Aplicadas (UPC), 2020. http://hdl.handle.net/10757/653688.

Повний текст джерела
Анотація:
Actualmente, Perú tiene la mayor tasa de mortalidad en la región respecto a accidentes de tránsito, alcanzando en Lima más de 50,000 siniestros vehiculares y más de 500 fallecidos al año, según la Policía Nacional del Perú. Esto se debe, principalmente, a la imprudencia de conductores y/o peatones, pero también a la demora en la atención debido a las condiciones del tránsito en la ciudad, lo que aumenta el índice de víctimas fatales. Las entidades locales responsables se encuentran desarrollando proyectos para el mejoramiento del tránsito, no obstante, se ha identificado que las soluciones actuales no cuentan con las características suficientes para poder cubrir los problemas presentados. El presente proyecto tiene como objetivo principal implementar una solución tecnológica basada en Smart Traffic para optimizar la identificación y atención de accidentes de tránsito en Lima Metropolitana, empleando infraestructura vial, tecnológica y soluciones analíticas. Esta solución consiste en la captura de datos a través de diversos dispositivos, el procesamiento y análisis de los datos a través de un motor de analítica, la ejecución de acciones en tiempo real para la detección y atención de los siniestros, y la presentación de resultados y reportes, todo soportado en la infraestructura vial existente y en tecnologías como Internet of Things, Big Data & Analytics y Cloud Computing. Asimismo, se ha realizado la validación del modelo propuesto en base a escenarios de prueba con métricas existentes, y se ha elaborado un plan de continuidad que asegure la viabilidad y alta disponibilidad de la propuesta.
Currently, Peru has the highest mortality rate in the region regarding traffic accidents, reaching more than 50,000 vehicle accidents in Lima and more than 500 deaths per year, according to Policía Nacional del Perú. This is mainly due to the recklessness of drivers and / or pedestrians, but also to the delay in attention due to the current traffic conditions in the city, which increases the rate of fatalities. Local entities in charge are developing projects to improve traffic, however, it has been identified that current solutions do not have sufficient characteristics to cover the problems presented. The main objective of this project is to implement a technological solution based on Smart Traffic to optimize the identification and attention of traffic accidents in Lima Metropolitana, using road infrastructure, technology and analytical solutions. This solution consists of data capture through various devices, data processing and analysis through an analytics engine, the execution of actions in real time for the detection and attention of traffic accidents, and the presentation of results. and reports, all supported by the existing road infrastructure and by technologies such as Internet of Things, Big Data & Analytics and Cloud Computing. Also, the proposed model has been validated based on test scenarios with existing metrics, and a continuity plan has been drawn up to ensure the viability and high availability of the proposal.
Tesis
Стилі APA, Harvard, Vancouver, ISO та ін.
26

Rukskul, Pataravit. "Analysis of different respiratory and blood gas parameters to optimize brain tissue oxygen tension (PtiO2) in patients with acute subarachnoid hemorrhage." [S.l.] : [s.n.], 2003. http://deposit.ddb.de/cgi-bin/dokserv?idn=970018304.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
27

Debellemanière, Eden. "Optimiser le sommeil lent profond par des méthodes innovantes et non-invasives : mise au point technique, applicabilité et conséquences chez le sujet sain en restriction chronique de sommeil." Thesis, Sorbonne Paris Cité, 2018. https://wo.app.u-paris.fr/cgi-bin/WebObjects/TheseWeb.woa/wa/show?t=2369&f=17185.

Повний текст джерела
Анотація:
Plus d’un tiers des travailleurs européens sont concernés par des situations alimentant une restriction chronique de sommeil (RCS) qui conduisent à une altération de leurs fonctions cognitives et physiologiques et impacte, à terme, leur santé. L’identification et le développement de contre-mesures utilisables facilement en dehors du laboratoire et permettant d’optimiser le sommeil pour augmenter la proportion ou la qualité de sommeil lent profond (SLP) est nécessaire pour limiter les conséquences à court et à long terme de leur dette de sommeil. L’objectif de ce travail de thèse était de mettre au point et d’évaluer les bénéfices de deux contre-mesures, les suggestions hypnotiques et les stimulations auditives des ondes lentes en SLP, chez des sujets sains sans dette de sommeil puis au cours d’une période de RCS. Nos principaux résultats mettent en évidence que les suggestions hypnotiques permettent d’augmenter le temps total de sommeil et la durée du SLP lors de courtes siestes chez des travailleurs sans dette de sommeil. En condition de RCS, elles induisent une récupération précoce des capacités cognitives et physiologiques par rapport à des siestes classiques. Nos recherches montrent également qu’il est possible d’optimiser les algorithmes de stimulations auditives des ondes lentes en utilisant du bruit rose et que de telles stimulations permettent une récupération précoce de la somnolence subjective et des vitesses de réponses à une tâche d’attention soutenue après une période de RCS. Dans cette étude, les sons étaient émis par le bandeau Dreem, après avoir préalablement validé sa fiabilité pour mesurer le sommeil et réaliser de telles stimulations de manière autonome
No abstract
Стилі APA, Harvard, Vancouver, ISO та ін.
28

Kannan, Ramakrishnan. "A Nash Bargaining Based Bid Optimizer for Sponsored Search Auctions." Thesis, 2008. https://etd.iisc.ac.in/handle/2005/4609.

Повний текст джерела
Анотація:
The on-line advertising market involving displaying of Ads against search results by a search engine is growing at a fast rate. A majority of the search engine companies sell their advertising space through auctions which are popularly called sponsored search auctions. In the face of fierce competition, each search engine company faces the threat of its advertisers switching to an alternative search engine in quest of better bang-per-buck. In order to retain customers, the search engine companies have introduced many utilities such as bid optimizer, ad slotting, and budget optimizer to promise enhanced bang-per-buck to the advertisers. The focus of this work is on the bid optimizer which is a software agent that automatically chooses bid values on behalf of the bidders. Bidders will provide a target budget to exhaust and a maximum bid value (willingness to pay). The bid optimizer aims to maximize the bang-per-buck by adjusting the bid amount based on the projected keyword traffic and remaining budget. In this thesis, we develop a novel bid optimizer based on Nash bargaining theory. The thesis is in two parts. In the first part, we address continuous bids while in the second part, we address discrete bids. Our formulation consists of a two person Nash bargaining model, where, the auctioneer is one player and a virtual aggregated agent representing all the n bidders is the other player. In the continuous bids case, the Nash bargaining formulation leads to a convex hull of the utility space of these two players. We show that this convex hull has exactly three corner points which can be computed in O(nlogn) time. Next, we show that the Nash bargaining solution, a point in this convex hull can be mapped to a bid profile of the bidders in O(n) time. Such a bid profile happens to be a local envy-free equilibrium of the underlying game that gets induced among the bidders due to the combined effect of auction rules and the bid optimizer. In the discrete bids case, we show that the Nash bargaining solution always lies on a certain edge of the convex hull such that one end point of the edge is the vector of maximum willingness to pay of all the bidders. We show that the other endpoint of this edge can be computed as a solution of a linear programming problem. We also map this solution to a bid profile of the bidders. For both continuous bids and discrete bids, we describe an algorithmic approach for implementing our idea. We also show experimentally that the proposed bid optimization algorithm, in conjunction with the generalized second price (GSP) auction as the underlying mechanism, outperforms the GSP when we take into account the realistic possibility of bidders dropping from the auction.
Стилі APA, Harvard, Vancouver, ISO та ін.
29

Ρήγα, Γεωργία. "Βελτιστοποίηση ερωτημάτων με πολλαπλά κριτήρια σε βάσεις δεδομένων". Thesis, 2006. http://nemertes.lis.upatras.gr/jspui/handle/10889/531.

Повний текст джерела
Анотація:
Το πρόβλημα της βελτιστοποίησης ερωτημάτων πολλαπλών κριτηρίων σε βάσεις δεδομένων είναι ένα αρκετά δύσκολο και ενδιαφέρον ερευνητικά πρόβλημα, διότι χαρακτηρίζεται από αντικρουόμενες απαιτήσεις. Κάθε βήμα στην απάντηση ενός ερωτήματος μπορεί να εκτελεστεί με παραπάνω από έναν τρόπους. Για την επίλυση τέτοιου είδους ερωτημάτων έχουν προταθεί διάφοροι αλγόριθμοι, με πιο πρόσφατους τους: Mariposa, M' και Generate Partitions. Ο Mariposa και ο Μ' εφαρμόζονται στην βάση δεδομένων Mariposa, η οποία δίνει την δυνατότητα στον χρήστη να καθορίζει την επιθυμητή εξισορόπηση (tradeoff) καθυστέρησης/κόστους για κάθε ερώτημα που θέτει. Ο αλγόριθμος Mariposa ακολουθεί μία προσέγγιση απληστίας (greedy approach) προσπαθώντας σε κάθε βήμα να μεγιστοποιήσει το «κέρδος» ενώ ο Μ' χρησιμοποιεί σύνολα βέτιστων κατά Pareto λύσεων για την επιλογή του επόμενου βήματος στην θέση του κριτηρίου απληστίας. Τέλος, ο αλγόριθμος Generate Partition χρησιμοποιεί έναν διαχωρισμό του χώρου απαντήσεων χρησιμοποιώντας δομές R-trees πετυχαίνοντας πολύ καλή απόδοση.
The optimization of queries in distributed database systems is known to be subject to delicate trade-offs. For example, the Mariposa database system allows users to specify a desired delay-cost tradeoff (that is to supply a decreasing function u(d) specifying how much the user is willing to pay in order to receive the query results within time d) Mariposa divides a query graph into orizontal strides analyzes each stride, and uses a greedy heuristic to find the best plan for all strides.
Стилі APA, Harvard, Vancouver, ISO та ін.
30

Liu, Hsin-Liang, and 劉信良. "Discussion Possibilities to Optimize Water as Bio-Battery Source." Thesis, 2018. http://ndltd.ncl.edu.tw/handle/rfxmy6.

Повний текст джерела
Анотація:
碩士
國立屏東科技大學
車輛工程系所
106
In this study, we used the “self-designed copper-zinc electrode” photoreactive Bio-Battery device, the bioelectrical electricity came form chlorella, Spirogyra. and Grammitidaceae. We contributed different artificial environments as the research objects and changed the light wavelength and illuminance for the light source. The electrode type was added different concentrations of TiO2 nanoparticles, domestic sewage, and other experimental methods, the measurement of various research projects such as voltage, current. The bioelectric data had calculated the power of the bio-battery. The pros and cons were assessed of different projects for bio-batteries and it was possibile be used. The experimental results show that chlorella performance is better than that of spirogyra and the intensity and a wavelength of light have a direct impact on current. To improve the performance of bio-battery, the electrode material and electrode area can be improved. The best performing bio-battery was chlorella exposed to a solar irradiance of 87500 Lux, the average maximum voltage is 0.813 V, the current is 0.898 mA and power is 0.73 mW. The experimental items in the data are required accuracy, through precise control of the experiment in a variety of reasons, to ensure that the experimental standard deviation of less than 5%. We are confirmed that two groups of bio-batteries can drive a single red LED lamp bead in tandem. However, the volume power density still differs by 107 times from that of the solar panel sold by the market due to the volume power density calculation. Although the bio-battery feasibility, there is still considerable improvement in performance.
Стилі APA, Harvard, Vancouver, ISO та ін.
31

TU, MENG-JIA, and 凃孟佳. "Kalman filter optimized by bio-inspired computation for underwater glider localization." Thesis, 2018. http://ndltd.ncl.edu.tw/handle/cejrvm.

Повний текст джерела
Анотація:
碩士
國立高雄應用科技大學
電子工程系
106
Ocean data sampling is an important technique for observing weather and managing fishery resources in Taiwan. Underwater glider is a kind of unmanned water vehicle that can be endowed with long-term and dangerous underwater detection tasks, to save labor cost, to intelligently gather and to effectively record marine data. In order to cruise in sea for a long time and complete the task of collecting marine information, the navigation system and path planning in sea are significant issues because the oceanic environment has quite a lot of variability, which makes the difficulty of positioning and path planning to affect the task completion. If there is no qualified positioning system, it will be not able to complete the mission requirements, even the underwater glider could be lost or damaged. Thus, a good positioning system of the underwater unmanned glider is a key to slove the problems of the path planning, underwater working time and energy consumption of the glider. Therefore, to improve the traditional positioning method, in this paper, a Kalman filter with Genetic Algorithm is proposed to estimate the glider position and mitigate the positioning error. The random search of genetic algorithm is helpful to discover the optimal solution of Kalman filter to enable the proposed system quickly adapting to various environmental variability problems. The research results show that the average error of the proposed approach with Kalman filter is reduced by 27%. When the genetic algorithm is introduced to work with Kalman filter, the average error is reduced by 33%. The contribution of this work has presented a milestone and is helpful to develop a position technology of underwater vehicles.
Стилі APA, Harvard, Vancouver, ISO та ін.
32

Hounsell, Mathew Francis. "Using Big Data from TOTOR ETS to optimise public transport operations." Thesis, 2020. http://hdl.handle.net/10453/144073.

Повний текст джерела
Анотація:
University of Technology Sydney. Institute for Sustainable Futures.
This thesis contributes to development of the trans-disciplinary field of Transport Analytics that aims to better target customer preferences and needs while optimising public transport operations. It demonstrates how use of empirical data acquired from Electronic Ticketing Systems (ETS) such as the Opal card system in Sydney, can provide more accurate and unexpected insights into demand patterns for public transport services. Systems that record every Tap-On and Tap-Off (TOTOR) pair, such as Opal, effectively provide a census of passenger responses to the services delivered, potentially increasing certainty and consensus on key aspects of operations such as required capacity, appropriate frequency and interchanging. Previously, the lack of detailed empirical data led public transport service providers to rely on top-down system-level models of macro-behaviour. The availability of high resolution TOTOR datasets provides an opportunity to develop bottom-up human-level models of micro-behaviour that can then be used to construct more accurate macroscopic system models. As will be shown, this difference in approach can lead to significantly different conclusions about patronage and appropriate service levels. The thesis approaches this new data and analytical opportunities in two ways. Firstly, by acknowledging and addressing concerns about privacy through development of a method to construct privacy-safe datasets; and secondly through new analytical methods to take advantage of the TOTOR data. The travel histories of individual customers contained within TOTOR datasets provide detailed biographical information; and so, to protect the privacy of individuals, access to these datasets has been highly restricted. In response, this thesis describes the methodological barrier created by the need for privacy protection, proposing a method to overcome this. The thesis provides several example case studies undertaken using analytics developed for activity datasets to improve public transport operations. These leverage inherent aspects of the data that were not available in previous data forms. The methods proposed leverages the ability to transform biographical-datasets into privacy-safe activity-datasets through deidentification, disassociation, aggregation and elimination. The method proposes three stages of transformation to create three levels of activity datasets with increasing privacy that can be distributed and shared appropriately between service providers and coordination agencies, to collaborators (such as researchers), and then to the general public. At all times, the research has been carried out within a customer-centric (customer service) approach to public transport service delivery. In the case studies, improved analytics has been shown to assist service partners in analysing and interpreting passenger behaviour in transport operations.
Стилі APA, Harvard, Vancouver, ISO та ін.
33

Yang, Bin [Verfasser]. "Modified particle swarm optimizers and their application to robust design and structural optimization / Bin Yang." 2009. http://d-nb.info/99460453X/34.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
34

Lin, Bo-Wei, and 林柏維. "Improvement of Deep Learning Optimizer Based on Metaheuristic Using Batch Update." Thesis, 2019. http://ndltd.ncl.edu.tw/cgi-bin/gs32/gsweb.cgi/login?o=dnclcdr&s=id=%22107NCHU5394029%22.&searchmode=basic.

Повний текст джерела
Анотація:
碩士
國立中興大學
資訊科學與工程學系所
107
Deep learning is able to find the hidden rule of given problem, so it is often applied to several research field such as classification, prediction, image recognition, etc. To fullfill this strong function of deep learning, it adjusts weight and bias continuously to train a model to simluate a function similar to the rule of original problem. Because the effect and efficiency of adjusting these parameter depends on the machanism of optimizer, the selection of optimizer determine the strategy of the simulated function. One of the optimizers used commonly is backpropagation. In addition to the original backpropagation, there is also a hybrid method by combining backpropagation and metaheuristic algorithm. Metaheuristic is a series of search algorithm to find the optimal solution in solution space. Becasue of the feature of metaheuristic, it is better than backpropagation in global search. Comparing metaheuristic to backpropagation in deep learning, metaheuristic is able to find a best solution in global search, and backpropagation perform well in local search. By combining these two methds, the hybrid metaheuristic-backpropagation is better than only metaheuristic and only backpropagation in global search and local search. However, the training time of metaheuristic-backpropagation optimizer is too long. This paper reduces training time by batch size update in regression problem. Finally, batch update get better result than original in complicated problem. Batch update enhance a little accuracy, and training time is reduced to 87.74% time cost.
Стилі APA, Harvard, Vancouver, ISO та ін.
35

Javidsharifi, M., T. Niknam, J. Aghaei, and Geev Mokryani. "Multi-objective short-term scheduling of a renewable-based microgrid in the presence of tidal resources and storage devices." 2017. http://hdl.handle.net/10454/15243.

Повний текст джерела
Анотація:
Yes
Daily increasing use of tidal power generation proves its outstanding features as a renewable source. Due to environmental concerns, tidal current energy which has no greenhouse emission attracted researchers’ attention in the last decade. Additionally, the significant potential of tidal technologies to economically benefit the utility in long-term periods is substantial. Tidal energy can be highly forecasted based on short-time given data and hence it will be a reliable renewable resource which can be fitted into power systems. In this paper, investigations of effects of a practical stream tidal turbine in Lake Saroma in the eastern area of Hokkaido, Japan, allocated in a real microgrid (MG), is considered in order to solve an environmental/economic bi-objective optimization problem. For this purpose, an intelligent evolutionary multi-objective modified bird mating optimizer (MMOBMO) algorithm is proposed. Additionally, a detailed economic model of storage devices is considered in the problem. Results show the efficiency of the suggested algorithm in satisfying economic/environmental objectives. The effectiveness of the proposed approach is validated by making comparison with original BMO and PSO on a practical MG.
Iran National Science Foundation; Royal Academy of Engineering Distinguished Visiting Fellowship under Grant DVF1617\6\45
Стилі APA, Harvard, Vancouver, ISO та ін.
36

Wadia, Anosh Porus. "Developing Biomimetic Design Principles for the Highly Optimized and Robust Design of Products and Their Components." Thesis, 2011. http://hdl.handle.net/1969.1/ETD-TAMU-2011-08-9832.

Повний текст джерела
Анотація:
Engineering design methods focus on developing products that are innovative, robust, and multi-functional. In this context, the term robust refers to a product's ability to accomplish successfully its predetermined functions. Owing to the abundance of optimized and robust biological systems, engineering designers are now looking to nature for inspiration. Researchers believe that biomimetic or bio-inspired engineering systems can leverage the principles, mechanisms, processes, strategies, and/or morphologies of nature's successful designs. Unfortunately, two important problems associated with biomimetic design are a designer's limited knowledge of biology and the difference in biological and engineering terminologies. This research developed a new design tool that addresses these problems and proposes to help engineering designers develop candidate bio-inspired products or solutions. A methodology that helps users infer or extract biomimetic design principles from a given natural system or biomimetic product pair is described in this thesis. The method incorporates and integrates five existing design tools and theories to comprehensively investigate a given natural system or biomimetic product. Subsequently, this method is used to extract biomimetic design principles from 23 biomimetic products and natural systems. It is proposed that these principles have the potential to inspire ideas for candidate biomimetic products that are novel, innovative, and robust. The principle extraction methodology and the identified principles are validated using two separate case studies and a detailed analysis using the validation square framework. In the first case study, two students and the author use the principle extraction methodology to extract characteristics from a natural system and a biomimetic product pair. Results from this case study showed that the methodology effectively and repeatedly identifies system characteristics that exemplify inherent biomimetic design principles. In the second case study, the developed biomimetic design principles are used to inspire a solution for an engineering design problem. The resulting solution and its evaluation show that the design's achieved usefulness is linked to applying the biomimetic design principles. Similar to the TRIZ principles, the biomimetic design principles can inspire ideas for solutions to a given problem. The key difference is that designers using TRIZ leverage the solution strategies of engineering patents, while designers using the biomimetic design principles leverage nature’s solution strategies. The biomimetic design principles are compared to TRIZ and the BioTRIZ matrix.
Стилі APA, Harvard, Vancouver, ISO та ін.
37

Chen, Chao-Pin, and 陳肇斌. "Optical Design and Optimized Rapid Prototyping Process for Automotive Thick-wall Lens." Thesis, 2019. http://ndltd.ncl.edu.tw/cgi-bin/gs32/gsweb.cgi/login?o=dnclcdr&s=id=%22107NCHU5693017%22.&searchmode=basic.

Повний текст джерела
Анотація:
碩士
國立中興大學
精密工程學系所
107
This paper is mainly used to improve the manufacturing process of thick-wall lens in the automotive industry. The thick-wall lens is optically designed under the limitation of the size of the headlight and comply with the vehicle regulations. However, the current forming time of thick-wall lens more than 600 seconds, resulting in lower production efficiency. Therefore, this paper is aimed to the structural design (geometry/curvature), optical simulation (light pattern/light intensity), heat source calculation of mold, plastic flow analysis of the thick-wall len (mold temperature uniformity control/cooling rate/deformation/cooling time), structure design of the mold rapid turbulent flow cooling module, design and manufacture of the mold, verification of the test on the plastic injection machine and the inspection of the finished product characteristics. According to the optical design, mold cooling design, mold flow analysis and mold production verification, it can effectively reduce the injection molding process time of 35% of thick-wall lens without affecting the optical properties in the finished product. The productivity and quality of the thick-wall lens can be improved by reducing defects or dimensional tolerance caused by uneven mold temperature distribution. It can reduce production costs effectively, increase production value, and meet the optical requirements of the automotive industry.
Стилі APA, Harvard, Vancouver, ISO та ін.
38

Peng, Huai-Chien, and 彭淮謙. "Multiobjective Optimized Hybrid Neural Networks for Hexapod Robot Gait Generation and Navigation." Thesis, 2019. http://ndltd.ncl.edu.tw/cgi-bin/gs32/gsweb.cgi/login?o=dnclcdr&s=id=%22107NCHU5441088%22.&searchmode=basic.

Повний текст джерела
Анотація:
碩士
國立中興大學
電機工程學系所
107
This thesis proposes an evolutionary control approach to realize the navigation of a hexapod robot in unknown environments. The control approach consists of three controllers. The first one is a sixteen-node fully connected recurrent neural network (FCRNN). The FCRNN is responsible for generating the basic walking gait of the robot. The FCNNN sends six signals to control the six hip joints of the robot with the two control objectives of fast walking speed and straightness in forward walking. The second is a fuzzy neural network (FNN) controller that controls the robot to execute obstacle boundary-following (OBF) behavior with the objectives of fast walking speed and maintaining a proper distance to an obstacle. The inputs of the FNN are the distance sensors on the robot, where the sensors are used to measure the distance between the robot and obstacles. The FNN sends two control outputs to adjust the hip joint swing amplitudes on the left and right sides of the robot. The last one is also an FNN controller that controls the robot to execute the target searching (TS) behavior. The FNN controller also adjusts the hip joint swing amplitudes according to the error between the orientation of the robot and the target and the change of errors. All of the above three controllers are optimized using the non-dominated sorting gene algorithms (NSGAII) and their own designed objective functions in simulations. In the navigation application, this thesis uses Stargazer to localize the position and orientation of the robot according to the labels on the ceiling to achieve navigation in unknown environments. In addition to simulations, experiments on a real robot are performed to verify the effectiveness and feasibility of the proposed method.
Стилі APA, Harvard, Vancouver, ISO та ін.
39

Liu, En Fu, and 劉恩福. "Optimized study of Compton camera with prompt gamma imaging for proton range verification." Thesis, 2019. http://ndltd.ncl.edu.tw/cgi-bin/gs32/gsweb.cgi/login?o=dnclcdr&s=id=%22107CGU05770010%22.&searchmode=basic.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
40

Muwawa, Jean Nestor Dahj. "Data mining and predictive analytics application on cellular networks to monitor and optimize quality of service and customer experience." Diss., 2018. http://hdl.handle.net/10500/25875.

Повний текст джерела
Анотація:
This research study focuses on the application models of Data Mining and Machine Learning covering cellular network traffic, in the objective to arm Mobile Network Operators with full view of performance branches (Services, Device, Subscribers). The purpose is to optimize and minimize the time to detect service and subscriber patterns behaviour. Different data mining techniques and predictive algorithms have been applied on real cellular network datasets to uncover different data usage patterns using specific Key Performance Indicators (KPIs) and Key Quality Indicators (KQI). The following tools will be used to develop the concept: RStudio for Machine Learning and process visualization, Apache Spark, SparkSQL for data and big data processing and clicData for service Visualization. Two use cases have been studied during this research. In the first study, the process of Data and predictive Analytics are fully applied in the field of Telecommunications to efficiently address users’ experience, in the goal of increasing customer loyalty and decreasing churn or customer attrition. Using real cellular network transactions, prediction analytics are used to predict customers who are likely to churn, which can result in revenue loss. Prediction algorithms and models including Classification Tree, Random Forest, Neural Networks and Gradient boosting have been used with an exploratory Data Analysis, determining relationship between predicting variables. The data is segmented in to two, a training set to train the model and a testing set to test the model. The evaluation of the best performing model is based on the prediction accuracy, sensitivity, specificity and the Confusion Matrix on the test set. The second use case analyses Service Quality Management using modern data mining techniques and the advantages of in-memory big data processing with Apache Spark and SparkSQL to save cost on tool investment; thus, a low-cost Service Quality Management model is proposed and analyzed. With increase in Smart phone adoption, access to mobile internet services, applications such as streaming, interactive chats require a certain service level to ensure customer satisfaction. As a result, an SQM framework is developed with Service Quality Index (SQI) and Key Performance Index (KPI). The research concludes with recommendations and future studies around modern technology applications in Telecommunications including Internet of Things (IoT), Cloud and recommender systems.
Cellular networks have evolved and are still evolving, from traditional GSM (Global System for Mobile Communication) Circuit switched which only supported voice services and extremely low data rate, to LTE all Packet networks accommodating high speed data used for various service applications such as video streaming, video conferencing, heavy torrent download; and for say in a near future the roll-out of the Fifth generation (5G) cellular networks, intended to support complex technologies such as IoT (Internet of Things), High Definition video streaming and projected to cater massive amount of data. With high demand on network services and easy access to mobile phones, billions of transactions are performed by subscribers. The transactions appear in the form of SMSs, Handovers, voice calls, web browsing activities, video and audio streaming, heavy downloads and uploads. Nevertheless, the stormy growth in data traffic and the high requirements of new services introduce bigger challenges to Mobile Network Operators (NMOs) in analysing the big data traffic flowing in the network. Therefore, Quality of Service (QoS) and Quality of Experience (QoE) turn in to a challenge. Inefficiency in mining, analysing data and applying predictive intelligence on network traffic can produce high rate of unhappy customers or subscribers, loss on revenue and negative services’ perspective. Researchers and Service Providers are investing in Data mining, Machine Learning and AI (Artificial Intelligence) methods to manage services and experience. This research study focuses on the application models of Data Mining and Machine Learning covering network traffic, in the objective to arm Mobile Network Operators with full view of performance branches (Services, Device, Subscribers). The purpose is to optimize and minimize the time to detect service and subscriber patterns behaviour. Different data mining techniques and predictive algorithms will be applied on cellular network datasets to uncover different data usage patterns using specific Key Performance Indicators (KPIs) and Key Quality Indicators (KQI). The following tools will be used to develop the concept: R-Studio for Machine Learning, Apache Spark, SparkSQL for data processing and clicData for Visualization.
Electrical and Mining Engineering
M. Tech (Electrical Engineering)
Стилі APA, Harvard, Vancouver, ISO та ін.
41

Abd-Rabbo, Diala. "Beyond hairballs: depicting complexity of a kinase-phosphatase network in the budding yeast." Thèse, 2017. http://hdl.handle.net/1866/19318.

Повний текст джерела
Стилі APA, Harvard, Vancouver, ISO та ін.
Ми пропонуємо знижки на всі преміум-плани для авторів, чиї праці увійшли до тематичних добірок літератури. Зв'яжіться з нами, щоб отримати унікальний промокод!

До бібліографії