Dissertations / Theses on the topic 'Soft computing'
Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles
Consult the top 50 dissertations / theses for your research on the topic 'Soft computing.'
Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.
You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.
Browse dissertations / theses on a wide variety of disciplines and organise your bibliography correctly.
Keukelaar, J. H. D. "Topics in Soft Computing." Doctoral thesis, KTH, Numerical Analysis and Computer Science, NADA, 2002. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-3294.
Full textMedaglia, Andres L. "Simulation Optimization Using Soft Computing." NCSU, 2001. http://www.lib.ncsu.edu/theses/available/etd-20010124-233615.
Full textTo date, most of the research in simulation optimization has been focused on single response optimization on the continuous space of input parameters. However, the optimization of more complex systems does not fit this framework. Decision makers often face the problem of optimizing multiple performance measures of systems with both continuous and discrete input parameters. Previously acquired knowledge of the system by experts is seldomincorporated into the simulation optimization engine. Furthermore, when the goals of the system design are stated in natural language or vague terms, current techniques are unable to deal with this situation. For these reasons, we define and study the fuzzy single response simulation optimization (FSO) and fuzzy multiple response simulation optimization (FMSO) problems.
The primary objective of this research is to develop an efficient and robust method for simulation optimization of complex systems with multiple vague goals. This method uses a fuzzy controller to incorporate existing knowledge to generate high quality approximate Pareto optimal solutions in a minimum number of simulation runs.
For comparison purposes, we also propose an evolutionary method for solving the FMSO problem. Extensive computational experiments on the design of a flow line manufacturing system (in terms of tandem queues with blocking) have been conducted. Both methods are able to generate high quality solutions in terms of Zitzlerand Thiele's "dominated space" metric. Both methods are also able to generate an even sample of the Pareto front. However, the fuzzy controlled method is more efficient, requiring fewer simulation runs than the evolutionary method to achieve the same solution quality.
To accommodate the complexity of natural language, this research also provides a new Bezier curve-based mechanism to elicit knowledge and express complex vague concepts. To date, this is perhaps the most flexible and efficient mechanism for both automatic and interactive generation of membership functions for convex fuzzy sets.
Di, Tomaso Enza. "Soft computing for Bayesian networks." Thesis, University of Bristol, 2004. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.409531.
Full textThomas, Anna. "Error detection for soft computing applications." Thesis, University of British Columbia, 2013. http://hdl.handle.net/2429/44811.
Full textMachaka, Pheeha. "Situation recognition using soft computing techniques." Master's thesis, University of Cape Town, 2012. http://hdl.handle.net/11427/11225.
Full textThe last decades have witnessed the emergence of a large number of devices pervasively launched into our daily lives as systems producing and collecting data from a variety of information sources to provide different services to different users via a variety of applications. These include infrastructure management, business process monitoring, crisis management and many other system-monitoring activities. Being processed in real-time, these information production/collection activities raise an interest for live performance monitoring, analysis and reporting, and call for data-mining methods in the recognition, prediction, reasoning and controlling of the performance of these systems by controlling changes in the system and/or deviations from normal operation. In recent years, soft computing methods and algorithms have been applied to data mining to identify patterns and provide new insight into data. This thesis revisits the issue of situation recognition for systems producing massive datasets by assessing the relevance of using soft computing techniques for finding hidden pattern in these systems.
Mitra, Malay. "Medical diognosis using soft computing technology." Thesis, University of North Bengal, 2018. http://hdl.handle.net/123456789/2722.
Full textMitra, Malay. "Medical diognosis using soft computing technologies." Thesis, University of North Bengal, 2018. http://ir.nbu.ac.in/handle/123456789/3659.
Full textRajkhowa, Priyanka. "Exploiting soft computing for real time performance." College Park, Md. : University of Maryland, 2006. http://hdl.handle.net/1903/3928.
Full textThesis research directed by: Dept. of Electrical and Computer Engineering. Title from t.p. of PDF. Includes bibliographical references. Published by UMI Dissertation Services, Ann Arbor, Mich. Also available in paper.
Hirschen, Kai. "Soft computing methods for applied shape optimization." Phd thesis, [S.l. : s.n.], 2004. http://elib.tu-darmstadt.de/diss/000499.
Full textAbraham, Ajith 1968. "Hybrid soft computing : architecture optimization and applications." Monash University, Gippsland School of Computing and Information Technology, 2002. http://arrow.monash.edu.au/hdl/1959.1/8676.
Full textShrestha, Pranav Nath. "Applying soft computing to early obesity prediction /." Available to subscribers only, 2006. http://proquest.umi.com/pqdweb?did=1240705441&sid=9&Fmt=2&clientId=1509&RQT=309&VName=PQD.
Full textHiziroglu, Abdulkadir. "A soft computing approach to customer segmentation." Thesis, University of Manchester, 2009. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.503072.
Full textStolpmann, Alexander. "An intelligent soft-computing texture classification system." Thesis, University of South Wales, 2005. https://pure.southwales.ac.uk/en/studentthesis/an-intelligent-softcomputing-texture-classification-system(a43eb831-a799-438b-9112-3ce1df432fe9).html.
Full textFernando, Kurukulasuriya Joseph Tilak Nihal. "Soft computing techniques in power system analysis." Thesis, full-text, 2008. https://vuir.vu.edu.au/2025/.
Full textFernando, Kurukulasuriya Joseph Tilak Nihal. "Soft computing techniques in power system analysis." full-text, 2008. http://eprints.vu.edu.au/2025/1/thesis.pdf.
Full textEsteves, João Trevizoli. "Climate and agrometeorology forecasting using soft computing techniques. /." Jaboticabal, 2018. http://hdl.handle.net/11449/180833.
Full textResumo: Precipitação, em pequenas escalas de tempo, é um fenômeno associado a altos níveis de incerteza e variabilidade. Dada a sua natureza, técnicas tradicionais de previsão são dispendiosas e exigentes em termos computacionais. Este trabalho apresenta um modelo para prever a ocorrência de chuvas em curtos intervalos de tempo por Redes Neurais Artificiais (RNAs) em períodos acumulados de 3 a 7 dias para cada estação climática, mitigando a necessidade de predizer o seu volume. Com essa premissa pretende-se reduzir a variância, aumentar a tendência dos dados diminuindo a responsabilidade do algoritmo que atua como um filtro para modelos quantitativos, removendo ocorrências subsequentes de valores de zero(ausência) de precipitação, o que influencia e reduz seu desempenho. O modelo foi desenvolvido com séries temporais de 10 regiões agricolamente relevantes no Brasil, esses locais são os que apresentam as séries temporais mais longas disponíveis e são mais deficientes em previsões climáticas precisas, com 60 anos de temperatura média diária do ar e precipitação acumulada. foram utilizados para estimar a evapotranspiração potencial e o balanço hídrico; estas foram as variáveis utilizadas como entrada para as RNAs. A precisão média para todos os períodos acumulados foi de 78% no verão, 71% no inverno 62% na primavera e 56% no outono, foi identificado que o efeito da continentalidade, o efeito da altitude e o volume da precipitação normal , tem um impacto direto na precisão das RNAs. Os... (Resumo completo, clicar acesso eletrônico abaixo)
Abstract: Precipitation, in short periods of time, is a phenomenon associated with high levels of uncertainty and variability. Given its nature, traditional forecasting techniques are expensive and computationally demanding. This paper presents a model to forecast the occurrence of rainfall in short ranges of time by Artificial Neural Networks(ANNs) in accumulated periods from 3 to 7 days for each climatic season, mitigating the necessity of predicting its amount. With this premise it is intended to reduce the variance, rise the bias of data and lower the responsibility of the model acting as a filter for quantitative models by removing subsequent occurrences of zeros values of rainfall which leads to bias the and reduces its performance. The model were developed with time series from 10 agriculturally relevant regions in Brazil, these places are the ones with the longest available weather time series and and more deficient in accurate climate predictions, it was available 60 years of daily mean air temperature and accumulated precipitation which were used to estimate the potential evapotranspiration and water balance; these were the variables used as inputs for the ANNs models. The mean accuracy of the model for all the accumulated periods were 78% on summer, 71% on winter 62% on spring and 56% on autumn, it was identified that the effect of continentality, the effect of altitude and the volume of normal precipitation, have a direct impact on the accuracy of the ANNs. The models have ... (Complete abstract click electronic access below)
Mestre
Struhar, Vaclav. "Improving Soft Real-time Performance of Fog Computing." Licentiate thesis, Mälardalens högskola, Inbyggda system, 2021. http://urn.kb.se/resolve?urn=urn:nbn:se:mdh:diva-55679.
Full textErman, Maria. "Applications of Soft Computing Techniques for Wireless Communications." Licentiate thesis, Blekinge Tekniska Högskola, Institutionen för tillämpad signalbehandling, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:bth-17314.
Full textTang, Yuchun. "Granular Support Vector Machines Based on Granular Computing, Soft Computing and Statistical Learning." Digital Archive @ GSU, 2006. http://digitalarchive.gsu.edu/cs_diss/5.
Full textCastro, Espinoza Félix. "A soft computing decision support framework for e-learning." Doctoral thesis, Universitat Politècnica de Catalunya, 2018. http://hdl.handle.net/10803/619802.
Full textSoportado por el desarrollo tecnológico y su impacto en las diferentes actividades cotidianas, el e-Learning (o aprendizaje electrónico) y el b-Learning (Blended Learning o aprendizaje mixto), han experimentado un crecimiento vertiginoso principalmente en la educación superior y la capacitación. Su habilidad inherente para romper distancias tanto físicas como culturales, para diseminar conocimiento y disminuir los costes del proceso enseñanza aprendizaje le permite llegar a cualquier sitio y a cualquier persona. La comunidad educativa se encuentra dividida en cuanto a su papel en el futuro. Se cree que para el año 2019 la mitad de los cursos de educación superior del mundo se impartirá a través del e-Learning. Mientras que los partidarios aseguran que ésta será la modalidad educativa del futuro, sus detractores señalan que es una moda, que hay enormes índices de abandono y que su masificación y potencial baja calidad, provocará su caída, reservándole un importante papel de acompañamiento a la educación tradicional. Hay, sin embargo, dos características interrelacionadas donde parece haber consenso. Por un lado, la enorme generación de información y evidencias que los sistemas de gestión del aprendizaje o LMS (Learning Management System) generan durante el proceso educativo electrónico y que son la base de la parte del proceso que se puede automatizar. En contraste, está el papel fundamental de los e-tutores y e-formadores que son los garantes de la calidad educativa. Éstos se ven continuamente desbordados por la necesidad de proporcionar retroalimentación oportuna y eficaz a los alumnos, gestionar un sin fin de situaciones particulares y casuísticas que requieren toma de decisiones y procesar la información almacenada. En este sentido, las herramientas que las plataformas de e-Learning proporcionan actualmente para obtener reportes y cierto nivel de seguimiento no son suficientes ni demasiado adecuadas. Es en este punto de convergencia Información-Formador, donde están centrados los actuales desarrollos de los LMS y es aquí donde la tesis que se propone pretende innovar. La presente investigación propone y desarrolla una plataforma enfocada al apoyo en la toma de decisiones en ambientes e-Learning. Utilizando técnicas de Soft Computing y de minería de datos, extrae conocimiento de los datos producidos y almacenados por los sistemas e-Learning permitiendo clasificar, analizar y generalizar el conocimiento extraído. Incluye herramientas para identificar modelos del comportamiento de aprendizaje de los estudiantes y, a partir de ellos, predecir su desempeño futuro y permitir a los formadores proporcionar una retroalimentación adecuada. Así mismo, los estudiantes pueden autoevaluarse, evitar aquellos patrones de comportamiento poco efectivos y obtener pistas reales acerca de cómo mejorar su desempeño en el curso, mediante rutas y estrategias adecuadas a partir del modelo de comportamiento de los estudiantes exitosos. La base metodológica de las funcionalidades mencionadas es el Razonamiento Inductivo Difuso (FIR, por sus siglas en inglés), que es particularmente útil en el modelado de sistemas dinámicos. Durante el desarrollo de la investigación, la metodología FIR ha sido mejorada y potenciada mediante la inclusión de varios algoritmos. En primer lugar un algoritmo denominado CR-FIR, que permite determinar la Relevancia Causal que tienen las variables involucradas en el modelado del aprendizaje y la evaluación de los estudiantes. En la presente tesis, CR-FIR se ha probado en un conjunto amplio de datos de prueba clásicos, así como conjuntos de datos reales, pertenecientes a diferentes áreas de conocimiento. En segundo lugar, la detección de comportamientos atípicos en campus virtuales se abordó mediante el enfoque de Mapeo Topográfico Generativo (GTM), que es una alternativa probabilística a los bien conocidos Mapas Auto-organizativos. GTM se utilizó simultáneamente para agrupamiento, visualización y detección de datos atípicos. La parte medular de la plataforma ha sido el desarrollo de un algoritmo de extracción de reglas lingüísticas en un lenguaje entendible para los expertos educativos, que les ayude a obtener los patrones del comportamiento de aprendizaje de los estudiantes. Para lograr dicha funcionalidad, se diseñó y desarrolló el algoritmo LR-FIR, (extracción de Reglas Lingüísticas en FIR, por sus siglas en inglés) como una extensión de FIR que permite tanto caracterizar el comportamiento general, como identificar patrones interesantes. En el caso de la aplicación de la plataforma a varios cursos e-Learning reales, los resultados obtenidos demuestran su factibilidad y originalidad. La percepción de los profesores acerca de la usabilidad de la herramienta es muy buena, y consideran que podría ser un valioso recurso para mitigar los requerimientos de tiempo del formador que los cursos e-Learning exigen. La identificación de los modelos de comportamiento de los estudiantes y los procesos de predicción han sido validados en cuanto a su utilidad por los formadores expertos. LR-FIR se ha aplicado y evaluado en un amplio conjunto de problemas reales, no todos ellos del ámbito educativo, obteniendo buenos resultados. La estructura de la plataforma permite suponer que su utilización es potencialmente valiosa en aquellos dominios donde la administración del conocimiento juegue un papel preponderante, o donde los procesos de toma de decisiones sean una pieza clave, por ejemplo, e-business, e-marketing, administración de clientes, por mencionar sólo algunos. Las herramientas de Soft Computing utilizadas y desarrolladas en esta investigación: FIR, CR-FIR, LR-FIR y GTM, ha sido aplicadas con éxito en otros dominios reales, como música, medicina, comportamientos climáticos, etc.
Ahmad, Alzghoul. "Screening Web Breaks in a Pressroom by Soft Computing." Thesis, Halmstad University, School of Information Science, Computer and Electrical Engineering (IDE), 2008. http://urn.kb.se/resolve?urn=urn:nbn:se:hh:diva-1144.
Full textWeb breaks are considered as one of the most significant runnability problems
in a pressroom. This work concerns the analysis of relation between various
parameters (variables) characterizing the paper, printing press, the printing
process and the web break occurrence. A large number of variables, 61 in
total, obtained off-line as well as measured online during the printing process
are used in the investigation. Each paper reel is characterized by a vector x
of 61 components.
Two main approaches are explored. The first one treats the problem as a
data classification task into "break" and "non break" classes. The procedures
of classifier training, the selection of relevant input variables and the selection
of hyper-parameters of the classifier are aggregated into one process based on
genetic search. The second approach combines procedures of genetic search
based variable selection and data mapping into a low dimensional space. The
genetic search process results into a variable set providing the best mapping
according to some quality function.
The empirical study was performed using data collected at a pressroom
in Sweden. The total number of data points available for the experiments
was equal to 309. Amongst those, only 37 data points represent the web
break cases. The results of the investigations have shown that the linear
relations between the independent variables and the web break frequency
are not strong.
Three important groups of variables were identified, namely Lab data
(variables characterizing paper properties and measured off-line in a paper
mill lab), Ink registry (variables characterizing operator actions aimed to
adjust ink registry) and Web tension. We found that the most important
variables are: Ink registry Y LS MD (adjustments of yellow ink registry
in machine direction on the lower paper side), Air permeability (character-
izes paper porosity), Paper grammage, Elongation MD, and four variables
characterizing web tension: Moment mean, Min sliding Mean, Web tension
variance, and Web tension mean.
The proposed methods were helpful in finding the variables influencing
the occurrence of web breaks and can also be used for solving other industrial
problems.
Perez, Ruben E. "Soft Computing techniques and applications in aircraft design optimization." Thesis, National Library of Canada = Bibliothèque nationale du Canada, 2001. http://www.collectionscanada.ca/obj/s4/f2/dsk3/ftp05/MQ63122.pdf.
Full textIslam, Nilufar. "Evaluating source water protection strategies : a soft computing approach." Thesis, University of British Columbia, 2010. http://hdl.handle.net/2429/30842.
Full textDarus, Intan Zaurah Mat. "Soft computing adaptive active vibration control of flexible structures." Thesis, University of Sheffield, 2004. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.408305.
Full textMcClintock, Shaunna. "Soft computing : a fuzzy logic controlled genetic algorithm environment." Thesis, University of Ulster, 1999. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.268579.
Full textСнитюк, О. І., and Л. В. Бережна. "Сценарний аналіз економічних процесів з використанням технологій Soft Computing." Thesis, НТУ "ХПІ", 2012. http://repository.kpi.kharkov.ua/handle/KhPI-Press/27332.
Full textAdam, Otmar. "Soft Business Process Management : Darstellung, Überwachung und Verbesserung von Geschäftsprozessen mit Methoden des Soft Computing /." Berlin : Logos, 2009. http://d-nb.info/999025317/04.
Full textKumar, Vikas. "Soft computing approaches to uncertainty propagation in environmental risk mangement." Doctoral thesis, Universitat Rovira i Virgili, 2008. http://hdl.handle.net/10803/8558.
Full textIn the first part of this thesis different uncertainty propagation methods have been investigated. The first methodology is generalized fuzzy α-cut based on the concept of transformation method. A case study of uncertainty analysis of pollutant transport in the subsurface has been used to show the utility of this approach. This approach shows superiority over conventional methods of uncertainty modelling. A Second method is proposed to manage uncertainty and variability together in risk models. The new hybrid approach combining probabilistic and fuzzy set theory is called Fuzzy Latin Hypercube Sampling (FLHS). An important property of this method is its ability to separate randomness and imprecision to increase the quality of information. A fuzzified statistical summary of the model results gives indices of sensitivity and uncertainty that relate the effects of variability and uncertainty of input variables to model predictions. The feasibility of the method is validated to analyze total variance in the calculation of incremental lifetime risks due to polychlorinated dibenzo-p-dioxins and dibenzofurans (PCDD/F) for the residents living in the surroundings of a municipal solid waste incinerator (MSWI) in Basque Country, Spain.
The second part of this thesis deals with the use of artificial intelligence technique for generating environmental indices. The first paper focused on the development of a Hazzard Index (HI) using persistence, bioaccumulation and toxicity properties of a large number of organic and inorganic pollutants. For deriving this index, Self-Organizing Maps (SOM) has been used which provided a hazard ranking for each compound. Subsequently, an Integral Risk Index was developed taking into account the HI and the concentrations of all pollutants in soil samples collected in the target area. Finally, a risk map was elaborated by representing the spatial distribution of the Integral Risk Index with a Geographic Information System (GIS). The second paper is an improvement of the first work. New approach called Neuro-Probabilistic HI was developed by combining SOM and Monte-Carlo analysis. It considers uncertainty associated with contaminants characteristic values. This new index seems to be an adequate tool to be taken into account in risk assessment processes. In both study, the methods have been validated through its implementation in the industrial chemical / petrochemical area of Tarragona.
The third part of this thesis deals with decision-making framework for environmental risk management. In this study, an integrated fuzzy relation analysis (IFRA) model is proposed for risk assessment involving multiple criteria. The fuzzy risk-analysis model is proposed to comprehensively evaluate all risks associated with contaminated systems resulting from more than one toxic chemical. The model is an integrated view on uncertainty techniques based on multi-valued mappings, fuzzy relations and fuzzy analytical hierarchical process. Integration of system simulation and risk analysis using fuzzy approach allowed to incorporate system modelling uncertainty and subjective risk criteria. In this study, it has been shown that a broad integration of fuzzy system simulation and fuzzy risk analysis is possible.
In conclusion, this study has broadly demonstrated the usefulness of soft computing approaches in environmental risk analysis. The proposed methods could significantly advance practice of risk analysis by effectively addressing critical issues of uncertainty propagation problem.
Los problemas del mundo real, especialmente aquellos que implican sistemas naturales, son complejos y se componen de muchos componentes indeterminados, que muestran en muchos casos una relación no lineal. Los modelos convencionales basados en técnicas analíticas que se utilizan actualmente para conocer y predecir el comportamiento de dichos sistemas pueden ser muy complicados e inflexibles cuando se quiere hacer frente a la imprecisión y la complejidad del sistema en un mundo real. El tratamiento de dichos sistemas, supone el enfrentarse a un elevado nivel de incertidumbre así como considerar la imprecisión. Los modelos clásicos basados en análisis numéricos, lógica de valores exactos o binarios, se caracterizan por su precisión y categorización y son clasificados como una aproximación al hard computing. Por el contrario, el soft computing tal como la lógica de razonamiento probabilístico, las redes neuronales artificiales, etc., tienen la característica de aproximación y disponibilidad. Aunque en la hard computing, la imprecisión y la incertidumbre son propiedades no deseadas, en el soft computing la tolerancia en la imprecisión y la incerteza se aprovechan para alcanzar tratabilidad, bajos costes de computación, una comunicación efectiva y un elevado Machine Intelligence Quotient (MIQ). La tesis propuesta intenta explorar el uso de las diferentes aproximaciones en la informática blanda para manipular la incertidumbre en la gestión del riesgo medioambiental. El trabajo se ha dividido en tres secciones que forman parte de cinco artículos.
En la primera parte de esta tesis, se han investigado diferentes métodos de propagación de la incertidumbre. El primer método es el generalizado fuzzy α-cut, el cual está basada en el método de transformación. Para demostrar la utilidad de esta aproximación, se ha utilizado un caso de estudio de análisis de incertidumbre en el transporte de la contaminación en suelo. Esta aproximación muestra una superioridad frente a los métodos convencionales de modelación de la incertidumbre. La segunda metodología propuesta trabaja conjuntamente la variabilidad y la incertidumbre en los modelos de evaluación de riesgo. Para ello, se ha elaborado una nueva aproximación híbrida denominada Fuzzy Latin Hypercube Sampling (FLHS), que combina los conjuntos de la teoría de probabilidad con la teoría de los conjuntos difusos. Una propiedad importante de esta teoría es su capacidad para separarse los aleatoriedad y imprecisión, lo que supone la obtención de una mayor calidad de la información. El resumen estadístico fuzzificado de los resultados del modelo generan índices de sensitividad e incertidumbre que relacionan los efectos de la variabilidad e incertidumbre de los parámetros de modelo con las predicciones de los modelos. La viabilidad del método se llevó a cabo mediante la aplicación de un caso a estudio donde se analizó la varianza total en la cálculo del incremento del riesgo sobre el tiempo de vida de los habitantes que habitan en los alrededores de una incineradora de residuos sólidos urbanos en Tarragona, España, debido a las emisiones de dioxinas y furanos (PCDD/Fs).
La segunda parte de la tesis consistió en la utilización de las técnicas de la inteligencia artificial para la generación de índices medioambientales. En el primer artículo se desarrolló un Índice de Peligrosidad a partir de los valores de persistencia, bioacumulación y toxicidad de un elevado número de contaminantes orgánicos e inorgánicos. Para su elaboración, se utilizaron los Mapas de Auto-Organizativos (SOM), que proporcionaron un ranking de peligrosidad para cada compuesto. A continuación, se elaboró un Índice de Riesgo Integral teniendo en cuenta el Índice de peligrosidad y las concentraciones de cada uno de los contaminantes en las muestras de suelo recogidas en la zona de estudio. Finalmente, se elaboró un mapa de la distribución espacial del Índice de Riesgo Integral mediante la representación en un Sistema de Información Geográfico (SIG). El segundo artículo es un mejoramiento del primer trabajo. En este estudio, se creó un método híbrido de los Mapas Auto-organizativos con los métodos probabilísticos, obteniéndose de esta forma un Índice de Riesgo Integrado. Mediante la combinación de SOM y el análisis de Monte-Carlo se desarrolló una nueva aproximación llamada Índice de Peligrosidad Neuro-Probabilística. Este nuevo índice es una herramienta adecuada para ser utilizada en los procesos de análisis. En ambos artículos, la viabilidad de los métodos han sido validados a través de su aplicación en el área de la industria química y petroquímica de Tarragona (Cataluña, España).
El tercer apartado de esta tesis está enfocado en la elaboración de una estructura metodológica de un sistema de ayuda en la toma de decisiones para la gestión del riesgo medioambiental. En este estudio, se presenta un modelo integrado de análisis de fuzzy (IFRA) para la evaluación del riesgo cuyo resultado depende de múltiples criterios. El modelo es una visión integrada de las técnicas de incertidumbre basadas en diseños de valoraciones múltiples, relaciones fuzzy y procesos analíticos jerárquicos inciertos. La integración de la simulación del sistema y el análisis del riesgo utilizando aproximaciones inciertas permitieron incorporar la incertidumbre procedente del modelo junto con la incertidumbre procedente de la subjetividad de los criterios. En este estudio, se ha demostrado que es posible crear una amplia integración entre la simulación de un sistema incierto y de un análisis de riesgo incierto.
En conclusión, este trabajo demuestra ampliamente la utilidad de aproximación Soft Computing en el análisis de riesgos ambientales. Los métodos propuestos podría avanzar significativamente la práctica de análisis de riesgos de abordar eficazmente el problema de propagación de incertidumbre.
Ahmed, Mahmud. "The use of advanced soft computing for machinery condition monitoring." Thesis, University of Huddersfield, 2014. http://eprints.hud.ac.uk/id/eprint/25504/.
Full textTurel, Mesut. "Soft computing based spatial analysis of earthquake triggered coherent landslides." Diss., Georgia Institute of Technology, 2011. http://hdl.handle.net/1853/45909.
Full textWang, Lijuan. "Multiphase flow measurement using Coriolis flowmeters incorporating soft computing techniques." Thesis, University of Kent, 2017. https://kar.kent.ac.uk/63877/.
Full textLubasch, Peer. "Identifikation von Verkehrslasten unter Einsatz von Methoden des soft computing." Dresden TUDpress, 2009. http://d-nb.info/995246319/04.
Full textYang, Yingjie. "Investigation on soft computing techniques for airport environment evaluation systems." Thesis, Loughborough University, 2008. https://dspace.lboro.ac.uk/2134/35015.
Full textGardiner, Michael Robert. "An Evaluation of Soft Processors as a Reliable Computing Platform." BYU ScholarsArchive, 2015. https://scholarsarchive.byu.edu/etd/5509.
Full textBennert, Reinhard. "Soft Computing-Methoden in Sanierungsprüfung und -controlling : Entscheidungsunterstützung durch Computional Intelligence /." Wiesbaden : Dt. Univ.-Verl, 2004. http://www.gbv.de/dms/zbw/386511489.pdf.
Full textGarcia, Raymond Christopher. "A soft computing approach to anomaly detection with real-time applicability." Diss., Georgia Institute of Technology, 2001. http://hdl.handle.net/1853/21808.
Full textOsanlou, Ardeshir. "Soft computing and fractal geometry in signal processing and pattern recognition." Thesis, De Montfort University, 2000. http://hdl.handle.net/2086/4242.
Full textChen, Mingwu. "Motion planning and control of mobile manipulators using soft computing techniques." Thesis, University of Sheffield, 1997. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.266128.
Full textBaraka, Ali. "Soft-computing and human-centric approaches for modelling complex manufacturing systems." Thesis, University of Sheffield, 2017. http://etheses.whiterose.ac.uk/16183/.
Full textLucic, Panta. "Modeling Transportation Problems Using Concepts of Swarm Intelligence and Soft Computing." Diss., Virginia Tech, 2002. http://hdl.handle.net/10919/26396.
Full textPh. D.
Bhupatiraju, Murali K. "Direct and inverse models in metal forming : a soft computing approach /." The Ohio State University, 1999. http://rave.ohiolink.edu/etdc/view?acc_num=osu1488190595941775.
Full textChaoui, Hicham. "Soft-computing based intelligent adaptive control design of complex dynamic systems." Thèse, Université du Québec à Trois-Rivières, 2011. http://depot-e.uqtr.ca/2676/1/030295752.pdf.
Full textAmina, Mahdi. "Dynamic non-linear system modelling using wavelet-based soft computing techniques." Thesis, University of Westminster, 2011. https://westminsterresearch.westminster.ac.uk/item/8zwwz/dynamic-non-linear-system-modelling-using-wavelet-based-soft-computing-techniques.
Full textMa, Xi. "One-diode photovoltaic model parameter extraction based on Soft-Computing Approaches." Thesis, Mittuniversitetet, Institutionen för elektronikkonstruktion, 2019. http://urn.kb.se/resolve?urn=urn:nbn:se:miun:diva-36302.
Full textFalcone, Roberto. "Optimal seismic retrofitting of existing RC frames through soft-computing approaches." Doctoral thesis, Universita degli studi di Salerno, 2018. http://hdl.handle.net/10556/3092.
Full textPh.D. Thesis proposes a Soft-Computing approach capable of supporting the engineer judgement in the selection and design of the cheapest solution for seismic retrofitting of existing RC framed structure. Chapter 1 points out the need for strengthening the existing buildings as one of the main way of decreasing economic and life losses as direct consequences of earthquake disasters. Moreover, it proposes a wide, but not-exhaustive, list of the most frequently observed deficiencies contributing to the vulnerability of concrete buildings. Chapter 2 collects the state of practice on seismic analysis methods for the assessment the safety of the existing buildings within the framework of a performancebased design. The most common approaches for modeling the material plasticity in the frame non-linear analysis are also reviewed. Chapter 3 presents a wide state of practice on the retrofitting strategies, intended as preventive measures aimed at mitigating the effect of a future earthquake by a) decreasing the seismic hazard demands; b) improving the dynamic characteristics supplied to the existing building. The chapter presents also a list of retrofitting systems, intended as technical interventions commonly classified into local intervention (also known “member-level” techniques) and global intervention (also called “structure-level” techniques) that might be used in synergistic combination to achieve the adopted strategy. In particular, the available approaches and the common criteria, respectively for selecting an optimum retrofit strategy and an optimal system are discussed. Chapter 4 highlights the usefulness of the Soft-Computing methods as efficient tools for providing “objective” answer in reasonable time for complex situation governed by approximation and imprecision. In particular, Chapter 4 collects the applications found in the scientific literature for Fuzzy Logic, Artificial Neural Network and Evolutionary Computing in the fields of structural and earthquake engineering with a taxonomic classification of the problems in modeling, simulation and optimization. Chapter 5 “translates” the search for the cheapest retrofitting system into a constrained optimization problem. To this end, the chapter includes a formulation of a novel procedure that assembles a numerical model for seismic assessment of framed structures within a Soft-Computing-driven optimization algorithm capable to minimize the objective function defined as the total initial cost of intervention. The main components required to assemble the procedure are described in the chapter: the optimization algorithm (Genetic Algorithm); the simulation framework (OpenSees); and the software environment (Matlab). Chapter 6 describes step-by-step the flow-chart of the proposed procedure and it focuses on the main implementation aspects and working details, ranging from a clever initialization of the population of candidate solutions up to a proposal of tuning procedure for the genetic parameters. Chapter 7 discusses numerical examples, where the Soft-Computing procedure is applied to the model of multi-storey RC frames obtained through simulated design. A total of fifteen “scenarios” are studied in order to assess its “robustness” to changes in input data. Finally, Chapter 8, on the base of the outcomes observed, summarizes the capabilities of the proposed procedure, yet highlighting its “limitations” at the current state of development. Some possible modifications are discussed to enhance its efficiency and completeness. [edited by author]
XVI n.s.
Tiwari, A., J. Knowles, E. Avineri, Keshav P. Dahal, and R. Roy. "Applications of Soft Computing." 2006. http://hdl.handle.net/10454/2291.
Full textTsai, Pei-Wei, and 蔡沛緯. "Soft Computing for Information Hiding." Thesis, 2007. http://ndltd.ncl.edu.tw/handle/00402221571722548598.
Full text國立高雄應用科技大學
電子與資訊工程研究所碩士班
95
An innovative scheme for optimization algorithm based on observing and imitating creatures’ constitutional behaviors is proposed in this thesis. Taking behaviors from natural organisms to form the optimization algorithm is an effective way for solving optimization problems. The main purpose of this thesis is to establish an optimization algorithm by means of modeling the observed congenital behaviors from the specific species, cat, for solving optimization problems. We present two sub-models, the tracing mode and the seeking mode, for moving the solution sets from one position to another on the solution space. By properly allocating these two sub-models in the evolution, we imitate the behaviors of tracing moving objects and the resting of the cat. Moreover, when applying these sub-models into the algorithm, the solution sets are able to move from one position to a new position. Then, the evolutionary algorithm, Cat Swarm Optimization (CSO), for optimization is achieved. In order to investigate the performance of CSO, we compare CSO with one existing technique called Particle Swarm Optimization (PSO) in the experiments by testing several functions. According to the experimental results, as we expected, CSO achieves searching of the global optimum more swiftly and more precisely than PSO does. Furthermore, we apply CSO into information hiding. When increasing the robustness of the hidden information, the quality of the media containing hidden information becomes degraded. On the contrary, the robustness of the hidden information becomes weaker while the cover media perform more similarly to the original media. Balancing between the robustness of the hidden information and the similarity of the cover media is a frequent trade-off problem in information hiding, and the optimization algorithm is very useful for solving this kind of problem. Based on the results, the application of CSO and the optimization algorithm proposed in this thesis are able to obtain the best search results while producing positive influences on the hidden information at the same time. Without doubt, the proper decision can be found with the application of CSO.
DAO, THI-KIEN, and 陶氏建 (Thi-Kien Dao). "Soft Computing with Industrial Applications." Thesis, 2019. http://ndltd.ncl.edu.tw/handle/3vvvve.
Full text國立高雄科技大學
電子工程系
107
Industrialization is one of the priority concerns of government departments for socio-economic development. Therefore, industrial applications are paid more considerable attention from the research community. Successful applications have generated enormous economic benefits, e.g., working environment improved, heavily labor liberated, income increased. The government policy for developing economy depends on more increasingly focused on industrial development in the country. However, many real industrial applications face the challenges arising from requirements of the increase of industrial development with creative processes, robustness, and efficiency. Soft Computing (SC) is one of the promising solutions to these challenges. SC is an evolving collection of methodologies, which aims to exploit tolerance for imprecision, uncertainty, and partial truth to achieve robustness, tractability, close to the human mind, and low cost. SC is proving robust in delivering optimal global solutions and assisting in resolving the limitations encountered in traditional methods. The soft computing technologies like the evolutionary algorithms (EA), swarm intelligence (SI) fuzzy logic (FL), rough sets (RS), soft sets (SS), and artificial neural networks (ANN) have been applied successfully to industrial applications. This dissertation tries to bridge the gap between the theory of soft computing and industrial applications partially. The primary methodology of our research is to attempt learning how to analyze, redesign and, improve SC, e.g., technologies of EAs and SIs, for solving the particular related industrial problems. Our approach often has two parts included the algorithm and the solution as the application of the algorithm. For the first part, in the algorithm of SC, we consider the techniques like parallel computing (PC), compact computing (CP), hybrid computing (HC), multi-objective (MO), and discrete transform (DF) to enhance or improve the methodologies according to fitting what specification of the problems. For the second part is the solution for related industrial applications by applying the analyzed algorithms. The principal objects in this dissertation to be solved are the aspects of optimization such as scheduling, balancing, and topology control problems. Besides, we will discuss on advantages and disadvantages of SCs over traditional solutions, we also present the early research results of the dissertation. These results include an optimal make-span in Job shop scheduling problems, a solution for topology control scheme in Wireless Sensor Networks (WSN), a solution for the economic load dispatch problem, and a solution for the base stations (BS) optimum formation in WSN. Regarding the orientation of opportunities and challenges of industrial application development, the dissertation would be feasible for practical application in industrial social life.
Saad, A., E. Avineri, Keshav P. Dahal, M. Sarfraz, and R. Roy. "Soft Computing in Industrial Applications." 2007. http://hdl.handle.net/10454/2290.
Full text