Literatura académica sobre el tema "Expectation-Minimization"

Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros

Elija tipo de fuente:

Consulte las listas temáticas de artículos, libros, tesis, actas de conferencias y otras fuentes académicas sobre el tema "Expectation-Minimization".

Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.

También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.

Artículos de revistas sobre el tema "Expectation-Minimization"

1

Sekine, Jun. "Dynamic Minimization of Worst Conditional Expectation of Shortfall". Mathematical Finance 14, n.º 4 (octubre de 2004): 605–18. http://dx.doi.org/10.1111/j.0960-1627.2004.00207.x.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Power, J. F. y M. C. Prystay. "Expectation Minimum (EM): A New Principle for the Solution of Ill-Posed Problems in Photothermal Science". Applied Spectroscopy 49, n.º 6 (junio de 1995): 709–24. http://dx.doi.org/10.1366/0003702953964499.

Texto completo
Resumen
The expectation-minimum (EM) principle is a new strategy for recovering robust solutions to the ill-posed inverse problems of photothermal science. The expectation-minimum principle uses the addition of well-characterized random noise to a model basis to be fitted to the experimental response by linear minimization or projection techniques. The addition of noise to the model basis improves the conditioning of the basis by many orders of magnitude. Multiple projections of the data onto the basis in the presence of noise are averaged, to give the solution vector as an expectation value which reliably estimates the global minimum solution for general cases, while the conventional approaches fail. This solution is very stable in the presence of random error on the data. The expectation-minimum principle has been demonstrated in conjunction with several projection algorithms. The nature of the solutions recovered by the expectation minimum principle is nearly independent of the minimization algorithms used and depends principally on the noise level set in the model basis.
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Cheung, Ka Chun. "Optimal Reinsurance Revisited – A Geometric Approach". ASTIN Bulletin 40, n.º 1 (mayo de 2010): 221–39. http://dx.doi.org/10.2143/ast.40.1.2049226.

Texto completo
Resumen
AbstractIn this paper, we reexamine the two optimal reinsurance problems studied in Cai et al. (2008), in which the objectives are to find the optimal reinsurance contracts that minimize the value-at-risk (VaR) and the conditional tail expectation (CTE) of the total risk exposure under the expectation premium principle. We provide a simpler and more transparent approach to solve these problems by using intuitive geometric arguments. The usefulness of this approach is further demonstrated by solving the VaR-minimization problem when the expectation premium principle is replaced by Wang's premium principle.
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Chen, Fenge, Xingchun Peng y Wenyuan Wang. "Risk minimization for an insurer with investment and reinsurance via g-expectation". Communications in Statistics - Theory and Methods 48, n.º 20 (20 de febrero de 2019): 5012–35. http://dx.doi.org/10.1080/03610926.2018.1504077.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Cohen, Shay B. y Noah A. Smith. "Empirical Risk Minimization for Probabilistic Grammars: Sample Complexity and Hardness of Learning". Computational Linguistics 38, n.º 3 (septiembre de 2012): 479–526. http://dx.doi.org/10.1162/coli_a_00092.

Texto completo
Resumen
Probabilistic grammars are generative statistical models that are useful for compositional and sequential structures. They are used ubiquitously in computational linguistics. We present a framework, reminiscent of structural risk minimization, for empirical risk minimization of probabilistic grammars using the log-loss. We derive sample complexity bounds in this framework that apply both to the supervised setting and the unsupervised setting. By making assumptions about the underlying distribution that are appropriate for natural language scenarios, we are able to derive distribution-dependent sample complexity bounds for probabilistic grammars. We also give simple algorithms for carrying out empirical risk minimization using this framework in both the supervised and unsupervised settings. In the unsupervised case, we show that the problem of minimizing empirical risk is NP-hard. We therefore suggest an approximate algorithm, similar to expectation-maximization, to minimize the empirical risk.
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Galkanov, Allaberdi G. "ABOUT INNOVATIVE METHODS OF NUMERICAL DATA AVERAGING". RSUH/RGGU Bulletin. Series Information Science. Information Security. Mathematics, n.º 2 (2023): 81–101. http://dx.doi.org/10.28995/2686-679x-2023-2-81-101.

Texto completo
Resumen
Numerical data refers to any finite set of data in the form of numbers, vectors, functions, matrices representing the results of an experiment or field observations. Averaging of deterministic, random variables and matrices is considered from a single point of view as a minimization of a function in the form of a generalized least squares problem. A new definition of the mean is given. Three generalizations of averages are obtained as solutions to the minimization problem. If the known averages are harmonic, geometric, arithmetic and quadratic averages and, perhaps, some other averages, then the first generalization of averages has already given an uncountable set of averages. Two new averages are derived from the first generalization. For particular types of averages arising from the first generalization, their interpretations are given in terms of absolute and relative deviations (errors). A sufficient condition of the mean is proved for all averages. Inequalities for six averages are proved. The law of nine numbers has been discovered. The concept of a complex average is given. The concept of optimal mean is introduced. New definitions of mathematical expectation and variance and their generalizations are proposed. In the family of mathematical expectations obtained, only the classical mathematical expectation turned out to be linear. The application of generalized mathematical expectation has led to the discovery of two new distributions in probability theory, namely, the harmonic and relative distributions of a continuous random variable are determined and analytically presented.
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Fotakis, Dimitris, Piotr Krysta y Carmine Ventre. "Efficient Truthful Scheduling and Resource Allocation through Monitoring". Proceedings of the AAAI Conference on Artificial Intelligence 35, n.º 6 (18 de mayo de 2021): 5423–31. http://dx.doi.org/10.1609/aaai.v35i6.16683.

Texto completo
Resumen
We study the power and limitations of the Vickrey-Clarke-Groves mechanism with monitoring (VCGmon) for cost minimization problems with objective functions that are more general than the social cost. We identify a simple and natural sufficient condition for VCGmon to be truthful for general objectives. As a consequence, we obtain that for any cost minimization problem with non-decreasing objective μ, VCGmon is truthful, if the allocation is Maximal-in-Range and μ is 1-Lipschitz (e.g., μ can be the Lp-norm of the agents’ costs, for any p ≥ 1 or p = ∞). We apply VCGmon to scheduling on restricted-related machines and obtain a polynomial-time truthful-in-expectation 2-approximate (resp. O(1)-approximate) mechanism for makespan (resp. Lp- norm) minimization. Moreover, applying VCGmon, we obtain polynomial-time truthful O(1)-approximate mechanisms for some fundamental bottleneck network optimization problems with single-parameter agents. On the negative side, we provide strong evidence that VCGmon could not lead to computationally efficient truthful mechanisms with reasonable approximation ratios for binary covering social cost minimization problems. However, we show that VCGmon results in computationally efficient approximately truthful mechanisms for binary covering problems.
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Ansley, Craig F. y Robert Kohn. "On the equivalence of two stochastic approaches to spline smoothing". Journal of Applied Probability 23, A (1986): 391–405. http://dx.doi.org/10.2307/3214367.

Texto completo
Resumen
Wahba (1978) and Weinert et al. (1980), using different models, show that an optimal smoothing spline can be thought of as the conditional expectation of a stochastic process observed with noise. This observation leads to efficient computational algorithms. By going back to the Hilbert space formulation of the spline minimization problem, we provide a framework for linking the two different stochastic models. The last part of the paper reviews some new efficient algorithms for spline smoothing.
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Ansley, Craig F. y Robert Kohn. "On the equivalence of two stochastic approaches to spline smoothing". Journal of Applied Probability 23, A (1986): 391–405. http://dx.doi.org/10.1017/s002190020011722x.

Texto completo
Resumen
Wahba (1978) and Weinert et al. (1980), using different models, show that an optimal smoothing spline can be thought of as the conditional expectation of a stochastic process observed with noise. This observation leads to efficient computational algorithms. By going back to the Hilbert space formulation of the spline minimization problem, we provide a framework for linking the two different stochastic models. The last part of the paper reviews some new efficient algorithms for spline smoothing.
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Weng, Wenting y Wen Luo. "A Comparative Analysis of Data Mining Methods and Hierarchical Linear Modeling Using PISA 2018 Data". International Journal of Database Management Systems 15, n.º 2/3 (27 de junio de 2023): 1–16. http://dx.doi.org/10.5121/ijdms.2023.15301.

Texto completo
Resumen
Educational research often encounters clustered data sets, where observations are organized into multilevel units, consisting of lower-level units (individuals) nested within higher-level units (clusters). However, many studies in education utilize tree-based methods like Random Forest without considering the hierarchical structure of the data sets. Neglecting the clustered data structure can result in biased or inaccurate results. To address this issue, this study aimed to conduct a comprehensive survey of three tree- based data mining algorithms and hierarchical linear modeling (HLM). The study utilized the Programme for International Student Assessment (PISA) 2018 data to compare different methods, including non-mixed- effects tree models (e.g., Random Forest) and mixed-effects tree models (e.g., random effects expectation minimization recursive partitioning method, mixed-effects Random Forest), as well as the HLM approach. Based on the findings of this study, mixed-effects Random Forest demonstrated the highest prediction accuracy, while the random effects expectation minimization recursive partitioning method had the lowest prediction accuracy. However, it is important to note that tree-based methods limit deep interpretation of the results. Therefore, further analysis is needed to gain a more comprehensive understanding. In comparison, the HLM approach retains its value in terms of interpretability. Overall, this study offers valuable insights for selecting and utilizing suitable methods when analyzing clustered educational datasets.
Los estilos APA, Harvard, Vancouver, ISO, etc.

Tesis sobre el tema "Expectation-Minimization"

1

Barbault, Pierre. "Un ticket pour le sparse : de l'estimation des signaux et des paramètres en problèmes inverses bernoulli-gaussiens". Electronic Thesis or Diss., université Paris-Saclay, 2023. http://www.theses.fr/2023UPASG049.

Texto completo
Resumen
L'imagerie par Magnéto/Électro Encéphalographie (M/EEG) peut servir à reconstruire les foyers d'activité cérébrale en mesurant le champ Électro Magnétique produit par ce dernier. Même si le temps caractéristique des signaux enregistrés est assez faible pour pouvoir envisager un modèle linéaire d'acquisition, le nombre de sources possibles reste très large face au nombre de capteurs. De fait, il s'agit là d'un problème mal posé et de surcroît de grande dimension. Afin de se ramener dans le cadre d'un problème qui admet une hypothèse courante, et qui fait sens pour les neurones, est que les sources sont parcimonieuses i.e. que le nombre de valeurs non-nulles est très petit. On modélise alors le problème d'un point de vue probabiliste en utilisant une distribution a priori Bernoulli-Gaussien (BG) pour les sources. Il existe de nombreuses méthodes qui permettent de résoudre un tel problème, mais la plupart d'entre elles font appel à une connaissance des paramètres de la loi BG. L'objectif de cette thèse est de proposer une approche entièrement non-supervisée qui permet d'estimer les paramètres de la loi BG ainsi que d'estimer les sources si possible. Pour ce faire les algorithmes d'Espérance-Maximisation (EM) sont explorés. Dans un premier temps, le cas le plus simple est traité : celui du débruitage où l'opérateur linéaire est l'identité. Dans ce cadre trois algorithmes sont proposés : Une méthode des Moments basée sur la statistique des données, un EM et un algorithme d'estimation jointe des sources et des paramètres. Les résultats montrent que l'EM initialisé par la méthode des Moments est le meilleur candidat pour l'estimation des paramètres. Dans un second temps, les résultats précédents sont étendus au cas général d'opérateurs linéaires quelconques grâce à l'introduction d'une variable latente. Cette variable, en découplant les sources des observations, permet de dériver des algorithmes dit 'latents' qui alternent entre une étape de descente de gradient et une étape de débruitage qui correspond exactement au problème traité précédemment. Les résultats montrent alors que la stratégie la plus efficace est l'utilisation de l'estimation jointe 'latente' qui initialise l'EM 'latent'. Enfin, la dernière partie de ces travaux est consacrée à des considérations théoriques concernant les choix d'estimateurs joints ou marginaux du support et/ou des sources dans le cas supervisé. Ces travaux montrent que l'on peut encadrer les fonctions de coûts associées aux problèmes marginaux par celles associées à des problèmes joints grâce à une reparamétrisation du problème. Cela permet alors de proposer une stratégie générale d'estimation basée sur l'initialisation d'algorithmes d'estimation marginale par des algorithmes d'estimation jointe
Magneto/Electro Encephalography (M/EEG) imaging can be used to reconstruct focal points of cerebral activity by measuring the Electro Magnetic field produced by it. Even if the characteristic time of the recorded signals is low enough to be able to consider a linear acquisition model, the number of possible sources remains very large compared to the number of sensors. In fact, this is an ill-posed and, moreover, a large-scale problem. In order to reduce it to a 'well-posed' problem, a common assumption, and which makes sense for neurons, is that the sources are sparse i.e. that the number of non-zero values is very small. We then model the problem from a probabilistic point of view using a Bernoulli-Gaussian (BG) a priori for the sources. There are many methods that can solve such a problem, but most of them require knowledge of the parameters of the BG law. The objective of this thesis is to propose a completely unsupervised approach which allows to estimate the parameters of the BG law as well as to estimate the sources if possible. To do this, Expectation-Minimization (EM) algorithms are explored. First, the simplest case is treated: that of denoising where the linear operator is the identity. In this framework, three algorithms are proposed: A Moments method based on data statistics, an EM and a joint estimation algorithm for sources and parameters. The results show that the EM initialized by the Method of Moments is the best candidate for parameter estimation. Secondly, the previous results are extended to the general case of any linear operator thanks to the introduction of a latent variable. This variable, by decoupling the sources from the observations, makes it possible to derive so-called 'latent' algorithms which alternate between a gradient descent step and a denoising step which corresponds exactly to the problem dealt with previously. The results then show that the most effective strategy is the use of the 'latent' joint estimate which initializes the 'latent' EM. Finally, the last part of this work is devoted to theoretical considerations concerning the choice of joint or marginal estimators in the supervised case. In particular with regard to the sources and their supports. This work shows that it is possible to frame marginal problems by joint problems thanks to a reparameterization of the problem. This then makes it possible to propose a general estimation strategy based on the initialization of marginal estimation algorithms by joint estimation algorithms
Los estilos APA, Harvard, Vancouver, ISO, etc.

Libros sobre el tema "Expectation-Minimization"

1

Annala, Helka. Alisuoriutumiseen liittyvistä tekijöistä ja niihin vaikuttamisesta: Odotusvaikutusteorian sovellus alisuoriutumisen lieventämiseen = Factors associated with underachievement and ways of influencing them : application of the theory of expectation effects on minimization of underachievement. Oulu: Oulun yliopisto, 1986.

Buscar texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.

Capítulos de libros sobre el tema "Expectation-Minimization"

1

O’Sullivan, Joseph A. "Alternating Minimization Algorithms: From Blahut-Arimoto to Expectation-Maximization". En Codes, Curves, and Signals, 173–92. Boston, MA: Springer US, 1998. http://dx.doi.org/10.1007/978-1-4615-5121-8_13.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Nordin, Nurdiana. "Monitoring Organic Synthesis via Density Functional Theory". En Density Functional Theory - New Perspectives and Applications [Working Title]. IntechOpen, 2023. http://dx.doi.org/10.5772/intechopen.112290.

Texto completo
Resumen
A preliminary molecular structure for a system, which may or may not be known, is the first step in a typical investigation using ab initio techniques. A stable system is generated by a geometry search using an energy minimization method (usually a local minimum or transition state). Subsequently, it is easy to obtain any energetic properties (such as atomization energies, formation temperatures, binding energies) or expectation values or quantifiable quantities from the wave function of the molecular system and its fragments. The stability of such a system can be determined by considering the second derivative of the energy with respect to the spatial coordinates (also known as the Hessian matrix). It could be a goal to find out how the system interacts with other systems and eventually to decipher the synthesis pathways. Therefore, this chapter presents a recent application of approaches based on density functional theory (DFT) to study chemical processes at the catalytic sites of enzymes. The focus is on the interaction of small organic molecules with the ability to inhibit a catalytic cysteine of the malaria parasite, in the area of drug design.
Los estilos APA, Harvard, Vancouver, ISO, etc.

Actas de conferencias sobre el tema "Expectation-Minimization"

1

Gripsy, J. Viji y A. Jayanthiladevi. "Energy Hole Minimization in Wireless Mobile Ad Hoc Networks Using Enhanced Expectation-Maximization". En 2023 9th International Conference on Advanced Computing and Communication Systems (ICACCS). IEEE, 2023. http://dx.doi.org/10.1109/icaccs57279.2023.10112728.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Shanbhag, Uday V. y Farzad Yousefian. "Zeroth-order randomized block methods for constrained minimization of expectation-valued Lipschitz continuous functions". En 2021 Seventh Indian Control Conference (ICC). IEEE, 2021. http://dx.doi.org/10.1109/icc54714.2021.9703135.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Zhang, Hu, Pan Zhou, Yi Yang y Jiashi Feng. "Generalized Majorization-Minimization for Non-Convex Optimization". En Twenty-Eighth International Joint Conference on Artificial Intelligence {IJCAI-19}. California: International Joint Conferences on Artificial Intelligence Organization, 2019. http://dx.doi.org/10.24963/ijcai.2019/591.

Texto completo
Resumen
Majorization-Minimization (MM) algorithms optimize an objective function by iteratively minimizing its majorizing surrogate and offer attractively fast convergence rate for convex problems. However, their convergence behaviors for non-convex problems remain unclear. In this paper, we propose a novel MM surrogate function from strictly upper bounding the objective to bounding the objective in expectation. With this generalized surrogate conception, we develop a new optimization algorithm, termed SPI-MM, that leverages the recent proposed SPIDER for more efficient non-convex optimization. We prove that for finite-sum problems, the SPI-MM algorithm converges to an stationary point within deterministic and lower stochastic gradient complexity. To our best knowledge, this work gives the first non-asymptotic convergence analysis for MM-alike algorithms in general non-convex optimization. Extensive empirical studies on non-convex logistic regression and sparse PCA demonstrate the advantageous efficiency of the proposed algorithm and validate our theoretical results.
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Zhang, Xinyu, Yaohang Li, Arvid Myklebust y Paul Gelhausen. "Optimization of Geometrically Trimmed B-Spline Surfaces". En ASME 2005 International Mechanical Engineering Congress and Exposition. ASMEDC, 2005. http://dx.doi.org/10.1115/imece2005-81862.

Texto completo
Resumen
Unlike the visual trimming of B-spline surfaces, which hides unwanted portions in rendering, the geometric trimming approach provides a mathematically clean representation without redundancy. However, the process may lead to significant deviation from the corresponding portion on the original surface. Optimization is required to minimize approximation errors and obtain higher accuracy. In this paper, we describe the application of a novel global optimization method, so-called hybrid Parallel Tempering (PT) and Simulated Annealing (SA) method, for the minimization of B-spline surface representation errors. The high degree of freedom within the configuration of B-spline surfaces as well as the “rugged” landscapes of objective functions complicate the error minimization process. The hybrid PT/SA method, which is an effective algorithm to overcome the slow convergence, waiting dilemma, and initial value sensitivity, is a good candidate for optimizing geometrically trimmed B-spline surfaces. Examples of application to geometrically trimmed wing components are presented and discussed. Our preliminary results confirm our expectation.
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Clarkson, Eric, Jack Denny, Harrison Barrett, Craig Abbey y Brandon Gallas. "Night-sky reconstructions for linear digital imaging systems". En Signal Recovery and Synthesis. Washington, D.C.: Optica Publishing Group, 1998. http://dx.doi.org/10.1364/srs.1998.sthc.5.

Texto completo
Resumen
In tomographic and other digital imaging systems the goal is often to reconstruct an object function from a finite amount of noisy data generated by that function through a system operator. One way to determine the reconstructed function is to minimize the distance between the noiseless data vector it would generate via the system operator, and the data vector created through the system by the real object and noise. The former we will call the reconstructed data vector, and the latter the actual data vector. A reasonable constraint to place on this minimization problem is to require that the reconstructed function be non-negative everywhere. Different measures of distance in data space then result in different reconstruction methods. For example, the ordinary Euclidean distance results in a positively constrained least squares reconstruction, while the Kulback-Leibler distance results in a Poisson maximum likelihood reconstruction. In many cases though, if the reconstruction algorithm is continued until it converges, the end result is a reconstructed function that consists of many point-like structures and little else. These are called night-sky reconstructions, and they are usually avoided by stopping the reconstruction algorithm early or using regularization. The expectation-maximization algorithm for Poisson maximum likelihood reconstructions is an example of this situation.
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Shipway, P. H., D. G. McCartney y T. Sudaprasert. "HVOF Spraying of WC-Co Coatings with Liquid-Fuelled and Gas-Fuelled Systems: Competing Mechanisms of Structural Degradation". En ITSC2005, editado por E. Lugscheider. Verlag für Schweißen und verwandte Verfahren DVS-Verlag GmbH, 2005. http://dx.doi.org/10.31399/asm.cp.itsc2005p0963.

Texto completo
Resumen
Abstract It is widely known that during high velocity oxy-fuel (HVOF) spraying of tungsten carbide – cobalt (WC-Co) coatings, decomposition occurs resulting in the formation of W2C and a relatively brittle amorphous binder phase (along with other carbides and even metallic tungsten). Decomposition has generally been seen to be deleterious to the wear resistance of these coatings and, as such, there have been moves to reduce it. Since decomposition during spraying initiates with WC dissolution into the molten binder phase, strategies for its minimization have been based on reduction of particle temperatures and exposure times during spraying. Moves in spraying from gas-fuelled systems to liquid-fuelled systems have contributed towards these goals. This paper examines microstructural features and wear behaviour of WC-Co coatings deposited with both a liquid-fuelled and a gas-fuelled system. Contrary to expectation, it was found that the wear rate of the liquid-fuel sprayed coating was five to ten times higher than that of the gas-fuel sprayed coating. It was shown that whilst the degree of decomposition was limited during spraying with a liquid-fuelled system, the solid core of WC-Co suffers significant mechanical damage on impact as it is deposited, resulting in carbide fracture and size reduction and thus to the low observed wear resistance.
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Komanovics, Adrienne. "WORKPLACE PRIVACY IN THE EU : THE IMPACT OF EMERGING TECHNOLOGIES ON EMPLOYEE’S FUNDAMENTAL RIGHTS". En International Scientific Conference “Digitalization and Green Transformation of the EU“. Faculty of Law, Josip Juraj Strossmayer University of Osijek, 2023. http://dx.doi.org/10.25234/eclic/27458.

Texto completo
Resumen
Over the last decade, several new technologies have been adopted that enable more systematic surveillance of employees, creating significant challenges to privacy and data protection. The risks posed by the new devices and methods were exacerbated with the advent of Covid, with the involuntary introduction of digital tools to measure work output and efforts to get visibility back in the workplace through new means. Against this backdrop, the article aims to examine the main issues in workplace surveillance. After a brief overview of the range of surveillance methods, such as video surveillance, network and e-mail monitoring, and employee tracking softwares (the so-called “bossware”), as well as the challenges posed by the new technologies, the paper goes on to individually analyse the legal aspects of monitoring employees for security or performance-related reasons. The phenomenon is examined in light of relevant EU legislation (the General Data Protection Regulation of 2016 being the most relevant one), as well as the opinions adopted by the Article 29 Working Party established by Directive 95/46 and the guidelines drawn up by the European Data Protection Board, established by the GDPR and replacing the WP. In doing so, the paper will elaborate on the concept of transparency, consent, purpose limitation, data minimization, data retention, the so-called expectation of privacy, and the lawfulness of processing, especially the issue of balancing the legitimate interests of the employer against the interests or fundamental rights of the data subject. The results of the analysis suggest that new and emerging technologies developed to monitor employees in order to address productivity issues, security risks, and sexual harassment, combined with the fact that remote and hybrid work becomes the norm inevitably increase the porosity between work and private life and blur the line between public and private. Such an extensive intrusion into privacy calls for enhanced institutional efforts to protect workers from the surveillance overreach of the new digital devices.
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Morris, Lloyd, Homero Murzi, Hernan Espejo, Olga Jamin Salazar De Morris y Juan Luis Arias Vargas. "Big Data Analysis in Vehicular Market Forecasts for Business Management". En 13th International Conference on Applied Human Factors and Ergonomics (AHFE 2022). AHFE International, 2022. http://dx.doi.org/10.54941/ahfe1002299.

Texto completo
Resumen
Information in various markets constitutes the primary basis for making the right decisions in a modern and globalized world. Therefore, opportunities grow based on the availability of data and how the data is structured to obtain information that supports decision-making processes, Ogrean (2018) and Neubert (2018), and even more so when business dynamics revolve around satisfying the demand for the products or services offered, Jacobs and Chase (2009). This article proposes the analysis of the new vehicle market, through operational research techniques, addressing the behavior of vehicle sales for medium and long-term projections for business management. The analysis is developed through Markov Chains and time series analysis techniques, so a complementary approach is used to obtain predictions in future scenarios such as analysis in sales levels related to market shares. Choi et al (2018), indicate that one of the important applications of Big Data in business management is in the field of demand forecasts, becoming one of the common alternatives in prediction for data series over time. The data is taken from Statistics of the National Association of Sustainable Mobility, from 2016 to 2019 for new vehicles in the Colombian market, Andemos (2021). Merkuryeba (2019) proposes procedures between techniques that allow a comprehensive approach to forecasts and where the methods complement each other, it is through the use of the methodology in Markov chain models (Kiral and Uzun 2017), plus the methodology of the time series analysis (Stevenson et al 2015), which with a complementary approach, can reach a more detailed and comprehensive level of analysis for the statement about the future of the variable of interest: vehicle market sales for business management.The results showed that Markov chains were very useful in long-term analysis for sales forecasting and their analysis by market segmentation, for this the sales level is ranked according to the technique of Pareto. Another important contribution to the Markov chain in business management corresponds to the analysis disaggregated by sales rankings, for example in ranking 1 (first 5 brands), was obtained an expectation of value defined at 67.1% of the total sales level, also an internal analysis of this percentage ranking was carried out. Complementarily, for the alternative of times series analysis; we start from the analysis of the demand, where a seasonal behavior of vehicle sales is detected. Rockwell and Davis (2016) and Stevenson et al (2015), establish a procedure for estimating and eliminating seasonal components by using the seasonal index. Additionally, Weller and Crone (2012) and Lau et al (2018), recommend two common alternatives to measure forecast error and making decisions to selected the technique more adequate for business management: mean absolute deviation (MAD) and mean absolute percentage error (MAPE), finally, the result of the three techniques developed: moving average, exponential smoothing, and weighted moving average, the simple exponential smoothing, optimized through MAPE minimization is the selected technique, with which short and medium-term forecasts are defined.This study contributes directly to decision-making in the context of the marketing of new vehicles, as well as in academic settings in relation to research processes in data series under the configuration of big data. In this sense, it was demonstrate that the behavior of sales, segmented by market levels according to the participating brands, can be transformed into estimates of future behavior that establishes an orienting mapping of business objectives with respect to the possible level of participation in quotas of market. Finally, the methodological scheme under an epistemological perspective supported by technical decisions, represent an academic contribution of great relevance for business management, where is recommended to use the time series techniques for short and medium-term forecasts, while Markov chains for the prediction and analysis of the sales structure in medium to long term forecasts.
Los estilos APA, Harvard, Vancouver, ISO, etc.
Ofrecemos descuentos en todos los planes premium para autores cuyas obras están incluidas en selecciones literarias temáticas. ¡Contáctenos para obtener un código promocional único!

Pasar a la bibliografía