To see the other types of publications on this topic, follow the link: DATA BALANCING TECHNIQUE.

Journal articles on the topic 'DATA BALANCING TECHNIQUE'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'DATA BALANCING TECHNIQUE.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Economou, Nikos, and Antonis Vafidis. "Spectral balancing GPR data using time-variant bandwidth in the t-f domain." GEOPHYSICS 75, no. 3 (May 2010): J19—J27. http://dx.doi.org/10.1190/1.3374464.

Full text
Abstract:
Ground-penetrating radar (GPR) sections encounter a resolution reduction with depth because, for electromagnetic (EM) waves propagating in the subsurface, attenuation is typically more pronounced at higher frequencies. To correct for these effects, we have applied a spectral balancing technique, using the S-transform (ST). This signal-processing technique avoids the drawbacks of inverse [Formula: see text] filtering techniques, namely, the need for estimation of the attenuation factor [Formula: see text] from the GPR section and instability caused by scattering effects that result from methods of dominant frequency-dependent estimation of [Formula: see text]. The method designs and applies a gain in the time-frequency ([Formula: see text]) domain and involves the selection of a time-variant bandwidth to reduce high-frequency noise. This method requires a reference amplitude spectrum for spectral shaping. It performs spectral balancing, which works efficiently for GPR data when it is applied in very narrow time windows. Furthermore, we have found that spectral balancing must be applied prior to deconvolution, instead of being an alternative technique.
APA, Harvard, Vancouver, ISO, and other styles
2

Pattanapairoj, Sirorat, Danaipong Chetchotsak, and Banchar Arnonkijpanich. "Hybrid Balancing Technique Using GRSOM and Bootstrap Algorithms for Classifiers with Imbalanced Data." Advanced Materials Research 931-932 (May 2014): 1375–81. http://dx.doi.org/10.4028/www.scientific.net/amr.931-932.1375.

Full text
Abstract:
To deal with imbalanced data, this paper proposes a hybrid data balancing technique which incorporates both over and under-sampling approaches. This technique determines how much minority data should be grown as well as how much majority data should be reduced. In this manner, noise introduced to the data due to excessive over-sampling could be avoided. On top of that, the proposed data balancing technique helps to determine the appropriate size of the balanced data and thus computation time required for construction of classifiers would be more efficient. The data balancing technique over samples the minority data through GRSOM method and then under samples the majority data using the bootstrap sampling approach. GRSOM is used in this study because it grows new samples in a non-linear fashion and preserves the original data structure. Performance of the proposed method is tested using four data sets from UCI Machine Learning Repository. Once the data sets are balanced, the committee of classifiers is constructed using these balanced data. The experimental results reveal that our proposed data balancing method provides the best performance.
APA, Harvard, Vancouver, ISO, and other styles
3

Chetchotsak, Danaipong, Sirorat Pattanapairoj, and Banchar Arnonkijpanich. "Integrating new data balancing technique with committee networks for imbalanced data: GRSOM approach." Cognitive Neurodynamics 9, no. 6 (July 31, 2015): 627–38. http://dx.doi.org/10.1007/s11571-015-9350-4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Muszala, S. P., D. A. Connors, J. J. Hack, and G. Alaghband. "The Promise of Load Balancing the Parameterization of Moist Convection Using a Model Data Load Index." Journal of Atmospheric and Oceanic Technology 23, no. 4 (April 1, 2006): 525–37. http://dx.doi.org/10.1175/jtech1865.1.

Full text
Abstract:
Abstract The parameterization of physical processes in atmospheric general circulation models contributes to load imbalances among individual processors of message-passing distributed-multiprocessor systems. Load imbalances increase the overall time to completion of a model run and should be eliminated or reduced as much as possible. Presented herein is a new technique that shows promise for load balancing the parameterization of moist convection found in the Community Climate System Model's (CCSM's) Community Atmosphere Model version 3 (CAM3). At the heart of this technique is a load index that is a marker for moist convection (called a model data load index). The marker for moist convection correlates directly to the amount of processing time per model grid cell and can therefore be used to effect a load balance. Spatial locality on the model grid and temporal locality between model time steps exist that allow a decomposition from a load-balancing step to be retained for multiple time steps. The analysis in this paper shows that the load balance does not need to be applied at every time step and that the number of steps in which the previous load balance remains effective is large enough for the overhead to be cost effective. Tests performed indicate that this technique is scalable to higher-resolution models as well as to higher processor counts than those presented. Through the use of the Load Balancing and Scheduling Framework (LBSF), this technique shows promise in reducing (by ∼47%) the time of the unbalanced load of one particular subroutine in CAM3 at the T85 spectral truncation. A maximum of 3.75 s of total execution time is saved over a 2430-time-step simulation. When extrapolated to a 1000-yr simulation, this translates to a potential savings of ∼22 h in that subroutine alone. Similar methods applied to remaining subroutines can add up to a significant savings. These results are encouraging in that a fine-grained load-balancing technique using the evolving characteristics of geophysical data paves the way for load balancing a broad range of physical calculations, both in CAM3 and other scientific applications, where more general techniques are not practical.
APA, Harvard, Vancouver, ISO, and other styles
5

Muktar Yahuza, Yamani Idna Bin Idris, Ainuddin Wahid Bin Abdul Wahab, Mahdi A. Musa, and Adamu Abdullahi Garba. "A LIGHTWEIGHT AUTHENTICATION TECHNIQUE FOR SECURE COMMUNICATION OF EDGE/FOG DATA-CENTERS." Science Proceedings Series 2, no. 1 (April 25, 2020): 76–81. http://dx.doi.org/10.31580/sps.v2i1.1319.

Full text
Abstract:
Edge computing has significantly enhanced the capabilities of cloud computing. Edge data-centres are used for storing data of the end-user devices. Secure communication between the legitimate edge data-centres during the load balancing process has attracted industrial and academic researchers. Recently, Puthal et al. have proposed a technique for authenticating edge datacenters to enable secure load balancing. However, the resource-constraint nature of the edge data-centres is ignored. The scheme is characterized by complex computation and memory intensive cryptographic protocol. It is also vulnerable to key escrow attack because the secret key used for encrypting and decrypting of the communicated messages is been created by the trusted cloud datacenter. Additionally, the key sharing phase of their algorithm is complex. Therefore, to address the highlighted challenges, this paper proposed a lightweight key escrow-less authentication algorithm that will ensure secure communication of resource-constrained edge data-centres during the load balancing process. The security capability of the proposed scheme has been formally evaluated using the automatic cryptographic analytical tool ProVerif. The relatively low computation and communication costs of the proposed scheme compared to the benchmark schemes proved that it is lightweight, thus suitable for resource-constrained edge datacenters.
APA, Harvard, Vancouver, ISO, and other styles
6

Thilagavathi, N., D. Divya Dharani, R. Sasilekha, Vasundhara Suruliandi, and V. Rhymend Uthariaraj. "Energy Efficient Load Balancing in Cloud Data Center Using Clustering Technique." International Journal of Intelligent Information Technologies 15, no. 1 (January 2019): 84–100. http://dx.doi.org/10.4018/ijiit.2019010104.

Full text
Abstract:
Cloud computing has seen tremendous growth in recent days. As a result of this, there has been a great increase in the growth of data centers all over the world. These data centers consume a lot of energy, resulting in high operating costs. The imbalance in load distribution among the servers in the data center results in increased energy consumption. Server consolidation can be handled by migrating all virtual machines in those underutilized servers. Migration causes performance degradation of the job, based on the migration time and number of migrations. Considering these aspects, the proposed clustering agent-based model improves energy saving by efficient allocation of the VMs to the hosting servers, which reduces the response time for initial allocation. Middle VM migration (MVM) strategy for server consolidation minimizes the number of VM migrations. Further, randomization of extra resource requirement done to cater to real-time scenarios needs more resource requirements than the initial requirement. Simulation results show that the proposed approach reduces the number of migrations and response time for user request and improves energy saving in the cloud environment.
APA, Harvard, Vancouver, ISO, and other styles
7

Shivaliya, Shikha, and Vijay Anand. "Design of Load Balancing Technique for Cloud Computing Environment." ECS Transactions 107, no. 1 (April 24, 2022): 2911–18. http://dx.doi.org/10.1149/10701.2911ecst.

Full text
Abstract:
Cloud computing allows for the provision of IT resources on-demand and has various advantages. Because the majority of firms have shifted their activities to the cloud, data centers are frequently flooded with sporadic loads. When dealing with high network traffic in the cloud, it is necessary to balance the load among servers. This is something that load balancing can help with. The primary goal of load balancing is to distribute demand evenly among all available servers such that no server is under or overloaded. Load balancing is the process of dispersing load among several nodes to make the best use of resources when work is overwhelmed. When a node is overburdened to support the load, load balancing is essential. When a node becomes overcrowded, the load is dispersed to the remaining optimal nodes.
APA, Harvard, Vancouver, ISO, and other styles
8

Gui, Chun. "Analysis of imbalanced data set problem: The case of churn prediction for telecommunication." Artificial Intelligence Research 6, no. 2 (June 27, 2017): 93. http://dx.doi.org/10.5430/air.v6n2p93.

Full text
Abstract:
Class-imbalanced datasets are common in the field of mobile Internet industry. We tested three kinds of feature selection techniques-Random Forest (RF), Relative Weight (RW) and Standardized Regression Coefficients (SRC); three kinds of balance methods-over-sampling (OS), under-sampling (US) and synthetic minority over-sampling (SMOTE); a widely used classification method-RF. The combined models are composed of feature selection techniques, balancing techniques and classification method. The original dataset which has 45 thousand records and 22 features were used to evaluate the performances of both feature selection and balancing techniques. The experimental results revealed that SRC combined with SMOTE technique attained the minimum value of Cost = 1085. Through the calculation of the Cost on all models, the most important features for minimum cost of telecommunication were identified. The application of these combined models will have the possibility to maximize the profit with the minimum expenditure for customer retention and help reduce customer churn rates.
APA, Harvard, Vancouver, ISO, and other styles
9

Sapna, P. Fathima, and Dr R. Lal Raja Singh. "Smart Meter Data based Load Analysis Using Clustering Technique." International Academic Journal of Science and Engineering 9, no. 1 (December 6, 2022): 39–48. http://dx.doi.org/10.9756/iajse/v9i1/iajse0918.

Full text
Abstract:
In this study, clustering is investigated as a method for improving energy efficiency based on smart meters for a number of household applications that are both currently available and expected to become available in the near future. These applications include smart thermostats, smart lights, smart water heaters, smart washing machines, and smart refrigerators. We describe a novel approach to load balanced clustering that is founded on the K-means Clustering algorithm. Our algorithm's major goal is to optimize network lifetime while maintaining acceptable sensing coverage in scenarios in which sensor nodes generate either uniform or non-uniform data traffic. This can be accomplished by maintaining acceptable sensing coverage. We also provide a new clustering cost function that takes into consideration not only the volume of traffic but also the amount of work that is required to communicate across substantial geographic distances. This is done so that we can achieve this objective. We demonstrate that our algorithm is able to improve both load analysis as well as load balancing in the domestic area by running extensive simulations that compare the proposed algorithm to leading state-of-the-art clustering approaches and then comparing the results to one another. This allows us to demonstrate that our algorithm is able to improve both load analysis as well as load balancing in the domestic area. In addition to this, it demonstrates the culmination of a stage in the processing of a dataset in order to compute the typical quantity of energy load that is utilized by consumers.
APA, Harvard, Vancouver, ISO, and other styles
10

Ramzan, Bajwa, Kazmi, and Amna. "Challenges in NoSQL-Based Distributed Data Storage: A Systematic Literature Review." Electronics 8, no. 5 (April 30, 2019): 488. http://dx.doi.org/10.3390/electronics8050488.

Full text
Abstract:
Key-Value stores (KVSs) are the most flexible and simplest model of NoSQL databases, which have become highly popular over the last few years due to their salient features such as availability, portability, reliability, and low operational cost. From the perspective of software engineering, the chief obstacle for KVSs is to achieve software quality attributes (consistency, throughput, latency, security, performance, load balancing, and query processing) to ensure quality. The presented research is a Systematic Literature Review (SLR) to find the state-of-the-art research in the KVS domain, and through doing so determine the major challenges and solutions. This work reviews the 45 papers between 2010–2018 that were found to be closely relevant to our study area. The results show that performance is addressed in 31% of the studies, consistency is addressed in 20% of the studies, latency and throughput are addressed in 16% of the studies, query processing is addressed in 13% of studies, security is addressed in 11% of the studies, and load balancing is addressed in 9% of the studies. Different models are used for execution. The indexing technique was used in 20% of the studies, the hashing technique was used in 13% of the studies, the caching and security techniques were used together in 9% of the studies, the batching technique was used in 5% of the studies, the encoding techniques and Paxos technique were used together in 4% of the studies, and 36% of the studies used other techniques. This systematic review will enable researchers to design key-value stores as efficient storage. Regarding future collaborations, trust and privacy are the quality attributes that can be addressed; KVS is an emerging facet due to its widespread popularity, opening the way to deploy it with proper protection.
APA, Harvard, Vancouver, ISO, and other styles
11

Cai, Wenfang, Songyuan Lu, Zhengfeng Wu, Guangyao Ying, and Wenjian Wu. "Strategy and Technique of High Efficiency Balancing in Field for Turbo-Generator Units with Large Capacity." Journal of Physics: Conference Series 2101, no. 1 (November 1, 2021): 012011. http://dx.doi.org/10.1088/1742-6596/2101/1/012011.

Full text
Abstract:
Abstract Abstract.This paper aims at the high efficiency of field balancing for turbo-generator with large capacity currently, and introduces the strategies and key techniques of the rotor system balancing with practical cases of power plant in field. The acquisition, analysis and former processing of the original vibration data for balance calculation are included. Furthermore, they involve complete measuring points and conditions, accurate judgment for the types of unbalance exciting force and selection of stable vibration data, all these could reduce the blindness of balancing effectively. The strategies and techniques also contain the determination for axial plane of unbalance by the modal method, then the optimal steps and the plane of adding weight are chosen during the implementation of balancing. Besides, this paper also introduces the analysis and selection of influence coefficients and the phase of trial weight, these can help determine the final correction weight accurately in order to guarantee the balancing process prompt and efficient. Meanwhile, the restriction of practical location for adding weight and construction period of maintenance and production for the units should be considered during the high efficiency balancing in field. These strategies and techniques of high efficiency balancing have practical application value in promoting the technology of field balancing for turbo- generator units with large capacity.
APA, Harvard, Vancouver, ISO, and other styles
12

Al-Batahari, M., Azlinda Abdul Aziz, and Nur Syufiza Ahmad Shukor. "The Techniques in Enhancing Bandwidth Load Balancing QoS at Local Area Network: A Review Paper." International Journal on Cybernetics & Informatics 12, no. 5 (August 12, 2023): 23–30. http://dx.doi.org/10.5121/ijci.2023.120503.

Full text
Abstract:
Bandwidth load balancing is a critical technique for improving the quality of service (QoS) in local area networks (LANs). Insufficient load balancing and QoS utilization can lead to network congestion, reduced performance, latency, packet loss, and an overall degradation of network performance. This paper presents an overview of load balancing techniques, load balancing algorithms, QoS parameters, load balancing strategies, and considerations for calculating bandwidth. The challenges of implementing load balancing algorithms and preferences for their implementation are discussed. Accurate network measurements and monitoring, coordination between network devices, security and privacy considerations, and careful evaluation of load balancing algorithms are among the challenges faced. The paper also discusses optimization and monitoring of bandwidth use, including options such as increasing bandwidth, bandwidth throttling, and data transfer throttling. The findings presented in this paper provide insights into the importance of load balancing and QoS in LANs and offer guidance for addressing related challenges.
APA, Harvard, Vancouver, ISO, and other styles
13

Arca, Sevgi, and Rattikorn Hewett. "Analytics on Anonymity for Privacy Retention in Smart Health Data." Future Internet 13, no. 11 (October 28, 2021): 274. http://dx.doi.org/10.3390/fi13110274.

Full text
Abstract:
Advancements in smart technology, wearable and mobile devices, and Internet of Things, have made smart health an integral part of modern living to better individual healthcare and well-being. By enhancing self-monitoring, data collection and sharing among users and service providers, smart health can increase healthy lifestyles, timely treatments, and save lives. However, as health data become larger and more accessible to multiple parties, they become vulnerable to privacy attacks. One way to safeguard privacy is to increase users’ anonymity as anonymity increases indistinguishability making it harder for re-identification. Still the challenge is not only to preserve data privacy but also to ensure that the shared data are sufficiently informative to be useful. Our research studies health data analytics focusing on anonymity for privacy protection. This paper presents a multi-faceted analytical approach to (1) identifying attributes susceptible to information leakages by using entropy-based measure to analyze information loss, (2) anonymizing the data by generalization using attribute hierarchies, and (3) balancing between anonymity and informativeness by our anonymization technique that produces anonymized data satisfying a given anonymity requirement while optimizing data retention. Our anonymization technique is an automated Artificial Intelligent search based on two simple heuristics. The paper describes and illustrates the detailed approach and analytics including pre and post anonymization analytics. Experiments on published data are performed on the anonymization technique. Results, compared with other similar techniques, show that our anonymization technique gives the most effective data sharing solution, with respect to computational cost and balancing between anonymity and data retention.
APA, Harvard, Vancouver, ISO, and other styles
14

Singh, Prabhdeep, Rajbir Kaur, Junaid Rashid, Sapna Juneja, Gaurav Dhiman, Jungeun Kim, and Mariya Ouaissa. "A Fog-Cluster Based Load-Balancing Technique." Sustainability 14, no. 13 (June 29, 2022): 7961. http://dx.doi.org/10.3390/su14137961.

Full text
Abstract:
The Internet of Things has recently been a popular topic of study for developing smart homes and smart cities. Most IoT applications are very sensitive to delays, and IoT sensors provide a constant stream of data. The cloud-based IoT services that were first employed suffer from increased latency and inefficient resource use. Fog computing is used to address these issues by moving cloud services closer to the edge in a small-scale, dispersed fashion. Fog computing is quickly gaining popularity as an effective paradigm for providing customers with real-time processing, platforms, and software services. Real-time applications may be supported at a reduced operating cost using an integrated fog-cloud environment that minimizes resources and reduces delays. Load balancing is a critical problem in fog computing because it ensures that the dynamic load is distributed evenly across all fog nodes, avoiding the situation where some nodes are overloaded while others are underloaded. Numerous algorithms have been proposed to accomplish this goal. In this paper, a framework was proposed that contains three subsystems named user subsystem, cloud subsystem, and fog subsystem. The goal of the proposed framework is to decrease bandwidth costs while providing load balancing at the same time. To optimize the use of all the resources in the fog sub-system, a Fog-Cluster-Based Load-Balancing approach along with a refresh period was proposed. The simulation results show that “Fog-Cluster-Based Load Balancing” decreases energy consumption, the number of Virtual Machines (VMs) migrations, and the number of shutdown hosts compared with existing algorithms for the proposed framework.
APA, Harvard, Vancouver, ISO, and other styles
15

Shimada, Ken-ichi, Makoto Hosaka, Kazuyoshi Yamazaki, Shinsuke Onoe, and Tatsuro Ide. "Technique for positioning hologram for balancing large data capacity with fast readout." Japanese Journal of Applied Physics 56, no. 9S (August 7, 2017): 09NA04. http://dx.doi.org/10.7567/jjap.56.09na04.

Full text
APA, Harvard, Vancouver, ISO, and other styles
16

Arshed, Muhammad Asad, Muhammad Abdul Jabbar, Farrukh Liaquat, Usman Mohy-ud-Din Chaudhary, Danial Karim, Hina Alam, and Shahzad Mumtaz. "Machine Learning with Data Balancing Technique for IoT Attack and Anomalies Detection." Vol 4 Issue 2 4, no. 2 (May 29, 2022): 490–98. http://dx.doi.org/10.33411/ijist/2022040218.

Full text
Abstract:
Nowadays the significant concern in IoT infrastructure is anomaly and attack detection from IoT devices. Due to the advanced technology, the attack issues are increasing gradually. There are many attacks like Data Type Probing, Denial of Service, Malicious Operation, Malicious Control, Spying, Scan, and Wrong Setup that cause the failure of the IoT-based system. In this paper, several machine learning model performances have been compared to effectively predict the attack and anomaly. The performance of the models is compared with evaluation matrices (Accuracy) and confusion matrix for the final version of the effective model. Most of the recent studies performed experiments on an unbalanced dataset; that is clear that the model will be biased for such a dataset, so we completed the experiments in two forms, unbalanced and balanced data samples. For the unbalanced dataset, we have achieved the highest accuracy of 98.0% with Generalized Linear Model as well as with Random Forest; Unbalanced dataset means most of the chances are that model is biased, so we have also performed the experiments with Random Under Sampling Technique (Balancing Data) and achieved the highest accuracy of 94.3% with Generalized Linear Model. The confusion matrix in this study also supports the performance of the Generalized Linear Model.
APA, Harvard, Vancouver, ISO, and other styles
17

Ross, Christopher P., and Paul L. Beale. "Seismic offset balancing." GEOPHYSICS 59, no. 1 (January 1994): 93–101. http://dx.doi.org/10.1190/1.1443538.

Full text
Abstract:
The ability to successfully predict lithology and fluid content from reflection seismic records using AVO techniques is contingent upon accurate pre‐analysis conditioning of the seismic data. However, all too often, residual amplitude effects remain after the many offset‐dependent processing steps are completed. Residual amplitude effects often represent a significant error when compared to the amplitude variation with offset (AVO) response that we are attempting to quantify. We propose a model‐based, offset‐dependent amplitude balancing method that attempts to correct for these residuals and other errors due to sub‐optimal processing. Seismic offset balancing attempts to quantify the relationship between the offset response of back‐ground seismic reflections and corresponding theoretical predictions for average lithologic interfaces thought to cause these background reflections. It is assumed that any deviation from the theoretical response is a result of residual processing phenomenon and/or suboptimal processing, and a simple offsetdependent scaling function is designed to correct for these differences. This function can then be applied to seismic data over both prospective and nonprospective zones within an area where the theoretical values are appropriate and the seismic characteristics are consistent. A conservative application of the above procedure results in an AVO response over both gas sands and wet sands that is much closer to theoretically expected values. A case history from the Gulf of Mexico Flexure Trend is presented as an example to demonstrate the offset balancing technique.
APA, Harvard, Vancouver, ISO, and other styles
18

Singh, Jaspreet, Deepali Gupta, and Neha Sharma. "Cloud Load Balancing Algorithms: A Comparative Assessment." Journal of Computational and Theoretical Nanoscience 16, no. 9 (September 1, 2019): 3989–94. http://dx.doi.org/10.1166/jctn.2019.8282.

Full text
Abstract:
Nowadays, Cloud computing is developing quickly and customers are requesting more administrations and superior outcomes. In the cloud domain, load balancing has turned into an extremely intriguing and crucial research area. Numbers of algorithms were recommended to give proficient mechanism for distributing the cloud user’s requests for accessing pool cloud resources. Also load balancing in cloud should provide notable functional benefits to cloud users and at the same time should prove out to be eminent for cloud services providers. In this paper, the pre-existing load balancing techniques are explored. The paper intends to provide landscape for classification of distinct load balancing algorithms based upon the several parameters and also address performance assessment bound to various load balancing algorithms. The comparative assessment of various load balancing algorithms will helps in proposing a competent load balancing technique for intensify the performance of cloud data centers.
APA, Harvard, Vancouver, ISO, and other styles
19

Ul Hassan, Ietezaz, Raja Hashim Ali, Zain Ul Abideen, Talha Ali Khan, and Rand Kouatly. "Significance of Machine Learning for Detection of Malicious Websites on an Unbalanced Dataset." Digital 2, no. 4 (October 31, 2022): 501–19. http://dx.doi.org/10.3390/digital2040027.

Full text
Abstract:
It is hard to trust any data entry on online websites as some websites may be malicious, and gather data for illegal or unintended use. For example, bank login and credit card information can be misused for financial theft. To make users aware of the digital safety of websites, we have tried to identify and learn the pattern on a dataset consisting of features of malicious and benign websites. We treated the problem of differentiation between malicious and benign websites as a classification problem and applied several machine learning techniques, for example, random forest, decision tree, logistic regression, and support vector machines to this data. Several evaluation metrics such as accuracy, precision, recall, F1 score, and false positive rate, were used to evaluate the performance of each classification technique. Since the dataset was imbalanced, the machine learning models developed a bias during training toward a specific class of websites. Multiple data balancing techniques, for example, undersampling, oversampling, and SMOTE, were applied for balancing the dataset and removing the bias. Our experiments showed that after balancing the data, the random forest algorithm using the oversampling technique showed the best results in all evaluation metrics for the benign and malicious website feature dataset.
APA, Harvard, Vancouver, ISO, and other styles
20

Ratnasari, Dwi, and Dianita Meirini. "PAD, Dana Perimbangan, Belanja Modal, SILPA dan Kinerja Keuangan Pemerintah Daerah di Jawa Timur." E-Jurnal Akuntansi 32, no. 5 (May 28, 2022): 1189. http://dx.doi.org/10.24843/eja.2022.v32.i05.p06.

Full text
Abstract:
The purpose of the study was to examine the effect of local revenue, balancing funds, capital expenditures and SILPA on the financial performance of local governments as proxied through financial ratios in the form of efficiency ratios. A quantitative approach is used as an approach in this study. Saturated sample for sampling technique. The data analysis technique in this study was a panel data regression test. Based on the results of the T-test, financial performance cannot be affected by local revenue but financial performance can be significantly affected negatively by balancing funds and SILPA, and financial performance can be significantly affected positively by capital expenditures. Keywords: PAD; Balancing Fund; Capital Expenditures; SILPA; Local Government Financial Performance.
APA, Harvard, Vancouver, ISO, and other styles
21

Belalem, Ghalem, Naima Belayachi, Radjaa Behidji, and Belabbes Yagoubi. "Load Balancing to Increase the Consistency of Replicas in Data Grids." International Journal of Distributed Systems and Technologies 1, no. 4 (October 2010): 42–57. http://dx.doi.org/10.4018/jdst.2010100104.

Full text
Abstract:
Data grids are current solutions to the needs of large scale systems and provide a set of different geographically distributed resources. Their goal is to offer an important capacity of parallel calculation, ensure a data effective and rapid access, improve the availability, and tolerate the breakdowns. In such systems, however, these advantages are possible only by using the replication technique. The use of this technique raises the problem of maintaining consistency of replicas of the same data set. In order to guarantee replica set reliability, it is necessary to have high coherence. This fact, however, penalizes performance. In this paper, the authors propose studying balancing influence on replica quality. For this reason, a service of hybrid consistency management is developed, which combines the pessimistic and optimistic approaches and is extended by a load balancing service to improve service quality. This service is articulated on a hierarchical model with two levels.
APA, Harvard, Vancouver, ISO, and other styles
22

Bura, Deepa, Meeta Singh, and Poonam Nandal. "Analysis and Development of Load Balancing Algorithms in Cloud Computing." International Journal of Information Technology and Web Engineering 13, no. 3 (July 2018): 35–53. http://dx.doi.org/10.4018/ijitwe.2018070103.

Full text
Abstract:
This article describes how cloud computing utilizes the benefits of web engineering and its applications by improving the performance and reducing the load on cloud providers. As the cloud is one of the emerging technology in the field of computing, it is used to provide various services to the user through the internet. One of the major concerns in cloud computing is accessibility of cloud. For estimating the availability of cloud, various load balancing algorithms are deployed in data centers of the cloud environment. Load balancing is a technique that distributes a signal load across various computers for optimizing resource usage, reducing response time, etc. There are different load balancing algorithms, for performing the load distribution across various centers. This article analyses different load balancing algorithms and develop a new algorithm for efficient load balancing. The proposed load balancing algorithm utilizes the concepts of web engineering to prioritize the request of end user using parsing technique, which will assign the resources to the end users based on the priority set by the data centers.
APA, Harvard, Vancouver, ISO, and other styles
23

Mohammed, Rafiq Ahmed, Kok-Wai Wong, Mohd Fairuz Shiratuddin, and Xuequn Wang. "PWIDB: A framework for learning to classify imbalanced data streams with incremental data re-balancing technique." Procedia Computer Science 176 (2020): 818–27. http://dx.doi.org/10.1016/j.procs.2020.09.077.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Wen-Hsuan, Liang, Cheng Dun-Wei, Hsu Chih-Wei, Lee Chia-Wei, Keand Chih-Heng, Zomaya Albert Y., and Hsieh Sun-Yuan. "Dynamic Flow Scheduling Technique for Load Balancing in Fat-Tree Data Center Networks." International Journal of Performability Engineering 17, no. 6 (2021): 491. http://dx.doi.org/10.23940/ijpe.21.06.p1.491503.

Full text
APA, Harvard, Vancouver, ISO, and other styles
25

Kansal, Nidhi Jain, and Inderveer Chana. "An empirical evaluation of energy-aware load balancing technique for cloud data center." Cluster Computing 21, no. 2 (September 19, 2017): 1311–29. http://dx.doi.org/10.1007/s10586-017-1166-z.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

de Hollander, J. A. "Application of a metabolic balancing technique to the analysis of microbial fermentation data." Antonie van Leeuwenhoek 60, no. 3-4 (September 1991): 275–92. http://dx.doi.org/10.1007/bf00430370.

Full text
APA, Harvard, Vancouver, ISO, and other styles
27

Subasinghe, G. K. N. S. "A transparent technique for mass balancing and data adjustment of complex metallurgical circuits." Mineral Processing and Extractive Metallurgy 118, no. 3 (September 2009): 162–67. http://dx.doi.org/10.1179/174328509x431418.

Full text
APA, Harvard, Vancouver, ISO, and other styles
28

Ullah, Arif, and Nazri Mohd Nawi. "Enhancing the dynamic load balancing technique for cloud computing using HBATAABC algorithm." International Journal of Modeling, Simulation, and Scientific Computing 11, no. 05 (July 23, 2020): 2050041. http://dx.doi.org/10.1142/s1793962320500415.

Full text
Abstract:
Cloud computing brings incipient transmutations in different fields of life and consists of different characteristics and virtualization is one of them. Virtual machine (VM) is one of the main elements of virtualization. VM is a process in which physical server changes into the virtual machine and works as a physical server. When a user sends data or request for data in cloud data center, a situation can occur that may cause the virtual machines to underload data or overload data. The aforementioned situation can lead to failure of the system or delay the user task. Therefore, appropriate load balancing techniques are required to surmount the above two mentioned problems. Load balancing is a technique utilized in cloud computing for management of the resource by a condition such that a maximum throughput is achieved with slightest reaction time and additionally dividing the traffic between different servers or VM so that it can get data without any delay. For the amelioration of load balancing technique in this study, a novel technique is used which is coalescence of BAT and ABC algorithms both of which are nature-inspired algorithms. When the ABC algorithm local search section changes with BAT algorithm local search section, a second modification takes place in the fitness function of BAT algorithm. The proposed technique is known as HBATAABC algorithm. The novel technique implemented by utilizing transfer strategy policy in VM improves the performance of data allocation system of VM in the cloud data center. To check the performance of the proposed algorithm, three main parameters are used which are network average time, network stability and throughput. The performance of the proposed novel technique is verified and tested with the help of cloudsim simulator. The result shows that the suggested modified algorithm increases performance by 1.30% of network average time, network stability and throughput as compared with BAT algorithm, ABC algorithm and RRA algorithm. Nevertheless, the proposed algorithm is more precise and expeditious as compared with the three models.
APA, Harvard, Vancouver, ISO, and other styles
29

Rutherford, Steven R. "Noise‐discriminating, statistical‐amplitude compensation for AVO analysis." GEOPHYSICS 58, no. 12 (December 1993): 1831–39. http://dx.doi.org/10.1190/1.1443398.

Full text
Abstract:
Statistical amplitude balancing/compensation techniques are widely used in the industry to prepare seismic data for amplitude variation with offset (AVO) processing and analysis. The intent of such statistical techniques is to compensate the data for the average signal decay with offset such that reflectors that are anomalous with respect to this average decay can be detected and analyzed. Statistical amplitude compensation techniques, however, suffer from a serious flaw when applied to data sets having low signal‐to‐noise ratios (S/N) that vary with offset. An artifact of this flaw is often a suppression of the AVO effects one is trying to detect. When S/N is low and decreases with offset, as is usually the case, the rms amplitude measurements that statistical techniques are based upon become increasingly dominated by noise as offset increases. This can lead to a suppression of the far offsets by the balancing scalars responding to a noise level that is increasing with offset. A noise‐discriminating, statistical‐amplitude compensation technique can be designed that counteracts the detrimental effects of noise. This technique is based on the premise that a common‐midpoint (CMP) ensemble average of crosscorrelations of like offset data is proportional to the average signal amplitude corresponding to that offset. The average signal decay with offset can be estimated with this technique and used to amplitude compensate a data set for AVO analysis. The noise‐discriminating statistical technique performs extremely well on synthetic data. When applied to a Gulf of Mexico data set having poor S/N characteristics, the technique also performs well and offers encouragement that it will be useful in actual practice.
APA, Harvard, Vancouver, ISO, and other styles
30

Parent, Johan, Katja Verbeeck, Jan Lemeire, Ann Nowe, Kris Steenhaut, and Erik Dirkx. "Adaptive Load Balancing of Parallel Applications with Multi-Agent Reinforcement Learning on Heterogeneous Systems." Scientific Programming 12, no. 2 (2004): 71–79. http://dx.doi.org/10.1155/2004/987356.

Full text
Abstract:
We report on the improvements that can be achieved by applying machine learning techniques, in particular reinforcement learning, for the dynamic load balancing of parallel applications. The applications being considered in this paper are coarse grain data intensive applications. Such applications put high pressure on the interconnect of the hardware. Synchronization and load balancing in complex, heterogeneous networks need fast, flexible, adaptive load balancing algorithms. Viewing a parallel application as a one-state coordination game in the framework of multi-agent reinforcement learning, and by using a recently introduced multi-agent exploration technique, we are able to improve upon the classic job farming approach. The improvements are achieved with limited computation and communication overhead.
APA, Harvard, Vancouver, ISO, and other styles
31

Ramadani, Alkansa Fadila, and Muslimin Muslimin. "Pendapatan Asli Daerah dan Dana Perimbangan terhadap Kinerja Keuangan Pemerintah Daerah." Journal of Management and Bussines (JOMB) 4, no. 1 (June 27, 2022): 362–72. http://dx.doi.org/10.31539/jomb.v4i1.3710.

Full text
Abstract:
This study aims to determine the effect of PAD and Balancing Funds on the financial performance of local governments. This research method is descriptive quantitative. This research utilizes the Non-Probability Sampling method as well as the saturated sampling technique. The type of data used is secondary data obtained from the 2016-2020 Kediri City Government Budget Realization Report. The data were analyzed using multiple linear regression analysis with the help of SPSS 25 software. The results showed that PAD had a positive effect on local government financial performance. Meanwhile, the Balancing Fund has no effect. However, PAD and the Balancing Fund together have a significant effect on the local government's financial performance. In conclusion, the higher the PAD, the better the local government's financial performance, and the amount of balancing funds does not affect the regional government's financial performance. However, on the other hand, PAD and the Balancing Fund together have a significant impact on the local government's financial performance. Keywords: Balancing Fund, Regional Financial Performance, Regional Original Income
APA, Harvard, Vancouver, ISO, and other styles
32

Chopra, Satinder, and Kurt J. Marfurt. "Coherence attribute applications on seismic data in various guises — Part 1." Interpretation 6, no. 3 (August 1, 2018): T521—T529. http://dx.doi.org/10.1190/int-2018-0006.1.

Full text
Abstract:
The iconic coherence attribute is very useful for imaging geologic features such as faults, deltas, submarine canyons, karst collapse, mass-transport complexes, and more. In addition to its preconditioning, the interpretation of discrete stratigraphic features on seismic data is also limited by its bandwidth, where in general the data with higher bandwidth yield crisper features than data with lower bandwidth. Some form of spectral balancing applied to the seismic amplitude data can help in achieving such an objective so that coherence run on spectrally balanced seismic data yields a better definition of the geologic features of interest. The quality of the generated coherence attribute is also dependent in part on the algorithm used for its computation. In the eigenstructure decomposition procedure for coherence computation, spectral balancing equalizes each contribution to the covariance matrix, and thus it yields crisper features on coherence displays. There are other ways to modify the spectrum of the input data in addition to simple spectral balancing, including the amplitude-volume technique, taking the derivative of the input amplitude, spectral bluing, and thin-bed spectral inversion. We compare some of these techniques, and show their added value in seismic interpretation, which forms part of the more elaborate exercise that we had carried out. In other work, we discuss how different spectral components derived from the input seismic data allow interpretation of different scales of discontinuities, what additional information is provided by coherence computed from narrow band spectra, and the different ways to integrate them.
APA, Harvard, Vancouver, ISO, and other styles
33

Lim, JongBeom, and DaeWon Lee. "A Load Balancing Algorithm for Mobile Devices in Edge Cloud Computing Environments." Electronics 9, no. 4 (April 23, 2020): 686. http://dx.doi.org/10.3390/electronics9040686.

Full text
Abstract:
As current data centers and servers are growing in size by orders of magnitude when needed, load balancing is a great concern in scalable computing systems, including mobile edge cloud computing environments. In mobile edge cloud computing systems, a mobile user can offload its tasks to nearby edge servers to support real-time applications. However, when users are located in a hot spot, several edge servers can be overloaded due to suddenly offloaded tasks from mobile users. In this paper, we present a load balancing algorithm for mobile devices in edge cloud computing environments. The proposed load balancing technique features an efficient complexity by a graph coloring-based implementation based on a genetic algorithm. The aim of the proposed load balancing algorithm is to distribute offloaded tasks to nearby edge servers in an efficient way. Performance results show that the proposed load balancing algorithm outperforms previous techniques and increases the average CPU usage of virtual machines, which indicates a high utilization of edge servers.
APA, Harvard, Vancouver, ISO, and other styles
34

Gadam, M. A., Maryam Abdulazeez Ahmed, Chee Kyun Ng, Nor Kamariah Nordin, Aduwati Sali, and Fazirulhisyam Hashim. "Review of Adaptive Cell Selection Techniques in LTE-Advanced Heterogeneous Networks." Journal of Computer Networks and Communications 2016 (2016): 1–12. http://dx.doi.org/10.1155/2016/7394136.

Full text
Abstract:
Poor cell selection is the main challenge in Picocell (PeNB) deployment in Long Term Evolution- (LTE-) Advanced heterogeneous networks (HetNets) because it results in load imbalance and intercell interference. A selection technique based on cell range extension (CRE) has been proposed for LTE-Advanced HetNets to extend the coverage of PeNBs for load balancing. However, poor CRE bias setting in cell selection inhibits the attainment of desired cell splitting gains. By contrast, a cell selection technique based on adaptive bias is a more effective solution to traffic load balancing in terms of increasing data rate compared with static bias-based approaches. This paper reviews the use of adaptive cell selection in LTE-Advanced HetNets by highlighting the importance of cell load estimation. The general performances of different techniques for adaptive CRE-based cell selection are compared. Results reveal that the adaptive CRE bias of the resource block utilization ratio (RBUR) technique exhibits the highest cell-edge throughput. Moreover, more accurate cell load estimation is obtained in the extended RBUR adaptive CRE bias technique through constant bit rate (CBR) traffic, which further improved load balancing as against the estimation based on the number of user equipment (UE). Finally, this paper presents suggestions for future research directions.
APA, Harvard, Vancouver, ISO, and other styles
35

Chopra, Satinder, and Kurt J. Marfurt. "Coherence attribute applications on seismic data in various guises — Part 2." Interpretation 6, no. 3 (August 1, 2018): T531—T541. http://dx.doi.org/10.1190/int-2018-0007.1.

Full text
Abstract:
We have previously discussed some alternative means of modifying the frequency spectrum of the input seismic data to modify the resulting coherence image. The simplest method was to increase the high-frequency content by computing the first and second derivatives of the original seismic amplitudes. We also evaluated more sophisticated techniques, including the application of structure-oriented filtering to different spectral components before spectral balancing, thin-bed reflectivity inversion, bandwidth extension, and the amplitude volume technique. We further examine the value of coherence computed from individual spectral voice components, and alternative means of combining three or more such coherence images, providing a single volume for interpretation.
APA, Harvard, Vancouver, ISO, and other styles
36

Everett, Louis J. "Two-Plane Balancing of a Rotor System Without Phase Response Measurements." Journal of Vibration and Acoustics 109, no. 2 (April 1, 1987): 162–67. http://dx.doi.org/10.1115/1.3269409.

Full text
Abstract:
This paper presents, and experimentally verifies, a two-plane balancing technique for rigid rotors and possibly flexible rotors operating at a constant speed. The technique, based upon influence coefficients, extends the single-plane four-run balancing procedure to two planes. Like the four-run method, this technique is most easily performed graphically and does not require response phase measurement. Despite the additional runs required to obtain data, its simplicity and applicability to a wide range of equipment renders it more useful, in some cases, than the standard two-plane influence coefficient method.
APA, Harvard, Vancouver, ISO, and other styles
37

Kolo, SILUE. "IMPACT OF DATA PREPROCESSING AND BALANCING ON DIABETES PREDICTION USING THE DECISION TREE TECHNIQUE." International Journal of Numerical Methods and Applications 23, no. 2 (June 5, 2023): 157–80. http://dx.doi.org/10.17654/0975045223008.

Full text
APA, Harvard, Vancouver, ISO, and other styles
38

Chakravarthy, V. Deeban, and B. Amutha. "Path based load balancing for data center networks using SDN." International Journal of Electrical and Computer Engineering (IJECE) 9, no. 4 (August 1, 2019): 3279. http://dx.doi.org/10.11591/ijece.v9i4.pp3279-3285.

Full text
Abstract:
Due to the increase in the number of users on the internet and the number of applications that is available in the cloud makes Data Center Networking (DCN) has the backbone for computing. These data centre requires high operational cost and also experience the link failures and congestions often. Hence the solution is to use Software Defined Networking (SDN) based load balancer which improves the efficiency of the network by distributing the traffic across multiple paths to optimize the efficiency of the network. Traditional load balancers are very expensive and inflexible. These SDN load balancers do not require costly hardware and can be programmed, which it makes it easier to implement user-defined algorithms and load balancing strategies. In this paper, we have proposed an efficient load balancing technique by considering different parameters to maintain the load efficiently using Open FlowSwitches connected to ONOS controller.
APA, Harvard, Vancouver, ISO, and other styles
39

Barbar, Aziz, and Anis Ismail. "Stochastically Balancing Trees for File and Database Systems." International Journal of Green Computing 4, no. 1 (January 2013): 58–70. http://dx.doi.org/10.4018/jgc.2013010104.

Full text
Abstract:
With the constant improvement in data storage technologies, a new generation of indexing mechanisms is to be created to exploit the improvements in disk access speeds that were previously impractical. The self-balancing tree B-Tree, has long been the indexing structure of choice for reducing the amount of disk access at the expense of size of data block to be read or written. A new technique based on a dynamically growing multilevel list structure, which is stochastically balanced rather than self balanced, is discussed and compared to the B-Tree. An analogy between the technique and the structures is established to better compare the computational complexities.
APA, Harvard, Vancouver, ISO, and other styles
40

Barkah, Azhari Shouni, Siti Rahayu Selamat, Zaheera Zainal Abidin, and Rizki Wahyudi. "Impact of Data Balancing and Feature Selection on Machine Learning-based Network Intrusion Detection." JOIV : International Journal on Informatics Visualization 7, no. 1 (February 28, 2023): 241. http://dx.doi.org/10.30630/joiv.7.1.1041.

Full text
Abstract:
Unbalanced datasets are a common problem in supervised machine learning. It leads to a deeper understanding of the majority of classes in machine learning. Therefore, the machine learning model is more effective at recognizing the majority classes than the minority classes. Naturally, imbalanced data, such as disease data and data networking, has emerged in real life. DDOS is one of the network intrusions found to happen more often than R2L. There is an imbalance in the composition of network attacks in Intrusion Detection System (IDS) public datasets such as NSL-KDD and UNSW-NB15. Besides, researchers propose many techniques to transform it into balanced data by duplicating the minority class and producing synthetic data. Synthetic Minority Oversampling Technique (SMOTE) and Adaptive Synthetic (ADASYN) algorithms duplicate the data and construct synthetic data for the minority classes. Meanwhile, machine learning algorithms can capture the labeled data's pattern by considering the input features. Unfortunately, not all the input features have an equal impact on the output (predicted class or value). Some features are interrelated and misleading. Therefore, the important features should be selected to produce a good model. In this research, we implement the recursive feature elimination (RFE) technique to select important features from the available dataset. According to the experiment, SMOTE provides a better synthetic dataset than ADASYN for the UNSW-B15 dataset with a high level of imbalance. RFE feature selection slightly reduces the model's accuracy but improves the training speed. Then, the Decision Tree classifier consistently achieves a better recognition rate than Random Forest and KNN.
APA, Harvard, Vancouver, ISO, and other styles
41

Anggreni, Ni Ketut Ayu, and Luh Gede Sri Artini. "PENGARUH PAD, DANA PERIMBANGAN DAN BELANJA MODAL TERHADAP KINERJA KEUANGAN DAERAH KABUPATEN BADUNG PROVINSI BALI." E-Jurnal Manajemen Universitas Udayana 8, no. 3 (December 5, 2018): 1315. http://dx.doi.org/10.24843/ejmunud.2019.v08.i03.p06.

Full text
Abstract:
The purpose of this study is to determine the effect of PAD on regional financial performance. To know the effect of Balancing Fund on financial performance and to know the effect of Capital Expenditure on financial performance. Data used in this research is secondary data. Data used in this research is secondary data. The data used in research is obtained through non-behavioral observation method as its data collection method, so no sampling technique and questionnaire is required. Data analysis technique using multiple regression analysis. The result of research indicates that local revenue is positive and significant to the financial performance of Badung regency. Balancing funds have a positive and significant impact on the financial performance of Badung regency. Capital expenditure has a positive and significant impact on the financial performance of Badung regency
APA, Harvard, Vancouver, ISO, and other styles
42

Lee, Taejun, Minju Kim, and Sung-Phil Kim. "Improvement of P300-Based Brain–Computer Interfaces for Home Appliances Control by Data Balancing Techniques." Sensors 20, no. 19 (September 29, 2020): 5576. http://dx.doi.org/10.3390/s20195576.

Full text
Abstract:
The oddball paradigm used in P300-based brain–computer interfaces (BCIs) intrinsically poses the issue of data imbalance between target stimuli and nontarget stimuli. Data imbalance can cause overfitting problems and, consequently, poor classification performance. The purpose of this study is to improve BCI performance by solving this data imbalance problem with sampling techniques. The sampling techniques were applied to BCI data in 15 subjects controlling a door lock, 15 subjects an electric light, and 14 subjects a Bluetooth speaker. We explored two categories of sampling techniques: oversampling and undersampling. Oversampling techniques, including random oversampling, synthetic minority oversampling technique (SMOTE), borderline-SMOTE, support vector machine (SVM) SMOTE, and adaptive synthetic sampling, were used to increase the number of samples for the class of target stimuli. Undersampling techniques, including random undersampling, neighborhood cleaning rule, Tomek’s links, and weighted undersampling bagging, were used to reduce the class size of nontarget stimuli. The over- or undersampled data were classified by an SVM classifier. Overall, some oversampling techniques improved BCI performance while undersampling techniques often degraded performance. Particularly, using borderline-SMOTE yielded the highest accuracy (87.27%) and information transfer rate (8.82 bpm) across all three appliances. Moreover, borderline-SMOTE led to performance improvement, especially for poor performers. A further analysis showed that borderline-SMOTE improved SVM by generating more support vectors within the target class and enlarging margins. However, there was no difference in the accuracy between borderline-SMOTE and the method of applying the weighted regularization parameter of the SVM. Our results suggest that although oversampling improves performance of P300-based BCIs, it is not just the effect of the oversampling techniques, but rather the effect of solving the data imbalance problem.
APA, Harvard, Vancouver, ISO, and other styles
43

Bennis, Saad, and Narut Kang. "Multivariate Technique for Validating Historical Hydrometric Data with Redundant Measurements." Hydrology Research 31, no. 2 (April 1, 2000): 107–26. http://dx.doi.org/10.2166/nh.2000.0008.

Full text
Abstract:
The aim of this research was to develop an automated methodology for validating chronological series of natural inflows to reservoirs. Theoretically, gauges located on the same reservoir should indicate the same reading. However, under the influence of meteorological and hydraulic factors, or simply because of failed measuring equipment, there may be large deviations between the various measurements. Since the calculation of historical natural inflows is directly linked to the measurement of reservoir level by the water balance equation, there will be as many series of natural inflows as there are of reservoir levels. A multivariate filtering technique is used to validate the historical natural inflow computed by each water level variation. The multi filter methodology has the advantage of balancing the water volume of natural inflows to the reservoir when applied over a relatively long period of time. As a result, the validated flood peaks are not systematically overestimated or underestimated and the validated natural inflows are nearly identical for all the gauges. The proposed technique has been incorporated into a software program called ValiDeb, which has been successfully tested on-site on the Gatineau River in Quebec.
APA, Harvard, Vancouver, ISO, and other styles
44

Ryan, Gregorius, Pricillia Katarina, and Derwin Suhartono. "MBTI Personality Prediction Using Machine Learning and SMOTE for Balancing Data Based on Statement Sentences." Information 14, no. 4 (April 3, 2023): 217. http://dx.doi.org/10.3390/info14040217.

Full text
Abstract:
The rise of social media as a platform for self-expression and self-understanding has led to increased interest in using the Myers–Briggs Type Indicator (MBTI) to explore human personalities. Despite this, there needs to be more research on how other word-embedding techniques, machine learning algorithms, and imbalanced data-handling techniques can improve the results of MBTI personality-type predictions. Our research aimed to investigate the efficacy of these techniques by utilizing the Word2Vec model to obtain a vector representation of words in the corpus data. We implemented several machine learning approaches, including logistic regression, linear support vector classification, stochastic gradient descent, random forest, the extreme gradient boosting classifier, and the cat boosting classifier. In addition, we used the synthetic minority oversampling technique (SMOTE) to address the issue of imbalanced data. The results showed that our approach could achieve a relatively high F1 score (between 0.7383 and 0.8282), depending on the chosen model for predicting and classifying MBTI personality. Furthermore, we found that using SMOTE could improve the selected models’ performance (F1 score between 0.7553 and 0.8337), proving that the machine learning approach integrated with Word2Vec and SMOTE could predict and classify MBTI personality well, thus enhancing the understanding of MBTI.
APA, Harvard, Vancouver, ISO, and other styles
45

Barros, Thiago M., Plácido A. SouzaNeto, Ivanovitch Silva, and Luiz Affonso Guedes. "Predictive Models for Imbalanced Data: A School Dropout Perspective." Education Sciences 9, no. 4 (November 15, 2019): 275. http://dx.doi.org/10.3390/educsci9040275.

Full text
Abstract:
Predicting school dropout rates is an important issue for the smooth execution of an educational system. This problem is solved by classifying students into two classes using educational activities related statistical datasets. One of the classes must identify the students who have the tendency to persist. The other class must identify the students who have the tendency to dropout. This problem often encounters a phenomenon that masks out the obtained results. This study delves into this phenomenon and provides a reliable educational data mining technique that accurately predicts the dropout rates. In particular, the three data classifying techniques, namely, decision tree, neural networks and Balanced Bagging, are used. The performances of these classifies are tested with and without the use of a downsample, SMOTE and ADASYN data balancing. It is found that among other parameters geometric mean and UAR provides reliable results while predicting the dropout rates using Balanced Bagging classifying techniques.
APA, Harvard, Vancouver, ISO, and other styles
46

Kaliappan, M., E. Mariappan, M. Viju Prakash, and B. Paramasivan. "Load Balanced Clustering Technique in MANET using Genetic Algorithms." Defence Science Journal 66, no. 3 (April 25, 2016): 251. http://dx.doi.org/10.14429/dsj.66.9205.

Full text
Abstract:
<p>Mobile adhoc network (MANET) has characteristics of topology dynamics due to factors such as energy conservation and node movement that leads to dynamic load-balanced clustering problem (DLBCP). Load-balancing and reliable data transfer between all the nodes are essential to prolong the lifetime of the network. MANET can also be partitioned into clusters for maintaining the network structure. Generally, Clustering is used to reduce the size of topology and to accumulate the topology information. It is necessary to have an effective clustering algorithm for adapting the topology change. In this, we used energy metric in genetic algorithm (GA) to solve the DLBCP. It is important to select the energy- efficient cluster head for maintaining the cluster structure and balance the load effectively. In this work, we used genetic algorithms such as elitism based immigrants genetic algorithm (EIGA) and memory enhanced genetic algorithm (MEGA) to solve DLBCP. These schemes select an optimal cluster head by considering the distance and energy parameters. We used EIGA to maintain the diversity level of the population and MEGA to store the old environments into the memory. It promises the load -balancing in cluster structure to increase the lifetime of the network. Experimental results show that the proposed schemes increases the network lifetime and reduces the total energy consumption. The simulation results show that MEGA and EIGA give a better performance in terms of load-balancing.</p>
APA, Harvard, Vancouver, ISO, and other styles
47

Paikaray, Divya, Divyanshi Chhabra, Sachin Sharma, Sachin Goswami, Shashikala H K, and Prof Gordhan Jethava. "Energy Efficiency Based Load Balancing Optimization Routing Protocol In 5G Wireless Communication Networks." International Journal of Communication Networks and Information Security (IJCNIS) 14, no. 3 (December 31, 2022): 187–98. http://dx.doi.org/10.17762/ijcnis.v14i3.5605.

Full text
Abstract:
A significant study area in cloud computing that still requires attention is how to distribute the workload among virtual machines and resources. Main goal of this research is to develop an efficient cloud load balancing approach, improve response time, decrease readiness time, maximise source utilisation, and decrease activity rejection time. This research propose novel technique in load balancing based network optimization using routing protocol for 5G wireless communication networks. the network load balancing has been carried out using cloud based software defined multi-objective optimization routing protocol. then the network security has been enhanced by data classification utilizing deep belief Boltzmann NN. Experimental analysis has been carried out based on load balancing and security data classification in terms of throughput, packet delivery ratio, energy efficiency, latency, accuracy, precision, recall.
APA, Harvard, Vancouver, ISO, and other styles
48

Dlaska, Constantin, Petros Ismailidis, Kenji Doma, Benjamin Brandon, Matthew Wilkinson, and Kaushik Hazratwala. "Femoral Component Rotation in Total Knee Arthroplasty Using a Tibia-First, Gap-Balancing, “Functional Alignment” Technique." Journal of Clinical Medicine 11, no. 22 (November 10, 2022): 6680. http://dx.doi.org/10.3390/jcm11226680.

Full text
Abstract:
Background: The purpose of this study was to describe the femoral component rotation in total knee arthroplasty (TKA) using a tibia-first, gap-balancing, “functional alignment” technique. Methods: Ninety-seven patients with osteoarthritis received a TKA using computer navigation. The tibial resection was performed according to the kinematic alignment (KA) principles, while the femoral rotation was set according to the gap-balancing technique. Preoperative MRIs and intraoperative resection depth data were used to calculate the following rotational axes: the transepicondylar axis (TEA), the posterior condylar axis (PCA) and the prosthetic posterior condylar axis (rPCA). The angles between the PCA and the TEA (PCA/TEA), between the rPCA and the PCA (rPCA/PCA) and between the rPCA and the TEA (rPCA/TEA) were measured. Data regarding patellar maltracking and PROMs were collected for 24 months postoperatively. Results: The mean PCA/TEA, rPCA/TEA and rPCA/PCA angles were −5.1° ± 2.1°, −4.8° ± 2.6° and −0.4° ± 1.7°, respectively (the negative values denote the internal rotation of the PCA to the TEA, rPCA to TEA and rPCA to PCA, respectively). There was no need for lateral release and no cases of patellar maltracking. Conclusion: A tibia-first, gap-balancing, “functional alignment” approach allows incorporating a gap-balancing technique with kinematic principles. Sagittal complexities in the proximal tibia (variable medial and lateral slopes) can be accounted for, as the tibial resection is completed prior to setting the femoral rotation. The prosthetic femoral rotation is internally rotated relative to the TEA, almost parallel to the PCA, similar to the femoral rotation of the KA-TKA technique. This technique did not result in patellar maltracking.
APA, Harvard, Vancouver, ISO, and other styles
49

GÜMÜŞ, İbrahim Halil, and Serkan GÜLDAL. "Tıbbi Verilerde Heinz Ortalamasına Dayalı Yeni Sentetik Veriler Üreterek Veri Kümesini Dengeleme." Afyon Kocatepe University Journal of Sciences and Engineering 22, no. 3 (June 30, 2022): 570–76. http://dx.doi.org/10.35414/akufemubid.1011058.

Full text
Abstract:
Advances in science and technology have caused data sizes to increase at a great rate. Thus, unbalanced data has arisen. A dataset is unbalanced if the classes are not nearly equally represented. In this case, classifying the data causes performance values to decrease because the classification algorithms are developed on the assumption that the datasets are balanced. As the accuracy of the classification favors the majority class, the minority class is often misclassified. The majority of datasets, especially those used in the medical field, have an unbalanced distribution. To balance this distribution, several studies have been performed recently. These studies are undersampling and oversampling processes. In this study, distance and mean based resampling method is used to produce synthetic samples using minority class. For the resampling process, the closest neighbors for all data points belonging to the minority class were determined by using the Euclidean distance. Based on these neighbors and using the Heinz Mean, the desired number of new synthetic samples were formed between each sample to obtain balance. The Random Forest (RF) and Support Vector Machine (SVM) algorithms are used to classify the raw and balanced datasets, and the results were compared. Additionally, the other well known methods (Random Over Sampling-ROS, Random Under Sampling-RUS, and Synthetic Minority Oversampling TEchnique-SMOTE) are compared with the proposed method. It was shown that the balanced dataset using the proposed resampling method increases classification efficiency as compared to the raw dataset and other methods. Accuracy measurements of RF are 0.751 and 0.799 and, accuracy measurements of SVM are 0.762 and 0.781 for raw data and resampled data respectively. Likewise, there are improvements in the other metrics such as Precision, Recall, and F1 Score.
APA, Harvard, Vancouver, ISO, and other styles
50

Marbun, Susiana, Erna Putri Manalu, and Yois Nelsari Malau. "Pengaruh pajak daerah, retribusi daerah, dana perimbangan, SiLPA terhadap alokasi belanja daerah pada Kabupaten/Kota Provinsi Sumatera Selatan Tahun 2017-2019." Jurnal Paradigma Ekonomika 17, no. 1 (May 9, 2022): 19–30. http://dx.doi.org/10.22437/jpe.v17i1.14370.

Full text
Abstract:
This research examines the influence of Regional Taxes, Regional Levies, Balance Funds, and SiLPA on Regional Expenditure Allocations in the Regency/City of South Sumatra Province in 2017-2019. This presentation uses a quantitative descriptive type, with the number of samples being 17 districts/cities. In the choice of models, used saturated sampling technique. Retrieval data sources using documentation techniques whose data is taken through the website www.bpssumsel.go.id. In the coefficient test, the result obtained Adjusted R2 worth 0.942, meaning 94.2%, and the rest of the other independent that have not yet been studied in this regard presentation are 5,8%. The discussion obtained that regional taxes, regional levies, and balancing funds influence, but SiLPA does not affect regional expenditure allocation. Provincial taxes, regional levies, balancing funds, and SiLPA of the whole are stated to impact the regional expenditure allocation.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography