Academic literature on the topic 'Database transaction safety'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Database transaction safety.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Database transaction safety"

1

Sheard, Tim, and David Stemple. "Automatic verification of database transaction safety." ACM Transactions on Database Systems 14, no. 3 (September 1989): 322–68. http://dx.doi.org/10.1145/68012.68014.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Lin, Jerry Chun-Wei, Matin Pirouz, Youcef Djenouri, Chien-Fu Cheng, and Usman Ahmed. "Incrementally updating the high average-utility patterns with pre-large concept." Applied Intelligence 50, no. 11 (June 30, 2020): 3788–807. http://dx.doi.org/10.1007/s10489-020-01743-y.

Full text
Abstract:
Abstract High-utility itemset mining (HUIM) is considered as an emerging approach to detect the high-utility patterns from databases. Most existing algorithms of HUIM only consider the itemset utility regardless of the length. This limitation raises the utility as a result of a growing itemset size. High average-utility itemset mining (HAUIM) considers the size of the itemset, thus providing a more balanced scale to measure the average-utility for decision-making. Several algorithms were presented to efficiently mine the set of high average-utility itemsets (HAUIs) but most of them focus on handling static databases. In the past, a fast-updated (FUP)-based algorithm was developed to efficiently handle the incremental problem but it still has to re-scan the database when the itemset in the original database is small but there is a high average-utility upper-bound itemset (HAUUBI) in the newly inserted transactions. In this paper, an efficient framework called PRE-HAUIMI for transaction insertion in dynamic databases is developed, which relies on the average-utility-list (AUL) structures. Moreover, we apply the pre-large concept on HAUIM. A pre-large concept is used to speed up the mining performance, which can ensure that if the total utility in the newly inserted transaction is within the safety bound, the small itemsets in the original database could not be the large ones after the database is updated. This, in turn, reduces the recurring database scans and obtains the correct HAUIs. Experiments demonstrate that the PRE-HAUIMI outperforms the state-of-the-art batch mode HAUI-Miner, and the state-of-the-art incremental IHAUPM and FUP-based algorithms in terms of runtime, memory, number of assessed patterns and scalability.
APA, Harvard, Vancouver, ISO, and other styles
3

Chen, Chao, Abdelsalam (Sumi) Helal, Zhi Jin, Mingyue Zhang, and Choonhwa Lee. "IoTranx: Transactions for Safer Smart Spaces." ACM Transactions on Cyber-Physical Systems 6, no. 1 (January 31, 2022): 1–26. http://dx.doi.org/10.1145/3471937.

Full text
Abstract:
Smart spaces such as smart homes deliver digital services to optimize space use and enhance user experience. They are composed of an Internet of Things (IoT), people, and physical content. They differ from traditional computer systems in that their cyber-physical nature ties intimately with the users and the built environment. The impact of ill-programmed applications in such spaces goes beyond loss of data or a computer crash, risking potentially physical harm to the space and its users. Ensuring smart space safety is therefore critically important to successfully deliver intimate and convenient services surrounding our daily lives. By modeling smart space as a highly dynamic database, we present IoT Transactions, an analogy to database transactions, as an abstraction for programming and executing the services as the handling of the devices in smart space. Unlike traditional database management systems that take a “clear room approach,” smart spaces take a “dirty room approach” where imperfection and unattainability of full control and guarantees are the new normal. We identify Atomicity, Isolation, Integrity and Durability (AI 2 D) as the set of properties necessary to define the safe runtime behavior for IoT transactions for maintaining “permissible device settings” of execution and to avoid or detect and resolve “impermissible settings.” Furthermore, we introduce a lock protocol, utilizing variations of lock concepts, that enforces AI 2 D safety properties during transaction processing. We show a brief proof of the protocol correctness and a detailed analytical model to evaluate its performance.
APA, Harvard, Vancouver, ISO, and other styles
4

Rahmawati, Rianti, Anak Agung Gde Agung, and Fitri Sukmawati. "Aplikasi Perhitungan Persediaan Bahan Baku dengan Metode Economic Order Quantity Berdasarkan Varian Produk." Jurnal Nasional Pendidikan Teknik Informatika (JANAPATI) 5, no. 1 (March 13, 2016): 34. http://dx.doi.org/10.23887/janapati.v5i1.9915.

Full text
Abstract:
CV Dwi Sumber is one of manufacture company which handles sales with variant product. Recording process of raw materials from the warehouse until production process was done manually. The company ordered raw material with the same amount without checking the stock quantity available in the warehouse. For every order, the company must pay order fee. Transactions record were also done manually so it was not directly reduce the stock of raw materials in the warehouse. This accumulated to overstock in the warehouse. This application used Economic Order Quantity (EOQ) to calculate optimal raw material order so it can reduce order fee. This application can also record transaction, automatically reduce stock in the warehouse, calculate safety stock in the warehouse so management can decide when they have to reorder the raw material. Transaction record can be viewed as accounting journal and general ledger. This application is built using PHP programming language and MySQL database.
APA, Harvard, Vancouver, ISO, and other styles
5

Huang, Chen, Dan Wang, Zhen Wang, and Jian Yun. "Development of Cultural Investment Trading Services Platform Based on Drupal Framework." Applied Mechanics and Materials 373-375 (August 2013): 1647–51. http://dx.doi.org/10.4028/www.scientific.net/amm.373-375.1647.

Full text
Abstract:
We used a untraditional techniques --- Drupal, and created a cultural investment services platform. Drupal is superior to traditional techniques in terms of efficiency and safety. Site foreground selected Bartik theme, selected mysql database as background support, core transaction functionality implemented used Ubercart e-commerce module. Site has friendly search and browsing interface, and provide a forum, private messages and other communication means.
APA, Harvard, Vancouver, ISO, and other styles
6

Mohamed, Asghaiyer. "Network load traffic on MySQL atomic transaction database." Bulletin of Social Informatics Theory and Application 4, no. 1 (April 23, 2020): 35–39. http://dx.doi.org/10.31763/businta.v4i1.188.

Full text
Abstract:
Internet technology is developing very rapidly especially on the database system. Today's database has led to data that cannot be processed into the traditional way that we call big data. Some data stored on the server requires a way for the data to be valid and intact for that transaction mechanism appears on RDBMS which ensures that the data stored will become a unified whole as in customer account data, withdrawal of money at ATMs, e-transactions -commerce and so on. Of course the use of transactions in a database by not using Atomic transactions has a difference in terms of traffic on the network. This research appears by analyzing network traffic or density from a database that uses transactions and not to users who access them. This research method uses a questionnaire method by distributing questionnaires quantitatively to 300 respondents. The results of the study of approximately 300 respondents, researchers get the results that the use of transactions in databases and databases without transactions after being accessed by 300 people, the densest network is a network owned by a system that uses transaction features, this is because there is a slight increase of about 13% of traffic when compared to a network without transactions. This statement shows that two-way communication from a database that has the transaction provides feedback to the user so that the data is reliable as an indicator that the data has been stored safely. Further research can be done by finding other information or a study of big data using the atomic transaction model.
APA, Harvard, Vancouver, ISO, and other styles
7

Vandevoort, Brecht, Bas Ketsman, Christoph Koch, and Frank Neven. "Robustness against read committed for transaction templates." Proceedings of the VLDB Endowment 14, no. 11 (July 2021): 2141–53. http://dx.doi.org/10.14778/3476249.3476268.

Full text
Abstract:
The isolation level Multiversion Read Committed (RC), offered by many database systems, is known to trade consistency for increased transaction throughput. Sometimes, transaction workloads can be safely executed under RC obtaining the perfect isolation of serializability at the lower cost of RC. To identify such cases, we introduce an expressive model of transaction programs to better reason about the serializability of transactional workloads. We develop tractable algorithms to decide whether any possible schedule of a workload executed under RC is serializable (referred to as the robustness problem). Our approach yields robust subsets that are larger than those identified by previous methods. We provide experimental evidence that workloads that are robust against RC can be evaluated faster under RC compared to stronger isolation levels. We discuss techniques for making workloads robust against RC by promoting selective read operations to updates. Depending on the scenario, the performance improvements can be considerable. Robustness testing and safely executing transactions under the lower isolation level RC can therefore provide a direct way to increase transaction throughput without changing DBMS internals.
APA, Harvard, Vancouver, ISO, and other styles
8

Haoxiang, Wang, and Smys S. "A Survey on Digital Fraud Risk Control Management by Automatic Case Management System." March 2021 3, no. 1 (May 10, 2021): 1–14. http://dx.doi.org/10.36548/jeea.2021.1.001.

Full text
Abstract:
In this digital era, a huge amount of money had been laundered via digital frauds, which mainly occur in the timeframe of electronic payment transaction made by first-time credit/debit card users. Currently, Finance organizations are facing several fraud attempts and it likely happens due to the current infrastructure, which only has an older database.. The current infrastructure diminishes the working environment of any finance organization sector with frequent fraud attempts. In this perspective, the roposed research article provides an overview for the development of an automated prevention system for any finance organization to protect it from any fraudulent attacks. The proposed automated case management system is used to monitor the expenses of the behavior study of users by avoiding the undesirable contact. The proposed research work develops a new management procedure to prevent the occurrence of electronic fraud in any finance organization. The existing procedure can predict digital fraud with an old updated database. This creates disaster and destructive analysis of the finance segment in their procedure. The cyber fraud phenomenon prediction is used to predict the fraud attempt with content-based analysis. The lack of resources is one of the enormous challenges in the digital fraud identification domain. The proposed scheme addresses to integrate all safety techniques to safeguard the stakeholders and finance institutions from cyber-attacks.
APA, Harvard, Vancouver, ISO, and other styles
9

Kuznetsov, Alexandr, Inna Oleshko, Vladyslav Tymchenko, Konstantin Lisitsky, Mariia Rodinko, and Andrii Kolhatin. "Performance Analysis of Cryptographic Hash Functions Suitable for Use in Blockchain." International Journal of Computer Network and Information Security 13, no. 2 (April 8, 2021): 1–15. http://dx.doi.org/10.5815/ijcnis.2021.02.01.

Full text
Abstract:
A blockchain, or in other words a chain of transaction blocks, is a distributed database that maintains an ordered chain of blocks that reliably connect the information contained in them. Copies of chain blocks are usually stored on multiple computers and synchronized in accordance with the rules of building a chain of blocks, which provides secure and change-resistant storage of information. To build linked lists of blocks hashing is used. Hashing is a special cryptographic primitive that provides one-way, resistance to collisions and search for prototypes computation of hash value (hash or message digest). In this paper a comparative analysis of the performance of hashing algorithms that can be used in modern decentralized blockchain networks are conducted. Specifically, the hash performance on different desktop systems, the number of cycles per byte (Cycles/byte), the amount of hashed message per second (MB/s) and the hash rate (KHash/s) are investigated. The comparative analysis of different hashing algorithms allows us to choose the most suitable candidates for building decentralized systems type of blockchain.
APA, Harvard, Vancouver, ISO, and other styles
10

Ou, Jiarui, and Jianglin Zhang. "Data Mining and Meta-Analysis of Psoriasis Based on Association Rules." Journal of Healthcare Engineering 2022 (January 27, 2022): 1–11. http://dx.doi.org/10.1155/2022/9188553.

Full text
Abstract:
Psoriasis is a common chronic and recurrent disease in dermatology, which has a great impact on the physical and mental health of patients. Meta-analysis can evaluate the effectiveness and safety of defubao in the treatment of psoriasis vulgaris. This article observes psoriasis skin lesions treated with topical defubao and the changes in blood vessels under dermoscopy. Considering that the Apriori algorithm and the existing improved algorithm have the problems of ignoring the weight and repeatedly scanning the database, this paper proposes a matrix association rule method based on random forest weighting. This method uses the random forest algorithm to assign weights to each item in the data set, and introduces matrix theory to convert the transaction data set into a matrix form and store it, thereby improving operating efficiency. This article included 11 studies, of which 7 studies used the indicator “Researcher’s Overall Assessment” (IGA) to evaluate the efficacy, 5 studies used the “Patient Overall Assessment” (PGA) as the efficacy evaluation index, and Loss Area and Severity Index (PASI) was used as an observation index to evaluate the efficacy. Seven studies conducted safety comparisons. In this paper, IGA and PGA were used as evaluation indicators. The treatment effect of the defubao group was better than the calcipotriol group and the betamethasone group. The differences were statistically significant. The effect of the Fubao treatment for 8 weeks is significantly better than that of 4 weeks and 2 weeks, and the differences are statistically different. Using PASI as the evaluation index, a descriptive study was carried out, and it was found that after 4 weeks of treatment for psoriasis vulgaris, the average PASI reduction rate of patients was higher than that of the calcipotriol group and the betamethasone group. The safety evaluation found that after 8 weeks of treatment, the incidence of adverse events in the defubao group was significantly lower than that in the calcipotriol group.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Database transaction safety"

1

Lawley, Michael John, and n/a. "Program Transformation for Proving Database Transaction Safety." Griffith University. School of Computing and Information Technology, 2000. http://www4.gu.edu.au:8080/adt-root/public/adt-QGU20070228.150125.

Full text
Abstract:
In this thesis we propose the use of Dijkstra's concept of a predicate transformer [Dij75] for the determination of database transaction safety [SS89] and the generation of simple conditions to check that a transaction will not violate the integrity constraints in the case that it is not safe. The generation of this simple condition is something that can be done statically, thus providing a mechanism for generating safe transactions. Our approach treats a database as state, a database transaction as a program, and the database's integrity constraints as a postcondition in order to use a predicate transformer [Dij75] to generate a weakest precondition. We begin by introducing a set-oriented update language for relational databases for which a predicate transformer is then defined. Subsequently, we introduce a more powerful update language for deductive databases and define a new predicate transformer to deal with this language and the more powerful integrity constraints that can be expressed using recursive rules. Next we introduce a data model with object-oriented features including methods, inheritance and dynamic overriding. We then extend the predicate transformer to handle these new features. For each of the predicate transformers, we prove that they do indeed generate a weakest precondition for a transaction and the database integrity constraints. However, the weakest precondition generated by a predicate transformer still involves much redundant checking. For several general classes of integrity constraint, including referential integrity and functional dependencies, we prove that the weakest precondition can be substantially further simplified to avoid checking things we already know to be true under the assumption that the database currently satisfies its integrity con-straints. In addition, we propose the use of the predicate transformer in combination with meta-rules that capture the exact incremental change to the database of a particular transaction. This provides a more general approach to generating simple checks for enforcing transaction safety. We show that this approach is superior to known existing previous approaches to the problem of efficient integrity constraint checking and transaction safety for relational, deductive, and deductive object-oriented databases. Finally we demonstrate several further applications of the predicate transformer to the problems of schema constraints, dynamic integrity constraints, and determining the correctness of methods for view updates. We also show how to support transactions embedded in procedural languages such as C.
APA, Harvard, Vancouver, ISO, and other styles
2

Lawley, Michael John. "Program Transformation for Proving Database Transaction Safety." Thesis, Griffith University, 2000. http://hdl.handle.net/10072/365511.

Full text
Abstract:
In this thesis we propose the use of Dijkstra's concept of a predicate transformer [Dij75] for the determination of database transaction safety [SS89] and the generation of simple conditions to check that a transaction will not violate the integrity constraints in the case that it is not safe. The generation of this simple condition is something that can be done statically, thus providing a mechanism for generating safe transactions. Our approach treats a database as state, a database transaction as a program, and the database's integrity constraints as a postcondition in order to use a predicate transformer [Dij75] to generate a weakest precondition. We begin by introducing a set-oriented update language for relational databases for which a predicate transformer is then defined. Subsequently, we introduce a more powerful update language for deductive databases and define a new predicate transformer to deal with this language and the more powerful integrity constraints that can be expressed using recursive rules. Next we introduce a data model with object-oriented features including methods, inheritance and dynamic overriding. We then extend the predicate transformer to handle these new features. For each of the predicate transformers, we prove that they do indeed generate a weakest precondition for a transaction and the database integrity constraints. However, the weakest precondition generated by a predicate transformer still involves much redundant checking. For several general classes of integrity constraint, including referential integrity and functional dependencies, we prove that the weakest precondition can be substantially further simplified to avoid checking things we already know to be true under the assumption that the database currently satisfies its integrity con-straints. In addition, we propose the use of the predicate transformer in combination with meta-rules that capture the exact incremental change to the database of a particular transaction. This provides a more general approach to generating simple checks for enforcing transaction safety. We show that this approach is superior to known existing previous approaches to the problem of efficient integrity constraint checking and transaction safety for relational, deductive, and deductive object-oriented databases. Finally we demonstrate several further applications of the predicate transformer to the problems of schema constraints, dynamic integrity constraints, and determining the correctness of methods for view updates. We also show how to support transactions embedded in procedural languages such as C.
Thesis (PhD Doctorate)
Doctor of Philosophy (PhD)
School of Computing and Information Technology
Faculty of Information and Communication Technology
Full Text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Database transaction safety"

1

Lawley, Michael. "Transaction safety in deductive object-oriented databases." In Deductive and Object-Oriented Databases, 395–410. Berlin, Heidelberg: Springer Berlin Heidelberg, 1995. http://dx.doi.org/10.1007/3-540-60608-4_52.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Indraratne, Harith, and Gábor Hosszú. "Fine-Grained Data Access for Networking Applications." In Encyclopedia of Multimedia Technology and Networking, Second Edition, 568–73. IGI Global, 2009. http://dx.doi.org/10.4018/978-1-60566-014-1.ch076.

Full text
Abstract:
Current-day network applications require much more secure data storages than anticipated before. With millions of anonymous users using same networking applications, security of data behind the applications have become a major concern of database developers and security experts. In most security incidents, the databases attached to the applications are targeted, and attacks have been made. Most of these applications require allowing data manipulation at several granular levels to the users accessing the applications—not just table and view level, but tuple level. A database that supports fine-grained access control restricts the rows a user sees, based on his/her credentials. Generally, this restriction is enforced by a query modification mechanism automatically done at the database. This feature enables per-user data access within a single database, with the assurance of physical data separation. It is enabled by associating one or more security policies with tables, views, table columns, and table rows. Such a model is ideal for minimizing the complexity of the security enforcements in databases based on network applications. With fine-grained access controls, one can create fast, scalable, and secure network applications. Each application can be written to find the correct balance between performance and security, so that each data transaction is performed as quickly and safely as possible. Today, the database vendors like Oracle 10g, and IBM DB2 provides commercial implementations of fine-grained access control methods, such as filtering rows, masking columns selectively based on the policy, and applying the policy only when certain columns are accessed. The behavior of the fine-grained access control model can also be increased through the use of multiple types of policies based on the nature of the application, making the feature applicable to multiple situations. Meanwhile, Microsoft SQL Server2005 has also come up with emerging features to control the access to databases using fine-grained access controls. Fine-grained access control does not cover all the security issues related to Internet databases, but when implemented, it supports building secure databases rapidly and bringing down the complexity of security management issues.
APA, Harvard, Vancouver, ISO, and other styles
3

Alisawi, Muthana, Aras Al-Dawoodi, Yousif Mohammed Wahab, Layth Hammood, Asmaa Yaseen Nawaf, and Alaan Ghazi. "Developing the Real Estate Rental Sector in Third World Countries Using Blockchain Technology." In Blockchain Technologies for Sustainable Development in Smart Cities, 87–109. IGI Global, 2022. http://dx.doi.org/10.4018/978-1-7998-9274-8.ch006.

Full text
Abstract:
Blockchain technology represents an attractive factor for sectors that are interested in managing and keeping their transactions on a database that records all the movements that take place on each transaction. Therefore, to keep pace with the digital transformation taking place in this field of the real estate sector, this chapter aims to submit a proposal in the Iraqi real estate sector to implement this technology. In addition, the chapter clarified the current mechanism adopted in Iraq in completing real estate rental transactions and the problems that accompany this method. Then, the authors explained the advantages provided by the application of blockchain technology through the smart contract, in addition to eliminating the problems of tax evasion in this sector. Despite the negative points that this technology suffers from, it still represents a safe environment for preserving, managing, and exchanging information.
APA, Harvard, Vancouver, ISO, and other styles
4

Jayasena, K. Pubudu Nuwnthika, and Poddivila Marage Nimasha Ruwandi Madhunamali. "Blockchain and IoT-Based Diary Supply Chain Management System for Sri Lanka." In Advances in Data Mining and Database Management, 246–73. IGI Global, 2021. http://dx.doi.org/10.4018/978-1-7998-6694-7.ch015.

Full text
Abstract:
The central problem to be addressed in this research is to investigate how blockchain technology can be used in today's food supply chains to deliver greater traceability of assets. The aim is to create a blockchain model in the dairy supply chain that can be implemented across any food supply chains and present the advantages and limitations in its implementation. Blockchain allows monitoring all types of transactions in a supply chain more safely and transparently. Acceptance of blockchain in the supply chain and logistics is slow right now because of related risks and the lack of demonstrable models. The proposed solution removes the need for a trusted centralized authority, intermediaries and provides records of transactions, improving high integrity, reliability, and security efficiency and protection. All transactions are registered and maintained in the unchangeable database of the blockchain with access to a shared file network.
APA, Harvard, Vancouver, ISO, and other styles
5

Yapa Bandara, Kosala, Subhasis Thakur, and John G. Breslin. "End-to-End Tracing and Congestion in a Blockchain." In Advances in Data Mining and Database Management, 68–91. IGI Global, 2021. http://dx.doi.org/10.4018/978-1-7998-6650-3.ch004.

Full text
Abstract:
Modern supply chain applications are complex systems that play an important role in many different sectors. Supply chain management systems are implemented to handle increasing complexity and flows of goods. However, most of these systems are also increasing the complexity of providing trust and a global view of transactions in a distributed supply chain system. Blockchain technology introduces a new architectural style to support the traceability and trust of transactions performed by participants in a network. This chapter uses this emerging technology to realize a supply chain use case from JLP Meats in the UK with improved transparency, trust, and end-to-end querying while discussing potential challenges of realizing large-scale enterprise blockchain applications. The process of farm-to-fork is implemented and tested for traceability, item recall, block analysis, congestion enabling food safety, and sustainable agriculture. Potential challenges are highlighted in complex supply chains that need heterogeneous trade compliance and scalability.
APA, Harvard, Vancouver, ISO, and other styles
6

Show, Arnab Kumar, Abhishek Kumar, Achintya Singhal, Gayathri N., and K. Vengatesan. "Future Blockchain Technology for Autonomous Applications/Autonomous Vehicle." In Advances in Data Mining and Database Management, 165–77. IGI Global, 2021. http://dx.doi.org/10.4018/978-1-7998-3295-9.ch010.

Full text
Abstract:
The autonomous industry has rapidly grown for self-driving cars. The main purpose of autonomous industry is trying to give all types of security, privacy, secured traffic information to the self-driving cars. Blockchain is another newly established secured technology. The main aim of this technology is to provide more secured, convenient online transactions. By using this new technology, the autonomous industry can easily provide more suitable, safe, efficient transportation to the passengers and secured traffic information to the vehicles. This information can easily gather by the roadside units or by the passing vehicles. Also, the economical transactions can be possible more efficiently since blockchain technology allows peer-to-peer communications between nodes, and it also eliminates the need of the third party. This chapter proposes a concept of how the autonomous industry can provide more adequate, proper, and safe transportation with the help of blockchain. It also examines for the possibility that autonomous vehicles can become the future of transportation.
APA, Harvard, Vancouver, ISO, and other styles
7

Gupta, Priti, Abhishek Kumar, Achintya Singhal, Shantanu Saurabh, and V. D. Ambeth Kumar. "Security, Privacy, and Trust Management and Performance Optimization of Blockchain." In Advances in Data Mining and Database Management, 134–46. IGI Global, 2021. http://dx.doi.org/10.4018/978-1-7998-3295-9.ch008.

Full text
Abstract:
Blockchain provides innovative ideas for storing information, executing transactions, performing functions, creating trust in an open environment, etc. Even though cryptographers, mathematicians, and coders have been trying to bring the most trustable protocols to get authentication guarantee over various systems, blockchain technology is secure with no central authority in an open network system because of a large distributed network of independent users. If anyone tries to change the blockchain database, the current hash will also change, which does not match with the previous hash. In this way, blockchain creates privacy and trust in digital data by removing malleability attacks. In this chapter, security and privacy on the blockchain has been focused. The safety and privacy of blockchain are mainly engrossed on two things: firstly, uncovering few attacks suffered by blockchain systems and, secondly, putting specific and advanced proposals against such attacks.
APA, Harvard, Vancouver, ISO, and other styles
8

Jayasena, K. Pubudu Nuwnthika, and Poddivila Marage Nimasha Ruwandi Madhunamali. "Blockchain and IoT-Based Diary Supply Chain Management System for Sri Lanka." In Research Anthology on Convergence of Blockchain, Internet of Things, and Security, 1264–91. IGI Global, 2022. http://dx.doi.org/10.4018/978-1-6684-7132-6.ch067.

Full text
Abstract:
The central problem to be addressed in this research is to investigate how blockchain technology can be used in today's food supply chains to deliver greater traceability of assets. The aim is to create a blockchain model in the dairy supply chain that can be implemented across any food supply chains and present the advantages and limitations in its implementation. Blockchain allows monitoring all types of transactions in a supply chain more safely and transparently. Acceptance of blockchain in the supply chain and logistics is slow right now because of related risks and the lack of demonstrable models. The proposed solution removes the need for a trusted centralized authority, intermediaries and provides records of transactions, improving high integrity, reliability, and security efficiency and protection. All transactions are registered and maintained in the unchangeable database of the blockchain with access to a shared file network.
APA, Harvard, Vancouver, ISO, and other styles
9

manjula, rangu. "A Review to Leverage the Integration of Blockchain and Artificial Intelligence." In Advances in Data Mining and Database Management, 1–21. IGI Global, 2021. http://dx.doi.org/10.4018/978-1-7998-3295-9.ch001.

Full text
Abstract:
Information is the input for several transactions in blockchain technology and AI algorithms. Information in the net is scattered everyplace and controlled by totally different stakeholders. The net is hard to authorize or validate. In this chapter, the authors have a tendency to propose a completely unique approach. Bury Planetary Classification System and Ethereum offer safer information storing, sharing, computing within the large-scale net atmosphere. Here the authors have a tendency to square measure desegregation of two key components: 1) blockchain-based information sharing with possession guarantee and trustworthy information sharing within the large-scale atmosphere to make real huge information and 2) AI-based mostly secured computing technology to supply a lot of intelligent security policies to make a trustworthy net. Bury Planetary classification system makes it attainable to distribute high volumes of knowledge with high potency and no duplication.
APA, Harvard, Vancouver, ISO, and other styles
10

Gupta, Priti, Abhishek Kumar, Achintya Singhal, Shantanu Saurabh, and V. D. Ambeth Kumar. "Security, Privacy, and Trust Management and Performance Optimization of Blockchain." In Research Anthology on Convergence of Blockchain, Internet of Things, and Security, 1115–27. IGI Global, 2022. http://dx.doi.org/10.4018/978-1-6684-7132-6.ch059.

Full text
Abstract:
Blockchain provides innovative ideas for storing information, executing transactions, performing functions, creating trust in an open environment, etc. Even though cryptographers, mathematicians, and coders have been trying to bring the most trustable protocols to get authentication guarantee over various systems, blockchain technology is secure with no central authority in an open network system because of a large distributed network of independent users. If anyone tries to change the blockchain database, the current hash will also change, which does not match with the previous hash. In this way, blockchain creates privacy and trust in digital data by removing malleability attacks. In this chapter, security and privacy on the blockchain has been focused. The safety and privacy of blockchain are mainly engrossed on two things: firstly, uncovering few attacks suffered by blockchain systems and, secondly, putting specific and advanced proposals against such attacks.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Database transaction safety"

1

Wolfson, Ouri, and Mihalis Yannakakis. "Deadlock-freedom (and saftey) of transactions in a distributed database." In the fourth ACM SIGACT-SIGMOD symposium. New York, New York, USA: ACM Press, 1985. http://dx.doi.org/10.1145/325405.325418.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Fauvel, Owen R. "On the Distinction Between Design Parameters and Functional Requirements." In ASME 1994 Design Technical Conferences collocated with the ASME 1994 International Computers in Engineering Conference and Exhibition and the ASME 1994 8th Annual Database Symposium. American Society of Mechanical Engineers, 1994. http://dx.doi.org/10.1115/detc1994-0036.

Full text
Abstract:
Abstract The working of the design process has been described as a process of mapping Functional Requirements into Design Parameters. The definitions of these two types of information appear to be based upon intuitive differences. It is posited that by generating an operational distinction between the attributes in these two information domains, useful information patterns can be described for use in the design process. The following distinction is observed: whereas Design Parameters are deemed to have meaning which is insensitive to context, Functional Requirements and attributes can only be assessed with reference to the operative context or environment within which the designed object exists. Functional attributes such as usability, manufacturability, serviceability, safety, and affordability are seen not as intrinsic properties of a designed object but rather as measures of the interaction between the designed object and the relevant context; for the attributes cited, it would be necessary to characterize in turn the user environment, the manufacturing infrastructure, the servicing facilities and skills, the operational/legal environment, and the economic situation. The distinction as outlined serves as a premise upon which a fundamental information structure can be based. The proposed structure involves the categorization of design information into not only the Function Domain and the Design Parameter domain but also embraces a third - contextual - domain identified herein as the Environment Domain. Operational definitions have been devised for each type of information. These definitions also point to the nature of the interactions between the three types of information which take place during the process of design. It is suggested that what is presented here is not a new design paradigm but rather a new way to describe in a clear and explicit fashion the information and information transactions which are known to constitute the design processes. As such, it is seen to be of particular value in design education. However, it may also prove to be useful in organizing information systems for concurrent design activities. This view of design information has emerged through efforts to improve the effectiveness of teaching both design and manufacturing courses as well as the desire to improve the management of graduate design projects. Additionally, it has been influenced through ongoing research and development in the design of specific mechanical systems. As such, it is firmly rooted in the practicalities of design and design teaching and is constantly being put to the tests of utility, practicality, and veracity. For example, assessment of the attribute “manufacturability” has led to a systematic structuring of knowledge and information about manufacturing infrastructure in a way which facilitates decision-making as well as explanation and justification of the decision-making process. Some progress is also being made in developing information patterns which embrace all three information domains by way of providing pre-packaged design solutions for well-established types of design problem. The “bolted-joint”, for example, represents an extremely common design element about which much can be determined analytically but about which many other functional aspects are less accessible. Manufacturability, serviceability, reliability are attributes which can be assessed when due consideration is given to context regarding manufacture, use, placement, etc. The use of this information structure has also been useful in examining various models of the design process whether along traditional problem-solving lines or using artificial intelligence oriented systems. This approach has been used in examining the design process at the graduate level but student feedback has been sufficiently strong to suggest that it would be useful at the undergraduate level. In particular, while the traditional approach to teaching design provides an “activity map”, the addition of an “information map” is seen to be highly complementary. The notion of the information map is also seen to be useful for the management of concurrent design endeavours. It would be expected to provide a picture of both communication pathways and indicate the nature of the communications required. For example, the attribute “affordability” will usually be of particular importance for most designed things. Assessment of this attribute requires knowledge of the marketplace as well as the cost of the article and its performance capability. The cost attribute will require knowledge of the manufacturability of the article and hence the capability of the manufacturing infrastructure. In this way diverse interests can be visibly linked. And of course the map need not be a static one but would be expected to reflect the dynamics of the design process. If the distinction between attribute types continues to prove a useful and valid one, the door is opened to a new generation of parameterized design within which not only geometric relationships are programmed but more fuzzily-defined functions are determined by propagation of information along function-oriented pathways. The language for communication between disparate role-players in the design process has far to grow but the form of the communication can start to take on shape. Finally, the proposed information map will provide an explicit history of a design project thereby facilitating such activities as design audits and accident investigations. Perhaps as important is the role of the information map in recording the knowledge of expert designers and the generation of case histories which more explicitly illustrate the role of specific pieces of information in the generation of design solutions.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography