Artículos de revistas sobre el tema "File"

Siga este enlace para ver otros tipos de publicaciones sobre el tema: File.

Crea una cita precisa en los estilos APA, MLA, Chicago, Harvard y otros

Elija tipo de fuente:

Consulte los 50 mejores artículos de revistas para su investigación sobre el tema "File".

Junto a cada fuente en la lista de referencias hay un botón "Agregar a la bibliografía". Pulsa este botón, y generaremos automáticamente la referencia bibliográfica para la obra elegida en el estilo de cita que necesites: APA, MLA, Harvard, Vancouver, Chicago, etc.

También puede descargar el texto completo de la publicación académica en formato pdf y leer en línea su resumen siempre que esté disponible en los metadatos.

Explore artículos de revistas sobre una amplia variedad de disciplinas y organice su bibliografía correctamente.

1

Sahni, Priyanka y Nonika Sharma. "File System". Journal of Advance Research in Computer Science & Engineering (ISSN: 2456-3552) 1, n.º 4 (30 de abril de 2014): 01–05. http://dx.doi.org/10.53555/nncse.v1i4.513.

Texto completo
Resumen
A file is a named collection of related information that is recorded on secondary storage such as magnetic disks, magnetic tapes and optical disks. In general, a file is a sequence of bits, bytes, lines or records whose meaning is defined by the files creator and user. In our research paper we’re going to focus on File Structure, File Type, File Names, Pathnames. File Structure - It is a structure, which is according to a required format that operating system can understand. File types also can be used to indicate the internal structure of the file. File Type – It refers to the ability of the operating system to distinguish different types of file such as text files, source files and the binary files etc .Many Operating system supports many types of files .Operating system like MS-DOS and UNIX have following types of files:
Los estilos APA, Harvard, Vancouver, ISO, etc.
2

Herlambang, Herlambang, Antory Royan y Samhori Samhori. "PROCESS OF PROSPEROUS RESEARCH FILES IN THE CASE DETERMINING NEW INSTITUTIONS IN CRIMINAL MEASURES CORRUPTION". Bengkoelen Justice : Jurnal Ilmu Hukum 10, n.º 1 (8 de junio de 2020): 1–12. http://dx.doi.org/10.33369/j_bengkoelenjust.v10i1.11339.

Texto completo
Resumen
This thesis is the result of research on "Process Attorney Researcher Case file in Determine Suspect New on Corruption", specifically this thesis discusses the process of determining the new suspects by prosecutors investigators file corruption cases and factor inhibitors for Attorney Researcher Files Case to determine a suspect new in the case file corruption. In particular, this thesis examines several cases of corruption filed by Investigator Police Resort Arga Makmur District Attorney of Public Prosecutions to Arga Makmur. The results of the study illustrate that the Attorney Researcher in determining new suspects in the case files submitted by police investigators are using the instrument giving direct instructions by the prosecution team of researchers P 19 and the instrument exposes / his case subsequently prosecutors investigators case file gives instructions to the investigator to be fulfilled within 14 days after prosecutors clue investigators met by investigators, the case file is declared complete by prosecutors investigators (P21).
Los estilos APA, Harvard, Vancouver, ISO, etc.
3

Wang, Ya Rong, Pei Rong Wang y Rui Liu. "Hybrid File System - A Strategy for the Optimization of File System". Advanced Materials Research 734-737 (agosto de 2013): 3129–32. http://dx.doi.org/10.4028/www.scientific.net/amr.734-737.3129.

Texto completo
Resumen
The hybrid file system is designed to optimize the latency of the response of File System I/Os and extend the capacity of the local file system to cloud by taking the advantage of Internet. Our hybrid file system is consist of SSD, HDD and Amazon S3 cloud file system. We store small files, directory tree and metadata of all the files in SSD, because SSD has a good performance for the response of small and random I/Os. HDD is good at responding big and sequential I/Os, so we use it just like a warehouse to store big files which are linked by the symbolic files in the SSD. We also extend the local file system to cloud in order to enlarge its capacity. In this paper we describe test data of our hybrid file system and also its design and implement details.
Los estilos APA, Harvard, Vancouver, ISO, etc.
4

Zhao, Zhen Zhou y Zhong Yong Fu. "The Research of Excel File Fragmentation Data Recovery". Applied Mechanics and Materials 496-500 (enero de 2014): 2274–78. http://dx.doi.org/10.4028/www.scientific.net/amm.496-500.2274.

Texto completo
Resumen
Because the Excel file is often edited, the data blocks stored in the hard disk which make up the Excel file is often not continuous. Discontinuous storage fragmented files, deleted or formatted, with the current data recovery software to restore the effect is very poor, often file recovery, but can not open or is garbled. Therefore it makes sense to study an effective recovery method to recover Excel fragmented files. Based on "Laura" file format, first introduced the Windows platform fragmented files, and then given an Excel file fragmentation recognition method, final based on debris identify given Excel file fragmentation restructuring.
Los estilos APA, Harvard, Vancouver, ISO, etc.
5

Taufik, Ichsan, Undang Syaripudin, Faiz M. Kaffah, Jaka Giri Sobirin, Nanang Ismail y Teddy Surya Gunawan. "Concealment of Files Blocked by Gmail with EOF-Based Image Steganography". Indonesian Journal of Electrical Engineering and Computer Science 12, n.º 2 (1 de noviembre de 2018): 716. http://dx.doi.org/10.11591/ijeecs.v12.i2.pp716-721.

Texto completo
Resumen
Nowadays, due to security concern, not all the process of sending files via email runs smoothly. There are several types of file extensions that are blocked when sent via email. For examples, there are several file extensions blocked by Gmail. This paper discusses steganographic implementation using End of File (EOF) algorithm to insert special file into image cover file with JPG and PNG format so that files with these extensions can be sent via email. Before a special extension file is inserted into the cover file, a compression process should be conducted first to make the file size smaller. The proposed algorithm is implemented on Visual Basic.Net software. Based on the tests performed, the application can insert Gmail-blocked file system to the image cover file, without changing the physical bit of the image cover file or file system that inserted with 100% success rate. The stego-image file is also successfully sent via email without being blocked.
Los estilos APA, Harvard, Vancouver, ISO, etc.
6

Narjis Mezaal Shati y Ali Jassim Mohamed Ali. "Hiding Any Data File Format into Wave Cover". journal of the college of basic education 16, n.º 69 (31 de octubre de 2019): 1–10. http://dx.doi.org/10.35950/cbej.v16i69.4739.

Texto completo
Resumen
In the current study a steganography approach utilized to hide various data file format in wave files cover. Lest significant bit insertion (LSB) used to embedding a regular computer files (such as graphic, execution file (exe), sound, text, hyper text markup language (HTML) …etc) in a wave file with 2-bits hiding rates. The test results achieved good performance to hide any data file in wave file.
Los estilos APA, Harvard, Vancouver, ISO, etc.
7

Pitchiah, Pragadeesh A. y Prathima G. Shivashankarappa. "Rotary Files in Pediatric Dentistry: From Then Till Now". Journal of Scientific Dentistry 10, n.º 2 (2020): 55–57. http://dx.doi.org/10.5005/jp-journals-10083-0926.

Texto completo
Resumen
ABSTRACT The introduction of adult NiTi rotary file system for children was a revolution in the field of pediatric endodontics. With the use of these files, cost-effective, consistent obturations were made possible in shorter instrumentation time. The various restraints of adult rotary files such as file length and taper created the need for newer rotary file system. These voids were fulfilled with the advent of exclusive pediatric rotary file—Kedo-S file system. In this article, we have discussed how the innovative pediatric rotary files have made its mark overpowering the limitations of the existing adult rotary file system in children. How to cite this article: Pitchiah PA, Shivashankarappa PG. Rotary Files in Pediatric Dentistry: From Then Till Now. J Sci Dent 2020;10(2):55–57.
Los estilos APA, Harvard, Vancouver, ISO, etc.
8

Bhat, Anirudh, Aryan Likhite, Swaraj Chavan y Leena Ragha. "File Fragment Classification using Content Based Analysis". ITM Web of Conferences 40 (2021): 03025. http://dx.doi.org/10.1051/itmconf/20214003025.

Texto completo
Resumen
One of the major components in Digital Forensics is the extraction of files from a criminal’s hard drives. To achieve this, several techniques are used. One of these techniques is using file carvers. File carvers are used when the system metadata or the file table is damaged but the contents of the hard drive are still intact. File carvers work on the raw fragments in the hard disk and reconstruct files by classifying the fragments and then reassembling them to form the complete file. Hence the classification of file fragments has been an important problem in the field of digital forensics. The work on this problem has mainly relied on finding the specific byte sequences in the file header and footer. However, classification based on header and footer is not reliable as they may be modified or missing. In this project, the goal is to present a machine learningbased approach for content-based analysis to recognize the file types of file fragments. It does so by training a Feed-Forward Neural Network with a 2-byte sequence histogram feature vector which is calculated for each file. These files are obtained from a publicly available file corpus named Govdocs1. The results show that content-based analysis is more reliable than relying on the header and footer data of files.
Los estilos APA, Harvard, Vancouver, ISO, etc.
9

Karanović, Lj y D. Poleti. "A FORTRAN Program for Conversion of PC-APD Data Files into ASCII Files". Powder Diffraction 7, n.º 3 (septiembre de 1992): 179. http://dx.doi.org/10.1017/s0885715600018595.

Texto completo
Resumen
Recently, Dahan and co-workers (Dahan, 1991) suggested processing the XRD data by spreadsheet computer programs. Treated in this manner the XRD data became very flexible and made comparison with other data sets, as well as graphical presentation, much easier. In this note a simple FORTRAN 77 program for conversion of PC-APD data files into ASCII files suitable for import into spreadsheets is reported.In our laboratory XRD data are collected on a Philips 1710 diffractometer operated by the PC-APD version 2.0 (PC-APD Software, 1989). Each experiment usually generates its files containing collected raw intensity data (.RD file), background data (.BK file) and file with peak positions and their intensities (.DI file). The XRD data can be further processed: after smoothing, data are stored in files with extension .SM (.SM file) and, after Kα2 stripping, into files with extension .A2 (.A2 file). All files are stored in the binary format.
Los estilos APA, Harvard, Vancouver, ISO, etc.
10

Abhilisha Pandurang Bhalke. "Enhancement the Efficiency of Work with P2P System". International Journal for Modern Trends in Science and Technology 06, n.º 09 (29 de octubre de 2020): 68–72. http://dx.doi.org/10.46501/ijmtst060911.

Texto completo
Resumen
The P2P system should be used Proximity information to minimize the load of file request and improve the efficiency of the work .Clustering peers for their physical proximity can also rise the performance of the request file. However, very few currently work in a peer group based on demands as peers on physical proximity. Although structured P2P provides more efficient files requests than unstructured P2P, it is difficult to apply because of their strictly defined topology. In this work, we intending to introduce a system for exchange a P2P file for proximity and level of interest based on structured P2P nodes that form physically block in the cluster and other groups physically close and nodes of public interest in sub-cluster based on the hierarchical topology. Querying an effective file is important for the overall P2P file exchange performance. Clustering peers from their common interests can significantly enhance the efficiency of the request file PAIS use an intelligent file replication algorithm to further rise the efficiency of the request file .Create a copy file that is often requested by a group of physically close nodes in their position. In addition, PAIS improves the search for files within the intra-system sub-cluster through various approaches. First, it further classifies interest in the sub-cluster to a number of subsections of interests and groups with common interest-free sub nodes in the group for file sharing. Secondly PAIS creates an over for each group that connects nodes of less node capacity to a higher throughput for the distributed node overload prevention request file. Third, in order to reduce the search for late files, PAIS uses a set of proactive information so that applicant can file knowledge if its requested file is in the neighboring nodes. Fourth, reduce the overhead of collecting information about files using the PAIS, collection of file information based on the Bloom Filter and the corresponding search for files distributed. Fifth, in order to improve the efficiency of file sharing, PAIS ranks the results with a blob of filters in order. Sixth, while the newly visited file is usually re-visited approach, based on the Bloom filter is improved only through the management of new information flowering filter is added to reduce the delay of file search. The experimental result of the Real-world Planet Lab Experiment shows that PAIS significantly reduces overhead and improves the efficiency of scrolling and without sharing files. In addition, the experimental results show high efficiency within the sub-research cluster of file approaches to improve file search efficiency.
Los estilos APA, Harvard, Vancouver, ISO, etc.
11

Karresand, Martin, Stefan Axelsson y Geir Olav Dyrkolbotn. "Disk Cluster Allocation Behavior in Windows and NTFS". Mobile Networks and Applications 25, n.º 1 (19 de diciembre de 2019): 248–58. http://dx.doi.org/10.1007/s11036-019-01441-1.

Texto completo
Resumen
AbstractThe allocation algorithm of a file system has a huge impact on almost all aspects of digital forensics, because it determines where data is placed on storage media. Yet there is only basic information available on the allocation algorithm of the currently most widely spread file system; NTFS. We have therefore studied the NTFS allocation algorithm and its behavior empirically. To do that we used two virtual machines running Windows 7 and 10 on NTFS formatted fixed size virtual hard disks, the first being 64 GiB and the latter 1 TiB in size. Files of different sizes were written to disk using two writing strategies and the $Bitmap files were manipulated to emulate file system fragmentation. Our results show that files written as one large block are allocated areas of decreasing size when the files are fragmented. The decrease in size is seen not only within files, but also between them. Hence a file having smaller fragments than another file is written after the file having larger fragments. We also found that a file written as a stream gets the opposite allocation behavior, i. e. its fragments are increasing in size as the file is written. The first allocated unit of a stream written file is always very small and hence easy to identify. The results of the experiment are of importance to the digital forensics field and will help improve the efficiency of for example file carving and timestamp verification.
Los estilos APA, Harvard, Vancouver, ISO, etc.
12

Aqilah Mohd Nahar, Nur Farah, Nurul Hidayah Ab Rahman y Kamarudin Malik Mohammad. "E-Raser: File Shredder Application With Content Replacement by Using Random Words Function". JOIV : International Journal on Informatics Visualization 2, n.º 4-2 (10 de septiembre de 2018): 313. http://dx.doi.org/10.30630/joiv.2.4-2.175.

Texto completo
Resumen
Data shredding indicates a process of irreversible file destruction while file shredder is the program designed to render computer-based files unreadable by implementing overwriting method to destroy data in the content of a file. The addressable problem with existence of file recovery tools is it may lead to data leakage, exploitation or dissemination from an unauthorized person. Thus, this study proposed a file shredding application named E-Raser which replacing the content of the file using random words function algorithm. A file shredder application named E-Raser was developed to shred Microsoft Word documents with (.doc) or (.docx) format. The implemented algorithm replaced the original content of the files with uninformative words provided by the application. After rewriting phase is complete, shredding process take place to make the file unrecoverable. Object Oriented Software Development was used as the methodology to develop this application. As a result, E-Raser achieved the objectives to add, remove, rewrite, display and shred files. Also, E-Raser is significantly facilitates users to securely dispose their file, protect the confidentiality and privacy of the file’s content.
Los estilos APA, Harvard, Vancouver, ISO, etc.
13

Kim, Hyungchan, Sungbum Kim, Yeonghun Shin, Wooyeon Jo, Seokjun Lee y Taeshik Shon. "Ext4 and XFS File System Forensic Framework Based on TSK". Electronics 10, n.º 18 (20 de septiembre de 2021): 2310. http://dx.doi.org/10.3390/electronics10182310.

Texto completo
Resumen
Recently, the number of Internet of Things (IoT) devices, such as artificial intelligence (AI) speakers and smartwatches, using a Linux-based file system has increased. Moreover, these devices are connected to the Internet and generate vast amounts of data. To efficiently manage these generated data and improve the processing speed, the function is improved by updating the file system version or using new file systems, such as an Extended File System (XFS), B-tree file system (Btrfs), or Flash-Friendly File System (F2FS). However, in the process of updating the existing file system, the metadata structure may be changed or the analysis of the newly released file system may be insufficient, making it impossible for existing commercial tools to extract and restore deleted files. In an actual forensic investigation, when deleted files become unrecoverable, important clues may be missed, making it difficult to identify the culprit. Accordingly, a framework for extracting and recovering files based on The Sleuth Kit (TSK) is proposed by deriving the metadata changed in Ext4 file system journal checksum v3 and XFS file system v5. Thereafter, by comparing the accuracy and recovery rate of the proposed framework with existing commercial tools using the experimental dataset, we conclude that sustained research on file systems should be conducted from the perspective of forensics.
Los estilos APA, Harvard, Vancouver, ISO, etc.
14

Silitonga, Parasian D. P. y Irene Sri Morina. "Compression and Decompression of Audio Files Using the Arithmetic Coding Method". Scientific Journal of Informatics 6, n.º 1 (24 de mayo de 2019): 73–81. http://dx.doi.org/10.15294/sji.v6i1.17839.

Texto completo
Resumen
Audio file size is relatively larger when compared to files with text format. Large files can cause various obstacles in the form of large space requirements for storage and a long enough time in the shipping process. File compression is one solution that can be done to overcome the problem of large file sizes. Arithmetic coding is one algorithm that can be used to compress audio files. The arithmetic coding algorithm encodes the audio file and changes one row of input symbols with a floating point number and obtains the output of the encoding in the form of a number of values greater than 0 and smaller than 1. The process of compression and decompression of audio files in this study is done against several wave files. Wave files are standard audio file formats developed by Microsoft and IBM that are stored using PCM (Pulse Code Modulation) coding. The wave file compression ratio obtained in this study was 16.12 percent with an average compression process time of 45.89 seconds, while the average decompression time was 0.32 seconds.
Los estilos APA, Harvard, Vancouver, ISO, etc.
15

Rizal, Randi, Ruuhwan Ruuhwan y Septian Chandra. "Signature File Analysis Using The National Institute Standard Technology Method Base on Digital Forensic Concepts". Jurnal Informatika Universitas Pamulang 5, n.º 3 (30 de septiembre de 2020): 364. http://dx.doi.org/10.32493/informatika.v5i3.6073.

Texto completo
Resumen
The number of crimes committed by utilizing advances in information technology such as information leakage, embezzlement of money in banks, credit card fraud, pornography, terrorism, drug trafficking and many more are definitely related to the name digital data. File signatures or magic numbers are one of the forensic science techniques that assist in processing this digital data. The method used in this research is the National Institute Standards Technology method to analyze the authenticity of digital data and the method of proof to obtain valid evidence during the identification process of data or file content. This research is presented in the form of an analysis of the use of signature files in investigations to determine the type of file in the case of leaking company information xyz, the research stage uses evidence handling procedures in the laboratory. Contributions made after conducting a series of case investigations using signature files have been successfully carried out using the Access Data FTK Imager application version 4.2.0 and WinHex version 18.6. Signature files can be used for case investigations in identifying and verifying file types so that files that have been modified can be restored and can be read by the operating system by checking file types through hexadecimal values in the header file (file prefix) that show the characteristics of each type file so that the file type can be found and the file can be read by the operating system.
Los estilos APA, Harvard, Vancouver, ISO, etc.
16

Fitriyanto, Rachmad, Anton Yudhana y Sunardi Sunardi. "Boyer-Moore String Matching Algorithm and SHA512 Implementation for Jpeg/exif File Fingerprint Compilation in DSA". JUITA: Jurnal Informatika 8, n.º 1 (4 de mayo de 2020): 1. http://dx.doi.org/10.30595/juita.v8i1.4413.

Texto completo
Resumen
The jpeg/exif is file’s format for image produced by digital camera such as in the smartphones. The security method for jpeg/exif usages in digital communication currently only full-fill prevention aspect from three aspects of information security, prevention, detection and response. Digital Signature Algorithm (DSA) is a cryptographic method that provide detection aspect of information security by using hash-value as fingerprint of digital documents. The purpose of this research is to compile jpeg/exif file data fingerprint using the hash-value from DSA. The research conducted in four stages. The first stages is the identification of jpeg/exif file structure using Boyer-Moore string matching algorithm to locate the position of file’s segments. The second stage is segment’s content acquisition. The third stage the image files modification experiments to select the suitable element of jpeg/exif file data fingerprint. The fourth stage is the compilation of hash-values to form data fingerprint. The Obtained result has shown that the jpeg/exif file fingerprint comprises three hash value from the SOI segment, APP1's segment, and the SOF0 segment. The jpeg/exif file fingerprint can use for modified image detection, include six types of image modification there are image resizing, text addition, metadata modification, image resizing, image cropping and file type conversion
Los estilos APA, Harvard, Vancouver, ISO, etc.
17

El Anwar, Yusuf Jordan, Roni Habibi y Noviana Riza. "PENERAPAN METODE KRIPTOGRAFI AES UNTUK MENGAMANKAN FILE DOKUMEN". Jurnal Tekno Insentif 16, n.º 2 (21 de diciembre de 2022): 92–104. http://dx.doi.org/10.36787/jti.v16i2.852.

Texto completo
Resumen
Abstrak Perkembangan ilmu pengetahuan dan teknologi pada era digital berkembang dengan sangat pesat. Keamanan file dokumen adalah salah satu dampak tersendiri dari datangnya era digital. Tentunya perusahaan x perlu memiliki keunggulan manajemen keamanan file yang efektif dalam menghadapi hal tersebut. Sangat disayangkan, perkembangan perusahaan x saat ini belum memiliki suatu media untuk melakukan pengamanan file dokumen. Oleh karena itu, tujuan utama penelitian ini adalah membuat model pengamanan file dokumen. Proses pengamanan file dokumen yang digunakan terdiri dari enkripsi dan dekripsi. Selanjutnya proses pengamanan file dokumen akan di implementasikan menggunakan pendekatan kriptografi dengan metode advanced encryption standard (aes). Tentunya pengamanan file dokumen perlu divisualisasikan secara realtime untuk dapat digunakan oleh perusahaan x untuk mengamankan file dokumen dengan cepat. Visualisasi hasil prediksi tersebut akan ditampilkan berbasis web base dengan bahasa pemrograman php. Abstract The development of science and technology in the digital era is growing very rapidly. The security of document files is one of the impacts of the advent of the digital era. Of course company x needs to have an effective file security management advantage in dealing with this. Unfortunately, the development of company x currently does not have a media to protect document files. Therefore, the main purpose of this research is to create a document file security model. The process of securing document files used consists of encryption and decryption. Furthermore, the document file security process will be implemented using a cryptographic approach with the advanced encryption standard (AES) method. Of course, document file security needs to be visualized in real time to be used by company x to secure document files quickly. Visualization of the prediction results will be displayed based on a web base with the PHP programming language.
Los estilos APA, Harvard, Vancouver, ISO, etc.
18

Yu, Liguo. "Using Kolmogorov Complexity to Study the Coevolution of Header Files and Source Files of C-alike Programs". International Journal of Knowledge and Systems Science 8, n.º 2 (abril de 2017): 17–26. http://dx.doi.org/10.4018/ijkss.2017040102.

Texto completo
Resumen
In C-alike programs, the source code is separated into header files and source files. During the software evolution process, both these two kinds of files need to adapt to changing requirement and changing environment. This paper studies the coevolution of header files and source files of C-alike programs. Using normalized compression distance that is derived from Kolmogorov complexity, we measure the header file difference and source file difference between versions of an evolving software product. Header files distance and source files distance are compared to understand their difference in pace of evolution. Mantel tests are performed to investigate the correlation of header file evolution and source file evolution. The study is performed on the source code of Apache HTTP web server.
Los estilos APA, Harvard, Vancouver, ISO, etc.
19

Moon, Yangchan y Mingyu Lim. "An Enhanced File Transfer Mechanism Using an Additional Blocking Communication Channel and Thread for IoT Environments". Sensors 19, n.º 6 (13 de marzo de 2019): 1271. http://dx.doi.org/10.3390/s19061271.

Texto completo
Resumen
In this paper, we propose an enhanced file transfer mechanism for a communication framework (CM) for Internet of Things (IoT) applications. Our previous file transfer method uses a basic non-blocking communication channel and thread for the CM (non-blocking method), but this method has a cost of adding additional bytes to each original file block. Therefore, it is not suitable for the transfer of large-sized files. Other existing file transfer methods use a separate channel to transfer large-sized files. However, the creation of a separate channel increases the total transmission delay as the transfer frequency increases. The proposed method uses a dedicated blocking communication channel in a separate thread (blocking method). The blocking method uses a separate channel and thread which are dedicated to transferring file blocks. As it creates the separate channel in advance before the file transfer task, the proposed method does not have an additional channel creation cost at the moment of the file transfer. Through file transfer experiments, the blocking method showed a shorter file transfer time than the non-blocking method, and the transmission delay was increased as the file size grew. By supporting both non-blocking and blocking methods, an application can flexibly select the desirable method according to its requirement. If the application requires the transfer of small-sized files infrequently, it can use the non-blocking method. If the application needs to transfer small-sized or large-sized files frequently, a good alternative is to use the blocking method.
Los estilos APA, Harvard, Vancouver, ISO, etc.
20

Malhotra, Jyoti y Jagdish Bakal. "FiLeD: File Level Deduplication Approach". International Journal of Computer Trends and Technology 44, n.º 2 (25 de febrero de 2017): 74–79. http://dx.doi.org/10.14445/22312803/ijctt-v44p113.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
21

Clapton, Gary. "Thoughts on files". Qualitative Social Work 20, n.º 6 (21 de octubre de 2021): 1461–76. http://dx.doi.org/10.1177/14733250211039511.

Texto completo
Resumen
With the benefit of an anthropological attention to the importance of ‘things’ and the relations between ourselves and things (‘artefacts’), this paper gives attention to the Social Work File. Despite the rise of electronic recording, social work archives remain full of thousands of files that are increasingly accessed, especially by those who have been in care, and physical file-keeping remains a regular feature of practice. There is already a body of literature relating to the information in social work files, however this paper shifts the focus from this to the nature and role of the File itself. ‘Hidden in plain sight’ but laden with meaning and capacity, I identify the little we know already about the file. The various ways files and their authors and subjects, can interact are explored together with the file’s symbolic properties and the power held by the file’s owner, and the ability of the file to ‘other’ its subject. Whilst we understand that the practice shapes the file, how might the file shape practice?
Los estilos APA, Harvard, Vancouver, ISO, etc.
22

Kolte, Prof Roshan R. "FTS: File Tracking System for Railway Board". International Journal for Research in Applied Science and Engineering Technology 9, n.º 12 (31 de diciembre de 2021): 2278–80. http://dx.doi.org/10.22214/ijraset.2021.39741.

Texto completo
Resumen
Abstract: The aim of this paper proposes a project based implementation File Tracking System (FTS) used in private and mostly in government sector. This project is built on Railway System using three tier architecture. This application is used to prevent corruption. There are a large number of departments in the Railway Department where file transfer is done from one department to another. The number of files moving within the specific time duration, file allocated to any employee and whether the file has been forwarded or not. Sometimes if someone has to do corruption then they does not send that file forward, seeing all this problem we have developed File Tracking System (FTS), which track the movements of the file within the organization. This application will keep track the movement of files from one desk to another inside the department.
Los estilos APA, Harvard, Vancouver, ISO, etc.
23

Alrobieh, Ziad Saif y Ali Mohammed Abdullah Ali Raqpan. "File Carving Survey on Techniques, Tools and Areas of Use". Transactions on Networks and Communications 8, n.º 1 (28 de febrero de 2020): 16–26. http://dx.doi.org/10.14738/tnc.81.7636.

Texto completo
Resumen
In digital and computer forensics, file carving is a very hot research topic. That is the main reason why the research is needed to be focused on improving file carving techniques, so that digital investigation can obtain important data and evidence from damaged or corrupted storage media. In the digital forensic investigation, analyzing the unallocated space of storage media is necessary to extract the deleted or pre-written files when the file system metadata is missing or corrupted. Data carving can be defined as a method to recover the file from unallocated space based on different factors such as file type, information of the file (Header/Footer), or the contents of the file. Research in this area focuses on technological improvements in terms of tools and techniques over the past years. The studies examine different techniques of data carving, especially multimedia files (eg. images and videos). The work file carving is classified into three categories classic carving techniques, intelligent carving techniques and smart carving techniques. Moreover, there are seven popular multimedia carving tools that are mostly used and experimentally evaluated are presented. We conclude that proposing new advanced method for carving multimedia files still open and new direction for future research. This is because the fragmentations and compression are very commons used and useful for these kind of files.
Los estilos APA, Harvard, Vancouver, ISO, etc.
24

Effendi, Noverta y Fauzan Azim. "Analysis And Implementation of Noekeon Algorithms For Encryption and Description of Text Data". Jurnal Info Sains : Informatika dan Sains 11, n.º 1 (1 de marzo de 2021): 11–18. http://dx.doi.org/10.54209/infosains.v11i1.37.

Texto completo
Resumen
As the times evolve, human needs increase. Including information needs. Therefore, sending and storing files through electronic media requires a process that can ensure the security and integrity of the file. To ensure the safety and integrity of a file, an encoding process is required. Encryption is performed when the file is sent. This process will convert the original file into a confidential unreadable file. Meanwhile, the decryption process is done by the recipient of the sent file. The personal file received will be converted back to the original file. By encoding, the original file will not be read by unauthorized parties but only by recipients who have a decryption key.
Los estilos APA, Harvard, Vancouver, ISO, etc.
25

Lee, Wan Yeon, Kyong Hoon Kim y Heejo Lee. "Extraction of Creation-Time for Recovered Files on Windows FAT32 File System". Applied Sciences 9, n.º 24 (15 de diciembre de 2019): 5522. http://dx.doi.org/10.3390/app9245522.

Texto completo
Resumen
In this article, we propose a creation order reconstruction method of deleted files for the FAT32 file system with Windows operating systems. Creation order of files is established using a correlation between storage locations of the files and their directory entry locations. This method can be utilized to derive the creation-time bound of files recovered without the creation-time information. In this article, we first examine the file allocation behavior of Windows FAT32 file system. Next, based on the examined behavior, we propose a novel method that finds the creation order of deleted files after being recovered without the creation-time information. Due to complex behaviors of Windows FAT32 file system, the method may find multiple creation orders although the actual creation order is unique. In experiments with a commercial device, we confirm that the actual creation order of each recovered file belongs to one of the creation orders found by the method.
Los estilos APA, Harvard, Vancouver, ISO, etc.
26

Andrian, Dimas Pamilih Epin, Dhomas Hatta Fudholi y Yudi Prayudi. "KARAKTERISTIK METADATA PADA SHARING FILE DI MEDIA SOSIAL UNTUK MENDUKUNG ANALISIS BUKTI DIGITAL". Jurnal Ilmiah SINUS 19, n.º 1 (12 de enero de 2021): 13. http://dx.doi.org/10.30646/sinus.v19i1.494.

Texto completo
Resumen
Metadata is information in a file which its contents are an explanation of the file. Metadata contains information about the contents of data for file management purposes. In various cases involving digital evidence, investigators can uncover a case through the metadata file. Problems that arise when file metadata has changed or deleted information, for example, the moment that a file is shared via social media. Basically, all of the shared files through social media will experience changes in metadata information. This study conducted detailed analysis of changes in metadata information and hex dump values to determine the changing characteristics of metadata files shared in social media. This research method applied a comparison table to see the details of changes in metadata values from all files and social media as research objects. The results of this study are expected to have contribution for forensic analysts to identify the shared metadata characteristics of files in social media. As a result, later, the source of shared files in social media will be known. Moreover, it is expected from these findings that forensic analysts can explore the social media used by the cybercrime perpetrators.
Los estilos APA, Harvard, Vancouver, ISO, etc.
27

Wu, Xing y Mengqi Pei. "Image File Storage System Resembling Human Memory". International Journal of Software Science and Computational Intelligence 7, n.º 2 (abril de 2015): 70–84. http://dx.doi.org/10.4018/ijssci.2015040104.

Texto completo
Resumen
Big Data era is characterized by the explosive increase of image files on the Internet, massive image files bring great challenges to storage. It is required not only the storage efficiency of massive image files but also the accuracy and robustness of massive image file management and retrieval. To meet these requirements, distributed image file storage system based on cognitive theory is proposed. According to the human brain function, humans can correlate image files with thousands of distinct object and action categories and sorted store these files. Thus the authors proposed to sorted store image files according to different visual categories based on human cognition to resemble human memory. The experimental results demonstrate that the proposed distributed image file system (DIFS) based on cognition performs better than Hadoop Distributed File System (HDFS) and FastDFS.
Los estilos APA, Harvard, Vancouver, ISO, etc.
28

Li, Ting Ting y Rui Li Jiao. "Design and Implementation of a Fast Algorithm for File Integrity Verification". Applied Mechanics and Materials 631-632 (septiembre de 2014): 918–22. http://dx.doi.org/10.4028/www.scientific.net/amm.631-632.918.

Texto completo
Resumen
At present, MD5 algorithm is commonly used in the file transfer system to conduct the tests of file integrity, but when a file is too large, it will take a long time to generate MD5 code for the whole file, with the result of reducing the transmission efficiency of the file. Based on this conduction, a fast algorithm of file integrity detection is provided, which can generate MD5 code with the file size and some random content of the file. Although the algorithm cannot test files roundly, it is integrated into the file transfer system. The test results show that this algorithm can detect the file integrity faster, and when the file is damaged or tampered with, the probability of detecting changes of the file content is higher than the theoretical value. The algorithm has great value in engineering applications.
Los estilos APA, Harvard, Vancouver, ISO, etc.
29

Cui, Litao. "A Preliminary Study on the Management Strategy of University Personnel Files based on Artificial Intelligence Technology". Journal of Electronic Research and Application 5, n.º 2 (18 de junio de 2021): 1–4. http://dx.doi.org/10.26689/jera.v5i2.2192.

Texto completo
Resumen
In order to improve the management strategy for personnel files in colleges and universities, simplify the complex process of file management, and improve file management security and content preservation of the files. This paper elaborates on the application of Artificial Intelligence (AI) technology in university personnel file management through theoretical analysis based on the understanding of AI technology.
Los estilos APA, Harvard, Vancouver, ISO, etc.
30

Preethi Mariona, Delphine Priscilla Antony S y Sreedevi Dharman. "Association of type of file fracture and method of removal in a university setting". International Journal of Research in Pharmaceutical Sciences 11, SPL3 (22 de octubre de 2020): 1761–65. http://dx.doi.org/10.26452/ijrps.v11ispl3.3509.

Texto completo
Resumen
Some mishaps tend to occur during treatment in the root canal system, for example, file fractures. The challenge of the removal depends on the type of file fractured and the method which is used to remove it, generally it is decided based on the level of the fracture. Any file can fracture inside the root canal which is based on the curvature, anatomy. Specific techniques and measures have been employed to remove this file from the root canal system. The aim of the study is to find the association of file fracture with the method of removal. The details of all patients who underwent a root canal treatment where noted and the details of 16 patients with file fractures during the procedure were shortlisted. The type of file fractures was analyzed, such as K files, rotary files, H files, other instruments. The method of removal was usually ultrasonic, mechanical, manual or combination of any of the above. Excel tabulation was done and imported to SPSS for results. Chi-square test performed. The most common and used file to fracture in the root canal system was rotary files which had a fracture incidence of 31% and most common method used to retrieve was using ultrasonic of an incidence of 37.5. Chi-square test shows p>0.05, which is statistically not significant. The study concludes that rotary files fracture the most and the method used to retrieve them was mechanical, but as a whole, the most common method used was ultrasonic.
Los estilos APA, Harvard, Vancouver, ISO, etc.
31

Kim, Jungil y Eunjoo Lee. "A Change Recommendation Approach Using Change Patterns of a Corresponding Test File". Symmetry 10, n.º 11 (23 de octubre de 2018): 534. http://dx.doi.org/10.3390/sym10110534.

Texto completo
Resumen
Change recommendation improves the development speed and quality of software projects. Through change recommendation, software project developers can find the relevant source files that they must change for their modification tasks. In an existing change-recommendation approach based on the change history of source files, the reliability of the recommended change patterns for a source file is determined according to the change history of the source file. If a source file has insufficient change history to identify its change patterns or has frequently been changed with unrelated source files, the existing change-recommendation approach cannot identify meaningful change patterns for the source file. In this paper, we propose a novel change-recommendation approach to resolve the limitation of the existing change-recommendation method. The basic idea of the proposed approach is to consider the change history of a test file corresponding to a given source file. First, the proposed approach identifies the test file corresponding to a given source file by using a source–test traceability linking method based on the popular naming convention rule. Then, the change patterns of the source and test files are identified according to their change histories. Finally, a set of change recommendations is constructed using the identified change patterns. In an experiment involving six open-source projects, the accuracy of the proposed approach is evaluated. The results show that the accuracy of the proposed approach can be significantly improved from 21% to 62% compared with the existing approach.
Los estilos APA, Harvard, Vancouver, ISO, etc.
32

Stockinger, Heinz, Asad Samar, Shahzad Muzaffar y Flavia Donno. "Grid Data Mirroring Package (GDMP)". Scientific Programming 10, n.º 2 (2002): 121–33. http://dx.doi.org/10.1155/2002/514278.

Texto completo
Resumen
The GDMP client-server software system is a generic file replication tool that replicates files securely and efficiently from one site to another in a Data Grid environment using Globus Grid tools. In addition, it manages replica catalogue entries for file replicas and thus maintains a consistent view on names and locations of replicated files. Files to be transferred can be of any particular file format and GDMP treats them all in the same way. However, for Objectivity database files a particular plug-in exists. All files are assumed to be read-only.
Los estilos APA, Harvard, Vancouver, ISO, etc.
33

Cearnaigh, Seán Ua. "Nuair nach File File". Comhar 63, n.º 5 (2003): 15. http://dx.doi.org/10.2307/25574607.

Texto completo
Los estilos APA, Harvard, Vancouver, ISO, etc.
34

Abdulameer, Sadeer Dheyaa. "PRIVACY PRESERVING ADVANCED SECURITY ON UNTRUSTED CLOUD WITH SELF BACKUP PROCESS". International Journal of Research -GRANTHAALAYAH 5, n.º 4 (30 de abril de 2017): 176–81. http://dx.doi.org/10.29121/granthaalayah.v5.i4.2017.1810.

Texto completo
Resumen
Cloud Storage service are frequently required for many corporate and government organizations. Most of cloud storage service providers are un-trusted, so it is not safe to keep the data in cloud for long period. Many are using cloud storage for data sharing that means it is not possible to send a big file in email, maximum 25 GB are allowed, for big files, files are uploaded in cloud storage and link is given to the data consumer. After Data consumer download the file, Data owner has to delete the file from the cloud for the security reasons, but most of time Data Owner forget to delete the file. To overcome this problem data self-destruction is proposed in many papers and now proposed system has Self-Destruction cum Self-Backup Process, which help the file to stay in the public cloud for certain period of times and it will be removed from the cloud storage and securely stored in another storage. To verify the integrity of the file HMAC is created while file is uploaded and Data Consumer can able to download the file and generate the HMAC, check the integrity of the file.
Los estilos APA, Harvard, Vancouver, ISO, etc.
35

Saidhbi, Sheik. "An Intelligent Multimedia Data Encryption and Compression and Secure Data Transmission of Public Cloud". Asian Journal of Engineering and Applied Technology 8, n.º 2 (5 de mayo de 2019): 37–40. http://dx.doi.org/10.51983/ajeat-2019.8.2.1141.

Texto completo
Resumen
Data compression is a method of reducing the size of the data file so that the file should take less disk space for storage. Compression of a file depends upon encoding of file. In lossless data compression algorithm there is no data loss while compressing a file, therefore confidential data can be reproduce if it is compressed using lossless data compression. Compression reduces the redundancy and if a compressed file is encrypted it is having a better security and faster transfer rate across the network than encrypting and transferring uncompressed file. Most of the computer applications related to health are not secure and these applications exchange lot of confidential health data having different file formats like HL7, DICOM images and other audio, image, textual and video data formats etc. These types of confidential data need to be transmitted securely and stored efficiently. Therefore this paper proposes a learning compression- encryption model for identifying the files that should be compressed before encrypting and the files that should be encrypted without compressing them.
Los estilos APA, Harvard, Vancouver, ISO, etc.
36

Romadhon, Rifki Hari, Nugroho Suharto y Aad Hariyadi. "RASPBERRY PI SEBAGAI STORAGE VERSION CONTROL SYSTEM TUGAS MAHASISWA POLITEKNIK NEGERI MALANG". JASIEK (Jurnal Aplikasi Sains, Informasi, Elektronika dan Komputer) 3, n.º 1 (14 de junio de 2021): 11–20. http://dx.doi.org/10.26905/jasiek.v3i1.7793.

Texto completo
Resumen
File transfer technology is now beginning to develop rapidly, many researchers have begun to develop file transfer methods, now file transfer is present in daily life in various ways. For example, in human life there are many more uses of flash memory for the process of moving files. This makes it risky to get a virus from another computer or laptop that results in file loss. In this study, we are developing a file transfer that can simultaneously manage the user to apply file version control so that the file is still written to the repository and has its own version of each uploaded file. It is hoped that implementing a version control system will help students keep all files in the repository they are working on and make changes to each file. It is expected that this study will help provide an alternative to assessment by each faculty member and may speed up and facilitate decisions related to student assessment at Malang State Polytechnic Institute. The results of this study show that the file transfer QoS on the polynema network has a packet loss value of 0%, latency of 0.672ms (download) and 0.447ms (upload), and a throughput of 330 kbps (download), 544.50 (upload)
Los estilos APA, Harvard, Vancouver, ISO, etc.
37

Chan, Anthony, William Gropp y Ewing Lusk. "An Efficient Format for Nearly Constant-Time Access to Arbitrary Time Intervals in Large Trace Files". Scientific Programming 16, n.º 2-3 (2008): 155–65. http://dx.doi.org/10.1155/2008/749874.

Texto completo
Resumen
A powerful method to aid in understanding the performance of parallel applications uses log or trace files containing time-stamped events and states (pairs of events). These trace files can be very large, often hundreds or even thousands of megabytes. Because of the cost of accessing and displaying such files, other methods are often used that reduce the size of the tracefiles at the cost of sacrificing detail or other information. This paper describes a hierarchical trace file format that provides for display of an arbitrary time window in a time independent of the total size of the file and roughly proportional to the number of events within the time window. This format eliminates the need to sacrifice data to achieve a smaller trace file size (since storage is inexpensive, it is necessary only to make efficient use of bandwidth to that storage). The format can be used to organize a trace file or to create a separate file ofannotationsthat may be used with conventional trace files. We present an analysis of the time to access all of the events relevant to an interval of time and we describe experiments demonstrating the performance of this file format.
Los estilos APA, Harvard, Vancouver, ISO, etc.
38

Zivkovic, Slavoljub, Marijana Popovic-Bajic y Marija Zivkovic. "Reciprocial movements of endodontic files - simpler and more certain therapeutic procedure". Serbian Dental Journal 69, n.º 1 (2022): 22–30. http://dx.doi.org/10.2298/sgs2201022z.

Texto completo
Resumen
Numerous technological solutions in recent years have significantly improved the cleaning and shaping of canals and made canal instrumentation simpler, more efficient and safer. Significantly faster and less stressful canal instrumentation for the therapist is enabled by the specific design of the file working part and a special thermal modification of NiTi alloy with a change in the movement dynamics of the file in the canal. Research has shown that the problem of cyclic fatigue and torsional stress of the file during canal preparation can be solved by changing usual continuous rotation of the file. Dental technology has introduced the technique of reciprocal movements as an alternative to full file rotation. This change in file rotation direction during instrumentation, based on the technique of balanced forces, significantly reduces contact surface with the canal wall, eliminates the effect of screwing, extends the life of the file and further increases safety of instrumentation of different canal systems. An important advantage of changing the usual dynamics of file movements is that the concept of reciprocal movements is based on the use of only one file, which in addition to shortening treatment time also makes this intervention safer and with significantly lower percentage of defects and fractures of NiTi files. The aim of this paper was to present the concept of canal instrumentation with NiTi files with reciprocal movements as well as development, properties and possibility of application of these files in different clinical situations.
Los estilos APA, Harvard, Vancouver, ISO, etc.
39

Kang, Peng, Wenzhong Yang y Jiong Zheng. "Blockchain Private File Storage-Sharing Method Based on IPFS". Sensors 22, n.º 14 (7 de julio de 2022): 5100. http://dx.doi.org/10.3390/s22145100.

Texto completo
Resumen
Under the current national network environment, anyone can participate in publishing. As an important information resource, knowledge files reflect the workload of publishers. Moreover, high-quality knowledge files can promote the progress of society. However, pirated inferior files have the opposite effect. At present, most organizations use centralized servers to centrally manage the knowledge files released by users. In addition, it is necessary to introduce an untrusted third party to examine and encrypt the contents of files, which leads to an opaque process of file storage transactions, tampering with intellectual copyright, and the inability to have consistent systems of file management among institutions due to the lack of uniform standards for the same intellectual files. The purpose of this paper is to ensure the safe storage of knowledge files on the one hand and to realize efficient sharing of copyrighted files on the other hand. Therefore, this paper combines NDN (Named Data Network) technology with a distributed blockchain and an Interplanetary File System (IPFS) and proposes a blockchain knowledge file storage and sharing method based on an NDN. The method uses the NDN itself for the file content signature and encryption, thereby separating the file security and transmission process. At the same time, the method uses a flexible NDN reverse path forwarding and routing strategy, combining an IPFS private storage network to improve the safety of the encrypted data storage security. Finally, the method takes advantage of all participating nodes consensus and shares files in the synchronized blockchain to ensure traceability. This paper introduces the structure and principles of the method and describes the process of file upload and transfer. Finally, the performance of the method is compared and evaluated, and the advantages and disadvantages of the method and the future research direction are summarized.
Los estilos APA, Harvard, Vancouver, ISO, etc.
40

Salunke, Nagesh Rajendra. "Files Storage & Sharing Platform Using Cloud". International Journal for Research in Applied Science and Engineering Technology 9, n.º 11 (30 de noviembre de 2021): 1338–44. http://dx.doi.org/10.22214/ijraset.2021.38994.

Texto completo
Resumen
Abstract: The concept of cloud computing becomes more popular in latest years. Data storage is very important and valuable research field in cloud computing. Cloud based file sharing is a file sharing security in cloud. The required security from unauthorized access of the file in the cloud is provided by the encryption and decryption function. The admin can provide file access option to the authorized users. This facility limits the number and time of access of the shared files by the admin for the authorized user. Cloud data storage technology is the core area in cloud computing and solves the data storage mode of cloud environment. This project introduces the concept of cloud computing and cloud storage as well as the architecture of cloud storage firstly. Then we analyze the cloud data storage technology amazon web services, wasabi, Digital Ocean etc. We will improve the traditional file storage method and we will make a platform which will get more privileges. Keywords: Cloud, Storage, AWS, Wasabi, File Management, Files Storage, Files Sharing, DMS, CMS, Drive store, Private Cloud.
Los estilos APA, Harvard, Vancouver, ISO, etc.
41

Kancherla, Jayaram, Yifan Yang, Hyeyun Chae y Hector Corrada Bravo. "Epiviz File Server: Query, transform and interactively explore data from indexed genomic files". Bioinformatics 36, n.º 18 (3 de julio de 2020): 4682–90. http://dx.doi.org/10.1093/bioinformatics/btaa591.

Texto completo
Resumen
Abstract Motivation Genomic data repositories like The Cancer Genome Atlas, Encyclopedia of DNA Elements, Bioconductor’s AnnotationHub and ExperimentHub etc., provide public access to large amounts of genomic data as flat files. Researchers often download a subset of data files from these repositories to perform exploratory data analysis. We developed Epiviz File Server, a Python library that implements an in situ data query system for local or remotely hosted indexed genomic files, not only for visualization but also data transformation. The File Server library decouples data retrieval and transformation from specific visualization and analysis tools and provides an abstract interface to define computations independent of the location, format or structure of the file. We demonstrate the File Server in two use cases: (i) integration with Galaxy workflows and (ii) using Epiviz to create a custom genome browser from the Epigenome Roadmap dataset. Availability and implementation Epiviz File Server is open source and is available on GitHub at http://github.com/epiviz/epivizFileServer. The documentation for the File Server library is available at http://epivizfileserver.rtfd.io.
Los estilos APA, Harvard, Vancouver, ISO, etc.
42

Wu, Long Tao, Tie Ning Wang y Hai Rong Hu. "Research on Storage Strategy of Unstructured Small Files in HDFS". Applied Mechanics and Materials 644-650 (septiembre de 2014): 3053–56. http://dx.doi.org/10.4028/www.scientific.net/amm.644-650.3053.

Texto completo
Resumen
With the wide use of HDFS and increasing scale of small files, problems of HDFS in small files storage gradually exposed. Thus the article put forward a storage strategy of unstructured small file based on the type of file, and optimized the architecture of cluster to save memory and improve the efficiency of file access. Through experiment, the strategy is proved to be effective and reliable.
Los estilos APA, Harvard, Vancouver, ISO, etc.
43

Taloni, Alessandro y Fabio Marchesoni. "Interacting Single-File System: Fractional Langevin Formulation Versus Diffusion-Noise Approach". Biophysical Reviews and Letters 09, n.º 04 (diciembre de 2014): 381–96. http://dx.doi.org/10.1142/s1793048014400050.

Texto completo
Resumen
We review the latest advances in the analytical modelling of single file diffusion. We focus first on the derivation of the fractional Langevin equation that describes the motion of a tagged file particle. We then propose an alternative derivation of the very same stochastic equation by starting from the diffusion-noise formalism for the time evolution of the file density. [Formula: see text] Special Issue Comments: This article presents mathematical formulations and results on the dynamics in files with applied potential, yet also general files. This article is connected to the Special Issue articles about the zig zag phenomenon,72 advanced statistical properties in single file dynamics,73 and expanding files.74
Los estilos APA, Harvard, Vancouver, ISO, etc.
44

Zhu, Ming Ying, Jian Hua Tao, Xiao Chu Liu y Hua Liu. "Research on NC Process Information Integration Method Based on Database and Heterogeneous CAM Programming Platforms". Advanced Materials Research 328-330 (septiembre de 2011): 767–70. http://dx.doi.org/10.4028/www.scientific.net/amr.328-330.767.

Texto completo
Resumen
In this paper, the research focus on NC process information integration methods based on database and process information extraction from macro files of heterogeneous commercial CAM platforms. First, analyzed the macro file characteristics of studied CAM system to build heterogeneous systems variables mapping table and establish standard macro file for each studied CAM systems; Second, designed macro file self-learning mechanism based on characteristics of the macro file, extracted implicit process information in macro files and saved to build General process Database; Finally, used the General process Database and the standard macro files to generate target macro files to achieve automatic programming. The process Information integration method in this research improved the efficiency and processing quality of NC machining, and made excellent process experience well inheritance and reuse.
Los estilos APA, Harvard, Vancouver, ISO, etc.
45

Riza Suci Ernaman Putri, Retno Kusumo y Yuni Utami. "Analisis Penyebab Terjadinya Missfile Berkas Rekam Medis di Ruangan Filling RS St Elisabeth Batam Kota". SEHATMAS: Jurnal Ilmiah Kesehatan Masyarakat 1, n.º 3 (29 de julio de 2022): 309–17. http://dx.doi.org/10.55123/sehatmas.v1i3.646.

Texto completo
Resumen
This research is motivated by the occurrence of missfiles in terms of searching for medical record files where officers who borrow medical record documents do not write in the expedition book which causes obstacles in carrying out patient actions. The officer is not careful in preparing the medical record file where the document will be used but it is not on the proper shelf The study was to determine the flow of medical record file retrieval, to find out the flow of medical record file storage and the causes of missfiles in the medical record file filling room at St Elisabeth Hospital Batam City. This study uses a qualitative method. The data collection of this research was carried out by the researcher by means of interviews and observations. The population of this study consisted of four medical record officers and 1 nurse. Result: when the service is seen from the Man and Method factors, including the man factor, namely the level of education and work experience of officers. The method factor is that 100% of SOPs have not been implemented on returning and borrowing medical record files because there are still officers who do not know the correct application of SOPs. This is what causes the missfile in the filling room. Conclusion: The officers did not focus when filling out the medical record files due to the fatigue of the officers, the lack of thoroughness of the officers and the rush of the officers when they wanted to fill in the medical record files which could cause missfiles and where the return and loan SOPs did not work.
Los estilos APA, Harvard, Vancouver, ISO, etc.
46

Han, Sung-Hwa y Daesung Lee. "Kernel-Based Real-Time File Access Monitoring Structure for Detecting Malware Activity". Electronics 11, n.º 12 (14 de junio de 2022): 1871. http://dx.doi.org/10.3390/electronics11121871.

Texto completo
Resumen
Obfuscation and cryptography technologies are applied to malware to make the detection of malware through intrusion prevention systems (IPSs), intrusion detection systems (IDSs), and antiviruses difficult. To address this problem, the security requirements for post-detection and proper response are presented, with emphasis on the real-time file access monitoring function. However, current operating systems provide only file access control techniques, such as SELinux (version 2.6, Red Hat, Raleigh, NC, USA) and AppArmor (version 2.5, Immunix, Portland, OR, USA), to protect system files and do not provide real-time file access monitoring. Thus, the service manager or data owner cannot determine real-time unauthorized modification and leakage of important files by malware. In this paper, a structure to monitor user access to important files in real time is proposed. The proposed structure has five components, with a kernel module interrelated to the application process. With this structural feature, real-time monitoring is possible for all file accesses, and malicious attackers cannot bypass this file access monitoring function. By verifying the positive and negative functions of the proposed structure, it was validated that the structure accurately provides real-time file access monitoring function, the monitoring function resource is sufficiently low, and the file access monitoring performance is high, further confirming the effectiveness of the proposed structure.
Los estilos APA, Harvard, Vancouver, ISO, etc.
47

Angelia Putriana y Erlindai. "Sosialisasi Sistem Peyimpanan Rekam Medis Berdasarkan Standar Akreditasi Puskesmas Medan Johor Tahun 2021". ABDIKAN: Jurnal Pengabdian Masyarakat Bidang Sains dan Teknologi 1, n.º 3 (28 de agosto de 2022): 421–25. http://dx.doi.org/10.55123/abdikan.v1i3.851.

Texto completo
Resumen
Health center accreditation standards are one of the assessments of the medical record file storage system. The medical record file storage system at the Medan Johor Health Center is very important because it helps officers in retrieving and returning files on the medical record file storage rack. The socialization of the medical record file storage system based on this accreditation standard was carried out at the Medan Johor Health Center as a form of updating knowledge and implementing standardized medical record file storage. The purpose of this community service is to help the Medan Johor Health Center in preparing all the files and necessities to follow the Second Edition of the Public Health Center accreditation standard, version 2021 regarding the medical record file storage system at the Medan Johor Health Center. This community service method is to provide socialization of the medical record file storage system in accordance with the second edition of the Puskesmas accreditation instrument, version 2021. The results of community service are expected that medical record officers are able to apply procedures for retrieval and return of medical record files according to the accreditation standards of the storage system which will help medical record officer in the future when the accreditation period for the Medan Johor Health Center will arrive
Los estilos APA, Harvard, Vancouver, ISO, etc.
48

Alfadley, Abdulmohsen, Abdalrhman Alrajhi, Hamad Alissa, Faisal Alzeghaibi, Lubna Hamadah, Khalid Alfouzan y Ahmed Jamleh. "Shaping Ability of XP Endo Shaper File in Curved Root Canal Models". International Journal of Dentistry 2020 (17 de febrero de 2020): 1–6. http://dx.doi.org/10.1155/2020/4687045.

Texto completo
Resumen
The aim of this study was to assess the shaping ability of the XP Shaper (XPS) file in severely curved canal models under simulated body temperature and compare it with that of the WaveOne Gold (WOG) file. Ninety-six simulated root canals were equally distributed into XPS and WOG systems to be shaped by eight files each. Files were assessed under a stereomicroscope prior to canal shaping to detect deformation if any. The canals were shaped at 35 ± 1°C using the X-Smart Plus motor. Images of the canals were obtained before and after instrumentation using a stereomicroscope to measure the amount of removed resin from both the inner and outer curvature sides at apex (0 mm) and 3 mm and 6 mm from the apex. The shaping time was calculated. The data were statistically analyzed by the independent t-test at 5% significance level. The XPS and WOG systems shaped the canals in 37.0 ± 9.5 and 62.6 ± 11.3 seconds (P<0.05), respectively. At the apex level, the amount of resin removal in both sides did not show a significant difference between the tested groups (P>0.05). At 3 mm and 6 mm levels, the WOG removed more resin than XPS at both sides (P<0.05). In XPS, deformation was observed in four files: one file after the first use, one file after the fourth use, and two files after the sixth use. In WOG, two files were deformed: one file after the fifth use and one file after the sixth use. One XPS file was fractured after the sixth use. In short, XPS and WOG files can be used in shaping severely curved canals as they showed the ability to maintain the original shape with minimal transportation. Both file systems showed signs of deformation after use with a lower number of deformed files observed in WOG throughout the experiment.
Los estilos APA, Harvard, Vancouver, ISO, etc.
49

Zhu, Juan, Jipeng Huang y Lianming Wang. "Laser Printing Files Detection Method Based on Double Features". International Journal of Pattern Recognition and Artificial Intelligence 32, n.º 10 (20 de junio de 2018): 1854028. http://dx.doi.org/10.1142/s0218001418540289.

Texto completo
Resumen
A novel laser printing files detection method is proposed in this paper to solve the problem of low efficiency and difficulty in traditional detection. The new method is based on improved scale-invariant feature transform (SIFT) feature and histogram feature. Firstly, analyze the graphical features of different laser printing files. Different files have different printing texture features in valid data area. So segment the valid data area to remove the interference of background. Secondly, extract the histogram feature of the same character in the printing file. Normalize the histogram and then calculate the Bhattacharyya coefficient between the detected file and the original file to determine whether the detected file is right or fake. At the same time, calculate the SIFT features and match the detected file and the original file. To focus on the letter or character region, the SIFT features which are out of contour are deleted. Finally, the results of the two different methods are both used as the result of the identification. When any of the result is fake, the end result will be fake. In the self-built database experiment, in different printing files from different printers, the inkjet areas possess different image features. When scanning different files using 600 dpi, the detect accuracy is higher than 97%. This method was able to meet the reliability requirements of law.
Los estilos APA, Harvard, Vancouver, ISO, etc.
50

Bergmann, Frank T., Nicolas Rodriguez y Nicolas Le Novère. "COMBINE Archive Specification Version 1". Journal of Integrative Bioinformatics 12, n.º 2 (1 de junio de 2015): 104–18. http://dx.doi.org/10.1515/jib-2015-261.

Texto completo
Resumen
Summary Several standard formats have been proposed that can be used to describe models, simulations, data or other essential information in a consistent fashion. These constitute various separate components required to reproduce a given published scientific result.The Open Modeling EXchange format (OMEX) supports the exchange of all the information necessary for a modeling and simulation experiment in biology. An OMEX file is a ZIP container that includes a manifest file, an optional metadata file, and the files describing the model. The manifest is an XML file listing all files included in the archive and their type. The metadata file provides additional information about the archive and its content. Although any format can be used, we recommend an XML serialization of the Resource Description Framework.Together with the other standard formats from the Computational Modeling in Biology Network (COMBINE), OMEX is the basis of the COMBINE Archive. The content of a COMBINE Archive consists of files encoded in COMBINE standards whenever possible, but may include additional files defined by an Internet Media Type. The COMBINE Archive facilitates the reproduction of modeling and simulation experiments in biology by embedding all the relevant information in one file. Having all the information stored and exchanged at once also helps in building activity logs and audit trails.
Los estilos APA, Harvard, Vancouver, ISO, etc.
Ofrecemos descuentos en todos los planes premium para autores cuyas obras están incluidas en selecciones literarias temáticas. ¡Contáctenos para obtener un código promocional único!

Pasar a la bibliografía