Siga este link para ver outros tipos de publicações sobre o tema: Eventus (Computer file).

Artigos de revistas sobre o tema "Eventus (Computer file)"

Crie uma referência precisa em APA, MLA, Chicago, Harvard, e outros estilos

Selecione um tipo de fonte:

Veja os 50 melhores artigos de revistas para estudos sobre o assunto "Eventus (Computer file)".

Ao lado de cada fonte na lista de referências, há um botão "Adicionar à bibliografia". Clique e geraremos automaticamente a citação bibliográfica do trabalho escolhido no estilo de citação de que você precisa: APA, MLA, Harvard, Chicago, Vancouver, etc.

Você também pode baixar o texto completo da publicação científica em formato .pdf e ler o resumo do trabalho online se estiver presente nos metadados.

Veja os artigos de revistas das mais diversas áreas científicas e compile uma bibliografia correta.

1

Syafar, Faisal, Halimah Husain, Sutarsih Suhaeb, Putri Ida e Supriadi. "Blowfish Advanced CS Untuk Solusi Keamanan Sistem Komputer Sekolah". Vokatek : Jurnal Pengabdian Masyarakat 1, n.º 3 (30 de outubro de 2023): 353–61. http://dx.doi.org/10.61255/vokatekjpm.v1i3.271.

Texto completo da fonte
Resumo:
Data security and privacy are very important issues for businesses, colleges, the government, and even individuals. The TKJ Laboratory is where PKM events take place. Students are free to use the computers there as they please, as long as they don't break the security system on each one. The types of files that can be protected in this case are text-based document files, picture files, audio and video files that are stored digitally, and question files. In the Community Partnership Program, which had 15 participants, there were outcomes and goals. The program produced a guidebook on how to use Blowfish Advanced CS for computer system security cryptography and instructions on how to use Blowfish Advanced CS for computer system security cryptography. This study is a type of research called qualitative research. With a case study that aims to learn how to use the Blowfish Advanced CS tool to secure text messages, files, and documents with cryptographic data. The predicted outcomes of this PKM are 1) This key is combined to make the Blowfish algorithm stronger during the key setup process. 2) The Blowfish algorithm tool needs a key with at least 4 characters to encrypt the data file during the file/folder simulation. 3) The Blowfish method uses symmetric keys, which means that the simulation process always uses the same key to encrypt and decrypt data files and folders. The same key is also used to split and join files.
Estilos ABNT, Harvard, Vancouver, APA, etc.
2

Xu, Haochen, Guanhua Fang, Yunxiao Chen, Jingchen Liu e Zhiliang Ying. "Latent Class Analysis of Recurrent Events in Problem-Solving Items". Applied Psychological Measurement 42, n.º 6 (9 de abril de 2018): 478–98. http://dx.doi.org/10.1177/0146621617748325.

Texto completo da fonte
Resumo:
Computer-based assessment of complex problem-solving abilities is becoming more and more popular. In such an assessment, the entire problem-solving process of an examinee is recorded, providing detailed information about the individual, such as behavioral patterns, speed, and learning trajectory. The problem-solving processes are recorded in a computer log file which is a time-stamped documentation of events related to task completion. As opposed to cross-sectional response data from traditional tests, process data in log files are massive and irregularly structured, calling for effective exploratory data analysis methods. Motivated by a specific complex problem-solving item “Climate Control” in the 2012 Programme for International Student Assessment, the authors propose a latent class analysis approach to analyzing the events occurred in the problem-solving processes. The exploratory latent class analysis yields meaningful latent classes. Simulation studies are conducted to evaluate the proposed approach.
Estilos ABNT, Harvard, Vancouver, APA, etc.
3

Alkenani, Jawad, e Khulood Ahmed Nassar. "Enhance work for java based network analyzer tool used to analyze network simulator files". Indonesian Journal of Electrical Engineering and Computer Science 29, n.º 2 (1 de fevereiro de 2023): 954. http://dx.doi.org/10.11591/ijeecs.v29.i2.pp954-962.

Texto completo da fonte
Resumo:
<span lang="EN-US">The network performance measurement is important in computer networks, and performance measurement may not be effective for installation in peripheral devices resulting in the replacement of those devices and thus increasing cost. In light of this, it is better to have a simulation of the network to see its performance rather than the actual design. NS-2 is one of the most popular and widely used open-source network simulators in many organizations, which generates trace files during the simulation experience. The trace file contains all network events that can be used to calculate performance. Thus, NS-2 does not offer any visualization options for analyzing simulation results (trace files), which is the fundamental problem of trace file parsing difficulty. This paper provides a graphical user interface tool that enables researchers to quickly and efficiently analyze and visualize NS-2 trace files. This tool is a development of the JDNA tool, as it could not analyze more than one trace file at a time. In addition, this work can be a useful guide for network researchers or other programmers to analyze their networks and understand how to calculate network performance metrics.</span>
Estilos ABNT, Harvard, Vancouver, APA, etc.
4

Groh, Micah, Norman Buchanan, Derek Doyle, James B. Kowalkowski, Marc Paterno e Saba Sehrish. "PandAna: A Python Analysis Framework for Scalable High Performance Computing in High Energy Physics". EPJ Web of Conferences 251 (2021): 03033. http://dx.doi.org/10.1051/epjconf/202125103033.

Texto completo da fonte
Resumo:
Modern experiments in high energy physics analyze millions of events recorded in particle detectors to select the events of interest and make measurements of physics parameters. These data can often be stored as tabular data in files with detector information and reconstructed quantities. Most current techniques for event selection in these files lack the scalability needed for high performance computing environments. We describe our work to develop a high energy physics analysis framework suitable for high performance computing. This new framework utilizes modern tools for reading files and implicit data parallelism. Framework users analyze tabular data using standard, easy-to-use data analysis techniques in Python while the framework handles the file manipulations and parallelism without the user needing advanced experience in parallel programming. In future versions, we hope to provide a framework that can be utilized on a personal computer or a high performance computing cluster with little change to the user code.
Estilos ABNT, Harvard, Vancouver, APA, etc.
5

Marr, Charles. "210 CUSTOM-PRINTED ELECTRONIC FACT SHEETS AT HORTICULTURAL EVENTS". HortScience 29, n.º 5 (maio de 1994): 459d—459. http://dx.doi.org/10.21273/hortsci.29.5.459d.

Texto completo da fonte
Resumo:
A series of 62 fact sheets on a variety of topics related to vegetable gardening were constructed in WordPerfect and, using a series of macros and styles, were modified to a standard format and printed to a file for storage. Sheets were organized into a heirarchal menu so they could be copied to the printer upon request using only DOS commands. A portable laptop computer and a Hewlett Packard Portable DeskJet printer was used to print files at such remote locations as nurseries/garden centers, shopping malls, fairs, and public events where Master Gardener volunteers set up and operated the equipment. Single sheets could be printed in about 20 seconds. At garden shows with larger attendance, a HP Laser 4 printer and a standard computer were used to print fact sheets at about 5 seconds each. Fact sheets consisted of text and tables but no graphics were included. Most sheets were single pages although several were 2 pages in length. Additional information was available from a comprehensive for-sale publication sold or available for ordering on-site. Costs of custom printed fact sheets compared to standard printing will be discussed.
Estilos ABNT, Harvard, Vancouver, APA, etc.
6

Bántay, László, Gyula Dörgö, Ferenc Tandari e János Abonyi. "Simultaneous Process Mining of Process Events and Operator Actions for Alarm Management". Complexity 2022 (19 de setembro de 2022): 1–13. http://dx.doi.org/10.1155/2022/8670154.

Texto completo da fonte
Resumo:
Alarm management is an important task to ensure the safety of industrial process technologies. A well-designed alarm system can reduce the workload of operators parallel with the support of the production, which is in line with the approach of Industry 5.0. Using Process Mining tools to explore the operator-related event scenarios requires a goal-oriented log file format that contains the start and the end of the alarms along with the triggered operator actions. The key contribution of the work is that a method is presented that transforms the historical event data of control systems into goal-oriented log files used as inputs of process mining algorithms. The applicability of the proposed process mining-based method is presented concerning the analysis of a hydrofluoric acid alkylation plant. The detailed application examples illustrate how the extracted process models can be interpreted and utilized. The results confirm that applying the tools of process mining in alarm management requires a goal-oriented log-file design.
Estilos ABNT, Harvard, Vancouver, APA, etc.
7

Marjai, Péter, Péter Lehotay-Kéry e Attila Kiss. "The Use of Template Miners and Encryption in Log Message Compression". Computers 10, n.º 7 (23 de junho de 2021): 83. http://dx.doi.org/10.3390/computers10070083.

Texto completo da fonte
Resumo:
Presently, almost every computer software produces many log messages based on events and activities during the usage of the software. These files contain valuable runtime information that can be used in a variety of applications such as anomaly detection, error prediction, template mining, and so on. Usually, the generated log messages are raw, which means they have an unstructured format. This indicates that these messages have to be parsed before data mining models can be applied. After parsing, template miners can be applied on the data to retrieve the events occurring in the log file. These events are made from two parts, the template, which is the fixed part and is the same for all instances of the same event type, and the parameter part, which varies for all the instances. To decrease the size of the log messages, we use the mined templates to build a dictionary for the events, and only store the dictionary, the event ID, and the parameter list. We use six template miners to acquire the templates namely IPLoM, LenMa, LogMine, Spell, Drain, and MoLFI. In this paper, we evaluate the compression capacity of our dictionary method with the use of these algorithms. Since parameters could be sensitive information, we also encrypt the files after compression and measure the changes in file size. We also examine the speed of the log miner algorithms. Based on our experiments, LenMa has the best compression rate with an average of 67.4%; however, because of its high runtime, we would suggest the combination of our dictionary method with IPLoM and FFX, since it is the fastest of all methods, and it has a 57.7% compression rate.
Estilos ABNT, Harvard, Vancouver, APA, etc.
8

Starenkyi, Ivan, e Oleksandra Donchenko. "DETECTION OF TRACES OF THE USE OF SOFTWARE SUCH AS «STEALER» IN THE MEMORY OF THE STORAGE DEVICE". Criminalistics and Forensics, n.º 68 (3 de julho de 2023): 469–77. http://dx.doi.org/10.33994/kndise.2023.68.46.

Texto completo da fonte
Resumo:
The purpose of this work is to use an experimentally way to determine the characteristic features of the use of the software «Mars Stealer», which is positioned as a software product of the «Stealer» type, which is contained in the memory among the available and deleted data of the information storage device. The following conclusions can be drawn on the basis of the conducted experiment: Autopsy» software is a good tool for trying to reproduce and trace the events and processes that took place on the storage device; By studying the events and processes in the memory of the information storage, it was possible to establish the sequence of actions of the expert to detect the executable file(s) of third-party software that may be malicious; In the case when the research of the detected executable build file is not carried out immediately after receiving the research objects, but after a certain time, for example, within 90 calendar days, there is a high probability of «losing» communication with the C2 server, which the build – file calls. But, in order to establish the IP address to which the file-initiator try to connect on the affected computer, it is recommended to repeatedly run the build-file (this may increase the chances of detecting the IP address of the C2 server) while simultaneously monitoring the Internet traffic, which will go through a virtual machine; The executing file-initiator on the affected PC by the «Mars Stealer» software, even if there is an error in its work, will be «forced» to save the information collected on the affected PC in the root directory of the location of the Build file itself; Based on the results of the work carried out, it is possible to establish the characteristic features of the use of the «Mars Stealer» software, which is positioned as a «stealer» type of software, which include logging the number of launches of the executing Build-file, and interaction with the folders of WEB browsers installed in the memory storage of information, as well as interaction with a large number of libraries. The obtained results of the experiment given in this paper can be used when conducting examinations in the expert speciality 10.9 «Research of computer equipment and software products», in the study of information storage devices, among the available and deleted data of which information about samples of executable files of the «Mars Stealer» software, which is confirmed by the evidence collected during the examination.
Estilos ABNT, Harvard, Vancouver, APA, etc.
9

N. Sangeeta e Seung Yeob Nam. "Blockchain and Interplanetary File System (IPFS)-Based Data Storage System for Vehicular Networks with Keyword Search Capability". Electronics 12, n.º 7 (24 de março de 2023): 1545. http://dx.doi.org/10.3390/electronics12071545.

Texto completo da fonte
Resumo:
Closed-circuit television (CCTV) cameras and black boxes are indispensable for road safety and accident management. Visible highway surveillance cameras can promote safe driving habits while discouraging moving violations. According to CCTV laws, footage captured by roadside cameras must be securely stored, and authorized persons can access it. Footages collected by CCTV and Blackbox are usually saved to the camera’s microSD card, the cloud, or hard drives locally but there are concerns about security and data integrity. These issues may be addressed by blockchain technology. The cost of storing data on the blockchain, on the other hand, is prohibitively expensive. We can have decentralized and cost-effective storage with the interplanetary file system (IPFS) project. It is a file-sharing protocol that stores and distributes data in a distributed file system. We propose a decentralized IPFS and blockchain-based application for distributed file storage. It is possible to upload various types of files into our decentralized application (DApp), and hashes of the uploaded files are permanently saved on the Ethereum blockchain with the help of smart contracts. Because it cannot be removed, it is immutable. By clicking on the file description, we can also view the file. DApp also includes a keyword search feature to assist us in quickly locating sensitive information. We used Ethers.js’ smart contract event listener and contract.queryFilter to filter and read data from the blockchain. The smart contract events are then written to a text file for our DApp’s keyword search functionality. Our experiment demonstrates that our DApp is resilient to system failure while preserving the transparency and integrity of data due to the immutability of blockchain.
Estilos ABNT, Harvard, Vancouver, APA, etc.
10

Mohamed, Mohamed, e James C. L. Chow. "Acomprehensive computer database for medical physics on-call program". Journal of Radiotherapy in Practice 19, n.º 1 (14 de maio de 2019): 10–14. http://dx.doi.org/10.1017/s1460396919000244.

Texto completo da fonte
Resumo:
AbstractPurpose: A comprehensive and robust computer database was built to record and analyse the medical physics on-call data in emergency radiotherapy. The probability distributions of the on-call events varying with day and week were studied.Materials and methods: Variables of medical physics on-call events such as date and time of the event, number of event per day/week/month, treatment site of the event and identity of the on-call physicist were input to a programmed Excel file. The Excel file was linked to the MATLAB platform for data transfer and analysis. The total number of on-call events per day in a week and per month in a year were calculated based on the physics on-call data in 2010–18. In addition, probability distributions of on-call events varying with days in a week (Monday–Sunday) and months (January–December) in a year were determined.Results: For the total number of medical physics on-call events per week in 2010–18, it was found that the number was similar from Sundays to Thursdays but increased significantly on Fridays before the weekend. The total number of events in a year showed that the physics on-call events increased gradually from January up to March, then decreased in April and slowly increased until another peak in September. The number of events decreased in October from September, and increased again to reach another peak in December. It should be noted that March, September and December are months close to Easter, Labour Day and Christmas, when radiation staff usually take long holidays.Conclusions: A database to record and analyse the medical physics on-call data was created. Different variables such as the number of events per week and per year could be plotted. This roster could consider the statistical results to prepare a schedule with better balance of workload compared with scheduling it randomly. Moreover, the emergency radiotherapy team could use the analysed results to enhance their budget/resource allocation and strategic planning.
Estilos ABNT, Harvard, Vancouver, APA, etc.
11

Eljack, Sarah, Mahdi Jemmali, Mohsen Denden, Sadok Turki, Wael M. Khedr, Abdullah M. Algashami e Mutasim ALsadig. "A secure solution based on load-balancing algorithms between regions in the cloud environment". PeerJ Computer Science 9 (1 de dezembro de 2023): e1513. http://dx.doi.org/10.7717/peerj-cs.1513.

Texto completo da fonte
Resumo:
The problem treated in this article is the storage of sensitive data in the cloud environment and how to choose regions and zones to minimize the number of transfer file events. Handling sensitive data in the global internet network many times can increase risks and minimize security levels. Our work consists of scheduling several files on the different regions based on the security and load balancing parameters in the cloud. Each file is characterized by its size. If data is misplaced from the start it will require a transfer from one region to another and sometimes from one area to another. The objective is to find a schedule that assigns these files to the appropriate region ensuring the load balancing executed in each region to guarantee the minimum number of migrations. This problem is NP-hard. A novel model regarding the regional security and load balancing of files in the cloud environment is proposed in this article. This model is based on the component called “Scheduler” which utilizes the proposed algorithms to solve the problem. This model is a secure solution to guarantee an efficient dispersion of the stored files to avoid the most storage in one region. Consequently, damage to this region does not cause a loss of big data. In addition, a novel method called the “Grouping method” is proposed. Several variants of the application of this method are utilized to propose novel algorithms for solving the studied problem. Initially, seven algorithms are proposed in this article. The experimental results show that there is no dominance between these algorithms. Therefore, three combinations of these seven algorithms generate three other algorithms with better results. Based on the dominance rule, only six algorithms are selected to discuss the performance of the proposed algorithms. Four classes of instances are generated to measure and test the performance of algorithms. In total, 1,360 instances are tested. Three metrics are used to assess the algorithms and make a comparison between them. The experimental results show that the best algorithm is the “Best-value of four algorithms” in 86.5% of cases with an average gap of 0.021 and an average running time of 0.0018 s.
Estilos ABNT, Harvard, Vancouver, APA, etc.
12

Smith, Glenn Gordon, e Barry Grant. "From Players to Programmers: A Computer Game Design Class for Middle-School Children". Journal of Educational Technology Systems 28, n.º 3 (março de 2000): 263–75. http://dx.doi.org/10.2190/rvx6-61b0-8m2q-dul3.

Texto completo da fonte
Resumo:
The prospect of making computer games has often be used to “hook” students into learning programming or cognitive skills. There is, however, little research on using computer game design classes to teach computer skills. This article provides an answer to the question: Can a computer game design course employing the new generation of game authoring tools set middle school students on the path of learning a broad and sophisticated range of computer skills? The answer, based on the senior author's experiences teaching such a course eight times is, Yes. Students learned: an authoring system specifically designed for creating computer games; Windows 95 file management and other basic computer literacy skills; how to integrate outputs from several programs in one project—a form of computer literacy vital for multi-media designers; “if-then-else” logic; and rudimentary knowledge of programming with real-time events. Students also mastered a process for creating unique games and developed skills as autonomous learners.
Estilos ABNT, Harvard, Vancouver, APA, etc.
13

Peditto, Matteo, Riccardo Nucera, Erasmo Rubino, Antonia Marcianò, Marco Bitto, Antonio Catania e Giacomo Oteri. "Improving Oral Surgery: A Workflow Proposal to Create Custom 3D Templates for Surgical Procedures". Open Dentistry Journal 14, n.º 1 (14 de fevereiro de 2020): 35–44. http://dx.doi.org/10.2174/1874210602014010035.

Texto completo da fonte
Resumo:
Background: Computer-guided technologies are adopted in various fields of surgery to limit invasiveness and obtain patient benefits in terms of surgery duration and post-operative course. Surgical templates realized through CAD/CAM technologies are widely diffused in implant dentistry. The aim of this work is to propose, beyond implantology, the feasibility of application of 3D printed surgical templates in oral surgery procedures requiring osteotomies (like maxillary cyst enucleation and tooth disimpaction) in order to obtain accurate surgeries, avoid anatomical damage of surrounding structures and decrease patient’s morbidity, using a simple, low-cost protocol of fabrication. Objective: To provide a reliable CAD-CAM workflow for the realization of surgical templates in oral surgery. Methods: Three clinical scenarios are described: A maxillary canine disimpaction, a mandibular cyst removal, and an orthodontic miniscrew placement. Each one was managed using custom surgical templates realized using the proposed workflow. A Stereolithography (STL) file of maxillary structures was obtained by the use of a 3D medical image processing software (Materialise Mimics 20.0) a segmentation toolbox acquiring RX volumes by Cone-Beam Computed Tomography (CBCT). Digital models of the teeth, acquired as STL files directly, are imported in the same 3D medical image processing freeware (Materialise Mimics 20.0) to merge STL files of maxillary structures and teeth. Data are transported into Blue Sky Plan 4.0 (Blue Sky Bio, LLC), a software for 3D implant guides fabrication, together with the DICOM images package of maxillary volumes to carry out the pre-surgical treatment planning. Anatomical structures at risk are identified; a contour of ideal incision shape and bone osteotomy extent is drawn. Finally, the resulting three-dimensional guide is digitally generated and the surgical guide printed. The resulting 3D template shows the following major features: teeth support, flap management and bone osteotomy design. Results: The proposed work-flow aided the surgeon in both pre-operative and intra-operative work phases through accurate virtual planning and the fabrication of precise surgical guides to be used in oral surgery practice. In each clinical scenario, the use of custom 3D templates allowed better control of the osteotomy planes and flap management. No adverse events occurred during both surgical and healing phases. Conclusion: The proposed digital workflow represents a reliable and straightforward way to produce a surgical guide for oral surgery procedures. These templates represent a versatile tool in maxillary cyst enucleations, tooth disimpaction, and other surgical procedures, increasing accuracy, minimizing surgical complications, and decreasing patient’s morbidity.
Estilos ABNT, Harvard, Vancouver, APA, etc.
14

Hidayat, Arif, Sudarmaji ., Dharmawan ., Dedi Irawan, Lilik Joko Susanto, Mustika . e Hadi Pranoto. "Comparative Analysis Of Applications OSforensics, GetDataBack, Genius and Diskdigger On Digital Data Recovery in the Computer Device". International Journal of Engineering & Technology 7, n.º 4.7 (27 de setembro de 2018): 445. http://dx.doi.org/10.14419/ijet.v7i4.7.27356.

Texto completo da fonte
Resumo:
In the use of computer devices that are done in recent use for data processing in the form of text, images or video that can be done easily and quickly. But in reality, there are events where the work, in the form of computer files, can be lost so that the required files are missing from the storage media in the computer. In this research will be discussed and presented a comparative analysis of four software for recovery of data that has been deleted. The applications used are OSforensics, GetDataBack, Disk Genius and Diskdigger. The capabilities of such applications in the recovery of deleted data have been tested and analyzed in flash drives. Based on the tests that have been done indicates that the fourth this application can work well in terms of finding deleted data. In addition, this application is also able to recover data or retrieve the already deleted.
Estilos ABNT, Harvard, Vancouver, APA, etc.
15

Alnajjar, Ibrahim Ali, e Massudi Mahmuddin. "The Enhanced Forensic Examination and Analysis for Mobile Cloud Platform by Applying Data Mining Methods". Webology 18, SI01 (13 de janeiro de 2021): 47–74. http://dx.doi.org/10.14704/web/v18si01/web18006.

Texto completo da fonte
Resumo:
Investigating the mobile cloud environment is a challenging task due to the characteristics of voluminous data, dispersion of data, virtualization, and diverse data. Recent research works focus on applying the latest forensic methodologies to the mobile cloud investigation. This paper proposes an enhanced forensic examination and analysis model for the mobile cloud environment that incorporates timeline analysis, hash filtering, data carving, and data transformation sub-phases to improve the performance of the cloud evidence identification and overall forensic decision-making. It analyzes the timeline of events and filters the case-specific files based on the hash values and metadata using the data mining methods. The proposed forensic model performs the in-place carving on the filtered data to guide the investigation and integrates the heterogeneous file types and distributed pieces of evidence with the assistance of the data mining. Finally, the proposed approach employs LSTM based model that significantly improves the forensic decision making.
Estilos ABNT, Harvard, Vancouver, APA, etc.
16

He, Hai-yan, Chih-Yang Lin e Hollis T. Cline. "In Vivo Time-Lapse Imaging and Analysis of Dendritic Structural Plasticity in Xenopus laevis Tadpoles". Cold Spring Harbor Protocols 2022, n.º 1 (31 de março de 2021): pdb.prot106781. http://dx.doi.org/10.1101/pdb.prot106781.

Texto completo da fonte
Resumo:
In vivo time-lapse imaging of complete dendritic arbor structures in tectal neurons of Xenopus laevis tadpoles has served as a powerful in vivo model to study activity-dependent structural plasticity in the central nervous system during early development. In addition to quantitative analysis of gross arbor structure, dynamic analysis of the four-dimensional data offers particularly valuable insights into the structural changes occurring in subcellular domains over experience/development-driven structural plasticity events. Such analysis allows not only quantifiable characterization of branch additions and retractions with high temporal resolution but also identification of the loci of action. This allows for a better understanding of the spatiotemporal association of structural changes to functional relevance. Here we describe a protocol for in vivo time-lapse imaging of complete dendritic arbors from individual neurons in the brains of anesthetized tadpoles with two-photon microscopy and data analysis of the time series of 3D dendritic arbors. For data analysis, we focus on dynamic analysis of reconstructed neuronal filaments using a customized open source computer program we developed (4D SPA), which allows aligning and matching of 3D neuronal structures across different time points with greatly improved speed and reliability. File converters are provided to convert reconstructed filament files from commercial reconstruction software to be used in 4D SPA. The program and user manual are publicly accessible and operate through a graphical user interface on both Windows and Mac OSX.
Estilos ABNT, Harvard, Vancouver, APA, etc.
17

Haislip, Jacob Z., Khondkar E. Karim, Karen Jingrong Lin e Robert E. Pinsker. "The Influences of CEO IT Expertise and Board-Level Technology Committees on Form 8-K Disclosure Timeliness". Journal of Information Systems 34, n.º 2 (2 de agosto de 2019): 167–85. http://dx.doi.org/10.2308/isys-52530.

Texto completo da fonte
Resumo:
ABSTRACT Recent research documents the improvement of Form 8-K disclosure timeliness in the post-Sarbanes-Oxley Act (SOX) era. However, it remains unclear why disclosure timeliness overall has improved, but disclosure timeliness for certain events has not improved. We examine firms' information technology (IT) management and IT governance in order to investigate their potential positive impacts on 8-K reporting timeliness. We find that, on average, IT-expert Chief Executive Officers (CEOs) and firms with board-level technology committees file Form 8-Ks in a timelier manner. Specifically, firms with IT-expert CEOs file a half-day sooner and firms with technology committees file a full-day sooner. Additional analyses show that firms with technology committees file 8-Ks in a timelier manner than firms without technology committees, even when the events are complicated or surprising. In aggregate, our evidence suggests that IT-expert CEOs and IT expertise on the board facilitates efficient IT utilization and is associated with timely disclosure. Data Availability: The data used are publicly available from the sources cited in the text.
Estilos ABNT, Harvard, Vancouver, APA, etc.
18

Gouveia, José Rafael Ferreira de, e Cristina Rodrigues Nascimento. "Uso e Cobertura do Solo após Eventos de Queimadas no Município de Floresta em Pernambuco". Revista Brasileira de Geografia Física 15, n.º 6 (2022): 3121–35. http://dx.doi.org/10.26848/rbgf.v15.6.p3121-3135.

Texto completo da fonte
Resumo:
Fires are recurrent in Brazil to clean up and increase territories. Temporal analyzes on the subject show that the municipality of Floresta in Pernambuco stands out in the number of cases in the month of November 2019. This article aims to collect and quantify the incidence of hotspots and burned areas in the aforementioned region and month, in addition to of land use and cover conditions based on remote sensing techniques. Information from the Pernambuco Water and Climate Agency (APAC) was used for climatology. The databases of the Burning Database were used to collect and quantify hotspots. MODIS/TERRA, MODIS/Combinado and Landsat-8/OLI images were applied in the characterization of burned areas, image interpretation and calculation of vegetation and fire indexes. 35 fire areas were identified, 31 of which were confirmed hotspots, totaling 208. Of these areas, 7 coincided with November 6, 2019, Julian day 310. It was observed by the WDRVI, IAF and NDVI vegetation indices that the Healthy vegetation behavior is uniform, with low to moderate density. After the fire, the use and cover of the soil regenerated, with no change. The SAVI and the NBR and NBR2 exhibit very strong and strong positive correlation coefficients, respectively, indicating the dependence of one variable on the other. In this way, remote sensing proved to be effective in the study of land use and cover conditions, as well as the identification of fires in the municipality of Floresta.
Estilos ABNT, Harvard, Vancouver, APA, etc.
19

Lei, Yan, Linxiang He e Houqiang Huang. "Enhancement of Nursing Effect in Emergency General Surgery Based on Computer Aid". Journal of Healthcare Engineering 2022 (10 de março de 2022): 1–10. http://dx.doi.org/10.1155/2022/6745993.

Texto completo da fonte
Resumo:
In order to improve the nursing effect of emergency general surgery, this paper combines computer algorithms to carry out the intelligent management of general surgery nursing, and realizes the standardization of nursing information, the electronic nursing file, the precision of nursing workload, and the intelligentization of nursing quality control by means of informatization. This truly and objectively reflects the nursing operation and treatment situation, prevents the occurrence of some adverse events, and effectively reduces the workload of nursing care. Moreover, this paper uses a standardized software design method to define the software concept, and then conducts a detailed demand analysis of the nursing display function through detailed investigation, class work, discussion and analysis, and comparison decision-making methods. In addition, this paper compiles the software through strict coding standards, and finally designs test cases to test and improve the software. Through actual case studies, it can be seen that the computer-assisted emergency general surgery nursing method proposed in this paper has a certain progress compared with the traditional nursing method.
Estilos ABNT, Harvard, Vancouver, APA, etc.
20

Sezonov, V. S. "The concept of a document in jurisprudence". Law and Safety 82, n.º 3 (29 de setembro de 2021): 200–208. http://dx.doi.org/10.32631/pb.2021.3.23.

Texto completo da fonte
Resumo:
At the present stage of development of legal document science and forensic document research techniques there is a need to expand the concept of the document using the achievements of various fields of knowledge. A document is a multifaceted, generalized concept for all important sources of information. Today, there is no consensus on the classification of legal documents. In our opinion, a document in jurisprudence has the following properties: it is a carrier and source of necessary information, it is in the documents that information is recorded for the first time; has a legal meaning, economic significance, can serve as written evidence, a means of certifying something; allows you to save and extract from the archives, if necessary, information and its processing in the current activities. As the main unit of clerical correspondence, the document performs certain functions and must meet certain requirements for conclusion. All documents according to the method of presentation of the material are divided into documents with a low level of standardization and documents with a high level of standardization. Legal documents in electronic form have a completely different nature. The information that constitutes the essence of an electronic document is limited to the term "file", which in the scientific literature means a specific place on a computer medium with its own specific name. In addition, an electronic document can exist both in the form of a single file (a set of files) and in the form of a link (on the Internet). Therefore, it is considered appropriate to define a legal document as a material object with information recorded on it, which is meaningful for the establishment of legal circumstances and exists in appropriate forms and forms. A legal document is an information object that is compiled in accordance with legal requirements and contains legal information (confirmation of rights, obligations or confirmation of legal facts, events or actions that give rise to certain rights and obligations). The main ways to modernize the current legal norms and provisions relating to legal documents and records are to approve new types of legal documents in electronic form as having the same legal force as paper. Certain provisions require approval of their scope and responsibility for their protection.
Estilos ABNT, Harvard, Vancouver, APA, etc.
21

Kobayashi, Kent D., e H. C. Bittenbender. "155 Farmer's Bookshelf: Evolution of an Information System for Crops in Hawaii". HortScience 34, n.º 3 (junho de 1999): 468E—469. http://dx.doi.org/10.21273/hortsci.34.3.468e.

Texto completo da fonte
Resumo:
In 1988, the Farmer's Bookshelf started out as a computerized information system of crops grown in Hawaii. The first version was created on an Apple Macintosh computer using a hypermedia program called HyperCard. Because HyperCard came with each Macintosh computer, only the crop files needed to be sent to clientele. As the demand for an IBM-compatible version of the Farmer's Bookshelf increased, the Windows version was created using a hypermedia program called Plus. In addition to the crop files, the runtime version of Plus was also distributed to clientele. Later, other files were added to the Farmer's Bookshelf, including files to diagnose problems of macadamia in the field, select ground covers, select landscape trees, recommend fertilization, calculate nut loss for macadamia growers, and calculate turfgrass irrigation. Cost of analysis spread-sheets for several crops were also added. Recently, the Farmer's Bookshelf was moved to the World Wide Web, which has the advantages of reaching a world-wide clientele, easier updating and modifications, and linking to sites of related information. We have added links to newspaper articles on agriculture in Hawaii, to related sites on a particular crop, to on-line agricultural magazines and newsletters, to agricultural software, to upcoming agricultural events, and to Y2K sites. Because of the benefits of the Web version, the diskette versions (Macintosh and Windows) are no longer supported. Putting the Farmer's Bookshelf on the Web has allowed us to better meet the needs of our clientele for up-to-date information.
Estilos ABNT, Harvard, Vancouver, APA, etc.
22

Mueggler, Elias, Henri Rebecq, Guillermo Gallego, Tobi Delbruck e Davide Scaramuzza. "The event-camera dataset and simulator: Event-based data for pose estimation, visual odometry, and SLAM". International Journal of Robotics Research 36, n.º 2 (fevereiro de 2017): 142–49. http://dx.doi.org/10.1177/0278364917691115.

Texto completo da fonte
Resumo:
New vision sensors, such as the dynamic and active-pixel vision sensor (DAVIS), incorporate a conventional global-shutter camera and an event-based sensor in the same pixel array. These sensors have great potential for high-speed robotics and computer vision because they allow us to combine the benefits of conventional cameras with those of event-based sensors: low latency, high temporal resolution, and very high dynamic range. However, new algorithms are required to exploit the sensor characteristics and cope with its unconventional output, which consists of a stream of asynchronous brightness changes (called “events”) and synchronous grayscale frames. For this purpose, we present and release a collection of datasets captured with a DAVIS in a variety of synthetic and real environments, which we hope will motivate research on new algorithms for high-speed and high-dynamic-range robotics and computer-vision applications. In addition to global-shutter intensity images and asynchronous events, we provide inertial measurements and ground-truth camera poses from a motion-capture system. The latter allows comparing the pose accuracy of ego-motion estimation algorithms quantitatively. All the data are released both as standard text files and binary files (i.e. rosbag). This paper provides an overview of the available data and describes a simulator that we release open-source to create synthetic event-camera data.
Estilos ABNT, Harvard, Vancouver, APA, etc.
23

Poquerusse, M., e P. S. McIntosh. "Type III Radio Burst Productivity of Solar Flares". International Astronomical Union Colloquium 104, n.º 2 (1989): 177–80. http://dx.doi.org/10.1017/s0252921100154107.

Texto completo da fonte
Resumo:
We study the statistical relationship between optical flares and type III radio bursts, using modern and extensive computer files. Results emerge along two main lines, concerning the physical mechanism of ejection of energetic particles, and the magnetic field geometry respectively.First, we find that type III probability of occurrence increases strongly with the brightness of a flare and its proximity to a sunspot, and with accompanying prominence activity. This suggests that Bornmann's class I and III events correspond to distinct physical phenomena, particle acceleration and magnetic expansion respectively, both working simultaneously in class II events, which are the most favorable to the ejection of energetic particles out of flaring sites.
Estilos ABNT, Harvard, Vancouver, APA, etc.
24

Patil, Anand N., e Sujata V. Mallapur. "Novel machine learning based authentication technique in VANET system for secure data transmission". Journal of Autonomous Intelligence 6, n.º 2 (8 de agosto de 2023): 828. http://dx.doi.org/10.32629/jai.v6i2.828.

Texto completo da fonte
Resumo:
<p>Adaptive transport technologies based on vehicular ad hoc networks (VANET) has proven considerable potential in light of the developing expansion of driver assistance and automobile telecommunication systems. However, confidentiality and safety are the vital challenges in vehicular ad hoc networks which could be seriously impaired by malicious attackers. While protecting vehicle privacy from threats, it is imperative to stop internal vehicles from putting out bogus messages. Considering these issues, a novel machine learning based message authentication combined with blockchain and inter planetary file system (IPFS) is proposed to achieve message dissemination in a secured way. Blockchain is the emerging technology which attempts to solve these problems by producing tamper proof events of records in a distributed environment and inter planetary file system used in the framework is a protocol designed to store the event with content addressability. Along with this combined technology, the source metadata information collected from the inter planetary file system is stored via a smart contract and uploaded to the distributed ledger technology (DLT). For performing event authentication, K-means clustering and support vector machine (SVM) classifier is employed in this framework. K-means clustering performs clustering of vehicles and it is marked malicious or not malicious. After clustering, support vector machine classifier detects the malicious event messages. By this way, the malicious messages are identified and it is dropped. Only the secure messages are forwarded in the network. Finally, our approach is capable of creating a safe and decentralized vehicular ad hoc network architecture with accountability and confidentiality through theoretical study and simulations.</p>
Estilos ABNT, Harvard, Vancouver, APA, etc.
25

Sugier, Jarosław. "Scripting Scenarios of Pedestrian Behavior in a Computer Simulator of Security Monitoring System: A Practitioner’s Perspective". Transport and Telecommunication Journal 24, n.º 4 (1 de novembro de 2023): 349–60. http://dx.doi.org/10.2478/ttj-2023-0027.

Texto completo da fonte
Resumo:
Abstract The subject of this paper is the AvatarTraffic simulator – a computer system capable of modelling in real time environments such as subway stations or airport halls populated with tens or hundreds of moving figures, which, in addition to pedestrian traffic typical for this type of objects, can perform predefined sequences of events and actions formulated as a simulation scenario. Thanks to the integration with a real monitoring system, the simulator, in addition to providing data streams (including video) generated by the virtual scene, is also able to dynamically respond to actions taken by the system’s staff. Using the Unity simulation engine as the implementation platform, a number of practical problems had to be solved during the development, two of which are the subject of this article: a) supervising and correcting the work of AI algorithms used in Unity to simulate the pedestrian movement of avatars, and b) a textual description of the scenario of events taking place on the stage in a way editable for experts planning tests of the monitoring system. Some more challenging cases of people movement are discussed (including creating queues and passing through doors) and the paper presents original algorithms correcting the work of the Unity’s built-in methods in the situations when the coordinated behaviour of people groups is required. Because of the specifics of the simulator environment the scenario needed to be expressed in a JSON text file, and the article presents the implemented mechanisms of its compilation directly to the C# runtime environment and discusses the original command language which was created to model sequences of events and actions making up the scenario.
Estilos ABNT, Harvard, Vancouver, APA, etc.
26

Marjai, Péter, Péter Lehotay-Kéry e Attila Kiss. "A Novel Dictionary-Based Method to Compress Log Files with Different Message Frequency Distributions". Applied Sciences 12, n.º 4 (16 de fevereiro de 2022): 2044. http://dx.doi.org/10.3390/app12042044.

Texto completo da fonte
Resumo:
In the present day, virtually every application software generates large amounts of log entries during its work. The log files that are made from these entries are a collection of information about what happened while the program was running. This report can be used for multiple purposes such as performance monitoring, maintaining security, or improving business decision making. Log entries are usually generated in a disorganized manner. Using template miners, the different ‘event types’ can be distinguished (each log entry is an event), and the set of all entries is split into disjointed subsets according to the event types. These events consist of two parts. The first is the constant part, which is the same for all occurrences of the same event type. The second is the parameter part, which can be different for each occurrence. Since software mass-produces log files, in our previous paper, we introduced an algorithm that uses the templates mined from the data to create a dictionary, which is then used to encode the log entries, so only the ID and the parameter list would be stored. In this paper, we enhance our algorithm with the use of the frequency of the templates, by encoding the parameters and also making use of Huffman coding. With the use of these measures, compared to the previous 67.4% compression rate, a 94.98% compression rate can be achieved (where compression rate is 1 minus the ratio of the size of the compressed file to the uncompressed size). The running times of the different measures that we used to enhance our algorithm are also compared. We also analyze the difference between the compression rate of the enhanced algorithm and general compressors such as LZMA, Bzip2, and PPMd. We examine whether the size of the log files can be further decreased with the combined use of our enhanced method and the general compressors. We also generate log files that follow different distributions to examine the compression capability if the distribution does not follow the power law. Based on our experiments, we would recommend the use of the MoLFI (Multi-objective Log message Format Identification) template miner method with our enhanced algorithm together with PPMd.
Estilos ABNT, Harvard, Vancouver, APA, etc.
27

Colligan, Thomas, Kayla Irish, Douglas J. Emlen e Travis J. Wheeler. "DISCO: A deep learning ensemble for uncertainty-aware segmentation of acoustic signals". PLOS ONE 18, n.º 7 (26 de julho de 2023): e0288172. http://dx.doi.org/10.1371/journal.pone.0288172.

Texto completo da fonte
Resumo:
Recordings of animal sounds enable a wide range of observational inquiries into animal communication, behavior, and diversity. Automated labeling of sound events in such recordings can improve both throughput and reproducibility of analysis. Here, we describe our software package for labeling elements in recordings of animal sounds, and demonstrate its utility on recordings of beetle courtships and whale songs. The software, DISCO, computes sensible confidence estimates and produces labels with high precision and accuracy. In addition to the core labeling software, it provides a simple tool for labeling training data, and a visual system for analysis of resulting labels. DISCO is open-source and easy to install, it works with standard file formats, and it presents a low barrier of entry to use.
Estilos ABNT, Harvard, Vancouver, APA, etc.
28

Machard, Anaïs, Christian Inard, Jean-Marie Alessandrini, Charles Pelé e Jacques Ribéron. "A Methodology for Assembling Future Weather Files Including Heatwaves for Building Thermal Simulations from the European Coordinated Regional Downscaling Experiment (EURO-CORDEX) Climate Data". Energies 13, n.º 13 (2 de julho de 2020): 3424. http://dx.doi.org/10.3390/en13133424.

Texto completo da fonte
Resumo:
With increasing mean and extreme temperatures due to climate change, it becomes necessary to use—not only future typical conditions—but future heatwaves in building thermal simulations as well. Future typical weather files are widespread, but few researchers have put together methodologies to reproduce future extreme conditions. Furthermore, climate uncertainties need to be considered and it is often difficult due to the lack of data accessibility. In this article, we propose a methodology to re-assemble future weather files—ready-to-use for building simulations—using data from the European Coordinated Regional Downscaling Experiment (EURO-CORDEX) dynamically downscaled regional climate multi-year projections. It is the first time that this database is used to assemble weather files for building simulations because of its recent availability. Two types of future weather files are produced: typical weather years (TWY) and heatwave events (HWE). Combined together, they can be used to fully assess building resilience to overheating in future climate conditions. A case study building in Paris is modelled to compare the impact of the different weather files on the indoor operative temperature of the building. The results confirm that it is better to use multiple types of future weather files, climate models, and or scenarios to fully grasp climate projection uncertainties.
Estilos ABNT, Harvard, Vancouver, APA, etc.
29

Kern, Fabian, Jeremy Amand, Ilya Senatorov, Alina Isakova, Christina Backes, Eckart Meese, Andreas Keller e Tobias Fehlmann. "miRSwitch: detecting microRNA arm shift and switch events". Nucleic Acids Research 48, W1 (1 de maio de 2020): W268—W274. http://dx.doi.org/10.1093/nar/gkaa323.

Texto completo da fonte
Resumo:
Abstract Arm selection, the preferential expression of a 3′ or 5′ mature microRNA (miRNA), is a highly dynamic and tissue-specific process. Time-dependent expression shifts or switches between the arms are also relevant for human diseases. We present miRSwitch, a web server to facilitate the analysis and interpretation of arm selection events. Our species-independent tool evaluates pre-processed small non-coding RNA sequencing (sncRNA-seq) data, i.e. expression matrices or output files from miRNA quantification tools (miRDeep2, miRMaster, sRNAbench). miRSwitch highlights potential changes in the distribution of mature miRNAs from the same precursor. Group comparisons from one or several user-provided annotations (e.g. disease states) are possible. Results can be dynamically adjusted by choosing from a continuous range of highly specific to very sensitive parameters. Users can compare potential arm shifts in the provided data to a human reference map of pre-computed arm shift frequencies. We created this map from 46 tissues and 30 521 samples. As case studies we present novel arm shift information in a Alzheimer’s disease biomarker data set and from a comparison of tissues in Homo sapiens and Mus musculus. In summary, miRSwitch offers a broad range of customized arm switch analyses along with comprehensive visualizations, and is freely available at: https://www.ccb.uni-saarland.de/mirswitch/.
Estilos ABNT, Harvard, Vancouver, APA, etc.
30

Et.al, Ms Hepisuthar. "Comparative Analysis Study on SSD, HDD, and SSHD". Turkish Journal of Computer and Mathematics Education (TURCOMAT) 12, n.º 3 (10 de abril de 2021): 3635–41. http://dx.doi.org/10.17762/turcomat.v12i3.1644.

Texto completo da fonte
Resumo:
In the Current Century, permeant storage devices and methods of storing data changed from traditional HDD to SDD. In this document, we discuss the merge of HDD and SSD. The Abbreviation of SSHD is called the solid-state hybrid disk. A mixture of both secondary devices to enhance the performance of the system. Inside the SSD, data movement events occur without any user input. Recent research has suggested that SSD has only the Replacement of secondary storage. HDD is also good in life span with longer life. It’s more reliable for long time data contained in this. HDD storage has typical magnetic fields for store data. SSD contains NAND flash memory to write the data in the drive. Based on the method and material of storing different. HDD and SSD feature well to upgrade with technology in Computer filed. For enhancing computing speed and excellent processing SSHD good to use in computer.Ratio increase of SSHD usage in current laptop and in computer system.
Estilos ABNT, Harvard, Vancouver, APA, etc.
31

Szpyrka, Marcin, Edyta Brzychczy, Aneta Napieraj, Jacek Korski e Grzegorz J. Nalepa. "Conformance Checking of a Longwall Shearer Operation Based on Low-Level Events". Energies 13, n.º 24 (15 de dezembro de 2020): 6630. http://dx.doi.org/10.3390/en13246630.

Texto completo da fonte
Resumo:
Conformance checking is a process mining technique that compares a process model with an event log of the same process to check whether the current execution stored in the log conforms to the model and vice versa. This paper deals with the conformance checking of a longwall shearer process. The approach uses place-transition Petri nets with inhibitor arcs for modeling purposes. We use event log files collected from a few coal mines located in Poland by Famur S.A., one of the global suppliers of coal mining machines. One of the main advantages of the approach is the possibility for both offline and online analysis of the log data. The paper presents a detailed description of the longwall process, an original formal model we developed, selected elements of the approach’s implementation and the results of experiments.
Estilos ABNT, Harvard, Vancouver, APA, etc.
32

Krämer, Stefan D., Johannes Wöhrle, Christin Rath e Günter Roth. "Anabel: An Online Tool for the Real-Time Kinetic Analysis of Binding Events". Bioinformatics and Biology Insights 13 (janeiro de 2019): 117793221882138. http://dx.doi.org/10.1177/1177932218821383.

Texto completo da fonte
Resumo:
Anabel ( Analysis of binding events + l) is an open source online software tool ( www.skscience.org/anabel ) for the convenient analysis of molecular binding interactions. Currently, exported datasets from Biacore (surface plasmon resonance [SPR]), FortéBio (biolayer interference [BLI]), and Biametrics (single color reflectometry [SCORE]) can be uploaded and evaluated in Anabel using 2 different evaluation methods. Moreover, a universal data template format is provided to upload any other binding dataset to Anabel. This enables an easier comparison of different analysis methods for all users. Furthermore, a guide was established in Anabel to help inexperienced users to obtain optimal results. In addition, expert features can be used to optimize and control the fit of the binding model to the measured data. We tried to make the process of fitting and evaluating as easy as possible through the use of an intuitive user interface. At the end of every analysis, a single excel file, containing all results and graphs of the performed analysis, can be downloaded.
Estilos ABNT, Harvard, Vancouver, APA, etc.
33

Manihda, O. V., e V. A. Hnera. "PREFERENCES OF USING GEOINFORMATION SYSTEMS FOR FIXATION ON ARCHAEOLOGICAL OBJECTS". Archaeology and Early History of Ukraine 30, n.º 1 (25 de março de 2019): 218–30. http://dx.doi.org/10.37445/adiu.2019.01.17.

Texto completo da fonte
Resumo:
The paper proposes examples of archaeological objects fixing using Geoinformation system (GIS) as an effective computer-supported system used for a digital visualization and analysis of geographic features and events happening on them. The main preference of using these methods is disclosed due to elaborations of specialists worked in Architectural-archaeological expedition of Archaeology Institute of NASU for several years. There is an experience gained in field and urban space. According to this thesis main preferences that is noticed by authors are: 1) an accuracy of fixing in a difficult conditions; 2) multipurpose and flexibility of coordinate system; 3) a unique format of different file types; 4) an opportunity of object reconstruction based on earlier drawing; 5) creation a topography ground (basic plan) for future excavations; 6) combining in one GIS model different types of information that is appropriate to an archaeological object; 7) join the attribute tables of database related to archaeological objects fixed during the excavation in GIS formats. An effective algorithm of object fixing is proposed by using the most basic methods of GIS.
Estilos ABNT, Harvard, Vancouver, APA, etc.
34

Prime, Sunantha. "Forecasting the changes in daily stock prices in Shanghai Stock Exchange using Neural Network and Ordinary Least Squares Regression". Investment Management and Financial Innovations 17, n.º 3 (1 de outubro de 2020): 292–307. http://dx.doi.org/10.21511/imfi.17(3).2020.22.

Texto completo da fonte
Resumo:
The research focuses on finding a superior forecasting technique to predict stock movement and behavior in the Shanghai Stock Exchange. The author’s interest is in stock market activities during high volatility, specifically 13 years from 2002 to 2015. This volatile period, fueled by events such as the dot-com bubble, SARS outbreak, political leadership transitions, and the global financial crisis, is of interest. The study aims to analyze changes in stock prices during an unstable period. The author used advanced computer sciences, Machine Learning through information processing and training, and the traditional statistical approach, the Multiple Linear Regression Model, with the least square method. Both techniques are accurate predictors measured by Absolute Percent Error with a range of 1.50% to 1.65%, using a data file containing 3,283 observations generated to record the daily close prices of individual Chinese companies. The t-test paired difference experiment shows the superiority of Neural Network in the finance sector and potentially not in other sectors. The Multiple Linear Regression Model performs equivalent to the Neural Network in other sectors.
Estilos ABNT, Harvard, Vancouver, APA, etc.
35

Kumar, Vijay, e Talwinder Kaur. "Cloud Functions and Serverless Computing". International Journal for Research in Applied Science and Engineering Technology 10, n.º 5 (31 de maio de 2022): 3426–27. http://dx.doi.org/10.22214/ijraset.2022.43163.

Texto completo da fonte
Resumo:
Abstract: Cloud Function in the simplest words is a Function-as-a-Service (FaaS). FaaS is actually a family of the serverless computing category. “Serverless” means that the user can focus on its application logic without dealing with infrastructure at all. Painless development, deployment, and maintenance of a web API is still not a turn-key solution, although modern web application frameworkshave improved dramatically in the last few years. Serverless is without any doubt a game-changer. The event-driven approach combined with a scalable and robust cloud ecosystem offered by the maintop cloud vendors opens endless opportunities. Cloud Functions is Google Cloud’s event-driven serverless compute platform. FaaS is a real NoOpstechnology, it completely abstracts away servers. Cloud Functions are developed with the sole purpose to build event-driven architectures. These can react to events like file changes in storage, messages in the queue, or any HTTP request. Index Terms: Cloud Computing, Serverless Technology, Function as a Service, LambdaFunctions and Functions
Estilos ABNT, Harvard, Vancouver, APA, etc.
36

Kumar, Vijay. "Cloud Functions Using Server less Computing". International Journal for Research in Applied Science and Engineering Technology 9, n.º 9 (30 de setembro de 2021): 418–19. http://dx.doi.org/10.22214/ijraset.2021.37986.

Texto completo da fonte
Resumo:
Abstract: Cloud Function in the simplest words is a Function-as-a-Service (FaaS). FaaS is actually a family of the serverless computing category. “Serverless” means that the user can focus on its application logic without dealing with infrastructure at all. Painless development, deployment, and maintenance of a web API is still not a turn-key solution, although modern web application frameworks have improved dramatically in the last few years. Serverless is without any doubt a game-changer. The event-driven approach combined with a scalable and robust cloud ecosystem offered by the main top cloud vendors opens endless opportunities. Cloud Functions is Google Cloud’s event-driven serverless compute platform. FaaS is a real NoOps technology, it completely abstracts away servers. Cloud Functions are developed with the sole purpose to build event-driven architectures. These can react to events like file changes in storage, messages in the queue, or any HTTP request. Index Terms: Cloud Computing, Serverless Technology, Function as a Service, Lambda Functions and Functions
Estilos ABNT, Harvard, Vancouver, APA, etc.
37

Yang, Jing. "Media Evolution, “Double-edged Sword” Technology and Active Spectatorship: investigating “Desktop Film” from media ecology perspective". Lumina 14, n.º 1 (30 de abril de 2020): 125–38. http://dx.doi.org/10.34019/1981-4070.2020.v14.30260.

Texto completo da fonte
Resumo:
Desktop film or computer screen film is a film subgenre with all events and actions taking place on a screen of a computer and using the protagonist’s first-person perspective, exemplified by The Den (2013), Open Windows(2014), Unfriended (2014), Unfriended: Dark Web (2018), Profile (2018) and Searching(2018). This paper mainly focuses on the desktop films with the theoretical framework of “Media Ecology”, aiming to investigate how the desktop film evolves and interacts with new media, digital technology, while influencing communication and spectatorship. Firstly, this paper discusses the evolution of cinema, which evolves through the interaction, co-existence and convergence with other media, as well as corresponds to the anthropotropic trend. Secondly, this paper investigates the digital media and technology in desktop films. “Desktop films” create cyberspaces and reproduce people’s virtual lives, revealing the influences of media technology, which is considered as a double-edged sword. Thirdly, this paper analyzes how desktop film exerts impacts on cinematic communication, while reshaping the spectatorship and audience’s viewing mechanism. “Desktop films” are suitable to be watched on computer, thus making audiences become active and have more autonomy.
Estilos ABNT, Harvard, Vancouver, APA, etc.
38

Chun, Hein, e Sangwoo Kim. "BAMixChecker: an automated checkup tool for matched sample pairs in NGS cohort". Bioinformatics 35, n.º 22 (14 de junho de 2019): 4806–8. http://dx.doi.org/10.1093/bioinformatics/btz479.

Texto completo da fonte
Resumo:
Abstract Summary Mislabeling in the process of next generation sequencing is a frequent problem that can cause an entire genomic analysis to fail, and a regular cohort-level checkup is needed to ensure that it has not occurred. We developed a new, automated tool (BAMixChecker) that accurately detects sample mismatches from a given BAM file cohort with minimal user intervention. BAMixChecker uses a flexible, data-specific set of single-nucleotide polymorphisms and detects orphan (unpaired) and swapped (mispaired) samples based on genotype-concordance score and entropy-based file name analysis. BAMixChecker shows ∼100% accuracy in real WES, RNA-Seq and targeted sequencing data cohorts, even for small panels (<50 genes). BAMixChecker provides an HTML-style report that graphically outlines the sample matching status in tables and heatmaps, with which users can quickly inspect any mismatch events. Availability and implementation BAMixChecker is available at https://github.com/heinc1010/BAMixChecker Supplementary information Supplementary data are available at Bioinformatics online.
Estilos ABNT, Harvard, Vancouver, APA, etc.
39

Baftiu, Naim, e Raif Bytyqi. "APPLICATION OF BLASTWARE SOFTWARE FOR MEASURING MICROCOLIMIC CONDITIONS". Teacher of the future 31, n.º 4 (5 de junho de 2019): 1093–98. http://dx.doi.org/10.35120/kij31041093b.

Texto completo da fonte
Resumo:
Safety and health at work as per legislation is conceived as an integral part of the organization of work and work process and on this basis is provided every worker and every useful work, regardless of the type and complexity of the work, which is in accordance with the constitutional principle regarding the right of every worker for protection at work. Instantel Blastware software, the Windows software companion to your Instantel vibration monitor offers powerful, easy-to-use features, for event management, compliance reporting and advanced data analysis. Blastware software is designed to perform several tasks to assist with your monitoring operations. The software can be used to program any Series II, III or IV Instantel monitor, manage recorded events, remotely control monitors, as well as customize report content, language, frequency standard, and more. The program consists of two modules: the Compliance Module and the Advanced Module. The Compliance Module comes standard with each Instantel monitor. The Advanced Module, which is optional, includes powerful data analysis features and extended monitor setup options. Powerful Event Manager simplifies file transfer from monitor and file management on the computer, Operator interface is intuitive and user-friendly, Customized Event Reports with over 20 selectable National Frequency Standards to create compliance reports, Easy-to-use Frequency (FFT) Analysis and reporting, Monitoring setup utilities to configure systems for remote monitoring with modem communications, Blastware Mail automatically distributes event data to email and text messaging devices, Transfer event data to ASCII format.The purpose of this paper is the correlation of periodic measurements for the summer season in the company "Newko Balkan L.L.C." - Suharekë, with Instantel Blastware software.
Estilos ABNT, Harvard, Vancouver, APA, etc.
40

Ghozia, Ahmed, Gamal Attiya, Emad Adly e Nawal El-Fishawy. "Intelligence Is beyond Learning: A Context-Aware Artificial Intelligent System for Video Understanding". Computational Intelligence and Neuroscience 2020 (23 de dezembro de 2020): 1–15. http://dx.doi.org/10.1155/2020/8813089.

Texto completo da fonte
Resumo:
Understanding video files is a challenging task. While the current video understanding techniques rely on deep learning, the obtained results suffer from a lack of real trustful meaning. Deep learning recognizes patterns from big data, leading to deep feature abstraction, not deep understanding. Deep learning tries to understand multimedia production by analyzing its content. We cannot understand the semantics of a multimedia file by analyzing its content only. Events occurring in a scene earn their meanings from the context containing them. A screaming kid could be scared of a threat or surprised by a lovely gift or just playing in the backyard. Artificial intelligence is a heterogeneous process that goes beyond learning. In this article, we discuss the heterogeneity of AI as a process that includes innate knowledge, approximations, and context awareness. We present a context-aware video understanding technique that makes the machine intelligent enough to understand the message behind the video stream. The main purpose is to understand the video stream by extracting real meaningful concepts, emotions, temporal data, and spatial data from the video context. The diffusion of heterogeneous data patterns from the video context leads to accurate decision-making about the video message and outperforms systems that rely on deep learning. Objective and subjective comparisons prove the accuracy of the concepts extracted by the proposed context-aware technique in comparison with the current deep learning video understanding techniques. Both systems are compared in terms of retrieval time, computing time, data size consumption, and complexity analysis. Comparisons show a significant efficient resource usage of the proposed context-aware system, which makes it a suitable solution for real-time scenarios. Moreover, we discuss the pros and cons of deep learning architectures.
Estilos ABNT, Harvard, Vancouver, APA, etc.
41

Gómez-Déniz, Emilio, e Enrique Calderín-Ojeda. "A Priori Ratemaking Selection Using Multivariate Regression Models Allowing Different Coverages in Auto Insurance". Risks 9, n.º 7 (20 de julho de 2021): 137. http://dx.doi.org/10.3390/risks9070137.

Texto completo da fonte
Resumo:
A comprehensive auto insurance policy usually provides the broadest protection for the most common events for which the policyholder would file a claim. On the other hand, some insurers offer extended third-party car insurance to adapt to the personal needs of every policyholder. The extra coverage includes cover against fire, natural hazards, theft, windscreen repair, and legal expenses, among some other coverages that apply to specific events that may cause damage to the insured’s vehicle. In this paper, a multivariate distribution, based on a conditional specification, is proposed to account for different numbers of claims for different coverages. Then, the premium is computed for each type of coverage separately rather than for the total claims number. Closed-form expressions are given for moments and cross-moments, parameter estimates, and for a priori premiums when different premiums principles are considered. In addition, the severity of claims can be incorporated into this multivariate model to derive multivariate claims’ severity distributions. The model is extended by developing a zero-inflated version. Regression models for both multivariate families are derived. These models are used to fit a real auto insurance portfolio that includes five types of coverage. Our findings show that some specific covariates are statistically significant in some coverages, yet they are not so for others.
Estilos ABNT, Harvard, Vancouver, APA, etc.
42

Philp, Cassie, Barbara Geller e Fiona Alexander. "Psychiatric Induction Programme in Fife". BJPsych Open 8, S1 (junho de 2022): S170. http://dx.doi.org/10.1192/bjo.2022.475.

Texto completo da fonte
Resumo:
AimsTo improve the Psychiatry induction for DiTs in Fife.MethodsThe purpose of induction is to provide Doctors in Training (DiT) with a smooth, supported transition between roles. Delivered well, it will promote confidence and also provide a thorough grounding in the key requirements of the role and clarity regarding sources of help.A recent report, commissioned by the GMC, identified the key areas which should be covered in induction. The findings demonstrated a clear link between inadequate inductions to the impact on doctors’ well-being and patient safety issues.A questionnaire was issued to DiTs completing Psychiatry inductions in August and December 2021. Questions focused on the following key areas highlighted in the GMC report: •Gaining access to workplace settings and systems•Physical orientation of workplace•Team inductions•Daytime role and out of hours working and rotas.•Familiarisation with common cases/procedures that doctors may deal with in this speciality: risk management, use of the MHAResultsQuestionnaire Results: Key Issues highlightedAugust 2021 •FY2 to ST6 inducted together: differing experience levels•Differences in site inductions (psychiatry is spread across 3 hospitals in Fife)•Issues obtaining swipe cards/keys•IT access for emails and various computer systems delayed•Computer systems training not doneDecember 2021 •Lack of psychiatry experience of FY2s•Continued IT access issues initiallyConclusionIn September 2021, a working group was established comprising DiT representatives and those responsible for induction. The August 2021 results were disseminated and key improvements were identified in areas covered by the clinical induction: •An improved induction check list universal for all sites.•Induction documents for each role detailing responsibilities and useful information.•Integration of IT training.The December results highlighted improvements in many areas but continued a theme of concerns for FY2s starting in Psychiatry. The transition to this speciality is a significant adjustment as it operates differently to most specialities, requiring different skills and knowledge.Plans have been made to provide simulation events which would give DiTs practical experience in a safe environment of various topics e.g., risk management in psychiatry. Additionally, there are plans to revise induction for speciality trainees.
Estilos ABNT, Harvard, Vancouver, APA, etc.
43

Yang, Ting, Guanghua Zhang, Yin Li, Yiyu Yang, He Wang e Yuqing Zhang. "Detecting Privacy Leakage of Smart Home Devices through Traffic Analysis". Security and Communication Networks 2022 (15 de julho de 2022): 1–10. http://dx.doi.org/10.1155/2022/5655314.

Texto completo da fonte
Resumo:
Under the management of the Internet of Things platform, smart home devices can be operated remotely by users and greatly facilitate people’s life. Currently, smart home devices have been widely accepted by consumers, and the number of smart home devices is rising rapidly. The increase of smart home devices introduces various security hazards to users. Smart home devices are vulnerable to side-channel attacks based on network traffic. The event of smart home devices can be identified by network surveillants. Given this situation, we designed a set of standardized workflows for traffic capturing, fingerprint feature extraction, and fingerprint event detection. Based on such workflow, we present IoTEvent, a semiautomatic tool to detect vulnerable smart home devices, which is not limited to specific types of communication protocols. IoTEvent first collects device traffic by simulating touch events for App. Then, it pairs the packet sequences with events and generates a signature file. We also test the usability and performance of IoTEvent on five cloud platforms of smart home devices. Finally, we discuss the reasons for privacy leakage of smart home devices and security countermeasures.
Estilos ABNT, Harvard, Vancouver, APA, etc.
44

Isupov, Alexander. "VME–based DAQ system for the Deuteron Spin Structure setup at the Nuclotron internal target station". EPJ Web of Conferences 204 (2019): 10003. http://dx.doi.org/10.1051/epjconf/201920410003.

Texto completo da fonte
Resumo:
The new powerful VME–based data acquisition (DAQ) system has been designed for the Deuteron Spin Structure setup [1] placed at the Nuclotron Internal Target Station [2]. The DAQ system is built using the netgraph–based data acquisition and processing framework ngdp [3, 4]. The software dealing with VME hardware is a set of netgraph nodes in the form of the loadable kernel modules. The specific for current implementation nodes are described, while specific software utilities for the user context are the following. The b2r (binary–to–ROOT) server converts raw data into per trigger and per accelerator spill representations, which are based on C++ classes derived from the ROOT framework [5] ones. This approach allows us to generalize the code for histograms filling and polarization calculations. The b2r optionally stores ROOT events as ROOT TTree in file(s) on HDD, and supports the design of some express offine. The histGUI software module provides an interactive online access for human operator to histograms filled by the r2h (ROOT–to–histograms) server, which obtains the ROOT event representations from b2r. The r2h supports the calculation and histograming of runtime configurable variables as well as raw data variables, and optionally stores ROOT histograms in file(s) on HDD. Since the spin studies at the Nuclotron require fast and precise determination of the deuteron and proton beam polarization, the polarization calculator software module is introduced. This calculator based on runtime configurable r2h code allows us to compute polarization values online and integrate them into the Web–based scheme of representation and control of the polarimeters [6, 7].
Estilos ABNT, Harvard, Vancouver, APA, etc.
45

Ryciak, Piotr, Katarzyna Wasielewska e Artur Janicki. "Anomaly Detection in Log Files Using Selected Natural Language Processing Methods". Applied Sciences 12, n.º 10 (18 de maio de 2022): 5089. http://dx.doi.org/10.3390/app12105089.

Texto completo da fonte
Resumo:
In this article, we address the problem of detecting anomalies in system log files. Computer systems generate huge numbers of events, which are noted in event log files. While most of them report normal actions, an unusual entry may inform about a failure or malware infection. A human operator may easily miss such an entry; therefore, anomaly detection methods are used for this purpose. In our work, we used an approach known from the natural language processing (NLP) domain, which operates on so-called embeddings, that is vector representations of words or phrases. We describe an improved version of the LogEvent2Vec algorithm, proposed in 2020. In contrast to the original version, we propose a significant shortening of the analysis window, which both increased the accuracy of anomaly detection and made further analysis of suspicious sequences much easier. We experimented with various binary classifiers, such as decision trees or multilayer perceptrons (MLPs), and the Blue Gene/L dataset. We showed that selecting an optimal classifier (in this case, MLP) and a short log sequence gave very good results. The improved version of the algorithm yielded the best F1-score of 0.997, compared to 0.886 in the original version of the algorithm.
Estilos ABNT, Harvard, Vancouver, APA, etc.
46

Broeckmann, Andreas. "Reseau/Resonance: Connective Processes and Artistic Practice". Leonardo 37, n.º 4 (agosto de 2004): 280–86. http://dx.doi.org/10.1162/0024094041724544.

Texto completo da fonte
Resumo:
Most Internet art projects use the Net solely as a telematic and telecommunicative transmission medium that connects computers and servers and through which artists, performers and users exchange data, communicate and collaboratively create files and events. At the same time, however, some artists are exploring the electronic networks as specific socio-technical structures with their respective forms of social and machinic agency, in which people and machines interact in ways unique to this environment. The author discusses recent projects that use the Net as a performative space of social and aesthetic resonance in which notions of subjectivity, action and production are being articulated and reassessed. This text discusses the notion of “resonance” in order to think through these approaches to network-based art practices.
Estilos ABNT, Harvard, Vancouver, APA, etc.
47

Davis, Allan Peter, Thomas C. Wiegers, Cynthia J. Grondin, Robin J. Johnson, Daniela Sciaky, Jolene Wiegers e Carolyn J. Mattingly. "Leveraging the Comparative Toxicogenomics Database to Fill in Knowledge Gaps for Environmental Health: A Test Case for Air Pollution-induced Cardiovascular Disease". Toxicological Sciences 177, n.º 2 (14 de julho de 2020): 392–404. http://dx.doi.org/10.1093/toxsci/kfaa113.

Texto completo da fonte
Resumo:
Abstract Environmental health studies relate how exposures (eg, chemicals) affect human health and disease; however, in most cases, the molecular and biological mechanisms connecting an exposure with a disease remain unknown. To help fill in these knowledge gaps, we sought to leverage content from the public Comparative Toxicogenomics Database (CTD) to identify potential intermediary steps. In a proof-of-concept study, we systematically compute the genes, molecular mechanisms, and biological events for the environmental health association linking air pollution toxicants with 2 cardiovascular diseases (myocardial infarction and hypertension) as a test case. Our approach integrates 5 types of curated interactions in CTD to build sets of “CGPD-tetramers,” computationally constructed information blocks relating a Chemical- Gene interaction with a Phenotype and Disease. This bioinformatics strategy generates 653 CGPD-tetramers for air pollution-associated myocardial infarction (involving 5 pollutants, 58 genes, and 117 phenotypes) and 701 CGPD-tetramers for air pollution-associated hypertension (involving 3 pollutants, 96 genes, and 142 phenotypes). Collectively, we identify 19 genes and 96 phenotypes shared between these 2 air pollutant-induced outcomes, and suggest important roles for oxidative stress, inflammation, immune responses, cell death, and circulatory system processes. Moreover, CGPD-tetramers can be assembled into extensive chemical-induced disease pathways involving multiple gene products and sequential biological events, and many of these computed intermediary steps are validated in the literature. Our method does not require a priori knowledge of the toxicant, interacting gene, or biological system, and can be used to analyze any environmental chemical-induced disease curated within the public CTD framework. This bioinformatics strategy links and interrelates chemicals, genes, phenotypes, and diseases to fill in knowledge gaps for environmental health studies, as demonstrated for air pollution-associated cardiovascular disease, but can be adapted by researchers for any environmentally influenced disease-of-interest.
Estilos ABNT, Harvard, Vancouver, APA, etc.
48

Tse, William T., Kevin K. Duh e Morris Kletzel. "A Low-Cost, Open-Source Informatics Framework for Clinical Trials and Outcomes Research". Blood 118, n.º 21 (18 de novembro de 2011): 4763. http://dx.doi.org/10.1182/blood.v118.21.4763.4763.

Texto completo da fonte
Resumo:
Abstract Abstract 4763 Data collection and analysis in clinical studies in hematology often require the use of specialized databases, which demand extensive information technology (IT) support and are expensive to maintain. With the goal of reducing the cost of clinical trials and promoting outcomes research, we have devised a new informatics framework that is low-cost, low-maintenance, and adaptable to both small- and large-scale clinical studies. This framework is based on the idea that most clinical data are hierarchical in nature: a clinical protocol typically entails the creation of sequential patient files, each of which documents multiple encounters, during which clinical events and data are captured and tagged for later retrieval and analysis. These hierarchical trees of clinical data can be easily stored in a hypertext mark-up language (HTML) document format, which is designed to represent similar hierarchical data on web pages. In this framework, the stored clinical data will be structured according to a web standard called Document Object Model (DOM), for which powerful informatics techniques have been developed to allow efficient retrieval and collation of data from the HTML documents. The proposed framework has many potential advantages. The data will be stored in plain text files in the HTML format, which is both human and machine readable, hence facilitating data exchange between collaborative groups. The framework requires only a regular web browser to function, thereby easing its adoption in multiple institutions. There will be no need to set up or maintain a relational database for data storage, thus minimizing data fragmentation and reducing the demand for IT support. Data entry and analysis will be performed mostly on the client computer, requiring the use of a backend server only for central data storage. Utility programs for data management and manipulation will be written in Javascript and JQuery, computer languages that are free, open-source and easy to maintain. Data can be captured, retrieved, and analyzed on different devices, including desktop computers, tablets or smart phones. Encryption and password protection can be applied in document storage and data transmission to ensure data security and HIPPA compliance. In a pilot project to implement and test this informatics framework, we designed prototype programming modules to perform individual tasks commonly encountered in clinical data management. The functionalities of these modules included user-interface creation, patient data entry and retrieval, visualization and analysis of aggregate results, and exporting and reporting of extracted data. These modules were used to access simulated clinical data stored in a remote server, employing standard web browsers available on all desktop computers and mobile devices. To test the capability of these modules, benchmark tests were performed. Simulated datasets of complete patient records, each with 1000 data items, were created and stored in the remote server. Data were retrieved via the web using a gzip compressed format. Retrieval of 100, 300, 1000 such records took only 1.01, 2.45, and 6.67 seconds using a desktop computer via a broadband connection, or 3.67, 11.39, and 30.23 seconds using a tablet computer via a 3G connection. Filtering of specific data from the retrieved records was equally speedy. Automated extraction of relevant data from 300 complete records for a two-sample t-test analysis took 1.97 seconds. A similar extraction of data for a Kaplan-Meier survival analysis took 4.19 seconds. The program allowed the data to be presented separately for individual patients or in aggregation for different clinical subgroups. A user-friendly interface enabled viewing of the data in either tabular or graphical forms. Incorporation of a new web browser technique permitted caching of the entire dataset locally for off-line access and analysis. Adaptable programming allowed efficient export of data in different formats for regulatory reporting purposes. Once the system was set up, no further intervention from IT department was necessary. In summary, we have designed and implemented a prototype of a new informatics framework for clinical data management, which should be low-cost and highly adaptable to various types of clinical studies. Field-testing of this framework in real-life clinical studies will be the next step to demonstrate its effectiveness and potential benefits. Disclosures: No relevant conflicts of interest to declare.
Estilos ABNT, Harvard, Vancouver, APA, etc.
49

Dempsey, Paul, Andy Eadon e Gerard Morris. "Simpol: a simplified urban pollution modelling tool". Water Science and Technology 36, n.º 8-9 (1 de outubro de 1997): 83–88. http://dx.doi.org/10.2166/wst.1997.0648.

Texto completo da fonte
Resumo:
In the UK, the standards which protect river aquatic life against the effect of short term intermittent pollution events are expressed in terms of river concentration/duration thresholds for return periods ranging from 1 month to 1 year. To demonstrate compliance with these standards, it is necessary to simulate the performance of a drainage system and the river impact for a number of wet weather events, chosen such that the return periods of critical concentrations/durations can be estimated with confidence. Many events must be simulated leading to long computer runs and large result files when using detailed simulation models. This paper describes how this problem has been addressed within the UK Urban Pollution Management procedure by developing a simplified urban pollution model called SIMPOL. This model is designed to represent the key urban processes in a relatively simple way. Accuracy is preserved by calibrating the model against a small number of detailed model results. As multiple runs can be carried out easily, SIMPOL modelling allows greater account to be taken of the variability in rainfall, river conditions and foul flow quality. Potential solutions can be rapidly tested against both environmental and emission standards. A case study is used to illustrate the application of SIMPOL.
Estilos ABNT, Harvard, Vancouver, APA, etc.
50

Hancock, Megan. "Museums and 3D Printing: More Than a Workshop Novelty, Connecting to Collections and the Classroom". Bulletin of the Association for Information Science and Technology 42, n.º 1 (outubro de 2015): 32–35. http://dx.doi.org/10.1002/bul2.2015.1720420110.

Texto completo da fonte
Resumo:
EDITOR'S SUMMARYA typical museum visitor views an artwork for about half a minute, but 3D scanning and printing technology can help provide a more meaningful experience. Several museums have invited audiences to hands‐on 3D scanning and printing events, encouraging them to engage with museum pieces more deeply, creating 3D photo files and ultimately constructing 3D printed models of the pieces that can be manipulated and interacted with in new ways. The ARTLAB+ program in Washington, D.C., and the Parachute Factory in New Mexico are examples of makerspaces focusing on hands‐on activities with cutting edge technologies including 3D printers for a variety of users. A special exhibit on 3D technology at the British Museum and Samsung Digital Discovery Center enabled visitors to recreate museum pieces with computer‐aided design technology, 3D pens and 3D printing. A resource list describes additional projects and information sources.
Estilos ABNT, Harvard, Vancouver, APA, etc.
Oferecemos descontos em todos os planos premium para autores cujas obras estão incluídas em seleções literárias temáticas. Contate-nos para obter um código promocional único!

Vá para a bibliografia