Journal articles on the topic 'Hardware Construction Languages (HCLs)'

To see the other types of publications on this topic, follow the link: Hardware Construction Languages (HCLs).

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 34 journal articles for your research on the topic 'Hardware Construction Languages (HCLs).'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Kamkin, Alexander Sergeevich, Mikhail Mikhaylovich Chupilko, Mikhail Sergeevich Lebedev, Sergey Aleksandrovich Smolov, and Georgi Gaydadjiev. "Comparison of High-Level Synthesis and Hardware Construction Tools." Proceedings of the Institute for System Programming of the RAS 34, no. 5 (2022): 7–22. http://dx.doi.org/10.15514/ispras-2022-34(5)-1.

Full text
Abstract:
Application-specific systems with FPGA accelerators are often designed using high-level synthesis or hardware construction tools. Nowadays, there are many frameworks available, both open-source and commercial. In this work, we attempt to fairly compare several existing solutions (languages and tools), including Verilog (our baseline), Chisel, Bluespec SystemVerilog (Bluespec Compiler), DSLX (XLS), MaxJ (MaxCompiler), and C (Bambu and Vivado HLS). Our analysis has been carried out using a representative example of 8×8 inverse discrete cosine transform (IDCT), a widely used algorithm engaged in, among others, JPEG and MPEG decoders. The metrics under consideration include: (a) the degree of automation (how much less code is required compared to Verilog), (b) the controllability (possibility to achieve given design characteristics, namely a given ratio of the performance and area), and (c) the flexibility (ease of design modification to achieve certain characteristics). Rather than focusing on computational kernels only, we have developed AXI-Stream wrappers for the synthesized implementations, which allows adequately evaluating characteristics of the designs when they are used as parts of real computer systems. Our study shows clear examples of what impact specific optimizations (tool settings and source code modifications) have on the overall system performance and area. It emphasizes how important is to be able to control the balance between the communication interface utilization and the computational kernel performance and delivers clear guidelines for the next generation tools for designing FPGA accelerator based systems.
APA, Harvard, Vancouver, ISO, and other styles
2

Заризенко, Инна Николаевна, and Артём Евгеньевич Перепелицын. "АНАЛИЗ СРЕДСТВ И ТЕХНОЛОГИЙ РАЗРАБОТКИ FPGA КАК СЕРВИС." RADIOELECTRONIC AND COMPUTER SYSTEMS, no. 4 (December 25, 2019): 88–93. http://dx.doi.org/10.32620/reks.2019.4.10.

Full text
Abstract:
This article has analyzed the most effective integrated development environments from leading programmable logical device (PLD) manufacturers. Heterogeneous calculations and the applicability of a general approach to the description of hardware accelerator designs are considered. An analytical review of the use of the OpenCL language in the construction of high-performance FPGA-based solutions is performed. The features of OpenCL language usage for heterogeneous computing for FPGA-based accelerators are discussed. The experience of a unified description of projects for solutions based on CPU, GPU, signal processors and FPGA is analyzed. The advantages of using such a description for tasks that perform parallel processing are shown. Differences in productivity and labor costs for developing FPHA systems with parallel data processing for hardware description languages and OpenCL language are shown. The results of comparing commercially available solutions for building services with FPGA accelerators are presented. The advantages of the Xilinx platform and tools for building an FPGA service are discussed. The stages of creating solutions based on FaaS are proposed. Some FaaS related tasks are listed and development trends are discussed. The SDAccel platform of the Xilinx SDx family is considered, as well as the possible role of these tools in creating the FPGA computing platform as a service. An example of using SDAccel to develop parallel processing based on FPGA is given. The advantages and disadvantages of the use of hardware description languages with such design automation tools are discussed. The results of comparing the performance of the simulation speed of the system described with the use of programming languages and hardware description languages are presented. The advantages of modeling complex systems are discussed, especially for testing solutions involving the processing of tens of gigabytes of data and the impossibility of creating truncated test sets. Based on practical experience, the characteristics of development environments, including undocumented ones, are formulated.
APA, Harvard, Vancouver, ISO, and other styles
3

Kohen, Hanan, and Dov Dori. "Improving Conceptual Modeling with Object-Process Methodology Stereotypes." Applied Sciences 11, no. 5 (March 5, 2021): 2301. http://dx.doi.org/10.3390/app11052301.

Full text
Abstract:
As system complexity is on the rise, there is a growing need for standardized building blocks to increase the likelihood of systems’ success. Conceptual modeling is the primary activity required for engineering systems to be understood, designed, and managed. Modern modeling languages enable describing the requirements and design of systems in a formal yet understandable way. These languages use stereotypes to standardize, clarify the model semantics, and extend the meaning of model elements. An Internet of things (IoT) system serves as an example to show the significant contributions of stereotypes to model construction, comprehension, error reduction, and increased productivity during design, simulation, and combined hardware–software system execution. This research emphasizes stereotype features that are unique to Object-Process Methodology (OPM) ISO 19450, differentiating it from stereotypes in other conceptual modeling languages. We present the implementation of stereotypes in OPCloud, an OPM modeling software environment, explore stereotype-related problems, propose solutions, and discuss future enhancements.
APA, Harvard, Vancouver, ISO, and other styles
4

Giraldo, Carlos Alberto, Beatriz Florian-Gaviria, Eval Bladimir Bacca-Cortés, Felipe Gómez, and Francisco Muñoz. "A programming environment having three levels of complexity for mobile robotics." Ingeniería e Investigación 32, no. 3 (September 1, 2012): 76–82. http://dx.doi.org/10.15446/ing.investig.v32n3.35947.

Full text
Abstract:
This paper presents a programming environment for supporting learning in STEM, particularly mobile robotic learning. It was designed to maintain progressive learning for people with and without previous knowledge of programming and/or robotics. The environment was multi-platform and built with open source tools. Perception, mobility, communication, navigation and collaborative behaviour functionalities can be programmed for different mobile robots. A learner is able to programme robots using different programming languages and editor interfaces: graphic programming interface (basic level), XML-based meta-language (intermediate level) or ANSI C language (advanced level). The environment supports programme translation transparently into different languages for learners or explicitly on learners' demand. Learners can access proposed challenges and learning interfaces by examples. The environment was designed to allow characteristics such as extensibility, adaptive interfaces, persistence and low software/hardware coupling. Functionality tests were performed to prove programming environment specifications. UV-BOT mobile robots were used in these tests.
APA, Harvard, Vancouver, ISO, and other styles
5

Zielenkiewicz, Maciej, and Aleksy Schubert. "Automata theory approach to predicate intuitionistic logic." Journal of Logic and Computation 32, no. 3 (November 16, 2021): 554–80. http://dx.doi.org/10.1093/logcom/exab069.

Full text
Abstract:
Abstract Predicate intuitionistic logic is a well-established fragment of dependent types. Proof construction in this logic, as the Curry–Howard isomorphism states, is the process of program synthesis. We present automata that can handle proof construction and program synthesis in full intuitionistic first-order logic. Given a formula, we can construct an automaton such that the formula is provable if and only if the automaton has an accepting run. As further research, this construction makes it possible to discuss formal languages of proofs or programs, the closure properties of the automata and their connections with the traditional logical connectives.
APA, Harvard, Vancouver, ISO, and other styles
6

Akay, Abdullah E., and John Sessions. "Applying the Decision Support System, TRACER, to Forest Road Design." Western Journal of Applied Forestry 20, no. 3 (July 1, 2005): 184–91. http://dx.doi.org/10.1093/wjaf/20.3.184.

Full text
Abstract:
Abstract A three-dimensional forest road alignment model, TRACER, was developed to assist a forest road designer with rapid evaluation of alternative road paths. The objective is to design a route with the lowest total cost considering construction, maintenance, and transportation costs, while conforming to design specifications, environmental requirements, and driver safety. The model integrates two optimization techniques: a linear programming for earthwork allocation and a heuristic approach for vertical alignment selection. The model enhances user efficiency through automated horizontal and vertical curve fitting routines, cross-section generation, and cost routines for construction, maintenance, and vehicle use. The average sediment delivered to a stream from the road section is estimated using the method of a GIS-based road erosion/delivery model. It is anticipated that the development of a design procedure incorporating modern graphics capability, hardware, software languages, modern optimization techniques, and environmental considerations will improve the design process for forest roads. West. J. Appl. For. 20(3):184–191.
APA, Harvard, Vancouver, ISO, and other styles
7

Popescu, Natalie, Ziyang Xu, Sotiris Apostolakis, David I. August, and Amit Levy. "Safer at any speed: automatic context-aware safety enhancement for Rust." Proceedings of the ACM on Programming Languages 5, OOPSLA (October 20, 2021): 1–23. http://dx.doi.org/10.1145/3485480.

Full text
Abstract:
Type-safe languages improve application safety by eliminating whole classes of vulnerabilities–such as buffer overflows–by construction. However, this safety sometimes comes with a performance cost. As a result, many modern type-safe languages provide escape hatches that allow developers to manually bypass them. The relative value of performance to safety and the degree of performance obtained depends upon the application context, including user goals and the hardware upon which the application is to be executed. Since libraries may be used in many different contexts, library developers cannot make safety-performance trade-off decisions appropriate for all cases. Application developers can tune libraries themselves to increase safety or performance, but this requires extra effort and makes libraries less reusable. To address this problem, we present NADER, a Rust development tool that makes applications safer by automatically transforming unsafe code into equivalent safe code according to developer preferences and application context. In end-to-end system evaluations in a given context, NADER automatically reintroduces numerous library bounds checks, in many cases making application code that uses popular Rust libraries safer with no corresponding loss in performance.
APA, Harvard, Vancouver, ISO, and other styles
8

BANYASAD, OMID, and PHILIP T. COX. "Integrating design synthesis and assembly of structured objects in a visual design language." Theory and Practice of Logic Programming 5, no. 6 (October 31, 2005): 601–21. http://dx.doi.org/10.1017/s1471068404002285.

Full text
Abstract:
Computer Aided Design systems provide tools for building and manipulating models of solid objects. Some also provide access to programming languages so that parametrised designs can be expressed. There is a sharp distinction, therefore, between building models, a concrete graphical editing activity, and programming, an abstract, textual, algorithm-construction activity. The recently proposed Language for Structured Design (LSD) was motivated by a desire to combine the design and programming activities in one language. LSD achieves this by extending a visual logic programming language to incorporate the notions of solids and operations on solids. Here we investigate another aspect of the LSD approach, namely, that by using visual logic programming as the engine to drive the parametrised assembly of objects, we also gain the powerful symbolic problem-solving capability that is the forté of logic programming languages. This allows the designer/programmer to work at a higher level, giving declarative specifications of a design in order to obtain the design descriptions. Hence LSD integrates problem solving, design synthesis, and prototype assembly in a single homogeneous programming/design environment. We demonstrate this specification-to-final-assembly capability using the masterkeying problem for designing systems of locks and keys.
APA, Harvard, Vancouver, ISO, and other styles
9

Izatri, Dini Idzni, Nofita Idaroka Rohmah, and Renny Sari Dewi. "Identifikasi Risiko pada Perpustakaan Daerah Gresik dengan NIST SP 800-30." JURIKOM (Jurnal Riset Komputer) 7, no. 1 (February 15, 2020): 50. http://dx.doi.org/10.30865/jurikom.v7i1.1756.

Full text
Abstract:
With the rapid development of technology in Indonesia, several companies and government institutions have begun to implement IT in their systems, as well as the Gresik Regency Regional Library. Information Technology is a field of technology management and covers various fields including but not limited to things such as processes, computer software, information systems, computer hardware, programming languages, and data construction. In short, what makes data, information or knowledge felt in any visual format, through any mechanism of multimedia distribution, is considered part of Information Technology. Regional Library of Gresik Regency is one of the institutions from the government that has implemented Information Technology in their system. Gresik district library has about thirty thousand books consisting of novels, magazines, school textbooks, literature, and others. The Regional Library of Gresik Regency is now using the INLIS LITE application, this application is used by the library, from the collection of books to the list of library members
APA, Harvard, Vancouver, ISO, and other styles
10

Wooldridge, Michael, and Nicholas R. Jennings. "Intelligent agents: theory and practice." Knowledge Engineering Review 10, no. 2 (June 1995): 115–52. http://dx.doi.org/10.1017/s0269888900008122.

Full text
Abstract:
AbstractThe concept of anagenthas become important in both artificial intelligence (AT) and mainstream computer science. Our aim in this paper is to point the reader at what we perceive to be the most important theoretical and practical issues associated with the design and construction of intelligent agents. For convenience, we divide these issues into three areas (though as the reader will see, the divisions are at times somewhat arbitrary).Agent theoryis concerned with the question of what an agent is, and the use of mathematical formalisms for representing and reasoning about the properties of agents.Agent architecturescan be thought of as software engineering models of agents; researchers in this area are primarily concerned with the problem of designing software or hardware systems that will satisfy the properties specified by agent theorists. Finally,agent languagesare software systems for programming and experimenting with agents; these languages may embody principles proposed by theorists. The paper isnotintended to serve as a tutorial introduction to all the issues mentioned; we hope instead simply to identify the most important issues, and point to work that elaborates on them. The article includes a short review of current and potential applications of agent technology.
APA, Harvard, Vancouver, ISO, and other styles
11

Lin, Shaokai, Yatin A. Manerkar, Marten Lohstroh, Elizabeth Polgreen, Sheng-Jung Yu, Chadlia Jerad, Edward A. Lee, and Sanjit A. Seshia. "Towards Building Verifiable CPS using Lingua Franca." ACM Transactions on Embedded Computing Systems 22, no. 5s (September 9, 2023): 1–24. http://dx.doi.org/10.1145/3609134.

Full text
Abstract:
Formal verification of cyber-physical systems (CPS) is challenging because it has to consider real-time and concurrency aspects that are often absent in ordinary software. Moreover, the software in CPS is often complex and low-level, making it hard to assure that a formal model of the system used for verification is a faithful representation of the actual implementation, which can undermine the value of a verification result. To address this problem, we propose a methodology for building verifiable CPS based on the principle that a formal model of the software can be derived automatically from its implementation. Our approach requires that the system implementation is specified in Lingua Franca (LF), a polyglot coordination language tailored for real-time, concurrent CPS, which we made amenable to the specification of safety properties via annotations in the code. The program structure and the deterministic semantics of LF enable automatic construction of formal axiomatic models directly from LF programs. The generated models are automatically checked using Bounded Model Checking (BMC) by the verification engine Uclid5 using the Z3 SMT solver. The proposed technique enables checking a well-defined fragment of Safety Metric Temporal Logic (Safety MTL) formulas. To ensure the completeness of BMC, we present a method to derive an upper bound on the completeness threshold of an axiomatic model based on the semantics of LF. We implement our approach in the LF V erifier and evaluate it using a benchmark suite with 22 programs sampled from real-life applications and benchmarks for Erlang, Lustre, actor-oriented languages, and RTOSes. The LF V erifier correctly checks 21 out of 22 programs automatically.
APA, Harvard, Vancouver, ISO, and other styles
12

JI, JIANMIN, FANGFANG LIU, and JIA-HUAI YOU. "Well-founded operators for normal hybrid MKNF knowledge bases." Theory and Practice of Logic Programming 17, no. 5-6 (September 2017): 889–905. http://dx.doi.org/10.1017/s1471068417000291.

Full text
Abstract:
AbstractHybrid MKNF knowledge bases have been considered one of the dominant approaches to combining open world ontology languages with closed world rule-based languages. Currently, the only known inference methods are based on the approach of guess-and-verify, while most modern SAT/ASP solvers are built under the DPLL architecture. The central impediment here is that it is not clear what constitutes a constraint propagator, a key component employed in any DPLL-based solver. In this paper, we address this problem by formulating the notion of unfounded sets for non-disjunctive hybrid MKNF knowledge bases, based on which we propose and study two new well-founded operators. We show that by employing a well-founded operator as a constraint propagator, a sound and complete DPLL search engine can be readily defined. We compare our approach with the operator based on the alternating fixpoint construction by Knorr et al. (2011. Artificial Intelligence 175, 9, 1528–1554) and show that, when applied to arbitrary partial partitions, the new well-founded operators not only propagate more truth values but also circumvent the non-converging behavior of the latter. In addition, we study the possibility of simplifying a given hybrid MKNF knowledge base by employing a well-founded operator and show that, out of the two operators proposed in this paper, the weaker one can be applied for this purpose and the stronger one cannot. These observations are useful in implementing a grounder for hybrid MKNF knowledge bases, which can be applied before the computation of MKNF models.
APA, Harvard, Vancouver, ISO, and other styles
13

Li, Bin, Yunlong Fan, Miao Gao, Yikemaiti Sataer, and Zhiqiang Gao. "A Joint-Learning-Based Dynamic Graph Learning Framework for Structured Prediction." Electronics 12, no. 11 (May 23, 2023): 2357. http://dx.doi.org/10.3390/electronics12112357.

Full text
Abstract:
Graph neural networks (GNNs) have achieved remarkable success in structured prediction, owing to the GNNs’ powerful ability in learning expressive graph representations. However, most of these works learn graph representations based on a static graph constructed by an existing parser, suffering from two drawbacks: (1) the static graph might be error-prone, and the errors introduced in the static graph cannot be corrected and might accumulate in later stages, and (2) the graph construction stage and graph representation learning stage are disjoined, which negatively affects the model’s running speed. In this paper, we propose a joint-learning-based dynamic graph learning framework and apply it to two typical structured prediction tasks: syntactic dependency parsing, which aims to predict a labeled tree, and semantic dependency parsing, which aims to predict a labeled graph, for jointly learning the graph structure and graph representations. Experiments are conducted on four datasets: the Universal Dependencies 2.2, the Chinese Treebank 5.1, the English Penn Treebank 3.0 in 13 languages for syntactic dependency parsing, and the SemEval-2015 Task 18 dataset in three languages for semantic dependency parsing. The experimental results show that our best-performing model achieves a new state-of-the-art performance on most language sets of syntactic dependency and semantic dependency parsing. In addition, our model also has an advantage in running speed over the static graph-based learning model. The outstanding performance demonstrates the effectiveness of the proposed framework in structured prediction.
APA, Harvard, Vancouver, ISO, and other styles
14

Michael, Alexandra E., Anitha Gollamudi, Jay Bosamiya, Evan Johnson, Aidan Denlinger, Craig Disselkoen, Conrad Watt, et al. "MSWasm: Soundly Enforcing Memory-Safe Execution of Unsafe Code." Proceedings of the ACM on Programming Languages 7, POPL (January 9, 2023): 425–54. http://dx.doi.org/10.1145/3571208.

Full text
Abstract:
Most programs compiled to WebAssembly (Wasm) today are written in unsafe languages like C and C++. Unfortunately, memory-unsafe C code remains unsafe when compiled to Wasm—and attackers can exploit buffer overflows and use-after-frees in Wasm almost as easily as they can on native platforms. Memory- Safe WebAssembly (MSWasm) proposes to extend Wasm with language-level memory-safety abstractions to precisely address this problem. In this paper, we build on the original MSWasm position paper to realize this vision. We give a precise and formal semantics of MSWasm, and prove that well-typed MSWasm programs are, by construction, robustly memory safe. To this end, we develop a novel, language-independent memory-safety property based on colored memory locations and pointers. This property also lets us reason about the security guarantees of a formal C-to-MSWasm compiler—and prove that it always produces memory-safe programs (and preserves the semantics of safe programs). We use these formal results to then guide several implementations: Two compilers of MSWasm to native code, and a C-to-MSWasm compiler (that extends Clang). Our MSWasm compilers support different enforcement mechanisms, allowing developers to make security-performance trade-offs according to their needs. Our evaluation shows that on the PolyBenchC suite, the overhead of enforcing memory safety in software ranges from 22% (enforcing spatial safety alone) to 198% (enforcing full memory safety), and 51.7% when using hardware memory capabilities for spatial safety and pointer integrity. More importantly, MSWasm’s design makes it easy to swap between enforcement mechanisms; as fast (especially hardware-based) enforcement techniques become available, MSWasm will be able to take advantage of these advances almost for free.
APA, Harvard, Vancouver, ISO, and other styles
15

Perepelitsyn, Artem. "Method of creation of FPGA based implementation of artificial intelligence as a service." Radioelectronic and Computer Systems, no. 3 (September 29, 2023): 27–36. http://dx.doi.org/10.32620/reks.2023.3.03.

Full text
Abstract:
The subject of study in this article is the technologies of Field Programmable Gate Array (FPGA), methods, and tools for prototyping of hardware accelerators of Artificial Intelligence (AI) and providing it as a service. The goal is to reduce the efforts of creation and modification of FPGA implementation of Artificial Intelligent projects and provide such solutions as a service. Task: to analyze the possibilities of heterogeneous computing for the implementation of AI projects; analyze advanced FPGA technologies and accelerator cards that allow the organization of a service; analyze the languages, frameworks, and integrated environments for the creation of Artificial Intelligence projects for FPGA implementation; propose a technique for modifiable FPGA project prototyping to ensure a long period of compatibility with integrated environments and target devices; propose a technique for the prototyping of FPGA services with high performance to improve the efficiency of FPGA based AI projects; propose a sequence of optimization of neural networks for FPGA implementation; and provide an example of the practical implementation of the research results. According to the tasks, the following results were obtained. Analysis of the biggest companies and vendors of FPGA technology is performed. Existing heterogeneous technologies and potential non-electronic mediums for AI computations are discussed. FPGA accelerator cards with a large amount of High Bandwidth Memory (HBM) on the same chip package for implementation of AI projects are analyzed and compared. Languages, frameworks, and technologies as well as the capabilities of libraries and integrated environments for prototyping of FPGA projects for the AI applications are analyzed in detail. The sequence of prototyping of FPGA projects that are stable to changes in the environment is proposed. The sequence of prototyping of highly efficient pipelined projects for data processing is proposed. The steps of optimization of neural networks for FPGA implementation of AI applications are provided. An example of practical use of the results of research, including the use of sequences is provided. Conclusions. One of the main contributions of this research is the proposed method of creation of FPGA based implementation of AI projects in the form of services. Proposed sequence of neural network optimization for FPGA allows the reduction of the complexity of the initial program model by more than five times for hardware implementation depending on the required accuracy. The described solutions allow the construction of completely scalable and modifiable FPGA implementations of AI projects to provide it as a service.
APA, Harvard, Vancouver, ISO, and other styles
16

Perbawa, Laodikia Galih Krisna, Muhammad Hasbi, and Bebas Widada. "Rekomendasi Tempat Wisata Di Kabupaten Grobogan." Jurnal Ilmiah SINUS 19, no. 2 (July 15, 2021): 57. http://dx.doi.org/10.30646/sinus.v19i2.560.

Full text
Abstract:
Geografis Information System (SIG) is a computer system used to manipulate geographic data. The system is implemented with computer hardware and software that serves to init and verify data, data compilation, data storage, data changes and updates, data management and exchange, data manipulation, data calling and presentation and data analysis.In Grobogan Regency itself there are still many tourist locations that are not known to the public, especially tourists outside the area. The number of tourist attractions in Grobogan Regency itself is approximately 21 places, both natural tourism such as Waterfalls, Sendang, Goa.Religious tourism such as Tombs or artificial tourism such as swimming pools and reservoirs.Because of the time and information, people find it difficult to know the point of location and also information from the tours located in Grobogan Regency. In this Thesis Study use the Haversine Formula Method. With analysis, system design, coding/construction, testing, and implementation. At the analysis stage, the collection of data is observation and literature studies. While the data source obtained in the form of primary data and secondary data. At the design stage of the system using Unified Modelling Language (UML) include use case diagrams, activity diagrams, sequence diagrams, and class diagrams. At the construction stage use google maps API, PHP and MySQL programming languages. And at the testing stage using the Blackbox technique. While the implementation of this system can be used on desktop devices. By using the search for the nearest route and can recommend it from a radius of 20 km.The expected result of this study is that the app can recommend tourist attractions based on distance and also include tourist information located in Grobogan, so that tourists can easily access the tourist attractions they want to go to.
APA, Harvard, Vancouver, ISO, and other styles
17

Le Guernic, Paul, Jean-Pierre Talpin, and Jean-Christophe Le Lann. "POLYCHRONY for System Design." Journal of Circuits, Systems and Computers 12, no. 03 (June 2003): 261–303. http://dx.doi.org/10.1142/s0218126603000763.

Full text
Abstract:
Rising complexities and performances of integrated circuits and systems, shortening time-to-market demands for electronic equipments, growing installed bases of intellectual property (IP), requirements for adapting existing IP blocks with new services, all stress high-level design as a prominent research topic and call for the development of appropriate methodological solutions. In this aim, system design based on the so-called "synchronous hypothesis" consists of abstracting the nonfunctional implementation details of a system and lets one benefit from a focused reasoning on the logics behind the instants at which the system functionalities should be secured. With this point of view, synchronous design models and languages provide intuitive (ontological) models for integrated circuits. This affinity explains the ease of generating synchronous circuits and verify their functionalities using compilers and related tools that implement this approach. In the relational mathematical model behind the design language SIGNAL, this affinity goes beyond the domain of purely synchronous circuits, and embraces the context of complex architectures consisting of synchronous circuits and desynchronization protocols: globally asynchronous and locally synchronous architectures (GALS). The unique features of the relational model behind SIGNAL are to provide the notion of polychrony: the capability to describe circuits and systems with several clocks; and to support refinement: the ability to assist and support system design from the early stages of requirement specification, to the later stages of synthesis and deployment. The SIGNAL model provides a design methodology that forms a continuum from synchrony to asynchrony, from specification to implementation, from abstraction to concretization, from interfaces to implementations. SIGNAL gives the opportunity to seamlessly model circuits and devices at multiple levels of abstractions, by implementing mechanisms found in many hardware simulators, while reasoning within a simple and formally defined mathematical model. In the same manner, the flexibility inherent to the abstract notion of signal, handled in the synchronous-desynchronized design model of SIGNAL, invites and favors the design of correct by construction systems by means of well-defined transformations of system specifications (morphisms) that preserve the intended semantics and stated properties of the architecture under design. The aim of the present article is to review and summarize these formal, correct-by-construction, design transformations. Most of them are implemented in the POLYCHRONY tool-set, allowing for a mixed bottom–up and top–down design of an embedded hardware–software system using the SIGNAL design language.
APA, Harvard, Vancouver, ISO, and other styles
18

Szyszka, Michał, Łukasz Tomczyk, and Aneta M. Kochanowicz. "Digitalisation of Schools from the Perspective of Teachers’ Opinions and Experiences: The Frequency of ICT Use in Education, Attitudes towards New Media, and Support from Management." Sustainability 14, no. 14 (July 7, 2022): 8339. http://dx.doi.org/10.3390/su14148339.

Full text
Abstract:
The digitalisation of education has become an irreversible process, and Poland is no exception. However, the issue of ICT usage in education raises many concerns and controversies, posing numerous methodological challenges at the same time. In the interpretation of our research, one of the most frequently used and validated models in empirical research—the Unified Theory of Technology Acceptance and Use (UTAUT) model—was used. The aim of the research was to show the frequency of use of popular hardware, software and websites among teachers from Poland (in the Silesia Province) and finding answers to the questions pertaining to the main determinants of digital teaching aids. The frequency of ICT use in education was juxtaposed with the attitudes towards new media in the school environment as well as with the support of managers in that area. Quantitative research was conducted on a sample of N = 258 in 2020. A questionnaire in the form of a digital diagnostic survey was used to collect data. Based on the data collected, it was noted that: (1) Teachers use virtual systems (eRegisters) and interactive whiteboards most often, while educational podcasts and software for learning foreign languages are used least often; (2) approximately 40% of teachers use ICT often or very often in their school teaching; (3) in public schools, digital teaching aids are used slightly more often than in non-public institutions; (4) teachers are consistent in their use of digital teaching aids, so the individuals experimenting with and implementing ICT at school do so regardless of the type of software and hardware; (5) active support from school head teachers strengthens the frequency of ICT use in education; and (6) in schools where the use of smartphones by students is prohibited, the frequency of ICT use in education is at a lower level.
APA, Harvard, Vancouver, ISO, and other styles
19

BOGAERTS, BART, TOMI JANHUNEN, and SHAHAB TASHARROFI. "Stable-unstable semantics: Beyond NP with normal logic programs." Theory and Practice of Logic Programming 16, no. 5-6 (September 2016): 570–86. http://dx.doi.org/10.1017/s1471068416000387.

Full text
Abstract:
AbstractStandard answer set programming (ASP) targets at solving search problems from the first level of the polynomial time hierarchy (PH). Tackling search problems beyond NP using ASP is less straightforward. The class of disjunctive logic programs offers the most prominent way of reaching the second level of the PH, but encoding respective hard problems as disjunctive programs typically requires sophisticated techniques such as saturation or meta-interpretation. The application of such techniques easily leads to encodings that are inaccessible to non-experts. Furthermore, while disjunctive ASP solvers often rely on calls to a (co-)NP oracle, it may be difficult to detect from the input program where the oracle is being accessed. In other formalisms, such as Quantified Boolean Formulas (QBFs), the interface to the underlying oracle is more transparent as it is explicitly recorded in the quantifier prefix of a formula. On the other hand, ASP has advantages over QBFs from the modeling perspective. The rich high-level languages such as ASP-Core-2 offer a wide variety of primitives that enable concise and natural encodings of search problems. In this paper, we present a novel logic programming–based modeling paradigm that combines the best features of ASP and QBFs. We develop so-calledcombined logic programsin which oracles are directly cast as (normal) logic programs themselves. Recursive incarnations of this construction enable logic programming on arbitrarily high levels of the PH. We develop a proof-of-concept implementation for our new paradigm.
APA, Harvard, Vancouver, ISO, and other styles
20

Kalai, Yael Tauman, Ran Raz, and Ron D. Rothblum. "How to Delegate Computations: The Power of No-Signaling Proofs." Journal of the ACM 69, no. 1 (February 28, 2022): 1–82. http://dx.doi.org/10.1145/3456867.

Full text
Abstract:
We construct a 1-round delegation scheme (i.e., argument-system) for every language computable in time t = t ( n ), where the running time of the prover is poly ( t ) and the running time of the verifier is n · polylog ( t ). In particular, for every language in P we obtain a delegation scheme with almost linear time verification. Our construction relies on the existence of a computational sub-exponentially secure private information retrieval ( PIR ) scheme. The proof exploits a curious connection between the problem of computation delegation and the model of multi-prover interactive proofs that are sound against no-signaling (cheating) strategies , a model that was studied in the context of multi-prover interactive proofs with provers that share quantum entanglement, and is motivated by the physical principle that information cannot travel faster than light. For any language computable in time t = t ( n ), we construct a multi-prover interactive proof ( MIP ), that is, sound against no-signaling strategies, where the running time of the provers is poly ( t ), the number of provers is polylog ( t ), and the running time of the verifier is n · polylog ( t ). In particular, this shows that the class of languages that have polynomial-time MIP s that are sound against no-signaling strategies, is exactly EXP . Previously, this class was only known to contain PSPACE . To convert our MIP into a 1-round delegation scheme, we use the method suggested by Aiello et al. (ICALP, 2000), which makes use of a PIR scheme. This method lacked a proof of security. We prove that this method is secure assuming the underlying MIP is secure against no-signaling provers.
APA, Harvard, Vancouver, ISO, and other styles
21

Wrona, Zofia, Maria Ganzha, Marcin Paprzycki, and Stanisław Krzyżanowski. "Dynamic Knowledge Management in an Agent-Based Extended Green Cloud Simulator." Energies 17, no. 4 (February 6, 2024): 780. http://dx.doi.org/10.3390/en17040780.

Full text
Abstract:
Cloud infrastructures operate in highly dynamic environments, and today, energy-focused optimization become crucial. Moreover, the concept of extended cloud infrastructure, which, among others, uses green energy, started to gain traction. This introduces a new level of dynamicity to the ecosystem, as “processing components” may “disappear” and “come back”, specifically in scenarios where the lack/return of green energy leads to shutting down/booting back servers at a given location. Considered use cases may involve introducing new types of resources (e.g., adding containers with server racks with “next-generation processors”). All such situations require the dynamic adaptation of “system knowledge”, i.e., runtime system adaptation. In this context, an agent-based digital twin of the extended green cloud infrastructure is proposed. Here, knowledge management is facilitated with an explainable Rule-Based Expert System, combined with Expression Languages. The tests were run using Extended Green Cloud Simulator, which allows the modelling of cloud infrastructures powered (partially) by renewable energy sources. Specifically, the work describes scenarios in which: (1) a new hardware resource is introduced in the system; (2) the system component changes its resource; and (3) system user changes energy-related preferences. The case study demonstrates how rules can facilitate control of energy efficiency with an example of an adaptable compromise between pricing and energy consumption.
APA, Harvard, Vancouver, ISO, and other styles
22

Cook, Sebastien, and Paulo Garcia. "Arbitrarily Parallelizable Code: A Model of Computation Evaluated on a Message-Passing Many-Core System." Computers 11, no. 11 (November 18, 2022): 164. http://dx.doi.org/10.3390/computers11110164.

Full text
Abstract:
The number of processing elements per solution is growing. From embedded devices now employing (often heterogeneous) multi-core processors, across many-core scientific computing platforms, to distributed systems comprising thousands of interconnected processors, parallel programming of one form or another is now the norm. Understanding how to efficiently parallelize code, however, is still an open problem, and the difficulties are exacerbated across heterogeneous processing, and especially at run time, when it is sometimes desirable to change the parallelization strategy to meet non-functional requirements (e.g., load balancing and power consumption). In this article, we investigate the use of a programming model based on series-parallel partial orders: computations are expressed as directed graphs that expose parallelization opportunities and necessary sequencing by construction. This programming model is suitable as an intermediate representation for higher-level languages. We then describe a model of computation for such a programming model that maps such graphs into a stack-based structure more amenable to hardware processing. We describe the formal small-step semantics for this model of computation and use this formal description to show that the model can be arbitrarily parallelized, at compile and runtime, with correct execution guaranteed by design. We empirically support this claim and evaluate parallelization benefits using a prototype open-source compiler, targeting a message-passing many-core simulation. We empirically verify the correctness of arbitrary parallelization, supporting the validity of our formal semantics, analyze the distribution of operations within cores to understand the implementation impact of the paradigm, and assess execution time improvements when five micro-benchmarks are automatically and randomly parallelized across 2 × 2 and 4 × 4 multi-core configurations, resulting in execution time decrease by up to 95% in the best case.
APA, Harvard, Vancouver, ISO, and other styles
23

Kravets, Alla, and Vitaly Egunov. "The Software Cache Optimization-Based Method for Decreasing Energy Consumption of Computational Clusters." Energies 15, no. 20 (October 12, 2022): 7509. http://dx.doi.org/10.3390/en15207509.

Full text
Abstract:
Reducing the consumption of electricity by computing devices is currently an urgent task. Moreover, if earlier this problem belonged to the competence of hardware developers and the design of more cost-effective equipment, then more recently there has been an increased interest in this issue on the part of software developers. The issues of these studies are extensive. From energy efficiency issues of various programming languages to the development of energy-saving software for smartphones and other gadgets. However, to the best of our knowledge, no study has reported an analysis of the impact of cache optimizations on computing devices’ power consumption. Hence, this paper aims to provide an analysis of such impact on the software energy efficiency using the original software design procedure and computational experiments. The proposed Software Cache Optimization (SCO)-based Methodology was applied to one of the key linear algebra transformations. Experiments were carried out to determine software energy efficiency. RAPL (Running Average Power Limit) was used—an interface developed by Intel, which provides built-in counters of Central Processing Unit (CPU) energy consumption. Measurements have shown that optimized software versions reduce power consumption up to 4 times in relation to the basic transformation scheme. Experimental results confirm the effectiveness of the SCO-based Methodology used to reduce energy consumption and the applicability of this technique for software optimization.
APA, Harvard, Vancouver, ISO, and other styles
24

PORTO, ANTÓNIO. "A structured alternative to Prolog with simple compositional semantics." Theory and Practice of Logic Programming 11, no. 4-5 (July 2011): 611–27. http://dx.doi.org/10.1017/s1471068411000202.

Full text
Abstract:
AbstractProlog's very useful expressive power is not captured by traditional logic programming semantics, due mainly to the cut and goal and clause order. Several alternative semantics have been put forward, exposing operational details of the computation state. We propose instead to redesign Prolog around structured alternatives to the cut and clauses, keeping the expressive power and computation model but with a compositional denotational semantics over much simpler states—just variable bindings. This considerably eases reasoning about programs, by programmers and tools such as a partial evaluator, with safe unfolding of calls through predicate definitions. Anif-then-elseacross clauses replaces most uses of the cut, but the cut's full power is achieved by anuntilconstruct. Disjunction, conjunction anduntil, along with unification, are the primitive goal types with a compositional semantics yielding sequences of variable-binding solutions. This extends to programs via the usual technique of a least fixpoint construction. A simple interpreter for Prolog in the alternative language, and a definition ofuntilin Prolog, establish the identical expressive power of the two languages. Many useful control constructs are derivable from the primitives, and the semantic framework illuminates the discussion of alternative ones. The formalisation rests on a term language with variable abstraction as in the λ-calculus. A clause is an abstraction on the call arguments, a continuation, and the local variables. It can be inclusive or exclusive, expressing a local case bound to a continuation by either a disjunction or anif-then-else. Clauses are open definitions, composed (and closed) with simple functional application β-reduction). This paves the way for a simple account of flexible module composition mechanisms.Cube, a concrete language with the exposed principles, has been implemented on top of a Prolog engine and successfully used to build large real-world applications.
APA, Harvard, Vancouver, ISO, and other styles
25

Bazydło, Grzegorz. "Designing Reconfigurable Cyber-Physical Systems Using Unified Modeling Language." Energies 16, no. 3 (January 25, 2023): 1273. http://dx.doi.org/10.3390/en16031273.

Full text
Abstract:
Technological progress in recent years in the Cyber-Physical Systems (CPSs) area has given designers unprecedented possibilities and computational power, but as a consequence, the modeled CPSs are becoming increasingly complex, hierarchical, and concurrent. Therefore, new methods of CPSs design (especially using abstract modeling) are needed. The paper presents an approach to the CPS control part modeling using state machine diagrams from Unified Modelling Language (UML). The proposed design method attempts to combine the advantages of graphical notation (intuitiveness, convenience, readability) with the benefits of text specification languages (unambiguity, precision, versatility). The UML specification is transformed using Model-Driven Development (MDD) techniques into an effective program in Hardware Description Language (HDL), using Concurrent Finite State Machine (CFSM) as a temporary model. The obtained HDL specification can be analyzed, validated, synthesized, and finally implemented in Field Programmable Gate Array (FPGA) devices. The dynamic, partial reconfiguration (a feature of modern FPGAs) allows for the exchange of a part of the implemented CPS algorithm without stopping the device. But to use this feature, the model must be safe, which in the proposed approach means, that it should possess special idle states, where the control is transferred during the reconfiguration process. Applying the CFSM model greatly facilitates this task. The proposed design method offers efficient graphical modeling of a control part of CPS, and automatic translation of the behavior model into a synthesizable Verilog description, which can be directly implemented in FPGA devices, and dynamically reconfigured as needed. A practical example illustrating the successive stages of the proposed method is also presented.
APA, Harvard, Vancouver, ISO, and other styles
26

Riveiro, Luigi Quintans, Kristine Sheila Schuster, Cristina Cavalli Bertolucci, and Leandra Anversa Fioreze. "A Construção de um Estado da Arte Sobre Introdução, Dificuldades e Perspectivas de Conceitos e Simbologias da Álgebra no Ensino Fundamental." Jornal Internacional de Estudos em Educação Matemática 16, no. 3 (February 26, 2024): 330–42. http://dx.doi.org/10.17921/2176-5634.2023v16n3p330-342.

Full text
Abstract:
Conhecer a produção científica sobre a introdução da Álgebra no Ensino Fundamental em processos de ensino e aprendizagem foi o tema de interesse deste Estado da Arte. Nessa direção surgiram questionamentos: como o estudante dos anos finais do Ensino Fundamental lida com a experiência do “mundo” das letras na Matemática? Que dificuldades ele apresenta para identificar e operar com símbolos além de números? Como se dá o processo da construção do pensamento algébrico? A partir de um cuidadoso processo descrito sobre a escolha das obras, construiu-se um corpus de análise com base na Biblioteca Digital Brasileira de Teses e Dissertações, consultadas desde os anos 2000 até hoje. Analisando-se criteriosamente oito dissertações, foram criadas duas categorias, a primeira evidencia aspectos relacionados a teorias de aprendizagem e a outra dá ênfase à argumentação algébrica. Nesse estudo observou-se que existe uma recorrência de referenciais teóricos que se complementam, e uma constante preocupação acerca do modo como a álgebra é colocada para os alunos no Ensino Fundamental. Em relação às dificuldades, destacam que a construção dos conceitos feitos pelos alunos está diretamente ligada ao modo como os mesmos são apresentados a eles. Direciona-se a necessidade de maior estabelecimento das relações e conexões entre aritmética e álgebra, a atenção com a transição entre linguagens (usual, simbólica, algébrica), a produção do significado algébrico para além da manipulação mecanizada de símbolos, o discernimento entre os vários e possíveis usos da variável e a busca por estratégias didáticas com vistas à construção do pensamento algébrico. Palavras-chave: Introdução à Álgebra. Pensamento Algébrico, Estado da Arte. Aprendizagem. Ensino Fundamental. AbstractKnowing the scientific production about the introduction of Algebra in Elementary Education in teaching and learning processes was the theme of interest of this State of the Art. In this direction, questions arose: how do students in the final years of elementary school deal with the experience of the "world" of letters in mathematics? What difficulties do they have to identify and operate with symbols other than numbers? How does the process of building algebraic thought take place? From a careful process described on the choice of works, a corpus of analysis was built based on the Brazilian Digital Library of Theses and Dissertations, consulted since 2000 until today. By carefully analyzing eight dissertations, two categories were created: the first highlights aspects related to learning theories and the second emphasizes algebraic reasoning. In this study, it was observed that there is a recurrence of theoretical references that complement each other, and a constant concern about the way algebra is presented to students in Elementary School. Regarding the difficulties, they highlight that the construction of the concepts made by the students is directly linked to the way they are presented to them. It addresses the need for greater establishment of relationships and connections between arithmetic and algebra, attention to the transition between languages (usual, symbolic, algebraic), the production of algebraic meaning beyond the mechanized manipulation of symbols, discernment between the various and possible uses of the variable and the search for teaching strategies with a view to building algebraic thinking. Keywords: Introduction of Álgebra. Algebraic Thinking, State of art. Learning. Elementary Education.
APA, Harvard, Vancouver, ISO, and other styles
27

Toliupa, Serhii, Yuri Samokhvalov, and Serhii Shtanenko. "Ensuring cyber security of ACS TP by using FPGA technology." Information systems and technologies security, no. 1 (5) (2021): 46–54. http://dx.doi.org/10.17721/ists.2021.1.44-52.

Full text
Abstract:
In modern conditions, cybersecurity issues are moving from the level of information protection at a separate object of computer technology to the level of creating a single cybersecurity system of the state, as part of the information and national security system responsible for protecting not only information in the narrow sense, but also all cyberspace. In the process of forming global cyberspace, military and civilian computer technologies are converging, new means and methods of influencing the information infrastructure of a potential adversary are being developed, and specialized cyber centers are being created and implemented on high-tech platforms. At present, the cybersecurity procedure does not fully reflect the issues related to the cybersecurity of the ACS TP. This is due to the fact that the ACS PA was originally developed based on the ideology of physical isolation from external networks and strict delimitation of access by service personnel, using specific software, information exchange via industrial communication protocols Modbus, Profibus, etc., which often work on top of the TCP / IP protocol. Accordingly, there are many vulnerabilities in the ACS TP, the probability of which in various cyber incidents is directly proportional to the importance and significance of the object. Given the fact that the ACS TP have become an integral part of our existence, respectively, the problem of cybersecurity of the systems under consideration is today an urgent and timely task. The article discusses an approach to ensuring the cybersecurity of automated process control systems (APCS) by creating intelligent cybersecurity systems (ISCs). It is assumed that the construction of the proposed systems should be based on the concept of "evolution (development)", that is, the ability of the system to adapt through changes in parameters under the influence of external and internal cyber threats (cyber attacks), through the applied technologies, to counter cyber attacks throughout the entire life cycle. Technically, it is proposed to implement the ISCs by means of using an expert system and disaster-tolerant information systems (DIS), a characteristic feature of which, in contrast to fault-tolerant systems, is the continuation of work in conditions of massive and, possibly, consecutive failures of the system or its subsystems as a result of cyberattacks. These properties (catastrophic properties – system survivability) are possessed by programmed logic integrated circuits (FPGA) – a class of microprocessor systems, a characteristic feature of which is the ability to implement a multiprocessor (parallelized) structure that can withstand external influences (cyber attacks). By themselves, FPGA are an integrated circuit, the internal configuration of which is set by programming using special languages for describing hardware
APA, Harvard, Vancouver, ISO, and other styles
28

Dogru, A. H., H. A. Sunaidi, L. S. Fung, W. A. Habiballah, N. Al-Zamel, and K. G. Li. "A Parallel Reservoir Simulator for Large-Scale Reservoir Simulation." SPE Reservoir Evaluation & Engineering 5, no. 01 (February 1, 2002): 11–23. http://dx.doi.org/10.2118/75805-pa.

Full text
Abstract:
Summary A new parallel, black-oil-production reservoir simulator (Powers**) has been developed and fully integrated into the pre- and post-processing graphical environment. Its primary use is to simulate the giant oil and gas reservoirs of the Middle East using millions of cells. The new simulator has been created for parallelism and scalability, with the aim of making megacell simulation a day-to-day reservoir-management tool. Upon its completion, the parallel simulator was validated against published benchmark problems and other industrial simulators. Several giant oil-reservoir studies have been conducted with million-cell descriptions. This paper presents the model formulation, parallel linear solver, parallel locally refined grids, and parallel well management. The benefits of using megacell simulation models are illustrated by a real field example used to confirm bypassed oil zones and obtain a history match in a short time period. With the new technology, preprocessing, construction, running, and post-processing of megacell models is finally practical. A typical history- match run for a field with 30 to 50 years of production takes only a few hours. Introduction With the development of early parallel computers, the attractive speed of these computers got the attention of oil industry researchers. Initial questions were concentrated along these lines:Can one develop a truly parallel reservoir-simulator code?What type of hardware and programming languages should be chosen? Contrary to seismic, it is well known that reservoir simulator algorithms are not naturally parallel; they are more recursive, and variables display a strong dependency on each other (strong coupling and nonlinearity). This poses a big challenge for the parallelization. On the other hand, if one could develop a parallel code, the speed of computations would increase by at least an order of magnitude; as a result, many large problems could be handled. This capability would also aid our understanding of the fluid flow in a complex reservoir. Additionally, the proper handling of the reservoir heterogeneities should result in more realistic predictions. The other benefit of megacell description is the minimization of upscaling effects and numerical dispersion. The megacell simulation has a natural application in simulating the world's giant oil and gas reservoirs. For example, a grid size of 50 m or less is used widely for the small and medium-size reservoirs in the world. In contrast, many giant reservoirs in the Middle East use a gridblock size of 250 m or larger; this easily yields a model with more than 1 million cells. Therefore, it is of specific interest to have megacell description and still be able to run fast. Such capability is important for the day-to-day reservoir management of these fields. This paper is organized as follows: the relevant work in the petroleum-reservoir-simulation literature has been reviewed. This will be followed by the description of the new parallel simulator and the presentation of the numerical solution and parallelism strategies. (The details of the data structures, well handling, and parallel input/output operations are placed in the appendices). The main text also contains a brief description of the parallel linear solver, locally refined grids, and well management. A brief description of megacell pre- and post-processing is presented. Next, we address performance and parallel scalability; this is a key section that demonstrates the degree of parallelization of the simulator. The last section presents four real field simulation examples. These example cases cover all stages of the simulator and provide actual central processing unit (CPU) execution time for each case. As a byproduct, the benefits of megacell simulation are demonstrated by two examples: locating bypassed oil zones, and obtaining a quicker history match. Details of each section can be found in the appendices. Previous Work In the 1980s, research on parallel-reservoir simulation had been intensified by the further development of shared-memory and distributed- memory machines. In 1987, Scott et al.1 presented a Multiple Instruction Multiple Data (MIMD) approach to reservoir simulation. Chien2 investigated parallel processing on sharedmemory computers. In early 1990, Li3 presented a parallelized version of a commercial simulator on a shared-memory Cray computer. For the distributed-memory machines, Wheeler4 developed a black-oil simulator on a hypercube in 1989. In the early 1990s, Killough and Bhogeswara5 presented a compositional simulator on an Intel iPSC/860, and Rutledge et al.6 developed an Implicit Pressure Explicit Saturation (IMPES) black-oil reservoir simulator for the CM-2 machine. They showed that reservoir models over 2 million cells could be run on this type of machine with 65,536 processors. This paper stated that computational speeds in the order of 1 gigaflop in the matrix construction and solution were achievable. In mid-1995, more investigators published reservoir-simulation papers that focused on distributed-memory machines. Kaarstad7 presented a 2D oil/water research simulator running on a 16384 processor MasPar MP-2 machine. He showed that a model problem using 1 million gridpoints could be solved in a few minutes of computer time. Rame and Delshad8 parallelized a chemical flooding code (UTCHEM) and tested it on a variety of systems for scalability. This paper also included test results on Intel iPSC/960, CM-5, Kendall Square, and Cray T3D.
APA, Harvard, Vancouver, ISO, and other styles
29

Ferres, Bruno, Olivier Muller, and Frédéric Rousseau. "A Chisel Framework for Flexible Design Space Exploration through a Functional Approach." ACM Transactions on Design Automation of Electronic Systems, April 5, 2023. http://dx.doi.org/10.1145/3590769.

Full text
Abstract:
As the need for efficient digital circuits is ever growing in the industry, the design of such systems remains daunting, requiring both expertise and time. In an attempt to close the gap between software development and hardware design, powerful features such as functional and object-oriented programming have been used to define new languages, known as Hardware Construction Languages. In this article, we investigate the usage of such languages — more precisely, of Chisel — in the context of Design Space Exploration, and propose a novel design methodology to build custom and adaptable design flows. We apply an innovative functional approach to define flexible strategies for design space exploration, based on the composition of basic exploration steps, and provide a library of basic strategies along with a proof-of-concept framework — which we believe to be the first Chisel-based DSE framework. This framework fully integrates within the ecosystem of Chisel, to allow users to define their DSE processes in the same framework (and language) they use to describe their designs. We demonstrate our methodology through several use cases, illustrating how our functional approach makes it possible to consider various metrics of interest when building exploration processes — in particular, we provide a quality of service -driven exploration example. The methodology presented in this work makes use of designers’ expertise to reduce the time required for hardware design, in particular for Design Space Exploration, and its application should ease digital design and enhance hardware developpers’ productivity.
APA, Harvard, Vancouver, ISO, and other styles
30

Guellil, Imane, Antonio Garcia-Dominguez, Peter R. Lewis, Shakeel Hussain, and Geoffrey Smith. "Entity linking for English and other languages: a survey." Knowledge and Information Systems, April 2, 2024. http://dx.doi.org/10.1007/s10115-023-02059-2.

Full text
Abstract:
AbstractExtracting named entities text forms the basis for many crucial tasks such as information retrieval and extraction, machine translation, opinion mining, sentiment analysis and question answering. This paper presents a survey of the research literature on named entity linking, including named entity recognition and disambiguation. We present 200 works by focusing on 43 papers (5 surveys and 38 research works). We also describe and classify 56 resources, including 25 tools and 31 corpora. We focus on the most recent papers, where more than 95% of the described research works are after 2015. To show the efficiency of our construction methodology and the importance of this state of the art, we compare it to other surveys presented in the research literature, which were based on different criteria (such as the domain, novelty and presented models and resources). We also present a set of open issues (including the dominance of the English language in the proposed studies and the frequent use of NER rather than the end-to-end systems proposing NED and EL) related to entity linking based on the research questions that this survey aims to answer.
APA, Harvard, Vancouver, ISO, and other styles
31

Toliupa, S., S. Shtanenko, T. Poberezhets, and V. Lozunov. "Methodology for designing robotic systems based on CAD Intel Quartus Prime." Communication, informatization and cybersecurity systems and technologies 2, no. 2 (2022). http://dx.doi.org/10.58254/viti.2.2022.08.54.

Full text
Abstract:
At present, programmable logic integrated circuits (FPGAs) are increasingly used in the development of robotic systems. A significant advantage of FPGAs is their versatility and the ability to quickly program to perform the functions of almost any digital device of a robotic system. FPGA is a semi-finished product, on the basis of which a developer with a personal computer has the ability to design a digital device in record time. This is provided by simple and relatively inexpensive software hardware and special software called computer-aided design (CAD). FPGA is an electronic component used to create digital integrated circuits. Unlike conventional digital chips, the logic of FPGA operation is set by programming using special tools: programmers and software. FPGA programming is performed using the description languages Verilog HDL and VHDL. At the upper level, these languages are very similar - the hardware model is described in the form of interacting blocks (modules) and for each of them is defined interface and implementation. Module interfaces describe the input, output, and two-way ports through which modules connect to each other for data exchange as well as control signals. The implementation sets the elements of the internal state and the order of calculating the values of the output interfaces based on this state and the values of the input ports, as well as the rules for updating the internal state. The article reveals the stages of designing digital devices of robotic systems using FPGA, considers the principles of construction and operation of the main nodes of combinational circuits, logical elements implemented one of the specified functions, which is subsequently programmed on FPGA using CAD Quartus Prime with built-in simulators Models.
APA, Harvard, Vancouver, ISO, and other styles
32

Vale, Arthur Oliveira, Zhong Shao, and Yixuan Chen. "A Compositional Theory of Linearizability." Journal of the ACM, January 27, 2024. http://dx.doi.org/10.1145/3643668.

Full text
Abstract:
Compositionality is at the core of programming languages research and has become an important goal toward scalable verification of large systems. Despite that, there is no compositional account of linearizability , the gold standard of correctness for concurrent objects. In this paper, we develop a compositional semantics for linearizable concurrent objects. We start by showcasing a common issue, which is independent of linearizability, in the construction of compositional models of concurrent computation: interaction with the neutral element for composition can lead to emergent behaviors, a hindrance to compositionality. Category theory provides a solution for the issue in the form of the Karoubi envelope. Surprisingly, and this is the main discovery of our work, this abstract construction is deeply related to linearizability and leads to a novel formulation of it. Notably, this new formulation neither relies on atomicity nor directly upon happens-before ordering and is only possible because of compositionality, revealing that linearizability and compositionality are intrinsically related to each other. We use this new, and compositional, understanding of linearizability to revisit much of the theory of linearizability, providing novel, simple, algebraic proofs of the locality property and of an analogue of the equivalence with observational refinement . We show our techniques can be used in practice by connecting our semantics with a simple program logic that is nonetheless sound concerning this generalized linearizability.
APA, Harvard, Vancouver, ISO, and other styles
33

LIU, FANGFANG, and JIA-HUAI YOU. "Alternating Fixpoint Operator for Hybrid MKNF Knowledge Bases as an Approximator of AFT." Theory and Practice of Logic Programming, September 1, 2021, 1–30. http://dx.doi.org/10.1017/s1471068421000168.

Full text
Abstract:
Abstract Approximation fixpoint theory (AFT) provides an algebraic framework for the study of fixpoints of operators on bilattices and has found its applications in characterizing semantics for various classes of logic programs and nonmonotonic languages. In this paper, we show one more application of this kind: the alternating fixpoint operator by Knorr et al. for the study of the well-founded semantics for hybrid minimal knowledge and negation as failure (MKNF) knowledge bases is in fact an approximator of AFT in disguise, which, thanks to the abstraction power of AFT, characterizes not only the well-founded semantics but also two-valued as well as three-valued semantics for hybrid MKNF knowledge bases. Furthermore, we show an improved approximator for these knowledge bases, of which the least stable fixpoint is information richer than the one formulated from Knorr et al.’s construction. This leads to an improved computation for the well-founded semantics. This work is built on an extension of AFT that supports consistent as well as inconsistent pairs in the induced product bilattice, to deal with inconsistencies that arise in the context of hybrid MKNF knowledge bases. This part of the work can be considered generalizing the original AFT from symmetric approximators to arbitrary approximators.
APA, Harvard, Vancouver, ISO, and other styles
34

Lillie, Jonathan. "Tackling Identity with Constructionist Concepts." M/C Journal 1, no. 3 (October 1, 1998). http://dx.doi.org/10.5204/mcj.1712.

Full text
Abstract:
Did you wake up this morning wondering: "What really is my true identity?" Or have you ever seen your favorite television news program do a spot on cultural identity? "Today we ask you the viewer about your cultural identity." Not likely. It is certainly not vital for each of us to be able to expound upon our personal identity issues and definitions (you don't necessarily have to talk about identity to know yourself and to be happy and well-rounded). And yet, with this said, a casual visit to the local "mall" for a dose of people/culture-watching is all that it might take to be reminded of the multitude of social, economic and political institutions that vie every day for a piece of your identity, and the identity of everyone else we share this society with. Some of these identity-mongers can be considered beneficial and welcome influences on our understandings of who we are and how we see the world and life itself. These groups may include your family, friends, religious community and the cultural knowledge or background within which you were raised. Other groups that seek strong identification with themselves or their products include nation states, corporations, entertainment products, political parties and some civic institutions as well. From our observations in the mall, you can see how many aspects of identity have to do with collective identifications common to members of groups, such as those mentioned above. Indeed, much of the recent work in academia on identity analyses how social systems in the current era of late modernity affect identity construction. Yet, if we are to try to glue together a total picture or concept of what identity is, we must also consider the elements of an individual's identity which can be better understood within the unique experiences and feelings of each person. To be sure, it would be a sad reality if the identifications that influence my behavior in the mall encompassed the totality of "my identity". To get at what identity is, or might be made of, we can first venture into a tragically brief history lesson on the evolution of the concept of identity. This evolution has been rather drastic over the past few centuries. Chapter One -- Identity before Hegel: in Western society, before the beginning stages of the industrial revolution, you were considered to be born with your identity. It was a mixture, perhaps, of your soul and your situation/position in society and family (i.e. depending on your father's occupation, your gender, ethnic group, etc.). This view varies greatly from the modern, "constructionist" conceptualisation of identity. Chapter Two -- Modern Identity: in intellectual and academic circles much of the constructionist work on identity was begun by Existentialist philosophers such as Nietzsche and Sartre. The most recent inquiries on the issue of identity have been within Cultural Studies and Postmodernist thought. The constructionist view sees identity as "constructed on the back of a recognition of some common origin or shared characteristic with another person or group, or with an idea" (Hall 2). Thus, identity is formed through experiences of, and identification with, certain events, rituals, social institutions and symbols of culture(s) in which an individual was raised and lives. In short, identity is not a given or static; it is an evolving construction within each of us. Now that history class is over, perhaps we should highlight three principal concepts from the constructionist's viewpoint on identity. First, cultural environment is of utmost importance to personal and collective identity construction. "Cultural environment" must be seen as encompassing, (1) the plethora of entertainment and information technologies -- cultural spaces that corporations fill with new and reconstructed cultural products --, and (2) more temporal symbolic spaces such as oral and written languages. So, the Power Rangers will have their say in the identities of their young minions, but family heritages will as well, provided that such spaces are available and experienced. Secondly, the amount of cultural/social power that different groups and interests have to influence identity at the individual and collective (group) levels is also a vital element in the identity continuum. The last point is that identity itself is inherently a social phenomenon; it is a product of society, rather than a preexistent element of a being human. Identity is here seen as a way in which people make sense of and understand the self through affiliation and bonds with other people and the signs (i.e., the culture) that societies have created. Manuel Castells, a prolific writer and social observer, offers some compelling ideas about how social structures in modern societies are instrumental in collective identity construction. Castells's hypothesis is that identity construction can be separated into three categories: (1) legitimising identity, which is introduced by the dominant (hegemonic) institutions of society to further reproduce and rationalise their privileges, power and domination vis-à-vis social actors; (2) resistance identity, emerging from actors within cultures that are marginalised by dominant discourses and power relations, and who therefore build "trenches of resistance and survival" against these forces; and (3) project identity, "where social actors, on the basis of whichever cultural materials are available to them, build a new identity that redefines their position in society and, by doing so, seek the transformation of overall social structure" (Castells 8). While Castells's theories deserve more in-depth consideration than can be offered here, for our purposes nevertheless they help to distinguish some of the boundaries and anomalies within identity. Resistance identity, for example, is for me a useful concept for explaining the impact of ethnicity and nationality on how people use various cultural products to build and maintain their identities. In the USA, there are many groups who share common histories, experiences of persecution and discrimination, and culture with other members of the group. African-Americans are the best known and most studied sub-cultural (i.e., not the dominant) cultural/social groups in the USA. Being African-American, or "Black", is experienced by the individual and the group in the home, at school and work, and through the mass media and literature. For Castells, being Black in the USA is a resistance identity which is constructed through negative experiences of bigotry, discrimination and, for some, a lower economic status, and also through positive experiences of Black culture, history and family. Returning briefly to the international scene, resistance identity may also be a reaction to the proliferation of US and English-language cultural products in local settings. With "American" mass media and political-economic dominance (at present in the form of neo-liberal policies), nationalism, regional cultural pride and preservation may involve some resistance to this increasingly intrusive order. We must remember that Castells's typology here deals with collective identity only. This is important to keep in mind, particularly because common stereotypes of people's identities often play on the ethnic and social-economic groups which people may or may not be a part of. An endemic assumption is that an "American", "Black", "Latino", or even a "yuppie" will possess an identity and personality common to their stereotyped groupings. One problem with concepts of identity is that it is easy to generalise or overdetermine them. A face-value understanding of legitimising identity, for example, may posit that it is the embodied association and identification with the dominant institutions of society. Yet, if you think about it, most members of society, including members of marginalised groups, possess aspects of a legitimised identification with mainstream society. Most people do identify with capitalist dreams of being important, wealthy and living a specific lifestyle. Furthermore, many people, regardless of ethnicity or other groupings, do participate in the capitalist society, political systems and parties, Western ideologies, religious institutions and values. My point here is not to generalise, but rather to suggest that most people who have or feel some resistance to the dominant society also identify with certain legitimised and accepted aspects of that same society or culture. One way to think about the difference between resistance identity and legitimised identity is to consider how members of marginalised groups have access to specialised social and cultural spaces which other groups do not. Blacks have access to the black community, Latinos to Latino communities, homosexuals to homosexual communities. Specific processes of socialisation, identity-building and reaffirmation go on within these groups that non-group members miss out on for a variety of reasons. What members of the dominant society have are opportunities for membership in other specialised spaces that they seek membership in due to interests, unique personalities, physical traits or situational experiences. These cultural phenomena include musical tastes, gangs or civil groups, sports and other school activities, and the list goes on and on. Depending on the level of marginalisation, many members of "resistance" groups may or may not participate in a variety of other identity groups such as these. Furthermore, the type of identification involved may be collective or largely unique to the individual. Even with identities that we may call collective, as with my example of African-American identity, the actual types of identifications, feelings and interpretations that an individual feels with reference to her or his group(s) certainly can vary greatly. Another place we might look for a better understanding of identity groups is the wide gamut of communities of interest thriving in cyberspace. The development of online communities-of-interest, which are seen by some writers as allowing breaks from some of the traditional social constraints of modern society, has led to theories and excitement about the postmodern nature of cyberspace. These communities have developed because they allow individuals to express parts of themselves which do not have many outlets in real-world lives. The ability to play with gender and other personal characteristics in chat rooms or MUDs also offers identity variations that are refreshing, exciting and at times empowering for some people (see Bradlee, Lillie). Yet these considerations, like many others that accompany discussions of "post-modern" identity, dwell on the positive. Identity developments can also lead to harmful behaviors and thought processes. The Internet has also grown to offer a plethora of spaces for many people, particularly middle and upper-class men, to engage sexual fetishes, via the use of pornographic Web sites, that certainly can have long-term effects on their identities and perhaps on intimate relations with real people. The Internet offers a vast number of cultural spaces that those who have the chance to be online can tap into and identify with. Many of these spaces have been colonised by corporate interests, and more importantly, these capitalist forces are the primary drivers of new software and hardware production that will shape the look and feel, if not the content, of the Net of tomorrow (Schiller). As dangerous and unfortunate as this may be, identity is not yet in danger of being the proxy and total creation of mega-multinationals. Collective identification often has its roots in temporal cultures, tradition, and, for some, resistance identity. The audio-visual and Internet industries might have installed themselves as cultural gatekeepers and producers (a dangerous development in itself), but they cannot create cultural identities so easily. Drawing on the ideas laid out above, we can posit that the individual (whether they know it or not) and the cultural background and family/community influences in which he or she grows up most likely have the largest role. Concepts of identity, particularly newer work in the constructionist legacy (the example here being Castells), can serve us well by helping to forge understandings of the role of (1) the individual and (2) group influences in our day-to-day integration of cultural spaces, products and genres into our identities, behaviors and belief systems. Although constructionist ideas are implicitly represented in how much of the popular culture and society articulates "identity", it is all too easy to get caught up in concepts of identity based on bigotry, religious fanaticism or over-generalisation. As you stroll through the mall this week you might then pause to consider, not so much the extent to which our collective selves are casualties of a vapid consumer culture, but rather, I suggest, how to productively conceptualise the complexities of modern identities. References Berland, Jodi. "Angels Dancing: Cultural Technologies and the Production of Space." Cultural Studies. Eds. Lawrence Grossberg, Cary Nelson, and Paula Treichler. London: Routledge, 1992. Braddlee. "Virtual Communities: Computer-Mediated Communication and Communities of Association." Master's Thesis. U of Indiana, 1993. Castells, Manuel. The Power of Identity. Oxford: Blackwell, 1997. Hall, Stuart. "Introduction: Who Needs Identity?" Questions of Cultural Identity. Eds. Stuart Hall and Paul du Gay. London: Sage, 1996. Lillie, Jonathan. "The Empowerment Potential of Internet Use." Homepage of Jonathan Lillie. 3 Apr. 1998. 14 Oct. 1998 <http://www.unc.edu/~jlillie/340.php>. Schiller, H.I. "The Global Information Highway: Project for an Ungovernable World." Resisting the Virtual Life: The Culture and Politics of Information. Eds. James Brook and Iain A. Boal. San Francisco: City Lights, 1995. Citation reference for this article MLA style: Jonathan Lillie. "Tackling Identity with Constructionist Concepts." M/C: A Journal of Media and Culture 1.3 (1998). [your date of access] <http://www.uq.edu.au/mc/9810/const.php>. Chicago style: Jonathan Lillie, "Tackling Identity with Constructionist Concepts," M/C: A Journal of Media and Culture 1, no. 3 (1998), <http://www.uq.edu.au/mc/9810/const.php> ([your date of access]). APA style: Jonathan Lillie. (1998) Tackling identity with constructionist concepts. M/C: A Journal of Media and Culture 1(3). <http://www.uq.edu.au/mc/9810/const.php> ([your date of access]).
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography