Journal articles on the topic 'Other Information, Computing and Communication Sciences'

To see the other types of publications on this topic, follow the link: Other Information, Computing and Communication Sciences.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Other Information, Computing and Communication Sciences.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Wang, Mian. "Mobile Information System of Ceramic Appreciation and Communication Management Based on Mobile Edge Computing." Mobile Information Systems 2021 (September 15, 2021): 1–11. http://dx.doi.org/10.1155/2021/4540664.

Full text
Abstract:
Mobile edge computing is a very popular technology now. It was proposed to eliminate the problem of lack of global computing resources. This article aims to study the use of the latest mobile edge computing technology to study the mobile information system for appreciation, exchange, and management of the traditional ceramic industry. The whole article uses mobile edge computing technology. It enters the network using wireless methods and provides recent users with the required services and cloud computing functions, allowing users to easily query the information and data they want, plus mobile. The information system enables people to use mobile phones, tablets, and other mobile terminals to query information in the ceramic industry and perform functions such as appreciation, communication, and management. From 2016 to 2020, our country’s ceramic industry exports have increased from US$3.067 billion to US$6.826 billion. Traditional ceramics in our country have been loved by various industries at home and abroad. The number of employees in the ceramic industry has also increased to 5 million, an increase of 30% year-on-year. The ceramic industry is also very promising in the long term.
APA, Harvard, Vancouver, ISO, and other styles
2

Hakken, David. "Computing and the Crisis: The Significant Role of New Information Technologies in the Current Socio-economic Meltdown." tripleC: Communication, Capitalism & Critique. Open Access Journal for a Global Sustainable Information Society 8, no. 2 (August 28, 2010): 205–20. http://dx.doi.org/10.31269/vol8iss2pp205-220.

Full text
Abstract:
There is good reason to be concerned about the long-term implications of the current crisis for the reproduction of contemporary social formations. Thus there is an urgent need to understand it character, especially its distinctive features. This article identifies profound ambiguities in valuing assets as new and key economic features of this crisis, ambiguities traceable to the dominant, “computationalist” computing used to develop new financial instruments. After some preliminaries, the article identifies four specific ways in which computerization of finance is generative of crisis. It then demonstrates how computationalist computing is linked to other efforts to extend commodification based on the ideology of so-called “intellectual property” (IP). Several other accounts for the crisis are considered and then demonstrated to have less explanatory value. After considering how some commons-oriented (e.g., Free/Libre and/or Opening Source Software development projects) forms of computing also undermine the IP project, the article concludes with a brief discussion of what research on Socially Robust and Enduring Computing might contribute to fostering alternative, non-crisis generative ways to compute.
APA, Harvard, Vancouver, ISO, and other styles
3

Al-Aufi, Ali, and Crystal Fulton. "Impact of social networking tools on scholarly communication: a cross-institutional study." Electronic Library 33, no. 2 (April 7, 2015): 224–41. http://dx.doi.org/10.1108/el-05-2013-0093.

Full text
Abstract:
Purpose – This paper aims to investigate the extent to which social networking tools had an impact on academics’ patterns of informal scholarly communication in humanities and social science disciplines. Social networking tools, reinforced by proliferation and advances in portable computing and wireless technologies, have reshaped how information is produced, communicated and consumed. Design/methodology/approach – A cross-institutional quantitative study utilized an online questionnaire survey sent to 382 academics affiliated with humanities and social science disciplines in two different academic institutions: one that belongs to a Western tradition of scholarly communication in Ireland, and the other to a developing country in Oman. Descriptive interpretation of data compared findings from both universities. Frequencies, percentages and means were displayed in tables to enhance the meaning of collected data. Inferential analysis was also conducted to determine statistical significance. Findings – Overall findings indicate progressive use of social networking tools for informal scholarly communication. There is perceived usefulness on the impact of social networking tools on patterns of informal scholarly communication. However, nearly one-third of the respondents have never used social networking tools for informal scholarly communication. Institution-based data comparison revealed no significant differences on data except for few activities of informal scholarly communication. Research limitations/implications – Given that the number of study subjects was eventually small (total = 382) and that academics by their very nature are disinclined to respond to online surveys, results of the study may suggest non-response errors, and these may impact negatively on the acceptability of inferences and statistical conclusions. The results of the study are, therefore, unlikely to be useful for generalization, but they remain suggestive of a growing tendency among humanities and social sciences’ academics to use social networking tools for informal scholarly communication. Originality/value – Empirical findings provide a broad understanding about the potential of social networking tools on informal scholarly communication in areas of humanities and social sciences disciplines. Multi-disciplinary investigation and qualitative studies may further deepen our understanding of the impact of social networking tools on patterns of scholarly communication.
APA, Harvard, Vancouver, ISO, and other styles
4

Dai, Yu, Qiuhong Zhang, and Lei Yang. "Virtual Machine Migration Strategy Based on Multi-Agent Deep Reinforcement Learning." Applied Sciences 11, no. 17 (August 29, 2021): 7993. http://dx.doi.org/10.3390/app11177993.

Full text
Abstract:
Mobile edge computing is a new computing model, which pushes cloud computing power from centralized cloud to network edge. However, with the sinking of computing power, user mobility brings new challenges: since it is usually unstable, services should be dynamically migrated between multiple edge servers to maintain service performance, that is, user-perceived latency. Considering that Mobile Edge Computing is a highly distributed computing environment and it is difficult to synchronize information between servers, in order to ensure the real-time performance of the migration strategy, a virtual machine migration strategy based on Multi-Agent Deep Reinforcement Learning is proposed in this paper. The method of centralized training and distributed execution is adopted, that is, the transfer action is guided by the global information during training, and only the local observation information is needed to obtain the transfer action. Compared with the centralized control method, the proposed method alleviates communication bottleneck. Compared with other distributed control methods, this method only needs local information, does not need communication between servers, and speeds up the perception of the current environment. Migration strategies can be generated faster. Simulation results show that the proposed strategy is better than the contrast strategy in terms of convergence and energy consumption.
APA, Harvard, Vancouver, ISO, and other styles
5

Konkoly, K. R., and K. A. Paller. "0431 Two-Way Communication Between Dreamers and Experimenters." Sleep 43, Supplement_1 (April 2020): A166. http://dx.doi.org/10.1093/sleep/zsaa056.428.

Full text
Abstract:
Abstract Introduction Dreams are emblematic of human sleep, but they have yet to be adequately explained. In part, this is due to the limited options available for peering into dream experiences. Mapping neural measures onto dreams is problematic when those dreams are recounted after waking. Retrospective dream reports are subject to distortion and rapid forgetting. Methods Here, we describe a method to overcome these obstacles through two-way communication between dreamers and experimenters. To demonstrate proof-of-concept, we presented softly spoken math problems to participants during lucid REM sleep, and they provided answers using covert physiological signals such as eye movements. We confirmed REM sleep using standard polysomnographic methods. Results Thus far, 3 out of 8 participants who had lucid dreams correctly answered problems during REM sleep. Conclusion Results document that sleeping individuals can have sufficient abilities for veridical perceptual analysis, maintaining information, computing simple answers using working memory, and expressing volitional replies. Dreamers can thus be capable of interacting and exchanging information with other individuals. In this way, the mental content experienced by the dreamer can be interrogated to characterize the phenomenological experiences and cognitive abilities of dreaming. Support Mind Science Foundation, National Science Foundation
APA, Harvard, Vancouver, ISO, and other styles
6

Wang, Jian Hui, and Li Liu. "Research on Computer Graphics Design and Visual Communication Design." Applied Mechanics and Materials 713-715 (January 2015): 2191–94. http://dx.doi.org/10.4028/www.scientific.net/amm.713-715.2191.

Full text
Abstract:
In today's society rapid development of science and technology, the computer figure image design can be widely used in various industries. In the film and television, advertising, exhibition, art, electronic play in areas such as lack of necessary computer graphics design. With the development of computer information technology, the design of the graphics technology will progress. On the technology no longer content with a simple graphics rendering, more note focus on visual sense of the data, enhance the figure shape the image of beauty and expression. This paper mainly discusses the computer graphics design and visual communication design in today's society rapid development of science and technology, computer graphics design has also been extensively applied to various industries. In the film and television, advertising, exhibition, art, video games, and other fields are short of necessary computer graphics design. With the development of information technology, computing machine graphics design technology progress. Current technology is no longer satisfied with the image of a simple now, pay more attention to the visual sensation of biography and enhance the figure shape the image of aesthetic feeling and expression. This paper mainly discusses the computer graphics design and research of visual communication design.
APA, Harvard, Vancouver, ISO, and other styles
7

Mujević, Mersad, and Safet Korać. "Significance and role of using electronic communications in entrepreneurial companies." Ekonomski izazovi 9, no. 18 (2020): 86–95. http://dx.doi.org/10.5937/ekoizazov2018086m.

Full text
Abstract:
With the development of computer science, time and computer networks, primarily the Internet, as well as the increasing use of information and communication technologies in the company's business, it establishes a new form of business, and thus the economy. For several years, computing has been ranked high on political agendas in Europe and the world. Today, the European Commission considers computing to be literacy, which is the basis for understanding how digital technologies work and serves the development of 21st century skills, such as, among other things, "electronic business, ie. digital economics and analytical thinking. E-business operates on different principles in relation to the old economy and requires a different economic philosophy. Information, ideas, innovation and knowledge that create values, growth and productivity. The modern way of doing business guarantees a better access to the market and thus increases the position of companies, especially small and medium enterprises, in time and better use of their own resources provided by information and communication technologies. Companies, ie. SMEs in their challenges in the later stages of development will be precisely that, the better position in the global Internet market with its basic premises of creating good material bases and time make their offer accessible to potential consumers.
APA, Harvard, Vancouver, ISO, and other styles
8

Cao, Shihua, Xin Lin, Keyong Hu, Lidong Wang, Wenjuan Li, Mengxin Wang, and Yuchao Le. "Cloud Computing-Based Medical Health Monitoring IoT System Design." Mobile Information Systems 2021 (July 7, 2021): 1–12. http://dx.doi.org/10.1155/2021/8278612.

Full text
Abstract:
With the continuous improvement of the national medical system, health monitoring combined with cloud computing and Internet of Things has become a concern. This study mainly discusses the design of the medical health monitoring IoT system based on cloud computing. From the user to the health service provider, there are three devices: sensor terminal, gateway terminal, and service platform. The sensor terminal is used to measure physiological indicators, such as blood pressure, electrocardiogram, blood oxygen saturation, heart rate, and other physiological indicators; the gateway terminal is used to link the sensor terminal to receive physiological indicators and forward them to the business platform; the gateway is also used to receive health information and other instructions issued by the server. In the community service mode, users can be divided into groups according to the community and region, and the corresponding service doctors and agent customer service personnel (nurses) can be assigned. Users can collect personal physiological indicators at home or outside through the medical terminal. These indicators and information are transmitted to the background health platform system through the mobile GSM-TD communication network. Users can also view their own historical health records and opinions of health consultants through the web/WAP website. Through the integration ability of the health cloud platform, relying on the interconnection with HIS, LIS, and other information systems of professional medical institutions, we jointly operate special value-added services, such as appointment registration, maternal and child healthcare, and medical communication (doctor-patient interaction), so that users can enjoy the remote service and guidance of professional medical institutions by subscribing to health value-added services. The CPU utilization rate is 40%, the internal utilization rate is 7.44 G, the memory utilization rate is 11.8%, and the network bandwidth is 591.87 M. During the whole test process, the indicators are stable, and there are no restart, crash, and other phenomena, so the system performance meets the design requirements.
APA, Harvard, Vancouver, ISO, and other styles
9

Daberdaku, Sebastian, and Carlo Ferrari. "Computing voxelised representations of macromolecular surfaces." International Journal of High Performance Computing Applications 32, no. 3 (May 15, 2016): 407–32. http://dx.doi.org/10.1177/1094342016647114.

Full text
Abstract:
Voxel-based representations of surfaces have received a lot of interest in bioinformatics and computational biology as a simple and effective way of representing geometrical and physicochemical properties of proteins and other biomolecules. Processing such surfaces for large molecules can be challenging, as space-demanding data structures with associated high computational costs are required. In this paper, we present a methodology for the fast computation of voxelised macromolecular surface representations (namely the van der Waals, solvent-accessible and solvent-excluded surfaces). The proposed method implements a spatial slicing procedure on top of compact data structures to efficiently calculate the three molecular surface representations at high-resolutions, in parallel. The spatial slicing protocol ensures a balanced workload distribution and allows the computation of the solvent-excluded surface with minimal synchronisation and communication between processes. This is achieved by adapting a multi-step region-growing EDT algorithm. At each step, distance values are first calculated independently for every slice, then, a small portion of the borders’ information is exchanged between adjacent slices. Very little process communication is also required in the pocket detection procedure, where the algorithm distinguishes surface portions belonging to solvent-accessible pockets from cavities buried inside the molecule. Experimental results are presented to validate the proposed approach.
APA, Harvard, Vancouver, ISO, and other styles
10

Ferneley, Elaine, and Ben Light. "Unpacking User Relations in an Emerging Ubiquitous Computing Environment: Introducing the Bystander." Journal of Information Technology 23, no. 3 (September 2008): 163–75. http://dx.doi.org/10.1057/palgrave.jit.2000123.

Full text
Abstract:
The move towards technological ubiquity is allowing a more idiosyncratic and dynamic working environment to emerge that may result in the restructuring of information communication technologies, and changes in their use through different user groups’ actions. Taking a ‘practice’ lens to human agency, we explore the evolving roles of, and relationships between these user groups and their appropriation of emergent technologies by drawing upon Lamb and Kling's social actor framework. To illustrate our argument, we draw upon a study of a UK Fire Brigade that has introduced a variety of technologies in an attempt to move towards embracing mobile and ubiquitous computing. Our analysis of the enactment of such technologies reveals that Bystanders, a group yet to be taken as the central unit of analysis in information systems research, or considered in practice, are emerging as important actors. The research implications of our work relate to the need to further consider Bystanders in deployments other than those that are mobile and ubiquitous. For practice, we suggest that Bystanders require consideration in the systems development life cycle, particularly in terms of design and education in processes of use.
APA, Harvard, Vancouver, ISO, and other styles
11

Zhao, Jianming, Peng Zeng, Yingjun Liu, and Tianyu Wang. "On Improving the Robustness of MEC with Big Data Analysis for Mobile Video Communication." Security and Communication Networks 2021 (July 6, 2021): 1–12. http://dx.doi.org/10.1155/2021/4539540.

Full text
Abstract:
Mobile video communication and Internet of Things are playing a more and more important role in our daily life. Mobile Edge Computing (MEC), as the essential network architecture for the Internet, can significantly improve the quality of video streaming applications. The mobile devices transferring video flow are often exposed to hostile environment, where they would be damaged by different attackers. Accordingly, Mobile Edge Computing Network is often vulnerable under disruptions, against either natural disasters or human intentional attacks. Therefore, research on secure hub location in MEC, which could obviously enhance the robustness of the network, is highly invaluable. At present, most of the attacks encountered by edge nodes in MEC in the IoT are random attacks or random failures. According to network science, scale-free networks are more robust than the other types of network under the random failures. In this paper, an optimization algorithm is proposed to reorganize the structure of the network according to the amount of information transmitted between edge nodes. BA networks are more robust under random attacks, while WS networks behave better under human intentional attacks. Therefore, we change the structure of the network accordingly, when the attack type is different. Besides, in the MEC networks for mobile video communication, the capacity of each device and the size of the video data influence the structure significantly. The algorithm sufficiently takes the capability of edge nodes and the amount of the information between them into consideration. In robustness test, we set the number of network nodes to be 200 and 500 and increase the attack scale from 0% to 100% to observe the behaviours of the size of the giant component and the robustness calculated for each attack method. Evaluation results show that the proposed algorithm can significantly improve the robustness of the MEC networks and has good potential to be applied in real-world MEC systems.
APA, Harvard, Vancouver, ISO, and other styles
12

Adu, Kofi Koranteng, and Patrick Ngulube. "Preserving the digital heritage of public institutions in Ghana in the wake of electronic government." Library Hi Tech 34, no. 4 (November 21, 2016): 748–63. http://dx.doi.org/10.1108/lht-07-2016-0077.

Full text
Abstract:
Purpose The purpose of this paper is to examine digital preservation of e-government in Ghana under the research question: what are the current digital preservation strategies being deployed across the ministries and agencies in Ghana? Design/methodology/approach Guided by a conceptual framework, underpinned by a quantitative approach, the paper uses the survey approach, to address the digital preservation strategies deployed across public sector organisations in Ghana. It underscores the link between the conceptual framework and the literature to analyse the various digital preservation strategies. Findings Backup strategy, migration, metadata and trusted repositories were noted as the most widely implemented preservation strategies across the ministries and agencies. On the other hand, cloud computing, refreshing and emulation were the least implemented strategies used to address the digital preservation challenges. Research limitations/implications The paper adds to the existing conceptual underpinnings that have dominated the debate about data management, archival storage, preservation strategies, challenges and best practices of digital preservation of e-government. Originality/value This study draws its originality from the evidence of studies on digital preservation in Ghana as most studies have rather focussed on the preservation of documentary heritage, preservation and security of microfilms, preservation practices in the public records (Akussah, 2002; Ampofo, 2009; Festus, 2010). The emergence of this study addresses the knowledge gap in the preservation of digital records in a country where little attention has been accorded to digital preservation. The study also feeds into Ghana’s vision 2020 and the information communication technology policy document of the ministry of communication which aims at ensuring that Ghanaians have access to information and communication technology products and services.
APA, Harvard, Vancouver, ISO, and other styles
13

Piotelat, Elisabeth, and Florence Raulin Cerceau. "New SETI prospects opened up by current information networking." International Journal of Astrobiology 12, no. 3 (June 20, 2013): 208–11. http://dx.doi.org/10.1017/s1473550413000141.

Full text
Abstract:
AbstractThis paper discusses ideas that impact the fc factor as defined by Frank Drake in 1961, i.e. the fraction of planets with intelligent creatures capable of interstellar communication. This factor remains one of the most speculative terms of the equation. We suggest that the ability of sharing information is an important parameter to take into account in evaluating the tendency of a civilization to make contact (or share data) with other civilizations. Thus, we give special consideration to the fraction of planets with intelligent creatures capable of producing and sharing large amount of data. First, we determine the level of our own civilization in the framework of Sagan's energy- and information-based classification, by taking into account the recent improvements in computing and networking technologies. Second, we distinguish two types of organization, hierarchical and heterarchical, with respect to information sharing. We illustrate this distinction in the case of SETI and we show that the probability to detect a civilization would be greater if it is heterarchical than if it is hierarchical and if we utilize heterarchical principles for SETI.
APA, Harvard, Vancouver, ISO, and other styles
14

Wu, J. Y., R. Xin, J. B. Zhao, T. Zheng, D. Jiang, and P. F. Zhang. "Study on Delay Optimization of Fog Computing Edge Nodes Based on the CPSO-LB Algorithm." Wireless Communications and Mobile Computing 2020 (December 1, 2020): 1–12. http://dx.doi.org/10.1155/2020/8811175.

Full text
Abstract:
With the development of modern science and technology as well as the steady advancement of urbanization, intelligent networks have emerged and are replacing traditional networks with the identity of next-generation networks. And information security is one of the most important research directions in the intelligent network construction. In order to resist the threat of privacy leakage during the data transmission of intelligent terminal, an original four-layer fog computing system which is suitable for intelligent network data collection, transmission, and processing structure is established in the paper. With the help of the Paillier algorithm for encryption and fine-grained aggregation, the fine-grained aggregated data as coefficients are embed in the cloud node, and Horner’s rule is conformed for unary polynomials, which further aggregates to reduce the amount of transmitted data, so that communication overhead is reduced as well. Meanwhile, the resolvability of Horner’s rules allows EPSI to finally obtain the subregional information plain text, and it is summed up to obtain cloud-level information data. Therefore, the comparative analysis of simulation experiments with other algorithms proves that the rational optimization of the research content in this paper plays a higher security role.
APA, Harvard, Vancouver, ISO, and other styles
15

Walton, Douglas N. "Using conversation policies to solve problems of ambiguity in argumentation and artificial intelligence." Pragmatics and Cognition 14, no. 1 (August 22, 2006): 3–36. http://dx.doi.org/10.1075/pc.14.1.03wal.

Full text
Abstract:
This investigation joins recent research on problems with ambiguity in two fields, argumentation and computing. In argumentation, there is a concern with fallacies arising from ambiguity, including equivocation and amphiboly. In computing, the development of agent communication languages is based on conversation policies that make it possible to have information exchanges on the internet, as well as other forms of dialogue like persuasion and negotiation, in which ambiguity is a problem. Because it is not possible to sharply differentiate between problems arising from ambiguity and those arising from vagueness, obscurity and indeterminacy, some study of the latter is included. The semantic web is based on what are called ontologies, or systems of classification of concepts, shown to be useful tools for dealing with these problems.
APA, Harvard, Vancouver, ISO, and other styles
16

Kakkar, Latika, Deepali Gupta, Sapna Saxena, and Sarvesh Tanwar. "An Analysis of Integration of Internet of Things and Cloud Computing." Journal of Computational and Theoretical Nanoscience 16, no. 10 (October 1, 2019): 4345–49. http://dx.doi.org/10.1166/jctn.2019.8523.

Full text
Abstract:
The Internet of Things (IoT) comprises of various smart devices which are networked together to detect, accumulate, process, improve and interchange significant data over the Internet. IoT has improved our lifestyle by offering various applications such as intelligent home, smart healthcare, traffic monitoring and smart city devices. The IoT devices have restriction of power, battery life, memory and network constraints, so cloud can be used for accumulating and analyzing the IoT data. Due to the considerable increase in data transfer over Internet and other devices, the confidential information from the IoT sources required to be secure from any third party access. Cloud computing (CC) on the other side provides a protected, abrupt and advantageous data storage and computing services all over the internet. The integration of both these technologies can prove to be beneficial for each other. Therefore, we need an efficient and authentic method for secure communication in the IoT and cloud based big data environment. This paper provides a review of amalgamation of the IoT and cloud by featuring the implementation challenges and integration benefits.
APA, Harvard, Vancouver, ISO, and other styles
17

Shahryari, Om-Kolsoom, Amjad Anvari-Moghaddam, and Shadi Shahryari. "Demand side management using the internet of energy based on LoRaWAN technology." Kurdistan Journal of Applied Research 2, no. 3 (August 27, 2017): 112–19. http://dx.doi.org/10.24017/science.2017.3.35.

Full text
Abstract:
The smart grid, as a communication network, allows numerous connected devices such as sensors, relays and actuators to interact and cooperate with each other. An Internet-based solution for electricity that provides bidirectional flow of information and power is internet of energy (IoE) which is an extension of smart grid concept. A large number of connected devices and the huge amount of data generated by IoE and issues related to data transmission, process and storage, force IoE to be integrated by cloud computing. Furthermore, in order to enhance the performance and reduce the volume of transmitted data and process information in an acceptable time, fog computing is suggested as a layer between IoE layer and cloud layer. This layer is used as a local processing level that leads to reduction in data transmissions to the cloud. So, it can save energy consumption used by IoE devices to transmit data into cloud because of a long range, low power, wide area and low bit rate wireless telecommunication system which is called LoRaWAN. All devices in fog domain are connected by long range wide area network (LoRa) into a smart gateway. The gateway which bridges fog domain and cloud, is introduced for scheduling devices/appliances by creating a priority queue which can perform demand side management dynamically. The queue is affected by not only the consumer importance but also the consumer policies and the status of energy resources.
APA, Harvard, Vancouver, ISO, and other styles
18

Calderbank, Robert, and Andrew Thompson. "CHIRRUP: a practical algorithm for unsourced multiple access." Information and Inference: A Journal of the IMA 9, no. 4 (December 5, 2019): 875–97. http://dx.doi.org/10.1093/imaiai/iaz029.

Full text
Abstract:
Abstract Unsourced multiple access abstracts grantless simultaneous communication of a large number of devices (messages) each of which transmits (is transmitted) infrequently. It provides a model for machine-to-machine communication in the Internet of Things, including the special case of radio-frequency identification, as well as neighbour discovery in ad hoc wireless networks. This paper presents a fast algorithm for unsourced multiple access that scales to ${\mathscr{C}}=2^{100}$ (active or non-active) devices (arbitrary $100$ bit messages). The primary building block is multiuser detection of binary chirps, which are simply codewords in the second-order Reed–Muller code. The chirp detection algorithm originally presented by Howard et al. (2008, 42nd Annual Conference on Information Sciences and Systems) is enhanced and integrated into a peeling decoder designed for a patching and slotting framework. In terms of both energy per bit and number of active devices (number of transmitted messages), the proposed algorithm is within a factor of $2$ of state-of-the-art approaches. A significant advantage of our algorithm is its computational efficiency. We prove that the worst-case complexity of the basic chirp reconstruction algorithm is ${\mathscr{O}}[nK(\log _2^2 n + K)]$, where $n$ is the codeword length and $K$ is the number of active users. Crucially, the complexity is sublinear in ${\mathscr{C}}$, which makes the reconstruction computationally feasible—a claim we support by reporting computing times for our algorithm. Our performance and computing time results represent a benchmark against which other practical algorithms can be measured.
APA, Harvard, Vancouver, ISO, and other styles
19

Lahlou, Saadi. "Identity, social status, privacy and face-keeping in digital society." Social Science Information 47, no. 3 (September 2008): 299–330. http://dx.doi.org/10.1177/0539018408092575.

Full text
Abstract:
Digitization of society raises concerns about privacy. This article first describes privacy threats of life-logging. It gives the technically novice reader a quick overview of what information and communication technology (ICT) is currently preparing for society, based on state-of-the art research in the industry laboratories: ubiquitous computing, aware environments, the Internet of Things, and so on. We explain how geolocation systems work and how they can provide detailed accounts of personal activity that will deeply affect privacy. At present, system designers rarely implement privacy-enhancing technologies — we explain why, based on empirical research. On the other hand, users, while expressing concern, do not protect themselves in practice — we list reasons for this. The problem is complex because the very nature of identity and social relations works against protecting personal data; this is the privacy dilemma. At least two key mechanisms in the production of good interaction and in the construction of social status are based on personal data disclosure. Then we discuss the nature of privacy, based on field observation. Privacy loss can be seen as `losing face'. We detail this notion, based on a discussion of the notion of face, and especially the Asian social construct of `Chemyon'. We then propose a new, positive, definition of privacy as `keeping face'. This positive notion can be used to build constructive guidelines for enhancing privacy in systems design, compatible with the way designers perceive their role. These guidelines are presented in an annex, after a short conclusion that advocates a constructive — perhaps risky — role for social science in the construction of future information and communication technology. 1
APA, Harvard, Vancouver, ISO, and other styles
20

Ellen Frederick, Donna. "Libraries, data and the fourth industrial revolution (Data Deluge Column)." Library Hi Tech News 33, no. 5 (July 4, 2016): 9–12. http://dx.doi.org/10.1108/lhtn-05-2016-0025.

Full text
Abstract:
Purpose The World Economic Forum held in Davos, Switzerland, in January 2016, brought together leaders from the areas of science and technology, business, health, education, government and other fields as well as representatives from the media. A key theme of the forum was what has come to be known as the “fourth industrial revolution”. Design/methodology/approach News reports and blog posts about the forum gave the impression that this new “revolution” would bring unprecedented advances in science and medicine as well as would hold the potential for a future dominated by intelligent robots and massive levels of unemployment. Findings For example, on January 24, 2016, Elliot of The Guardian reported that the “Fourth Industrial Revolution brings promise and peril for humanity”. Sensational headlines and sound bites are good at attracting attention but they are not very effective with regard to communicating what this revolution is about and what it could mean for our lives, communities, governments and our workplaces in the near and distant future. The snippets of information reported here and there give the impression that robots, artificial intelligence, cloud-based computing, big data and a combination of other technologies are gradually merging to create a new reality which has the potential for revolutionizing our way of life. Originality/value This installment of the Data Deluge consists of an exploration of the fourth industrial revolution, what role libraries might play in this revolution and how our information environment could be forever changed.
APA, Harvard, Vancouver, ISO, and other styles
21

Shammar, Elham Ali, and Ammar Thabit Zahary. "The Internet of Things (IoT): a survey of techniques, operating systems, and trends." Library Hi Tech 38, no. 1 (October 5, 2019): 5–66. http://dx.doi.org/10.1108/lht-12-2018-0200.

Full text
Abstract:
Purpose Internet has changed radically in the way people interact in the virtual world, in their careers or social relationships. IoT technology has added a new vision to this process by enabling connections between smart objects and humans, and also between smart objects themselves, which leads to anything, anytime, anywhere, and any media communications. IoT allows objects to physically see, hear, think, and perform tasks by making them talk to each other, share information and coordinate decisions. To enable the vision of IoT, it utilizes technologies such as ubiquitous computing, context awareness, RFID, WSN, embedded devices, CPS, communication technologies, and internet protocols. IoT is considered to be the future internet, which is significantly different from the Internet we use today. The purpose of this paper is to provide up-to-date literature on trends of IoT research which is driven by the need for convergence of several interdisciplinary technologies and new applications. Design/methodology/approach A comprehensive IoT literature review has been performed in this paper as a survey. The survey starts by providing an overview of IoT concepts, visions and evolutions. IoT architectures are also explored. Then, the most important components of IoT are discussed including a thorough discussion of IoT operating systems such as Tiny OS, Contiki OS, FreeRTOS, and RIOT. A review of IoT applications is also presented in this paper and finally, IoT challenges that can be recently encountered by researchers are introduced. Findings Studies of IoT literature and projects show the disproportionate importance of technology in IoT projects, which are often driven by technological interventions rather than innovation in the business model. There are a number of serious concerns about the dangers of IoT growth, particularly in the areas of privacy and security; hence, industry and government began addressing these concerns. At the end, what makes IoT exciting is that we do not yet know the exact use cases which would have the ability to significantly influence our lives. Originality/value This survey provides a comprehensive literature review on IoT techniques, operating systems and trends.
APA, Harvard, Vancouver, ISO, and other styles
22

Ghazzawi, Nizar, Benoît Robichaud, Patrick Drouin, and Fatiha Sadat. "Automatic extraction of specialized verbal units." Terminology 23, no. 2 (December 31, 2017): 207–37. http://dx.doi.org/10.1075/term.00002.gha.

Full text
Abstract:
Abstract This paper presents a methodology for the automatic extraction of specialized Arabic, English and French verbs of the field of computing. Since nominal terms are predominant in terminology, our interest is to explore to what extent verbs can also be part of a terminological analysis. Hence, our objective is to verify how an existing extraction tool will perform when it comes to specialized verbs in a given specialized domain. Furthermore, we want to investigate any particularities that a language can represent regarding verbal terms from the automatic extraction perspective. Our choice to operate on three different languages reflects our desire to see whether the chosen tool can perform better on one language compared to the others. Moreover, given that Arabic is a morphologically rich and complex language, we consider investigating the results yielded by the extraction tool. The extractor used for our experiment is TermoStat (Drouin 2003). So far, our results show that the extraction of verbs of computing represents certain differences in terms of quality and particularities of these units in this specialized domain between the languages under question.
APA, Harvard, Vancouver, ISO, and other styles
23

Lin, Dongmei. "Research on the Information Construction of Accounting Audit Based on the Big Data of Computer." International Journal of Information Technology and Web Engineering 12, no. 3 (July 2017): 74–82. http://dx.doi.org/10.4018/ijitwe.2017070107.

Full text
Abstract:
For distributed network content audit system, its communication needs have two important aspects. One is how to let the audit rules sent fast to the entire network, the other is the Internet records data transmission of audit nodes among each other. The above two issues are related to the broadcasting of network news and the large-scale data transmission. This article mainly carries on the thorough research on large-scale data transmission and proposes the approach of large-scale data transmission, and puts forward its application method in the distributed network content audit system. For the large-scale data transmission, the main idea to realize it is by dividing the large-scale data into blocks and combining with various transmission routes to transmit. In this paper, the main methods of the current large-scale data transmission are analyzed and the shortcomings are summarized. Based on the algorithm of task allocation in the reference of grid computing, the large-scale data transmission algorithm is put forward based on node performance. This transmission algorithm uses APDG broadcasting algorithm to find relay nodes, and by judging the performance of the relay nodes, segments different size data blocks to the relay nodes for forwarding. The experimental results show that the large-scale data transmission algorithm based on node performance, compared with the current large-scale data transmission algorithm, has better flexibility and transmission performance.
APA, Harvard, Vancouver, ISO, and other styles
24

Salamí, Esther, Antonia Gallardo, Georgy Skorobogatov, and Cristina Barrado. "On-the-Fly Olive Trees Counting Using a UAS and Cloud Services." Remote Sensing 11, no. 3 (February 5, 2019): 316. http://dx.doi.org/10.3390/rs11030316.

Full text
Abstract:
Unmanned aerial systems (UAS) are becoming a common tool for aerial sensing applications. Nevertheless, sensed data need further processing before becoming useful information. This processing requires large computing power and time before delivery. In this paper, we present a parallel architecture that includes an unmanned aerial vehicle (UAV), a small embedded computer on board, a communication link to the Internet, and a cloud service with the aim to provide useful real-time information directly to the end-users. The potential of parallelism as a solution in remote sensing has not been addressed for a distributed architecture that includes the UAV processors. The architecture is demonstrated for a specific problem: the counting of olive trees in a crop field where the trees are regularly spaced from each other. During the flight, the embedded computer is able to process individual images on board the UAV and provide the total count. The tree counting algorithm obtains an F 1 score of 99 . 09 % for a sequence of ten images with 332 olive trees. The detected trees are geolocated and can be visualized on the Internet seconds after the take-off of the flight, with no further processing required. This is a use case to demonstrate near real-time results obtained from UAS usage. Other more complex UAS applications, such as tree inventories, search and rescue, fire detection, or stock breeding, can potentially benefit from this architecture and obtain faster outcomes, accessible while the UAV is still on flight.
APA, Harvard, Vancouver, ISO, and other styles
25

Redkina, N. S. "Global trends of libraries development: optimism vs pessimism (foreign literature review) Part 1." Bibliosphere, no. 4 (December 30, 2018): 87–94. http://dx.doi.org/10.20913/1815-3186-2018-4-87-94.

Full text
Abstract:
The dynamic development of the external technological environment, on the one hand, impacts on libraries questioning their future existence, on the other, helps libraries to work more productively, increases competitiveness and efficiency, expands the range of social projects, develops new ways and forms of work with users taking into account their preferences in information and services. The review is based on over 500 articles searched in the world's largest databases (Google Scholar, Web of Science, Scopus, etc.), which discuss trends and future development of libraries. Then the documents were classified according to sections and types of libraries, as well as advanced technologies. Examples of information technologies were collected and reviewed, as well as articles related to the implementation of information technologies when creating new services, with the emphasis on those that may affect libraries in the future. The latest information technologies that can be applied to the next generation library have been studied. The material is structured in blocks and presented in two parts. Thie 1st one presents such sections as: 1) challenges of the external environment and the future of libraries, 2) modern information technologies in libraries development (mobile technologies and applications, cloud computing, big data, internet of things, virtual and augmented reality, technical innovations, etc.), 4) Library 4.0 concept - new directions for libraries development. The 2nd part of the review article (Bibliosphere, 2019, 1) will touch the following issues: 1) user preferences and new library services (software for information literacy development, research data management, web archiving, etc.), 2) libraries as centers of intellectual leisure, communication platforms, places for learning, co-working, renting equipment, creativity, work, scientific experiments and leisure, 3) smart buildings and smart libraries, 4) future optimism. Based on the content analysis of publications, it is concluded that libraries should not only accumulate resources and provide access to them, but renew existing approaches to forms and content of their activities, as well as goals, missions and prospects for their development using various hard- and software, cloud computing technologies, mobile technologies and apps, social networks, etc.
APA, Harvard, Vancouver, ISO, and other styles
26

Tak, Sehyun, Jinsu Yoon, Soomin Woo, and Hwasoo Yeo. "Sectional Information-Based Collision Warning System Using Roadside Unit Aggregated Connected-Vehicle Information for a Cooperative Intelligent Transport System." Journal of Advanced Transportation 2020 (July 21, 2020): 1–12. http://dx.doi.org/10.1155/2020/1528028.

Full text
Abstract:
Vehicular collision and hazard warning is an active field of research that seeks to improve road safety by providing an earlier warning to drivers to help them avoid potential collision danger. In this study, we propose a new type of a collision warning system based on aggregated sectional information, describing vehicle movement processed by a roadside unit (RSU). The proposed sectional information-based collision warning system (SCWS) overcomes the limitations of existing collision warning systems such as the high installation costs, the need for high market penetration rates, and the lack of consideration of traffic dynamics. The proposed SCWS gathers vehicle operation data through on-board units (OBUs) and shares this aggregated information through an RSU. All the data for each road section are locally processed by the RSU using edge computing, allowing the SCWS to effectively estimate the information describing the vehicles surrounding the subject vehicle in each road section. The performance of the SCWS was evaluated through comparison with other collision warning systems such as the vehicle-to-vehicle communication-based collision warning system (VCWS), which solely uses in-vehicle sensors; the hybrid collision warning system (HCWS), which uses information from both infrastructure and in-vehicle sensors; and the infrastructure-based collision warning system (ICWS), which only uses data from infrastructure. In this study, the VCWS with a 100% market penetration rate was considered to provide the most theoretically similar result to the actual collision risk. The comparison results show that in both aggregation and disaggregation level analyses, the proposed SCWS exhibits a similar collision risk trend to the VCWS. Furthermore, the SCWS shows a high potential for practical application because it provides acceptable performance even with a low market penetration rate (30%) at the relatively low cost of OBU installation, compared to the VCWS requirement of a high market penetration rate at a high installation cost.
APA, Harvard, Vancouver, ISO, and other styles
27

Ahmed, Mohammed Imtyaz, and G. Kannan. "Cloud-Based Remote RFID Authentication for Security of Smart Internet of Things Applications." Journal of Information & Knowledge Management 20, Supp01 (January 30, 2021): 2140004. http://dx.doi.org/10.1142/s0219649221400049.

Full text
Abstract:
Radio frequency is the technology which enables smart labels to things. Thus, even physical things can participate in computing process. It is becoming popular due to its technological innovation and ability to overcome line of sight problem. With Internet of Things (IoT) technology, RFID usage became ubiquitous in smart applications or IoT use the cases like smart education, smart homes, smart healthcare and smart cities to mention a few. Integration of digital and physical worlds is made possible with IoT, RFID and host of other technologies and standards. When connected devices and things are uniquely identified using RFID technology, it is essential to know its utility in authentication process and security challenges thrown as well. RFID tag and RFID reader are involved in wireless communication and identification. RFID tags may carry sensitive information and its vulnerabilities if any are exploited by adversaries. Moreover, heavy computation is involved in RFID-based authentication. To overcome issues like privacy, security and overhead improvements have been proposed as found in the literature. Researchers used cryptographic tools, hash functions and symmetric key encryption for secure RFID communications. However, the level of security is still inadequate. In this paper, we proposed a cloud-based remote RFID authentication scheme with smart home as the case study. The proposed scheme has features like forward secrecy, anonymity and untraceability besides being light weight. It can withstand various security attacks. Our simulation study revealed that the proposed system model and the cloud-based remote RFID authentication scheme are effective in providing privacy and security as part of access control system in smart home IoT use case.
APA, Harvard, Vancouver, ISO, and other styles
28

Janani, K. "Mitigation of Malware Effect using Cyber Threat Analysis using Ensemble Deep Belief Networks." International Journal of Innovative Technology and Exploring Engineering 10, no. 11 (September 30, 2021): 40–46. http://dx.doi.org/10.35940/ijitee.k9477.09101121.

Full text
Abstract:
Cybersecurity is a technique that entails security models development techniques to the illegal access, modification, or destruction of computing resources, networks, program, and data. Due to tremendous developments in information and communication technologies, new dangers to cyber security have arisen and are rapidly changing. The creation of a Deep Learning system requires a substantial number of input samples and it can take a great deal of time and resources to gather and process the samples. Building and maintaining the basic system requires a huge number of resources, including memory, data and computational power. In this paper, we develop an Ensemble Deep Belief Networks to classify the cybersecurity threats in large scale network. An extensive simulation is conducted to test the efficacy of model under different security attacks. The results show that the proposed method achieves higher level of security than the other methods.
APA, Harvard, Vancouver, ISO, and other styles
29

Baldini, Gianmarco, Jose Luis Hernandez Ramos, and Irene Amerini. "Intrusion Detection Based on Gray-Level Co-Occurrence Matrix and 2D Dispersion Entropy." Applied Sciences 11, no. 12 (June 16, 2021): 5567. http://dx.doi.org/10.3390/app11125567.

Full text
Abstract:
The Intrusion Detection System (IDS) is an important tool to mitigate cybersecurity threats in an Information and Communication Technology (ICT) infrastructure. The function of the IDS is to detect an intrusion to an ICT system or network so that adequate countermeasures can be adopted. Desirable features of IDS are computing efficiency and high intrusion detection accuracy. This paper proposes a new anomaly detection algorithm for IDS, where a machine learning algorithm is applied to detect deviations from legitimate traffic, which may indicate an intrusion. To improve computing efficiency, a sliding window approach is applied where the analysis is applied on large sequences of network flows statistics. This paper proposes a novel approach based on the transformation of the network flows statistics to gray images on which Gray level Co-occurrence Matrix (GLCM) are applied together with an entropy measure recently proposed in literature: the 2D Dispersion Entropy. This approach is applied to the recently public IDS data set CIC-IDS2017. The results show that the proposed approach is competitive in comparison to other approaches proposed in literature on the same data set. The approach is applied to two attacks of the CIC-IDS2017 data set: DDoS and Port Scan achieving respectively an Error Rate of 0.0016 and 0.0048.
APA, Harvard, Vancouver, ISO, and other styles
30

Ramamurthy, M. K. "A new generation of cyberinfrastructure and data services for earth system science education and research." Advances in Geosciences 8 (June 6, 2006): 69–78. http://dx.doi.org/10.5194/adgeo-8-69-2006.

Full text
Abstract:
Abstract. A revolution is underway in the role played by cyberinfrastructure and modern data services in the conduct of research and education. We live in an era of an unprecedented data volume from diverse sources, multidisciplinary analysis and synthesis, and active, learner-centered education emphasis. Complex environmental problems such as global change and water cycle transcend disciplinary and geographic boundaries, and their solution requires integrated earth system science approaches. Contemporary education strategies recommend adopting an Earth system science approach for teaching the geosciences, employing pedagogical techniques such as enquiry-based learning. The resulting transformation in geoscience education and research creates new opportunities for advancement and poses many challenges. The success of the scientific enterprise depends heavily on the availability of a state-of-the-art, robust, and flexible cyberinfrastructure, and on the timely access to quality data, products, and tools to process, manage, analyze, integrate, publish, and visualize those data. Concomittantly, rapid advances in computing, communication, and information technologies have revolutionized the provision and use of data, tools and services. The profound consequences of Moore's Law and the explosive growth of the Internet are well known. On the other hand, how other technological trends have shaped the development of data services is less well understood. For example, the advent of digital libraries, web services, open standards and protocols have been important factors in shaping a new generation of cyberinfrastructure for solving key scientific and educational problems. This paper presents a broad overview of these issues, along with a survey of key information technology trends, and discuses how those trends are enabling new approaches to applying data services for solving geoscientific problems.
APA, Harvard, Vancouver, ISO, and other styles
31

N., Ramachandran, Sivaprakasam P., Thangamani G., and Anand G. "Selecting a suitable Cloud Computing technology deployment model for an academic institute." Campus-Wide Information Systems 31, no. 5 (October 28, 2014): 319–45. http://dx.doi.org/10.1108/cwis-09-2014-0018.

Full text
Abstract:
Purpose – Cloud Computing (CC) technology is getting implemented rapidly in the educational sector to improve learning, research and other administrative process. As evident from the literature review, most of these implementations are happening in the western countries such as USA, UK, while the level of implementation of CC in developing countries such as India is rare. Moreover, implementing CC technology in the educational sector require various decisions to be made by the managers of the Information Technology (IT) department such as selecting suitable deployment model, vendor providing cloud service, etc. in their respective university or institute. The purpose of this paper is to attempt to address one such decision. Since, different types of CC deployment are available; selecting a suitable one plays a key role, as it might have an impact on the requirements of various stakeholders such as students, teachers, administrative staff (especially the staff members in the IT department), etc. apart from affecting the overall performance of the facilities such as a laboratory. Naturally, a proper decision by analysing multiple perspectives has to be made while carrying out such strategic initiatives by any educational institute. Design/methodology/approach – A case study methodology has been chosen as the research methodology to discuss and demonstrate the above decision problem that was faced in real time by one of the educational institutes in India, offering high-quality management education. The IT managers of this institute were planning to switch over to CC technology for the computer laboratory and they have to make a decision of choosing suitable alternative CC deployment models such as private cloud (PRC), public cloud (PUC), community cloud (COC), hybrid cloud (HYC), etc. by analysing and comparing them based on various factors and perspectives such as elasticity, availability, scalability, etc. Since, multiple factors are involved in making such a strategic decision, the most commonly used Multi-Criteria Decision Making (MCDM) model – namely, the Analytic Hierarchy Process (AHP) is used as a decision support during the decision making process. Findings – The team of decision makers, who were planning to implement CC in the case institute, found that PRC is best as they believed that it would provide adequate cost savings, apart from providing necessary security to maintain confidential information such as student's detail, grades, etc. Research limitations/implications – The results obtained are based on a single case study. Hence, they cannot be generalized for institutions across educational sector. However, the decision making situation and understanding its impact on the stakeholders of the educational institute can be common across various educational institute. Practical implications – Using a real-life case study of an educational institute, this paper presented a strategic decision making situation, which needs to be considered by the IT managers of the educational institutes when they decide to switch over to CC technology. Various criteria to be considered during the decision making process was identified from the literature review were identified and enumerated. These factors would useful for the IT managers of the different educational institute and they can suitably add or delete these decision criteria as per their requirements and situation at hand. Moreover, the algorithm of AHP, which was used as a decision support, was presented in a step-by-step manner, which should be beneficial for the practitioners to apply the same for similar decision making situations. Originality/value – It is believed that this paper would be the first to report on a strategic decision of choosing the deployment model for CC technology especially in the educational sector. Similarly, this paper would also contribute to the field of CC, as it lists out the decision criteria that are to be considered for making the above decision, which has not got adequate importance. Lastly, this paper is also unique in the realm of AHP because application for a decision problem in the field of CC especially in the educational sector is least reported.
APA, Harvard, Vancouver, ISO, and other styles
32

Zhao, Bozuo, Rui Kong, and Wei Miao. "Information Security of New Media Art Platform of Distributed System Based on Blockchain Technology." Mobile Information Systems 2021 (July 21, 2021): 1–8. http://dx.doi.org/10.1155/2021/6607130.

Full text
Abstract:
New media art is different from ready-made art, installation art, body art, and land art. New media art is a new art subject category with “optical” media and electronic media as the basic language. And, because of the continuous development of computer networks, distributed database management systems are becoming more and more popular. This article aims to study the information security control problem in the distributed new media system. Through comprehensive access control, reliable support, and many-to-many random mutual encryption, it solves the security requirements of supporting mobile computing in a distributed network environment, communication, and hierarchical grouping. Group key management and other issues have studied some key security technologies for building secure distributed information systems. This article proposes covering the behavior-based access control model ABAC (operation-based access control model), the architecture of the new Trusted Platform Module, the architecture of multithreaded crypto chips, and their service methods, as well as distributed information systems and key management solutions. Approximately obeying the chi-square distribution with 2 degrees of freedom, when the significance level is taken as 5%, the chi-square value for the skin is 5.99. Different initial values are selected for the chaotic sequence, 200 groups of chaotic sequences with a length of 2000 are selected for sequence inspection, and the pass rate is 97.5%. It can be seen that the autocorrelation and cross-correlation characteristics of the improved spatiotemporal chaotic sequence are still relatively ideal, so the usability of the platform is relatively high.
APA, Harvard, Vancouver, ISO, and other styles
33

Ali, Omer, Mohamad Khairi Ishak, and Muhammad Kamran Liaquat Bhatti. "Emerging IoT domains, current standings and open research challenges: a review." PeerJ Computer Science 7 (August 16, 2021): e659. http://dx.doi.org/10.7717/peerj-cs.659.

Full text
Abstract:
Over the last decade, the Internet of Things (IoT) domain has grown dramatically, from ultra-low-power hardware design to cloud-based solutions, and now, with the rise of 5G technology, a new horizon for edge computing on IoT devices will be introduced. A wide range of communication technologies has steadily evolved in recent years, representing a diverse range of domain areas and communication specifications. Because of the heterogeneity of technology and interconnectivity, the true realisation of the IoT ecosystem is currently hampered by multiple dynamic integration challenges. In this context, several emerging IoT domains necessitate a complete re-modeling, design, and standardisation from the ground up in order to achieve seamless IoT ecosystem integration. The Internet of Nano-Things (IoNT), Internet of Space-Things (IoST), Internet of Underwater-Things (IoUT) and Social Internet of Things (SIoT) are investigated in this paper with a broad future scope based on their integration and ability to source other IoT domains by highlighting their application domains, state-of-the-art research, and open challenges. To the best of our knowledge, there is little or no information on the current state of these ecosystems, which is the motivating factor behind this article. Finally, the paper summarises the integration of these ecosystems with current IoT domains and suggests future directions for overcoming the challenges.
APA, Harvard, Vancouver, ISO, and other styles
34

Rezaei Aghdam, Atae, Jason Watson, Cynthia Cliff, and Shah Jahan Miah. "Improving the Theoretical Understanding Toward Patient-Driven Health Care Innovation Through Online Value Cocreation: Systematic Review." Journal of Medical Internet Research 22, no. 4 (April 24, 2020): e16324. http://dx.doi.org/10.2196/16324.

Full text
Abstract:
Background Patient participation in the health care domain has surged dramatically through the availability of digital health platforms and online health communities (OHCs). Such patient-driven service innovation has both potential and challenges for health care organizations. Over the last 5 years, articles have surfaced that focus on value cocreation in health care services and the importance of engaging patients and other actors in service delivery. However, a theoretical understanding of how to use OHCs for this purpose is still underdeveloped within the health care service ecosystem. Objective This paper aimed to introduce a theoretical discussion for better understanding of the potential of OHCs for health care organizations, in particular, for patient empowerment. Methods This literature review study involved a comprehensive search using 12 electronic databases (EMBASE, PsycINFO, Web of Science, Scopus, ScienceDirect, Medical Literature Analysis and Retrieval System Online, PubMed, Elton B Stephens Co [academic], Cumulative Index of Nursing and Allied Health Literature, Accelerated Information Sharing for Law Enforcement, Association for Computing Machinery, and Google Scholar) from 2013 to 2019. A total of 1388 studies were identified from the database search. After removing duplicates and applying inclusion criteria, we thematically analyzed 56 articles using the Braun and Clarke thematic analysis approach. Results We identified a list of 5 salient themes: communication extension, improved health literacy for patients and health care organizations, communication transparency with patients, informational and social support for patients, and patient empowerment in self-management. The most frequent theme was communication extension, which covers 39% (22/56) of the literature. This theme reported that an extension of communication between patients, caregivers, and physicians and organizations led to new opportunities to create value with minimal time and cost restrictions. Improved health literacy and communication transparency with patients were the second and third most frequent themes, respectively, covering 26% (15/56) and 25% (14/56) of the literature, respectively. The frequency of these themes indicated that the use of OHCs to generate new knowledge from patients’ interactions helped health care organizations to customize treatment plans and establish transparent and effective communication between health care organizations and patients. Furthermore, of the 56 studies, 13 (23%) and 10 (17%) studies contended the opportunity of using OHCs in terms of informational and emotional support and empowering patients in their self-management of diseases. Conclusions This review enables better understanding of the current state of the art of the online value cocreation and its potential for health care organizations. This study found that the opportunities for health care organizations through enhancement of patient participation and their cocreation of value in digital health platforms have been rapidly increasing. The identified gaps and opportunities in this study would identify avenues for future directions in modernized and more effective value-oriented health care informatics research.
APA, Harvard, Vancouver, ISO, and other styles
35

Ceia, Carlos. "Promoting a Teacher Education Research-Oriented Curriculum for Initial Teacher Training in English as a Foreign Language." e-TEALS 7, no. 1 (December 1, 2016): 1–10. http://dx.doi.org/10.1515/eteals-2016-0005.

Full text
Abstract:
Abstract Research-oriented programs related to pre-service teacher education are practically non-existent in many countries. Since in Portugal we now have a stable legal system for initial teacher training, how can we help these countries to respond to their teacher training needs and accomplish these same standards? How can we create an international program at MA level that could serve such an objective? What are the research priorities for teachers in primary and secondary education? I will claim for a new general research policy using small-scale research projects in foreign language teaching (FLT), which illustrated a turning point in advanced research in foreign languages teacher training. Presently, researchers no longer narrow their inquiries into linguistic questions or school and student-centered actions. Instead, they focus on a range of issues such as teacher-centered actions, beliefs and policies, and aspects of FLT such as literacy education, special educational needs or methods for teaching gifted students. Despite a lack of funding at all levels, many research projects in teacher education have been undertaken, and new areas have been explored, such as didactic transposition, literary and information literacies, intercultural learning, corpora in FLT, new information and communication technologies in FLT, interlingual inferencing, national standards for foreign language education, FLT for specific purposes, digital narratives in education, CLIL, assessment, and language learning behaviors. This small sample of the many areas covered proves that advanced research in teacher education can also be very useful to promote the growing interest in further internationalization in other sciences (beyond human and social areas) traditionally linked to politics, business and industry (computing, chemistry, biology, medicine, etc.), something that can only be attained by focusing on multilingualism, multi-literacy and lifelong learning.
APA, Harvard, Vancouver, ISO, and other styles
36

Floridi, Luciano. "Web 2.0 vs. the Semantic Web: A Philosophical Assessment." Episteme 6, no. 1 (February 2009): 25–37. http://dx.doi.org/10.3366/e174236000800052x.

Full text
Abstract:
ABSTRACTThe paper develops some of the conclusions, reached in Floridi (2007), concerning the future developments of Information and Communication Technologies (ICTs) and their impact on our lives. The two main theses supported in that article were that, as the information society develops, the threshold between online and offline is becoming increasingly blurred, and that once there won't be any significant difference, we shall gradually re-conceptualise ourselves not as cyborgs but rather as inforgs, i.e. socially connected, informational organisms. In this paper, I look at the development of the so-called Semantic Web and Web 2.0 from this perspective and try to forecast their future. Regarding the Semantic Web, I argue that it is a clear and well-defined project, which, despite some authoritative views to the contrary, is not a promising reality and will probably fail in the same way AI has failed in the past. Regarding Web 2.0, I argue that, although it is a rather ill-defined project, which lacks a clear explanation of its nature and scope, it does have the potentiality of becoming a success (and indeed it is already, as part of the new phenomenon of Cloud Computing) because it leverages the only semantic engines available so far in nature, us. I conclude by suggesting what other changes might be expected in the future of our digital environment.
APA, Harvard, Vancouver, ISO, and other styles
37

Albert, Sylvie, and Don Flournoy. "Think Global, Act Local." International Journal of Sociotechnology and Knowledge Development 2, no. 1 (January 2010): 59–79. http://dx.doi.org/10.4018/jskd.2010100804.

Full text
Abstract:
Being able to connect high-speed computing and other information technologies into broadband communication networks presents local communities with some of their best chances for renewal. Such technologies are now widely perceived to be not just a nice amenity among corporations and such non-profit organizations as universities but a social and economic necessity for communities struggling to find their place in a rapidly changing world. Today, citizens want and expect their local communities to be “wired” for broadband digital transactions, whether for family, business, education or leisure. Such networks have become a necessity for attracting and retaining the new “knowledge workforce” that will be key to transforming communities into digital societies where people will want to live and work. Since the Internet is a global phenomenon, some of the challenges of globalization for local communities and regions are introduced in this article and suggestions for turning those challenges into opportunities are offered. To attain maximum benefit from the new wired and wireless networks, local strategies must be developed for its implementation and applications must be chosen with some sensitivity to local needs. New Growth theory is used to show why communities must plan their development agenda, and case studies of the Intelligent Community Forum are included to show how strategically used ICTs are allowing local communities to be contributors in global markets.
APA, Harvard, Vancouver, ISO, and other styles
38

Wang, Yiming, and Xidan Gong. "Optimization of Data Processing System for Exercise and Fitness Process Based on Internet of Things." Wireless Communications and Mobile Computing 2021 (July 6, 2021): 1–11. http://dx.doi.org/10.1155/2021/7132301.

Full text
Abstract:
In the digital network era, people have higher requirements for physical fitness. In the future, physical fitness requires not only good fitness equipment and fitness environment but also more convenient and intelligent health management, service guidance, social entertainment, and other refined fitness services. The innovation of sports and fitness equipment for the digital network era will definitely depend on the development of information technology and network technology. Based on the cutting-edge Internet of Things technology, this thesis focuses on the development and application of a new generation of digital fitness equipment adapted to future development, advocating the new concept of seamless integration of fitness exercise and information services through human-oriented systematic design thinking and providing implementable solutions to realize the science, convenience, and life of public fitness. This thesis uses modern science and technology, especially the Internet of Things (IoT) technology, to fully meet the diversified fitness needs of the fitness crowd as the guide; IoT digital fitness equipment design and application research was newly generated, using a variety of research methods to explore the functional design and application of IoT fitness equipment; the goal is to create a more intelligent and three-dimensional IoT fitness service model in the future. Through the application research of intelligent devices in IoT fitness equipment, the realization of the functions of identity identification, environment perception, and data transmission of IoT fitness equipment is made faster. Intelligent devices can become the interaction channel between fitness service personnel, fitness equipment, and fitness users and also reduce the development cost of IoT fitness equipment. The construction of an IoT fitness cloud service platform and data management system integrates the application of IoT, cloud computing, mobile communication, and other technologies to make IoT fitness service supply remote, real-time, and diversified. While providing convenient and value-added fitness services for fitness people, it also brings sustainable development space for the health service industry.
APA, Harvard, Vancouver, ISO, and other styles
39

Kelkar, Bhagyashri A., Sunil F. Rodd, and Umakant P. Kulkarni. "A Novel Parameter-Light Subspace Clustering Technique Based on Single Linkage Method." Journal of Information & Knowledge Management 18, no. 01 (March 2019): 1950007. http://dx.doi.org/10.1142/s0219649219500072.

Full text
Abstract:
Subspace clustering is a challenging high-dimensional data mining task. There have been several approaches proposed in the literature to identify clusters in subspaces, however their performance and quality is highly affected by input parameters. A little research is done so far on identifying proper parameter values automatically. Other observed drawbacks are requirement of multiple database scans resulting into increased demand for computing resources and generation of many redundant clusters. Here, we propose a parameter light subspace clustering method for numerical data hereafter referred to as CLUSLINK. The algorithm is based on single linkage clustering method and works in bottom up, greedy fashion. The only input user has to provide is how coarse or fine the resulting clusters should be, and if not given, the algorithm operates with default values. The empirical results obtained over synthetic and real benchmark datasets show significant improvement in terms of accuracy and execution time.
APA, Harvard, Vancouver, ISO, and other styles
40

Chen, Qian Matteo, Alberto Finzi, Toni Mancini, Igor Melatti, and Enrico Tronci. "MILP, Pseudo-Boolean, and OMT Solvers for Optimal Fault-Tolerant Placements of Relay Nodes in Mission Critical Wireless Networks*." Fundamenta Informaticae 174, no. 3-4 (September 28, 2020): 229–58. http://dx.doi.org/10.3233/fi-2020-1941.

Full text
Abstract:
In critical infrastructures like airports, much care has to be devoted in protecting radio communication networks from external electromagnetic interference. Protection of such mission-critical radio communication networks is usually tackled by exploiting radiogoniometers: at least three suitably deployed radiogoniometers, and a gateway gathering information from them, permit to monitor and localise sources of electromagnetic emissions that are not supposed to be present in the monitored area. Typically, radiogoniometers are connected to the gateway through relay nodes. As a result, some degree of fault-tolerance for the network of relay nodes is essential in order to offer a reliable monitoring. On the other hand, deployment of relay nodes is typically quite expensive. As a result, we have two conflicting requirements: minimise costs while guaranteeing a given fault-tolerance. In this paper, we address the problem of computing a deployment for relay nodes that minimises the overall cost while at the same time guaranteeing proper working of the network even when some of the relay nodes (up to a given maximum number) become faulty (fault-tolerance). We show that, by means of a computation-intensive pre-processing on a HPC infrastructure, the above optimisation problem can be encoded as a 0/1 Linear Program, becoming suitable to be approached with standard Artificial Intelligence reasoners like MILP, PB-SAT, and SMT/OMT solvers. Our problem formulation enables us to present experimental results comparing the performance of these three solving technologies on a real case study of a relay node network deployment in areas of the Leonardo da Vinci Airport in Rome, Italy.
APA, Harvard, Vancouver, ISO, and other styles
41

Schulte, Stephanie J. "Information Professional Job Advertisements in the U.K. Indicate Professional Experience is the Most Required Skill." Evidence Based Library and Information Practice 4, no. 2 (June 14, 2009): 158. http://dx.doi.org/10.18438/b8ts51.

Full text
Abstract:
A Review of: Orme, Verity. “You will be…: A Study of Job Advertisements to Determine Employers’ Requirements for LIS Professionals in the UK in 2007.” Library Review 57.8 (2008): 619-33. Objective –To determine what skills employers in the United Kingdom (U.K.) want from information professionals as revealed through their job advertisements. Design – Content analysis, combining elements of both quantitative and qualitative content analysis. Orme describes it as “a descriptive non-experimental approach of content analysis” (62). Setting – Data for this study were obtained from job advertisements in the Chartered Institute of Library and Information Professional’s (CILIP) Library and Information Gazette published from June 2006 through May 2007. Subjects – A total of 180 job advertisements. Methods – Job advertisements were selected using a random number generator, purposely selecting only 15 advertisements per first issue of each month of the Library and Information Gazette (published every two weeks). The author used several sources to create an initial list of skills required by information professionals, using such sources as prior studies that examined this topic, the Library and Information Science Abstracts (LISA) database thesaurus, and personal knowledge. Synonyms for the skills were then added to the framework for coding. Skills that were coded had to be noted in such a way that the employer plainly stated the employee would be a certain skill or attribute or they were seeking a skill or a particular skill was essential or desirable. Skills that were stated in synonymous ways within the same advertisement were counted as two incidences of that skill. Duties for the position were not counted unless they were listed as a specific skill. Data were all coded by hand and then tallied. The author claims to have triangulated the results of this study with the literature review, the synonym ring used to prepare the coding framework, and a few notable studies. Main Results – A wide variety of job titles was observed, including “Copyright Clearance Officer,” “Electronic Resources and Training Librarian,” and “Assistant Information Advisor.” Employers represented private, school, and university libraries, as well as legal firms and prisons. Fifty-nine skills were found a total of 1,021 times across all of the advertisements. Each advertisement averaged 5.67 requirements. These skills were classified in four categories: professional, generic, personal, and experience. The most highly noted requirement was professional experience, noted 129 times, followed by interpersonal/communication skills (94), general computing skills (63), enthusiasm (48), and team-working skills (39). Professional skills were noted just slightly more than generic and personal skills in the top twenty skills found. Other professional skills that were highly noted were customer service skills (34), chartership (30), cataloguing/classification/metadata skills (25), and information retrieval skills (20). Some notable skills that occurred rarely included Web design and development skills (6), application of information technology in the library (5), and knowledge management skills (3). Conclusion – Professional, generic, and personal qualities were all important to employers in the U.K.; however, without experience, possessing these qualities may not be enough for new professionals in the field.
APA, Harvard, Vancouver, ISO, and other styles
42

Haliti, Nusret, Arbana Kadriu, and Mensur Jusufi. "An approach for speed limit determination for vehicle tracking in case of GID ambiguity and lack of information in navigation maps." International Journal of Pervasive Computing and Communications 13, no. 3 (September 4, 2017): 252–63. http://dx.doi.org/10.1108/ijpcc-02-2017-0020.

Full text
Abstract:
Purpose The purpose of the research presented in this paper is about an algorithm used for speed limit determination, in cases when there is ambiguity in determining the correct road data for a tracked vehicle. This algorithm resolves the glitch that happens when emitted global positioning system (GPS) data are ambiguous regarding roads that are very near to each other. Furthermore, we give a solution for other difficulties regarding the speed limit involving accuracy of emitted data and lack of information in navigation maps. Design/methodology/approach Our vehicle tracking system parses all GPS data from different vehicles to a single centralized database. It uses balancers and parsers to parse these data. Balancers use algorithms like round-robin to choose between different parsers. Information gained by the GPS unit is parsed, and then sent to a central server at regular intervals. For the gained data, we try to analyze the speed limit problem when tracking vehicles, analyze the challenges that are linked with speed limit problem, define a solution for the above discussed drawbacks and measure the driving performance. Findings We have already developed a fully functioning tracking system, which uses the above-described algorithm in tracking of few hundred vehicles, which makes approximately 1,300,000 requests per day, resulting in more than 4,000,000 tracking records gained in six months. The system monitors the motion of different vehicles using the gained GPS data from the first hand. This monitoring is done by developing web and mobile applications for third-party actors. This monitoring regards not only to the raw-produced data but also to new metrics that are derived from the raw one. Originality/value To our knowledge, there is not a similar algorithm/technology that can be of help in case of geographical identifier (GID) ambiguity. This research presents a solution to a real problem that we faced and which could not be answered by any of the current algorithms and technologies regarding the speed limit. Therefore, we consider this paper as highly original, which brings value in the field of pervasive computing and machine-to-machine communication.
APA, Harvard, Vancouver, ISO, and other styles
43

Singh Negi, Dheeraj. "Open source software using New GEN LIB: a case study of international management institutue Bhubaneswar." Library Hi Tech News 31, no. 9 (October 28, 2014): 9–10. http://dx.doi.org/10.1108/lhtn-07-2014-0056.

Full text
Abstract:
Purpose – The purpose of this study was to develop and update database of books in the International Management Institute Bhubaneswar. The study presents the status of automation in International Management Institute Bhubaneswar. A properly computerized library will help its user with quick services. Library automation refers to mechanization of library housekeeping operations predominantly by computerization. Implement automated system using New Gen Lib (NGL) library integrated open source software to carry out the functions of the circulation section more effectively to provide various search option to know the availability of books in the library and generate the list of books due by the particular member and also overdue charges. NGL is an integrated software system with required models for small to very large libraries. Being an open source any library wanted to go for automation for their library housekeeping operations can make use of this software. Design/methodology/approach – Open source is a software development model and a software distribution model. In this model, the source code of programs is made freely available with the software itself, so that anyone can see, change and distribute it provided they abide by the accompanying license. In this sense, open source is similar to peer review, which is used to strengthen the progress of scholarly communication. The open source software differs from the closed source or proprietary software, which may only be obtained by some form of payment, either by purchase or by leasing. The primary difference between the two is the freedom to modify the software. An open system is a design philosophy antithetical to solutions designed to be proprietary. The idea behind it is that institutions, such as libraries, can build a combination of components and deliver services that include several vendors' offerings. Thus, for instance, a library might use an integrated library system from one of the major vendors in combination with an open source product developed by another library or by itself to better meet its internal or users' requirements. Findings – NGL free software is constantly being updated, changed and customized to meet the library's needs. While all of this is fine and dandy, and sounds like the win-win solution for your library, there are still pitfalls and hurdles we will need to overcome. Hopefully, this article provides some introductory information as to how to wean your library off of traditional computing products and dive into the pool of open source resources available today. Libraries in the developing countries are able to support electronic access, digital libraries and resource sharing because they are able to use Open sources Software (OSS). Even libraries in well-developed countries are becoming more inclined toward OSS to improve their services. Originality/value – To develop and updated database of books and other online/printed resources of the International Management Institute Bhubaneswar. To implement automated system using NGL library integrated open sources software. To carry out the charging and discharging functions of the circulation section and Provide Various search options to know the availability of books in the library.
APA, Harvard, Vancouver, ISO, and other styles
44

Kang, Ho-Seok, Sung-Ryul Kim, and Pankoo Kim. "Traffic deflection method for dos attack defense using a location-based routing protocol in the sensor network." Computer Science and Information Systems 10, no. 2 (2013): 685–701. http://dx.doi.org/10.2298/csis120914029k.

Full text
Abstract:
As the ubiquitous computing environment gets more attention and development, WSN (Wireless Sensor Network) is getting popular as well. Especially, the development of wireless communication and sensor equipment greatly contributes to the popularization of WSN. On the other hand, the safety and security of WSN attracts lots of attention due to such a development and distribution. The DoS (Denial of Service) attack, which gets more sophisticated and broadens its domain into various services fields, may have negative effects on WSN, making it vulnerable to attacks. Since WSN collects information through sensors that are already deployed, it is difficult to have its energy recharged. When WSN is under a DoS attack, sensor nodes consume lots of energy, bringing about a fatal result to the sensor network. In this paper, we propose a method to efficiently defend against DoS attacks by modifying routing protocols in the WSN. This method uses a location based routing protocol that is simple and easy to implement. In the WSN environment where the location-based routing protocol is implemented, this method disperses the DoS attack concentration of traffic by using the traffic deflection technique and blocks it out before arriving at the target destinations. To find out the number of traffic redirection nodes proper for this method, we have performed a few experiments, through which the number of such nodes was optimized.
APA, Harvard, Vancouver, ISO, and other styles
45

Suplatov, Dmitry, Nina Popova, Sergey Zhumatiy, Vladimir Voevodin, and Vytas Švedas. "Parallel workflow manager for non-parallel bioinformatic applications to solve large-scale biological problems on a supercomputer." Journal of Bioinformatics and Computational Biology 14, no. 02 (April 2016): 1641008. http://dx.doi.org/10.1142/s0219720016410080.

Full text
Abstract:
Rapid expansion of online resources providing access to genomic, structural, and functional information associated with biological macromolecules opens an opportunity to gain a deeper understanding of the mechanisms of biological processes due to systematic analysis of large datasets. This, however, requires novel strategies to optimally utilize computer processing power. Some methods in bioinformatics and molecular modeling require extensive computational resources. Other algorithms have fast implementations which take at most several hours to analyze a common input on a modern desktop station, however, due to multiple invocations for a large number of subtasks the full task requires a significant computing power. Therefore, an efficient computational solution to large-scale biological problems requires both a wise parallel implementation of resource-hungry methods as well as a smart workflow to manage multiple invocations of relatively fast algorithms. In this work, a new computer software mpiWrapper has been developed to accommodate non-parallel implementations of scientific algorithms within the parallel supercomputing environment. The Message Passing Interface has been implemented to exchange information between nodes. Two specialized threads — one for task management and communication, and another for subtask execution — are invoked on each processing unit to avoid deadlock while using blocking calls to MPI. The mpiWrapper can be used to launch all conventional Linux applications without the need to modify their original source codes and supports resubmission of subtasks on node failure. We show that this approach can be used to process huge amounts of biological data efficiently by running non-parallel programs in parallel mode on a supercomputer. The C++ source code and documentation are available from http://biokinet.belozersky.msu.ru/mpiWrapper .
APA, Harvard, Vancouver, ISO, and other styles
46

Miranda, Enrique, and Jordi Suñé. "Memristors for Neuromorphic Circuits and Artificial Intelligence Applications." Materials 13, no. 4 (February 20, 2020): 938. http://dx.doi.org/10.3390/ma13040938.

Full text
Abstract:
Artificial Intelligence has found many applications in the last decade due to increased computing power. Artificial Neural Networks are inspired in the brain structure and consist in the interconnection of artificial neurons through artificial synapses in the so-called Deep Neural Networks (DNNs). Training these systems requires huge amounts of data and, after the network is trained, it can recognize unforeseen data and provide useful information. As far as the training is concerned, we can distinguish between supervised and unsupervised learning. The former requires labelled data and is based on the iterative minimization of the output error using the stochastic gradient descent method followed by the recalculation of the strength of the synaptic connections (weights) with the backpropagation algorithm. On the other hand, unsupervised learning does not require data labeling and it is not based on explicit output error minimization. Conventional ANNs can function with supervised learning algorithms (perceptrons, multi-layer perceptrons, convolutional networks, etc.) but also with unsupervised learning rules (Kohonen networks, self-organizing maps, etc.). Besides, another type of neural networks are the so-called Spiking Neural Networks (SNNs) in which learning takes place through the superposition of voltage spikes launched by the neurons. Their behavior is much closer to the brain functioning mechanisms they can be used with supervised and unsupervised learning rules. Since learning and inference is based on short voltage spikes, energy efficiency improves substantially. Up to this moment, all these ANNs (spiking and conventional) have been implemented as software tools running on conventional computing units based on the von Neumann architecture. However, this approach reaches important limits due to the required computing power, physical size and energy consumption. This is particularly true for applications at the edge of the internet. Thus, there is an increasing interest in developing AI tools directly implemented in hardware for this type of applications. The first hardware demonstrations have been based on Complementary Metal-Oxide-Semiconductor (CMOS) circuits and specific communication protocols. However, to further increase training speed andenergy efficiency while reducing the system size, the combination of CMOS neuron circuits with memristor synapses is now being explored. It has also been pointed out that the short time non-volatility of some memristors may even allow fabricating purely memristive ANNs. The memristor is a new device (first demonstrated in solid-state in 2008) which behaves as a resistor with memory and which has been shown to have potentiation and depression properties similar to those of biological synapses. In this Special Issue, we explore the state of the art of neuromorphic circuits implementing neural networks with memristors for AI applications.
APA, Harvard, Vancouver, ISO, and other styles
47

Devika, G., D. Ramesh, and Asha Gowda Karegowda. "Analysis of Binary and Discrete Grey Wolf Optimization Algorithms Applied for Enhancing Performance of Energy Efficient Low Energy Adaptive Clustering Hierarchy." Journal of Computational and Theoretical Nanoscience 17, no. 9 (July 1, 2020): 3850–59. http://dx.doi.org/10.1166/jctn.2020.8974.

Full text
Abstract:
Wireless sensor networks (WSN) are a yield of advancement in information technology and the requirement of large-scale communication infrastructures. Routing of data via selected paths is a critical task in WSN as process need to be carried on under resource constraint situations. This route identification problem can be better handled by employing appropriate heuristic bio-inspired computational intelligence optimization method. The most frequently applied routing is hierarchical routing algorithm is Low Energy Adaptive Clustering Hierarchy (LEACH) algorithm which has limitations in identifying energy efficient inter and intra route communication, identification of number of cluster head (CH), an eminent node to communicate to CH and Base Station (BS), selection of CH, and computing residual energy level, etc. Hence, researchers are focusing on boosting the capability of LEACH clustering algorithm by applying heuristic bio-inspired computational intelligence optimization methods. The proposed work is an attempt in this direction through applying heuristic bio-inspired Grey Wolf Optimization algorithm (GWO) for improving the performance of LEACH algorithm. In this paper, focus is given to increase the overall network time by adapting two modifications to conventional algorithms (i) selection of vice cluster head (VCH) in addition to CH (VCH node will replace the CH when CH when CH node goes down due to unexpected reasons as sensor node work under critical and uninterruptable environments and (ii) selection of intra and inter relay nodes (intra relay node will enhance the life span during CH data gathering and inter relay node will further enhance the life span of CH by acting as a mediator between CH an BS). The Spyder-py3 tool is used to simulate the proposed algorithms, LEACH Binary Grey Wolf search based Optimization (LEACH-BGWO) and LEACH Discrete Grey Wolf search based Optimization (LEACH-DGWO) protocols. The proposed work is compared with cluster based LEACH algorithm, chain based power-efficient gathering in sensor information systems (PEGASIS) algorithm, bio-inspired GWO and Genetic Algorithm data Aggregation (GADA) LEACH protocols. The results prove that both proposed algorithms outperformed other conventional algorithms in terms of prolonged network lifespan and increased throughput. Among proposed algorithms LEACH-BGWO outperformed LEACH-DGWO
APA, Harvard, Vancouver, ISO, and other styles
48

Ciobanu, Gabriel. "Writing as a Form of Freedom and Happiness Celebrating the 60th birthday of Gheorghe Păun." International Journal of Computers Communications & Control 5, no. 5 (December 1, 2010): 613. http://dx.doi.org/10.15837/ijccc.2010.5.2215.

Full text
Abstract:
<p>Essentially writing is form of thinking on paper, and a way of learning. According to Winston Churchill, writing a book is an adventure. "To begin with, it is a toy and an amusement; then it becomes a mistress, and then it becomes a master, and then a tyrant. The last phase is that just as you are about to be reconciled to your servitude, you kill the monster, and fling him out to the public." On the other hand, writing could be a form of freedom by escaping the madness of a period, and reducing the anxiety. In many situations the authors write to save themselves, to survive as individuals.</p><p>Gheorghe Păun is an example of a person affirming his own existence by writing. He is a prolific writer with a huge number of papers: tens of scientific books, hundreds of articles, several novels, poems, and books on games. A list of his scientific publications is posted at http://www.imar.ro/~gpaun/papers.php [2], while his books are listed at http://www.imar. ro/~gpaun/books.php [1] His way of distributing information is not by speaking, but by writing. Gheorghe Păun did not like very much to teach in universities. He preferred a form of "teaching by researching", combining ideas with nice metaphors and distributing his knowledge in articles and books. In this way he wrote several papers having a high impact in the scientific community. His seminal paper "Computing with membranes" published in Journal of Computers and System Sciences in 2000 and his fundamental book on computation theory "Membrane Computing" (Springer, 2003) has over 1,000 citations [6] (and his author was recognized as an "ISI highly cited researcher" [5]). He has defined new branches, new theories. The field of membrane computing was initiated by Gheorghe Păun as a branch of natural computing [3]; P systems are inspired by the hierarchical membrane structure of eukaryotic cells [4]. An impressive handbook of membrane computing was published recently (2010) by Oxford University Press.</p><p> </p><p>After 1990 he becomes a traveling scientist, visiting several countries and receiving many research fellowships and awards. Fruitful scientific collaboration at Magdeburg University (Germany), and at University of Turku (Finland). The trio Gheorghe Păun, Grzegorz Rozenberg and Arto Salomaa is well-known for several successful books. The last years were spent in Spain, first in Tarragona and now in Sevilla. Several collaborations were possible during his trips, and there are over 100 co-authors from many countries. His scientific reputation is related to the large number of invited talks provided at many international conferences and universities. He is a member of the editorial boards for several international journals, corresponding member of the Romanian Academy (from 1997), and member of Academia Europaea (from 2006).<br /> It is not possible to understand the personality of Gheorghe Păun without mentioning his activity as writer of novels and poems; he is a member of the Romanian Writers Association for a long time. Another aspect of his life is related to the intellectual seduction of games; he was the promoter of GO in Romania, writing many books about GO and other "mathematical" games.</p><p><br /> Personally, I am impressed by the speed of his mind (it is enough to say few words about some new results, and he is able to complete quickly the whole approach), his wide-ranging curiosity and intelligence, rich imagination and humor, talent and passion. He is highly motivated by challenging projects, and work hard to conclude them successfully. There are very few scientists having such an interesting profile, and I am very happy to learn a lot from him.</p><p><br /> Celebrating his 60th birthday, we wish him a good health, long life, and new interesting achievements!</p>
APA, Harvard, Vancouver, ISO, and other styles
49

Bilò, Davide, Luciano Gualà, Stefano Leucci, and Guido Proietti. "Network Creation Games with Traceroute-Based Strategies." Algorithms 14, no. 2 (January 26, 2021): 35. http://dx.doi.org/10.3390/a14020035.

Full text
Abstract:
Network creation games have been extensively used as mathematical models to capture the key aspects of the decentralized process that leads to the formation of interconnected communication networks by selfish agents. In these games, each user of the network is identified by a node and selects which link to activate by strategically balancing his/her building cost with his/her usage cost (which is a function of the distances towards the other player in the network to be built). In these games, a widespread assumption is that players have a common and complete information about the evolving network topology. This is only realistic for small-scale networks as, when the network size grows, it quickly becomes impractical for the single users to gather such a global and fine-grained knowledge of the network in which they are embedded. In this work, we weaken this assumption, by only allowing players to have a partial view of the network. To this aim, we borrow three popular traceroute-based knowledge models used in network discovery: (i) distance vector, (ii) shortest-path tree view, and (iii) layered view. We settle many of the classical game theoretic questions in all of the above models. More precisely, we introduce a suitable (and unifying) equilibrium concept which we then use to study the convergence of improving and best response dynamics, the computational complexity of computing a best response, and to provide matching upper and lower bounds to the price of anarchy.
APA, Harvard, Vancouver, ISO, and other styles
50

Egorova, P. A., T. G. Mukhina, S. N. Sorokoumova, and D. D. Mukhina. "MONITORING ANALYSIS OF EFFICIENCY OF PROFESSIONAL COMPUTERS AND SYSTEM REQUIREMENTS FOR THE ORGANIZATION OF ELECTRONIC-EDUCATIONAL ENVIRONMENT OF HIGHER EDUCATION INSTITUTIONS IN THE CONDITIONS OF EDUCATIONAL INCLUSION." Vestnik of Minin University 6, no. 3 (November 10, 2018): 12. http://dx.doi.org/10.26795/2307-1281-2018-6-3-12.

Full text
Abstract:
Introduction:in the process of training specialists in various fields of activity, one of the strategic directions is the education of persons with disabilities. The main task is to make higher education more “sensitive” towards students with disabilities. To give them more freedom of choice, based primarily on the desire of students to get an interesting profession for themselves. One of the main activities on this path is the elimination of all kinds of barriers in education, based on the use of information and communication technologies. Multipurpose use of computer equipment in working with students with HIA allows the introduction of the main classical didactic principles for the implementation of the process of training and education (science, systematic and consistent, conscientiousness and strength of training, students' activity principles and use of visual aids), and also increase the effectiveness of the principle of accounting for age and individual characteristics of students. In the conditions of professional activity in educational organizations, there is a tendency towards the request of high-performance computing means - personal computers (PCs), workstations united by telecommunications computer networks, and the organization faces the problem of choosing the priority of investment. The question is to be asked which information technology object is more priority for the formation of an electronic educational environment. At the same time, on the one hand, it makes sense to make an assumption about investing in technological equipment, the physical deterioration of which (in many cases - extreme) and obsolescence contribute to the deterioration of the quality of the organization of educational activities, its technological capabilities and appreciation. On the other hand, scientific and technical progress offers new solutions and functionalities in the form of modern information technologies (IT). However, in order to achieve maximum IT, monitoring and evaluation of the computing power of the PC is necessary. In this regard, the goal of the research is to conduct a monitoring analysis of the effectiveness of professional computers and system requirements for the timely identification of obsolete personal computers and forecasting the potential life of personal computers in an educational organization to determine the readiness of a university to create and maintain an inclusive educational environment.Materials and methods: for monitoring the effectiveness of professional computers and system requirements for the organization of the electronic educational environment of the university in terms of educational inclusion, the educational process was divided into three areas: "Engineering graphics", "Programming" and "Computer Science". Within each area, a reference software product was identified - the product that is most often used in this area of study and / or the most costly in terms of resources consumed. For the “Engineering graphics” specialty, the reference product was “AutoCAD”, for the “Programming” specialty - “Microsoft Visual Studio”, for the “Informatics” specialty - “Microsoft Office”.Results:Forecasting in our study is presented as the construction of a trend line (a graph of a prediction function). Using this function, approximate ratings of software products of subsequent versions were calculated, which made it possible to track the physical and moral obsolescence of classrooms.Discussion and Conclusions: long-term experience of using information technologies in the learning process proves the readiness of an educational organization to effectively create an inclusive educational environment, with the goal of developing successful socialization and professional development in students with disabilities.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography