Journal articles on the topic 'COTS applications software vendors'

To see the other types of publications on this topic, follow the link: COTS applications software vendors.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'COTS applications software vendors.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Zhang, Xiong, Wei T. Yue, and Wendy Hui. "Software piracy and bundling in the cloud-based software era." Information Technology & People 32, no. 4 (August 5, 2019): 1085–122. http://dx.doi.org/10.1108/itp-05-2018-0210.

Full text
Abstract:
Purpose In the cloud computing era, three merging developments in software industry are: cloud and on-premises software may offer complementary value to each other; cloud software service requires the support of significant information technology infrastructure; and software piracy problems can be better managed in the cloud. However, how these developments together impact a vendor’s bundling strategy has not yet been investigated. The paper aims to discuss this issue. Design/methodology/approach Drawing on the product bundling framework, this research establishes stylized models to study a software vendor’s bundling decision in the cloud-based era with special consideration on the issue of software piracy. Findings The authors find different key parameters associated with the cloud era exert different effects on the bundling decision. When on-premises software and cloud software generate additional value by complementing each other, software vendors can make greater profits under the pure components (PC) strategy. Regarding a low infrastructure cost, software vendors should favor pure bundling (PB). The impact of piracy deterrence effectiveness is less straightforward – it favors PC when piracy deterrence effectiveness is low, but PB when piracy deterrence effectiveness is high. Originality/value This study makes key contributions to theory and practice. First, this is the first study to examine software bundling strategies in the cloud computing era, whereby the three factors relevant to the cloud phenomenon have been considered. Second, this paper contributes to the literature of bundling and software piracy by examining the intersection of these two streams of literature. Third, this paper sheds light on a vendor’s bundling decision when facing piracy problems in the emerging cloud software era.
APA, Harvard, Vancouver, ISO, and other styles
2

Singh Sagar, Mahendra, Babita Singh, and Waseem Ahmad. "Study on Cloud Computing Resource Allocation Strategies." International Journal of Advance Research and Innovation 1, no. 3 (2013): 20–28. http://dx.doi.org/10.51976/ijari.131303.

Full text
Abstract:
Cloud computing is offering utility-oriented IT services to users worldwide. Based on pay-as-you-go model, it enables hosting of pervasive applications from consumer, scientific, and business domains. It is a revolution of traditional data centre's and offers subscription-based access to infrastructure, platforms, and applications that are popularly referred to as Infrastructure, Platform and Software as a Service. Numerous IT vendors are promising to offer computation, storage, and application hosting services and to provide coverage in several continents. These vendors required a huge amount of energy for contributing to high dynamic cost along with a drawback of environment pollution. Therefore, current scenario needs Green computing to save energy and reduce dynamic costs too. Because of increasing demand of high speed computation and data storages, distributed computing system has beckon a lot of contemplation. Resource allocation plays an indispensable role in distributed system where clients have service level agreements. Due to these IT Vendors total profit depends on these Service level agreements.
APA, Harvard, Vancouver, ISO, and other styles
3

Seth, Dinesh, and Subhash Rastogi. "Application of vendor rationalization strategy for manufacturing cycle time reduction in engineer to order (ETO) environment." Journal of Manufacturing Technology Management 30, no. 1 (January 21, 2019): 261–90. http://dx.doi.org/10.1108/jmtm-03-2018-0095.

Full text
Abstract:
Purpose The purpose of this paper is to demonstrate the application of vendor rationalization strategy for streamlining the supplies and manufacturing cycle time reduction in an Indian engineer-to-order (ETO) company. ETO firms are known for a large number of vendors, co-ordination hassles, rework problems and its impact on cycle time and operational excellence. Design/methodology/approach The research demonstrates the case-based application of Kraljic’s matrix for supply and leverages items, on-the-job observations, field visits, discussions and analysis of supplies reports. Findings The study guides on the rationalization of supplies and the necessary strategic alignments that can significantly reduce supply risk, costs, manufacturing and delivery cycle time along with co-ordination hassles. The study depicts the challenges of ETO environment with respect to supplies, and demonstrates the effectiveness of vendor rationalization application for the case company and weaknesses of commonly practiced vendor management approaches. Practical implications To be competitive, companies should rationalize supply items and vendors based on the nature of items and their subsequent usage by applying Kraljic’s matrix-based classification. The immediate implication of vendor rationalization is misunderstood as reducing supply base, but it does much more and includes review of supplies, nature of items and strategic alignments, leading to win-win situation for company and suppliers. Originality/value For the rationalization of supplies, while procuring and dealing with vendors, executives should envisage engineering nature of components, considering cross-functional requirements and integration of components in context to ETO products/projects environments. There is a dearth of studies focusing on vendor rationalization aspects in ETO setups in fast-developing country context.
APA, Harvard, Vancouver, ISO, and other styles
4

Shao, Guo Dong, Swee Leong, and Charles McLean. "Simulation-Based Manufacturing Interoperability Standards and Testing." Key Engineering Materials 407-408 (February 2009): 283–86. http://dx.doi.org/10.4028/www.scientific.net/kem.407-408.283.

Full text
Abstract:
Software applications for manufacturing systems developed using software from different vendors typically cannot work together. Develop¬ment of custom integrations of manufacturing software incurs costs and delays that hurt industry productivity and competitiveness. Software applications need to be tested in live operational systems. It is impractical to use real industrial systems to support dynamic interoperability test¬ing and research due to: 1) access issues - manu¬facturing facilities are not open to outsiders, as proprietary data and processes may be compro¬mised; 2) technical issues - operational systems are not instrumented to support testing; and 3) cost issues - productivity suffers when actual production systems are taken offline to allow testing. Publicly available simulations do not exist to demonstrate simulation integration issues, validate potential standards solu¬tions, or dynamically test the interoperability of simulation systems and other software applica¬tions. A new, dynamic, simulation-based interoperability testing facility for manufacturing software applications is being developed at the National Institute of Standards and Technology (NIST).
APA, Harvard, Vancouver, ISO, and other styles
5

Salma, Charkaoui, Marzak Abdelaziz, El Habib Ben Lahma, and Abdelbaki Issam. "Cross-Platform Mobile Development Framework Based on MDA Approach." International Journal of Technology Diffusion 9, no. 1 (January 2018): 45–59. http://dx.doi.org/10.4018/ijtd.2018010104.

Full text
Abstract:
This article describes how these days the mobile application market keeps getting bigger because of the different mobile operating systems. So, it has become a challenge for application vendors to provide an application planned for multiple platforms whose operating systems use different technologies. This fragmentation makes the development of mobile applications quite difficult and very expensive. This can be observed at all levels, data storage, software architecture, user interface, access to phone data, communication between applications, etc. To resolve this problem, several solutions exist in the mobile market to develop mobile applications according to the principle “develops once, use everywhere”. In this article, the authors propose a solution based on the MDA approach called “TimPhoneGenerator”. Using TimPhoneGenerator, applications only need to be coded once for all targeted platforms, which reduces development time and costs.
APA, Harvard, Vancouver, ISO, and other styles
6

Rachappa, Halkar. "Experimental Study of BaaS, Its Implementation Methods and Advantages and Challenges." International Journal on Recent and Innovation Trends in Computing and Communication 9, no. 8 (August 31, 2021): 05–08. http://dx.doi.org/10.17762/ijritcc.v9i8.5518.

Full text
Abstract:
BaaS is the facility to provide backend as a service by various vendors to the companies. The developers will outsource the backend jobs to a private or public vendor which will help them to provide all the backend services using cloud technique at a reasonable cost. The cost will depend upon the types and number of backend services required by the organisation. It can be modified at any point of time and as a result of this the costing will also change. By using the BaaS provided by the vendor, the developers can spend and utilise their time at the front-end coding of applications of the system. In this way efficient software and system can be developed which will have latest techniques and gives efficient performance. The paper will explain the characteristics of backend as a service and how it is beneficial for the organisations. There are certain challenges while shifting the existing backend services to the public cloud services provided by the vendor which will be discussed in this paper.
APA, Harvard, Vancouver, ISO, and other styles
7

Balint, Bryon. "Maximizing the Value of Packaged Software Customization." International Journal of Enterprise Information Systems 13, no. 1 (January 2017): 1–16. http://dx.doi.org/10.4018/ijeis.2017010101.

Full text
Abstract:
Organizations that purchase packaged application software – for example, an Enterprise Resource Planning system – must make choices about customization. Packaged software vendors and practitioners recommend that organizations customize software as little as possible, and instead adapt their processes to meet the “best practices” of the software. However, organizations continue to exceed their budgets on implementing and maintaining customized software. This suggests that either these organizations are making poor decisions, or that the conventional wisdom about customization is incorrect. In this paper the author models the primary factors in the customization decision, most notably the “fit” between desired processes and the procedures inherent in the packaged software. The author then consider costs related to development, maintenance, and technical corrections due to poor integration and performance; and benefits related to increased fit, technical corrections, and user acceptance. This paper extends prior work by (1) modelling nonlinear relationships between the amount of time spent on custom development and the resulting benefits, (2) modelling nonlinear relationships between development costs and maintenance costs, and (3) modelling corrective development as a function of development related to fit and user acceptance. The author uses simulation techniques to illustrate the conditions under which customization is likely to provide value to the organization, as well as conditions under which customization should be avoided.
APA, Harvard, Vancouver, ISO, and other styles
8

Saravanan, K., and E. Poorna Chandra Prasad. "Open Source Software Test Automation Tools: A Competitive Necessity." Scholedge International Journal of Management & Development ISSN 2394-3378 3, no. 6 (August 15, 2016): 103. http://dx.doi.org/10.19085/journal.sijmd030601.

Full text
Abstract:
<em>Software Testing is one of the critical activities in developing quality software. To enhance test efficiency and to improve repeatability of tests , several testing tools were developed and rolled out by proprietary commercial vendors like HP, IBM, etc., In the past decade, Proprietary software test automation tools dominated the automation market and were extensively adopted. But in recent past, Open Source test automation tools are acknowledged as reliable and are being adopted rapidly by IT companies. When companies started adopting Open source test automation tools, these tools were perceived as Competitive advantage to keep them abreast of technology developments, reduce cost, train testers, test in-house developed applications and so on. But today, we are witnessing Open Source test automation tools evolving as industry standard and have become a competitive necessity in IT Industry. This paper explores on how Open Source Test Automation tools have become a Competitive Necessity.</em>
APA, Harvard, Vancouver, ISO, and other styles
9

Balint, Bryon. "Obtaining Value from the Customization of Packaged Business Software." International Journal of Enterprise Information Systems 11, no. 1 (January 2015): 33–49. http://dx.doi.org/10.4018/ijeis.2015010103.

Full text
Abstract:
Businesses that purchase packaged application software – for example, an Enterprise Resource Planning system – must make choices about customization. Packaged software vendors, anecdotal evidence, and practitioner-oriented research all recommend that businesses should customize software as little as possible, and instead adapt their processes to meet the “best practices” of the software. However, businesses continue to outspend their budgets on implementing and maintaining customized software, often to a significant extent. This suggests that either these businesses are making poor decisions, or that the conventional wisdom about customization is incorrect. In this paper the authors model the primary factors in the customization decision: “fit” between the desired business process and the packaged software; costs related to development, maintenance, integration, and performance; and benefits related to increased fit, integration, performance, and user acceptance. They use simulation techniques to illustrate the conditions under which customization is likely to provide value to the organization, as well as conditions under which customization should be avoided.
APA, Harvard, Vancouver, ISO, and other styles
10

Schraven, Markus Hans, Kai Droste, Carlo Guarnieri Calò Carducci, Dirk Müller, and Antonello Monti. "Open-Source Internet of Things Gateways for Building Automation Applications." Journal of Sensor and Actuator Networks 11, no. 4 (November 8, 2022): 74. http://dx.doi.org/10.3390/jsan11040074.

Full text
Abstract:
Due to its potential benefits in data transparency, maintenance, and optimization of operation, the Internet of Things (IoT) has recently emerged in the building automation system (BAS) domain. However, while various IoT devices have been developed, the integration into BAS remains a challenging task due to the variety of conventional interfaces used in existing BAS. From an objective point of view, integrating IoT connectivity on existing devices’ printed circuit boards (PCBs) would be the most efficient option in terms of cost and resources, but requires adaptation of product lines, and vendors would often couple this with their own services and without an option for customization. By contrast, the majority of research activities focus on developing alternative or additional measurement systems, rather than connecting with legacy system components. Furthermore, most research applications cover very simple and individual use-cases with a do-it-yourself character and limited applicability in industrial applications. In this study, we present a scalable, industrial-like embedded solution to connect to common interfaces in BAS applications and share all the hardware and software design as an open-source platform for public use, customization, and further enhancement. Moreover, a thorough measurement performance analysis was conducted, suggesting an acceptable trade-off among accuracy, flexibility, and costs, e.g., achieving a performance increase by over 75 and a cost reduction by roughly 34 compared to a previous design.
APA, Harvard, Vancouver, ISO, and other styles
11

Calì, Davide, Ekkart Kindler, Razgar Ebrahimy, Peder Bacher, Kevin Hu, Michelle Lind Østrup, Magnus Bachalarz, and Henrik Madsen. "climify.org: an online solution for easy control and monitoring of the indoor environment." E3S Web of Conferences 111 (2019): 05006. http://dx.doi.org/10.1051/e3sconf/201911105006.

Full text
Abstract:
Real energy performance of new and retrofitted buildings often consistently differs from expectations. While occupants might complain about poor indoor climate, the energy use in such buildings is often higher than expected, leading to the well-known phenomenon called “Energy Performance Gap”. In the past years, monitoring of buildings, both in terms of energy use and indoor climate conditions, was realised mostly for office buildings only, and at high financial costs. However, the exponential growth in availability of IoT devices, over the last years, opens now new scenarios for low-cost monitoring and control solutions of buildings. Yet, modern IoT devices are often only accessible online through the vendors’ software, although some devices make use of open communication protocols and can, therefore, be connected to open platforms such as openHAB. However, the use of open platforms is still connected to a big efforts for many final users. We, therefore, propose climify.org, an open platform for plug and play connection of IoT sensors and actuators, for easy monitoring and controlling of buildings and buildings’ HVAC systems. The platform climify.org offers, at time of writing, three main applications. The first application is an IoT device installation app, to be used on portable devices (e.g. mobile phones or tablets of system administrators): this app allows easily installing and locating a sensor or an actuator, within a building. The second application is an online service for data visualisation and HVAC control: while the monitoring data can be plotted, the service offers several data evaluation methods; moreover, the settings of the connected actuators can be modified and controlled. The third application can be installed on portable devices (mobile phones and tablets of buildings’ occupants) and allows occupants to provide feedback on their perception of the indoor climate through several questionnaires’ formats. Through the three applications developed within climify.org, we aim at providing the best indoor climate and the lowest energy use through a low-cost solution.
APA, Harvard, Vancouver, ISO, and other styles
12

Pakvasa, Mikhail, Hannes Prescher, Bryce Hendren-Santiago, Tony Da Lomba, Nicholas McKenzie, Courtney Orsbon, Zachary Collier, Richard Ramirez-Garcia, Isabella Gomez, and Russell R. Reid. "An Easy-to-Use Protocol for Segmenting and 3-D Printing Craniofacial CT-Images Using Open-Source Software." FACE 3, no. 1 (January 10, 2022): 66–73. http://dx.doi.org/10.1177/27325016211072286.

Full text
Abstract:
Introduction: Stereolithography, also known as 3D printing (3DP), is a versatile and useful technology with many healthcare applications. While 3DP has gained tremendous popularity, it remains a daunting and perceptibly time-consuming process for the inexperienced user, with most turning to commercially printed products. Commercial vendors are expensive. We propose that 3DP is feasible for the inexperienced user with the appropriate knowledge and tools. Methods: A 3DP protocol was created for model design and printing using open-source software and a low-cost desktop printer. It was betatested by 3 inexperienced users. The fidelity of the protocol was then tested in direct comparison to industry models made for 3 patients undergoing mandibular distraction osteogenesis, using standard cephalometric measurements. Results: All inexperienced testers were able to successfully create a 3D model using the easy-to-follow protocol without the use of any other resources. The models were created in a mean time of 170 minutes. All cephalometric measurements on the open-source printed models were equal to within 0.5 to 1.0 mm of the respective industry models. Conclusions: As the 3DP process is simplified and desktop printers and materials become more affordable, we anticipate that its implementation will become more commonplace. We describe a step-by-step, protocol using open-source software and affordable materials to create 3D models.
APA, Harvard, Vancouver, ISO, and other styles
13

Keyes, D. E., H. Ltaief, and G. Turkiyyah. "Hierarchical algorithms on hierarchical architectures." Philosophical Transactions of the Royal Society A: Mathematical, Physical and Engineering Sciences 378, no. 2166 (January 20, 2020): 20190055. http://dx.doi.org/10.1098/rsta.2019.0055.

Full text
Abstract:
A traditional goal of algorithmic optimality, squeezing out flops, has been superseded by evolution in architecture. Flops no longer serve as a reasonable proxy for all aspects of complexity. Instead, algorithms must now squeeze memory, data transfers, and synchronizations, while extra flops on locally cached data represent only small costs in time and energy. Hierarchically low-rank matrices realize a rarely achieved combination of optimal storage complexity and high-computational intensity for a wide class of formally dense linear operators that arise in applications for which exascale computers are being constructed. They may be regarded as algebraic generalizations of the fast multipole method. Methods based on these hierarchical data structures and their simpler cousins, tile low-rank matrices, are well proportioned for early exascale computer architectures, which are provisioned for high processing power relative to memory capacity and memory bandwidth. They are ushering in a renaissance of computational linear algebra. A challenge is that emerging hardware architecture possesses hierarchies of its own that do not generally align with those of the algorithm. We describe modules of a software toolkit, hierarchical computations on manycore architectures, that illustrate these features and are intended as building blocks of applications, such as matrix-free higher-order methods in optimization and large-scale spatial statistics. Some modules of this open-source project have been adopted in the software libraries of major vendors. This article is part of a discussion meeting issue ‘Numerical algorithms for high-performance computational science’.
APA, Harvard, Vancouver, ISO, and other styles
14

Stevens, James D., and Andreas Klöckner. "A mechanism for balancing accuracy and scope in cross-machine black-box GPU performance modeling." International Journal of High Performance Computing Applications 34, no. 6 (June 3, 2020): 589–614. http://dx.doi.org/10.1177/1094342020921340.

Full text
Abstract:
The ability to model, analyze, and predict execution time of computations is an important building block that supports numerous efforts, such as load balancing, benchmarking, job scheduling, developer-guided performance optimization, and the automation of performance tuning for high performance, parallel applications. In today’s increasingly heterogeneous computing environment, this task must be accomplished efficiently across multiple architectures, including massively parallel coprocessors like GPUs, which are increasingly prevalent in the world’s fastest supercomputers. To address this challenge, we present an approach for constructing customizable, cross-machine performance models for GPU kernels, including a mechanism to automatically and symbolically gather performance-relevant kernel operation counts, a tool for formulating mathematical models using these counts, and a customizable parameterized collection of benchmark kernels used to calibrate models to GPUs in a black-box fashion. With this approach, we empower the user to manage trade-offs between model accuracy, evaluation speed, and generalizability. A user can define their own model and customize the calibration process, making it as simple or complex as desired, and as application-targeted or general as desired. As application examples of our approach, we demonstrate both linear and nonlinear models; these examples are designed to predict execution times for multiple variants of a particular computation: two matrix-matrix multiplication variants, four discontinuous Galerkin differentiation operation variants, and two 2D five-point finite difference stencil variants. For each variant, we present accuracy results on GPUs from multiple vendors and hardware generations. We view this highly user-customizable approach as a response to a central question arising in GPU performance modeling: how can we model GPU performance in a cost-explanatory fashion while maintaining accuracy, evaluation speed, portability, and ease of use, an attribute we believe precludes approaches requiring manual collection of kernel or hardware statistics.
APA, Harvard, Vancouver, ISO, and other styles
15

Marehn, David Thomas, Detlef Wilhelm, Heike Pospisil, and Roberto Pizzoferrato. "Double entry method for the verification of data a chromatography data system receives." Journal of Sensors and Sensor Systems 8, no. 1 (May 17, 2019): 207–14. http://dx.doi.org/10.5194/jsss-8-207-2019.

Full text
Abstract:
Abstract. The importance of software validation increases since the need for high usability and suitability of software applications grows. In order to reduce costs and manage risk factors, more and more recommendations and rules have been established. In the field of pharmacy the vendors of so-called chromatography data systems (CDSs) had to implement the guidelines of the Code of Federal Regulations Title 21 (CFR 21) during the last few years in order to fulfill the increasing requirements. The CFR 21 part 11 deals with electronic records and signatures. This part is binding for each company in the regulated environment that wishes to create, edit and sign electronic information instead of printing them on paper. Subsection CFR 21 part 11.10(h) explains how to perform an input check for manual user entries as well as for data that will be collected from an external device. In this article we present an approach performing the double entry method on data provided by the hardware instrument in order to investigate possible influences on the raw data by the handling CDS. A software tool has been written which allows us to communicate with a high-performance liquid chromatography (HPLC) detector and acquire data from it. The communication is completely independent of a CDS which is started separately and connected to the same system. Using this configuration we made a parallel data acquisition of two instances at the same time possible. Two CDSs have been tested and for at least one of them it has been shown that a comparison of the acquired data can be done as with the double entry method for the data verification. For the second CDS we checked whether it would be applicable after a few modifications. The given approach could be either used for a live data verification of produced raw data or as a single test during a software operational qualification to verify the data acquisition functionality of the software.
APA, Harvard, Vancouver, ISO, and other styles
16

Caswell, Greg, and Joelle Arnold. "The Reliability Impact of Reballing COTS Pb-Free BGAs to Sn/Pb for Military Applications." International Symposium on Microelectronics 2011, no. 1 (January 1, 2011): 000703–10. http://dx.doi.org/10.4071/isom-2011-wp2-paper3.

Full text
Abstract:
The electronics assembly market has experienced a material shift from lead (Pb) based solders to Pb-free solders. This is a result of the widespread adoption of Reduction of Hazardous Substances (RoHS) legislation and practices in commercial industry. As a result, it is becoming increasingly difficult to procure commercial off-the-shelf (COTS) components with tin-lead (SnPb) solder balls or finish. There are essentially three responses to the scarcity of acceptable SnPb parts: custom order, post process or adapt. Custom ordering parts with SnPb finishes negates the benefits of COTS based acquisition; however, has a reduced reliability risk because the material and processes are known. Reprocessing parts once in house saves money because the parts are COTS, but expends money and resources by performing post processing on them. Also, the additional touch labor and handling increases the risk of damaging the part. Finally, adapting to Pb-free finishes is the preferred long term approach because it preserves the cost benefits of using COTS parts and does not require post processing. It is the riskiest approach due to the lack of historical data in the DoD environment. This paper presents results regarding reballing 208 I/O Ball Grid Array (BGA) parts from tin-silver-copper (SAC305) solder to SnPb eutectic solder. It is important to understand the reliability risks associated with the reballing procedure, particularly as it relates to thermal cycling, shock and vibration environments. Three major efforts will be presented to answer these concerns. First, a survey of reballing vendors was performed to better understand the processes and variables associated with that industry. The results of that survey were used to down-select to five vendors that were used for the physical testing portion of the effort. Finally, physical testing consisting of thermal cycling, shock, and vibration was performed. The physical testing was performed on parts from the five different reballing vendors as well as native SnPb parts and native SAC305 parts. The results of these activities will be presented.
APA, Harvard, Vancouver, ISO, and other styles
17

Irshad, Lubna, Li Yan, and Zongmin Ma. "Schema-Based JSON Data Stores in Relational Databases." Journal of Database Management 30, no. 3 (July 2019): 38–70. http://dx.doi.org/10.4018/jdm.2019070103.

Full text
Abstract:
JSON is a simple, compact and light weighted data exchange format to communicate between web services and client applications. NoSQL document stores evolve with the popularity of JSON, which can support JSON schema-less storage, reduce cost, and facilitate quick development. However, NoSQL still lacks standard query language and supports eventually consistent BASE transaction model rather than the ACID transaction model. This is very challenging and a burden on the developer. The relational database management systems (RDBMS) support JSON in binary format with SQL functions (also known as SQL/JSON). However, these functions are not standardized yet and vary across vendors along with different limitations and complexities. More importantly, complex searches, partial updates, composite queries, and analyses are cumbersome and time consuming in SQL/JSON compared to standard SQL operations. It is essential to integrate JSON into databases that use standard SQL features, support ACID transactional models, and has the capability of managing and organizing data efficiently. In this article, we empower JSON to use relational databases for analysis and complex queries. The authors reveal that the descriptive nature of the JSON schema can be utilized to create a relational schema for the storage of the JSON document. Then, the powerful SQL features can be used to gain consistency and ACID compatibility for querying JSON instances from the relational schema. This approach will open a gateway to combine the best features of both worlds: the fast development of JSON, consistency of relational model, and efficiency of SQL.
APA, Harvard, Vancouver, ISO, and other styles
18

Parthasarathy, Sudhaman, C. Sridharan, Thangavel Chandrakumar, and S. Sridevi. "Quality Assessment of Standard and Customized COTS Products." International Journal of Information Technology Project Management 11, no. 3 (July 2020): 1–13. http://dx.doi.org/10.4018/ijitpm.2020070101.

Full text
Abstract:
Software quality is a very important aspect in evolving strategy for IT vendors involved in commercial off-the-shelf (COTS) (also referred as packaged software) product development. Software metrics are widely accepted measures for monitoring and managing the quality in software projects. Enterprise resource planning (ERP) systems are COTS products and attempt to integrate data and processes in organizations and often require extensive customization. Using software quality metrics already established in literature, software quality attributes defined by the quality model ISO/IEC 9126 were evaluated for a standard and a customized ERP product. This will help the ERP team to identify the specific quality attributes that were affected owing to customization. This research study infers that there exists a considerable impact of ERP system customization over the quality of ERP product. The implications of the findings for both practice and research are discussed, and possible areas of future research are identified.
APA, Harvard, Vancouver, ISO, and other styles
19

Agarwal, Reita N., Rajesh Aggarwal, Pridhviraj Nandarapu, Hersheth Aggarwal, Ashmit Verma, Absarul Haque, and Manish K. Tripathi. "COVID-19 Vaccination Drive in a Low-Volume Primary Care Clinic: Challenges & Lessons Learned in Using Homegrown Self-Scheduling Web-Based Mobile Platforms." Vaccines 10, no. 7 (July 3, 2022): 1072. http://dx.doi.org/10.3390/vaccines10071072.

Full text
Abstract:
Background: The whole of humanity has suffered dire consequences related to the novel coronavirus disease 2019 (COVID-19). Vaccination of the world base population is considered the most promising and challenging approach to achieving herd immunity. As healthcare organizations took on the extensive task of vaccinating the entire U.S. population, digital health companies expanded their automated health platforms in order to help ease the administrative burdens of mass inoculation. Although some software companies offer free applications to large organizations, there are prohibitive costs for small clinics such as the Good Health Associates Clinic (GHAC) for integrating and implementing new self-scheduling software into our e-Clinical Works (ECW) Electronic Health Record (EHR). These cost burdens resulted in a search that extended beyond existing technology, and in investing in new solutions to make it easier, more efficient, more cost-effective, and more scalable. Objective: In comparison to commercial entities, primary care clinics (PCCs) have the advantage of engaging the population for vaccination through personalized continuity of clinical care due to good rapport between their patients and the PCC team. In order to support the overall national campaign to prevent COVID-19 infections and restore public health, the GHAC wanted to make COVID-19 vaccination accessible to its patients and to the communities it serves. We aimed to achieve a coordinated COVID-19 vaccination drive in our community through our small primary care clinic by developing and using an easily implementable, cost-effective self-registration and scheduling web-based mobile platform, using the principle of “C.D.S. Five Rights”. Results: Overall, the Moderna vaccination drive using our developed self-registration and scheduling web portal and SMS messaging mobile platform improved vaccination uptake (51%) compared to overall vaccination uptake in our town, county (36%), and state (39%) during April–July 2021. Conclusions: Based on our experience during this COVID-19 vaccination drive, we conclude that PCCs have significant leverage as “invaluable warriors”, along with government and media education available, to engage patients for vaccination uptake; this leads to national preventive health spread in our population, and reduces expenses related to acute illness and hospitalization. In terms of cost-effectiveness, small PCCs are worthy of government-sponsored funding and incentives, including mandating EHR vendors to provide free (or minimal fee) software for patient self-registration and scheduling, in order to improve vaccination drive access. Hence, improved access to personalized informative continuity of clinical care in the PCC setting is a “critical link” in accelerating similar cost-effective campaigns in patient vaccine uptake.
APA, Harvard, Vancouver, ISO, and other styles
20

Cresswell, Kathrin, Jamie Coleman, Pam Smith, Charles Swainson, Ann Slee, and Aziz Sheikh. "Qualitative analysis of multi-disciplinary round-table discussions on the acceleration of benefits and data analytics through hospital electronic prescribing (ePrescribing) systems." Journal of Innovation in Health Informatics 23, no. 2 (July 4, 2016): 501. http://dx.doi.org/10.14236/jhi.v23i2.178.

Full text
Abstract:
Background: Electronic systems that facilitate prescribing, administration and dispensing of medicines (ePrescribing systems) are at the heart of international efforts to improve the safety, quality and efficiency of medicine management. Considering the initial costs of procuring and maintaining ePrescribing systems, there is a need to better understand how to accelerate and maximise the financial benefits associated with these systems.Objectives: We sought to investigate how different sectors are approaching the realisation of returns on investment from ePrescribing systems in U.K. hospitals and what lessons can be learned for future developments and implementation strategies within healthcare settings.Methods: We conducted international, multi-disciplinary, round-table discussions with 21 participants from different backgrounds including policy makers, healthcare organisations, academic researchers, vendors and patient representatives. The discussions were audio-recorded, transcribed and then thematically analysed with the qualitative analysis software NVivo10.Results: There was an over-riding concern that realising financial returns from ePrescribing systems was challenging. The underlying reasons included substantial fixed costs of care provision, the difficulties in radically changing the medicines management process and the lack of capacity within NHS hospitals to analyse and exploit the digital data being generated. Any future data strategy should take into account the need to collect and analyse local and national data (i.e. within and across hospitals), setting comparators to measure progress (i.e. baseline measurements) and clear standards guiding data management so that data are comparable across settings.Conclusions: A more coherent national approach to realising financial benefits from ePrescribing systems is needed as implementations progress and the range of tools to collect information will lead to exponential data growth. The move towards more sophisticated closed-loop systems that integrate prescribing, administration and dispensing, as well as increasingly empowered patients accessing their data through portals and portable devices, will accelerate these developments. Meaningful analysis of data will be the key to realise benefits associated with systems.
APA, Harvard, Vancouver, ISO, and other styles
21

Rahman, Hanif Ur, Mushtaq Raza, Palwasha Afsar, Abdullah Alharbi, Sultan Ahmad, and Hashym Alyami. "Multi-Criteria Decision Making Model for Application Maintenance Offshoring Using Analytic Hierarchy Process." Applied Sciences 11, no. 18 (September 14, 2021): 8550. http://dx.doi.org/10.3390/app11188550.

Full text
Abstract:
The phenomenon of Global Software Development (GSD) has attracted the interest of businesses all over the world. It brings together partners from various national and corporate cultures to develop applications with numerous advantages, including access to a vast labor pool, cost savings, and round the clock growth. GSD, on the other hand, is technologically and organizationally diverse and poses a number of obstacles for the development team, such as geographical distance, cultural differences, communication and language barriers. Global services are provided by selecting one of the suitable global delivery options, i.e., the onshore model, nearshore model or offshore model. Experts typically choose one of the models based on the nature of the project and the needs of the customer. However, the vendors and clients lack an adequate decision support system that can assist them in making suitable sourcing decisions. Therefore, the current study presents a Multi-Criteria Decision Making (MCDM) model for offshore outsourcing decisions of application maintenance. To achieve our target, two systematic literature reviews were conducted that explored a list of 15 influencing factors. The identified factors were further evaluated in the outsourcing industry by performing an empirical study that resulted in a list of 10 critical success factors. We propose a sourcing framework based on the critical success factors that can assist decision makers in adopting a suitable sourcing strategy for the offshore outsourcing of application maintenance. In order to further enhance the decision-making process, the MCDM model is developed based on the Analytic Hierarchy Process (AHP). The MCDM model is evaluated with three case studies in highly reputable international companies, including IBM Stockholm, Sweden, Vattenfall AB, Stockholm, Sweden and a London based company in the United Kingdom. The outcomes of these case studies are further reviewed and validated by the outsourcing specialists in other firms. The proposed model is used as a decision support system that determines the ranking of sourcing alternatives and suggests the most suitable option for application maintenance offshoring.
APA, Harvard, Vancouver, ISO, and other styles
22

Cresswell, Kathrin, Andrés Domínguez Hernández, Robin Williams, and Aziz Sheikh. "Key Challenges and Opportunities for Cloud Technology in Health Care: Semistructured Interview Study." JMIR Human Factors 9, no. 1 (January 6, 2022): e31246. http://dx.doi.org/10.2196/31246.

Full text
Abstract:
Background The use of cloud computing (involving storage and processing of data on the internet) in health care has increasingly been highlighted as having great potential in facilitating data-driven innovations. Although some provider organizations are reaping the benefits of using cloud providers to store and process their data, others are lagging behind. Objective We aim to explore the existing challenges and barriers to the use of cloud computing in health care settings and investigate how perceived risks can be addressed. Methods We conducted a qualitative case study of cloud computing in health care settings, interviewing a range of individuals with perspectives on supply, implementation, adoption, and integration of cloud technology. Data were collected through a series of in-depth semistructured interviews exploring current applications, implementation approaches, challenges encountered, and visions for the future. The interviews were transcribed and thematically analyzed using NVivo 12 (QSR International). We coded the data based on a sociotechnical coding framework developed in related work. Results We interviewed 23 individuals between September 2020 and November 2020, including professionals working across major cloud providers, health care provider organizations, innovators, small and medium-sized software vendors, and academic institutions. The participants were united by a common vision of a cloud-enabled ecosystem of applications and by drivers surrounding data-driven innovation. The identified barriers to progress included the cost of data migration and skill gaps to implement cloud technologies within provider organizations, the cultural shift required to move to externally hosted services, a lack of user pull as many benefits were not visible to those providing frontline care, and a lack of interoperability standards and central regulations. Conclusions Implementations need to be viewed as a digitally enabled transformation of services, driven by skill development, organizational change management, and user engagement, to facilitate the implementation and exploitation of cloud-based infrastructures and to maximize returns on investment.
APA, Harvard, Vancouver, ISO, and other styles
23

Ye Yang, J. Bhuta, B. Boehm, and D. N. Port. "Value-Based Processes for COTS-Based Applications." IEEE Software 22, no. 4 (July 2005): 54–62. http://dx.doi.org/10.1109/ms.2005.112.

Full text
APA, Harvard, Vancouver, ISO, and other styles
24

Rabiei, Elaheh, Lixian Huang, Hao-Yu Chien, Arjun Earthperson, Mihai A. Diaconeasa, Jason Woo, Subramanian Iyer, Mark White, and Ali Mosleh. "Method and software platform for electronic COTS parts reliability estimation in space applications." Proceedings of the Institution of Mechanical Engineers, Part O: Journal of Risk and Reliability 235, no. 5 (March 23, 2021): 744–60. http://dx.doi.org/10.1177/1748006x21998231.

Full text
Abstract:
Adoption of electronic Commercial-Off-The-Shelf (COTS) parts in various industrial products is rapidly increasing due to the accessibility and appealing lower cost of these commodities. Depending on the type of application, having an accurate understanding of the COTS failure information can be crucial to ensure the reliability and safety of the final products. On the other hand, frequent large-scale testing is often cost prohibitive and time consuming for emerging technologies, especially in the consumer electronics sector where minimizing time-to-market and cost is critical. This paper presents a comprehensive Bayesian approach and software platform (named COTS Reliability Expert System), that integrates multiple pieces of heterogeneous information about the failure rate of COTS parts. The ultimate goal is to reduce dependency on testing for reliability analysis and yet to obtain a more accurate “order of magnitude” estimate of the failure rate through an efficient process. The method provides a foundation for incorporating manufacturers reliability data, estimates based on underlying physics-of-failure mechanisms and circuit simulations, partially relevant life test data of similar (but not necessarily identical) parts, and expert opinions on the manufacturing process of the COTS part of interest. The developed expert system uses Bayesian estimation to integrate all these types of evidence. The methodology is demonstrated in estimating the failure rate of a static random-access memory (SRAM) part.
APA, Harvard, Vancouver, ISO, and other styles
25

Lemahieu, W., M. Snoeck, F. Goethals, M. De Backer, R. Haesen, J. Vandenbulcke, and G. Dedene. "Coordinating COTS Applications via a Business Event Layer." IEEE Software 22, no. 4 (July 2005): 28–35. http://dx.doi.org/10.1109/ms.2005.90.

Full text
APA, Harvard, Vancouver, ISO, and other styles
26

Gupta, Pankaj, Shilpi Verma, and Mukesh Kumar Mehlawat. "Fuzzy COTS Selection for Modular Software Systems Based on Cohesion and Coupling under Multiple Applications Environment." International Journal of Applied Evolutionary Computation 3, no. 4 (October 2012): 1–18. http://dx.doi.org/10.4018/jaec.2012100101.

Full text
Abstract:
Due to the rapid growth of development of component based software systems, the selection of optimal commercial-off-the-shelf (COTS) components has become the key of optimization techniques used for the purpose. In this paper, the authors use fuzzy mathematical programming (FMP) for developing bi-objective fuzzy optimization models that aims to select the best-fit COTS components for a modular software system under multiple applications development task. The proposed models maximize the functional performance and minimize the total cost of the software system satisfying the constraints of minimum threshold on intra-modular coupling density and reusability of COTS components. The efficiency of the models is illustrated using a real-world scenario of developing two financial applications for two small-scale industries.
APA, Harvard, Vancouver, ISO, and other styles
27

Riemann, Ute. "Benefits and Challenges for BPM in the Cloud." International Journal of Organizational and Collective Intelligence 5, no. 1 (January 2015): 32–61. http://dx.doi.org/10.4018/ijoci.2015010103.

Full text
Abstract:
Business processes are not only variable, they are dynamic as well. A key benefit of BPM is the ability to adjust processes accordingly in response to changing market requirements. In parallel to BPM, enterprise cloud computing technology has emerged to provide a more cost effective solution to businesses and services while making use of inexpensive computing solutions, which combines pervasive, internet, and virtualization technologies (). Despite the slow start the business benefits of cloud computing are as such that the transition of BPM to the cloud is now underway. Cloud services refer to the operation of a virtualized, automated, and service-oriented IT landscape that allows the flexible provision and usage-based invoicing of resources, services, and applications via a network or the Internet. The generic term “X-as-a-Service” summarized the business models delivering almost everything as a service. BPM in the cloud is often regarded as a SaaS application. More recently, BPM is being regarded as a PaaS as it facilitates the creation and deployment of applications, in this case business process solutions. The PaaS landscape is the least developed of the four cloud based software delivery models previously discussed. PaaS vendors, such as IBM, Oracle, Microsoft delivered an application platform with managed cloud infrastructure services however more recently the PaaS market has begun to evolve to include other middleware capabilities including process management. BPM PaaS is the delivery of BPM technology as a service via a cloud service provider. In order to be classified as a PaaS a BPM suite requires the following capabilities: the architecture should be multi-tenant, it should be hosted off premise and it should offer elasticity and metering by use capabilities. When we refer to BPM in the cloud what we are really referring to is a combination of BPM PaaS and BPaaS (Business Process as a Service). Business Process as a Service (BPaaS) is a set of pre-defined business processes that allows the execution of customized business processes in the cloud. BPaaS is a complete pre-integrated BPM platform hosted in the cloud and delivered as a service, for the development and execution of general-purpose business process application. Although such a service harbors an economic potential, questions that need to be answered are as follows: Can an individual and company-specific business process supported by a standardized cloud solution, or should we protect process creativity and competitive differentiation by allowing the company to design the processes individually and solely support basic data flows and structures? Does it make sense to take a software solution “out of the box” that handles both data and process in a cloud environment, or would this hinder the creativity of business (process) development leading to a lower quality of processes and consequently to a decrease in the competitive positioning of a company? How to manage the inherent compliance and security topic. Within a completely integrated business application system, all required security aspects can be implemented as safeguards with just enough money. Within the cloud, however, advanced standards and identity prove is required to monitor and measure information exchange across the federation. Thereby there seems to be no need for developing new protocols, but a standardized way to collect and evaluate the collected information.
APA, Harvard, Vancouver, ISO, and other styles
28

Riemann, Ute. "Benefits and Challenges for Business Process Management in the Cloud." International Journal of Organizational and Collective Intelligence 5, no. 2 (April 2015): 80–104. http://dx.doi.org/10.4018/ijoci.2015040104.

Full text
Abstract:
Business processes are not only variable they are as well dynamic. A key benefit of Business Process Management (BPM) is the ability to adjust business processes accordingly in response to changing market requirements. In parallel to BPM, enterprise cloud computing technology has emerged to provide a more cost effective solution to businesses and services while making use of inexpensive computing solutions, which combines pervasive, internet, and virtualization technologies (). Despite the slow start, the business benefits of cloud computing are as such that the transition of BPM to the cloud is now underway. Cloud services refer to the operation of a virtualized, automated, and service-oriented IT landscape allowing the flexible provision and usage-based invoicing of resources, services, and applications via a network or the internet. The generic term “X-as-a-Service” summarize the business models delivering almost everything as a service. BPM in the cloud is often regarded as a SaaS application. More recently, BPM is being regarded as a PaaS as it facilitates the creation and deployment of applications, in this case business process solutions. The PaaS landscape is the least developed of the four cloud based software delivery models previously discussed. PaaS vendors, such as IBM, Oracle, and Microsoft delivered an application platform with managed cloud infrastructure services however, more recently the PaaS market has begun to evolve to include other middleware capabilities including process management. BPM PaaS is the delivery of BPM technology as a service via a cloud service provider. For the classification as a PaaS a BPM suite requires the following capabilities: the architecture should be multi-tenant, hosting should be off premise and it should offer elasticity and metering by use capabilities. When we refer to BPM in the cloud, what we are really referring to is a combination of BPM PaaS and BPaaS (Business Process as a Service). Business Process as a Service (BPaaS) is a set of pre-defined business processes that allows the execution of customized business processes in the cloud. BPaaS is a complete pre-integrated BPM platform hosted in the cloud and delivered as a service, for the development and execution of general-purpose business process application. Although such a service harbors an economic potential there are remaining questions: Can an individual and company-specific business process supported by a standardized cloud solution, or should we protect process creativity and competitive differentiation by allowing the company to design the processes individually and solely support basic data flows and structures? Does it make sense to take a software solution “out of the box” that handles both data and process in a cloud environment, or would this hinder the creativity of business (process) development leading to a lower quality of processes and consequently to a decrease in the competitive positioning of a company? How to manage the inherent compliance and security topic. Within a completely integrated business application system, all required security aspects can be implemented as a safeguarding with just enough money. Within the cloud, however, advanced standards and identity prove is required to monitor and measure information exchange across the federation. Thereby there seems to be no need for developing new protocols, but a standardized way to collect and evaluate the collected information.
APA, Harvard, Vancouver, ISO, and other styles
29

VILLALBA, M. TERESA, LUIS FERNÁNDEZ-SANZ, JUAN J. CUADRADO-GALLEGO, and JOSE J. MARTÍNEZ. "SOFTWARE QUALITY EVALUATION FOR SECURITY COTS PRODUCTS." International Journal of Software Engineering and Knowledge Engineering 20, no. 01 (February 2010): 27–48. http://dx.doi.org/10.1142/s0218194010004633.

Full text
Abstract:
Increasing demand for security commercial products requires an improvement of methods for evaluating their software quality. Existing standards offer general frameworks but more specific models which reflect the perception of experts and customers as well as the particular characteristics of this type of products are needed. This article presents a method for generating domain-oriented software quality models for specific types of applications. It is applied to the generation of a model for security COTS products based on systematic review of standards, related literature and conclusions of evaluation experiences as well as the statistical analysis of information collected from 203 security experts and practitioners. Results reveal interesting conclusions on the importance given by users to the different quality characteristics of security commercial software products.
APA, Harvard, Vancouver, ISO, and other styles
30

Hoffman, Jo, and Catherine A. Cook. "Designing for Usability with Cots: How Useful is a Style Guide?" Proceedings of the Human Factors and Ergonomics Society Annual Meeting 42, no. 18 (October 1998): 1295–99. http://dx.doi.org/10.1177/154193129804201804.

Full text
Abstract:
The recent increase in the use of Commercial Off The Shelf (COTS) equipment and software in future military procurements creates a variety of challenges for human factors practitioners to address. There is a need to tailor our approach in order to provide suitable human factors tools to support the design of COTS-based systems. The human factors approach adopted, and experiences gained, in the development of a command planning aid are reported. This system, which is currently under development, utilises a large number of COTS products together with significant bespoke software development. It is one of the first major procurements in the United Kingdom to be based heavily around the use of COTS. A major challenge has been to optimise the usability of the overall system by providing future users with as seamless as possible integration of the various COTS products, rather than a series of unrelated, separate applications. One of the main activities has therefore been the design of the human-computer interface (HCI). A comprehensive Style Guide was developed against which the proposed COTS products could be evaluated, and new bespoke software could be designed. This paper evaluates the utility of a Style Guide in this context, and reports a number of lessons learned from our experiences.
APA, Harvard, Vancouver, ISO, and other styles
31

Cherednychenko, Anna. "Application of free software in secondary schools of Ukraine." Ukrainian Journal of Educational Studies and Information Technology 6, no. 3 (September 30, 2018): 21–32. http://dx.doi.org/10.32919/uesit.2018.03.03.

Full text
Abstract:
Information and communication technologies and software are now used in a lot of areas of human activity. The information technology industry is constantly expanding, offering new software products and services to consumers. Proprietary software vendors deliver quality programs and guarantee their upgrade and support, but set fairly hard restrictions on their use. Licenses clearly stipulate the conditions for the installation, use and transfer of software, the number of copies, etc. Consequently, besides the fact that such software has a high cost in most cases, it significantly limits the freedom of users and makes them dependent on developers and vendors. One of the global software industry trends is the development of free software. Its important advantage is the ability to install, transmit and modify it without restrictions, and to use it to solve a variety of tasks, including profitable. In addition, such programs are often free, which is extremely important for the education sector. The implementation of free software in educational institutions can be considered as an important economic factor, as a factor of the development of students' competence and vocational guidance, as a factor of the development of civic consciousness. In the article the phenomenon of free software was analyzed, the movement for which began in the 70's of the twentieth century. It is found out that the idea of the free software is based on the philosophy of respect for individual freedom, which is realized through the freedom to execute programs in any way, to study and modify programs, to transfer copies of programs to other users, to transmit to other their modified versions of programs. We show the advantages and disadvantages of the free software, its role in information and civil society, the importance of its using in education.
APA, Harvard, Vancouver, ISO, and other styles
32

Yao, Yurong, Denis M. Lee, and Yang W. Lee. "Cost and Service Capability Considerations on the Intention to Adopt Application Service Provision Services." Journal of Database Management 21, no. 3 (July 2010): 90–113. http://dx.doi.org/10.4018/jdm.2010070104.

Full text
Abstract:
The Application Service Provision (ASP) model offers a new form of IS/IT resource management option for which the vendor remotely provides the usage of applications over a network. Currently, the ASP industry appears to be more vendor-driven. But without a good understanding of how the ASP offerings might appeal to prospective customers, the industry might not survive. This study investigates empirically the intention to adopt an ASP service from the customers’ perspective, using survey data collected from a national sample of IS/IT executives. Based on the Transaction Cost Theory (Williamson, 1979, 1985) and service capability, a causal model is developed to examine the effects of perceived cost savings and service capability, as well as their antecedent factors, on the intention to adopt an ASP service. The results show a dominant effect of cost savings consideration on ASP adoption intention.
APA, Harvard, Vancouver, ISO, and other styles
33

Mozaffar, Hajar, Robin Williams, Kathrin Cresswell, Zoe Morrison, David W. Bates, and Aziz Sheikh. "The evolution of the market for commercial computerized physician order entry and computerized decision support systems for prescribing." Journal of the American Medical Informatics Association 23, no. 2 (September 2, 2015): 349–55. http://dx.doi.org/10.1093/jamia/ocv095.

Full text
Abstract:
Abstract Objective To understand the evolving market of commercial off-the-shelf Computerized Physician Order Entry (CPOE) and Computerized Decision Support (CDS) applications and its effects on their uptake and implementation in English hospitals. Methods Although CPOE and CDS vendors have been quick to enter the English market, uptake has been slow and uneven. To investigate this, the authors undertook qualitative ethnography of vendors and adopters of hospital CPOE/CDS systems in England. The authors collected data from semi-structured interviews with 11 individuals from 4 vendors, including the 2 most entrenched suppliers, and 6 adopter hospitals, and 21 h of ethnographic observation of 2 user groups, and 1 vendor event. The research and analysis was informed by insights from studies of the evolution of technology fields and the emergence of generic COTS enterprise solutions. Results Four key themes emerged: (1) adoption of systems that had been developed outside of England, (2) vendors’ configuration and customization strategies, (3) localized adopter practices vs generic systems, and (4) unrealistic adopter demands. Evidence for our over-arching finding concerning the current immaturity of the market was derived from vendors’ strategies, adopters’ reactions to the technology, and policy makers’ incomplete insights. Conclusions The CPOE/CDS market in England is still in an emergent phase. The rapid entrance of diverse products, triggered by federal policy initiatives, has resulted in premature adoption of systems that do not yet adequately meet the needs of hospitals. Vendors and adopters lacked understanding of how to design and implement generic solutions to meet diverse user needs.
APA, Harvard, Vancouver, ISO, and other styles
34

GUPTA, PANKAJ, SHILPI VERMA, and MUKESH KUMAR MEHLAWAT. "A MEMBERSHIP FUNCTION APPROACH FOR COST-RELIABILITY TRADE-OFF OF COTS SELECTION IN FUZZY ENVIRONMENT." International Journal of Reliability, Quality and Safety Engineering 18, no. 06 (December 2011): 573–95. http://dx.doi.org/10.1142/s0218539311004251.

Full text
Abstract:
The optimization techniques used in commercial-off-the-shelf (COTS) selection process faces challenges to deal with uncertainty in many important selection parameters, for example, cost, reliability and delivery time. In this paper, we propose a fuzzy optimization model for selecting the best COTS product among the available alternatives for each module in the development of modular software systems. The proposed model minimizes the total cost of the software system satisfying the constraints of minimum threshold on system reliability, maximum threshold on the delivery time of the software, and incompatibility among COTS products. In order to deal with uncertainty in real-world applications of COTS selection, the coefficients of the cost objective function, delivery time constraints and minimum threshold on reliability are considered fuzzy numbers. The fuzzy optimization model is converted into a pair of mathematical programming problems parameterized by possibility (feasibility) level α using Zadeh's extension principle. The solutions of the resultant problems at different α-cuts provide lower and upper bounds of the fuzzy minimum total cost which helps in constructing the membership function of the cost objective function. The solution approach provide fuzzy solutions instead of a single crisp solution thereby giving decision maker enough flexibility in maintaining cost-reliability trade-off of COTS selection besides meeting other important system requirements. A real-world case study is discussed to demonstrate the effectiveness of the proposed model in fuzzy environment.
APA, Harvard, Vancouver, ISO, and other styles
35

Paul Chinnaraju, Swaraj, G. Gunasekaran, N. Kumar, and R. Anandan. "Transformation from legacy storage to software defined storage–a review." International Journal of Engineering & Technology 7, no. 2.21 (April 20, 2018): 306. http://dx.doi.org/10.14419/ijet.v7i2.21.12387.

Full text
Abstract:
In IT Industry the new trend is everything to be in Software defined. And so the storage industry has started its transformation from Legacy storage like storage area network and network attached storage. This is enable through the software defined networks. Software defined networks helps organizations to accelerate its application deployment, thus reducing IT costs by applying some policy enabled workflows. In this paper, we will have a review over the need for it from the storage vendor perspective.
APA, Harvard, Vancouver, ISO, and other styles
36

Cechich, Alejandra, and Mario Piattini. "Early detection of COTS component functional suitability." Information and Software Technology 49, no. 2 (February 2007): 108–21. http://dx.doi.org/10.1016/j.infsof.2006.03.008.

Full text
APA, Harvard, Vancouver, ISO, and other styles
37

Feder, Judy. "Data Exchange and Collaboration Realize Automated Drilling Control Potential." Journal of Petroleum Technology 73, no. 02 (February 1, 2021): 47–48. http://dx.doi.org/10.2118/0221-0047-jpt.

Full text
Abstract:
This article, written by JPT Technology Editor Judy Feder, contains highlights of paper SPE 201763, “Exploiting the Full Potential in Automated Drilling Control by Increased Data Exchange and Multidisciplinary Collaboration,” by Kristian Gjerstad, SPE, and Ronny Bergerud, Sekal, and Stig Tore Thorsen, SPE, Equinor, prepared for the 2020 SPE Annual Technical Conference and Exhibition, originally scheduled to be held in Denver, Colorado, 5-7 October. The paper has not been peer reviewed. The complete paper describes challenges that must be overcome to reach the goal of drilling systems automation (DSA). The authors explore steps necessary to realize the full potential of performance-enhancing functionalities in automated drilling control (ADC) software, highlight current gaps, and present relatively easily achievable goals that can enable significant cost reduction and improvements in automation and safety. They also emphasize that automation is a multidisciplinary task, and that success requires collaboration between different sectors of the drilling industry. Overview The 19-page complete paper includes detailed technical discussion of topics ranging from the basic principles of an ADC system and practical challenges experienced with a model-based digital twin approach to suggested solutions and improvements. Each topic is divided into numerous related discussions. Because delving into each of these discussions is not possible in this synopsis, these have been outlined, with a few supporting points included for each. The Potential of ADC Systems Dedicated software applications - referred to by the authors as ADC systems - for protecting the well, increasing safety, automating repetitive operations, and optimizing the drilling process, have been available for some time. Several projects in which sophisticated ADC systems evaluate downhole conditions to assist the driller with judgments and decisions have been reported, with promising results including noticeable improvements in cost savings, reduced incidents, and improved safety. However, the number of rigs with sophisticated ADC systems running actively in real time is not high, and even on rigs where an ADC system is in use, the potential of the system generally is not fully leveraged. One reason is that these ADC systems are based on models of the drilling process running in parallel with the real process, with each requiring the exact same inputs in real time to work optimally. Many of these inputs are entered manually because the instrumentation, equipment, and infrastructure needed to automate the data transfer are not in place. The inputs that are automated may not be sufficiently accurate or reliable, so manual interactions are needed. Experience shows that even on relatively new rigs with modern instrumentation, a large untapped potential exists. An underlying reason for this lack of automated inputs is that different parties involved in establishing the required instrumentation and automated signal transfer are not well aligned. Thus, increased automation and repeatability can introduce increased staffing and cost for operating the ADC system. To overcome this paradox, better collaboration is required among the vendors in the complete production loop.
APA, Harvard, Vancouver, ISO, and other styles
38

Morris, A. Terry, and Peter A. Beling. "Extracting Acyclic Dependency Models from Quality Standards for COTS Software Evaluation." Journal of Aerospace Computing, Information, and Communication 3, no. 7 (July 2006): 327–39. http://dx.doi.org/10.2514/1.19106.

Full text
APA, Harvard, Vancouver, ISO, and other styles
39

Adam, George K., Nikos Petrellis, Georgia Garani, and Tilemachos Stylianos. "COTS-Based Architectural Framework for Reliable Real-Time Control Applications in Manufacturing." Applied Sciences 10, no. 9 (May 6, 2020): 3228. http://dx.doi.org/10.3390/app10093228.

Full text
Abstract:
The challenge of keeping the development and implementation of real-time control systems reliable and efficient and at the same time, low-cost and low-energy, is getting harder. This is because system designers and developers are faced with the dependability, inflexibility and often high-cost of specialized or custom-built hardware and software components. This research attempts to tackle issues such as the reliability and efficiency of real-time control systems and advance further the current state-of-the-art. For this purpose, a strong emphasis is placed on finding novel efficient solutions based on standardized and commercially available off-the-shelf hardware/software components. In this direction, this research applies credible and feasible methodologies (e.g., model-based design, component-based design, formal verification, real-time scheduling, prototyping, and validation) in an innovative enhanced way. As an important outcome, a versatile integrative design approach and architectural framework (VIDAF) is proposed, which supports the development and implementation of reliable real-time control systems and applications using commercial off-the-shelf (COTS) components. The feasibility and applicability of the proposed system’s architecture are evaluated and validated through a system application in embedded real-time control in manufacturing. The research outcomes are expected to have a positive impact on emerging areas such as the Industrial Internet of Things (IIoT).
APA, Harvard, Vancouver, ISO, and other styles
40

Hamid, Arief Abdul. "Comparison of Software Licensing and Development Models Using Val IT." ACMIT Proceedings 3, no. 1 (March 18, 2019): 208–12. http://dx.doi.org/10.33555/acmit.v3i1.46.

Full text
Abstract:
Today Small Medium Enterprise (SME) in Indonesia increasingly growing, along with the growth of business done, of course must be offset by existing IT in the company, due to support the performance of the company. Who originally used it daily jobs done manually will be diverted to use software to help the work. But sometimes to develop the IT application, the obstacle typically encountered is about the budget for the investment of IT itself, therefore in this paper the authors compare the advantages and disadvantages between open source software and software Proprietary using VAL IT as a framework. For the result of the Open Source Software is low cost because the the code it is free but company should spend budget for the training and for the report the company need to learn to create the reporting. While for software Proprietary is high cost but support is conducted regularly for operational and for just ask the vendor to create report.
APA, Harvard, Vancouver, ISO, and other styles
41

JHA, P. C., RAMANDEEP KAUR, SHIVANI BALI, and SUSHILA MADAN. "OPTIMAL COMPONENT SELECTION APPROACH FOR FAULT-TOLERANT SOFTWARE SYSTEM UNDER CRB INCORPORATING BUILD-OR-BUY DECISION." International Journal of Reliability, Quality and Safety Engineering 20, no. 06 (December 2013): 1350024. http://dx.doi.org/10.1142/s0218539313500241.

Full text
Abstract:
Application Package Software (APS) has emerged as a ready-to-use solution for the software industry. The software system comprises of a number of components which can be either purchased from the vendor in the form of COTS (Commercial Off-the-Shelf) or can be built in-house. Such a decision is known as Build-or-Buy decision. Under the situations wherein the software has the responsibility of supervising life-critical systems, the inception of errors in software due to inadequate or incomplete testing, is not acceptable. Such life-critical systems enforces upon meeting the quality standards of the software as unforbiddenable. This can be achieved by incorporating a fault-tolerant design that enables a system to continue its intended operation rather than failing completely when some part of the system fails. Moreover, while designing a fault-tolerant system, it must be apprehended that 100% fault tolerance can never be achieved and the closer we try to get to 100%, the more costly the system will be. The proposed model shall incorporate consensus recovery block scheme of fault tolerant techniques. Through this paper, we shall focus on build-or-buy decision for an APS in order to facilitate optimal component selection thereby, maximizing the reliability and minimizing the overall cost and source lines of code of the entire system. Further, since the proposed problem has incompleteness and unreliability of input information such as execution time and cost, hence, the environment in the proposed model is taken as fuzzy.
APA, Harvard, Vancouver, ISO, and other styles
42

Bartel, Timothy, and Mark Finster. "A TQM PROCESS FOR SYSTEMS INTEGRATION Getting the Most from COTS Software." Information Systems Management 12, no. 3 (January 1995): 19–29. http://dx.doi.org/10.1080/07399019508962982.

Full text
APA, Harvard, Vancouver, ISO, and other styles
43

Alford, Molly, Isuru Udugama, Wei Yu, and Brent Young. "Flexible digital twins from commercial off-the-shelf software solutions: a driver for energy efficiency and decarbonisation in process industries?" Chemical Product and Process Modeling 17, no. 4 (August 1, 2022): 395–407. http://dx.doi.org/10.1515/cppm-2021-0045.

Full text
Abstract:
Abstract The term ‘digital twin’ refers to a virtual simulation/model (virtual twin) of a physical plant or object (physical twin), where data flows between the virtual and physical twins. A digital twin can be used for different purposes, such as process optimisation/control, design, training, and maintenance/service. This manuscript found an increasing number of simulation and modelling publications in literature year on year, which illustrates the current trend towards implementing digital twins in a broad range of process engineering applications. A targeted literature review into the area found several commercial off-the-shelf software solutions (COTS) for different industrial applications providing the necessary flexibility to analyse a broad range of industries. However, most of the process modelling software is designed for petroleum and fine chemicals processes. There is still a need for software solutions that can model a broader range of applications. While most of the software found was licensed, open source process modelling software was also available. There is a lack of independent research into the accuracy of these software solutions. The literature review also found that 37% of the research based on process simulations is carried out to improve energy efficiencies. In comparison, 27% of the research found Decarbonization to be a secondary "added" benefit. It can be concluded that digital twins are ideally suited for driving energy efficiency improvements and decarbonisation goals. However, none of the COTS identified in the literature meets all the requirements for a digital twin. A solution to this problem is to create a layered digital twin, combining and interfacing different tools to accomplish a visually similar, self-optimising, self-learning virtual plant.
APA, Harvard, Vancouver, ISO, and other styles
44

Tamimi, Moutasm, and Issam Jebreen. "A Systematic Snapshot of Small Packaged Software Vendors' Enterprises." International Journal of Enterprise Information Systems 14, no. 2 (April 2018): 21–42. http://dx.doi.org/10.4018/ijeis.2018040102.

Full text
Abstract:
This article describes how small packaged software vendors' enterprises (SPSVEs) have played a massive role in a software environment and contributed dramatically to economies. The purpose of this article is to investigate and categorize the most recent of literature addressing small packaged software vendors' enterprises through a systematic snapshot research in order to identify current research topics and highlight some areas needing more consideration. The pattern of the authors' systematic approach is based on developing a classification scheme which targets a collection of papers published within the period of 2007-2017. The authors analysed one hundred and one papers from peer-reviewed conferences, journals, and workshops to examine the current state of SPSVE's research in order to provide systematic snapshot mapping (SSM) that includes the small packaged software life cycle, research methods used, and country of study. The systematic snapshot of 101 papers reveals that the majority of the literature has focused on the planning and implementation phases of SPSVEs. Figuring out a new model of packaged software life-cycle in SMEs will occur by applying the model of categorizations with regard to the life cycle with its factors and sub factors. Moreover, it will contribute to finding research methods, regions, top ten citation, articles type classifications, and other kinds of classifications. This research is targeted to small packaged software vendors' enterprises (SPSVEs). The authors' finding is intended for software research areas more than economic research areas. This article has presented a high degree of benefits in order to assist researchers in evidence-based decision making in terms of investigating hot research areas in line with the small packaged software vendors' enterprises (SPSVEs).
APA, Harvard, Vancouver, ISO, and other styles
45

Minahil Ahmad. "Multi core Commercial Off-the-Shelf (Cots) Under the implementation of fault Tolerance." Lahore Garrison University Research Journal of Computer Science and Information Technology 2, no. 1 (March 30, 2018): 45–54. http://dx.doi.org/10.54692/lgurjcsit.2018.020144.

Full text
Abstract:
The fault-tolerant design is applicable to high performance IT systems, increased by an amount of false contacts from smaller structures. As a substitute for the hardware solutions that are based on software error tolerance mechanisms, proper solution can increase the reliability for the commercial-of-the-shelf (COTS) multicore processors within a cheaper range with efficiency for further use. This paper aims to create a hybrid approach to hardware-related software, the current Core Intel x86 family and Xeon multi-core platforms. To support memory transactional hardware (TSX Intel) in creating implicit hits and fast rollback redundant process execution and signature comparison for error detection and error recovery can be done through transactional packing. Existing applications have improved redundant instrumentation performance with tolerance to post-binary compound errors. Further physical improvement increases the applicability of the CPU SPEC benchmark approach proposed in 2006 with administrative costs and rated performance over 47% on average based on the existence of proposed hardware support. In this paper, all the techniques for hardware and software level that help us for the removal of faults in multi-core commercial-of-the-shelf (COTS) and that makes the system able enough to work properly.
APA, Harvard, Vancouver, ISO, and other styles
46

Ramdani, Boumediene, Delroy Chevers, and Densil A. Williams. "SMEs' adoption of enterprise applications." Journal of Small Business and Enterprise Development 20, no. 4 (October 28, 2013): 735–53. http://dx.doi.org/10.1108/jsbed-12-2011-0035.

Full text
Abstract:
Purpose – This paper aims to empirically explore the TOE (technology-organisation-environment) factors influencing small to medium-sized enterprises' (SMEs') adoption of enterprise applications (EA). Design/methodology/approach – Direct interviews were used to collect data from a random sample of SMEs located in the northwest of England. Using partial least squares (PLS) technique, 102 responses were analysed. Findings – Results indicate that technology, organisation and environment contexts impact SMEs' adoption of EA. This suggests that the TOE model is indeed a robust tool to predict the adoption of EA by SMEs. Research limitations/implications – Although this study focused on examining factors that influence SMEs' adoption of a set of systems such as CRM and e-procurement, it fails to differentiate between factors influencing each of these applications. The model used in this study can be used by software vendors not only in developing marketing strategies that can target potential SMEs, but also to develop strategies to increase the adoption of EA among SMEs. Practical implications – This model could be used by software vendors to determine which SMEs they should target with their products. It can also be used by policy makers to develop strategies to increase the rate of EA adoption among SMEs. Originality/value – This paper provides a model that can predict SMEs' adoption of EA. SMEs, adoption, enterprise applications, enterprise systems, ICT, PLS, technology-organisation-environment framework, TOE
APA, Harvard, Vancouver, ISO, and other styles
47

Pourali, Parsa, Maria Toeroe, and Ferhat Khendek. "Pattern based configuration generation for highly available COTS components based systems." Information and Software Technology 74 (June 2016): 143–59. http://dx.doi.org/10.1016/j.infsof.2016.02.008.

Full text
APA, Harvard, Vancouver, ISO, and other styles
48

Kartadie, Rikie. "Performance Test of Openflow Agent on Openflow Software-Based Mikrotik RB750 Switch." Scientific Journal of Informatics 3, no. 2 (November 23, 2016): 217–28. http://dx.doi.org/10.15294/sji.v3i2.7987.

Full text
Abstract:
A network is usually developed by several devices such as router, switch etc. Every device forwards data package manipulation with complicated protocol planted in its hardware. An operator is responsible for running configuration either to manage rules or application applied in the network. Human error may occur when device configuration run manually by operator. Some famous vendors, one of them is MikroTik, has also been implementing this OpenFlow on its operation. It provides the implementation of SDN/OpenFlow architecture with affordable cost. The second phase research result showed that switch OF software-based MikroTik resulted higher latency value than both mininet and switch OF software-based OpenWRT. The average gap value of switch OF software-based MikroTik is 2012 kbps lower than the value of switch OF software-based OpenWRT. The average gap value of throughput bandwidth protocol UDP switch OF software-based MikroTik is 3.6176 kBps lower than switch OF software-based OpenWRT and it is 8.68 kBps lower than mininet. The average gap throughput jitter protokol UDP of switch OF software-based MiktoTik is 0.0103ms lower than switch OF software-based OpenWRT and 0.0093ms lower than mininet.
APA, Harvard, Vancouver, ISO, and other styles
49

Agnesina, Anthony, James Yamaguchi, Christian Krutzik, John Carson, Jean Yang-Scharlotta, and Sung Kyu Lim. "A COTS-Based Novel 3-D DRAM Memory Cube Architecture for Space Applications." IEEE Transactions on Very Large Scale Integration (VLSI) Systems 28, no. 9 (September 2020): 2055–68. http://dx.doi.org/10.1109/tvlsi.2020.3002211.

Full text
APA, Harvard, Vancouver, ISO, and other styles
50

Govindan Kannan, Pravein, Priyanka Naik, Praveen Tammana, and Mythili Vutukuru. "Special Issue on The Workshop on Performance of host-based Network Applications (PerfNA 2022)." ACM SIGMETRICS Performance Evaluation Review 50, no. 2 (August 30, 2022): 45. http://dx.doi.org/10.1145/3561074.3561091.

Full text
Abstract:
With the advancement of highly network-powered paradigms like 5G, Microservices, etc. which are typically deployed as containers/VMs, there is a growing imperative on the host nodes to perform specialized network tasks like monitoring, filtering, tunneling, load-balancing, etc. While traditionally, these tasks were performed using switches and specialized middleboxes in the network, there is a demand to perform these network tasks on commodity hardware comprising of COTS servers. However, a major challenge is to perform these tasks at low-overhead and high reliability while maintaining low latency, high throughput, and flexibility.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography