Journal articles on the topic 'Timed business protocols'

To see the other types of publications on this topic, follow the link: Timed business protocols.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Timed business protocols.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

MOKHTARI-ASLAOUI, KARIMA, SALIMA BENBERNOU, SOROR SAHRI, VASILIOS ANDRIKOPOULOS, FRANK LEYMANN, and MOHAND-SAID HACID. "TIMED PRIVACY-AWARE BUSINESS PROTOCOLS." International Journal of Cooperative Information Systems 21, no. 02 (June 2012): 85–109. http://dx.doi.org/10.1142/s0218843012500013.

Full text
Abstract:
Web services privacy issues have been attracting more and more attention in the past years. Since the number of Web services-based business applications is increasing, the demands for privacy enhancing technologies for Web services will also be increasing in the future. In this paper, we investigate an extension of business protocols, i.e. the specification of which message exchange sequences are supported by the web service, in order to accommodate privacy aspects and time-related properties. For this purpose we introduce the notion of Timed Privacy-aware Business Protocols (TPBPs). We also discuss TPBP properties can be checked and we describe their verification process.
APA, Harvard, Vancouver, ISO, and other styles
2

Elabd, Emad, Emmanuel Coquery, and Mohand-Said Hacid. "From Implicit to Explicit Transitions in Business Protocols." International Journal of Web Services Research 9, no. 4 (October 2012): 69–95. http://dx.doi.org/10.4018/jwsr.2012100104.

Full text
Abstract:
Modeling Web services is a major step towards their automated analysis. One of the important parameters in this modeling, for the majority of Web services, is the time. A Web service can be presented by its behavior which can be described by a business protocol representing the possible sequences of message exchanges. To the best of the authors’ knowledge, automated analysis of timed Web services (e.g., compatibility and replaceability checking) is very difficult and in some cases it is not possible with the presence of implicit transitions (internal transitions) based on time constraints. The semantics of the implicit transitions is the source of this difficulty because most of well-known modeling tools do not express this semantics (e.g., epsilon transition on the timed automata has a different semantics). This paper presents an approach for converting any protocol containing implicit transitions to an equivalent one without implicit transitions before performing analysis.
APA, Harvard, Vancouver, ISO, and other styles
3

AlMahmoud, Abdelrahman, Maurizio Colombo, Chan Yeob Yeun, and Hassan Al Muhairi. "Secure communication protocol for real-time business process monitoring." International Journal of Internet Technology and Secured Transactions 5, no. 3 (2014): 223. http://dx.doi.org/10.1504/ijitst.2014.065183.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Biornstad, Biorn, Cesare Pautasso, and Gustavo Alonso. "Enforcing web services business protocols at run-time: a process-driven approach." International Journal of Web Engineering and Technology 2, no. 4 (2006): 396. http://dx.doi.org/10.1504/ijwet.2006.010422.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Rao, N. H. "Electronic Commerce and Opportunities for Agribusiness in India." Outlook on Agriculture 32, no. 1 (March 2003): 29–33. http://dx.doi.org/10.5367/000000003101294235.

Full text
Abstract:
Electronic commerce (or e-commerce) using Internet technologies helps businesses to cut costs and cycle times, raise efficiency and provide more information, choice and value to consumers. Agribusinesses in India will need to deploy Internet technologies to gain competitive advantage and avoid isolation from mainstream businesses. Some challenges to becoming e-commerce-enabled are technical (limited infrastructure for Internet access), some are government policy-related (bandwidth, free movement of goods across states, market and trade policies), and some are legal. Many of these challenges are being addressed through both public and private initiatives. Some are specific to agribusiness: for example, relating to scope, regional specificity, the multidisciplinary nature of agricultural services, and trade restrictions on agro-products. Low levels of computer literacy and innumerable local languages compound these challenges. A two-stage strategy is suggested for agribusiness, one for improving operational efficiencies within businesses by using Internet technologies in back office business operations, and the other for delivering both knowledge and products to farmers. The first requires deploying new, generic and cost-effective Internet technologies with open standards and protocols. The second requires using Internet technologies for strategic positioning of products and services to gain long-term competitive advantage. The latter would mean persisting with conventional business strategy while using the Internet as an effective front end.
APA, Harvard, Vancouver, ISO, and other styles
6

UBERMORGEN, /. "Everything is always (the liquid protocol)." Finance and Society 2, no. 2 (December 19, 2016): 175–79. http://dx.doi.org/10.2218/finsoc.v2i2.1731.

Full text
Abstract:
UBERMORGEN are lizvlx (AT, 1973) and Hans Bernhard (CH/USA, 1971), European artists and net.art pioneers. They tenaciously convert code & language and concept & aesthetics into digital objects, software art, net.art, installation, new painting, videos, press releases and actions. CNN described them as ‘maverick Austrian business people’ and the New York Times called Google Will Eat Itself ‘simply brilliant’. Their main influences are Rammstein, Samantha Fox, Guns N’ Roses & Duran Duran, Olanzapine, LSD & Kentucky Fried Chicken’s Coconut Shrimps Deluxe. Visit their website at http://ubermorgen.com
APA, Harvard, Vancouver, ISO, and other styles
7

Sayeed, Sarwar, and Hector Marco-Gisbert. "Proof of Adjourn (PoAj): A Novel Approach to Mitigate Blockchain Attacks." Applied Sciences 10, no. 18 (September 22, 2020): 6607. http://dx.doi.org/10.3390/app10186607.

Full text
Abstract:
The blockchain is a distributed ledger technology that is growing in importance since inception. Besides cryptocurrencies, it has also crossed its boundary inspiring various organizations, enterprises, or business establishments to adopt this technology benefiting from the most innovative security features. The decentralized and immutable aspects have been the key points that endorse blockchain as one of the most secure technologies at the present time. However, in recent times such features seemed to be faded due to new attacking techniques. One of the biggest challenges remains within the consensus protocol itself, which is an essential component to bring all network participants to an agreed state. Cryptocurrencies adopt suitable consensus protocols based on their mining requirement, and Proof of Work (PoW) is the consensus protocol that is being predominated in major cryptocurrencies. Recent consensus protocol-based attacks, such as the 51% attack, Selfish Mining, Miner Bribe Attack, Zero Confirmation Attack, and One Confirmation Attack have been demonstrated feasible. To overcome these attacks, we propose Proof of Adjourn (PoAj), a novel consensus protocol that provides strong protection regardless of attackers hashing capability. After analyzing the 5 major attacks, and current protection techniques indicating the causes of their failure, we compared the PoAj against the most widely used PoW, showing that PoAj is not only able to mitigate the 5 attacks but also attacks relying on having a large amount of hashing power. In addition, the proposed PoAj showed to be an effective approach to mitigate the processing time issue of large-sized transactions. PoAj is not tailored to any particular attack; therefore, it is effective against malicious powerful players. The proposed approach provides a strong barrier not only to current and known attacks but also to future unknown attacks based on different strategies that rely on controlling the majority of the hashing power.
APA, Harvard, Vancouver, ISO, and other styles
8

Herschel, Richard T., and Nicolle Clements. "The Importance of Storytelling in Business Intelligence." International Journal of Business Intelligence Research 8, no. 1 (January 2017): 26–39. http://dx.doi.org/10.4018/ijbir.2017010102.

Full text
Abstract:
This paper examines the relevance and importance of storytelling to business intelligence. Business intelligence provides analytics to inform a decision-making process. However, there are often times issues with understanding the analytics presented and contextualizing the analytics to the overall decision-making process. This paper examines these issues and then assesses the value of storytelling in conveying BI findings. The SOAP protocol used for physician/patient clinical encounters is presented to illustrate the benefit of providing structure to a decision-making process. The importance of structure to storytelling is then further examined and shown to help facilitate the communication of data insights. Media richness is also examined to reveal the impact that it has on BI storytelling, helping to explain why visualizations have become so important to BI. An example of BI storytelling with a high level of media richness is then presented. The paper concludes that storytelling is inextricably linked to BI success.
APA, Harvard, Vancouver, ISO, and other styles
9

Rigin, Anton Mikhailovich, and Sergey Andreevich Shershakov. "Method of Performance Analysis of Time-Critical Applications Using DB-Nets." Proceedings of the Institute for System Programming of the RAS 33, no. 3 (2021): 109–22. http://dx.doi.org/10.15514/ispras-2021-33(3)-9.

Full text
Abstract:
These days, most of time-critical business processes are performed using computer technologies. As an example, one can consider financial processes including trading on stock exchanges powered by electronic communication protocols such as the Financial Information eXchange (FIX) Protocol. One of the main challenges emerging with such processes concerns maintaining the best possible performance since any unspecified delay may cause a large financial loss or other damage. Therefore, performance analysis of time-critical systems and applications is required. In the current work, we develop a novel method for a performance analysis of time-critical applications based on the db-net formalism, which combines the ability of colored Petri nets to model a system control flow with the ability to model relational database states. This method allows to conduct a performance analysis for time-critical applications that work as transactional systems and have log messages which can be represented in the form of table records in a relational database. One of such applications is a FIX protocol-based trading communication system. This system is used in the work to demonstrate applicability of the proposed method for time-critical systems performance analysis. However, there are plenty of similar systems existing for different domains, and the method can also be applied for a performance analysis of these systems. The software prototype is developed for testing and demonstrating abilities of the method. This software prototype is based on an extension of Renew software tool, which is a reference net simulator. The testing input for the software prototype includes a test log with FIX messages, provided by a software developer of testing solutions for one of the global stock exchanges. An application of the method for quantitative analysis of maximum acceptable delay violations is presented. The developed method allows to conduct a performance analysis as a part of conformance checking of a considered system. The method can be used in further research in this domain as well as in testing the performance of real time-critical software systems.
APA, Harvard, Vancouver, ISO, and other styles
10

Jain, Tarun, and Bijendra Nath Jain. "Infection Testing at Scale: An Examination of Pooled Testing Diagnostics." Vikalpa: The Journal for Decision Makers 46, no. 1 (March 2021): 13–26. http://dx.doi.org/10.1177/02560909211018906.

Full text
Abstract:
Executive Summary In pandemics or epidemics, public health authorities need to rapidly test a large number of individuals without adequate testing kits. We propose a testing protocol to accelerate infection diagnostics by combining multiple samples, and in case of positive results, re-test individual samples. The key insight is that a negative result in the first stage implies negative infection for all individuals. Thus, a single test could rule out infection in multiple individuals. Using simulations, we show that this protocol reduces the required number of testing kits, especially when the infection rate is low, alleviating a key bottleneck for public health authorities in times of pandemics and epidemics such as COVID-19. Our proposed protocol is expected to be more effective when the infection rate is low, which suggests that it is better suited for early stage and large-scale, population-wide testing. However, the managerial trade-off is that the protocol has costs in additional time for returning test results and an increased number of false negatives. We discuss applications of pooled testing in understanding population-wide testing to understand infection prevalence, to diagnose infections in high-risk groups of individuals, and to identify disease cold spots.
APA, Harvard, Vancouver, ISO, and other styles
11

Lai, Byron, Chia-Ying Chiu, Emily Pounds, Tracy Tracy, Tapan Mehta, Hui-Ju Young, Emily Riser, and James Rimmer. "COVID-19 Modifications for Remote Teleassessment and Teletraining of a Complementary Alternative Medicine Intervention for People With Multiple Sclerosis: Protocol for a Randomized Controlled Trial." JMIR Research Protocols 9, no. 7 (July 3, 2020): e18415. http://dx.doi.org/10.2196/18415.

Full text
Abstract:
Background Access to comprehensive exercise and rehabilitation services for people with multiple sclerosis (MS) remains a major challenge, especially in rural, low-income areas. Hence, the Tele-Exercise and Multiple Sclerosis (TEAMS) study aims to provide patient-centered, coordinated care by implementing a 12-week complementary and alternative medicine (CAM) intervention for adults with MS. However, due to the societal impact of coronavirus disease (COVID-19) in mid-March 2020, the University of Alabama at Birmingham announced a limited business model halting all nonessential research requiring on-site visits, which includes the TEAMS study. Objective In compliance with the shelter-in-place policy and quarantine guidance, a modified testing and training protocol was developed to allow participants to continue the study. Methods The modified protocol, which replaces on-site data collection and training procedures, includes a teleassessment package (computer tablet, blood pressure cuff, hand dynamometer, mini disc cone, measuring tape, an 8” step, and a large-print 8” × 11” paper with ruler metrics and wall-safe tape) and a virtual meeting platform for synchronous interactive training between the therapist and the participant. The teleassessment measures include resting blood pressure and heart rate, grip strength, Five Times Sit to Stand, Timed Up & Go, and the Berg Balance Scale. The teletraining component includes 20 sessions of synchronous training sessions of dual tasking, yoga, and Pilates exercises designed and customized for a range of functional levels. Teletraining lasts 12 weeks and participants are instructed to continue exercising for a posttraining period of 9 months. Results The protocol modifications were supported with supplemental funding (from the Patient-Centered Outcomes Research Institute) and approved by the University Institutional Review Board for Human Use. At the time nonessential research visits were halted by the university, there were 759 people enrolled and baseline tested, accounting for 92.5% of our baseline testing completion target (N=820). Specifically, 325 participants completed the 12-week intervention and follow-up testing visits, and 289 participants needed to complete either the intervention or follow-up assessments. A modified analysis plan will include sensitivity analyses to ensure the robustness of the study results in the presence of uncertainty and protocol deviations. Study results are projected to be published in 2021. Conclusions This modified remote teleassessment/teletraining protocol will impact a large number of participants with MS who would otherwise have been discontinued from the study. Trial Registration ClinicalTrials.gov NCT03117881; https://clinicaltrials.gov/ct2/show/NCT03117881 International Registered Report Identifier (IRRID) DERR1-10.2196/18415
APA, Harvard, Vancouver, ISO, and other styles
12

Marentes, Luis Andres, Tilman Wolf, Anna Nagurney, and Yezid Donoso. "Towards Pricing Mechanisms for Delay Tolerant Services." International Journal of Computers Communications & Control 11, no. 1 (November 16, 2015): 77. http://dx.doi.org/10.15837/ijccc.2016.1.1438.

Full text
Abstract:
One of the applications of Delay Tolerant Networking (DTN) is rural networks. For this application researchers have argued benefits on lowering costs and overcoming challenging conditions under which, for instance, protocols such as TCP/IP cannot work because their underlying requisites are not satisfied. New responses are required in order to understand the true adoption opportunities of this technology. Constraints in service level agreements and viable alternative pricing schemes are some of the new issues that arise as a consequence of the particular operation mode. In this paper, we propose a novel model for pricing delay tolerant services, which adjusts prices to demand variability subject to constraints imposed by the DTN operation. With this model we also show how important parameters such as channel rental costs, cycle times of providers, and market sensitivities affect business opportunities of operators.
APA, Harvard, Vancouver, ISO, and other styles
13

Sharma, Gaurav, Stilianos Vidalis, Catherine Menon, Niharika Anand, and Somesh Kumar. "Analysis and Implementation of Threat Agents Profiles in Semi-Automated Manner for a Network Traffic in Real-Time Information Environment." Electronics 10, no. 15 (July 31, 2021): 1849. http://dx.doi.org/10.3390/electronics10151849.

Full text
Abstract:
Threat assessment is the continuous process of monitoring the threats identified in the network of the real-time informational environment of an organisation and the business of the companies. The sagacity and security assurance for the system of an organisation and company’s business seem to need that information security exercise to unambiguously and effectively handle the threat agent’s attacks. How is this unambiguous and effective way in the present-day state of information security practice working? Given the prevalence of threats in the modern information environment, it is essential to guarantee the security of national information infrastructure. However, the existing models and methodology are not addressing the attributes of threats like motivation, opportunity, and capability (C, M, O), and the critical threat intelligence (CTI) feed to the threat agents during the penetration process is ineffective, due to which security assurance arises for an organisation and the business of companies. This paper proposes a semi-automatic information security model, which can deal with situational awareness data, strategies prevailing information security activities, and protocols monitoring specific types of the network next to the real-time information environment. This paper looks over analyses and implements the threat assessment of network traffic in one particular real-time informational environment. To achieve this, we determined various unique attributes of threat agents from the Packet Capture Application Programming Interface (PCAP files/DataStream) collected from the network between the years 2012 and 2019. We used hypothetical and real-world examples of a threat agent to evaluate the three different factors of threat agents, i.e., Motivation, Opportunity, and Capability (M, O, C). Based on this, we also designed and determined the threat profiles, critical threat intelligence (CTI), and complexity of threat agents that are not addressed or covered in the existing threat agent taxonomies models and methodologies.
APA, Harvard, Vancouver, ISO, and other styles
14

Maniza, Lalu Hendra, Sudarta Sudarta, and Nur' Aini. "PEMBUATAN ABON IKAN GUNA MEMBANTU EKONOMI KELUARGA DI MASA COVID DI DESA JATISELA KECAMATAN GUNUNGSARI KABUPATEN LOBAR." SELAPARANG Jurnal Pengabdian Masyarakat Berkemajuan 4, no. 2 (May 8, 2021): 492. http://dx.doi.org/10.31764/jpmb.v4i2.4231.

Full text
Abstract:
ABSTRAK Tujuan pengabdian kepada masyarakat ini adalah untuk membantu anggota UKM yang ada di desa jatisela kecamatan gunungsari dalam memasarkan produk nya secara online dan offline, memiliki kemampuan berinovasi dengan bahan baku yang ada di lingkungan sendiri, Bisa menjadi sumber mata pencaharian sehingga dapat membantu kebutuhan rumah tangga di masa sulit saat ini seperti pandemi.pengabdian ini dilakukan dengan metode penyuluhan,pelatihan dan evaluasi hasil yang telah dicapai,dimana mitra yang kami undang adalah yang ingin menambah pengahasilan dan yg memiliki usaha namun tak berkembang, edukasi yang diberikan dengan sosialisasi peragaan pembuatan langsung dengan 13 orang mitra dengan di ketuai oleh salah seorang anggota UKM.tiga metode yang telah tim pengabdian lakukan diantaranya mengevaluasi hasil dari mitra sejauh mana perkembangan usaha pelatihan yang diberikan pihak pengabdian dalam memasarkan usaha abon dan kemampuan dalam pembuatan abon dengan protocol kesehatan di masa covid seperti sekarang ini. Kata kunci: pembuatan abon; membantu ekonomi keluarga; dimasa pandemi ABSTRACTThe purpose of this community service is to help UKM members in the village of Jatisela, Gunungsari sub-district in marketing their products online and offline, have the ability to innovate with raw materials in their own environment, can be a source of livelihood so they can help household needs in Today's difficult times are like a pandemic. This service is carried out by means of counseling, training and evaluation of the results that have been achieved, where the partners we invite are those who want to increase their income and who have businesses but are not developing, education is provided with the socialization of direct manufacturing demonstrations with 13 a partner chaired by a member of the UKM. The three methods that the service team have done include evaluating the results of partners, the extent to which the development of the training efforts provided by the service party in marketing shredded businesses and the ability to make shredded shredded with health protocols during the Covid period such as rti now. Keywords: shredded making; help the family economy; during the pandemic
APA, Harvard, Vancouver, ISO, and other styles
15

Arshadi, Nasser. "Blockchain Platform for Real-Time Payments: A Less Costly and More Secure Alternative to ACH." Technology & Innovation 21, no. 1 (October 31, 2019): 3–9. http://dx.doi.org/10.21300/21.1.2019.3.

Full text
Abstract:
This paper examines the historical development of banking and automated clearing house legacy systems and offers blockchain platform for real-time payments as an alternative. Institutions with legacy systems resist disruptive change unless there are demands by businesses and customers and an opportunity for new products at reduced costs and additional revenues. Although the Automated Clearing House (ACH) in the U. S. has phased in a twice-a-day clearing and settlement system, it is still behind real-time models in use in several other countries. ACH uses individual clearance and settlements for large payments and batches for smaller transactions. Using ACH data, this paper calculates the annual opportunity cost of using a real-time model such as blockchain versus ACH's discrete, twice-a-day clearance and settlement procedure. For 2016, the real-time protocol would have resulted in benefits for businesses and customers of $10 billion.
APA, Harvard, Vancouver, ISO, and other styles
16

Mobo, Froilan D. "The Impact of Video Conferencing Platform in All Educational Sectors Amidst Covid-19 Pandemic." Aksara: Jurnal Ilmu Pendidikan Nonformal 7, no. 1 (January 2, 2021): 15. http://dx.doi.org/10.37905/aksara.7.1.15-18.2021.

Full text
Abstract:
<p>The Impact of COVID-19 Pandemic made the business world stop and also the economic sector. Health protocols standard has been observed like social distancing, the mandatory wearing of facemask, and avoiding mass gathering. The researcher is proposing to use a video conferencing platform beside the learning management system because this will replace the face to face setup and realtime feedback from the students to the teacher. Video conferencing has always been a key ingredient in the recipe to success for enterprises and other educational sector hoping to connect with customers, remote workers, and even with the students, (BEAUFORD, 2020). The results suggest that current policies and teaching strategies can be adapted due to the outbreak of COVID-19. In relation to previous studies on the use of videoconferencing in higher education, Video Conferencing such as Zoom and Google meet the demands in a broader consideration of the relevant challenges that arise when using certain videoconferencing systems in learning and teaching situations and that can be used in the current scenario, (Khatib, 2020). Using Video Conferencing will not violate any quarantine protocols and this will ensure the safety of both students and the teachers, in times like this, we really need to adopt the new technology platforms embracing the effects of COVID-19 and might lead to the opening of the New Normal in all sectors.</p>
APA, Harvard, Vancouver, ISO, and other styles
17

Cao, Hanyu, Meiying Zhang, Huanxi Cai, Wei Gong, Min Su, and Bin Li. "A zero-forcing beamforming based time switching protocol for wireless powered internet of things system." Journal of Industrial & Management Optimization 16, no. 6 (2020): 2913–22. http://dx.doi.org/10.3934/jimo.2019086.

Full text
APA, Harvard, Vancouver, ISO, and other styles
18

Gili, Claudia, Mauro Vasconi, and Flavio Gagliardi. "Impact of SARS-CoV-2 on Aquaria: An Italian Perspective." Journal of Applied Animal Ethics Research 3, no. 1 (May 4, 2021): 74–90. http://dx.doi.org/10.1163/25889567-bja10015.

Full text
Abstract:
Abstract Aquatic animals have been maintained by humans in confined spaces since very ancient times. In the last century both, the need to implement seafood productions and the popularity of aquatic exhibits, have facilitated professional scientific development of live fish management techniques. In this context, aquatic animal welfare has therefore become an important standpoint to guarantee good and safe quality of seafood and sustainable aquaria and zoological collections. At the end of 2019, SARS-CoV-2 severely affected human health in China and shortly became pandemic, hence influencing globally most types of businesses. All animal industries fully dependent on human daily activities and resources, have been severely impacted by human distancing and isolation protocols. During this world crisis, extensive changes in aquarium management procedures had to be applied. Specific contingency plans were developed to protect humans and to guarantee animal care, in order to avoid the risk for aquaria fading away.
APA, Harvard, Vancouver, ISO, and other styles
19

FIET, JAMES O., ROBERT D. NIXON, MAHESH GUPTA, and PANKAJ C. PATEL. "ENTREPRENEURIAL DISCOVERY BY THE WORKING POOR." Journal of Developmental Entrepreneurship 11, no. 03 (September 2006): 255–73. http://dx.doi.org/10.1142/s1084946706000428.

Full text
Abstract:
We test the proposition that it is possible to train the economically vulnerable, working poor of inner cities to make entrepreneurial discoveries. We demonstrate the effective use of a model of constrained, systematic search. We employ an experimental design with a control group using the alertness approach recommended by received theory and a treatment group using systematic search. Our results indicate the systematic search approach works 25 times better among a sample of the working poor. In addition, we operationalize systematic search training protocols and implementation. We conclude by discussing special challenges inherent in training the economically disadvantaged and suggest that the lack of trust of those from outside the local community necessitates the building of bridges to targeted residents. Bridge-building, as an integral part of public policy utilizing large-scale training, might be accomplished through reliance on established community relationships.
APA, Harvard, Vancouver, ISO, and other styles
20

Boutrous Saab, C., D. Coulibaly, S. Haddad, T. Melliti, P. Moreaux, and S. Rampacek. "An Integrated Framework for Web Services Orchestration." International Journal of Web Services Research 6, no. 4 (October 2009): 1–29. http://dx.doi.org/10.4018/jwsr.2009071301.

Full text
Abstract:
Currently, Web services give place to active research and this is due both to industrial and theoretical factors. On one hand, Web services are essential as the design model of applications dedicated to the electronic business. On the other hand, this model aims to become one of the major formalisms for the design of distributed and cooperative applications in an open environment (the Internet). In this article, the authors will focus on two features of Web services. The first one concerns the interaction problem: given the interaction protocol of a Web service described in BPEL, how to generate the appropriate client? Their approach is based on a formal semantics for BPEL via process algebra and yields an algorithm which decides whether such a client exists and synthesizes the description of this client as a (timed) automaton. The second one concerns the design process of a service. They propose a method which proceeds by two successive refinements: first the service is described via UML, then refined in a BPEL model and finally enlarged with JAVA code using JCSWL, a new language that we introduce here. Their solutions are integrated in a service development framework that will be presented in a synthetic way.
APA, Harvard, Vancouver, ISO, and other styles
21

Schadeberg, Amanda, Marloes Kraan, and Katell G. Hamon. "Beyond métiers: social factors influence fisher behaviour." ICES Journal of Marine Science 78, no. 4 (April 1, 2021): 1530–41. http://dx.doi.org/10.1093/icesjms/fsab050.

Full text
Abstract:
Abstract Fisheries management is usually supported by technical and financial measurements (i.e. logbooks and market data), which are helpful for ecological or economic assessments. Yet this information is not able to address social heterogeneity and fisher motivations, which are key to understanding fisher behaviour. This case study of the demersal segment in the Netherlands shows that combining quantitative analysis of logbooks with qualitative data collected by engaging with fishers can capture both fishing activity and its motivations, generating a more social understanding of fisher behaviour. A métier analysis of logbook data describes five dominant fishing practices among the selected segment. Twenty-five in-depth interviews with fishers along with focus groups including other experts identify three social factors that influence fisher behaviour in the Dutch demersal fleet: business structure, working rhythm, and polyvalence. The results show that motivations for fisher behaviour are more complex than complying with regulations or seeking profit: social factors also influence fishing activity. Furthermore, these social factors have real implications for the impacts of management measures on both the fishing communities and the environment, especially in times of change. These results are useful for management strategy development or evaluation because they are feasibly observable through existing data collection protocols.
APA, Harvard, Vancouver, ISO, and other styles
22

Kumar, Bathula Prasanna, and Edara Srinivasa Reddy. "An Efficient Security Model for Password Generation and Time Complexity Analysis for Cracking the Password." International Journal of Safety and Security Engineering 10, no. 5 (November 30, 2020): 713–20. http://dx.doi.org/10.18280/ijsse.100517.

Full text
Abstract:
Passwords tend to be one of the most popular approaches to protect operating systems and user’s data also. Most businesses rely on password protection schemes, and secure passwords are incredibly necessary to them. The proposed model typically aims to impose protection by forcing users to obey protocols to build passwords. For user protection, password has become a prevailing method in terms of exposure to scarce tools. The main problem with password is its consistency or power, i.e. how simple (or how difficult) a third person can be "assumed" to enter the tool that you use while claiming to be you. In operating systems, text-based passwords remain the primary form of authentication, following major improvements in attackers' skills in breaking passwords. The proposed Random Character Utilization with Hashing (RCUH) is used for generation of new passwords by considering user parameters. The proposed model introduces a new framework to design a password by considering nearly 10 parameters from the user and also analyze the time for cracking the generated password to provide the system strength. The proposed model aims to generate an efficient security model for password generation by considering several secret parameters from the user. To break a set of consistency passwords, analysis is also performed on time for password cracking. The tests show a close positive correlation between guessing complexity and password consistency. The proposed model is compared with the traditional password generation and cracking models. The proposed model takes much time in cracking the password that improves the systems security.
APA, Harvard, Vancouver, ISO, and other styles
23

KRABS, WERNER, and STEFAN PICKL. "A GAME-THEORETIC TREATMENT OF A TIME-DISCRETE EMISSION REDUCTION MODEL." International Game Theory Review 06, no. 01 (March 2004): 21–34. http://dx.doi.org/10.1142/s0219198904000058.

Full text
Abstract:
We present a game-theoretic treatment of the so-called TEM model which leads to new results in the area of time-discrete dynamical games. The presented TEM-model describes the economical interaction between several actors (players) who intend to minimize their emissions (Ei) caused by technologies (Ti) by means of expenditures of money (Mi) or financial means, respectively. The index stands for the ith player, i=1,…,n. The players are linked by technical cooperations and the market, which expresses itself in the nonlinear time-discrete dynamics of the Technology-Emissions-Means-model, in short: TEM-model. In the sense of environmental protection, the aim is to reach a state which is mentioned in the Kyoto Protocol by choosing the control parameters such that the emissions of each player become minimized. The focal point is the realization of the necessary optimal control parameters via a played cost game, which is determined by the way of cooperation of the actors. In application to the work of G. Leitmann [1974], but not regarding solution sets as feasible sets, the τ-value of S. H. Tijs and T. S. H. Driessen [1986] is taken as a control parameter. This leads to a new class of problems in the area of 1-convex games. We want to solve the problem by a non-cooperative and cooperative treatment. We prove that the core which is gained by cooperation of the players is nonempty and can be used as feasible set for our control problem. With this solution a reasonable model for a Joint-Implementation process is developed, where its necessary fund is represented by the non-empty core of the analyzed game. Steering with parameters of this feasible set, the TEM-model can be regarded as a useful tool to implement and verify a technical Joint-Implementation Program. For the necessary data given to the Clearing House () we are able to compare the numerical results with real world phenomena.
APA, Harvard, Vancouver, ISO, and other styles
24

Wiener, M. B., H. M. Newman, and E. A. Spradley. "Revolutionizing oncology patient enrollment in clinical trials: Just-in-time approach." Journal of Clinical Oncology 25, no. 18_suppl (June 20, 2007): 6577. http://dx.doi.org/10.1200/jco.2007.25.18_suppl.6577.

Full text
Abstract:
6577 Background: In the US, adult oncology patient enrollment in clinical trials is estimated at 4% (Cancer 2006; 106(2):426–33). This is a major bottleneck in the development of new cancer treatments, especially for rare indications, like stage IV pancreatic cancer in treatment-naive patients. To overcome the challenge of low enrollment in clinical trials, Pharmatech developed the Just-In-Time (JIT) approach. JIT expands the potential patient population and allows research sites to pre-identify patients using standard of care information prior to site activation. This differs from a traditional approach where sites are activated prior to screening for patients. We postulated the JIT approach would impact patient accrual. Methods: With the JIT approach, a central IRB-approved protocol was presented to 38 research-ready sites but only 8 sites with a potential patient obtained IRB-site approval over a 6 month period. Within 5–10 business days after identifying a potential patient, sites obtained IRB approval, investigators at the site were trained, received study materials, and dosed the first patient. We compared traditional site activation conducted by the sponsor to JIT approach in this study. Results: Traditional approach vs JIT approach: No.of patients enrolled: 42 vs 18; No.of months open: 19 vs 6; No. of sites open: 13 vs 8; No. of non-enrolling sites: 2 vs 0; Patient accrual rate (pts/mo/site): 0.17 vs 0.38. Sites participating in the JIT approach accrued subjects at >2x the rate of traditionally activated sites. The JIT approach eliminated non-enrolling sites, increasing sites’ and sponsor's satisfaction with enrollment. JIT also enhanced patient options by providing access to a novel treatment alternative for this rare indication even at smaller research sites. Conclusions: With JIT, patient accrual rate, enrollment success and trial-related cost in a pancreatic cancer study were markedly better than with the traditional approach. Expanding JIT to other rare indications and to trials requiring specific genetic tumor-profiles will show whether these results are reproducible and whether JIT is a viable approach for trials with challenging eligibility criteria. No significant financial relationships to disclose.
APA, Harvard, Vancouver, ISO, and other styles
25

Nosheen, Mariam, Zahwa Sayed, Muhammad Saad Malik, and Muhammad Abuzar Fahiem. "An Evaluation Model for Measuring the Usability of Mobile Office Applications through User Interface Design Metrics." July 2019 38, no. 3 (July 1, 2019): 641–54. http://dx.doi.org/10.22581/muet1982.1903.10.

Full text
Abstract:
Usage of mobile devices and particularly smart phones has seen an enormous hike due to the advancement of mobile phone technology in recent times. People of different age groups, one way or the other, are now connected to different mobile phone applications, such as, Social Networking, Chatting, VoIP (Voice Over Internet Protocol) applications, Gaming etc. This rapid advancement has made it necessary for user interface designers of mobile applications to design user friendly interfaces for their applications, so that users can interact and use those applications with ease irrespective of their location. Usability plays a vital role for measuring the usefulness of such applications. After examining different experimental studies on usability assessment techniques imparted by various research workers, it has been determined that there is still a great deal of requirement where application designers have to guarantee more adept and improved usability of offline mobile applications, such as, Conversion Apps, offline encyclopedia, Translation apps, business use applications (office applications) etc. The primary objective of this study is to render a model for usability metrics of measuring the usability of office applications for smart phones. The effectiveness, usefulness and reliability of the proposed model is measured through two office applications named Office Suite Pro7 for Android and Office 365 for Windows 8 touch screen Smartphone.The results of usability testing and t-test show the significance of the proposed approach. The model in this study will enable the application designers to guarantee a more adept and enhanced usability of office applications for smart phones during the designing stage.
APA, Harvard, Vancouver, ISO, and other styles
26

Walther, James H. "Teaching ethical dilemmas in LIS coursework." Bottom Line 29, no. 3 (November 14, 2016): 180–90. http://dx.doi.org/10.1108/bl-05-2016-0020.

Full text
Abstract:
Purpose This paper aims to examine the teaching of library graduate students in an introductory course on the foundations of librarianship. To examine the specific skill of developing an ethical foundation in their future profession of librarianship, an examination is offered here using a multiple-step teaching strategy, introducing specific instructional materials, including a model of assessing ethics and a proposed integration of research skills with problem-based learning (PBL) as the suggested teaching delivery. As the experience proved to provide positive outcomes for student learning, the paper provides not only this operational examination but also the theoretical justification for further adaptation and usage of PBL as a teaching method in library and information science (LIS) education. Described are details LIS faculty should consider in implementing the method in teaching, especially on the topic of professional ethics. Methodology/approach This research project focused on exploring a new way of exploring the teaching of ethical behaviors in the library profession by examining real-world examples of ethics in trade news sources. It was therefore determined that the best strategy was to design a teaching activity that assists students in learning two sets of skills: information-seeking behavior and developing ethical boundaries and standards that a librarian would use in professional practice. Findings The process is often taught in a linear manner, but in practice, ethical situations are found and expressed in non-linear ways. In practice, the profession is rife with ethics, non-rules, non-lists and no checklists upon how to behave. Ethical dilemmas are extemporaneous, and yet decisions regarding them can be made from the guidance of professional associations and combined with thoughtful analysis. Originality/value Redefining any pedagogical activity in graduate teaching is, at times, more herculean than it seems at the start, yet with distilling the process into workable steps with appropriate protocols, we can successfully teach ethics in new ways. More integration of PBL is hereby advocated throughout LIS curriculum in a variety of contexts.
APA, Harvard, Vancouver, ISO, and other styles
27

Doevendans, Hans J. T., Nigel Peter Grigg, and Jane Goodyer. "Exploring Lean deployment in New Zealand apple pack-houses." Measuring Business Excellence 19, no. 1 (March 16, 2015): 46–60. http://dx.doi.org/10.1108/mbe-11-2014-0042.

Full text
Abstract:
Purpose – This paper aims to present findings from a research project that investigated the suitability of Lean in a seasonal horticultural setting, specifically the New Zealand (NZ) apple and pear (pipfruit) industry. The paper focusses on improvements made while deploying Lean elements in several apple pack-houses. Design/methodology/approach – The literature review discusses how common theoretical Lean themes are not industry or contextually bound and may be transferable to other industries. An industry-wide survey assesses the current state of knowledge and Lean deployment within the industry using a unique “single-question-per-day” approach. Two case studies and one action research study are used to obtain rich data from organisations that have implemented Lean in recent times. Reliability and validity is achieved by selecting representative samples, using a case study protocol, a single researcher for consistency, participant verification, multiple sources of evidence within cases and replication logic. Findings – The industry survey shows a low level of knowledge and applied Lean within the industry. Data demonstrate that significant progress is made, using different implementation approaches that lead to a measurable increase of Lean, supported by some positive financial indicators. Research limitations/implications – This research is restricted to NZ apple pack-houses, but indicates that Lean can contribute significantly to general horticultural pack house performance. Originality/value – Literature research shows that little research has been done to study Lean in the horticultural field generally and in the NZ pipfruit industry specifically. This paper contributes to filling that knowledge gap.
APA, Harvard, Vancouver, ISO, and other styles
28

David, Solomon Arulraj, and Christopher Hill. "Postgraduate students' experiences and perspectives on transformation of teaching and learning in tertiary education." Education + Training 63, no. 4 (February 9, 2021): 562–78. http://dx.doi.org/10.1108/et-05-2020-0122.

Full text
Abstract:
PurposeTertiary education has been going through dramatic transformation in recent times. Such transformation is seen in teaching and learning at tertiary education. This study, therefore, aims to understand the transformation of teaching and learning in tertiary-level education, particularly by accounting the experiences and perspectives of postgraduate learners.Design/methodology/approachThe study narrowed higher education transformation into four key drivers such as expansion, excellence, extension, external and explored their dynamics and impacts for teaching and learning in tertiary education. The data was gathered from 25 doctoral students from three different cohorts, who shared their critical reflection on their experiences and perspectives on the transformation of teaching and learning in a reflective journal. The 25 reflective journals were used as the qualitative transcripts for analysis. Standard required ethical protocols were followed in the research. The results were analysed using thematic analysis.FindingsThe findings indicate that teaching and learning in the higher education are transformed largely using technology, by engaging various stakeholders, several pedagogic methods, a range of assessments and numerous contents and materials. The findings suggest that higher education transformation has affected teaching and learning in tertiary education positively in the UAE, while identifying some relevant areas for improvement.Research limitations/implicationsSingle data and small sample size (although suitable for the study) are the limitations. The experiences and perspectives of the postgraduate scholars on teaching and learning offer relevant insights for postgraduate learners, academic, researchers, curriculum developers, policymakers. The study asserts that accounting student's experiences and perspectives supports the understanding on the transformation of teaching and learning in tertiary education.Originality/valueThe study contributes to the ongoing debate on how students are helping shape teaching and learning practices in tertiary education, particularly from the UAE context using informed critical reflection. The study contends and concludes that teaching and learning in tertiary education are continued to be shaped by emerging trends and development.
APA, Harvard, Vancouver, ISO, and other styles
29

Nagaraj, Kalyan, Biplab Bhattacharjee, Amulyashree Sridhar, and Sharvani GS. "Detection of phishing websites using a novel twofold ensemble model." Journal of Systems and Information Technology 20, no. 3 (August 13, 2018): 321–57. http://dx.doi.org/10.1108/jsit-09-2017-0074.

Full text
Abstract:
Purpose Phishing is one of the major threats affecting businesses worldwide in current times. Organizations and customers face the hazards arising out of phishing attacks because of anonymous access to vulnerable details. Such attacks often result in substantial financial losses. Thus, there is a need for effective intrusion detection techniques to identify and possibly nullify the effects of phishing. Classifying phishing and non-phishing web content is a critical task in information security protocols, and full-proof mechanisms have yet to be implemented in practice. The purpose of the current study is to present an ensemble machine learning model for classifying phishing websites. Design/methodology/approach A publicly available data set comprising 10,068 instances of phishing and legitimate websites was used to build the classifier model. Feature extraction was performed by deploying a group of methods, and relevant features extracted were used for building the model. A twofold ensemble learner was developed by integrating results from random forest (RF) classifier, fed into a feedforward neural network (NN). Performance of the ensemble classifier was validated using k-fold cross-validation. The twofold ensemble learner was implemented as a user-friendly, interactive decision support system for classifying websites as phishing or legitimate ones. Findings Experimental simulations were performed to access and compare the performance of the ensemble classifiers. The statistical tests estimated that RF_NN model gave superior performance with an accuracy of 93.41 per cent and minimal mean squared error of 0.000026. Research limitations/implications The research data set used in this study is publically available and easy to analyze. Comparative analysis with other real-time data sets of recent origin must be performed to ensure generalization of the model against various security breaches. Different variants of phishing threats must be detected rather than focusing particularly toward phishing website detection. Originality/value The twofold ensemble model is not applied for classification of phishing websites in any previous studies as per the knowledge of authors.
APA, Harvard, Vancouver, ISO, and other styles
30

Abraham, Ajith, Sung-Bae Cho, Thomas Hite, and Sang-Yong Han. "Special Issue on Web Services Practices." Journal of Advanced Computational Intelligence and Intelligent Informatics 10, no. 5 (September 20, 2006): 703–4. http://dx.doi.org/10.20965/jaciii.2006.p0703.

Full text
Abstract:
Web services – a new breed of self-contained, self-describing, modular applications published, located, and invoked across the Web – handle functions, from simple requests to complicated business processes. They are defined as network-based application components with a services-oriented architecture (SOA) using standard interface description languages and uniform communication protocols. SOA enables organizations to grasp and respond to changing trends and to adapt their business processes rapidly without major changes to the IT infrastructure. The Inaugural International Conference on Next-Generation Web Services Practices (NWeSP'05) attracted researchers who are also the world's most respected authorities on the semantic Web, Web-based services, and Web applications and services. NWeSP'05 was held in cooperation with the IEEE Computer Society Task Force on Electronic Commerce, the Technical Committee on Internet, and the Technical Committee on Scalable Computing. This special issue presents eight papers focused on different aspects of Web services and their applications. Papers were selected based on fundamental ideas and concepts rather than the thoroughness of techniques employed. Papers are organized as follows: <I>Taher et al.</I> present the first paper, on a Quality of Service Information and Computational framework (QoS-IC) supporting QoS-based service selection for SOA. The framework's functionality is expanded using a QoS constraints model that establishes an association relationship between different QoS properties and is used to govern QoS-based service selection in the underlying algorithm. Using a prototype implementation, the authors demonstrate how QoS constraints improve QoS-based service selection and save consumers valuable time. Due to the complex infrastructure of web applications, response times perceived by clients may be significantly longer than desired. To overcome some of the current problems, <I>Vilas et al.</I>, in the second paper, propose a cache-based extension of the architecture that enhances the current web services architecture, which is mainly based on program-logic or protocol-dependent optimization. In the third paper, Jo and Yoo present authorization for securing XML sources on the Web. One of the disadvantages of existing access control is that the DOM tree must be loaded into memory while all XML documents are parsed to generate the DOM tree, such that a lot of memory is used in repetitive search for tree to authorize access to all nodes in the DOM tree. The complex authorization evaluation process required thus lowers system performance. Existing access control fails to consider information structure and semantics sufficiently due to basic HTML limitations. The authors overcome some of these limitations in the proposed model. In the fourth paper, Jung and Cho propose a novel behavior-network-based method for Web service composition. The behavior network selects services automatically through internal and external links with environmental information from sensors and goals. An optimal service is selected at each step, resulting in a globally optimal service sequence for achieving preset goals. The authors detail experimental results for the proposed model by comparing them with rule-based system and user tests. <I>Kong et al.</I> present an efficient method in the fifth paper for merging heterogeneous ontologies – no ontology building standard currently exists – and the many ontology-building tools available are based on different ontology languages, mostly focusing on how to create, edit and infer the ontology efficiently. Even ontologies about the same domain differ because ontology experts hold different view points. For these reasons, interoperability between ontologies is very low. The authors propose merging heterogeneous domain ontologies by overcoming some of the above limitations. In the sixth paper, Chen and Che provide polynomial-time tree pattern query minimization algorithm whose efficiency stems from two key observations: (i) Inherent redundant "components" usually exist inside the rudimentary query provided by the user, and (ii) nonedundant nodes may become redundant when constraints such as co-occurrence and required child/descendant are given. They show that the algorithm obtained by first augmenting the input tree pattern using constraints, then applying minimization, invariably finds a unique minimal equivalent to the original query. Chen and Che present a polynomial-time algorithm for tree pattern query (TPQ) minimization without XML constraints in the seventh paper. The two-part algorithm is a dynamic programming strategy for finding all matching subtrees within a TPQ. The algorithm consists of one for subtree recognization and a second for subtree deletion. In the last paper, <I>Bagchi et al.</I> present the mobile distributed virtual memory (MDVM) concept and architecture for cellular networks containing server-groups (SG). They detail a two-round randomized distributed algorithm to elect a unique leader and co-leader of the SG that is free of any assumption about network topology, and buffer space limitations and is based on dynamically elected coordinators eliminating single points of failure. As guest editors, we thank all authors featured in this special issue for their contributions and the referees for critically evaluating the papers within the short time allotted. We sincerely believe that readers will share our enjoyment of this special issue and find the information it presents both timely and useful.
APA, Harvard, Vancouver, ISO, and other styles
31

Vlacic, Ljubo, Toshio Fukuda, Yasuhisa Hasegawa, and Michel Parent. "Special Issue on Cybernetic City Transport Systems and Technologies." Journal of Robotics and Mechatronics 22, no. 6 (December 20, 2010): 683–84. http://dx.doi.org/10.20965/jrm.2010.p0683.

Full text
Abstract:
The publication of this issue was driven by the vision that, in the not too distant future, Cybernetic Transport Systems (CTS) will be seen on city roads and dedicated infrastructures. TheWorld Council for Sustainability has projected that CTS will be seen in cities in as early as 2030 (Mobility 2030: Meeting the Challenges to Sustainability; World Business Council for Sustainable Mobility, July 2004). CTS are based on fully automated driverless urban road vehicles (CyberCars). They can also be based on Dual-Mode Vehicles (DMV) - conventional vehicles with Advanced Driver Assistance Technology (ADAT) and capable of driverless driving, on request by a driver. ADAT covers electronic and software products that assist drivers in driving. DMV assumes that a driver is not in control of the vehicle at all times but is fully responsible for vehicle operation throughout. Both CyberCars and DMVs co-operate through vehicle-to-vehicle and vehicle-to-infrastructure communication links thus enabling cybernetic transport to achieve higher traffic flows and improve network efficiency. Main CTS building blocks are CyberCars and/or Dual Mode Vehicles, Road Infrastructure Elements and CTS Traffic Management & Control Centre. These four blocks are interconnected, integrated and made interoperable through Communication Architecture and Protocols, and Operational Safety & Reliability Certification Procedures. A variety of CTS concepts have been prototyped and evaluated within the scope of projects such as: (i) Toyota’s Intelligent Multimode Transportation System (http://www.expo2005.or.jp/en/technology/imts.html); (ii) the CyberCars (http://www.cybercars.org); CyberMove (http://www.cybermove.org); (iii) CityMobil (http://www.citymobil-project.eu/); (iv) Safespot (http://www.safespot-eu.org/); (v) CVIS (http://www.cvisproject.org); (vi) Group Rapid Transit (http://www.2getthere.eu/Group Transit). The figure above shows a CTS prototyped by the CyberCars-2 Project Consortium. An extensive infield, i.e., on-road testing of operational performance of co-operative cybernetic transport solutions was conducted at several road tracks, the last being held at La Rochelle, France, in September 2008. This issue addresses a broad spectrum of theoretical and implementational topics related to CTS development and deployment including: • Cooperative Cybernetic Transport System Architecture • Real-time Decision Making by driverless vehicles • On-road testing of operational performance of CTS • Road-Crossing Landmarks Detection algorithm • Landmark Shape Detection algorithm • Road Shape Estimation algorithm, and • Vehicle-to-road infrastructure (traffic lights) communication solutions. In addition, this issue presents papers that deal with ADAT and analyses: • Acceptability and Usability of a Parking Assistance System for Elderly Drivers • Relationships between Car Accidents and a Driver’s Physiology and Psychology • 2D Localization in Urban Environment, and • Sustainability and Reusability aspects of Common Robotic Technology components. We hope you enjoy the issue!
APA, Harvard, Vancouver, ISO, and other styles
32

Wear, Sandra, Paul G. Richardson, Carolyn Revta, Ravi Vij, Mark Fiala, Sagar Lonial, Dixil Francis, et al. "The Multiple Myeloma Research Consortium (MMRC) Model: Reduced Time to Trial Activation and Improved Accrual Metrics." Blood 116, no. 21 (November 19, 2010): 3803. http://dx.doi.org/10.1182/blood.v116.21.3803.3803.

Full text
Abstract:
Abstract Abstract 3803 Background: The MMRC is a non-profit, disease-focused consortium founded in 2004 comprised of 13 North American centers with expertise in multiple myeloma (MM). The MMRC Inc. (Norwalk, CT), MMRC's member institutions, and several pharmaceutical partners work closely to speed early development of new treatment options for MM patients. In December 2007, MMRC headquarters staff implemented multiple project management (PM) business solutions to address trial barriers that delay activation of our phase I-II clinical trials and established trial metric benchmarks considered attainable at our member institutions. In November 2009, we reported initial data1 on trial activation: MMRC trials initiated between Sep08-Jul09 (n=5) demonstrated a 38% decrease in mean time to first patient dosed (FPFD), compared with the Early Group trials (EG) initiated before PM solutions (Jun06-Sept08; n=7 trials). These results also confirmed improvement over published metrics from Dilts et al 2,3. We present additional data on time to FPFD and now enrollment, to assess impact of the MMRC PM resources and processes on trial efficiency. Methods: Twenty-one (21) trials conducted within the MMRC from May 2006 to June 2010 had sufficient trial data for review. Data were collected by MMRC-funded project managers at the clinical centers using a web-based clinical trial management system (CTMS). FPFD was defined as the time from the member institutions' receipt of the final protocol (FP) from the trial sponsor, to the time the first patient was dosed on the trial at any participating MMRC member institution. With respect to enrollment, pre-study enrollment commitment (EC) established between MMRC and the study sponsor was defined as the total number of subjects committed to receive at least one dose of study drug across all participating MMRC centers on a trial; baseline enrollment timeline (BET) was prospectively defined as the target time period to attain EC. Results: Mean time to FPFD was 181 calendar days for the early group trials (EG, n=7) and 122 days for the recent group (RG) trials (n=14, Sep08 – Jun10), representing a 32% reduction of time to FPFD in the RG. Additionally, time from final protocol to first patient dosed at all MMRC centers on a trial decreased 18% from a mean of 7.7 months (EG) to 6.3 months (RG). With respect to MMRC trial enrollment, data was available for 16/21 trials (5 EG and 11 RG). 2 EG trials were missing data and 3 RG trials were still enrolling as of June 2010. The mean MMRC pre-study EC was 39 subjects per trial (n=16 trials; 626 enrollment target); the mean actual MMRC enrollment was 49 subjects per trial (n=16; 783 enrolled through Jun 29, 2010) representing a 25% increase in actual versus committed enrollment. Two trials did not meet MMRC EC: MMRC investigators discontinued their involvement in these trials at approximately 30% target EC due to trial complexity or low patient enrollment. 14/16 evaluable trials (88%) met their EC; 11/16 trials met EC within BET (69%) of which 8/16 trials (50%) reached EC 34% faster than their BET (representing a mean reduction of 4.5 months). The overall pre-study mean BET for 16 trials was 13.1 months. MMRC's actual mean enrollment timeline was 11.3 months for the group of 14 evaluable trials representing improvement over the original BET by a mean of 1.9 months (14%). We believe FP to FPFD is a more meaningful metric beyond that of first patient consented. Moreover, we believe that if all participating trial centers focus their efforts on dosing the first patient and within a targeted timeframe, it may improve our research efforts overall. Conclusion: Development of drugs in the clinical setting has become time and resource intensive. Activating and enrolling trials promptly is a priority for drug development. The MMRC's standardization of processes and support for site-based PM resources results in improved trial metrics. MMRC member centers met or exceeded pre-specified enrollment targets in 88% of the trials analyzed to date. Ongoing monitoring of trial conduct continues to reveal areas where increased focus is needed to realize further trial efficiencies. These trial metrics and measures to improve efficiency may be applied with similar expected benefit in all oncology disciplines. Disclosures: Wear: Multiple Myeloma Research Consortium (MMRC): Employment. Richardson:Multiple Myeloma Research Consortium (MMRC): Member Institution of the MMRC; Millennium: Membership on an entity's Board of Directors or advisory committees; Celgene: Membership on an entity's Board of Directors or advisory committees; Novartis: Membership on an entity's Board of Directors or advisory committees; Johnson & Johnson: Membership on an entity's Board of Directors or advisory committees. Revta:Multiple Myeloma Research Consortium (MMRC): Member Institution of the MMRC. Vij:Multiple Myeloma Research Consortium (MMRC): Research Funding; Multiple Myeloma Research Consortium (MMRC): Member Institution of the MMRC. Fiala:Multiple Myeloma Research Consortium (MMRC): Member Institution of the MMRC. Lonial:Bristo-Myers Squibb: Consultancy, Membership on an entity's Board of Directors or advisory committees, Research Funding; Celgene Corporation: Consultancy, Membership on an entity's Board of Directors or advisory committees, Research Funding; Millenium Pharmaceuticals Inc: Consultancy, Membership on an entity's Board of Directors or advisory committees, Research Funding; Novartis Pharmaceuticals: Consultancy, Membership on an entity's Board of Directors or advisory committees, Research Funding; Onyx: Consultancy, Membership on an entity's Board of Directors or advisory committees, Research Funding. Francis:Multiple Myeloma Research Consortium (MMRC): Member Institution of the MMRC. Siegel:Multiple Myeloma Research Consortium (MMRC): Member Institution of the MMRC; Celgene: Speakers Bureau; Millennium: Speakers Bureau. Schramm:Multiple Myeloma Research Consortium (MMRC): Member Institution of the MMRC. Jakubowiak:Multiple Myeloma Research Consortium (MMRC): Member Institution of the MMRC, Research Funding; Millennium: Honoraria, Membership on an entity's Board of Directors or advisory committees, Speakers Bureau; Celgene: Honoraria, Speakers Bureau; Centocor Ortho-Biotech: Honoraria, Speakers Bureau; Exelixis: Honoraria, Speakers Bureau; Bristol-Myers Squibb: Honoraria, Membership on an entity's Board of Directors or advisory committees, Speakers Bureau. Harvey:Multiple Myeloma Research Consortium (MMRC): Member Institution of the MMRC. Reece:Multiple Myeloma Research Consortium (MMRC): Member Institution of the MMRC; Celgene: Honoraria, Research Funding; Ortho Biotech: Honoraria, Research Funding; Merck: Honoraria, Research Funding; Amgen: Honoraria, Research Funding; Facet: Research Funding; Otsuka: Honoraria, Research Funding. Gul:Multiple Myeloma Research Consortium (MMRC): Member Institution of the MMRC. Jagannath:Celgene: Honoraria; Millenium: Honoraria; Orthobiotec: Honoraria; Onyx Pharma: Honoraria; Merck: Honoraria; Proteolix: Honoraria; Multiple Myeloma Research Consortium (MMRC): Member Institution of the MMRC. La:Multiple Myeloma Research Consortium (MMRC): Member Institution of the MMRC. Hofmeister:Multiple Myeloma Research Consortium (MMRC): Member Institution of the MMRC. Jansak:Multiple Myeloma Research Consortium (MMRC): Member Institution of the MMRC. Stewart:Millennium: Consultancy; Celgene: Honoraria; Multiple Myeloma Research Consortium: Member Institution of the MMRC. Hagerty:Mayo Clinic: Employment; Multiple Myeloma Research Consortium (MMRC): Member Institution of the MMRC. Wolf:Multiple Myeloma Research Consortium (MMRC): Member Institution of the MMRC; Celgene: Speakers Bureau; Millennium: Speakers Bureau; Novartis: Speakers Bureau; Orthobiotech: Speakers Bureau. Davis:Multiple Myeloma Research Consortium (MMRC): Member Institution of the MMRC. Krishnan:Celgene: Speakers Bureau; Multiple Myeloma Research Consortium (MMRC): Member Institution of the MMRC. Duarte:Multiple Myeloma Research Consortium (MMRC): Member Institution of the MMRC. Zimmerman:Millennium, Celgene: Speakers Bureau. Cisneros:Multiple Myeloma Research Consortium (MMRC): Member Institution of the MMRC. Kumar:Celgene: Consultancy, Research Funding; Millennium: Research Funding; Merck: Consultancy, Research Funding; Novartis: Research Funding; Genzyme: Consultancy, Research Funding; Cephalon: Research Funding; Bayer: Research Funding; Multiple Myeloma Research Consortium: Member Institution of the MMRC. Birgin:Multiple Myeloma Research Consortium (MMRC): Member Institution of the MMRC. Ott:Multiple Myeloma Research Consortium (MMRC): Employment. Tasca:Multiple Myeloma Research Consortium (MMRC): Employment. Kelley:Multiple Myeloma Research Consortium (MMRC): Employment. Anderson:Millennium: Consultancy; Multiple Myeloma Research Consortium (MMRC): Member Institution of the MMRC; Celgene: Consultancy; Novartis: Consultancy; Onyx: Consultancy; Merck: Consultancy; Bristol-Myers Squibb: Consultancy; Acetylon: Equity Ownership, Membership on an entity's Board of Directors or advisory committees. Giusti:Multiple Myeloma Research Consortium (MMRC): Employment.
APA, Harvard, Vancouver, ISO, and other styles
33

Panagiotou, Thomai, and Robert J. Fisher. "Producing micron- and nano-size formulations for functional foods applications." Functional Foods in Health and Disease 3, no. 7 (July 8, 2013): 274. http://dx.doi.org/10.31989/ffhd.v3i7.48.

Full text
Abstract:
Background: Nutrient deficiencies affect the health and wellness of large populations around the world. For example, the majority suffer from vitamin, essential fatty acid (such as omega-3), dietary fiber, and other important ingredient deficiencies due to their limited supply in the human food chain. Current trends in the nutraceutics industry to place these substances in higher. more-efficiently dispersed quantities in our food have become critically essential to their business plans. Nutrients in the form of small solids or droplets improve bioavailability. However, there remain numerous barriers to successful implementation of cost effective manufacturing processes. These challenges are addressed in the work presented here with particular focus on stability, bioavailability, and consumer acceptance. The goal is to develop large scale manufacturing systems that implement efficient platform technologies, with their respective operational maps, to produce functional food formulations, with particle sizes of these specially formulated nutraceutical ingredients in the micron-and nano- range.Objective: Demonstrating that stable micron- and nano-size emulsions, liposomes, and aqueous suspensions of functional food formulations can be produced using both “top down” and “bottom up” methods is our main objective. Addressing the challenges associated with the incorporation of these ingredients into large scale manufacturing systems, mainly mechanical stability and related shelf-life issues, is also a focus. That is, to develop proper processing protocols providing improved quality foods enriched with ingredients that are in limited supply in our food chain; to enhance human health and wellness world-wide.Methods: The formulations considered here typical of those used for increasing bioavailability of the infused, specially formulated ingredients with anti-cancer, anti-aging, and in-general wellness properties, lowering fat content and enhancing the shelf-life stability. Included are (a) an oil-in-water (fish oil/omega-3) emulsion, (b) liposome chaperones to vitamin C, and (c) aqueous suspensions (curcumin crystals, lutein/carotenoids, and fiber in soy milk). The production techniques include both “top-down” particle size reduction and “bottom-up” formation of crystals/precipitates via solubility adjustments. Both techniques are based on high shear processing of multiple liquid feeds. Using an impinging jet system, micro-mixing scales less than 100 nm were obtained.Results: (a) All nano-emulsion types, single, double and larger, either as oil-in-water and water-in-oil, can effectively be produced from various formulations using “top-down” methods. Illustrated here are single, oil-in-water systems; concentrations of 12-14 wt. % fish oil/omega-3 were mixed with water containing food grade surfactants. The high shear processing produced stable, submicron particles; with median particle sizes of 119-163 nm, no particles larger than 1 micron, and the “fish” odor was suppressed. Pertinent discussions related to the other types are also given as suggested path forward approaches for the development of nutrient enriched functional foods. This includes water-in-oil formulations for reduced fat content and the delivery of multiple species via double and triple emulsions, as compared to liposome configurations. (b) Although liposomes may be used to encapsulate both hydrophobic and hydrophilic substances, we selected liposomal vitamin C as our initial proof-of-concept system since it is absorbed into the body over four times more easily than its non-encapsulated form. After top down processing, the median size was 200 nm, compared to a median size of about 5 microns obtained by traditional self-assembly protocols. (b) Aqueous suspensions of micron- and nano- size formulations were also accomplished. The top down size reduction technique was used for processing soy bean fibers and lutein and the bottom-up method used for curcumin crystals. The fibers initially had a median size of 150 microns and a bi-modal distribution was obtained after processing; 99% of the particles were smaller than 15 microns with median sizes at 10 microns and the larger peak at about 200 nm. The curcumin submicron particles were formed via anti-solvent crystallization; with stable particles in the range of 300-500 nm. Conclusions: Our study demonstrates that stable micron- and nano-size emulsions, liposomes, and aqueous suspensions can be produced using both “top down” and “bottom up” methods. The formulation properties, in terms of particle size and stability, strongly depend on the processing parameters used in terms of energy input and temperature history. The energy requirements of the “bottom up” methods may be substantially lower than those of “top down” methods. Although some of the processes presented here have been scaled up to commercial levels, more work is needed in terms of fully assessing the bioavailability of the produced formulations and optimizing the processes to minimize cost. Key words: nano-emulsion, nano-suspension, high-shear processing, crystallization, curcumin, fish oil, liposomal vitamins: C and E, lutein, nutraceuticals, omega-3, soybean fiber
APA, Harvard, Vancouver, ISO, and other styles
34

Siembieda, William. "Toward an Enhanced Concept of Disaster Resilience: A Commentary on Behalf of the Editorial Committee." Journal of Disaster Research 5, no. 5 (October 1, 2010): 487–93. http://dx.doi.org/10.20965/jdr.2010.p0487.

Full text
Abstract:
1. Introduction This Special Issue (Part 2) expands upon the theme “Building Local Capacity for Long-term Disaster Resilience” presented in Special Issue Part 1 (JDR Volume 5, Number 2, April 2010) by examining the evolving concept of disaster resilience and providing additional reflections upon various aspects of its meaning. Part 1 provided a mixed set of examples of resiliency efforts, ranging from administrative challenges of integrating resilience into recovery to the analysis of hazard mitigation plans directed toward guiding local capability for developing resiliency. Resilience was broadly defined in the opening editorial of Special Issue Part 1 as “the capacity of a community to: 1) survive a major disaster, 2) retain essential structure and functions, and 3) adapt to post-disaster opportunities for transforming community structure and functions to meet new challenges.” In this editorial essay we first explore in Section 2 the history of resilience and then locate it within current academic and policy debates. Section 3 presents summaries of the papers in this issue. 2. Why is Resilience a Contemporary Theme? There is growing scholarly and policy interest in disaster resilience. In recent years, engineers [1], sociologists [2], geographers [3], economists [4], public policy analysts [5, 6], urban planners [7], hazards researchers [8], governments [9], and international organizations [10] have all contributed to the literature about this concept. Some authors view resilience as a mechanism for mitigating disaster impacts, with framework objectives such as resistance, absorption, and restoration [5]. Others, who focus on resiliency indicators, see it as an early warning system to assess community resiliency status [3, 8]. Recently, it has emerged as a component of social risk management that seeks to minimize social welfare loss from catastrophic disasters [6]. Manyena [11] traces scholarly exploration of resilience as an operational concept back at least five decades. Interest in resilience began in the 1940s with studies of children and trauma in the family and in the 1970s in the ecology literature as a useful framework to examine and measure the impact of assault or trauma on a defined eco-system component [12]. This led to modeling resilience measures for a variety of components within a defined ecosystem, leading to the realization that the systems approach to resiliency is attractive as a cross-disciplinary construct. The ecosystem analogy however, has limits when applied to disaster studies in that, historically, all catastrophic events have changed the place in which they occurred and a “return to normalcy” does not occur. This is true for modern urban societies as well as traditional agrarian societies. The adoption of “The Hyogo Framework for Action 2005-2015” (also known as The Hyogo Declaration) provides a global linkage and follows the United Nations 1990s International Decade for Natural Disaster Reduction effort. The 2005 Hyogo Declaration’s definition of resilience is: “The capacity of a system, community or society potentially exposed to hazards to adapt by resisting or changing in order to reach and maintain an acceptable level of functioning and structure.” The proposed measurement of resilience in the Hyogo Declaration is determined by “the degree to which the social system is capable of organizing itself to increase this capacity for learning from past disasters for better future protection and to improve risk reduction measures.” While very broad, this definition contains two key concepts: 1) adaptation, and 2) maintaining acceptable levels of functioning and structure. While adaptation requires certain capacities, maintaining acceptable levels of functioning and structure requires resources, forethought, and normative action. Some of these attributes are now reflected in the 2010 National Disaster Recovery Framework published by the U.S. Federal Emergency Management Agency (FEMA) [13]. With the emergence of this new thinking on resilience related to disasters, it is now a good time to reflect on the concept and assess what has recently been said in the literature. Bruneau et al. [1] offer an engineering sciences definition for community seismic resilience: “The ability of social units (e.g., organizations, communities) to mitigate hazards, contain the effects of disasters when they occur, and carry out recovery activities in ways that minimize social disruption and mitigate the effects of future earthquakes.” Rose [4] writes that resiliency is the ability of a system to recover from a severe shock. He distinguishes two types of resilience: (1) inherent – ability under normal circumstances and (2) adaptive – ability in crisis situations due to ingenuity or extra effort. By opening up resilience to categorization he provides a pathway to establish multi-disciplinary approaches, something that is presently lacking in practice. Rose is most concerned with business disruption which can take extensive periods of time to correct. In order to make resource decisions that lower overall societal costs (economic, social, governmental and physical), Rose calls for the establishment of measurements that function as resource decision allocation guides. This has been done in part through risk transfer tools such as private insurance. However, it has not been well-adopted by governments in deciding how to allocate mitigation resources. We need to ask why the interest in resilience has grown? Manyena [11] argues that the concept of resilience has gained currency without obtaining clarity of understanding, definition, substance, philosophical dimensions, or applicability to disaster management and sustainable development theory and practice. It is evident that the “emergency management model” does not itself provide sufficient guidance for policymakers since it is too command-and-control-oriented and does not adequately address mitigation and recovery. Also, large disasters are increasingly viewed as major disruptions of the economic and social conditions of a country, state/province, or city. Lowering post-disaster costs (human life, property loss, economic advancement and government disruption) is being taken more seriously by government and civil society. The lessening of costs is not something the traditional “preparedness” stage of emergency management has concerned itself with; this is an existing void in meeting the expanding interests of government and civil society. The concept of resilience helps further clarify the relationship between risk and vulnerability. If risk is defined as “the probability of an event or condition occurring [14]#8221; then it can be reduced through physical, social, governmental, or economic means, thereby reducing the likelihood of damage and loss. Nothing can be done to stop an earthquake, volcanic eruption, cyclone, hurricane, or other natural event, but the probability of damage and loss from natural and technological hazards can be addressed through structural and non-structural strategies. Vulnerability is the absence of capacity to resist or absorb a disaster impact. Changes in vulnerability can then be achieved by changes in these capacities. In this regard, Franco and Siembieda describe in this issue how coastal cities in Chile had low resilience and high vulnerability to the tsunami generated by the February 2010 earthquake, whereas modern buildings had high resilience and, therefore, were much less vulnerable to the powerful earthquake. We also see how the framework for policy development can change through differing perspectives. Eisner discusses in this issue how local non-governmental social service agencies are building their resilience capabilities to serve target populations after a disaster occurs, becoming self-renewing social organizations and demonstrating what Leonard and Howett [6] term “social resilience.” All of the contributions to this issue illustrate the lowering of disaster impacts and strengthening of capacity (at the household, community or governmental level) for what Alesch [15] terms “post-event viability” – a term reflecting how well a person, business, community, or government functions after a disaster in addition to what they might do prior to a disaster to lessen its impact. Viability might become the definition of recovery if it can be measured or agreed upon. 3. Contents of This Issue The insights provided by the papers in this issue contribute greater clarity to an understanding of resilience, together with its applicability to disaster management. In these papers we find tools and methods, process strategies, and planning approaches. There are five papers focused on local experiences, three on state (prefecture) experiences, and two on national experiences. The papers in this issue reinforce the concept of resilience as a process, not a product, because it is the sum of many actions. The resiliency outcome is the result of multiple inputs from the level of the individual and, at times, continuing up to the national or international organizational level. Through this exploration we see that the “resiliency” concept accepts that people will come into conflict with natural or anthropogenic hazards. The policy question then becomes how to lower the impact(s) of the conflict through “hard or soft” measures (see the Special Issue Part 1 editorial for a discussion of “hard” vs. “soft” resilience). Local level Go Urakawa and Haruo Hayashi illustrate how post-disaster operations for public utilities can be problematic because many practitioners have no direct experience in such operations, noting that the formats and methods normally used in recovery depend on personal skills and effort. They describe how these problems are addressed by creating manuals on measures for effectively implementing post-disaster operations. They develop a method to extract priority operations using business impact analysis (BIA) and project management based business flow diagrams (BFD). Their article effectively illustrates the practical aspects of strengthening the resiliency of public organizations. Richard Eisner presents the framework used to initiate the development and implementation of a process to create disaster resilience in faith-based and community-based organizations that provide services to vulnerable populations in San Francisco, California. A major project outcome is the Disaster Resilience Standard for Community- and Faith-Based Service Providers. This “standard” has general applicability for use by social service agencies in the public and non-profit sectors. Alejandro Linayo addresses the growing issue of technological risk in cities. He argues for the need to understand an inherent conflict between how we occupy urban space and the technological risks created by hazardous chemicals, radiation, oil and gas, and other hazardous materials storage and movement. The paper points out that information and procedural gaps exist in terms of citizen knowledge (the right to know) and local administrative knowledge (missing expertise). Advances and experience accumulated by the Venezuela Disaster Risk Management Research Center in identifying and integrating technological risk treatment for the city of Merida, Venezuela, are highlighted as a way to move forward. L. Teresa Guevara-Perez presents the case that certain urban zoning requirements in contemporary cities encourage and, in some cases, enforce the use of building configurations that have been long recognized by earthquake engineering as seismically vulnerable. Using Western Europe and the Modernist architectural movement, she develops the historical case for understanding discrepancies between urban zoning regulations and seismic codes that have led to vulnerable modern building configurations, and traces the international dissemination of architectural and urban planning concepts that have generated vulnerability in contemporary cities around the world. Jung Eun Kang, Walter Gillis Peacock, and Rahmawati Husein discuss an assessment protocol for Hazard Mitigation Plans applied to 12 coastal hazard zone plans in the state of Texas in the U.S. The components of these plans are systematically examined in order to highlight their respective strengths and weaknesses. The authors describe an assessment tool, the plan quality score (PQS), composed of seven primary components (vision statement, planning process, fact basis, goals and objectives, inter-organizational coordination, policies & actions, and implementation), as well as a component quality score (CQS). State (Prefecture) level Charles Real presents the Natural Hazard Zonation Policies for Land Use Planning and Development in California in the U.S. California has established state-level policies that utilize knowledge of where natural hazards are more likely to occur to enhance the effectiveness of land use planning as a tool for risk mitigation. Experience in California demonstrates that a combination of education, outreach, and mutually supporting policies that are linked to state-designated natural hazard zones can form an effective framework for enhancing the role of land use planning in reducing future losses from natural disasters. Norio Maki, Keiko Tamura, and Haruo Hayashi present a method for local government stakeholders involved in pre-disaster plan making to describe performance measures through the formulation of desired outcomes. Through a case study approach, Nara and Kyoto Prefectures’ separate experiences demonstrate how to conduct Strategic Earthquake Disaster Reduction Plans and Action Plans that have deep stakeholder buy-in and outcome measurability. Nara’s plan was prepared from 2,015 stakeholder ideas and Kyoto’s plan was prepared from 1,613 stakeholder ideas. Having a quantitative target for individual objectives ensures the measurability of plan progress. Both jurisdictions have undertaken evaluations of plan outcomes. Sandy Meyer, Eugene Henry, Roy E. Wright and Cynthia A. Palmer present the State of Florida in the U.S. and its experience with pre-disaster planning for post-disaster redevelopment. Drawing upon the lessons learned from the impacts of the 2004 and 2005 hurricane seasons, local governments and state leaders in Florida sought to find a way to encourage behavior that would create greater community resiliency in 2006. The paper presents initial efforts to develop a post-disaster redevelopment plan (PDRP), including the experience of a pilot county. National level Bo-Yao Lee provides a national perspective: New Zealand’s approach to emergency management, where all hazard risks are addressed through devolved accountability. This contemporary approach advocates collaboration and coordination, aiming to address all hazard risks through the “4Rs” – reduction, readiness, response, and recovery. Lee presents the impact of the Resource Management Act (1991), the Civil Defence Emergency Management Act (2002), and the Building Act (2004) that comprise the key legislation influencing and promoting integrated management for environment and hazard risk management. Guillermo Franco and William Siembieda provide a field assessment of the February 27, 2010, M8.8 earthquake and tsunami event in Chile. The papers present an initial damage and life-loss review and assessment of seismic building resiliency and the country’s rapid updating of building codes that have undergone continuous improvement over the past 60 years. The country’s land use planning system and its emergency management system are also described. The role of insurance coverage reveals problems in seismic coverage for homeowners. The unique role of the Catholic Church in providing temporary shelter and the central government’s five-point housing recovery plan are presented. A weakness in the government’s emergency management system’s early tsunami response system is noted. Acknowledgements The Editorial Committee extends its sincere appreciation to both the contributors and the JDR staff for their patience and determination in making Part 2 of this special issue possible. Thanks also to the reviewers for their insightful analytic comments and suggestions. Finally, the Committee wishes to again thank Bayete Henderson for his keen and thorough editorial assistance and copy editing support.
APA, Harvard, Vancouver, ISO, and other styles
35

Cobanoglu, Cihan, Muhittin Cavusoglu, and Gozde Turktarhan. "A beginner’s guide and best practices for using crowdsourcing platforms for survey research: The Case of Amazon Mechanical Turk (MTurk)." Journal of Global Business Insights 6, no. 1 (March 2021): 92–97. http://dx.doi.org/10.5038/2640-6489.6.1.1177.

Full text
Abstract:
Introduction Researchers around the globe are utilizing crowdsourcing tools to reach respondents for quantitative and qualitative research (Chambers & Nimon, 2019). Many social science and business journals are receiving studies that utilize crowdsourcing tools such as Amazon Mechanical Turk (MTurk), Qualtrics, MicroWorkers, ShortTask, ClickWorker, and Crowdsource (e.g., Ahn, & Back, 2019; Ali et al., 2021; Esfahani, & Ozturk, 2019; Jeong, & Lee, 2017; Zhang et al., 2017). Even though the use of these tools presents a great opportunity for sharing large quantities of data quickly, some challenges must also be addressed. The purpose of this guide is to present the basic ideas behind the use of crowdsourcing for survey research and provide a primer for best practices that will increase their validity and reliability. What is crowdsourcing research? Crowdsourcing describes the collection of information, opinions, or other types of input from a large number of people, typically via the internet, and which may or may not receive (financial) compensation (Hargrave, 2019; Oxford Dictionary, n.d.). Within the behavioral science realm, crowdsourcing is defined as the use of internet services for hosting research activities and for creating opportunities for a large population of participants. Applications of crowdsourcing techniques have evolved over the decades, establishing the strong informational power of crowds. The advent of Web 2.0 has expanded the possibilities of crowdsourcing, with new online tools such as online reviews, forums, Wikipedia, Qualtrics, or MTurk, but also other platforms such as Crowdflower and Prolific Academic (Peer et al., 2017; Sheehan, 2018). Crowdsourcing platforms in the age of Web 2.0 use remote labor recruited via the internet to assist employers complete tasks that cannot be left to machines. Key characteristics of crowdsourcing include payment for workers, their recruitment from any location, and the completion of tasks (Behrend et al., 2011). They also allow for a relatively quick collection of data compared to data collection in the field, and participants are rewarded with an incentive—often financial compensation. Crowdsourcing not only offers a large participation pool but also a streamlined process for the study design, participant recruitment, and data collection as well as integrated participant compensation system (Buhrmester et al., 2011). Also, compared to other traditional marketing firms, crowdsourcing makes it easier to detect possible sampling biases (Garrow et al., 2020). Due to advantages such as reduced costs, diversity of participants, and flexibility, crowdsourcing platforms have surged in popularity for researchers. Advantages MTurk is one of the most popular crowdsourcing platforms among researchers, allowing Requesters to submit tasks for Workers to complete (Cummings & Sibona, 2017). MTurk has been used as an online crowdsourcing platform for the recruitment of human subjects for research purposes (Paolacci & Chandler, 2014). Research has also shown MTurk to be a reliable and cost-effective tool, capable of providing representative data for research in the behavioral sciences (e.g., Crump et al., 2013; Goodman et al., 2013; Mason & Suri, 2012; Rand, 2012; Simcox & Fiez, 2014). In addition to its use in social science studies, the platform has been used in marketing, hospitality and tourism, psychology, political science, communication, and sociology contexts (Sheehan, 2018). To illustrate, between 2012 and 2017, more than 40% of the studies published in the Journal of Consumer Research used crowdsourcing websites for their data collection (Goodman & Paolacci, 2017). Disadvantages Although researchers have assessed crowdsourcing platforms as reliable and cost-effective for data collection in the behavioral sciences, they are not exempt of flaws. One disadvantage is the possibility of unsatisfactory data quality. In fact, the virtual setting of the survey implies that the investigator is physically separated from the participant, and this lack of monitoring could lead to data quality issues (Sheehan, 2018). In addition, participants in survey research on crowdsourcing platforms are not always who they claim to be, creating issues of trust with the data provided and, ultimately, the quality of the research findings (McGonagle, 2015; Smith et al., 2016). A recurrent concern with MTurk workers, for instance, is their assessment as experienced survey takers (Chandler et al., 2015). This experience is mainly acquired through completion of dozens of surveys per day, especially when they are faced with similar items and scales. Smith et al. (2016) identified two types of problems performing data collection using MTurk; namely, cheaters and speeders. As compared to Qualtrics—which has a strict screening and quality-control processes to ensure that participants are who they claim to be—MTurk appears to be less exigent regarding the workers. However, a downside for data collection with Qualtrics is more expensive fees—about $5.00 per questionnaire on Qualtrics, against $0.50 to $1.50 on MTurk (Ford, 2017). Hence, few researchers were able to conduct surveys and compare respondent pools with Qualtrics or other traditional marketing research firms (Garrow et al., 2020). Another challenge using MTurk arises when trying to collect a desired number of responses from a population targeted to a specific city or area (Ross et al., 2010). The issues inherent to the selection process of MTurk have been the subject of investigations in several studies (e.g., Berinsky et al., 2012; Chandler et al., 2014; 2015; Harms & DeSimone, 2015; Paolacci et al., 2010; Rand, 2012). Feitosa et al. (2015) pointed out that international respondents may still identify themselves as U.S. respondents with the use of fake addresses and accounts. They found that 5% to 10% of participants identifying themselves as U.S. respondents were actually from overseas locations. Moreover, Babin et al. (2016) assessed that the use of trap questions allowed researchers to uncover that many respondents change their genders, ages, careers, or income within the course of a single survey. The issues of (a) experienced workers for the quality control of questions and (b) speeders, which, for MTurk can be attributed to the platform being the main source of revenue for a given respondent, remain the inherent issues of crowdsourcing platforms used for research purposes. Best practices Some best practices can be recommended in the use of crowdsourcing platforms for data collection purposes. Workers IDs can be matched with IDs from previous studies, thus allowing researchers to exclude responses from workers who had answered previous similar studies (Goodman & Paolacci, 2017). Furthermore, proceed to a manual assignment of qualification on MTurk prior to data collection (Litman et al., 2015; Park & Park, 2020). When dealing with experienced workers, both using multiple attention checks and optimizing the survey in a way to have the participants exposed to the stimuli for a sufficient length of time to better address the questions are also recommended (Sheehan, 2018). In this sense, shorter surveys are preferred to longer ones, which affect the participant’s concentration, and may, in turn, adversely impact the quality of their answers. Most importantly, pretest the survey to make sure that all parts are working as expected. Researchers should also keep in mind that in the context of MTurk, the primary method for measurement is the web interface. Thus, to avoid method biases, researchers should ponder whether or not method factors emerge in the latent measurement models (Podsakoff et al., 2012). As such, time-lagged research designs may be preferred as predictor and criterion variables can be measured at different points in time or administered in different platforms, such as Qualtrics vs MTurk (Cheung et al., 2017). In general, the use of crowdsourcing platforms including MTurk may be appropriate according to the research question; and the quality of data is reliant on the quality-control strategies used by researchers to enhance data quality. Trade-offs between various validity types need to be prioritized according to the research objectives (Cheung et al., 2017). From our experience using crowdsourcing tools for our own research as the editorial team members of several journals and chair of several conferences, we provide the best practices as outlined below: MTurk Worker (Respondent) Selection: Researchers should consider their study population before using MTurk for data collection. The MTurk platform should be used for the appropriate study population. For example, if the study targets restaurant owners or company CEOs, MTurk workers may not be suitable for the study. However, if the target population is diners, hotel guests, grocery shoppers, online shoppers, students, or hourly employees, utilizing a sample from MTurk would be suitable. Researchers should use the selection tool in the software. For example, if you target workers only from one country, exclude responses that came from an internet protocol (IP) address outside the targeted country and report the results in the method section. Researchers should consider the demographics of workers on MTurk which must reflect the study targeted population. For example, if the study focuses on baby boomers use of technology, then the MTurk sample should include only baby boomers. Similarly, the gender balance, racial composition, and income of people on MTurk should mirror the targeted population. Researchers should use multiple screening tools that identify quality respondents and avoid problematic response patterns. For example, MTurk provides the approval rate for the respondents. This refers to how many times a respondent is rejected for various reasons (i.e., wrong code entered). We recommend using a 90% or higher approval rate. Researchers should include screening questions in different places with different type of questions to make sure that the respondents are appropriate for your study. One way is to use knowledge-based questions about the subject. For example, rather than asking “How experienced are you with accounting practices?”, a supplemental question such as “Which of the following is a component of an income statement?” should be integrated into the study in a different section of the survey. Survey Validity: Researchers should conduct a pilot survey from MTurk workers to identify and fix any potential data quality and programming problems before the entire data set is collected. Researcher can estimate time required to complete the survey from the pilot study. This average time should be used in calculating incentive payment for the workers in such a way that the payment should equate or exceed minimum wage in the targeted country. Researchers should build multiple validity-check tools into the survey. One of them is to ask attention check questions such as “please click on ‘strongly agree’ in this question” or “What is 2+2? Please choose 5” (Cobanoglu et al., 2016) Even though these attention questions are good and should be implemented, experienced survey takers or bots easily identify them and answer them correctly, but then give random answers to other questions. Instead, we recommend building in more involved validity check questions. One of the best is asking the same question in different places and in different forms. For example, asking the age of the respondent in the beginning of the survey and then asking them the year of their birth at the end of the survey is an effective way to check that they are replying to the survey honestly. Exclude all those who answered the same question differently. Report the results of these validity checks in the methodology. Cavusoglu (2019) found that almost 20% of the surveys were eliminated due to the failure of the validity check questions which were embedded in different places and in different forms in his survey. Researchers should be aware of internet bot, which is a software that runs automated tasks. Some respondents use a bot to reply to the surveys. To avoid this, use Captcha verification, which forces respondents to perform random tasks such as moving the bar to a certain area, clicking in boxes that has cars, or checking boxes to verify the person taking the survey is not a bot. Whenever appropriate, researchers should use time limit options offered by online survey tools such as Qualtrics to control the time that a survey taker must spend to advance to the next question. We found that this is a great tool, especially when you want the respondents to watch a video, read a scenario, or look at a picture before they respond to other questions. Researchers should collect data in different days and times during the week to collect a more diverse and representative sample. Data Cleaning: Researchers should be aware that some respondents do not read questions. They simply select random answers or type nonsense text. To exclude them from the study, manually inspect the data. Exclude anyone who filled out the survey too quickly. We recommend excluding all responses filled out less than 40% of the average time to take the survey. For example, if it takes 10 minutes to fill out a survey, we exclude everyone who fills out this survey in 4 minutes or less. After we separated these two groups, we compared them and found that the speeders’ (aka cheaters) data was significantly different than the regular group. Researchers should always collect more data than needed. Our rule of thumb is to collect 30% more data than needed. For example, if 500 clean data responses are wanted, collect at least 650 data. The targeted number of data will still be available after cleaning the data. Report the process of cleaning data in the method section of your article, showing the editor and reviewers that you have taken steps to increase the validity and reliability of the survey responses. Calculating a response rate for the samples using MTurk is not possible. However, it is possible to calculate active response rate (Ali et al., 2021). It can be calculated as the raw response numbers deducted from all screening and validity check question results. For example, if you have 1000 raw responses and you eliminated 100 responses for coming from IP address outside of the United States, another 100 surveys for failing the validity check questions, then your active response rate would be 800/1000= 80%.
APA, Harvard, Vancouver, ISO, and other styles
36

Raymond, Scott B., Feras Akbik, Joshua A. Hirsch, Christopher J. Stapleton, Ramon G. Gonzalez, Brijesh P. Mehta, James D. Rabinov, Aman B. Patel, Ronil V. Chandra, and Thabele Leslie-Mazwi. "Abstract WP330: Protocol Approaches Negate the “Weekend Effect” for Endovascular Stroke Treatment." Stroke 48, suppl_1 (February 2017). http://dx.doi.org/10.1161/str.48.suppl_1.wp330.

Full text
Abstract:
Background: Endovascular management of stroke from acute large vessel occlusion (LVO) requires complex, emergent diagnostic and therapeutic procedures. The “weekend effect” (worsened outcomes from stroke presenting on weekends or evenings) is a recognized phenomenon, attributed to non-uniform availability of services throughout the week. We assessed the impact of institutional protocols for stroke patients undergoing endovascular therapy during off hours. Methods: We analyzed a prospective observational stroke database for consecutive patients with anterior circulation stroke undergoing endovascular therapy between 6/2012 and 10/2015. Patients were grouped and analyzed based on day of the week and time of presentation to the emergency department. Off-hours were considered between 1900hrs and 0700hrs on weekdays and 1900hrs on Friday to 0700hrs on Mondays for weekends. Functional outcome was assessed prospectively by 3 month modified Rankin scale (mRS), dichotomized into good (mRS 0-2) versus poor (mRS 3-6). Results: In a cohort of 129 patients, 75 (58%) patients were treated off-hours. Patients treated off-hours demonstrated equivalent imaging to groin puncture times (78 vs 72 min, p = 0.4) and procedure durations (75 vs 68 min, p = 0.3). Reperfusion rates (TICI 2b or 3) were 68% off hours and 76% during working hours (p = 0.4). Complication rates were similar between the two groups. Outcome at 90 days was no different in the patients treated off hours, with 35 of 75 treated off-hours achieving a good outcome (mRS 0-2) compared to 22 of 54 treated during working hours (p = 0.6). With protocol adherence, temporal improvement was noted in imaging to groin times. Discussion/Conclusions: Following recent evidence of benefit from endovascular therapy for LVOs there is increased attention to care delivery. Our findings demonstrate that under the guidance of protocols, the “weekend effect” was negated. Evaluation and treatment times, and 90 day outcomes were equivalent in patients treated off- vs business hours, with improving treatment times as familiarity with protocols increased. Our findings highlight the importance of establishing institutional and regional protocols in the optimized management of these patients.
APA, Harvard, Vancouver, ISO, and other styles
37

Nasution, Suswati, and Tito Irwanto. "ANALISIS KEPEDULIAN PELAKU USAHA TERHADAP PROTOKOL KESEHATAN CORONAVIRUS DISEASE ( COVID-19 ) PADA PUSAT PERBELANJAAN MODERN DI KOTA BENGKULU." Jurnal Ilmiah Akuntansi, Manajemen dan Ekonomi Islam (JAM-EKIS) 4, no. 1 (January 31, 2021). http://dx.doi.org/10.36085/jam-ekis.v4i1.1270.

Full text
Abstract:
The concern of business actors during the Covid-19 pandemic is an important and very influential thing on the economy. In the midst of the COVID-19 pandemic, people think several times to visit shopping centers because of their concern that they will be infected with the coronavirus disease or covid-19, this will undoubtedly disrupt the running of the economy and indirectly impact the decline in turnover of business people. will have an impact on small traders who are around the location of the shopping center. This epidemic that has been going on for a very long time has caused the Government to issue a new policy by opening the faucet of the new normal discourse. Where to overcome the worsening economic situation, especially the trade and business sectors, not to get worse, the new normal discourse is echoed, where in the new normal era, people are expected to return to their normal activities provided that they comply with the Covid -19 health protocol. The community welcomes this discourse, those who were previously silent and doing all activities at home, have begun to make plans to carry out activities outside the home, including visiting a shopping center. For this reason, it is necessary to see the extent to which business actors care about health protocols at their place of business, so that visitors feel safe shopping without fear of contracting the virus.The perceptions of these business actors in this shopping center are very much needed to find out whether their concerns have an impact on increasing shopping center visits and can increase business income. With survey research and the method used is descriptive analytical quantitative. By taking samples from the population of business actors in two modern shopping centers in the city of Bengkulu, Bencoolen Mall and Mega Mall, and using a questionnaire as a data collection tool, 40 respondents were obtained, then the respondent's data and respondents' statements were processed using distribution On average with a Likert scale measuring instrument 5, while determining the characteristics of respondents using the frequency distribution with the help of SPSS as data processing software. The results of data processing show that business actors in two modern shopping centers in the city of Bengkulu are categorized as good in implementing the protocol. health Covid-19. Of course this is not a final result, stakeholders are expected to be able to increase the category to Very Good. So that the shopping center becomes a comfortable place for visitors even though the Covid-19 epidemic period has not ended.Keywords: Business Actor Concern, Covid-19 Prevention Protocol, Modern Shopping Center
APA, Harvard, Vancouver, ISO, and other styles
38

Kholil, Nafiah Ariani, and Dian Karsoma. "Trigona Honey Home Industry Development for Economic Recovery in the Time of COVID-19 Pandemic: A Case Study in North Lombok West Nusa Tenggara, Indonesia." Asian Journal of Research in Agriculture and Forestry, January 9, 2021, 1–9. http://dx.doi.org/10.9734/ajraf/2021/v7i130118.

Full text
Abstract:
The devastating earthquake on 28 July 2018 in Lombok and the COVID-19 pandemic have put high economic pressure on the community. It is not only the damage to economic infrastructure that stops business activities, but also very strict health protocols, especially social distancing and avoiding crowding that make business activities impossible. Trigona bee farm has been one of the productive activities to support the family economy for the people of North Lombok since decades ago. During the Covid-19 pandemic, which has been running for almost 10 months, the activities of Trigona beekeeping are still running; in fact, the demand has actually increased. This research aims to develop the best strategy to make business scale of Trigona honey and community income increase, using the SAST (Strategic Assumption Surfacing and Testing) method and AHP (Analytical Hierarchy Process). The results showed that the demand for trigona honey during the pandemic has actually increased, because this honey has a very complete nutritional content, which can be used to increase immunity against of COVID-19. There are three main problems that were faced by the trigona bee farmers to develop their business: Trigona seeds, cultivation technology, and business management. The best strategy to increase their business and income is cultivation system technology and providing added value.
APA, Harvard, Vancouver, ISO, and other styles
39

Richter, Jessika Luth, and Lizzie Chambers. "Reflections and outlook for the New Zealand ETS: must uncertain times mean uncertain measures?" Policy Quarterly 10, no. 2 (May 1, 2014). http://dx.doi.org/10.26686/pq.v10i2.4485.

Full text
Abstract:
The New Zealand emissions trading scheme (ETS) was introduced by legislation in 2008. The legislated objectives as stated in section 3 of the Climate Change Response Act 2002 are to ‘support and encourage global efforts to reduce the emission of greenhouse gases by (i) assisting New Zealand to meet its international obligations under the [UNFCCC] Convention and the [Kyoto] Protocol; and (ii) reducing New Zealand’s net emissions of those gases to below business-as-usual levels’. Beyond this, the New Zealand government has confirmed three objectives for the ETS.
APA, Harvard, Vancouver, ISO, and other styles
40

Salami, Babatunde A., Saheed O. Ajayi, and Adekunle S. Oyegoke. "Coping with the Covid-19 pandemic: an exploration of the strategies adopted by construction firms." Journal of Engineering, Design and Technology ahead-of-print, ahead-of-print (August 3, 2021). http://dx.doi.org/10.1108/jedt-01-2021-0054.

Full text
Abstract:
Purpose The outbreak of the Covid-19 pandemic has tested the resilience of the construction industry, putting the safety of workers and overall businesses at risk. This study aims to explore the different strategies adopted by construction companies to protect the health and well-being of employees, security of the construction sites and projects, and keep the overall business operational amid the Covid-19 pandemic. Design/methodology/approach A preliminary study that involves field study and survey research was used to collect data for the study. The results from the preliminary analysis served as inputs for constructing the questionnaire, which was analyzed using descriptive statistics, exploratory factor analysis and reliability analysis. Findings The results reveal that the key underlying measures put in place by construction businesses include restricted site access, support bubbling of office and site staff, enhanced hygiene and social distancing protocol, contract risk identification and mitigation, self-isolation measures and heightened construction site safety. Along with a further discussion of the underlying measures, the top-rated strategies that were adopted by construction firms are also discussed in the paper. Originality/value As many construction companies remained opened handling essential projects amid the pandemic, the study presents the effective and efficient strategies that were used in plowing through the trying times. This study provides the opportunity for construction companies that escaped the early impacts of Covid-19 due to site closure and policymakers to learn from the strategies adopted by construction companies that were operational amid the pandemic.
APA, Harvard, Vancouver, ISO, and other styles
41

Kim, Eun-Hee, and Thomas Lyon. "When Does Institutional Investor Activism Increase Shareholder Value?: The Carbon Disclosure Project." B.E. Journal of Economic Analysis & Policy 11, no. 1 (August 15, 2011). http://dx.doi.org/10.2202/1935-1682.2676.

Full text
Abstract:
Abstract This paper presents the first empirical test of the financial impacts of institutional investor activism towards climate change. Specifically, we study the conditions under which share prices are increased for the Financial Times (FT) Global 500 companies due to participation in the Carbon Disclosure Project (CDP), a consortium of institutional investors with $57 trillion in assets. We find no systematic evidence that participation, in and of itself, increased shareholder value. However, by making use of Russia’s ratification of the Kyoto Protocol, which caused the Protocol to go into effect, we find that companies’ CDP participation increased shareholder value when the likelihood of climate change regulation rose. We estimate the total increase in shareholder value from CDP participation at $8.6 billion, about 86% of the size of the carbon market in 2005. Our findings suggest that institutional investor activism towards climate change can increase shareholder value when the external business environment becomes more climate conscious.
APA, Harvard, Vancouver, ISO, and other styles
42

Clunie, David A. "DICOM Format and Protocol Standardization—A Core Requirement for Digital Pathology Success." Toxicologic Pathology, October 16, 2020, 019262332096589. http://dx.doi.org/10.1177/0192623320965893.

Full text
Abstract:
As the use of digital techniques in toxicologic pathology expands, challenges of scalability and interoperability come to the fore. Proprietary formats and closed single-vendor platforms prevail but depend on the availability and maintenance of multiformat conversion libraries. Expedient for small deployments, this is not sustainable at an industrial scale. Primarily known as a standard for radiology, the Digital Imaging and Communications in Medicine (DICOM) standard has been evolving to support other specialties since its inception, to become the single ubiquitous standard throughout medical imaging. The adoption of DICOM for whole slide imaging (WSI) has been sluggish. Prospects for widespread commercially viable clinical use of digital pathology change the incentives. Connectathons using DICOM have demonstrated its feasibility for WSI and virtual microscopy. Adoption of DICOM for digital and computational pathology will allow the reuse of enterprise-wide infrastructure for storage, security, and business continuity. The DICOM embedded metadata allows detached files to remain useful. Bright-field and multichannel fluorescence, Z-stacks, cytology, and sparse and fully tiled encoding are supported. External terminologies and standard compression schemes are supported. Color consistency is defined using International Color Consortium profiles. The DICOM files can be dual personality Tagged Image File Format (TIFF) for legacy support. Annotations for computational pathology results can be encoded.
APA, Harvard, Vancouver, ISO, and other styles
43

Coote, Skye, Tanya Frost, Shaloo Singhal, Chris Bladin, and Amanda Gilligan. "Abstract W P177: Direct to CT: Overcoming Barriers to Reduce Door to Needle times in Acute Stroke Patients." Stroke 45, suppl_1 (February 2014). http://dx.doi.org/10.1161/str.45.suppl_1.wp177.

Full text
Abstract:
Background: Taking acute stroke patients direct from triage to the CT scanner can reduce thrombolysis treatment times, which can improve patient outcomes. In May 2013, Box Hill Hospital introduced a Direct to CT policy for acute stroke patients who are called through as a Code Stroke by the ambulance service within business hours (8am-4.30pm) Monday to Friday. Method: We performed a prospective study comparing door-to-CT times (DTCT) and door-to-needle (DTN) times pre- and post-implementation of Direct to CT, and examined patient characteristics, Emergency Department (ED) presentation time, adverse effects, protocol violations and patient outcomes. Delays in treatment, enablers and barriers to treatment were also examined. Results: There was no statistical difference in demographics or clinical factors in patients who presented pre- (January-April, n=21) or post- Direct to CT (May-July, n=29). However, a reduction in median DTCT times (27 mins vs. 16 mins, p=0.01) and DTN times (97 mins vs. 52 mins, p<0.001) was seen. There was no increase in thrombolysed mimics (4.8% vs. 3.4%, p=0.82), protocol violations (9.5% vs. 0%, p=0.17) or adverse outcomes (33% vs. 35%, p=0.93) in patients taken Direct to CT. There was no difference in patient outcomes, however the current study size is small. Numerous barriers to Direct to CT were identified within four categories: pre-hospital, ED, CT and the stroke team, and issues included: lack of paramedic intravenous cannulation, ED resources, and stroke team indecisions; some of which are ongoing and are taking considerable time and efforts to overcome. Conclusions: Taking patients Direct to CT has significantly reduced time to treatment and further improvements may be achieved through resolution of identified barriers.
APA, Harvard, Vancouver, ISO, and other styles
44

Leu, Severina, Florian Halbeisen, Luigi Mariani, and Jehuda Soleman. "Intraoperative ultrasound-guided compared to stereotactic navigated ventriculoperitoneal shunt placement: study protocol for a randomised controlled study." Trials 22, no. 1 (May 19, 2021). http://dx.doi.org/10.1186/s13063-021-05306-5.

Full text
Abstract:
Abstract Background Ventriculoperitoneal shunt (VPS) placement is one of the most frequent neurosurgical procedures and the operation is performed in a highly standardised manner under maintenance of highest infection precautions. Short operation times are important since longer duration of surgery can increase the risk for VPS complications, especially infections. The position of the proximal ventricular catheter influences shunt functioning and survival. With freehand placement, rates of malpositioned VPS are still high. Several navigation techniques for improvement of shunt placement have been developed. Studies comparing these techniques are sparse. The aim of this study is to prospectively compare ultrasound (US) guided to stereotactic navigated shunt placement using optical tracking with the focus on operation time and efficiency. Methods In this prospective randomised, single-centre, partially-blinded study, we will include all patients undergoing VPS placement in our clinic. The patients will be randomised into two groups, one group undergoing US-guided (US-G) and the other group stereotactic navigated VPS placement using optical tracking. The primary outcome will be the surgical intervention time. This time span consists of the surgical preparation time together with the operation time and is given in minutes. Secondary outcomes will be accuracy of catheter positioning, VPS dysfunction and need for revision surgery, total operation and anaesthesia times, and amount of intraoperative ventricular puncture attempts as well as complications, any morbidity and mortality. Discussion To date, there is no prospective data available comparing these two navigation techniques. A randomised controlled study is urgently needed in order to provide class I evidence for the best possible surgical technique of this frequent surgery. Trial registration Business Administration System for Ethical Committees (BASEC) 2019-02157, registered on 21 November 2019, https://www.kofam.ch/de/studienportal/suche/88135/studie/49552; clinicalTrials.gov: NCT04450797, registered on 30 June 2020.
APA, Harvard, Vancouver, ISO, and other styles
45

Amalia, Nanda Rizki. "Olresto: Digitalization and Integration of Restaurants as a Role and Contribution of the Business Sector in Responding to the COVID-19 Pandemic." Khazanah: Jurnal Mahasiswa 12, no. 2 (December 13, 2020). http://dx.doi.org/10.20885/khazanah.vol12.iss2.art31.

Full text
Abstract:
The Ministry of Health has enacted Large-scale Social Restrictions (PSBB) regulations geared toward accelerating the handling of COVID-19. During PSBB, the economy across the country was declining in preventive measures. Including Indonesia, the education, tourism, hospitality, culinary and business sectors must force to close due to enacted regulations. The small restaurant business is one of the affected. After the implementation of this PSBB, the government wants to make changes to the routine of society to live with a new or new normal lifestyle. At this time of the new normal, restaurants are allowed to operate provided they limit the number of customers who come and are required to comply with health protocols. They must wear masks and wash their hands before entering the restaurant area. Today, digitalization has penetrated various sectors of life because the digital era brings many benefits, ranging from cost efficiency, time to energy. Application-based digital products are very suitable for use in the field of business economics that struggle in the sector of culinary, especially restaurants. To achieve operational in the new-normal times, the "Olresto" is an application that integrates all restaurants spread across the island of Java. Users can find out which restaurants are available and make a booking. The application also comes with a list of food menus so that customers can order through the application. The facility of transactions with the help of virtual accounts or e-money hence minimizing the spread of COVID-19 virus that can spread through money. With this application, it hoped that it could help restaurants to continue operating during this pandemic, thus reducing their deficit and can improve the existence of an economy in Indonesia which had fallen due to the COVID19.
APA, Harvard, Vancouver, ISO, and other styles
46

Khriji, Sabrine, Yahia Benbelgacem, Rym Chéour, Dhouha El Houssaini, and Olfa Kanoun. "Design and implementation of a cloud-based event-driven architecture for real-time data processing in wireless sensor networks." Journal of Supercomputing, July 26, 2021. http://dx.doi.org/10.1007/s11227-021-03955-6.

Full text
Abstract:
AbstractThe growth of the Internet of Things (IoTs) and the number of connected devices is driven by emerging applications and business models. One common aim is to provide systems able to synchronize these devices, handle the big amount of daily generated data and meet business demands. This paper proposes a cost-effective cloud-based architecture using an event-driven backbone to process many applications’ data in real-time, called REDA. It supports the Amazon Web Service (AWS) IoT core, and it opens the door as a free software-based implementation. Measured data from several wireless sensor nodes are transmitted to the cloud running application through the lightweight publisher/subscriber messaging transport protocol, MQTT. The real-time stream processing platform, Apache Kafka, is used as a message broker to receive data from the producer and forward it to the correspondent consumer. Micro-services design patterns, as an event consumer, are implemented with Java spring and managed with Apache Maven to avoid the monolithic applications’ problem. The Apache Kafka cluster co-located with Zookeeper is deployed over three availability zones and optimized for high throughput and low latency. To guarantee no message loss and to simulate the system performances, different load tests are carried out. The proposed architecture is reliable in stress cases and can handle records goes to 8000 messages in a second with low latency in a cheap hosted and configured architecture.
APA, Harvard, Vancouver, ISO, and other styles
47

Xie, Frank Tian, Naveen Donthu, and Wesley J. Johnston. "Beyond first or late mover advantages: timed mover advantage." Journal of Business & Industrial Marketing ahead-of-print, ahead-of-print (December 1, 2020). http://dx.doi.org/10.1108/jbim-11-2018-0334.

Full text
Abstract:
Purpose This paper aims to present a new framework that describes the relationship among market entry order and timing, the advantages accruing to first-movers and late-movers, entry timing premium (ETP), marketing strategy and enduring market performance of the firms. The framework, empirically tested using data from 241 business executives, expands extant research into new territory beyond first- and late-mover advantages in an attempt to reconcile a few streams of research in the area and provides an entry related, strategic assessment tool (ETP) for the managers. Contribution to marketing strategy theory and managerial implications are also presented. Design/methodology/approach Participants included informants in a firm’s strategic business unit who were the most familiar with a new product’s commercial launch, market condition at launch, competitor offerings, marketing activities and capabilities and eventual integration into or withdrawal from the product’s portfolio. Therefore, for the survey, the study targeted chief executive officers, vice presidents of marketing or sales, product or sales managers, general managers and regional managers. Both preference bias (Narus, 1984) and survivor biases among the respondents were addressed. Findings The research result of this study reveals two very significant aspects of marketing and marketing strategies. First, the importance of financial, pricing and cost strategies further attests to the fiercely competitive nature of the global market today and the tendency for firms to commoditize most products and services. An effective financial and pricing strategy, coupled with a higher level of ETP, is capable of leading a firm to initial market success in the product-market in which it competes. Both ETP (a positional advantage and resource of the firm) and financial and pricing strategies (a deliberate strategic decision of the management) are important to achieve this goal. Research limitations/implications This study is limited in several ways. The effects of entry order and timing on market performance could be dependent on the types of industries and types of product categories involved. However, as the hypotheses were well supported, the “industry specific” factors would provide “fine-tuning” in the future study. Second, the nature of the product (goods or services) may also present varying effects on the relationship studied (for differences between manufacturing and service firms in pioneering advantages, see Song et al., 1999). Services’ intangible nature, difficulty in protecting property rights, high involvement of boundary-spanning employees and customers, high reliance on delivery and quality, and ease of imitation may alter the proposed relationships in the model and the moderating effects. Third, although this study used a “retrospective” protocol approach in the data collection by encouraging respondents to recall market, product and business information, this study is not longitudinal. Lack of longitudinal data in any study involving strategic planning, strategy execution and the long-term effects is no doubt a weakness. In addition, due to peculiarity and complexity with regard to regulation and other aspects in pharmaceutical and other industries, the theory might be limited to a certain extent. Practical implications In all, the integrated framework contributes to the understanding of the intricate issues surrounding first-mover advantage, late-mover advantage, entry order and timing and the role of marketing strategy. The framework provides practitioners guidance as to when to enter a product-market to gain advantageous positions and how to maintain that advantage. Firms that use a deliberate late-mover strategy could also benefit from the research finding in mapping out their strategic courses of action. Originality/value This study believes that the halo effect surrounding first-mover advantage may have obscured the visions of some researchers and managers, and the pursuit of a silver bullet has led to frenzied interests in becoming a “first-mover” or a deliberate “late-mover”. The theoretical framework, which is substantiated by empirical testing, invalidates the long-held claim that entry of a particular kind (first-movers or late-movers) yields any unique competitive advantage. It is a firms’ careful selection of marketing strategies and careful execution of the strategies through effective operational tactics that would lead to enduring competitive advantage, under an adequate level of ETP.
APA, Harvard, Vancouver, ISO, and other styles
48

Hanna, A. B., A. M. Hanna, and R. W. Webby. "Farm monitoring to futures contracts: The Hannas' experience." Proceedings of the New Zealand Grassland Association, January 1, 2001, 29–32. http://dx.doi.org/10.33584/jnzg.2001.63.2424.

Full text
Abstract:
Being part of a group farm monitoring project and then a farm systems study has enabled the Hannas to develop their farm business to where they can supply livestock to meat processors under futures contract arrangements. The recognition and development of farm-monitoring protocols has enabled them to identify key weight and weight gain targets. A database of weights, pasture growth and pasture cover information has enabled them to predict numbers, weights and supply times. The result of this farm monitoring has been a 35% increase in productivity over the past 5 years. Lambing % has increased from 112 to 146% survival to sale. Bull beef cattle numbers have increased by 430% as the area suitable for running them has been fenced and developed. The Hannas have classified their land into capability units and farm these units accordingly. They recognise the strong link between farming profitably and sustainably and believe that the demonstration of sustainable farming practices will be a requirement to enter into high value markets in the future. Winning of the supreme Waikato Farm Environment award in 1999 was a reflection of this philosophy. Keywords: bull beef, futures contracts, monitor farm, sustainability
APA, Harvard, Vancouver, ISO, and other styles
49

Silverstone, Susan. "Using 21st Century Technology In Online Business Education." Journal of Business & Economics Research (JBER) 5, no. 12 (February 7, 2011). http://dx.doi.org/10.19030/jber.v5i12.2620.

Full text
Abstract:
<p class="Default" style="text-align: justify; margin: 0in 0in 0pt;"><span style="font-family: Times New Roman;"><span style="font-size: 9.5pt; mso-bidi-font-style: italic;">The challenges for education in the 21<sup><span style="position: relative; top: -4pt; mso-text-raise: 4.0pt;">st </span></sup>century are fundamentally the same as they were in each of the past centuries &ndash; holding on to what is of value while discovering and developing what adds value to both teaching and learning. While the future is difficult to predict, the seeds of the future can be seen in the behaviors of the present. Obviously technology will play an even greater role in future education no matter how much and how quickly technology changes. Of greater importance than technology is the thinking needed for knowing how to use technology for advancing education for both students and instructors. Identifying the shifts in behavior that people are experiencing today provides clues on the practices that will be common tomorrow. Basic changes in education include the following: (1) moving from an instructor-centered paradigm focused on teaching to a learner-centered model focused on learning; (2) shifting from an emphasis on textbooks as a preferred source of knowledge to the use of technology as the primary tool for acquiring information and ideas; (3) advancing from knowledge to know-how exemplified in the differences expected from the cognitive, behaviorist and constructivist approaches to learning; and (4) sharing responsibility for learning through increased interaction and continuous communication between and among all individuals engaged in becoming educated persons. Technology, though it may be the key tool for facilitating these changes, has its limitations as well as its advantages, as any instructor knows when comparing face-to-face classroom lecturing with virtual asynchronous online discussions. Today&rsquo;s students are techno-savvy and may be considered the &ldquo;Wi-Fi Generation.&rdquo; In the School of Business at National University, the second largest not-for-profit university in California, a blended approach to learning has been adopted in the accelerated one-month format used for its online education program. This paper explores the effects of some new technological options which were recently provided to marketing students in order to make their online learning experience more exciting and meaningful. National University&rsquo;s online classes are offered on the eCollege platform. Students interact with each other asynchronously through discussion boards and synchronously in weekly chat sessions. Chat sessions had been offered in a text-based format, but the School of Business has invested in iLinc software which provides Voice over Internet Protocol (VoIP) capability. In iLinc, students can see and hear each other as well as the instructor in real time. The system allows application sharing, group web-browsing, the display of PowerPoint</span><span style="font-size: 10pt; mso-bidi-font-style: italic;">&reg; </span><span style="font-size: 9.5pt; mso-bidi-font-style: italic;">slideshows, voting, and independent group work. Using this technology, the instructor acts as both a discussion moderator and a live lecturer. The traditional text-based chats are no longer used due to the high student acceptance and delight with the iLinc system. Outside of the virtual classroom, the marketing students were tasked to analyze and comment on the content of selected television shows. National University&rsquo;s students are adult learners who grew up passively watching television from an early age. These assignments were designed to get them to think beyond the surface entertainment to the underlying marketing and business messages given in these shows. For example, a graduate advertising class was assigned to comment on the reality show, The Apprentice, while an undergraduate class critiqued the Super Bowl advertisements. In both classes the students were told to look at these programs critically and share their comments with the class. The use of these current mass media presentations, (which afforded live action cases that demonstrated the immediate consequences of managerial actions), was shown to be very powerful. </span><span style="font-size: 9.5pt;">Overall, the students appear to thoroughly enjoy this addition of topical and &ldquo;live&rdquo; learning tools to their online learning experience. While not tested empirically as yet, these new classroom tools seem to increase student comprehension and retention of the course material. </span></span></p>
APA, Harvard, Vancouver, ISO, and other styles
50

Niklas, Frank, Efsun Annac, and Astrid Wirth. "App-based learning for kindergarten children at home (Learning4Kids): study protocol for cohort 1 and the kindergarten assessments." BMC Pediatrics 20, no. 1 (December 2020). http://dx.doi.org/10.1186/s12887-020-02432-y.

Full text
Abstract:
Abstract Background Children’s literacy and mathematical competencies are a critical platform for their successful functioning as individuals in society. However, many children, in particular those with low socio-economic status (SES) backgrounds who may not receive the home support needed to develop to their full potential, are at risk of not reaching sufficient competence levels. The overall aim of this project is to develop innovative computer tablet applications (‘apps’) and test whether the apps support parents in the provision of high-quality home learning environments (HLEs) and impact positively on the short- and long-term development of children’s competencies. Altogether, “App-based learning for kindergarten children at home” (Learning4Kids) is a 5-year longitudinal study funded by the EU and designed to assess the potential impact of a tablet-based family intervention on children’s learning, development, social inclusion and well-being. Methods/design This study uses a multi-method intervention approach and draws on expertise from psychology, education, informatics, and didactics to evaluate the effectiveness of learning apps and the intervention approach. It also exploits new technological possibilities afforded by tablet computers that are very common nowadays in families. Learning4Kids sets out to measure the quality of the HLE, children’s early mathematical, literacy, and cognitive competencies and their behaviour. Here, data will be gathered via standardized tests, observations, and parental and educator surveys and checklists. Data collection also includes the assessment of app usage times via mobile sensing. In cohort 1, 190 families are assigned to one of four groups. One business-as-usual group will only participate in the child assessments, whereas the three remaining groups are provided with tablets for about 10 months. Two intervention groups will receive mathematical or literacy learning apps as well as parental information about these topics and the tablet-control-group will receive similar apps and information that focus on general child development, but not on mathematics or literacy. Discussion Whilst offering substantive advances for the scientific fields of psychology and education, the Learning4Kids study also has broad societal implications. Improving young children’s learning trajectories is both a social and economic imperative as it equips them to achieve greater individual success and to contribute to societal prosperity.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography