Artykuły w czasopismach na temat „Metric system, 1898”

Kliknij ten link, aby zobaczyć inne rodzaje publikacji na ten temat: Metric system, 1898.

Utwórz poprawne odniesienie w stylach APA, MLA, Chicago, Harvard i wielu innych

Wybierz rodzaj źródła:

Sprawdź 36 najlepszych artykułów w czasopismach naukowych na temat „Metric system, 1898”.

Przycisk „Dodaj do bibliografii” jest dostępny obok każdej pracy w bibliografii. Użyj go – a my automatycznie utworzymy odniesienie bibliograficzne do wybranej pracy w stylu cytowania, którego potrzebujesz: APA, MLA, Harvard, Chicago, Vancouver itp.

Możesz również pobrać pełny tekst publikacji naukowej w formacie „.pdf” i przeczytać adnotację do pracy online, jeśli odpowiednie parametry są dostępne w metadanych.

Przeglądaj artykuły w czasopismach z różnych dziedzin i twórz odpowiednie bibliografie.

1

Moraes, Cicero, Francesco Maria Galassi, Luca Sineo, Jiří Šindelář, Elena Varotto, Joanna Mietlińska-Sauter, Nathalie Antunes-Ferreira, Michael E. Habicht i Thiago Beaini. "The anatomical bases of the 3D digital facial approximation of the Zlatý kůň 1 woman (ca. 43,000 BP)". Anthropological Review 87, nr 2 (10.07.2024): 85–97. http://dx.doi.org/10.18778/1898-6773.87.2.04.

Pełny tekst źródła
Streszczenie:
In 1950 on Mount Zlatý kůň (‘Golden Horse’) in modern-day Czech Republic a system of caves was discovered. During many years of research in this area, human and animal osteological remains have been excavated, among which the most interesting ones were nine fragments of a female skull, now dated to ca. 43,000 yrs BP which are one of the earliest known anatomically modern humans in Eurasia. The aim of this research was to use purely digital techniques to: (1) to reconstruct the skull based on the 3D data of preserved fragments, (2) to approximate the probable appearance of the female it belonged to, and (3) to analyze the calculated shape of the reconstructed mandible and volume of the neurocranium in the context of similarities and differences with other representatives of the genus Homo. Computer techniques used in this research constitute a new, original approach to the problem of 3D analyses and may be useful primarily in bioarchaeological sciences, where metric analyses of the most valuable bone artifacts are often severely limited due to the incompleteness of the material available for research. The digital techniques presented here may also contribute significantly to the field of surgery, with the possibility of being adapted for applications in cranial prosthetics and post-traumatic reconstructive surgery.
Style APA, Harvard, Vancouver, ISO itp.
2

Martinez, Norberto, Alejandra Tabares i John F. Franco. "Generation of Alternative Battery Allocation Proposals in Distribution Systems by the Optimization of Different Economic Metrics within a Mathematical Model". Energies 14, nr 6 (19.03.2021): 1726. http://dx.doi.org/10.3390/en14061726.

Pełny tekst źródła
Streszczenie:
Battery systems bring technical and economic advantages to electrical distribution systems (EDSs), as they conveniently store the surplus of cheap renewable generation for use at a more convenient time and contribute to peak shaving. Due to the high cost of batteries, technical and economic studies are needed to evaluate their correct allocation within the EDS. To contribute to this analysis, this paper proposes a stochastic mathematical model for the optimal battery allocation (OBA), which can be guided by the optimization of two different economic metrics: net present value (NPV) and internal rate of return (IRR). The effects of the OBA in the EDS are evaluated considering the stochastic variation of photovoltaic generation and load. Tests with the 33-node IEEE test system indicate that OBA results in voltage profile improvement (~1% at peak time), peak reduction (31.17%), increased photovoltaic hosting capacity (18.8%), and cost reduction (3.06%). Furthermore, it was found that the IRR metric leads to a different solution compared to the traditional NPV optimization due to its inherent consideration of the relation between cash flow and investment. Thus, both NPV and IRR-based allocation alternatives can be used by the decision maker to improve economic and technical operation of the EDS.
Style APA, Harvard, Vancouver, ISO itp.
3

Türel, Meryem, i Kazım Baycar. "Determinants of Bilateral Trade between Europeans and the Ottoman Empire: 1878-1913". World Journal of Applied Economics 8, nr 2 (17.12.2022): 81–91. http://dx.doi.org/10.22440/wjae.8.2.3.

Pełny tekst źródła
Streszczenie:
During the 19th century, the Ottoman Empire experienced an increased integration into the world economy, primarily through the development of bilateral trade with European markets. This study examines the determinants of bilateral trade of the Ottoman Empire with its trading partners between 1878 and 1913 using a panel regression framework. The results indicate that the GDP of trading partners, distance, common borders, and the adoption of the metric system significantly affected bilateral trade. In contrast, the GDP of the Ottoman Empire, trade agreements, railways, and commercial ports had no statistically significant effects on the mentioned trade relations.
Style APA, Harvard, Vancouver, ISO itp.
4

Vorobeva, Alisa A. "Method for evaluating the industrial systems with built-in artificial intelligence robustness to adversarial attacks". Proceedings of Tomsk State University of Control Systems and Radioelectronics 26, nr 4 (2023): 44–52. http://dx.doi.org/10.21293/1818-0442-2023-26-4-44-52.

Pełny tekst źródła
Streszczenie:
The paper presents a method for evaluating the industrial systems with built-in artificial intelligence (AI) robustness to adversarial attacks. The influence of adversarial attacks on the systems performance has been studied. The scheme and the scenarios to implement attacks on industrial systems with built-in AI were presented. A comprehensive set of metrics used to study the robustness of ML models has been proposed, including test data set quality metrics (MDQ), ML model quality metrics (MMQ), and model robustness to adversarial attacks metrics (MSQ). The method is based on the use of this metrics set and includes the following steps: generating a set of test data containing clean samples; assessing the quality of a test data set using MMQ metrics; identification of relevant adversarial attacks methods; generating adversarial examples and a test data set, containing the adversarial samples, to evaluate the robustness of the ML model; assessing the quality of the generated adversarial test data set using MDQ indicators; evaluating the quality of a ML model using MMQ indicators; evaluating model robustness using MSQ scores.
Style APA, Harvard, Vancouver, ISO itp.
5

Mikryukov, A. A., i A. V. Kuular. "Development of the Incident Management Model in an Enterprise Information System Based on a Three-Tier Architecture Using Key (Relevant) Metrics". Open Education 24, nr 3 (27.06.2020): 78–86. http://dx.doi.org/10.21686/1818-4243-2020-3-78-86.

Pełny tekst źródła
Streszczenie:
The aim of the study is to increase the efficiency of the incident management process in an enterprise information system. The article analyzes the work on improving the incident management process. The expediency of applying a number of key metrics is substantiated, which makes it possible to assess the degree to which the process indicators achieved their target values, that is, assess the quality of the incident management: the speed of solving the incident, the degree of satisfaction of service users and the availability of channels for processing user requests. A comparative analysis of the existing model of the incidents’ management process and the proposed model is performed. The proposed model, which includes an additional support line, can significantly improve key indicators of incident handling and resolution process. The scientific novelty of the developed proposals lies in the integrated use of a combination of process, technological and service metrics, which provides the construction of a more effective model of incident management.Materials and methods. The theoretical basis of the study is the analysis of recommendations for the use of metrics in accordance with: the management methodology of the COBIT information technology, recommendations for building an incident management process based on the ITIL library of information technology infrastructure, as well as the results of scientific research by Russian and foreign scientists and publications of leading organizations in the field of management incidents in enterprise information systems. The analysis of incident management process metrics is carried out. The mathematical methods of quantitative measurement of key metrics are used. The analysis of statistical data received by the technical support service for incident management processes was carried out.Results. The use of key metrics is justified, with the help of which the task of promptly responding to incidents, their subsequent processing and resolution is solved in conditions of ensuring guaranteed access to channels for processing calls. A three-tier incidents’ management model was developed, which made it possible to more effectively solve the problem of managing their processing based on the integrated use of key metrics.Conclusion. The study revealed the shortcomings of the existing model of the incident management process. The analysis of metrics used in existing models of the incident management process is carried out. The choice of a set of relevant metrics is substantiated, the complex application of which allowed us to develop a more effective incident management model that meets both the requirements of service consumers and the requirements for the operation of an information system. The developed model provides improved quality of incident processing (speed, completeness, reliability).A distinctive feature of the developed model is the use of objective quantitative characteristics obtained on the basis of relevant metrics of the incident management process, which made it possible to substantiate proposals for improving the existing incident management model in the enterprise information system.
Style APA, Harvard, Vancouver, ISO itp.
6

Lubkin, I. A. "Application security metrics when using defense system against vulnerabilities based on return-oriented programming". Proceedings of Tomsk State University of Control Systems and Radioelectronics 24, nr 4 (2021): 46–51. http://dx.doi.org/10.21293/1818-0442-2021-24-4-46-51.

Pełny tekst źródła
Streszczenie:
The vulnerabilities using return-oriented programming pose threats to the functioning of information systems. There are many protection systems to counteract them. They are based on various principles of functioning. At the same time, there are no generally accepted approaches to assess the security of applied solutions. The paper proposes security metrics that allow obtaining objective data on the efficiency of protection against RoP vulnerabilities. Proposed security metrics show ability to perform attack by gain control over control flow graph.
Style APA, Harvard, Vancouver, ISO itp.
7

Phan, Hoang Anh, Van Tan Duong, Mai Nguyen Thi, Anh Nguyen Thi, Hang Khuat Thi Thu, Thang Luu Duc, Van Hieu Dang, Huu Quoc Dong Tran, Thi Thanh Van Nguyen i Thanh Tung Bui. "Development of an Autonomous Component Testing System with Reliability Improvement Using Computer Vision and Machine Learning". ECTI Transactions on Computer and Information Technology (ECTI-CIT) 18, nr 1 (10.02.2024): 64–75. http://dx.doi.org/10.37936/ecti-cit.2024181.253854.

Pełny tekst źródła
Streszczenie:
This study evaluated computer vision-based models, including Histogram Analysis, Logistic Regression, Sift-SVM, and Deep learning models, in an autonomous testing system developed for smartphone camera modules. System performance was assessed in a practical factory setting with workers operating the system, and metrics such as processing time, sensitivity, specificity, accuracy, and defect rate were evaluated. Based on the results, the Sift-SVM model demonstrated the greatest potential for enhancing the reliability of the system with a processing time of 0.01578 seconds, a sensitivity of 99.811%, and a reduction in the failure rate to 1888 PPM. The study findings suggest that Sift-SVM has the potential to be practically applied in the industry, thus improving the speed and accuracy of automatic defect detection in manufacturing and reducing the defect rate.
Style APA, Harvard, Vancouver, ISO itp.
8

Wang, Jun, Heping Li i Haiyuan Lu. "An estimation of the evapotranspiration of typical steppe areas using Landsat images and the METRIC model". Journal of Water and Climate Change 13, nr 2 (30.12.2021): 926–42. http://dx.doi.org/10.2166/wcc.2021.432.

Pełny tekst źródła
Streszczenie:
Abstract Remote sensing excels in estimating regional evapotranspiration (ET). However, most remote sensing energy balance models require researchers to subjectively extract the characteristic parameters of the dry and wet limits of the underlying surfaces. The regional ET accuracy is affected by wrong determined ideal pixels. This study used Landsat images and the METRIC model to evaluate the effects of different dry and wet pixel combinations on the ET in the typical steppe areas. The ET spatiotemporal changes of the different land cover types were discussed. The results show that the surface temperature and leaf area index could determine the dry and wet limits recognition schemes in grassland areas. The water vapor flux data of an eddy covariance system verified that the relative error between the ETd,METRIC and ETd,GES of eight DOYs (day of the year) was 18.8% on average. The ETMETRIC values of the crop growth season and the ETIMS of eight silage maize irrigation monitoring stations were found to have a relative error of 11.1% on average. The spatial distribution of the ET of the different land cover types in the study area was as follows: ETwater > ETarable land > ETforest land > ETunutilized land > ETgrassland > ETurban land.
Style APA, Harvard, Vancouver, ISO itp.
9

Damini J. Dhondge i R. S. Talikoti. "Dynamic Analysis of Elevated Water Tanks". Electronic Journal of Structural Engineering 19 (1.12.2019): 33–38. http://dx.doi.org/10.56748/ejse.19233.

Pełny tekst źródła
Streszczenie:
Water tanks are very important components of lifeline. They are vital component in municipal water system, firefighting systems and in many industrial facilities for storage of water. The water tanks get heavily damaged or collapsed during earthquake due to the fluid-structure interactions; hence the seismic be-havior of tanks has the characteristics of complex phenomena. Water storage tanks ought to stay practical within the post-earthquake amount to confirm potable water system to earthquake affected regions. The para-metric study suggests that the elevated circular tanks performs better than elevated rectangular tanks. In the present study, a dynamic analysis of elevated RCC water tanks design for the zone III and zone V as per Indi-an Standard: 1893-2002 (Part-2) and analyzed manually as well as using the software considering all the earthquake forces. Objective of this paper is to understand the dynamic behavior of elevated water tanks un-der earthquake loading.
Style APA, Harvard, Vancouver, ISO itp.
10

Ochella, Sunday, i Mahmood Shafiee. "Performance Metrics for Artificial Intelligence (AI) Algorithms Adopted in Prognostics and Health Management (PHM) of Mechanical Systems". Journal of Physics: Conference Series 1828, nr 1 (1.02.2021): 012005. http://dx.doi.org/10.1088/1742-6596/1828/1/012005.

Pełny tekst źródła
Style APA, Harvard, Vancouver, ISO itp.
11

Ekhlakov, Yuriy P., Sergei O. Saprunov, Alexey A. Poguda i Nikita S. Cherkashin. "Comparison of fuzzy logic systems based on triangular and trapezoidal membership functions". Proceedings of Tomsk State University of Control Systems and Radioelectronics 27, nr 1 (2024): 49–62. http://dx.doi.org/10.21293/1818-0442-2024-27-1-49-62.

Pełny tekst źródła
Streszczenie:
The article is devoted to assessing the efficiency of using triangular and trapezoidal membership functions in fuzzy systems. Two metrics are proposed: execution time of the fuzzy inference algorithm, dispersion of the output value. The results of experiments on testing three fuzzy systems of different architecture complexity are presented: determining the size of a tip for a waiter; forecasting changes in energy consumption in the production system; driver assistance on the roads.
Style APA, Harvard, Vancouver, ISO itp.
12

Borysov, Oleksii, i Olena Vasylieva. "Communicative Analysis of Dialogical Interaction". Central European Journal of Communication 15, nr 2(31) (22.09.2022): 286–303. http://dx.doi.org/10.51480/1899-5101.15.2(31).6.

Pełny tekst źródła
Streszczenie:
Dialogue studies suggest keys to understanding communicative behavior. The purpose of this article is to put forth a more complex and comprehensive approach to the analysis of interaction that incorporates quantitative metrics to reveal its entire communicative depth. The methods of discourse-analysis, initiative-response analysis, a theory of speech acts, conversational, cognitive, stylistic, statistical analyses as well as descriptive and interpretative methods have been united in one system to interpret the procedure and results of the cooperative and conflict dialogues chosen as an example. The integrated methodology produces a broader investigative view of communication, also because it allows measuring the level of dominance of interlocutors and explaining it in terms of power relations. In this way, it contributes to a better understanding of the multifaceted nature of dialogue without any characteristics to be underestimated. The methodology is an open system and is suggested as a sample of dialogical communication research.
Style APA, Harvard, Vancouver, ISO itp.
13

Lin, Lixiong, Hongqin He, Zhiping Xu i Dongjie Wu. "Realtime Vehicle Tracking Method Based on YOLOv5 + DeepSORT". Computational Intelligence and Neuroscience 2023 (15.06.2023): 1–11. http://dx.doi.org/10.1155/2023/7974201.

Pełny tekst źródła
Streszczenie:
In actual traffic scenarios, the environment is complex and constantly changing, with many vehicles that have substantial similarities, posing significant challenges to vehicle tracking research based on deep learning. To address these challenges, this article investigates the application of the DeepSORT (simple online and realtime tracking with a deep association metric) multitarget tracking algorithm in vehicle tracking. Due to the strong dependence of the DeepSORT algorithm on target detection, a YOLOv5s_DSC vehicle detection algorithm based on the YOLOv5s algorithm is proposed, which provides accurate and fast vehicle detection data to the DeepSORT algorithm. Compared to YOLOv5s, YOLOv5s_DSC has no more than a 1% difference in optimal mAP0.5 (mean average precision), precision rate, and recall rate, while reducing the number of parameters by 23.5%, the amount of computation by 32.3%, the size of the weight file by 20%, and increasing the average processing speed of each image by 18.8%. After integrating the DeepSORT algorithm, the processing speed of YOLOv5s_DSC + DeepSORT reaches up to 25 FPS, and the system exhibits better robustness to occlusion.
Style APA, Harvard, Vancouver, ISO itp.
14

Miller, John S. "Evaluating New Transportation Technologies with Tiered Criteria: Rail Case Study Approach". Transportation Research Record: Journal of the Transportation Research Board 1838, nr 1 (styczeń 2003): 64–72. http://dx.doi.org/10.3141/1838-09.

Pełny tekst źródła
Streszczenie:
New transportation technologies, such as magnetic levitation (maglev), offer promise and risk as they move through the successive stages of research, development, demonstration, and deployment. The Virginia General Assembly’s initial financial contributions to a low-speed, 3,400-ft (1,037-m), maglev-based people mover being constructed by the private sector illustrate the opportunities for potential reward and failure. Requests for additional funding to refine and deploy the prototype motivated the Virginia Department of Transportation (VDOT) to seek a structured decision process. VDOT did not want to spend taxpayers’ money frivolously on a technology not yet in commercial service anywhere in the world, or to impede an affordable, new approach that could effectively reduce congestion. To help VDOT evaluate the prototype maglev system, the Virginia Transportation Research Council developed a set of broad-based performance measures applicable to different rail systems including maglev. This study links the standards of credibility, reliability, performance, safety, and cost to the types of detailed performance measures that should be obtained during an operational test. A strong link between technical data and broad-based metrics enables persons of disparate backgrounds to jointly evaluate the results of a new technology and establish reasonable expectations for what that technology should accomplish. This is a better approach than making decisions on the sole basis of preconceived notions of whether a technology is good or bad. These performance measures, therefore, offer a promising method for evaluating new transportation products. Although the concept of technology-blind criteria has received renewed emphasis over the past decade, this study highlights challenges and suggestions for using such criteria in a relatively political setting, which is less than ideal but common.
Style APA, Harvard, Vancouver, ISO itp.
15

Luo, Y. Q., J. Randerson, G. Abramowitz, C. Bacour, E. Blyth, N. Carvalhais, P. Ciais i in. "A framework of benchmarking land models". Biogeosciences Discussions 9, nr 2 (17.02.2012): 1899–944. http://dx.doi.org/10.5194/bgd-9-1899-2012.

Pełny tekst źródła
Streszczenie:
Abstract. Land models, which have been developed by the modeling community in the past two decades to predict future states of ecosystems and climate, have to be critically evaluated for their performance skills of simulating ecosystem responses and feedback to climate change. Benchmarking is an emerging procedure to measure and evaluate performance of models against a set of defined standards. This paper proposes a benchmarking framework for evaluation of land models. The framework includes (1) targeted aspects of model performance to be evaluated; (2) a set of benchmarks as defined references to test model performance; (3) metrics to measure and compare performance skills among models so as to identify model strengths and deficiencies; and (4) model improvement. Component 4 may or may not be involved in a benchmark analysis but is an ultimate goal of general modeling research. Land models are required to simulate exchange of water, energy, carbon and sometimes other trace gases between the atmosphere and the land-surface, and should be evaluated for their simulations of biophysical processes, biogeochemical cycles, and vegetation dynamics across timescales in response to both weather and climate change. Benchmarks that are used to evaluate models generally consist of direct observations, data-model products, and data-derived patterns and relationships. Metrics of measuring mismatches between models and benchmarks may include (1) a priori thresholds of acceptable model performance and (2) a scoring system to combine data-model mismatches for various processes at different temporal and spatial scales. The benchmark analyses should identify clues of weak model performance for future improvement. Iterations between model evaluation and improvement via benchmarking shall demonstrate progress of land modeling and help establish confidence in land models for their predictions of future states of ecosystems and climate.
Style APA, Harvard, Vancouver, ISO itp.
16

Pepe, Massimiliano, Domenica Costantino i Vincenzo Saverio Alfio. "A GIS Procedure to Assess Shoreline Changes over Time Using Multi-temporal Maps: An Analysis of a Sandy Shoreline in Southern Italy over the Last 100 Years". Geomatics and Environmental Engineering 17, nr 3 (16.02.2023): 107–34. http://dx.doi.org/10.7494/geom.2023.17.3.107.

Pełny tekst źródła
Streszczenie:
The aim of the paper is to identify a methodology capable of assessing shoreline changes through a geomatic approach based on the use of GIS (Geographic Information System) software. The paper describes a case study that reports the evolution of a coastline over a period of more than 100 years using medium and large-scale metric maps available in different periods. In fact, the coastlines were obtained from the source maps of the Italian Cadastre (dated 1890), from numerical cartography available on the coastline and acquired in different period at scales 1:5000 and 1:2000 and, more recently, from the Google Earth Pro platform. To analyse the evolution of the coastline a new procedure has been performed which is based on the use of GIS software, in particular a plugin called DSAS that allows the evaluation of the changes in the coastline and also obtains a statistical analysis of its evolution. The results showed the ease and applicability of the method in determining the evolution of the coastline and the strong erosion of a stretch of coastline with important socio-economic consequences and repercussions was highlighted in the analysed case study.
Style APA, Harvard, Vancouver, ISO itp.
17

Sheneamer, Abdullah. "Vulnerable JavaScript functions detection using stacking of convolutional neural networks". PeerJ Computer Science 10 (29.02.2024): e1838. http://dx.doi.org/10.7717/peerj-cs.1838.

Pełny tekst źródła
Streszczenie:
System security for web-based applications is paramount, and for the avoidance of possible cyberattacks it is important to detect vulnerable JavaScript functions. Developers and security analysts have long relied upon static analysis to investigate vulnerabilities and faults within programs. Static analysis tools are used for analyzing a program’s source code and identifying sections of code that need to be further examined by a human analyst. This article suggests a new approach for identifying vulnerable code in JavaScript programs by using ensemble of convolutional neural networks (CNNs) models. These models use vulnerable information and code features to detect related vulnerable code. For identifying different vulnerabilities in JavaScript functions, an approach has been tested which involves the stacking of CNNs with misbalancing, random under sampler, and random over sampler. Our approach uses these CNNs to detect vulnerable code and improve upon current techniques’ limitations. Previous research has introduced several approaches to identify vulnerable code in JavaScript programs, but often have their own limitations such as low accuracy rates and high false-positive or false-negative results. Our approach addresses this by using the power of convolutional neural networks and is proven to be highly effective in the detection of vulnerable functions that could be used by cybercriminals. The stacked CNN approach has an approximately 98% accuracy, proving its robustness and usability in real-world scenarios. To evaluate its efficacy, the proposed method is trained using publicly available JavaScript blocks, and the results are assessed using various performance metrics. The research offers a valuable insight into better ways to protect web-based applications and systems from potential threats, leading to a safer online environment for all.
Style APA, Harvard, Vancouver, ISO itp.
18

Carrasquel, Julio C., i Khalil Mecheraoui. "Object-Centric Replay-Based Conformance Checking: Unveiling Desire Lines and Local Deviations". Modeling and Analysis of Information Systems 28, nr 2 (11.06.2021): 146–68. http://dx.doi.org/10.18255/1818-1015-2021-2-146-168.

Pełny tekst źródła
Streszczenie:
Conformance checking methods diagnose to which extent a real system, whose behavior is recorded in an event log, complies with its specification model, e.g., a Petri net. Nonetheless, the majority of these methods focus on checking isolated process instances, neglecting interaction between instances in a system. Addressing this limitation, a series of object-centric approaches have been proposed in the field of process mining. These approaches are based on the holistic analysis of the multiple process instances interacting in a system, where each instance is centered on the handling of an object. Inspired by the object-centric paradigm, this paper presents a replay-based conformance checking method which uses a class of colored Petri nets (CPNs) -- a Petri net extension where tokens in the model carry values of some types (colors). Particularly, we consider conservative workflow CPNs which allow to describe the expected behavior of a system whose components are centered on the end-to-end processing of distinguishable objects. For describing a system’s real behavior, we consider event logs whose events have sets of objects involved in the execution of activities. For replay, we consider a jump strategy where tokens absent from input places of a transition to fire move from their current place of the model to the requested places. Token jumps allow to identify desire lines, i.e., object paths unforeseen in the specification. Also, we introduce local diagnostics based on the proportion of jumps in specific model components. The metrics allow to inform the severity of deviations in precise system parts. Finally, we report experiments supported by a prototype of our method. To show the practical value of our method, we employ a case study on trading systems, where orders from users are matched to trade.
Style APA, Harvard, Vancouver, ISO itp.
19

Cherifi, Abdelhamid, Tarik Mohammed Chikouche, Abdullah S. Karar, Julien Moussa H. Barakat, Omar Arbouche i Iyad Dayoub. "Capacity Improvement of 3D-OCDMA-PON Hybrid System Next Generation Using Weight Zero Cross Correlation Code". Applied Sciences 13, nr 10 (10.05.2023): 5869. http://dx.doi.org/10.3390/app13105869.

Pełny tekst źródła
Streszczenie:
This paper proposes a novel code for optical code division multiple access (OCDMA) systems, called the three-dimensional (3D) spectral/temporal/spatial single weight zero cross-correlation (3D-SWZCC) code. The proposed code could potentially be used in the next generation of passive optical networks (NG-PONs) to provide a 3D-SWZCC-OCDMA-NG-PON system. The developed code has a high capacity and a zero cross-correlation property that completely suppresses the multiple access interference (MAI) effects that are a main drawback for OCDMA systems. Previously, a two-dimensional (2D) SWZCC code was proposed for two-dimensional OCDMA (2D-OCDMA) systems. It works by devoting the first and second components to spectral and spatial encodings, respectively. However, the proposed code aims to carry out encoding domains in spectral, time, and spatial aspects for the first, second, and third components, respectively. One-dimensional, 2D, and 3D systems can support up to 68, 157, and 454 active users with total code lengths equal to 68, 171, and 273, respectively. Numerical results reveal that the 3D-SWZCC code outperforms codes from previous studies, including 3D codes such as perfect difference (PD), PD/multi-diagonal (PD/MD), dynamic cyclic shift/MD (DCS/MD), and Pascal’s triangle zero cross-correlation (PTZCC), according to various metrics. The system function is provided by exhibiting the architecture of the transmitter and receiver in the PON context, where the proposed code demonstrates its effectiveness in meeting optical communication requirements based on 3D-OCDMA-PON by producing a high quality factor (Q) of 18.8 and low bit error rate (BER) of 3.48 × 10−29 over a long distance that can reach 30 Km for a data rate of 0.622 Gbps.
Style APA, Harvard, Vancouver, ISO itp.
20

Kenney, Raymond J., Jeff Houck, Brian D. Giordano, Judith F. Baumhauer, Meghan Herbert i Michael D. Maloney. "Do Patient Reported Outcome Measurement Information System (PROMIS) Scales Demonstrate Responsiveness as Well as Disease-Specific Scales in Patients Undergoing Knee Arthroscopy?" American Journal of Sports Medicine 47, nr 6 (10.04.2019): 1396–403. http://dx.doi.org/10.1177/0363546519832546.

Pełny tekst źródła
Streszczenie:
Background: The Patient Reported Outcomes Information System (PROMIS) is an efficient metric able to detect changes in global health. Purpose: To assess the responsiveness, convergent validity, and clinically important difference (CID) of PROMIS compared with disease-specific scales after knee arthroscopy. Study Design: Cohort study (Diagnosis); Level of evidence, 2. Methods: A prospective institutional review board–approved study collected PROMIS Physical Function (PF), PROMIS Pain Interference (PI), International Knee Documentation Committee (IKDC), and Knee injury and Osteoarthritis Outcome Score (KOOS) results in patients undergoing knee arthroscopy. The change from preoperative to longest follow-up was used in analyses performed to determine responsiveness, convergent validity, and minimal and moderate CID using the IKDC scale as the anchor. Results: Of the 100 patients enrolled, 76 were included. Values of the effect size index (ESI) ranged from near 0 to 1.69 across time points and were comparable across scales. Correlations of the change in KOOS and PROMIS with IKDC ranged from r values of 0.61 to 0.79. The minimal CID for KOOS varied from 12.5 to 17.5. PROMIS PF and PI minimal CID were 3.3 and −3.2. KOOS moderate CID varied from 14.3 to 18.8. PROMIS PF and PI moderate CID were 5.0 and −5.8. Conclusion: The PROMIS PF and PI showed similar responsiveness and CID compared with disease-specific scales in patients after knee arthroscopy. PROMIS PI, PROMIS PF, and KOOS correlations with IKDC demonstrate that these scales are measuring a similar construct. The ESIs of PROMIS PF and PI were similar to those of KOOS and IKDC, suggesting similar responsiveness at 6 months or longer (ESI >1.0). Minimum and moderate CID values calculated for PROMIS PF and PI using IKDC as an anchor were sufficiently low to suggest clinical usefulness. Clinical Relevance: PROMIS PF and PI can be accurately used to determine improvement or lack thereof with clinically important changes after knee arthroscopy.
Style APA, Harvard, Vancouver, ISO itp.
21

Tsimaras, Dimitrios O., Stylianos Mystakidis, Athanasios Christopoulos, Emmanouil Zoulias i Ioannis Hatzilygeroudis. "E-Learning Courses Evaluation on the Basis of Trainees’ Feedback on Open Questions Text Analysis". Education Sciences 12, nr 9 (18.09.2022): 633. http://dx.doi.org/10.3390/educsci12090633.

Pełny tekst źródła
Streszczenie:
Life-long learning is a necessity associated with the requirements of the fourth industrial revolution. Although distance online education played a major role in the evolution of the modern education system, this share grew dramatically because of the COVID-19 pandemic outbreak and the social distancing measures that were imposed. However, the quick and extensive adoption of online learning tools also highlighted the multidimensional weaknesses of online education and the needs that arise when considering such practices. To this end, the ease of collecting digital data, as well as the overall evolution of data analytics, enables researchers, and by extension educators, to systematically evaluate the pros and cons of such systems. For instance, advanced data mining methods can be used to find potential areas of concern or to confirm elements of excellence. In this work, we used text analysis methods on data that have emerged from participants’ feedback in online lifelong learning programmes for professional development. We analysed 1890 Greek text-based answers of participants to open evaluation questions using standard text analysis processes. We finally produced 7-gram tokens from the words in the texts, from which we constructed meaningful sentences and characterized them as positive or negative. We introduced a new metric, called acceptance grade, to quantitatively evaluate them as far as their positive or negative content for the online courses is concerned. We finally based our evaluation on the top 10 sentences of each category (positive, negative). Validation of the results via two external experts and data triangulation showed an accuracy of 80%.
Style APA, Harvard, Vancouver, ISO itp.
22

Phinzi, Kwanele, Dávid Abriha i Szilárd Szabó. "Classification Efficacy Using K-Fold Cross-Validation and Bootstrapping Resampling Techniques on the Example of Mapping Complex Gully Systems". Remote Sensing 13, nr 15 (28.07.2021): 2980. http://dx.doi.org/10.3390/rs13152980.

Pełny tekst źródła
Streszczenie:
The availability of aerial and satellite imageries has greatly reduced the costs and time associated with gully mapping, especially in remote locations. Regardless, accurate identification of gullies from satellite images remains an open issue despite the amount of literature addressing this problem. The main objective of this work was to investigate the performance of support vector machines (SVM) and random forest (RF) algorithms in extracting gullies based on two resampling methods: bootstrapping and k-fold cross-validation (CV). In order to achieve this objective, we used PlanetScope data, acquired during the wet and dry seasons. Using the Normalized Difference Vegetation Index (NDVI) and multispectral bands, we also explored the potential of the PlanetScope image in discriminating gullies from the surrounding land cover. Results revealed that gullies had significantly different (p < 0.001) spectral profiles from any other land cover class regarding all bands of the PlanetScope image, both in the wet and dry seasons. However, NDVI was not efficient in gully discrimination. Based on the overall accuracies, RF’s performance was better with CV, particularly in the dry season, where its performance was up to 4% better than the SVM’s. Nevertheless, class level metrics (omission error: 11.8%; commission error: 19%) showed that SVM combined with CV was more successful in gully extraction in the wet season. On the contrary, RF combined with bootstrapping had relatively low omission (16.4%) and commission errors (10.4%), making it the most efficient algorithm in the dry season. The estimated gully area was 88 ± 14.4 ha in the dry season and 57.2 ± 18.8 ha in the wet season. Based on the standard error (8.2 ha), the wet season was more appropriate in gully identification than the dry season, which had a slightly higher standard error (8.6 ha). For the first time, this study sheds light on the influence of these resampling techniques on the accuracy of satellite-based gully mapping. More importantly, this study provides the basis for further investigations into the accuracy of such resampling techniques, especially when using different satellite images other than the PlanetScope data.
Style APA, Harvard, Vancouver, ISO itp.
23

Long, Anna-May, Kathryn J. Bunch, Marian Knight, Jennifer J. Kurinczuk i Paul D. Losty. "Early population-based outcomes of infants born with congenital diaphragmatic hernia". Archives of Disease in Childhood - Fetal and Neonatal Edition 103, nr 6 (4.01.2018): F517—F522. http://dx.doi.org/10.1136/archdischild-2017-313933.

Pełny tekst źródła
Streszczenie:
PurposeThis study aims to describe short-term outcomes of live-born infants with congenital diaphragmatic hernia (CDH) and to identify prognostic factors associated with early mortality.DesignA prospective population cohort study was undertaken between April 2009 and September 2010, collecting data on live-born infants with CDH from all 28 paediatric surgical centres in the UK and Ireland using an established surgical surveillance system. Management and outcomes are described. Prognostic factors associated with death before surgery are explored.ResultsTwo hundred and nineteen live-born infants with CDH were reported within the data collection period. There were 1.5 times more boys than girls (n=133, 61%). Thirty-five infants (16%) died without an operation. This adverse outcome was associated with female sex (adjusted OR (aOR) 3.96, 95% CI 1.66 to 9.47), prenatal diagnosis (aOR 4.99, 95% CI 1.31 to 18.98), and the need for physiological support in the form of inotropes (aOR 9.96, 95% CI 1.19 to 83.25) or pulmonary vasodilators (aOR 4.09, 95% CI 1.53 to 10.93). Significant variation in practice existed among centres, and some therapies potentially detrimental to infant outcomes were used, including pulmonary surfactant in 45 antenatally diagnosed infants (34%). Utilisation of extracorporeal membrane oxygenation was very low compared with published international studies (n=9/219, 4%). Postoperative 30-day survival was 98% for 182 infants with CDH who were adequately physiologically stabilised and underwent surgery.ConclusionThis is the first British Isles population-based study reporting outcome metrics for infants born with CDH. 16% of babies did not survive to undergo surgery. Factors associated with poor outcome included female sex and prenatal diagnosis. Early postoperative survival in those who underwent surgical repair was excellent.
Style APA, Harvard, Vancouver, ISO itp.
24

Vlasova, Ekaterina A., Elizaveta L. Karpova i M. Yu Olshevskaya. "Vocabulary: how many words are enough? Principles of minimizing learners’vocabulary". NSU Vestnik. Series: Linguistics and Intercultural Communication 17, nr 4 (2019): 63–77. http://dx.doi.org/10.25205/1818-7935-2019-17-4-63-77.

Pełny tekst źródła
Streszczenie:
This article analyses methodology of compiling Russian general wordlists and lexical minima for teaching Russian for specific purposes. The study systematizes three approaches: linguo-didactic, linguo-statistical, and corpus-based. The article also describes the process and results of applying all the three methods to the development of a lexical minimum based on political science corpus. Methodological analysis comprises general word lists for the Russian State Standard Exam (TRKI), the System of lexical minima by V. V. Morkovkin, and the Frequency dictionary of the Russian language for foreigners, created by S. A. Sharov as a part of the KELLY project, as well as special lexical minima for medicine, robotics, nuclear energy, and mathematics. It has been revealed that the core element in the development of a discipline-specific lexical minimum is minimization that involves a set of principles determining the optimal length of the list and lexeme selection. For the Russian general word lists, the most common principles of minimization are methodical expediency (“relevance” of the word at each level), quantitative metrics, including absolute and relative frequencies, the word rank, and a coverage index, showing the percentage of text that every lexeme covers. The article reports the results of combining the quantitative methods, corpus-based analysis, and didactic principles to apply to the development of the lexical minimum based on political science textbooks. The core index, defining the length of this list, was coverage which revealed that 8,237 most frequent lexemes cover 98 % of the whole corpus. The linguo-didactic analysis showed that 1, 000 most frequent lexemes, without stop-words, cover 50 % of this corpus, and therefore this wordlist allows foreign learners to understand about a half of the corpus. After reaching the point of 3,500 of the most frequent words, the coverage index grows insignificantly, and this number can be considered to be a target in teaching and learning discipline-specific vocabulary. It is notable that the recommended lexical minimum, comprising 1,000–3,500 of the most frequent words, is only a starting point for reading comprehension of texts for professionals also referred to as ‘special’ texts. Their deeper and effective understanding also involves competence in rhetoric strategies and text structure.
Style APA, Harvard, Vancouver, ISO itp.
25

Mosalev, A. I. "Parameterization of Educational Activities in an Educational Institution of Higher Education". Open Education 27, nr 3 (20.06.2023): 36–42. http://dx.doi.org/10.21686/1818-4243-2023-3-36-42.

Pełny tekst źródła
Streszczenie:
Purpose of the study. The formulation of a program of educational work in universities is one of the most important in the framework of the formation of the personality of a graduate. However, the rules for the formation of such a program are framework and are based on the state strategy of educational activities, as well as in accordance with the profile of the educational institution itself. Such an «open» state of the formation of programs, of course, gives advantages in the freedom to choose the trajectories of educational work with students; however, it also leaves an imprint on the possibility of fully considering the interests of all aspects of the educational process. It is important to form a vision of how, what parameters will consider the opinion of not only the leadership of the educational institution, the teaching staff, but also the students themselves, as well as their parents.Materials and methods. The basis for the study was the materials of the regulatory framework and theoretical articles in the field of educational activities. Emphasis was placed on a systematic analysis of the relationship of educational practices with an eye on the value orientations of today’s youth. In terms of developing proposals, the method of an online questionnaire survey of students, parents and legal guardians, teaching staff, faculty management, as well as representatives of departments in whose competence are issues of educational activities was used.Results. Maps for assessing the parameters of educational work have been developed in the context of spiritual, moral, and socio-axiological parameters. Based on the analysis, it was possible to structure both the strengths of the educational practice of the university, as well as weaknesses that are important to pay attention to. As a result, the approach described in the article allows an educational institution, when forming its own program of educating students, determining the development trajectories of each student, to understand what problems and misunderstandings may arise in the system plan, and to determine for themselves what individual parameters of educational contexts are important to focus on.Conclusion. The author is working on the systematization of the practices of the educational process through the systemic decomposition of the individual components of the programs of educational work with students at the university. Questions that require further research – what should be the indicative indexes and tools for measuring them, as well as the frequency of such work. Therefore, for example, it draws attention to what metrics will allow assessing the degree of development of internal feelings among students as will, justice, faith in goodness, etc.
Style APA, Harvard, Vancouver, ISO itp.
26

Agarwal, Anish K., Emily Seeburger, Gerald O’Neill, Chidinma C. Nwakanma, Lillian E. Marsh, Kevin Alexander Soltany, Eugenia C. South i Ari B. Friedman. "Prevalence of Behavioral Flags in the Electronic Health Record Among Black and White Patients Visiting the Emergency Department". JAMA Network Open 6, nr 1 (19.01.2023): e2251734. http://dx.doi.org/10.1001/jamanetworkopen.2022.51734.

Pełny tekst źródła
Streszczenie:
ImportanceBehavioral flags in the electronic health record (EHR) are designed to alert clinicians of potentially unsafe or aggressive patients. These flags may introduce bias, and understanding how they are used is important to ensure equitable care.ObjectiveTo investigate the incidence of behavioral flags and assess whether there were differences between Black and White patients and whether the flags were associated with differences in emergency department (ED) clinical care.Design, Setting, and ParticipantsThis was a retrospective cohort study of EHR data of adult patients (aged ≥18 years) from 3 Philadelphia, Pennsylvania, EDs within a single health system between January 1, 2017, and December 31, 2019. Secondary analyses excluded patients with sickle cell disease and high ED care utilization. Data were analyzed from February 1 to April 4, 2022.Main Outcomes and MeasuresThe primary outcome of interest was the presence of an EHR behavioral flag. Secondary measures included variation of flags across sex, race, age, insurance status, triage status, ED clinical care metrics (eg, laboratory, medication, and radiology orders), ED disposition (discharge, admission, or observation), and length of key intervals during ED care.ResultsParticipating EDs had 195 601 eligible patients (110 890 [56.7%] female patients; 113 638 Black patients [58.1%]; 81 963 White patients [41.9%]; median [IQR] age, 42 [28-60] years), with 426 858 ED visits. Among these, 683 patients (0.3%) had a behavioral flag notification in the EHR (3.5 flags per 1000 patients), and it was present for 6851 ED visits (16 flagged visits per 1000 visits). Patient differences between those with a flag and those without included male sex (56.1% vs 43.3%), Black race (71.2% vs 56.7%), and insurance status, particularly Medicaid insurance (74.5% vs 36.3%). Flag use varied across sites. Black patients received flags at a rate of 4.0 per 1000 patients, and White patients received flags at a rate of 2.4 per 1000 patients (P &amp;lt; .001). Among patients with a flag, Black patients, compared with White patients, had longer waiting times to be placed in a room (median [IQR] time, 28.0 [10.5-89.4] minutes vs 18.2 [7.2-75.1] minutes; P &amp;lt; .001), longer waiting times to see a clinician (median [IQR] time, 42.1 [18.8-105.5] minutes vs 33.3 [15.3-84.5] minutes; P &amp;lt; .001), and shorter lengths of stay (median [IQR] time, 274 [135-471] minutes vs 305 [154-491] minutes; P = .01). Black patients with a flag underwent fewer laboratory (eg, 2449 Black patients with 0 orders [43.4%] vs 441 White patients with 0 orders [36.7%]; P &amp;lt; .001) and imaging (eg, 3541 Black patients with no imaging [62.7%] vs 675 White patients with no imaging [56.2%]; P &amp;lt; .001) tests compared with White patients with a flag.Conclusions and RelevanceThis cohort study found significant differences in ED clinical care metrics, including that flagged patients had longer wait times and were less likely to undergo laboratory testing and imaging, which was amplified in Black patients.
Style APA, Harvard, Vancouver, ISO itp.
27

Wood, J., J. Heverhagen, X. Yang, G. Jia, R. M. Koch, R. Jacko, S. Sammet i in. "Imaging Core Lab Comparison of Five Clinical Liver Iron MRI Methods and Biopsy for Hepatic Iron Concentration (HIC) in Sickle Cell Disease (SCD) Patients." Blood 108, nr 11 (16.11.2006): 1217. http://dx.doi.org/10.1182/blood.v108.11.1217.1217.

Pełny tekst źródła
Streszczenie:
Abstract Introduction: HIC is a surrogate for total body iron in pts with transfusional iron overload. MRI techniques based on signal intensity ratios (SIR) or relaxation rates (R2 and R2*) are increasingly accepted as surrogates to biopsy in determining HIC. MRI techniques have been validated using data acquired on different MRI platforms, but image processing has been restricted to labs where the techniques were developed. We tested the ability of an imaging core lab to process data from 4 different MRI techniques:Gradient echo (GE) SIR (Gandon et al);Spin echo (SE) SIR (Jensen et al);R2 (Wood et al);R2* (Wood et al). R2 St. Pierre processing is proprietary and was not implemented by the core lab. The goal was to collect pilot data to test the feasibility of independent core lab assessment as well as assess inter-technique agreement and interobserver variability. Methods: Seven CICL670A0109 MRI/liver biopsy substudy pts (all at CHLA; one withdrew after baseline [BL] exam) participated in an open-label comparison of deferasirox and DFO in SCD pts. Liver biopsy (analyzed centrally at Ctre Hosp Univ, Rennes, France) was performed at BSL and 1 yr, and MRIs at BL, 6 mo and 1 yr on a 1.5 T GE CVi scanner, system 9.1. Exam consisted of localization, a total of 15 breathhold acquisitions for GE-SIR, R2* and R2 (Wood), and 30 min of free-breathing SE acquisitions for SE-SIR and R2 (St Pierre). A MnCl2 phantom (13 vials, 0–24 mM) was scanned 5x over 14 mo. De-identified DICOM images for SIRs, R2* and R2 Wood were analyzed at an imaging core lab. R2 and R2* maps were reconstructed by pixelwise fitting to a monoexponential plus an offset using custom (IDL) software. ROI selections were validated with the investigators responsible for developing the techniques (YG, PJ, JW). Results: BL biopsy HIC was 18.8±5.7 [11.7–28.3]. High HIC precluded successful HIC calculation by GE-SIR in 17/19 examinations. In contrast, SE-SIR was measurable in all studies but demonstrated a large bias compared with biopsy (Table). All three relaxation rate techniques yielded unbiased HIC assessment with acceptable agreement. Unlike previous reports, the MRI techniques agreed better among themselves than with biopsy (R=0.75–0.93), suggesting that, even with standardized collection, biopsy variability represented a major source of error in this study. Interobserver variability had means=−1.5–2.6% and SD=1.9–2.4 %. Errors introduced by different fitting algorithms had mean values of 0.5%–1.4% and SD of 1.3–3.1%. MnCl2 phantom MR values were stable over study duration (COV=3.5% [1.2–8.0]). Bland Altman Agreement and Correlation Coefficients vs Liver Biopsy vs R2 (St Pierre) Technique Mean, % SD, % r-value Mean, % SD, % r-value Signif bias in ital R2, St Pierre −2.9 24.7 0.72 - - - R2, Wood −5.0 24.3 0.75 −3.7 14.0 0.93 R2*, Wood −0.1 25.0 0.67 −4.9 21.0 0.85 SE-SIR, Jensen 27.0 33.0 0.30 34.6 48.6 0.75 Conclusion: With higher HIC, R2 and R2* metrics are more generalizable than SIR metrics. Computational and observer-based errors introduced by differences in image post-processing are small compared with the errors introduced by biopsy sampling errors and intersubject variability in MRI-iron calibration.
Style APA, Harvard, Vancouver, ISO itp.
28

Harlaux, Matthieu, Kalin Kouzmanov, Stefano Gialli, Oscar Laurent, Andrea Rielli, Andrea Dini, Alain Chauvet, Andrew Menzies, Miroslav Kalinaj i Lluís Fontboté. "Tourmaline as a Tracer of Late-Magmatic to Hydrothermal Fluid Evolution: The World-Class San Rafael Tin (-Copper) Deposit, Peru". Economic Geology 115, nr 8 (18.08.2020): 1665–97. http://dx.doi.org/10.5382/econgeo.4762.

Pełny tekst źródła
Streszczenie:
Abstract The world-class San Rafael tin (-copper) deposit (central Andean tin belt, southeast Peru) is an exceptionally large and rich (&gt;1 million metric tons Sn; grades typically &gt;2% Sn) cassiterite-bearing hydrothermal vein system hosted by a late Oligocene (ca. 24 Ma) peraluminous K-feldspar-megacrystic granitic complex and surrounding Ordovician shales affected by deformation and low-grade metamorphism. The mineralization consists of NW-trending, quartz-cassiterite-sulfide veins and fault-controlled breccia bodies (&gt;1.4 km in vertical and horizontal extension). They show volumetrically important tourmaline alteration that principally formed prior to the main ore stage, similar to other granite-related Sn deposits worldwide. We present here a detailed textural and geochemical study of tourmaline, aiming to trace fluid evolution of the San Rafael magmatic-hydrothermal system that led to the deposition of tin mineralization. Based on previous works and new petrographic observations, three main generations of tourmaline of both magmatic and hydrothermal origin were distinguished and were analyzed in situ for their major, minor, and trace element composition by electron microprobe analyzer and laser ablation-inductively coupled plasma-mass spectrometry, as well as for their bulk Sr, Nd, and Pb isotope compositions by multicollector-inductively coupled plasma-mass spectrometry. A first late-magmatic tourmaline generation (Tur 1) occurs in peraluminous granitic rocks as nodules and disseminations, which do not show evidence of alteration. This early Tur 1 is texturally and compositionally homogeneous; it has a dravitic composition, with Fe/(Fe + Mg) = 0.36 to 0.52, close to the schorl-dravite limit, and relatively high contents (10s to 100s ppm) of Li, K, Mn, light rare earth elements, and Zn. The second generation (Tur 2)—the most important volumetrically—is pre-ore, high-temperature (&gt;500°C), hydrothermal tourmaline occurring as phenocryst replacement (Tur 2a) and open-space fillings in veins and breccias (Tur 2b) and microbreccias (Tur 2c) emplaced in the host granites and shales. Pre-ore Tur 2 typically shows oscillatory zoning, possibly reflecting rapid changes in the hydrothermal system, and has a large compositional range that spans the schorl to dravite fields, with Fe/(Fe + Mg) = 0.02 to 0.83. Trace element contents of Tur 2 are similar to those of Tur 1. Compositional variations within Tur 2 may be explained by the different degree of interaction of the magmatic-hydrothermal fluid with the host rocks (granites and shales), in part because of the effect of replacement versus open-space filling. The third generation is syn-ore hydrothermal tourmaline (Tur 3). It forms microscopic veinlets and overgrowths, partly cutting previous tourmaline generations, and is locally intergrown with cassiterite, chlorite, quartz, and minor pyrrhotite and arsenopyrite from the main ore assemblage. Syn-ore Tur 3 has schorl-foititic compositions, with Fe/(Fe + Mg) = 0.48 to 0.94, that partly differ from those of late-magmatic Tur 1 and pre-ore hydrothermal Tur 2. Relative to Tur 1 and Tur 2, syn-ore Tur 3 has higher contents of Sr and heavy rare earth elements (10s to 100s ppm) and unusually high contents of Sn (up to &gt;1,000 ppm). Existence of these three main tourmaline generations, each having specific textural and compositional characteristics, reflects a boron-rich protracted magmatic-hydrothermal system with repeated episodes of hydrofracturing and fluid-assisted reopening, generating veins and breccias. Most trace elements in the San Rafael tourmaline do not correlate with Fe/(Fe + Mg) ratios, suggesting that their incorporation was likely controlled by the melt/fluid composition and local fluid-rock interactions. The initial radiogenic Sr and Nd isotope compositions of the three aforementioned tourmaline generations (0.7160–0.7276 for 87Sr/86Sr(i) and 0.5119–0.5124 for 143Nd/144Nd(i)) mostly overlap those of the San Rafael granites (87Sr/86Sr(i) = 0.7131–0.7202 and 143Nd/144Nd(i) = 0.5121–0.5122) and support a dominantly magmatic origin of the hydrothermal fluids. These compositions also overlap the initial Nd isotope values of Bolivian tin porphyries. The initial Pb isotope compositions of tourmaline show larger variations, with 206Pb/204Pb(i), 207Pb/204Pb(i), and 208Pb/204Pb(i) ratios mostly falling in the range of 18.6 to 19.3, 15.6 to 16.0, and 38.6 to 39.7, respectively. These compositions partly overlap the initial Pb isotope values of the San Rafael granites (206Pb/204Pb(i) = 18.6–18.8, 207Pb/204Pb(i) = 15.6–15.7, and 208Pb/204Pb(i) = 38.9–39.0) and are also similar to those of other Oligocene to Miocene Sn-W ± Cu-Zn-Pb-Ag deposits in southeast Peru. Rare earth element patterns of tourmaline are characterized, from Tur 1 to Tur 3, by decreasing (Eu/Eu*)N ratios (from 20 to 2) that correlate with increasing Sn contents (from 10s to &gt;1,000 ppm). These variations are interpreted to reflect evolution of the hydrothermal system from reducing toward relatively more oxidizing conditions, still in a low-sulfidation environment, as indicated by the pyrrhotite-arsenopyrite assemblage. The changing textural and compositional features of Tur 1 to Tur 3 reflect the evolution of the San Rafael magmatic-hydrothermal system and support the model of fluid mixing between reduced, Sn-rich magmatic fluids and cooler, oxidizing meteoric waters as the main process that caused cassiterite precipitation.
Style APA, Harvard, Vancouver, ISO itp.
29

Lieber, Sarah B., Musarrat Nahid, Alexandra Legge, Mangala Rajan, Robyn A. Lipschultz, Myriam Lin, M. Carrington Reid i Lisa A. Mandl. "Comparison of two frailty definitions in women with systemic lupus erythematosus". Rheumatology, 9.08.2023. http://dx.doi.org/10.1093/rheumatology/kead393.

Pełny tekst źródła
Streszczenie:
Abstract Objectives Frailty is a risk factor for adverse health in systemic lupus erythematosus (SLE). The Fried phenotype (FP) and the Systemic Lupus International Collaborating Clinics Frailty Index (SLICC-FI) are common frailty metrics reflecting distinct approaches to frailty assessment. We aimed to 1) compare frailty prevalence according to both metrics in women with SLE and describe differences between frail and non-frail participants using each method and 2) evaluate for cross-sectional associations between each metric and self-report disability. Methods Women aged 18–70 years with SLE were enrolled. FP and SLICC-FI were measured, and agreement calculated using a kappa statistic. Physician-reported disease activity and damage, Patient Reported Outcome Measurement Information System (PROMIS) computerized adaptive tests, and Valued Life Activities (VLA) self-report disability were assessed. Differences between frail and non-frail participants were evaluated cross-sectionally, and the association of frailty with disability was determined for both metrics. Results Of 67 participants, 17.9% (FP) and 26.9% (SLICC-FI) were frail according to each metric (kappa = 0.41, p&lt; 0.01). Compared with non-frail women, frail women had greater disease damage, worse PROMIS scores, and greater disability (all p&lt; 0.01 for FP and SLICC-FI). After age adjustment, frailty remained associated with a greater odds of disability (FP: odds ratio [OR] 4.7, 95% confidence interval [CI] 1.2–18.8; SLICC-FI: OR 4.6, 95% CI 1.3–15.8). Conclusion Frailty is present in 17.9–26.9% of women with SLE. These metrics identified a similar, but non-identical group of women as frail. Further studies are needed to explore which metric is most informative in this population.
Style APA, Harvard, Vancouver, ISO itp.
30

Zabarsky, Zachary, Alejandro Marquez‐Lara, T. David Luo, Mark Van Dyke i Thomas L. Smith. "The Effect of Keratin on Spinal Cord Injury Recovery". FASEB Journal 30, S1 (kwiecień 2016). http://dx.doi.org/10.1096/fasebj.30.1_supplement.993.10.

Pełny tekst źródła
Streszczenie:
ObjectiveTo evaluate the ability of kertin nanomaterials to create an optimal environment for nerve regeneration and function restoration after spinal cord injury (SCI).IntroductionInflammation plays a vital role in the recovery after SCI. Recent studies have shown that macrophage polarization to the M2 anti‐inflammatory phenotype attenuates the inflammatory response, preventing the secondary injury due to inflammation, resulting in improved outcomes following SCI. Keratin nanomaterials may allow for macrophage polarization, therefore we tested the hypothesis that keratin treatments will improve recovery in a rat contusion model of spinal cord injury.MethodsAll aspects of the described protocol were reviewed and approved by the Institutions Animal Care and Use Committee. Female lewis rats were trained and baseline gait recordings were obtained on a DigiGait Imaging System treadmill for two weeks before receiving injury. Blunt SCI was induced with an IH‐0400, Precision Systems and Instrumentation LLC spinal cord impactor following laminectomy at T9 with a consistent force of 150Kdyn, producing a moderate injury with partial recovery. Treatment groups included intrathecal administration of keratin and a saline control. Functional recovery data assessed by the DigiGait were observed at 3 and 6 weeks post‐SCI and compared to pre‐SCI recordings and data.ResultsGait metrics were analyzed at pre‐SCI and post‐operative (PO) timepoints. Table 1 includes the percent changes from pre‐SCI recordings of gait metrics at 3 and 6 weeks PO for both hind limbs in saline control and keratin treatment groups. Brake duration was decreased in the control rat as compared to the keratin treatment rat. Overlap distance was increased in control rat compared to treatment rat. Paw angel, and swing duration, compared to pre‐operative measures, showed altered ambulation following injury in both groups with varying levels of recovery.ConclusionsThe quantitative data obtained from the DigiGait Imaging system elucidates the altered ambulation following SCI in rats and shows improved scores for the keratin treated group for several parameters. The DigiGait analyzes up to 40 different gait metrics and analysis of additional parameters could show differences between treated groups compared to controls. The keratin treatment seems to be beneficial in some parameters such as brake duration, as longer brake duration may indicate more precise control and distribution of loading during stance phase. Also, more open paw angels of the hind paws are associated with ataxia, spinal cord injury, and demyelinating diseases. Future studies will utilize histological data in addition to gait analysis to document the recovery following SCI in rats and include more keratin treatment delivery methods.Support or Funding InformationErrett Fisher Foundation Shows the Percent changes of gait metrics compared to baseline recordings for a non‐treated control rat and a keratin treated rat. DigiGait Imaging System Gait Analysis of SCI Rats Metric: Brake (sec) Paw angel (deg) Swing (sec) Overlap distance (cm) (Percent Changes compared to Baseline data) Control 3 weeks L − 37% + 34.7% − 2.2% + 529.6% 3 weeks R − 37.5% + 17.1% + 2.5% + 181.7% 6 weeks L − 40.1% + 329.9% + 8.1% + 637.8% 6 weeks R − 38.5% + 7.1% + 20.5% + 240.6% Keratin 3 weeks L − 15% + 157.39% + 20.4% + 92.9% 3 weeks R + 9.3% + 45.3% + 30.3% + 65.6% 6 weeks L − 13.4% + 148.7% + 2.7% + 67.5% 6 weeks R + 3.1% + 45.9% +18.8% + 193.3%
Style APA, Harvard, Vancouver, ISO itp.
31

Wutke, Andrew. "From newton to universal planck natural units – disentangling the constants of nature". Journal of Physics Communications, 5.10.2023. http://dx.doi.org/10.1088/2399-6528/ad0090.

Pełny tekst źródła
Streszczenie:
Abstract This study exploits a historical gap in the evolution of metric systems that resulted from incomplete implementation of the “rationalization” concept published by Heaviside in 1893 and ignoring the suggestion of Maxwell in 1873 to use the simplest form of Newton’s gravitational law expression with no proportionality constant. Bridging this gap required deriving an experimental Rationalized Metric System (RMS) and a corresponding Universal Planck Natural Unit System (UPNUS) in [LT] units.The described solution combines Heaviside’s rationalization of Newton’s law and makes the unit of mass dimensions [L3T−2 ], as suggested by Maxwell. Consequently the modified Coulomb’s law, changes the unit of the electric charges to the same dimensions as those of the mass. The elimination of the kilogram and ampere has a disentangling effect on the dependencies among the constants of nature and opens new horizons. The new systems have the potential to become powerful exploratory tools in fundamental research and education because of the simplification of the relationships among physical quantities. Noteworthy highlights from analyzed examples include the following: The well-known expression for Stoney mass (mS) when converted to RMS units is reduced to the electron charge quantity, whereas traditional metric systems entangle the charge, speed of light, and gravitational constant, forming an entity in the dimension of mass, as first presented by Stoney in 1874. A well-substantiated conjecture is proposed, wherein the Stoney energy ES=mSc2 is nothing but the long-sought, finite electric field energy of the electron, and the gravitational constant appears to be the limiting factor. In UPNUS, the most disentangled fundamental expression, apart from the Stoney mass, is the elementary charge ӗ as the function of the fine structure constant α and the Planck mass( m̆P ̌): ӗ = m̆P √α ≈1.073 476 with ӗ , m̆P of [L3T−2 ] dimensions in Planck units, and m̆P = 4π
Style APA, Harvard, Vancouver, ISO itp.
32

İPEK, Ömer. "TWO TEXTS ON THE REFLECTIONS OF OTTOMAN MODERNIZATION IDEAS IN ENGLAND". Arşiv Dünyası, 23.06.2022. http://dx.doi.org/10.53474/ad.1127415.

Pełny tekst źródła
Streszczenie:
Turkish modernization is seen as a development driven by innovations in the educational and military fields. The inability of the economic and political model of the Ottoman Empire to react to the new conditions imposed by the world system, as well as the shortcomings in the military field due to the inadequacy of modern techniques, led to the emergence of a new model of society from the end of the 18th century. In addition to the emergence of the concept of nation, as a result of the change brought about by the spirit of Enlightenment of the French Revolution, the industrial revolution that began in England and the Russian pressure, which manifested itself in the military plan, were the triggers for efforts to keep up with the times in different areas. In this context, a text published in English as early as 1892 by a Turkish writer (İbrahim Hakkı Bey) and the criticism of this text by an English writer (Hyde Clarke) are important to show Turkey's efforts to prove and explain its modernization and progress. These two texts are very informative as they show the British approach to topics such as education, administration, culture, finance and even the sultanate and caliphate. In this article, we will discuss the British approach to Turkish modernization through the article "Is Turkey Progressing?" published in the British press on Ottoman and Turkish modernization efforts and the debate it has sparked.
Style APA, Harvard, Vancouver, ISO itp.
33

Dosalwar, Sharayu, Ketki Kinkar, Rahul Sannat i Dr Nitin Pise. "Analysis of Loan Availability using Machine Learning Techniques". International Journal of Advanced Research in Science, Communication and Technology, 4.09.2021, 15–20. http://dx.doi.org/10.48175/ijarsct-1895.

Pełny tekst źródła
Streszczenie:
In the banking system, banks have a variety of products to provide, but credit lines are their primary source of revenue. As a result, they will profit from the interest earned on the loans they make. Loans, or whether customers repay or default on their loans, affect a bank's profit or loss. The bank's Non-Performing Assets will be reduced by forecasting loan defaulters. As a result, further investigation into this occurrence is essential. Because precise forecasts are essential for benefit maximisation, it's crucial to analyse and compare the various methodologies. The logistic regression model is an important predictive analytics tool for detecting loan defaulters. In order to assess and forecast, data from Kaggle is acquired. Logistic Regression models were used to calculate the various performance indicators. The models are compared using performance metrics like sensitivity and specificity. In addition to checking account details (which indicate a customer's wealth), the model is significantly better because it includes variables (customer personal attributes such as age, objective, credit score, credit amount, credit period, and so on) that should be considered when correctly calculating the probability of loan default. As a result, using a logistic regression approach, the appropriate clients to target for loan issuance can be easily identified by evaluating their plausibility of loan default. The model implies that a bank should assess a creditor's other attributes, which play a critical role in credit decisions and forecasting loan defaulters, in addition to giving loans to wealthy borrowers.
Style APA, Harvard, Vancouver, ISO itp.
34

Martinez, Antonio Lopo, Jorge Luiz de Santana Júnior i Thiago Rios Sena. "Tax aggressiveness as a determining factor of conditional conservatism in Brazil",. Revista Contabilidade & Finanças 33, nr 90 (2022). http://dx.doi.org/10.1590/1808-057x20221484.en.

Pełny tekst źródła
Streszczenie:
ABSTRACT This paper investigates whether the degree of tax aggressiveness is associated with conditional conservatism in Brazil. After a thorough literature review on accounting conservatism by Brazilian academia and a discussion about tax aggressiveness and accounting conservatism, a literature gap was found because the relationship between a firm's degree of tax aggressiveness and its conditional conservatism had not been investigated previously. Taxable income has a relationship with accounting income in the Brazilian corporate income tax system. Hence, tax planning can affect financial information properties. This study offers a partial explanation of accounting conservatism based on tax issues that contribute to the conservatism and taxation literature. The results suggest that tax strategies that aim to avoid tax burden are related to conditional conservative accounting. Hence, the practice of conditional conservatism in Brazil appears to be linked with tax-deductible alternatives of reducing earnings, which would explain tax planning’s association with the degree of conditional conservatism in financial reporting. This finding is relevant to financial reporting users that can consider our results in their analysis and to management that seeks to understand their decisions about tax planning better. For this research purpose, two Basu models (Basu, 1997) were adopted, adapted with tax-aggressiveness controls. The effective tax rate (ETR) was used as a tax-aggressiveness metric, controlling firms with both high and low ETR. The study period was from 2010 to 2019 for Brazilian firms from B3 S.A. - Brasil, Bolsa, Balcão (B3). The findings demonstrate a significant relationship between tax avoidance and conditional conservatism, that is, more tax-aggressive firms tend to use more conservative accounting.
Style APA, Harvard, Vancouver, ISO itp.
35

Nordén, Bengt. "The Mole, Avogadro’s Number and Albert Einstein". Molecular Frontiers Journal, 24.03.2021, 1–13. http://dx.doi.org/10.1142/s2529732521400010.

Pełny tekst źródła
Streszczenie:
The mole a concept and Avogadro’s number are discussed as sought by Albert Einstein in his PhD thesis of 1905. Einstein would probably have regarded the metric system of units based on centimetre-gram-second (cgs) preferable to today’s SI system and specifically he would have rejected a recent SI suggestion to redefine Avogadro’s constant as based on a nonatomistic continuum description of matter. He would probably also have preferred keeping a dualistic definition of mole able of bookkeeping both mass and number of particles: we advocate that here and call it the ‘Einstein Definition’ and as Avogadro’s number we shall adopt an integer, the cube of 84446888 as suggested by Fox and Hill, providing also a definition of the kilogram based on the atomic mass of the carbon 12 isotope. Einstein was the first to explain the microscopic movements of pollen grains reported by Robert Brown in 1828 and his explanation that the particles move as a result of an unequal number of water molecules bumping into them from opposite sides was what finally made the scientific world accept the atom theory in its modern shape. In a cosmic diffusion analogy, pollen or bacterial spores moving randomly in outer space driven by the solar winds between solar systems can be envisaged. Applying Einstein’s diffusion theory, one can argue that life might have emerged from far outside of our planet from billions of solar systems, though not from outside of our Milky Way galaxy. As a curiosity we note that the number of solar systems (stars) in the Universe has been estimated to be of the order of Avogadro’s number.
Style APA, Harvard, Vancouver, ISO itp.
36

Rahimi, Sadeq. "Identities without a Reference". M/C Journal 3, nr 3 (1.06.2000). http://dx.doi.org/10.5204/mcj.1847.

Pełny tekst źródła
Streszczenie:
The process of modernisation can be understood to have contributed to a radical loss of collective and individual orientation, by depriving geography of identity, and replacing ‘place’ by ‘space’. "Space", writes Klapp, "robs identity. Place, on the other hand, nurtures it, tells you who you are" (28). If the replacement of place by space is an achievement of modernity, the replacement of space by time can be considered a postmodern hallmark. The fact is that cultures are now bounded as entities more in time than in space, and "time depth now prevails over field depth" (Virilio 24). It is, in other words, the unfolding of time that reflects change more immediately than does spatial distance. In fact space itself is now defined by time. The 200-year development course of the meter as the unit of length, for example, displays an interesting parallel to the space-time transition. In 1793 the French government decided the unit of length to be 10-7 of the earth's quadrant passing through Paris and to be called meter. It became clear in further examinations that the earth's quadrant had been miscalculated, but this discovery did not stop the use of the unit. Initially referred to as "meter of the archives", the unit was announced in 1799 to be based on a measurement of a meridian between Dunkirk and Barcelona, embodied by a rectangular platinum bar with polished parallel ends. This bar, which was supposed to equal one ten-millionth (10-7) part of the quadrant of the earth, went on to serve as the international standard of length throughout the 19th century. In 1872, the length was set as the official definition of meter by the International Commission of the Meter, even though it was admitted that "its relationship to a quadrant of the earth was tenuous and of little consequence anyway"1. The original bar was then replaced by another platinum-iridium line tool which was christened "the international prototype meter" and its 'copies' were distributed between member countries of the International Metric Convention in 1889. This definition was to serve as the reference of length until the mid-twentieth century. In 1960, following decades of deliberation, meter was redefined in the Eleventh General Conference on Weights and Measures as "1,659,763.73 vacuum wavelengths of light resulting from unperturbed atomic energy level transition 2p10 - 5d5 of the krypton isotope having an atomic weight of 86". This is an interesting development, because now the concept of length is removed from a geographical reference like the distance between Dunkirk and Barcelona to a 'virtual', non-geographical space like the distance between the peaks of the sine waves of a certain type of light. Finally in 1983 the meter was redefined once again. This time the definition refers directly to time as the unit for measuring space. The meter is defined currently as "the length of the path travelled by light in vacuum during a time interval of 1/299,792,458 of a second". A fast glance at this history reveals the absence of 'real' reference for what we have come to accept as the 'standard' unit, and the unstable nature of this unit. More intriguingly, the course of development of this definition portrays the gradual progression of reference from geographic place to virtual space and from there to time. Shrinking Time If the modern question of identity concerned locality and spatial reference, what informs the question of identity in the postmodern condition is primarily defined by temporal locatedness and virtual geography or even virtual space. This progression then causes speed to inevitably inform the issue of identification reference. The speed of environmental change is gradually approaching a point where identity could lack a reference, a precedence with which to identify oneself. The conflict is fundamental: if self-identification has traditionally always already implied a reference in time, then acceleration is inherently the enemy of identity, by continuously curtailing the ‘stuff’ identity is made of. It is not a coincidence perhaps that the concerns of social sciences have gradually moved from being able to predict the future to being content with simply explaining the present, as the high speed of change leaves little room for the luxury of prediction. If the modern question of identity concerned locality and spatial reference, what informs the question of identity in the postmodern condition is primarily defined by temporal locatedness and virtual geography or even virtual space. This progression then causes speed to inevitably inform the issue of identification reference. The speed of environmental change is gradually approaching a point where identity could lack a reference, a precedence with which to identify oneself. The conflict is fundamental: if self-identification has traditionally always already implied a reference in time, then acceleration is inherently the enemy of identity, by continuously curtailing the ‘stuff’ identity is made of. It is not a coincidence perhaps that the concerns of social sciences have gradually moved from being able to predict the future to being content with simply explaining the present, as the high speed of change leaves little room for the luxury of prediction. If we describe the postmodern condition as a condition where ‘the critical referential distance’ of identity approaches zero (the contraction of time), then the increase in speed of change can, theoretically at least, lead to a reversal of the orders of reference (see above). This may in fact be conceptualised as a reversal in the order of signification, so that the signifier precedes the signified. Though extremely important for a theory of posthuman identity, the possibility and implications of such reversal are not within the scope of the present paper. Presently applicable, however, is the more-or-less current postmodern predicament, within which self-identification seems to be running short of reference. To imagine a system of meaning wherein the act of self-identification (as traditionally done by humans) is unfeasible is to imagine a constant state of flux, a seamless ocean of meaning, a state traditionally considered pathological and diagnosed schizoid: a "smooth space," which is "in principle infinite, open, and unlimited in every direction"; and which "has neither top nor bottom nor centre" (Deleuze & Guattari 476). It is not difficult to realise that the ‘self’ native to this environment cannot be the human self we are familiar with. In the words of Gergen, the postmodern self resides in "a continuous state of construction and reconstruction", a fluid landscape where "each reality of self gives way to reflexive questioning, irony, and ultimately the playful probing of yet another reality", a reality where "the centre fails to hold" (6). While such conception of a posthuman to come may appear fantastic, the undeniable fact is that the postmodern condition is constantly expanding its reach, erasing boundaries, transforming nations, and dissolving temporal horizons. "Here as elsewhere, in our ordinary everyday life", writes Virilio, "we are passing from the extensive time of history to the intensive time of an instantaneity without history made possible by the technologies of the hour" (24-5). Conclusion As the progression of speed renders space and time as constituents of human reality less inflexible, it becomes imperative for any new theory of identity to accommodate a conception of ‘identity’ ultimately unconstrained by these grids. Such theoretical argument, however, needs to be accompanied by serious political considerations. Despite the specific philosophical perspective endorsed through the language of this paper, and while accepting Bauman’s suggestion that "identity is a name given to the sought escape from uncertainty" (82), I would insist nonetheless that political and clinical concerns demand certain concepts-to-work-with, certain constructions meant to ‘translate’ Being into the human reality. True, such translation spells ‘violence’, but the fact is that in a final analysis violence appears as the ‘other’ name for being, and any semiotic construction of the world always already exists through a systemised (if partial) negation of Being. That is to say, a philosophical appreciation of the void behind the term "identity" does not necessarily render a conceptualisation of identity futile. The challenge, however, may lie in gradually freeing the concept, so as to move as far as possible from positivistic reification towards the least rigid conceptualisations permitted within the current discourse of a given era. Currently, for example, the notions of change and fluidity championed by postmodern thinkers may provide useful metaphors towards such liberation of the working concept. Footnotes The information on the history of the meter is from the National Institute of Standards and Technology of the Government of the United States, through the Manufacturing Engineering Lab Web Site: http://www.mel.nist.gov. References Bauman, Z. Life in Fragments: Essays in Postmodern Morality. Cambridge: Blackwell, 1995. Deleuze, G., and F. Guattari. Anti-Oedipus: Capitalism and Schizophrenia. Trans. Robert Hurley, Mark Seem, and Helen Lane. New York: Viking, 1977. Gergen, K. J. The Saturated Self: Dilemmas of Identity in Contemporary Life. New York: Basic Books, 1991. Klapp, O. E. Collective Search for Identity. New York: Holt, Reinhart, and Winston, 1969. Fraser, J. T. "An Embarrassment of Proper Times: A Foreword." Time: Modern and Postmodern Experience. By Helga Nowotny. Cambridge, UK: Polity Press, 1994. Virilio, P. Polar Inertia. Trans. Patrick Camiller. London: Sage Publications, 2000. Citation reference for this article MLA style: Sadeq Rahimi. "Identities without a Reference: Towards a Theory of Posthuman Identity." M/C: A Journal of Media and Culture 3.3 (2000). [your date of access] <http://www.api-network.com/mc/0006/identity.php>. Chicago style: Sadeq Rahimi, "Identities without a Reference: Towards a Theory of Posthuman Identity," M/C: A Journal of Media and Culture 3, no. 3 (2000), <http://www.api-network.com/mc/0006/identity.php> ([your date of access]). APA style: Sadeq Rahimi. (2000) Identities without a reference: towards a theory of posthuman identity. M/C: A Journal of Media and Culture 3(3). <http://www.api-network.com/mc/0006/identity.php> ([your date of access]).
Style APA, Harvard, Vancouver, ISO itp.
Oferujemy zniżki na wszystkie plany premium dla autorów, których prace zostały uwzględnione w tematycznych zestawieniach literatury. Skontaktuj się z nami, aby uzyskać unikalny kod promocyjny!

Do bibliografii