Academic literature on the topic 'Average Threshold Crossing'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Average Threshold Crossing.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Average Threshold Crossing"

1

Xue, Xiao, Maximilian Russ, Nodar Samkharadze, Brennan Undseth, Amir Sammak, Giordano Scappucci, and Lieven M. K. Vandersypen. "Quantum logic with spin qubits crossing the surface code threshold." Nature 601, no. 7893 (January 19, 2022): 343–47. http://dx.doi.org/10.1038/s41586-021-04273-w.

Full text
Abstract:
AbstractHigh-fidelity control of quantum bits is paramount for the reliable execution of quantum algorithms and for achieving fault tolerance—the ability to correct errors faster than they occur1. The central requirement for fault tolerance is expressed in terms of an error threshold. Whereas the actual threshold depends on many details, a common target is the approximately 1% error threshold of the well-known surface code2,3. Reaching two-qubit gate fidelities above 99% has been a long-standing major goal for semiconductor spin qubits. These qubits are promising for scaling, as they can leverage advanced semiconductor technology4. Here we report a spin-based quantum processor in silicon with single-qubit and two-qubit gate fidelities, all of which are above 99.5%, extracted from gate-set tomography. The average single-qubit gate fidelities remain above 99% when including crosstalk and idling errors on the neighbouring qubit. Using this high-fidelity gate set, we execute the demanding task of calculating molecular ground-state energies using a variational quantum eigensolver algorithm5. Having surpassed the 99% barrier for the two-qubit gate fidelity, semiconductor qubits are well positioned on the path to fault tolerance and to possible applications in the era of noisy intermediate-scale quantum devices.
APA, Harvard, Vancouver, ISO, and other styles
2

Yanyo, L. C., and F. N. Kelley. "Effect of Chain Length Distribution on the Tearing Energy of Silicone Elastomers." Rubber Chemistry and Technology 60, no. 1 (March 1, 1987): 78–88. http://dx.doi.org/10.5254/1.3536123.

Full text
Abstract:
Abstract The tearing energies of two endlinked PDMS networks, one a monomodal distribution of chain lengths and the other a bimodal mixture of very short and rather long chains, at the same average molecular weight between crosslinks, were compared. The bimodal network exhibited higher tearing strengths than the monomodal network under the same experimental conditions. At threshold conditions, the bimodal network tearing energy was 70% higher than the threshold strength of the monomodal network. A rederivation of the Lake and Thomas theory for the threshold tearing strength which includes a bimodal probability distribution of chain lengths is shown to predict the observed behavior. The strength increase of these bimodal networks is attributed to the presence of the long chains which increases the energy required for fracture while maintaining the same number of chains crossing the fracture plane as in the monomodal network of the same crosslink density, by including a large number of short chains.
APA, Harvard, Vancouver, ISO, and other styles
3

Yacoub, M. D., C. R. C. M. da Silva, and J. E. Vargas B. "Level crossing rate and average fade duration for pure selection and threshold selection diversity-combining systems." International Journal of Communication Systems 14, no. 10 (2001): 897–907. http://dx.doi.org/10.1002/dac.514.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Mascio, Paola Di, and Laura Moretti. "Hourly Capacity of a Two Crossing Runway Airport." Infrastructures 5, no. 12 (December 4, 2020): 111. http://dx.doi.org/10.3390/infrastructures5120111.

Full text
Abstract:
At the international level, the interest in airport capacity is growing in the last years because its maximization ensures the best performances of the infrastructure. However, infrastructure, procedure, human factor constraints should be considered to ensure a safe and regular flow to the flights. This paper analyzed the airport capacity of an airport with two crossing runways. The fast time simulation allowed modeling the baseline scenario (current traffic volume and composition) and six operative scenarios; for each scenario, the traffic was increased until double the current volume. The obtained results in terms of average delay and throughput were analyzed to identify the best performing and operative layout and the most suitable to manage increasing hourly movements within the threshold delay of 10 min. The obtained results refer to the specific examined layout, and all input data were provided by the airport management body: the results are reliable, and the pursued approach could be implemented to different airports.
APA, Harvard, Vancouver, ISO, and other styles
5

Chesher, Andrew, and Adam M. Rosen. "What Do Instrumental Variable Models Deliver with Discrete Dependent Variables?" American Economic Review 103, no. 3 (May 1, 2013): 557–62. http://dx.doi.org/10.1257/aer.103.3.557.

Full text
Abstract:
We compare nonparametric instrumental variables (IV) models with linear models and 2SLS methods when dependent variables are discrete. A 2SLS method can deliver a consistent estimator of a Local Average Treatment Effect but is not informative about other treatment effect parameters. The IV models set identify a range of interesting structural and treatment effect parameters. We give set identification results for a counterfactual probability and an Average Treatment Effect in a IV binary threshold crossing model. We illustrate using data on female employment and family size (employed by Joshua Angrist and William Evans (1998)) and compare with their LATE estimates.
APA, Harvard, Vancouver, ISO, and other styles
6

Steffen, Will, Johan Rockström, Katherine Richardson, Timothy M. Lenton, Carl Folke, Diana Liverman, Colin P. Summerhayes, et al. "Trajectories of the Earth System in the Anthropocene." Proceedings of the National Academy of Sciences 115, no. 33 (August 6, 2018): 8252–59. http://dx.doi.org/10.1073/pnas.1810141115.

Full text
Abstract:
We explore the risk that self-reinforcing feedbacks could push the Earth System toward a planetary threshold that, if crossed, could prevent stabilization of the climate at intermediate temperature rises and cause continued warming on a “Hothouse Earth” pathway even as human emissions are reduced. Crossing the threshold would lead to a much higher global average temperature than any interglacial in the past 1.2 million years and to sea levels significantly higher than at any time in the Holocene. We examine the evidence that such a threshold might exist and where it might be. If the threshold is crossed, the resulting trajectory would likely cause serious disruptions to ecosystems, society, and economies. Collective human action is required to steer the Earth System away from a potential threshold and stabilize it in a habitable interglacial-like state. Such action entails stewardship of the entire Earth System—biosphere, climate, and societies—and could include decarbonization of the global economy, enhancement of biosphere carbon sinks, behavioral changes, technological innovations, new governance arrangements, and transformed social values.
APA, Harvard, Vancouver, ISO, and other styles
7

Shi, Xue Chao, Zhen Zhong Sun, and Sheng Lin Lu. "A Novel Method of Sub-Pixel Linear Edge Detection Based on First Derivative Approach." Advanced Materials Research 139-141 (October 2010): 2107–11. http://dx.doi.org/10.4028/www.scientific.net/amr.139-141.2107.

Full text
Abstract:
In this paper, a novel algorithm is proposed to detect linear edge. Image gradient is acquired by Sobel or Prewitt filters. Logical addition is applied to enhance image contrast. Statistical method is employed to cope with gradient data—gradient projection in the horizontal and vertical direction to compute the average gradient value. Multi-level B-Spline interpolation is employed to smooth gradient data. At last, edge coordinates can be computed precisely by the number and interval of extreme points. The experiment results are presented to show validity of the algorithm, which precision and accuracy can reach to sub-pixel. The proposed approach puts merits of zero crossing method and threshold method together, which is very robust, convenient and efficient to detect linear edge in industrial environment.
APA, Harvard, Vancouver, ISO, and other styles
8

Jiang, Nan, and Ting Liu. "An Improved Speech Segmentation and Clustering Algorithm Based on SOM and K-Means." Mathematical Problems in Engineering 2020 (September 12, 2020): 1–19. http://dx.doi.org/10.1155/2020/3608286.

Full text
Abstract:
This paper studies the segmentation and clustering of speaker speech. In order to improve the accuracy of speech endpoint detection, the traditional double-threshold short-time average zero-crossing rate is replaced by a better spectrum centroid feature, and the local maxima of the statistical feature sequence histogram are used to select the threshold, and a new speech endpoint detection algorithm is proposed. Compared with the traditional double-threshold algorithm, it effectively improves the detection accuracy and antinoise in low SNR. The k-means algorithm of conventional clustering needs to give the number of clusters in advance and is greatly affected by the choice of initial cluster centers. At the same time, the self-organizing neural network algorithm converges slowly and cannot provide accurate clustering information. An improved k-means speaker clustering algorithm based on self-organizing neural network is proposed. The number of clusters is predicted by the winning situation of the competitive neurons in the trained network, and the weights of the neurons are used as the initial cluster centers of the k-means algorithm. The experimental results of multiperson mixed speech segmentation show that the proposed algorithm can effectively improve the accuracy of speech clustering and make up for the shortcomings of the k-means algorithm and self-organizing neural network algorithm.
APA, Harvard, Vancouver, ISO, and other styles
9

Chakravarthy, Murali, Sharmila Sengupta, Sanjeev Singh, Neeta Munshi, Tency Jose, and Vatsal Chhaya. "Incidence Rates of Healthcare-associated Infections in Hospitals: A Multicenter, Pooled Patient Data Analysis in India." International Journal of Research Foundation of Hospital and Healthcare Administration 3, no. 2 (2015): 86–90. http://dx.doi.org/10.5005/jp-journals-10035-1042.

Full text
Abstract:
ABSTRACT Aim The aim of this study was to collect the multicenter data of healthcare-associated infections (HAIs) to assess the infection control scenario in India in context with CDC/NHSN and INICC database. Materials and methods Four National Accreditation Board for Hospitals and Health Care Providers (NABH) accredited hospitals were selected on random basis and raw data on healthcare-associated infections—number of days and number of infections in all intensive care patients was obtained as per the CDC-NHSN definitions and formula. Three major device related infections were considered for analysis based on the prevalence of HAIs and discussions with subject matter experts. All nodal champions from each hospital were trained and common data collection sheet for surveillance in accordance to CDC-NHSN was formed. The pooled means for HAI rates and average of the pooled means for all were calculated using data from four hospitals and were compared with CDC/NHSN and international nosocomial infection control consortium (INICC) percentiles of HAIs rates. Results The Indian pooled mean HAI rates for all infections were above CDC/NHSN percentile threshold but below INICC percentile. Ventilator-associated pneumonia (VAP) was considered as matter of prime concern, crossing P90 line of CDC/NHSN threshold. However, no HAI rate was in limit of P25. Conclusion Indian HAI rates were higher when mapped with CDC threshold. This promotes the need for more standardized and evidence-based protocols been adhered to so as to bring HAI within CDC/NHSN thresholds. However, the four hospitals have better HAI rates as compared to pooled INICC database. How to cite this article Singh S, Chakravarthy M, Sengupta S, Munshi N, Jose T, Chhaya V. Incidence Rates of Healthcareassociated Infections in Hospitals: A Multicenter, Pooled Patient Data Analysis in India. Int J Res Foundation Hosp Healthc Adm 2015;3(2):86-90.
APA, Harvard, Vancouver, ISO, and other styles
10

Lu, Naiwei, Mohammad Noori, and Yang Liu. "First-passage probability of the deflection of a cable-stayed bridge under long-term site-specific traffic loading." Advances in Mechanical Engineering 9, no. 1 (January 2017): 168781401668727. http://dx.doi.org/10.1177/1687814016687271.

Full text
Abstract:
Long-span bridges suffer from higher traffic loads and the simultaneous presence of multiple vehicles, which in conjunction with the steady traffic growth may pose a threat to the bridge safety. This study presents a methodology for first-passage probability evaluation of long-span bridges subject to stochastic heavy traffic loading. Initially, the stochastic heavy traffic loading was simulated based on long-term weigh-in-motion measurements of a highway bridge in China. A computational framework was presented integrating Rice’s level-crossing theory and the first-passage criterion. The effectiveness of the computational framework was demonstrated through a case study of a cable-stayed bridge. Numerical results show that the upper tail fitting of the up-crossing rate is an appropriate description of probability characteristics of the extreme traffic load effects of long-span bridges. The average daily truck traffic growth increases the probability of exceedance due to an intensive heavy traffic flow and results in a higher first-passage probability, but this increased trend is weakening as the continuous increase of the traffic volume. Since the sustained growth of gross vehicle weight has a constant impact on the probability of failure, setting a reasonable threshold overload ratio is an effective scheme as a traffic management to ensure the bridge serviceability.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Average Threshold Crossing"

1

Jhnujhunwala, Rajat Rakesh, and Geethanjali P. "Effect of Delay in EOG Signals for Eye Movement Recognition." In Advances in Medical Technologies and Clinical Practice, 71–80. IGI Global, 2021. http://dx.doi.org/10.4018/978-1-7998-8018-9.ch005.

Full text
Abstract:
Electrooculogram (EOG) signals as a part of human-controlled interface (HCI) is proposed for detecting the relevant information in EOG with and without delay in movement of eyes. The performance of eye movements is studied with the accuracy in identification of information along with single and double blink. The algorithm consists of a simple first order derivative, threshold windowing technique, and pattern recognition. The EOG pattern recognition was studied with time domain features mean value (MV) and ensemble of MV and zero crossing (ZC). The highest average classification accuracy of 85% and 84.4% is obtained from continuous movement of eyes for three classes (L, R, DB and L, R, SB) with two time-domain features. Further, the accuracy of 90% and 88% from two eye movement detection is obtained.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Average Threshold Crossing"

1

., Amirhossein, Masoud Shahshahani, Paolo Motto Ros, Alberto Bonanno, Marco Crepaldi, Maurizio Martina, Danilo Demarchi, and Guido Masera. "An All-Digital Spike-Based Ultra-Low-Power IR-UWB Dynamic Average Threshold Crossing Scheme for Muscle Force Wireless Transmission." In Design, Automation and Test in Europe. New Jersey: IEEE Conference Publications, 2015. http://dx.doi.org/10.7873/date.2015.1062.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Devonald, Martin, Mike Hill, Peter Song, Hamish Weatherly, and Lauren Vincent. "Real-Time Flood Monitoring and Management of a Mississippi River Pipeline Crossing." In 2016 11th International Pipeline Conference. American Society of Mechanical Engineers, 2016. http://dx.doi.org/10.1115/ipc2016-64219.

Full text
Abstract:
Enbridge Liquids Pipelines (Enbridge) operates over 26,000 km of liquid pipelines in Canada and the US, administers a system-wide geohazard management program to identify, investigate and monitor geohazards, and performs remediation as required. An integral part of the geohazard management program is real-time flood monitoring, where pipeline watercourse crossings affected by flooding are identified and flood levels monitored. Watercourse crossings where the pipelines have a high potential to become exposed, to span, and potentially to fail during a flood event are studied in more detail. This flood monitoring program automatically monitors publicly available real-time stream gauge flow measurements and compares these measurements to estimated discharge thresholds for the crossing under evaluation. Thresholds are related to the current pipeline depth of cover (DOC) and the amount of scour that can occur over a range of flood magnitudes. Thresholds include: 1) the estimated peak flow to expose the top of the pipe, “exposure flow”, 2) the estimated peak discharge and associated flow velocities that could create enough free spanning pipe for the onset of vortex induced vibration (VIV) fatigue failure, “flow of concern”, and 3) where additional mechanical assessment taking account of specific pipe properties, data requirements and circumstances has been carried out, the “critical flow”, the estimated average peak flow and duration that has the potential to result in product release due to VIV once a sufficient pipe span length has developed, “critical flow”. This paper is a case study of an assessment and flood monitoring of one of Enbridge’s Mississippi River pipeline crossings, which has a history of flood-related pipeline exposure and subsequent mitigations. During real-time monitoring of a 2015 flood event the “exposure flow” and “flow of concern” thresholds for this crossing were exceeded, resulting in a decision by Enbridge to shut down the pipeline. Subsequent surveys revealed that the pipe had become exposed and was spanning adjacent to the previously remediated area. The previous mitigation likely limited the length of pipe exposure and pipe span. Added complexity was encountered during the post shutdown DOC survey, which needed to be completed as quickly and safely as possible after flood levels declined to allow for an assessment of the actual condition of the pipeline prior to restart. This paper presents a methodology that could allow pipeline operators to identify river crossings susceptible to pipe exposure, and the potential for freespan development, due to flooding, by providing an understanding of what is likely happening to the cover over the pipe at a particular crossing during a flood event. This provides a tool to better manage pipeline river crossings experiencing flooding. As far as the authors are aware, this case study represents the first time a pipeline has been shut down based on real-time flows and thresholds in the United States.
APA, Harvard, Vancouver, ISO, and other styles
3

Hu, Jialiang, Pradeep Menon, Amna Al Yaqoubi, Mohamed Al Shehhi, Mahmoud Basioni, Fabio Roncarolo, and Natela Belova. "Fracture Characterization in Deep Gas Reservoirs to Identify Fracture Enhanced Flow Units, Offshore Abu Dhabi." In Abu Dhabi International Petroleum Exhibition & Conference. SPE, 2021. http://dx.doi.org/10.2118/207646-ms.

Full text
Abstract:
Abstract High gas flow rates in deep-buried dolomitized reservoir from an offshore field Abu Dhabi cannot be explained by the low matrix permeability. Previous permeability multiplier based on distance to major faults is not a solid geological solution due to over-simplifying reservoir geomechanics, overlooking folding-related fractures, and lack of detailed fault interpretation from poor seismic. Alternatively, to characterize the heterogeneous flow related with natural fractures in this undeveloped reservoir, fracture network is modelled based on core, bore hole imager (BHI), conventional logs, seismic data and test information. Limited by investigation scale, vertical wells record apparent BHI, and raw fracture interpretation cannot represent true 3D percolation reflected on PLT. To overcome this shortfall, correction based on geomechanics and mechanical layer (ML) analysis is performed. Young's modulus (E), Poisson ratio (ν), and brittleness index are calculated from logs, describing reservoir tendency of fracturing. Other than defining MLs, bedding plane intensity from BHI is also used as an indicator of fracture occurrence, since stress tends to release at strata discontinuity and forms bed-bounded fractures observed from cores. Subsequently, a new fracture intensity is generated from combined geomechanics properties and statistics average of BHI-derived fracture occurrence within the ML frame, which improves match with PLT and distinguishes fracture enhance flow intervals consistently in all wells. Seismic discontinuity attributes are used as static fracture footprints to distribute fractures from wells to 3D. The final hybrid DFN comprises large-scale deterministic zone-crossing fractures and small-scale stochastic bed-bounded fractures. Sub-vertical open fractures are dominated by NE-SW wrenching fractures related with Zagros compression and reactive salt upward movement. There is no angle rotation of fractures in different fault blocks. Open fractures in other strikes are supported by partial cements and mismatching fracture walls on computerized tomography (CT) images. ML correlation shows vertical consistence across stratigraphic framework and its intensity indicates fracture potential of vertical zones reflected by tests. Fracture-enhanced flow units are further constrained by a threshold in both combined geomechanics properties and statistics average of raw BHI fracture intensity in ML frame. As a result, final fracture network maps reservoir brittleness and flow potential both vertically and laterally, identifying fracture regions along folding axis not just major faults, evidenced by wells and seismic. According to the upscaling results, the case study reveals a type-III fractured reservoir, where fractures contribute to flow not to volume. Fracture network enhances bed-wise horizontal communication but also opens vertical feeding channels. Fracture permeability is mainly influenced by aperture and intensity, while aspect ratio, fracture length, and proportion of strikes and dips mainly influence permeability distribution rather than absolute values. This study provides a production-oriented characterization workflow of natural fracture heterogeneity based on correction of raw BHI in undeveloped fields.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography