Academic literature on the topic 'Two Wave Missing Data'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Two Wave Missing Data.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Two Wave Missing Data"

1

Rioux, Charlie, Antoine Lewin, Omolola A. Odejimi, and Todd D. Little. "Reflection on modern methods: planned missing data designs for epidemiological research." International Journal of Epidemiology 49, no. 5 (May 1, 2020): 1702–11. http://dx.doi.org/10.1093/ije/dyaa042.

Full text
Abstract:
Abstract Taking advantage of the ability of modern missing data treatments in epidemiological research (e.g. multiple imputation) to recover power while avoiding bias in the presence of data that is missing completely at random, planned missing data designs allow researchers to deliberately incorporate missing data into a research design. A planned missing data design may be done by randomly assigning participants to have missing items in a questionnaire (multiform design) or missing occasions of measurement in a longitudinal study (wave-missing design), or by administering an expensive gold-standard measure to a random subset of participants while the whole sample is administered a cheaper measure (two-method design). Although not common in epidemiology, these designs have been recommended for decades by methodologists for their benefits—notably that data collection costs are minimized and participant burden is reduced, which can increase validity. This paper describes the multiform, wave-missing and two-method designs, including their benefits, their impact on bias and power, and other factors that must be taken into consideration when implementing them in an epidemiological study design.
APA, Harvard, Vancouver, ISO, and other styles
2

Cavaleri, Luigi. "Wave Modeling—Missing the Peaks." Journal of Physical Oceanography 39, no. 11 (November 1, 2009): 2757–78. http://dx.doi.org/10.1175/2009jpo4067.1.

Full text
Abstract:
Abstract The paper analyzes the capability of the present wave models of properly reproducing the conditions during and at the peak of severe and extreme storms. After providing evidence that this is often not the case, the reasons for it are explored. First, the physics of waves considered in wave models is analyzed. Although much improved with respect to the past, the wind accuracy is still a relevant factor at the peak of the storms. Other factors such as wind variability and air density are considered. The classical theory of wave generation by J. W. Miles’s mechanism, with subsequent modifications, is deemed not sufficiently representative of extreme conditions. The presently used formulations for nonlinear energy transfer are found to lead to too wide distributions in frequency and direction, hence reducing the input by wind. Notwithstanding some recent improvements, the white-capping formulation still depends on parameters fitted to the bulk of the data. Hence, it is not obvious how they will perform in extreme conditions when the physics is likely to be different. Albeit at different levels in different models, the advection still implies the spreading of energy, hence a spatial smoothing of the peaks. The lack of proper knowledge of the ocean currents is found to substantially affect the identification of how much energy can—in some cases—be concentrated at a given time and location. The implementation of the available theories and know-how in the present wave models are often found inconsistent from model to model. It follows that in this case, it is not possible to exchange corresponding pieces of software between two models without substantially affecting the quality of the results. After analyzing various aspects of a wave model, the paper makes some general considerations. Because wave growth is the difference between processes (input and output) involving large amounts of energy, it is very sensitive to small modifications of one or more processes. Together with the strong, but effective, tuning present in a wave model, this makes the introduction of new physics more complicated. It is suggested that for long-term improvements, operational and experimental applications need to proceed along parallel routes, with the latter looking more to the physics without the necessity of an immediately improved overall performance. In view of the forthcoming increase of computer power, a sensitivity study is suggested to identify the most critical areas in a wave model to determine where to invest for further improvements. The limits on the description of the physics of the processes when using the spectral approach, particularly in extreme conditions, are considered. For further insights and as a way to validate the present theories in these conditions, the use is suggested of numerical experiments simulating in great detail the physical interaction between the lower atmosphere and the single waves.
APA, Harvard, Vancouver, ISO, and other styles
3

Rhemtulla, Mijke, Fan Jia, Wei Wu, and Todd D. Little. "Planned missing designs to optimize the efficiency of latent growth parameter estimates." International Journal of Behavioral Development 38, no. 5 (January 23, 2014): 423–34. http://dx.doi.org/10.1177/0165025413514324.

Full text
Abstract:
We examine the performance of planned missing (PM) designs for correlated latent growth curve models. Using simulated data from a model where latent growth curves are fitted to two constructs over five time points, we apply three kinds of planned missingness. The first is item-level planned missingness using a three-form design at each wave such that 25% of data are missing. The second is wave-level planned missingness such that each participant is missing up to two waves of data. The third combines both forms of missingness. We find that three-form missingness results in high convergence rates, little parameter estimate or standard error bias, and high efficiency relative to the complete data design for almost all parameter types. In contrast, wave missingness and the combined design result in dramatically lowered efficiency for parameters measuring individual variability in rates of change (e.g., latent slope variances and covariances), and bias in both estimates and standard errors for these same parameters. We conclude that wave missingness should not be used except with large effect sizes and very large samples.
APA, Harvard, Vancouver, ISO, and other styles
4

Lu, Peiyi, and Mack Shelley. "Testing the Missing Mechanism of Demographic and Health Variables in the Health and Retirement Study." Innovation in Aging 4, Supplement_1 (December 1, 2020): 509. http://dx.doi.org/10.1093/geroni/igaa057.1644.

Full text
Abstract:
Abstract Studies using data from longitudinal health survey of older adults usually assumed the data were missing completely at random (MCAR) or missing at random (MAR). Thus subsequent analyses used multiple imputation or likelihood-based method to handle missing data. However, little existing research actually examines whether the data met the MCAR/MAR assumptions before performing data analyses. This study first summarized the commonly used statistical methods to test missing mechanism and discussed their application conditions. Then using two-wave longitudinal data from the Health and Retirement Study (HRS; wave 2014-2015 and wave 2016-2017; N=18,747), this study applied different approaches to test the missing mechanism of several demographic and health variables. These approaches included Little’s test, logistic regression method, nonparametric tests, false discovery rate, and others. Results indicated the data did not meet the MCAR assumption even though they had a very low rate of missing values. Demographic variables provided good auxiliary information for health variables. Health measures (e.g., self-reported health, activity of daily life, depressive symptoms) met the MAR assumptions. Older respondents could drop out and die in the longitudinal survey, but attrition did not significantly affect the MAR assumption. Our findings supported the MAR assumptions for the demographic and health variables in HRS, and therefore provided statistical justification to HRS researchers about using imputation or likelihood-based methods to deal with missing data. However, researchers are strongly encouraged to test the missing mechanism of the specific variables/data they choose when using a new dataset.
APA, Harvard, Vancouver, ISO, and other styles
5

Frasier, C., and D. Winterstein. "Analysis of conventional and converted mode reflections at Putah sink, California using three‐component data." GEOPHYSICS 55, no. 6 (June 1990): 646–59. http://dx.doi.org/10.1190/1.1442877.

Full text
Abstract:
In 1980 Chevron recorded a three‐component seismic line using vertical (V) and transverse (T) motion vibrators over the Putah sink gas field near Davis, California. The purpose was to record the total vector motion of the various reflection types excited by the two sources, with emphasis on converted P‐S reflections. Analysis of the conventional reflection data agreed with results from the Conoco Shear Wave Group Shoot of 1977–1978. For example, the P‐P wave section had gas‐sand bright spots which were absent in the S‐S wave section. Shot profiles from the V vibrators showed strong P‐S converted wave events on the horizontal radial component (R) as expected. To our surprise, shot records from the T vibrators showed S‐P converted wave events on the V component, with low amplitudes but high signal‐to‐noise (S/N) ratios. These S‐P events were likely products of split S‐waves generated in anisotropic subsurface media. Components of these downgoing waves in the plane of incidence were converted to P‐waves on reflection and arrived at receivers in a low‐noise time window ahead of the S‐S waves. The two types of converted waves (P‐S and S‐P) were first stacked by common midpoint (CMP). The unexpected S‐P section was lower in true amplitude but much higher in S/N ratio than the P‐S section. The Winters gas‐sand bright spot was missing on the converted wave sections, mimicking the S‐S reflectivity as expected. CRP gathers were formed by rebinning data by a simple ray‐tracing formula based on the asymmetry of raypaths. CRP stacking improved P‐S and S‐P event resolution relative to CMP stacking and laterally aligned structural features with their counterparts on P and S sections. Thus, the unexpected S‐P data provided us with an extra check for our converted wave data processing.
APA, Harvard, Vancouver, ISO, and other styles
6

Rebernik, Teja, Jidde Jacobi, Mark Tiede, and Martijn Wieling. "Accuracy Assessment of Two Electromagnetic Articulographs: Northern Digital Inc. WAVE and Northern Digital Inc. VOX." Journal of Speech, Language, and Hearing Research 64, no. 7 (July 16, 2021): 2637–67. http://dx.doi.org/10.1044/2021_jslhr-20-00394.

Full text
Abstract:
Purpose This study compares two electromagnetic articulographs manufactured by Northern Digital, Inc.: the NDI Wave System (from 2008) and the NDI Vox-EMA System (from 2020). Method Four experiments were completed: (a) comparison of statically positioned sensors, (b) tracking dynamic movements of sensors manipulated using a motor-driven LEGO apparatus, (c) tracking small and large movements of sensors mounted in a rigid bar manipulated by hand, and (d) tracking movements of sensors rotated on a circular disc. We assessed spatial variability for statically positioned sensors, variability in the transduced Euclidean distances between sensor pairs, and missing data rates. For sensors tracking circular movements, we compared the fit between fitted ideal circles and actual trajectories. Results The average sensor pair tracking error (i.e., the standard deviation of the Euclidean distances) was 1.37 mm for the WAVE and 0.12 mm for the VOX during automated trials at the fastest speed, and 0.35 mm for the WAVE and 0.14 mm for the VOX during the tracking of large manual movements. The average standard deviation of the fitted circle radii charted by manual circular disc movements was 0.72 mm for the WAVE sensors and 0.14 mm for the VOX sensors. There was no significant difference between the WAVE and the VOX in the number of missing frames. Conclusions In general, the VOX system significantly outperformed the WAVE on measures of both static precision and dynamic accuracy (automated and manual). For both systems, positional precision and spatial variability were influenced by the sensors' position relative to the field generator unit (worse when further away). Supplemental Material https://doi.org/10.23641/asha.14787846
APA, Harvard, Vancouver, ISO, and other styles
7

Peach, Leo, N. Carthwright, and D. Strauss. "INVESTIGATING MACHINE LEARNING FOR VIRTUAL WAVE MONITORING." Coastal Engineering Proceedings, no. 36v (December 31, 2020): 46. http://dx.doi.org/10.9753/icce.v36v.papers.46.

Full text
Abstract:
Wave monitoring is a time consuming and costly endeavour which, despite best e orts, can be subject to occasional periods of missing data. This paper investigates the application of machine learning to create "virtual" wave height (Hs), period (Tz) and direction (Dp) parameters. Two supervised machine learning algorithms were applied using long term wave parameter datasets sourced from four wave monitoring stations in relatively close geographic proximity. The machine learning algorithms demonstrated reasonable performance for some parameters through testing, with Hs performing best overall followed closely by Tz; Dp was the most challenging to predict and performed relatively the poorest. The creation of such "virtual" wave monitoring stations could be used to hindcast wave conditions, fill observation gaps or extend data beyond that collected by the physical instrument.Recorded Presentation from the vICCE (YouTube Link): https://youtu.be/GM3EG2_SQa0
APA, Harvard, Vancouver, ISO, and other styles
8

Si, Yajuan, Jerome P. Reiter, and D. Sunshine Hillygus. "Semi-parametric Selection Models for Potentially Non-ignorable Attrition in Panel Studies with Refreshment Samples." Political Analysis 23, no. 1 (2015): 92–112. http://dx.doi.org/10.1093/pan/mpu009.

Full text
Abstract:
Panel studies typically suffer from attrition. Ignoring the attrition can result in biased inferences if the missing data are systematically related to outcomes of interest. Unfortunately, panel data alone cannot inform the extent of bias due to attrition. Many panel studies also include refreshment samples, which are data collected from a random sample of new individuals during the later waves of the panel. Refreshment samples offer information that can be utilized to correct for biases induced by non-ignorable attrition while reducing reliance on strong assumptions about the attrition process. We present a Bayesian approach to handle attrition in two-wave panels with one refreshment sample and many categorical survey variables. The approach includes (1) an additive non-ignorable selection model for the attrition process; and (2) a Dirichlet process mixture of multinomial distributions for the categorical survey variables. We present Markov chain Monte Carlo algorithms for sampling from the posterior distribution of model parameters and missing data. We apply the model to correct attrition bias in an analysis of data from the 2007–08 Associated Press/Yahoo News election panel study.
APA, Harvard, Vancouver, ISO, and other styles
9

Aleksandrova, Ekaterina, Christopher J. Gerry, and Olga Verkhovskaya. "Missing entrepreneurs." Journal of Entrepreneurship in Emerging Economies 12, no. 1 (July 22, 2019): 1–33. http://dx.doi.org/10.1108/jeee-11-2018-0133.

Full text
Abstract:
Purpose Compared with other emerging and former command economies, Russia has low levels of entrepreneurial activity and exceptionally low levels of reported entrepreneurial intentions. Drawing on the theory of planned behaviour (TPB), this paper aims to examine the determinants of entrepreneurial intentions in Russia. Design/methodology/approach Using individual level data from two waves (2013 and 2018) of the Global Entrepreneurship Monitor (GEM) survey, the paper presents a range of semi-nonparametric logistic regressions estimating the determinants of reported entrepreneurial intention among the Russian adult population not already engaged in entrepreneurial activity. These data allow for the first empirical exploration of the TPB in the Russian context. Findings The results provide evidence in support of two (“attitudes” and “perceived behavioural control”), from three, origins of the theory of planned behaviour. Firstly, positive attitudes towards entrepreneurship, in the form of employment seeking and direct (own experience) or indirect (experience through social networks) entrepreneurial knowledge are both positively associated with intention. Secondly, individuals who consider their environment to be conducive to entrepreneurship and who believe they have the knowledge and skills required to be entrepreneurs are more likely to intend entrepreneurial action. Originality/value In view of the limited entrepreneurial activity and low levels of reported entrepreneurial intention in Russia, it is important to understand the drivers of these intentions if the appropriate policy responses are to be identified and adopted. This research represents the first substantive efforts to comprehensively examine the determinants of entrepreneurial intentions for Russia and allows us to propose several policy relevant conclusions.
APA, Harvard, Vancouver, ISO, and other styles
10

Su, Yuanda, Xinding Fang, and Xiaoming Tang. "Measurement of the shear slowness of slow formations from monopole logging-while-drilling sonic logs." GEOPHYSICS 85, no. 1 (December 6, 2019): D45—D52. http://dx.doi.org/10.1190/geo2019-0236.1.

Full text
Abstract:
Acoustic logging-while-drilling (LWD) is used to measure formation velocity/slowness during drilling. In a fast formation, in which the S-wave velocity is higher than the borehole-fluid velocity, monopole logging can be used to obtain P- and S-wave velocities by measuring the corresponding refracted waves. In a slow formation, in which the S-wave velocity is less than the borehole-fluid velocity, because the fully refracted S-wave is missing, quadrupole logging has been developed and used for S-wave slowness measurement. A recent study based on numerical modeling implies that monopole LWD can generate a detectable transmitted S-wave in a slow formation. This nondispersive transmitted S-wave propagates at the formation S-wave velocity and thus can be used for measuring the S-wave slowness of a slow formation. We evaluate a field example to demonstrate the applicability of monopole LWD in determining the S-wave slowness of slow formations. We compare the S-wave slowness extracted from a monopole LWD data set acquired in a slow formation and the result derived from the quadrupole data recorded in the same logging run. The results indicated that the S-wave slowness can be reliably determined from monopole LWD sonic data in fairly slow formations. However, we found that the monopole approach is not applicable to very slow formations because the transmitted S-wave becomes too weak to detect when the formation S-wave slowness is much higher than the borehole-fluid slowness.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Two Wave Missing Data"

1

Belen, Rahime. "Detecting Disguised Missing Data." Master's thesis, METU, 2009. http://etd.lib.metu.edu.tr/upload/12610411/index.pdf.

Full text
Abstract:
In some applications, explicit codes are provided for missing data such as NA (not available) however many applications do not provide such explicit codes and valid or invalid data codes are recorded as legitimate data values. Such missing values are known as disguised missing data. Disguised missing data may affect the quality of data analysis negatively, for example the results of discovered association rules in KDD-Cup-98 data sets have clearly shown the need of applying data quality management prior to analysis. In this thesis, to tackle the problem of disguised missing data, we analyzed embedded unbiased sample heuristic (EUSH), demonstrated the methods drawbacks and proposed a new methodology based on Chi Square Two Sample Test. The proposed method does not require any domain background knowledge and compares favorably with EUSH.
APA, Harvard, Vancouver, ISO, and other styles
2

Chen, Lihan. "Two-stage maximum likelihood approach for item-level missing data in regression." Thesis, University of British Columbia, 2017. http://hdl.handle.net/2429/62724.

Full text
Abstract:
Psychologists often use scales composed of multiple items to measure underlying constructs, such as well-being, depression, and personality traits. Missing data often occurs at the item-level. For example, participants may skip items on a questionnaire for various reasons. If variables in the dataset can account for the missingness, the data is missing at random (MAR). Modern missing data approaches can deal with MAR missing data effectively, but existing analytical approaches cannot accommodate item-level missing data. A very common practice in psychology is to average all available items to produce scale means when there is missing data. This approach, called available-case maximum likelihood (ACML) may produce biased results in addition to incorrect standard errors. Another approach is scale-level full information maximum likelihood (SL-FIML), which treats the whole scale as missing if even one item is missing. SL-FIML is inefficient and prone to bias. A new analytical approach, called the two-stage maximum likelihood approach (TSML), was recently developed as an alternative (Savalei & Rhemtulla, 2017b). The original work showed that the method outperformed ACML and SL-FIML in structural equation models with parcels. The current simulation study examined the performance of ACML, SL- FIML, and TSML in the context of bivariate regression. It was shown that when item loadings or item means are unequal within the composite, ACML and SL-FIML produced biased estimates on regression coefficients under MAR. Outside of convergence issues when the sample size is small and the number of variables is large, TSML performed well in all simulated conditions, showing little bias, high efficiency, and good coverage. Additionally, the current study investigated how changing the strength of the MAR mechanism may lead to drastically different conclusions in simulation studies. A preliminary definition of MAR strength is provided in order to demonstrate its impact. Recommendations are made to future simulation studies on missing data.
Arts, Faculty of
Psychology, Department of
Graduate
APA, Harvard, Vancouver, ISO, and other styles
3

Kosler, Joseph Stephen. "Multiple comparisons using multiple imputation under a two-way mixed effects interaction model." Columbus, Ohio : Ohio State University, 2006. http://rave.ohiolink.edu/etdc/view?acc%5Fnum=osu1150482904.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Bailey, Brittney E. "Data analysis and multiple imputation for two-level nested designs." The Ohio State University, 2018. http://rave.ohiolink.edu/etdc/view?acc_num=osu1531822703002162.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Kellermann, Anh Pham. "Missing Data in Complex Sample Surveys: Impact of Deletion and Imputation Treatments on Point and Interval Parameter Estimates." Scholar Commons, 2018. https://scholarcommons.usf.edu/etd/7633.

Full text
Abstract:
The purpose of this simulation study was to evaluate the relative performance of five missing data treatments (MDTs) for handling missing data in complex sample surveys. The five missing data methods included in this study were listwise deletion (LW), single hot-deck imputation (HS), single regression imputation (RS), hot-deck-based multiple imputation (HM), and regression-based multiple imputation (RM). These MDTs were assessed in the context of regression weight estimates in multiple regression analysis in complex sample data with two data levels. In this study, the multiple regression equation had six regressors without missing data and two regressors with missing data. The four performance measures used in this study were statistical bias, RMSE, CI width, and coverage probability (i.e., 95%) of the confidence interval. The five MDTs were evaluated separately for three types of missingness: MCAR, MAR, and MNAR. For each type of missingness, the studied MDTs were evaluated at four levels of missingness (10%, 30%, 50%, and 70%) along with complete sample conditions as a reference point for interpretation of results. In addition, ICC levels (.0, .25, .50) and high and low density population were also manipulated as studied factors. The study’s findings revealed that the performance of each individual MDT varied across missing data types, but their relative performance was quite similar for all missing data types except for LW’s performance in MNAR. RS produced the most inaccurate estimates considering bias, RMSE, and coverage of confidence interval; RM and HM were the second poorest performers. LW as well as HS procedure outperformed the rest on the measures of accuracy and precision in MCAR; however LW’s measures of precision decreased in MAR and MNAR, and LW’s CI width was the widest in MNAR data. In addition, in all three missing data types, those poor performers were less accurate and less precise on variables with missing data than they were on variables without missing data; and the degree of accuracy and precision of these poor performers depended mostly on the level of data ICC. The proportion of missing data only noticeably affected the performance of HM such that in higher missing data levels, HM yielded worse performance measures. Population density factor had negligible effects on most of the measures produced by all studied MDTs except for RMSE, CI width, and CI coverage produced by LW which were modestly influenced by population density.
APA, Harvard, Vancouver, ISO, and other styles
6

Dunu, Emeka Samuel. "Comparing the Powers of Several Proposed Tests for Testing the Equality of the Means of Two Populations When Some Data Are Missing." Thesis, University of North Texas, 1994. https://digital.library.unt.edu/ark:/67531/metadc278198/.

Full text
Abstract:
In comparing the means .of two normally distributed populations with unknown variance, two tests very often used are: the two independent sample and the paired sample t tests. There is a possible gain in the power of the significance test by using the paired sample design instead of the two independent samples design.
APA, Harvard, Vancouver, ISO, and other styles
7

Bašić, Edin. "The problem of missing residential mobility information in the german microcensus : an evaluation of two statistical approaches with the socio-economic panel /." Hamburg : Kovač, 2008. http://swbplus.bsz-bw.de/bsz286991586cov.htm.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Yang, Hanfang. "Jackknife Emperical Likelihood Method and its Applications." Digital Archive @ GSU, 2012. http://digitalarchive.gsu.edu/math_diss/9.

Full text
Abstract:
In this dissertation, we investigate jackknife empirical likelihood methods motivated by recent statistics research and other related fields. Computational intensity of empirical likelihood can be significantly reduced by using jackknife empirical likelihood methods without losing computational accuracy and stability. We demonstrate that proposed jackknife empirical likelihood methods are able to handle several challenging and open problems in terms of elegant asymptotic properties and accurate simulation result in finite samples. These interesting problems include ROC curves with missing data, the difference of two ROC curves in two dimensional correlated data, a novel inference for the partial AUC and the difference of two quantiles with one or two samples. In addition, empirical likelihood methodology can be successfully applied to the linear transformation model using adjusted estimation equations. The comprehensive simulation studies on coverage probabilities and average lengths for those topics demonstrate the proposed jackknife empirical likelihood methods have a good performance in finite samples under various settings. Moreover, some related and attractive real problems are studied to support our conclusions. In the end, we provide an extensive discussion about some interesting and feasible ideas based on our jackknife EL procedures for future studies.
APA, Harvard, Vancouver, ISO, and other styles
9

Martins, Francisco T. R. França. "Take-two interactive software - risk of missing the growth wave." Master's thesis, 2017. http://hdl.handle.net/10362/25883.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Hung, Jui-Chung, and 洪瑞鍾. "Two-Stage Signal Reconstruction under Unknown Parameters and Missing Data." Thesis, 2001. http://ndltd.ncl.edu.tw/handle/43097974010581810123.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Two Wave Missing Data"

1

Kemp, J. W. L. Two-dimensional identification and control in the presence of missing data. Manchester: UMIST, 1994.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Nussbaumer-Ochsner, Yvonne, and Konrad E. Bloch. Sleep at high altitude and during space travel. Edited by Sudhansu Chokroverty, Luigi Ferini-Strambi, and Christopher Kennard. Oxford University Press, 2017. http://dx.doi.org/10.1093/med/9780199682003.003.0054.

Full text
Abstract:
This chapter summarizes data on sleep–wake disturbances in humans at high altitude and in space. High altitude exposure is associated with periodic breathing and a trend toward reduced slow-wave sleep and sleep efficiency in healthy individuals. Some subjects are affected by altitude-related illness (eg, acute and chronic mountain sickness, high-altitude cerebral and pulmonary edema). Several drugs are available to prevent and treat these conditions. Data about the effects of microgravity on sleep are limited and do not allow the drawing of firm conclusions. Microgravity and physical and psychological factors are responsible for sleep–wake disturbances during space travel. Space missions are associated with sleep restriction and disruption and circadian rhythm disturbances encouraging use of sleep medication. An unexplained and unexpected finding is the improvement in upper airway obstructive breathing events and snoring during space flight.
APA, Harvard, Vancouver, ISO, and other styles
3

Hollis-Brusky, Amanda, and Joshua C. Wilson. Separate but Faithful. Oxford University Press, 2020. http://dx.doi.org/10.1093/oso/9780190637262.001.0001.

Full text
Abstract:
While the Christian Right has long voiced grave concerns about the Supreme Court and cases such as Roe v. Wade, until recently its cultivation of the resources needed to effectively enter the courtroom had paled in comparison with its efforts in more traditional political arenas. A small constellation of high-profile leaders within the Christian Right began to address this imbalance in earnest in the pivot from the twentieth to the twenty-first century, investing in an array of institutions aimed at radically transforming American law and legal culture. Separate But Faithful is the first in-depth examination of these efforts—their causes, contours, and consequences. Drawing on an impressive amount of original data from a variety of sources, the book examines the conditions that gave rise to a set of distinctly “Christian Worldview” law schools and legal institutions. Further, the book analyzes their institutional missions and cultural makeup and evaluates their transformative impacts on law and legal culture to date. Separate But Faithful finds that this movement, while struggling to influence the legal and political mainstream, has succeeded in establishing a resilient Christian conservative beacon of resistance: a separate but faithful space from which to incrementally challenge the dominant legal culture by training and credentialing, in the words of Jerry Falwell, “a generation of Christian attorneys who could . . . infiltrate the legal profession with a strong commitment to the Judeo-Christian ethic.”
APA, Harvard, Vancouver, ISO, and other styles
4

Cai, Zongwu. Functional Coefficient Models for Economic and Financial Data. Edited by Frédéric Ferraty and Yves Romain. Oxford University Press, 2018. http://dx.doi.org/10.1093/oxfordhb/9780199568444.013.6.

Full text
Abstract:
This article discusses the use of functional coefficient models for economic and financial data analysis. It first provides an overview of recent developments in the nonparametric estimation and testing of functional coefficient models, with particular emphasis on the kernel local polynomial smoothing method, before considering misspecification testing as an important econometric question when fitting a functional (varying) coefficient model or a trending time-varying coefficient model. It then describes two major real-life applications of functional coefficient models in economics and finance: the first deals with the use of functional coefficient instrumental-variable models to investigate the empirical relation between wages and education in a random sample of young Australian female workers from the 1985 wave of the Australian Longitudinal Survey, and the second is concerned with the use of functional coefficient beta models to analyze the common stock price of Microsoft stock (MSFT) during the year 2000 using the daily closing prices.
APA, Harvard, Vancouver, ISO, and other styles
5

Deruelle, Nathalie, and Jean-Philippe Uzan. The two-body problem: an effective-one-body approach. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198786399.003.0056.

Full text
Abstract:
This chapter presents the basics of the ‘effective-one-body’ approach to the two-body problem in general relativity. It also shows that the 2PN equations of motion can be mapped. This can be done by means of an appropriate canonical transformation, to a geodesic motion in a static, spherically symmetric spacetime, thus considerably simplifying the dynamics. Then, including the 2.5PN radiation reaction force in the (resummed) equations of motion, this chapter provides the waveform during the inspiral, merger, and ringdown phases of the coalescence of two non-spinning black holes into a final Kerr black hole. The chapter also comments on the current developments of this approach, which is instrumental in building the libraries of waveform templates that are needed to analyze the data collected by the current gravitational wave detectors.
APA, Harvard, Vancouver, ISO, and other styles
6

Zeitlin, Vladimir. Geophysical Fluid Dynamics. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780198804338.001.0001.

Full text
Abstract:
The book explains the key notions and fundamental processes in the dynamics of the fluid envelopes of the Earth (transposable to other planets), and methods of their analysis, from the unifying viewpoint of rotating shallow-water model (RSW). The model, in its one- or two-layer versions, plays a distinguished role in geophysical fluid dynamics, having been used for around a century for conceptual understanding of various phenomena, for elaboration of approaches and methods, to be applied later in more complete models, for development and testing of numerical codes and schemes of data assimilations, and many other purposes. Principles of modelling of large-scale atmospheric and oceanic flows, and corresponding approximations, are explained and it is shown how single- and multi-layer versions of RSW arise from the primitive equations by vertical averaging, and how further time-averaging produces celebrated quasi-geostrophic reductions of the model. Key concepts of geophysical fluid dynamics are exposed and interpreted in RSW terms, and fundamentals of vortex and wave dynamics are explained in Part 1 of the book, which is supplied with exercises and can be used as a textbook. Solutions of the problems are available at Editorial Office by request. In-depth treatment of dynamical processes, with special accent on the primordial process of geostrophic adjustment, on instabilities in geophysical flows, vortex and wave turbulence and on nonlinear wave interactions follows in Part 2. Recently arisen new approaches in, and applications of RSW, including moist-convective processes constitute Part 3.
APA, Harvard, Vancouver, ISO, and other styles
7

Holmes, Amy Austin. Coups and Revolutions. Oxford University Press, 2019. http://dx.doi.org/10.1093/oso/9780190071455.001.0001.

Full text
Abstract:
This book offers the first analysis of both the revolution and counterrevolution in Egypt, beginning in January 2011 until July 2018. The period of revolutionary upheaval played out in three uprisings against three distinct forms of authoritarian rule: the Mubarak regime and the police state that protected it, the unelected military junta known as the Supreme Council of Armed Forces, and the religious authoritarianism of the Muslim Brotherhood. The second part of the book analyzes the counterrevolution, which is divided into two periods: the first under Adly Mansour as interim president; and the second after Abdel Fattah El -Sisi was elected president. During the first wave, the regime imprisoned or killed the leadership of the Muslim Brotherhood and many secular activists, while during the second wave the regime turned against civil society at large: nongovernmental organizations , charities, the media, academia, and minority groups. In addition to providing new and unprecedented empirical data, the book makes two theoretical contributions. First, a new framework is presented for analyzing the state apparatus in Egypt, which is based on four pillars of regime support that can either prop up or press upon those in power: the Egyptian military, the business elite, the United States, and the multiheaded opposition. Second , the book brings together the literature on bottom-up revolutionary movements and top-down military coups, and it introduces the concept of a coup from below in contrast to the revolution from above that took place under Gamal Abdel Nasser.
APA, Harvard, Vancouver, ISO, and other styles
8

van der Hoeven, Frank, and Alexander Wandl. Hotterdam: How space is making Rotterdam warmer, how this affects the health of its inhabitants, and what can be done about it. TU Delft Open, 2015. http://dx.doi.org/10.47982/bookrxiv.1.

Full text
Abstract:
Heat waves will occur in Rotterdam with greater frequency in the future. Those affected most will be the elderly – a group that is growing in size. In the light of the Paris heat wave of August 2003 and the one in Rotterdam in July 2006, mortality rates among the elderly in particular are likely to rise in the summer. METHOD The aim of the Hotterdam research project was to gain a better understanding of urban heat. The heat was measured and the surface energy balance modelled from that perspective. Social and physical features of the city we identified in detail with the help of satellite images, GIS and 3D models. We determined the links between urban heat/surface energy balance and the social/physical features of Rotterdam by multivariable regression analysis. The crucial elements of the heat problem were then clustered and illustrated on a social and a physical heat map. RESULTS The research project produced two heat maps, an atlas of underlying data and a set of adaptation measures which, when combined, will make the city of Rotterdam and its inhabitants more aware and less vulnerable to heat wave-related health effects. CONCLUSION In different ways, the pre-war districts of the city (North, South, and West) are warmer and more vulnerable to urban heat than are other areas of Rotterdam. The temperature readings that we carried out confirm these findings as far as outdoor temperatures are concerned. Indoor temperatures vary widely. Homes seem to have their particular dynamics, in which the house’s age plays a role. The above-average mortality of those aged 75 and over during the July 2006 heat wave in Rotterdam can be explained by a) the concentration of people in this age group, b) the age of the homes they live in, and c) the sum of sensible heat and ground heat flux. A diverse mix of impervious surfaces, surface water, foliage, building envelopes and shade make one area or district warmer than another. Adaptation measures are in the hands of residents, homeowners and the local council alike, and relate to changing behaviour, physical measures for homes, and urban design respectively.
APA, Harvard, Vancouver, ISO, and other styles
9

Hellwig, Timothy, Yesola Kweon, and Jack Vowles. Democracy Under Siege? Oxford University Press, 2020. http://dx.doi.org/10.1093/oso/9780198846208.001.0001.

Full text
Abstract:
For the worlds democracies, the Global Financial Crisis of 2008–9 was catalyst for the most precipitous economic downturn in eight decades. This book examines how the GFC and ensuing Great Recession affected the workings of mass politics in the established democracies. The initial wave of research on the crisis concluded it did little to change the established relationships between voters, parties, and elections. Yet, nearly a decade since the initial shock, we are witnessing a wave of political changes, the extent to which has not been fully explained by existing studies. How did the economic malaise bear on the political preferences of citizens? This book pushes against the received wisdom by advancing a framework for understanding citizen attitudes, preferences, and behaviour. We make two main claims. First, while previous studies of the GFC tend to focus on an immediate impact of the crisis, we argue that economic malaise had a long-lasting impact. In addition to economic shock, we emphasize that economic recovery has a significant impact on citizens assessment of political elites. Second, we argue that unanticipated exogenous shocks like the GFC grant party elites an opening for political manoeuvre through public policy and rhetoric. As a result, political elites have a high degree of agency to shape public perceptions and behaviour. Political parties can strategically moderate citizens economic uncertainty, mobilize/demobilize voters, and alter individuals political preferences. By leveraging data from over 150,000 individuals across over 100 nationally representative post-election surveys from the 1990s to 2017, this book tests these research claims across a range of outcomes, including economic perceptions, policy demands, political participation, and the vote.
APA, Harvard, Vancouver, ISO, and other styles
10

Temperley, David. The Musical Language of Rock. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780190653774.001.0001.

Full text
Abstract:
A theory of the structure of rock music is presented, addressing aspects such as tonality/key, harmony, rhythm/meter, melody, phrase structure, timbre/instrumentation, form, and emotional expression. The book brings together ideas from the author’s previous articles but also contains substantial new material. Rock is defined broadly (as it often is) to include a wide range of late twentieth-century Anglo-American popular styles, including 1950s rock & roll, Motown, soul, “British invasion” rock, soft rock, heavy metal, disco, new wave, and alternative rock. The study largely employs the informal, intuitive methods of conventional music theory and analysis, but it is also informed by corpus data. An important component of the theory is a representation of pitches—the “line of fifths”—that sheds light on issues such as stylistic distinctions within rock, effects of surprise, and emotion. The theory also entails a model of expression with three dimensions, representing valence, energy, and tension; this proves to be a powerful tool for tracing shifts in expressive effect within songs. The theory features novel approaches to issues such as cadences, melodic-harmonic coordination, the handling of sectional boundaries, and the classification of formal types. The final two chapters present analyses of six songs and a broader consideration of rock in its historical and stylistic context.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Two Wave Missing Data"

1

Graham, John W., and Allison E. Shevock. "Planned Missing Data Design 2: Two-Method Measurement." In Missing Data, 295–323. New York, NY: Springer New York, 2012. http://dx.doi.org/10.1007/978-1-4614-4018-5_13.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Griffith, Daniel A. "The Missing Data Problem for a Two-Dimensional Surface." In Advanced Studies in Theoretical and Applied Econometrics, 127–74. Dordrecht: Springer Netherlands, 1988. http://dx.doi.org/10.1007/978-94-009-2758-2_6.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Mazzotti, A., R. G. Ferber, and R. Marschall. "Two-Component Recording with a P-Wave Source to Improve Seismic Resolution." In Aspects of Seismic Reflection Data Processing, 155–223. Dordrecht: Springer Netherlands, 1989. http://dx.doi.org/10.1007/978-94-009-2087-3_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Ward, W. E., J. Oberheide, M. Riese, P. Preusse, and D. Offermann. "Planetary wave two signatures in CRISTA 2 ozone and temperature data." In Atmospheric Science Across the Stratopause, 319–25. Washington, D. C.: American Geophysical Union, 2000. http://dx.doi.org/10.1029/gm123p0319.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Rahman, Md Geaur, and Md Zahidul Islam. "kDMI: A Novel Method for Missing Values Imputation Using Two Levels of Horizontal Partitioning in a Data set." In Advanced Data Mining and Applications, 250–63. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-53917-6_23.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Diebold, John. "Tau-p Analysis in One, Two and Three Dimensions." In Tau-p: a plane wave approach to the analysis of seismic data, 71–117. Dordrecht: Springer Netherlands, 1989. http://dx.doi.org/10.1007/978-94-009-0881-9_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Leutgeb, Lorenz, Georg Moser, and Florian Zuleger. "Automated Expected Amortised Cost Analysis of Probabilistic Data Structures." In Computer Aided Verification, 70–91. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-031-13188-2_4.

Full text
Abstract:
AbstractIn this paper, we present the first fully-automated expected amortised cost analysis of self-adjusting data structures, that is, of randomised splay trees, randomised splay heaps and randomised meldable heaps, which so far have only (semi-)manually been analysed in the literature. Our analysis is stated as a type-and-effect system for a first-order functional programming language with support for sampling over discrete distributions, non-deterministic choice and a ticking operator. The latter allows for the specification of fine-grained cost models. We state two soundness theorems based on two different—but strongly related—typing rules of ticking, which account differently for the cost of non-terminating computations. Finally we provide a prototype implementation able to fully automatically analyse the aforementioned case studies."Image missing"
APA, Harvard, Vancouver, ISO, and other styles
8

Alinhac, Serge. "Blowup of small data solutions for a class of quasilinear wave equations in two dimensions: an outline of the proof." In Geometrical Optics and Related Topics, 1–15. Boston, MA: Birkhäuser Boston, 1997. http://dx.doi.org/10.1007/978-1-4612-2014-5_1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Hachay, Olga, and Andrey Khachay. "REFLECTION OF PROCESSES OF NONEQUILIBRIUM TWO-PHASE FILTRATION IN OIL-SATURATED HIERARCHIC MEDIUM BY DATA OF ACTIVE WAVE GEOPHYSICAL MONITORING." In Oil and Gas Exploration, 135–41. Hoboken, NJ, USA: John Wiley & Sons, Inc., 2017. http://dx.doi.org/10.1002/9781119227519.ch8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Hachay, Olga, Andrey Khachay, and Oleg Khachay. "Reflection of Processes of Non-equilibrium and Two-Phase Filtration in Fluid Saturated Hierarchic Inclusion in a Block Layered Medium by Data of Active Wave Geophysical Monitoring." In Sustainable Civil Infrastructures, 31–38. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-030-02032-3_3.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Two Wave Missing Data"

1

Chen, Jiaxin, Ian G. C. Ashton, and Ajit C. Pillai. "Wave Record Gap-Filling Using a Low-Rank Tensor Completion Model." In ASME 2022 41st International Conference on Ocean, Offshore and Arctic Engineering. American Society of Mechanical Engineers, 2022. http://dx.doi.org/10.1115/omae2022-79897.

Full text
Abstract:
Abstract The offshore wind farm industry has identified further refinement of marine operations as necessary to realize the lower strike prices seen in recent subsidy auctions. This requires extending working times by taking advantage of weather windows, even when operating in more remote sites. Key to this will be increasing the accuracy of forecasts and live metocean data from site for effective and safe operation scheduling. In recent years, statistical models or deep learning-based models, by learning spatial and temporal patterns in observations have shown their potential to support or even partly replace numerical weather modelling to provide accurate forecasts. The present paper implements a machine learning approach utilizing a nonconvex low-rank tensor completion considering truncated nuclear norm algorithm (LRTC-TNN) which can characterize the spatial and temporal dependencies rooted in the data to fill gaps in measurement data from wave buoys. The performance is assessed by manually masking valid data thereby introducing three types of missing entries. The proposed method is shown to successfully fill datasets missing up to 20% of the data with R2 values exceeding 0.9. The results are also compared against other intuitive methods including linear interpolation, cubic spline interpolation, and mean historical. Compared to these, the present method is shown to have more reasonable trends. Finally, sensitivity considering the ratio of missing data, the type of missing data, and two hyper-parameters of the algorithm were compared to characterize their impacts on the results and advise potential improvements. This work demonstrates multivariate time series gap-filling with arbitrary missing ratio and sensor availability that can be extended and deployed to sensor networks for possible forecasting applications.
APA, Harvard, Vancouver, ISO, and other styles
2

Hageman, Remco, Pieter Aalberts, Didier L’Hostis, and Alain Ledoux. "Feasibility of Using Hindcast Data for Fatigue Assessment of Permanently Moored Offshore Units in West-Africa." In ASME 2020 39th International Conference on Ocean, Offshore and Arctic Engineering. American Society of Mechanical Engineers, 2020. http://dx.doi.org/10.1115/omae2020-18794.

Full text
Abstract:
Abstract To register the wave loads acting on offshore floating structures, wave buoys have been the generally accepted as the most accurate method. However, over the last couple of years, hindcast data has become increasingly accurate. The availability of this data allows assessment of the response of offshore floating structure without the need of local wave buoys. The accuracy when using hindcast data to assess fatigue accumulation on FPSO hulls is assessed in this paper. For this analysis, calculations have been executed for multiple spread-moored production units in the West-Africa region. The purposely deployed wave buoys provide a full wave spectrum including multiple wave components and directional variation of the wave energy. This data is used to derive statistical data description of the sea state. In this paper, we consider the use of WaveWatch III, ERA-5 and Copernicus hindcast models. These are all models which provide sea state descriptions on a global scale. The statistical data from the hindcast models is compared against the reference data provided by the wave buoys. Overall, the three models provide decent results in this mild environment. However, it is found that Copernicus provides more accurate estimates for the larger wave heights. Fatigue analyses have been executed. The full spectral data provided by the wave buoys is used to conduct a spectral fatigue assessment. The hindcast models generally provide more limited details of the sea state, information such as spectral shapes and directional spreading may be missing. Understanding the influence of these missing characteristics is vital for reliable long term fatigue assessment. Multiple fatigue analyses have been executed to examine the sensitivity to these missing characteristics. It has been shown that the multi-modality of the sea state and spectral shape are the most important parameters driving the deviation between the fatigue assessment based on hindcast data and wave buoy data. These differences can accumulate to a factor of 2 on lifetime consumption. The influence of the accuracy of the statistical parameters provided by the hindcast models and wave spreading is considerably less with a typical contribution of around 30% on lifetime consumption.
APA, Harvard, Vancouver, ISO, and other styles
3

Cappietti, Lorenzo, Irene Simonetti, Andrea Esposito, Maximilian Streicher, Andreas Kortenhaus, Babette Scheres, Holger Schuettrumpf, Matthias Hirt, Bas Hofland, and Xuexue Chen. "Large-Scale Experiments of Wave-Overtopping Loads on Walls: Layer Thicknesses and Velocities." In ASME 2018 37th International Conference on Ocean, Offshore and Arctic Engineering. American Society of Mechanical Engineers, 2018. http://dx.doi.org/10.1115/omae2018-78104.

Full text
Abstract:
Wave-Overtopping loads on vertical walls, such as those located on top of a dike, have been investigated in several small-scale experiments in the past. A large-scale validation for a mild foreshore situation is still missing. Hence the WALOWA (WAve LOads on WAlls) experimental campaign was carried out to address this topic. This paper, first presents a description of the large-scale model, the measurement set-up and the experimental methodologies, then it focuses on the layer thicknesses and velocities of the flows created on the promenade by the wave overtopping. A set of resistive wave gauges, ultrasonic distance sensors and velocimeters have been used to conduct these measurements. Preliminary data analysis and results, related to a 1000 irregular waves long test, are discussed. The momentum flux of these flows is studied and its implications, for the wave-overtopping loads acting on the vertical walls, are highlighted.
APA, Harvard, Vancouver, ISO, and other styles
4

Arcipreste, Bruno, Delfim Soares, Luis Ribas, and José Carlos Teixeira. "Numerical Modeling of Wave Soldering in PCB." In ASME 2014 International Mechanical Engineering Congress and Exposition. American Society of Mechanical Engineers, 2014. http://dx.doi.org/10.1115/imece2014-39051.

Full text
Abstract:
Manufacturing of electronic boards (commonly referred as PCB) is a highly automated process that requires an accurate control of the various processing variables. Amongst the soldering processes, wave soldering is one of the most often used. In this, the various electronic components are provisionally inserted onto the PCB, and a low velocity jet of melted solder is directed to the moving board. Due to capillarity effects the solder adheres to the component/board interface and the process is completed. This methodology is most often used for small components. The adjusting of the operating parameters (solder nozzle orientation and velocity) is often carried out on a trial and error basis resulting in a time consuming process that is at odds with the increasing demand for smaller production series that the electronics industry is faced with. In addition the number of defects (mostly from missing components that are washed away by the impacting jet) is more likely to occur when thinner substrates are used in the PCB manufacturing. The present paper describes the application of a Computational Fluid Dynamics model to describe the interaction of the solder jet with the PCB and the integrated circuits. The model includes the conservation equations for mass, momentum and energy in a transient time frame. The jet and surrounding ambient atmosphere are modeled as two separate fluids and the interface is tracked by a VOF model. By adjusting the computational mesh refinement the interface is captured with accuracy. The drag forces occurring in the various components are computed from the pressure data field. The model allows the optimization of the wave operating parameters as a function of the component type of and its layout in the PCB.
APA, Harvard, Vancouver, ISO, and other styles
5

Birk, Lothar, and Gu¨nther F. Clauss. "Optimization of Offshore Structures Based on Linear Analysis of Wave-Body Interaction." In ASME 2008 27th International Conference on Offshore Mechanics and Arctic Engineering. ASMEDC, 2008. http://dx.doi.org/10.1115/omae2008-57631.

Full text
Abstract:
This paper discusses new developments in automated hull shape optimization spearheaded by the authors. The use of the linear diffraction-radiation panel code WAMIT® as a design tool is highlighted. Early optimization results yielded bodies with extreme shapes. A series of studies has been performed comparing model test data and numerical computations. The presented comparisons reinforce the numerical results. The authors connected the reliable hydrodynamic analysis provided by WAMIT® with a newly developed parametric hull design methodology. This allows the automated generation of hull shapes without requiring user interaction. Single- and multi-objective optimization algorithms are available to solve a wide range of nonlinear programming problems. The integrated system optimizes hulls by minimizing motions and forces in waves. This is especially important for innovative systems when prior design experience is missing. The results of an optimization run provide a wealth of information which can be utilized to support rational design decisions.
APA, Harvard, Vancouver, ISO, and other styles
6

Stefanakos, Christos N. "Nonstationary Extreme-Value Predictions in the Gulf of Mexico." In ASME 2007 26th International Conference on Offshore Mechanics and Arctic Engineering. ASMEDC, 2007. http://dx.doi.org/10.1115/omae2007-29383.

Full text
Abstract:
In the present work, return periods of various level values of significant wave height in the Gulf of Mexico are given. The predictions are based on a new method for nonstationary extreme-value calculations that have recently been published. This enhanced method exploits efficiently the nonstationary modeling of wind or wave time series and a new definition of return period using the MEan Number of Upcrossings of the level value x* (MENU method). The whole procedure is applied to long-term measurements of wave height in the Gulf of Mexico. Two kinds of data have been used: long-term time series of buoy measurements, and satellite altimeter data. Measured time series are incomplete and a novel procedure for filling in of missing values is applied before proceeding with the extreme-value calculations. Results are compared with several variants of traditional methods, giving more realistic estimates than the traditional predictions. This is in accordance with the results of other methods that take also into account the dependence structure of the examined time series.
APA, Harvard, Vancouver, ISO, and other styles
7

Qian, Yulei, and Daiyin Zhu. "Focusing of Two Dimensional Missing SAR Raw Data." In 2019 International Applied Computational Electromagnetics Society Symposium - China (ACES). IEEE, 2019. http://dx.doi.org/10.23919/aces48530.2019.9060698.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Suleymanov, Vagif, Ammar El-Husseiny, Guenther Glatz, and Jack Dvorkin. "Rock Physics and Machine Learning Analysis of a High-Porosity Gas Sand in the Gulf of Mexico." In SPE Annual Technical Conference and Exhibition. SPE, 2022. http://dx.doi.org/10.2118/210191-ms.

Full text
Abstract:
Abstract Rock physics transforms established on the well data play an important role in predicting seismic rock properties. However, a data-driven approach, such as machine learning, can also estimate the targeted outputs from the well data. This study aims at comparing the accuracy of rock physics and machine learning analyses for the prediction of the P-wave velocity of porous rocks at the well log scale by employing the well data from the Mississippi Canyon, Gulf of Mexico. Rock physics diagnostics (RPD) was used as a physics-driven methodology for predicting the P-wave velocity, while artificial neural network (ANN) was used as a machine learning approach. To train the neural network, the well data were divided into two sections where the ANN model was optimized on the upper well data interval and tested in the lower interval. During the rock physics analysis, the lower interval was employed to compare the obtained results from the physics-driven and data-driven approaches in the same well interval. Based on the results from RPD, the constant cement model with a high coordination number describes the well data under examination. The established rock physics model is used for predicting elastic properties of rocks, including the P-wave velocity from measured petrophysical properties, namely porosity, mineralogy, and the pore fluid. However, the mineralogy input, such as the clay content, was missing in the well data. Therefore, the clay content was calculated from the gamma ray log and used in the rock physics model established. On the other hand, the ANN model was developed and tested using well log inputs such as porosity, gamma ray, and resistivity logs. Results showed that the accuracy of the machine learning model outperforms that of the rock physics model in the prediction of the P-wave velocity. In particular, a correlation coefficient (R) of 0.84 and absolute average percentage error (AAPE) of 2.71 were obtained by the ANN model, while the constant cement model reached CC of 0.65 and AAPE of 4.07. However, one should be aware that the computed clay content from the gamma ray log was a major factor in obtaining low CC compared to the ANN model as it significantly introduced uncertainty in our computations.
APA, Harvard, Vancouver, ISO, and other styles
9

Mase, Hajime, and Toshikazu Kitano. "EFFECT OF MISSING LARGE WAVE DATA ON ESTIMATION ACCURACY OF RETURN WAVE HEIGHTS." In Proceedings of the 28th International Conference. World Scientific Publishing Company, 2003. http://dx.doi.org/10.1142/9789812791306_0009.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Yu, Jiayin, Yulin He, and Joshua Zhexue Huang. "A Two-Stage Missing Value Imputation Method Based on Autoencoder Neural Network." In 2021 IEEE International Conference on Big Data (Big Data). IEEE, 2021. http://dx.doi.org/10.1109/bigdata52589.2021.9671338.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Two Wave Missing Data"

1

Kline, Patrick, and Andres Santos. Sensitivity to Missing Data Assumptions: Theory and An Evaluation of the U.S. Wage Structure. Cambridge, MA: National Bureau of Economic Research, February 2010. http://dx.doi.org/10.3386/w15716.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Knibb, Rebecca, Lily Hawkins, and Dan Rigby. FoodSensitive Study: Wave One Survey. Food Standards Agency, October 2021. http://dx.doi.org/10.46756/sci.fsa.sov133.

Full text
Abstract:
We commissioned this survey to better understand how food allergies, intolerances and coeliac disease affect people across the UK, and the factors associated with higher or lower quality of life. It will also inform our ongoing work to monitor and evaluate the success of the FSA’s food hypersensitivity programme. The FSA will be running a second wave of the survey in autumn this year, and we will use this to observe any differences in the eating out and quality of life data collected across the two timepoints.
APA, Harvard, Vancouver, ISO, and other styles
3

Collins, Clarence O., and Tyler J. Hesser. altWIZ : A System for Satellite Radar Altimeter Evaluation of Modeled Wave Heights. Engineer Research and Development Center (U.S.), February 2021. http://dx.doi.org/10.21079/11681/39699.

Full text
Abstract:
This Coastal and Hydraulics Engineering Technical Note (CHETN) describes the design and implementation of a wave model evaluation system, altWIZ, which uses wave height observations from operational satellite radar altimeters. The altWIZ system utilizes two recently released altimeter databases: Ribal and Young (2019) and European Space Agency Sea State Climate Change Initiative v.1.1 level 2 (Dodet et al. 2020). The system facilitates model evaluation against 1 Hz1 altimeter data or a product created by averaging altimeter data in space and time around model grid points. The system allows, for the first time, quantitative analysis of spatial model errors within the U.S. Army Corps of Engineers (USACE) Wave Information Study (WIS) 30+ year hindcast for coastal United States. The system is demonstrated on the WIS 2017 Atlantic hindcast, using a 1/2° basin scale grid and a 1/4° regional grid of the East Coast. Consistent spatial patterns of increased bias and root-mean-square-error are exposed. Seasonal strengthening and weakening of these spatial patterns are found, related to the seasonal variation of wave energy. Some model errors correspond to areas known for high currents, and thus wave-current interaction. In conjunction with the model comparison, additional functions for pairing altimeter measurements with buoy data and storm tracks have been built. Appendices give information on the code access (Appendix I), organization and files (Appendix II), example usage (Appendix III), and demonstrating options (Appendix IV).
APA, Harvard, Vancouver, ISO, and other styles
4

Riederer, Bernhard, Nina-Sophie Fritsch, and Lena Seewann. Singles in the city: happily ever after? Verlag der Österreichischen Akademie der Wissenschaften, June 2021. http://dx.doi.org/10.1553/populationyearbook2021.res3.2.

Full text
Abstract:
More people than ever are living in cities, and in these cities, more and more people are living alone. Using the example of Vienna, this paper investigates the subjective well-being of single households in the city. Previous research has identified positive and negative aspects of living alone (e.g., increased freedom vs. missing social embeddedness). We compare single households with other household types using data from the Viennese Quality of Life Survey (1995–2018). In our analysis, we consider overall life satisfaction as well as selected dimensions of subjective wellbeing (i.e., housing, financial situation, main activity, family, social contacts, leisure time). Our findings show that the subjective well-being of single households in Vienna is high and quite stable over time. While single households are found to have lower life satisfaction than two-adult households, this result is mainly explained by singles reporting lower satisfaction with family life. Compared to households with children, singles are more satisfied with their financial situation, leisure time and housing, which helps to offset the negative consequences of missing family ties (in particular with regard to single parents).
APA, Harvard, Vancouver, ISO, and other styles
5

Gust, Sarah. Global Universal Basic Skills: Current Deficits and Implications for World Development. Research on Improving Systems of Education (RISE), October 2022. http://dx.doi.org/10.35489/bsg-risewp_2022/114.

Full text
Abstract:
How far is the world away from ensuring that every child obtains the basic skills needed to be internationally competitive? And what would accomplishing this mean for world development? Based on the micro data of international and regional achievement tests, we map achievement onto a common (PISA) scale. We then estimate the share of children not achieving basic skills for 159 countries that cover 98.1 percent of world population and 99.4 percent of world GDP. We find that at least two-thirds of the world’s youth do not reach basic skill levels, ranging from 24 percent in North America to 89 percent in South Asia and 94 percent in Sub-Saharan Africa. Our economic analysis suggests that the present value of lost world economic output due to missing the goal of global universal basic skills amounts to over $700 trillion over the remaining century, or 11 percent of discounted GDP.
APA, Harvard, Vancouver, ISO, and other styles
6

Brooks, G. R. Thickness record of varves from glacial Ojibway Lake recovered in sediment cores from Frederick House Lake, northeastern Ontario. Natural Resources Canada/CMSS/Information Management, 2022. http://dx.doi.org/10.4095/329275.

Full text
Abstract:
The thicknesses of 384 rhythmic couplets were measured along a composite sequence of glacial Lake Ojibway glaciolacustrine deposits recovered in two sediment cores from Frederick House Lake, Ontario. The visual comparison of distinctive couplets in the CT-scan radiographs of the Frederick House core samples to photographs of core samples from Reid Lake show a match of ±1 varve number from v1656-v1902, and ±5 varve numbers between v1903-v2010, relative to the regional numbering of the Timiskaming varve series. There are two interpretations for the post-v2010 couplets that fall within the Connaught varve sequence of the regional series. In the first, the interpreted numbering spans from v2066-v2115, which produces a gap of 55 missing varves equivalent to v2011-v2065, and corresponds to the original interpretation of the Connaught varve numbering. The second spans v2011a-v2060a, and represents alternative (a) numbering for the same varves. Varve thickness data are listed in spreadsheet files (.xlsx and .csv formats), and CT-Scan radiograph images of core samples are laid out on a mosaic poster showing the interpreted varve numbering and between-core sample correlations of the varve couplets.
APA, Harvard, Vancouver, ISO, and other styles
7

Raymond, Kara, Laura Palacios, Cheryl McIntyre, and Evan Gwilliam. Status of climate and water resources at Chiricahua National Monument, Coronado National Memorial, and Fort Bowie National Historic Site: Water year 2019. National Park Service, May 2022. http://dx.doi.org/10.36967/nrr-2293370.

Full text
Abstract:
Climate and hydrology are major drivers of ecosystems. They dramatically shape ecosystem structure and function, particularly in arid and semi-arid ecosystems. Understanding changes in climate, groundwater, and water quality and quantity is central to assessing the condition of park biota and key cultural resources. The Sonoran Desert Network collects data on climate, groundwater, and surface water at 11 National Park Service units in southern Arizona and New Mexico. This report provides an integrated look at climate, groundwater, and springs conditions at Chiricahua National Monument (NM), Coronado National Memorial (NMem), and Fort Bowie National Historic Site (NHS) during water year (WY) 2019 (October 2018–September 2019). Overall annual precipitation at Chiricahua NM and Coronado NMem in WY2019 was approximately the same as the normals for 1981–2010. (The weather station at Fort Bowie NHS had missing values on 275 days, so data were not presented for that park.) Fall and winter rains were greater than normal. The monsoon season was generally weaker than normal, but storm events related to Hurricane Lorena led to increased late-season rain in September. Mean monthly maximum temperatures were generally cooler than normal at Chiricahua, whereas mean monthly minimum temperatures were warmer than normal. Temperatures at Coronado were more variable relative to normal. The reconnaissance drought index (RDI) indicated that Chiricahua NM was slightly wetter than normal. (The WY2019 RDI could not be calculated for Coronado NMem due to missing data.) The five-year moving mean of annual precipitation showed both park units were experiencing a minor multi-year precipitation deficit relative to the 39-year average. Mean groundwater levels in WY2019 increased at Fort Bowie NHS, and at two of three wells monitored at Chiricahua NM, compared to WY2018. Levels in the third well at Chiricahua slightly decreased. By contrast, water levels declined in five of six wells at Coronado NMem over the same period, with the sixth well showing a slight increase over WY2018. Over the monitoring record (2007–present), groundwater levels at Chiricahua have been fairly stable, with seasonal variability likely caused by transpiration losses and recharge from runoff events in Bonita Creek. At Fort Bowie’s WSW-2, mean groundwater level was also relatively stable from 2004 to 2019, excluding temporary drops due to routine pumping. At Coronado, four of the six wells demonstrated increases (+0.30 to 11.65 ft) in water level compared to the earliest available measurements. Only WSW-2 and Baumkirchner #3 have shown net declines (-17.31 and -3.80 feet, respectively) at that park. Springs were monitored at nine sites in WY2019 (four sites at Chiricahua NM; three at Coronado NMem, and two at Fort Bowie NHS). Most springs had relatively few indications of anthropogenic or natural disturbance. Anthropogenic disturbance included modifications to flow, such as dams, berms, or spring boxes. Examples of natural disturbance included game trails, scat, or evidence of flooding. Crews observed 0–6 facultative/obligate wetland plant taxa and 0–3 invasive non-native species at each spring. Across the springs, crews observed six non-native plant species: common mullein (Verbascum thapsus), spiny sowthistle (Sonchus asper), common sowthistle (Sonchus oleraceus), Lehmann lovegrass (Eragrostis lehmanniana), rabbitsfoot grass (Polypogon monspeliensis), and red brome (Bromus rubens). Baseline data on water quality and water chemistry were collected at all nine sites. It is likely that that all nine springs had surface water for at least some part of WY2019, though temperature sensors failed at two sites. The seven sites with continuous sensor data had water present for most of the year. Discharge was measured at eight sites and ranged from < 1 L/minute to 16.5 L/minute.
APA, Harvard, Vancouver, ISO, and other styles
8

Ruosteenoja, Kimmo. Applicability of CMIP6 models for building climate projections for northern Europe. Finnish Meteorological Institute, September 2021. http://dx.doi.org/10.35614/isbn.9789523361416.

Full text
Abstract:
In this report, we have evaluated the performance of nearly 40 global climate models (GCMs) participating in Phase 6 of the Coupled Model Intercomparison Project (CMIP6). The focus is on the northern European area, but the ability to simulate southern European and global climate is discussed as well. Model evaluation was started with a technical control; completely unrealistic values in the GCM output files were identified by seeking the absolute minimum and maximum values. In this stage, one GCM was rejected totally, and furthermore individual output files from two other GCMs. In evaluating the remaining GCMs, the primary tool was the Model Climate Performance Index (MCPI) that combines RMS errors calculated for the different climate variables into one index. The index takes into account both the seasonal and spatial variations in climatological means. Here, MCPI was calculated for the period 1981—2010 by comparing GCM output with the ERA-Interim reanalyses. Climate variables explored in the evaluation were the surface air temperature, precipitation, sea level air pressure and incoming solar radiation at the surface. Besides MCPI, we studied RMS errors in the seasonal course of the spatial means by examining each climate variable separately. Furthermore, the evaluation procedure considered model performance in simulating past trends in the global-mean temperature, the compatibility of future responses to different greenhouse-gas scenarios and the number of available scenario runs. Daily minimum and maximum temperatures were likewise explored in a qualitative sense, but owing to the non-existence of data from multiple GCMs, these variables were not incorporated in the quantitative validation. Four of the 37 GCMs that had passed the initial technical check were regarded as wholly unusable for scenario calculations: in two GCMs the responses to the different greenhouse gas scenarios were contradictory and in two other GCMs data were missing from one of the four key climate variables. Moreover, to reduce inter-GCM dependencies, no more than two variants of any individual GCM were included; this led to an abandonment of one GCM. The remaining 32 GCMs were divided into three quality classes according to the assessed performance. The users of model data can utilize this grading to select a subset of GCMs to be used in elaborating climate projections for Finland or adjacent areas. Annual-mean temperature and precipitation projections for Finland proved to be nearly identical regardless of whether they were derived from the entire ensemble or by ignoring models that had obtained the lowest scores. Solar radiation projections were somewhat more sensitive.
APA, Harvard, Vancouver, ISO, and other styles
9

Bonvecchi, Alejandro, Ernesto Calvo, Susana Otálvaro-Ramírez, and Carlos Scartascini. The Effect of a Crisis on Trust and Willingness to Reform: Evidence from Survey Panels in Argentina and Uruguay. Inter-American Development Bank, July 2022. http://dx.doi.org/10.18235/0004396.

Full text
Abstract:
Does exposure to crises reduce the citizens trust in a countrys president? Are individuals willing to accept fiscal reforms and make personal economic sacrifices if it would help the country to leave the crisis faster? We take advantage of two survey panels in Argentina and Uruguay, with a first wave fielded before COVID-19 (the crisis studied here) and a second wave a year later during the pandemic. Results provide no evidence of a decline in trust after the individual's health was compromised by COVID-19. We find mixed evidence of support for higher personal sacrifices. These results are relevant for understanding how voters' experience with COVID affects their trust in the government and whether crises could be prudent times for reforms. The results highlight the importance of having multi-country panel data for evaluating the impact of crises on trust.
APA, Harvard, Vancouver, ISO, and other styles
10

Chapman, Ray, Phu Luong, Sung-Chan Kim, and Earl Hayter. Development of three-dimensional wetting and drying algorithm for the Geophysical Scale Transport Multi-Block Hydrodynamic Sediment and Water Quality Transport Modeling System (GSMB). Engineer Research and Development Center (U.S.), July 2021. http://dx.doi.org/10.21079/11681/41085.

Full text
Abstract:
The Environmental Laboratory (EL) and the Coastal and Hydraulics Laboratory (CHL) have jointly completed a number of large-scale hydrodynamic, sediment and water quality transport studies. EL and CHL have successfully executed these studies utilizing the Geophysical Scale Transport Modeling System (GSMB). The model framework of GSMB is composed of multiple process models as shown in Figure 1. Figure 1 shows that the United States Army Corps of Engineers (USACE) accepted wave, hydrodynamic, sediment and water quality transport models are directly and indirectly linked within the GSMB framework. The components of GSMB are the two-dimensional (2D) deep-water wave action model (WAM) (Komen et al. 1994, Jensen et al. 2012), data from meteorological model (MET) (e.g., Saha et al. 2010 - http://journals.ametsoc.org/doi/pdf/10.1175/2010BAMS3001.1), shallow water wave models (STWAVE) (Smith et al. 1999), Coastal Modeling System wave (CMS-WAVE) (Lin et al. 2008), the large-scale, unstructured two-dimensional Advanced Circulation (2D ADCIRC) hydrodynamic model (http://www.adcirc.org), and the regional scale models, Curvilinear Hydrodynamics in three dimensions-Multi-Block (CH3D-MB) (Luong and Chapman 2009), which is the multi-block (MB) version of Curvilinear Hydrodynamics in three-dimensions-Waterways Experiments Station (CH3D-WES) (Chapman et al. 1996, Chapman et al. 2009), MB CH3D-SEDZLJ sediment transport model (Hayter et al. 2012), and CE-QUAL Management - ICM water quality model (Bunch et al. 2003, Cerco and Cole 1994). Task 1 of the DOER project, “Modeling Transport in Wetting/Drying and Vegetated Regions,” is to implement and test three-dimensional (3D) wetting and drying (W/D) within GSMB. This technical note describes the methods and results of Task 1. The original W/D routines were restricted to a single vertical layer or depth-averaged simulations. In order to retain the required 3D or multi-layer capability of MB-CH3D, a multi-block version with variable block layers was developed (Chapman and Luong 2009). This approach requires a combination of grid decomposition, MB, and Message Passing Interface (MPI) communication (Snir et al. 1998). The MB single layer W/D has demonstrated itself as an effective tool in hyper-tide environments, such as Cook Inlet, Alaska (Hayter et al. 2012). The code modifications, implementation, and testing of a fully 3D W/D are described in the following sections of this technical note.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography