Добірка наукової літератури з теми "Human Motion Data Analysis"
Оформте джерело за APA, MLA, Chicago, Harvard та іншими стилями
Ознайомтеся зі списками актуальних статей, книг, дисертацій, тез та інших наукових джерел на тему "Human Motion Data Analysis".
Біля кожної праці в переліку літератури доступна кнопка «Додати до бібліографії». Скористайтеся нею – і ми автоматично оформимо бібліографічне посилання на обрану працю в потрібному вам стилі цитування: APA, MLA, «Гарвард», «Чикаго», «Ванкувер» тощо.
Також ви можете завантажити повний текст наукової публікації у форматі «.pdf» та прочитати онлайн анотацію до роботи, якщо відповідні параметри наявні в метаданих.
Статті в журналах з теми "Human Motion Data Analysis"
Dong, Ran, Dongsheng Cai, and Soichiro Ikuno. "Motion Capture Data Analysis in the Instantaneous Frequency-Domain Using Hilbert-Huang Transform." Sensors 20, no. 22 (November 16, 2020): 6534. http://dx.doi.org/10.3390/s20226534.
Повний текст джерелаHuang, Zhenzhen, Qiang Niu, and Shuo Xiao. "Human Behavior Recognition Based on Motion Data Analysis." International Journal of Pattern Recognition and Artificial Intelligence 34, no. 09 (December 2, 2019): 2056005. http://dx.doi.org/10.1142/s0218001420560054.
Повний текст джерелаLI, Chun-Peng, Zhao-Qi WANG, and Shi-Hong XIA. "Motion Synthesis for Virtual Human Using Functional Data Analysis." Journal of Software 20, no. 6 (July 14, 2009): 1664–72. http://dx.doi.org/10.3724/sp.j.1001.2009.03332.
Повний текст джерелаBarker, T. M., and P. McCombe. "Discriminant analysis of human kinematic data: Application to human lumbar spinal motion." Proceedings of the Institution of Mechanical Engineers, Part H: Journal of Engineering in Medicine 213, no. 6 (June 1999): 447–53. http://dx.doi.org/10.1243/0954411991535059.
Повний текст джерелаYu, Jian, Jun Yi Cao, and Cheng Guang Li. "Dynamic Modeling and Complexity Analysis of Human Lower Limb under Various Speeds." Applied Mechanics and Materials 868 (July 2017): 212–17. http://dx.doi.org/10.4028/www.scientific.net/amm.868.212.
Повний текст джерелаGAO, CHUNMING, CHANGHUI LI, GUANGHUA TAN, SONGRUI GUO, and KE XIAO. "ADAPTIVE SEGMENTATION APPROACH FOR HUMAN ACTION DATA." International Journal of Pattern Recognition and Artificial Intelligence 28, no. 08 (December 2014): 1455012. http://dx.doi.org/10.1142/s021800141455012x.
Повний текст джерелаZeng, Ming, Zai Xin Yang, Hong Lin Ren, and Qing Hao Meng. "Multichannel Human Motion Similarity Analysis Based on Information Entropy and Dynamic Time Warping." Applied Mechanics and Materials 687-691 (November 2014): 847–51. http://dx.doi.org/10.4028/www.scientific.net/amm.687-691.847.
Повний текст джерелаLi, Wanyi, Feifei Zhang, Qiang Chen, and Qian Zhang. "Projection Analysis Optimization for Human Transition Motion Estimation." International Journal of Digital Multimedia Broadcasting 2019 (June 2, 2019): 1–9. http://dx.doi.org/10.1155/2019/6816453.
Повний текст джерелаPerera, Asanka G., Yee Wei Law, Ali Al-Naji, and Javaan Chahl. "Human motion analysis from UAV video." International Journal of Intelligent Unmanned Systems 6, no. 2 (April 16, 2018): 69–92. http://dx.doi.org/10.1108/ijius-10-2017-0012.
Повний текст джерелаXIANG, Jian. "Human motion data analysis and retrieval based on 3D feature extraction." Journal of Computer Applications 28, no. 5 (May 20, 2008): 1344–46. http://dx.doi.org/10.3724/sp.j.1087.2008.01344.
Повний текст джерелаДисертації з теми "Human Motion Data Analysis"
Shen, Yuping. "GEOMETRIC INVARIANCE IN THE ANALYSIS OF HUMAN MOTION IN VIDEO DATA." Doctoral diss., University of Central Florida, 2009. http://digital.library.ucf.edu/cdm/ref/collection/ETD/id/3157.
Повний текст джерелаPh.D.
School of Electrical Engineering and Computer Science
Engineering and Computer Science
Computer Science PhD
Dai, Wei. "FPCA Based Human-like Trajectory Generating." Scholar Commons, 2013. http://scholarcommons.usf.edu/etd/4811.
Повний текст джерелаLiu, Kai. "Detecting stochastic motifs in network and sequence data for human behavior analysis." HKBU Institutional Repository, 2014. https://repository.hkbu.edu.hk/etd_oa/60.
Повний текст джерелаFrench, Michael Lee. "A modular microprocessor-based data acquisition system for computerized 3-D motion analysis /." The Ohio State University, 1985. http://rave.ohiolink.edu/etdc/view?acc_num=osu148726013535863.
Повний текст джерелаJin, Ning. "Human motion analysis." Thesis, University of Surrey, 2007. http://epubs.surrey.ac.uk/804406/.
Повний текст джерелаTanco, L. Molina. "Human motion synthesis from captured data." Thesis, University of Surrey, 2002. http://epubs.surrey.ac.uk/844411/.
Повний текст джерелаChan, Chee Seng. "Fuzzy qualitative human motion analysis." Thesis, University of Portsmouth, 2008. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.494009.
Повний текст джерелаWestfeld, Patrick. "Geometrische und stochastische Modelle zur Verarbeitung von 3D-Kameradaten am Beispiel menschlicher Bewegungsanalysen." Doctoral thesis, Saechsische Landesbibliothek- Staats- und Universitaetsbibliothek Dresden, 2012. http://nbn-resolving.de/urn:nbn:de:bsz:14-qucosa-88592.
Повний текст джерелаThe three-dimensional documentation of the form and location of any type of object using flexible photogrammetric methods and procedures plays a key role in a wide range of technical-industrial and scientific areas of application. Potential applications include measurement tasks in the automotive, machine building and ship building sectors, the compilation of complex 3D models in the fields of architecture, archaeology and monumental preservation and motion analyses in the fields of flow measurement technology, ballistics and medicine. In the case of close-range photogrammetry a variety of optical 3D measurement systems are used. Area sensor cameras arranged in single or multi-image configurations are used besides active triangulation procedures for surface measurement (e.g. using structured light or laser scanner systems). The use of modulation techniques enables 3D cameras based on photomix detectors or similar principles to simultaneously produce both a grey value image and a range image. Functioning as single image sensors, they deliver spatially resolved surface data at video rate without the need for stereoscopic image matching. In the case of 3D motion analyses in particular, this leads to considerable reductions in complexity and computing time. 3D cameras combine the practicality of a digital camera with the 3D data acquisition potential of conventional surface measurement systems. Despite the relatively low spatial resolution currently achievable, as a monosensory real-time depth image acquisition system they represent an interesting alternative in the field of 3D motion analysis. The use of 3D cameras as measuring instruments requires the modelling of deviations from the ideal projection model, and indeed the processing of the 3D camera data generated requires the targeted adaptation, development and further development of procedures in the fields of computer graphics and photogrammetry. This Ph.D. thesis therefore focuses on the development of methods of sensor calibration and 3D motion analysis in the context of investigations into inter-human motion behaviour. As a result of its intrinsic design and measurement principle, a 3D camera simultaneously provides amplitude and range data reconstructed from a measurement signal. The simultaneous integration of all data obtained using a 3D camera into an integrated approach is a logical consequence and represents the focus of current procedural development. On the one hand, the complementary characteristics of the observations made support each other due to the creation of a functional context for the measurement channels, with is to be expected to lead to increases in accuracy and reliability. On the other, the expansion of the stochastic model to include variance component estimation ensures that the heterogeneous information pool is fully exploited. The integrated bundle adjustment developed facilitates the definition of precise 3D camera geometry and the estimation of range-measurement-specific correction parameters required for the modelling of the linear, cyclical and latency defectives of a distance measurement made using a 3D camera. The integrated calibration routine jointly adjusts appropriate dimensions across both information channels, and also automatically estimates optimum observation weights. The method is based on the same flexible principle used in self-calibration, does not require spatial object data and therefore foregoes the time-consuming determination of reference distances with superior accuracy. The accuracy analyses carried out confirm the correctness of the proposed functional contexts, but nevertheless exhibit weaknesses in the form of non-parameterized range-measurement-specific errors. This notwithstanding, the future expansion of the mathematical model developed is guaranteed due to its adaptivity and modular implementation. The accuracy of a new 3D point coordinate can be set at 5 mm further to calibration. In the case of depth imaging technology – which is influenced by a range of usually simultaneously occurring noise sources – this level of accuracy is very promising, especially in terms of the development of evaluation algorithms based on corrected 3D camera data. 2.5D Least Squares Tracking (LST) is an integrated spatial and temporal matching method developed within the framework of this Ph.D. thesis for the purpose of evaluating 3D camera image sequences. The algorithm is based on the least squares image matching method already established in photogrammetry, and maps small surface segments of consecutive 3D camera data sets on top of one another. The mapping rule has been adapted to the data structure of a 3D camera on the basis of a 2D affine transformation. The closed parameterization combines both grey values and range values in an integrated model. In addition to the affine parameters used to include translation and rotation effects, the scale and inclination parameters model perspective-related deviations caused by distance changes in the line of sight. A pre-processing phase sees the calibration routine developed used to correct optical and distance-related measurement specific errors in input data and measured slope distances reduced to horizontal distances. 2.5D LST is an integrated approach, and therefore delivers fully three-dimensional displacement vectors. In addition, the accuracy and reliability data generated by error calculation can be used as decision criteria for integration into an application-specific processing chain. Process validation showed that the integration of complementary data leads to a more accurate, reliable solution to the correspondence problem, especially in the case of difficult contrast ratios within a channel. The accuracy of scale and inclination parameters directly linked to distance correction terms improved dramatically. In addition, the expansion of the geometric model led to significant benefits, and in particular for the matching of natural, not entirely planar surface segments. The area-based object matching and object tracking method developed functions on the basis of 3D camera data gathered without object contact. It is therefore particularly suited to 3D motion analysis tasks in which the extra effort involved in multi-ocular experimental settings and the necessity of object signalling using target marks are to be avoided. The potential of the 3D camera matching approach has been demonstrated in two application scenarios in the field of research into human behaviour. As in the case of the use of 2.5D LST to mark and then classify hand gestures accompanying verbal communication, the implementation of 2.5D LST in the proposed procedures for the determination of interpersonal distance and body orientation within the framework of pedagogical research into conflict regulation between pairs of child-age friends facilitates the automatic, effective, objective and high-resolution (from both a temporal and spatial perspective) acquisition and evaluation of data with relevance to behaviour. This Ph.D. thesis proposes the use of a novel 3D range imaging camera to gather data on human behaviour, and presents both a calibration tool developed for data processing purposes and a method for the contact-free determination of dense 3D motion vector fields. It therefore makes a contribution to current efforts in the field of the automated videographic documentation of bodily motion within the framework of dyadic interaction, and shows that photogrammetric methods can also deliver valuable results within the framework of motion evaluation tasks in the as-yet relatively untapped field of behavioural research
López, Méndez Adolfo. "Articulated models for human motion analysis." Doctoral thesis, Universitat Politècnica de Catalunya, 2012. http://hdl.handle.net/10803/112124.
Повний текст джерелаL’anàlisi del moviment humà es una area de visió per computador que, en les últimes dècades, ha atret l'interès de la comunitat científica. L’anàlisi de moviment inclou temes com el seguiment del cos humà, el reconeixement d'accions i patrons de comportament, o la segmentació del moviment humà. Tots aquests camps suposen un repte a causa de diferents raons, però especialment a la perspectiva de captura de les escenes a analitzar i també a l’absència d'una semàntica precisa associada a les accions i el moviment humà. La comunitat de visió per computador ha abordat l’anàlisi del moviment humà des de diverses perspectives. Els primers enfocaments es basen en models articulats del cos humà. Aquests models representen el cos com una estructura esqueletal tridimensional. No obstant, a causa de la dificultat i el cost computacional de l’estimació d'aquesta estructura articulada a partir de vídeo, la investigació s'ha anat enfocant, en els últims anys, cap a l’anàlisi de moviment humà basat en característiques de baix nivell. Malgrat obtenir resultats impressionants en diverses tasques, les característiques de baix nivell estan normalment condicionades per l’aparença i punt de vista, cosa que fa difícil la seva aplicació en diferents escenaris. Avui dia, l'augment de la potència de càlcul, la disponibilitat massiva de dades i la irrupció de les càmares de profunditat de baix cost han proporcionat un escenari que permet reconsiderar l’anàlisi de moviment humà a través de models articulats. L'anàlisi i comprensió del moviment humà a través de la informació tridimensional segueix sent un enfocament crucial per obtenir millors models dinàmics al voltant del moviment del cos humà. Per això, els models articulats del cos humà, que ofereixen una representació compacta i invariant al punt de vista de la captura, són una eina per potenciar l'anàlisi de moviment. En aquesta tesi, es presenten diversos enfocaments per a l'anàlisi de moviment. En particular, s'aborda el problema de l'estimació de pose, el reconeixement d'accions i el clustering temporal del moviment humà. Els models articulats són el leitmotiv en tots els plantejaments presentats. En primer lloc, plantegem l’estimació de pose mitjançant la formulació d'un mètode jeràrquic d'anàlisi per síntesi en que els models s'utilitzen per generar hipòtesis que es contrasten amb vídeo. Fent servir la mateixa representació articulada del cos humà, es proposa una formulació del moviment humà per al reconeixement d'accions. La nostra hipòtesi és que les accions formen un conjunt de sistemes dinàmics subjacents que generen observacions en forma de sèries temporals. Aquestes sèries temporals són observades a través del model articulat. Aquesta hipòtesi s'utilitza amb la finalitat de desenvolupar mètodes de reconeixement basats en time-delay embeddings, una eina d’anàlisi de sèries temporals que no fa suposicions sobre la forma del sistema dinàmic subjacent. Finalment, es proposa un mètode per segmentar seqüències de moviment del cos humà en diferents comportaments o accions, sense necessitar un coneixement a priori del nombre d'accions en la seqüència. El nostre enfocament utilitza els models articulats del cos humà per aprendre una distància mètrica. Aquesta mètrica té com a objectiu capturar la semàntica implícita de les anotacions que es puguin trobar en altres bases de dades que continguin seqüències de moviment. Amb la finalitat de mesurar objectivament les nostres contribucions, els mètodes proposats són avaluats utilitzant bases de dades publiques.
Holmberg, Björn. "Towards markerless analysis of human motion /." Uppsala : Department of Information Technology, Uppsala University, 2005. http://www.it.uu.se/research/publications/lic/2005-011/.
Повний текст джерелаКниги з теми "Human Motion Data Analysis"
André, Haeberli, ed. Human protein data. Weinheim: Wiley-VCH, 1998.
Знайти повний текст джерелаE, Basham Randall, ed. Data analysis with spreadsheets. Boston: Pearson/Allyn & Bacon, 2006.
Знайти повний текст джерелаMatthews, M. H. Geographical data: Sources, presentation and analysis. Oxford: Oxford University Press, 1989.
Знайти повний текст джерелаGertman, David. Human reliability and safety analysis data handbook. New York: Wiley, 1994.
Знайти повний текст джерелаGertman, David I. Human reliability and safety analysis data handbook. New York: Wiley, 1994.
Знайти повний текст джерелаIntroduction to Biomechanics for Human Motion Analysis. 2nd ed. Waterloo, Ontario, Canada: Waterloo Biomechanics, 2004.
Знайти повний текст джерелаPrinciples of biomechanics & motion analysis. Philadelphia: Lippincott Williams & Wilkins, 2006.
Знайти повний текст джерелаTheo, Gevers, and SpringerLink (Online service), eds. Computer Analysis of Human Behavior. London: Springer-Verlag London Limited, 2011.
Знайти повний текст джерелаWard, Thomas E. Kinetic data extraction and analysis system for human gait. Dublin: University College Dublin, 1996.
Знайти повний текст джерелаTice, Raymond R. User's guide: Micronucleus assay data management and analysis system. Las Vegas, NV: U.S. Environmental Protection Agency, Environmental Monitoring Systems Laboratory, 1990.
Знайти повний текст джерелаЧастини книг з теми "Human Motion Data Analysis"
Müller, Meinard, and Tido Röder. "A Relational Approach to Content-based Analysis of Motion Capture Data." In Human Motion, 477–506. Dordrecht: Springer Netherlands, 2008. http://dx.doi.org/10.1007/978-1-4020-6693-1_20.
Повний текст джерелаYe, Mao, Qing Zhang, Liang Wang, Jiejie Zhu, Ruigang Yang, and Juergen Gall. "A Survey on Human Motion Analysis from Depth Data." In Lecture Notes in Computer Science, 149–87. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-44964-2_8.
Повний текст джерелаCilla, Rodrigo, Miguel A. Patricio, Antonio Berlanga, and José M. Molina. "A Data Fusion Perspective on Human Motion Analysis Including Multiple Camera Applications." In Natural and Artificial Computation in Engineering and Medical Applications, 149–58. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013. http://dx.doi.org/10.1007/978-3-642-38622-0_16.
Повний текст джерелаJosiński, Henryk, Agnieszka Michalczuk, Romualda Mucha, Adam Świtoński, Agnieszka Szczȩsna, and Konrad Wojciechowski. "Analysis of Human Motion Data Using Recurrence Plots and Recurrence Quantification Measures." In Intelligent Information and Database Systems, 397–406. Berlin, Heidelberg: Springer Berlin Heidelberg, 2016. http://dx.doi.org/10.1007/978-3-662-49390-8_39.
Повний текст джерелаBenter, Martin, and Peter Kuhlang. "Derivation of MTM-HWD® Analyses from Digital Human Motion Data." In Proceedings of the 21st Congress of the International Ergonomics Association (IEA 2021), 363–70. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-74608-7_46.
Повний текст джерелаXu, Jingjing, Brendan M. Duffy, and Vincent G. Duffy. "Data Mining in Systematic Reviews: A Bibliometric Analysis of Game-Based Learning and Distance Learning." In Digital Human Modeling and Applications in Health, Safety, Ergonomics and Risk Management. Human Body, Motion and Behavior, 343–54. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-77817-0_24.
Повний текст джерелаDel Bimbo, Alberto, and Simone Santini. "Motion Analysis." In Human and Machine Vision, 199–221. Boston, MA: Springer US, 1994. http://dx.doi.org/10.1007/978-1-4899-1004-2_14.
Повний текст джерелаDe Mazière, P. A., and M. M. Van Hulle. "Towards a Spatio-Temporal Analysis Tool for fMRI Data: An Application to Depth-from-Motion Processing in Humans." In Perspectives in Neural Computing, 33–42. London: Springer London, 2001. http://dx.doi.org/10.1007/978-1-4471-0281-6_4.
Повний текст джерелаElgammal, Ahmed, and Chan-Su Lee. "The Role of Manifold Learning in Human Motion Analysis." In Human Motion, 25–56. Dordrecht: Springer Netherlands, 2008. http://dx.doi.org/10.1007/978-1-4020-6693-1_2.
Повний текст джерелаSminchisescu, Cristian. "3D Human Motion Analysis in Monocular Video: Techniques and Challenges." In Human Motion, 185–211. Dordrecht: Springer Netherlands, 2008. http://dx.doi.org/10.1007/978-1-4020-6693-1_8.
Повний текст джерелаТези доповідей конференцій з теми "Human Motion Data Analysis"
Kato, Kojiro, Kris M. Kitani, and Takuya Nojima. "Ego-motion analysis using average image data intensity." In the 2nd Augmented Human International Conference. New York, New York, USA: ACM Press, 2011. http://dx.doi.org/10.1145/1959826.1959835.
Повний текст джерелаJosiński, Henryk, Agnieszka Michalczuk, Adam Świtoński, Agnieszka Szczęsna, and Konrad Wojciechowski. "Recurrence plots and recurrence quantification analysis of human motion data." In INTERNATIONAL CONFERENCE OF NUMERICAL ANALYSIS AND APPLIED MATHEMATICS 2015 (ICNAAM 2015). Author(s), 2016. http://dx.doi.org/10.1063/1.4951961.
Повний текст джерелаSchooley, Patrick, and Syed Ali Hamza. "Radar human motion classification using multi-antenna system." In Big Data III: Learning, Analytics, and Applications, edited by Fauzia Ahmad, Panos P. Markopoulos, and Bing Ouyang. SPIE, 2021. http://dx.doi.org/10.1117/12.2588700.
Повний текст джерелаPing Hu, Qi Sun, Xiangxu Meng, and Jingliang Peng. "Data-driven human motion synthesis based on angular momentum analysis." In 2013 IEEE International Symposium on Circuits and Systems (ISCAS). IEEE, 2013. http://dx.doi.org/10.1109/iscas.2013.6572000.
Повний текст джерелаKadu, Harshad, Maychen Kuo, and C. C. Jay Kuo. "Human motion classification and management based on mocap data analysis." In the 2011 joint ACM workshop. New York, New York, USA: ACM Press, 2011. http://dx.doi.org/10.1145/2072572.2072594.
Повний текст джерелаChen, Enqing, and Longfei Zhang. "Example-based analysis and alignment system for human motion data." In Tenth International Conference on Digital Image Processing (ICDIP 2018), edited by Xudong Jiang and Jenq-Neng Hwang. SPIE, 2018. http://dx.doi.org/10.1117/12.2502886.
Повний текст джерела"A COMPREHENSIVE ANALYSIS OF HUMAN MOTION CAPTURE DATA FOR ACTION RECOGNITION." In International Conference on Computer Vision Theory and Applications. SciTePress - Science and and Technology Publications, 2012. http://dx.doi.org/10.5220/0003868806470652.
Повний текст джерелаPrakash, Chandra, Uddeshya Mishra, Manas Jain, Rajesh Kumar, and Namita Mittal. "Automated Kinematic Analysis Using Holistic Based Human Gait Motion for Biomedical Applications." In 2018 8th International Conference on Cloud Computing, Data Science & Engineering (Confluence). IEEE, 2018. http://dx.doi.org/10.1109/confluence.2018.8442947.
Повний текст джерелаHu, Ningning, and Aihui Wang. "Kinematics and dynamics analysis of lower limbs based on human motion data." In 2020 Chinese Automation Congress (CAC). IEEE, 2020. http://dx.doi.org/10.1109/cac51589.2020.9327180.
Повний текст джерелаVox, Jan Paul, and Frank Wallhoff. "A Framework for the Analysis of Biomechanical Loading Using Human Motion Tracking." In 2019 IEEE 20th International Conference on Information Reuse and Integration for Data Science (IRI). IEEE, 2019. http://dx.doi.org/10.1109/iri.2019.00020.
Повний текст джерелаЗвіти організацій з теми "Human Motion Data Analysis"
Rooks, Drew, and Trelanah McCalla. Human Dipping and Inserting Manipulation Motion Analysis. RPAL, December 2018. http://dx.doi.org/10.32555/2018.ir.001.
Повний текст джерелаGamoneda, Astrid, and Subhrajyoti Pradhan. Human Beating, Dipping, and Mixing Manipulation Motion Analysis. RPAL, December 2018. http://dx.doi.org/10.32555/2018.ir.003.
Повний текст джерелаMatsumoto, David, Hyisung C. Hwang, Adam M. Fullenkamp, and C. M. Laurent. Human Deception Detection from Whole Body Motion Analysis. Fort Belvoir, VA: Defense Technical Information Center, December 2015. http://dx.doi.org/10.21236/ada626755.
Повний текст джерелаLigocki, Aaron, and Nicholas Eales. Human Beating, Brushing, Screwing, Inserting, and Pouring Motion Analysis. RPAL, December 2018. http://dx.doi.org/10.32555/2018.ir.002.
Повний текст джерелаBriggs, Michael J., Stephen T. Maynord, Charles R. Nickles, and Terry N. Waller. Charleston Harbor Ship Motion Data Collection and Squat Analysis. Fort Belvoir, VA: Defense Technical Information Center, March 2004. http://dx.doi.org/10.21236/ada457976.
Повний текст джерелаCassidy, J. F. On the analysis of "weak" strong motion data, southwestern British Columbia. Natural Resources Canada/ESS/Scientific and Technical Publishing Services, 2000. http://dx.doi.org/10.4095/211651.
Повний текст джерелаBeam, Craig A., Emily F. Conant, Harold L. Kundel, Ji-Hyun Lee, Patricia A. Romily, and Edward A. Sickles. Time-Series Analysis of Human Interpretation Data in Mammography. Fort Belvoir, VA: Defense Technical Information Center, January 2005. http://dx.doi.org/10.21236/ada434583.
Повний текст джерелаZhang, Zheqing, Yingyao Wang, Xiaoguang Yang, Yiyong Chen, Hong Zhang, Xuebin Xu, Jin Zhou, et al. Human milk lipid profiles around the world: a pooled data analysis. INPLASY - International Platform of Registered Systematic Review and Meta-analysis Protocols, April 2022. http://dx.doi.org/10.37766/inplasy2022.4.0079.
Повний текст джерелаRodgers, A., H. Tkalcic, and D. McCallen. Understanding Ground Motion in Las Vegas: Insights from Data Analysis and Two-Dimensional Modeling. Office of Scientific and Technical Information (OSTI), February 2004. http://dx.doi.org/10.2172/15013918.
Повний текст джерелаRhodes, E. A., G. S. Stanford, and J. P. Regis. Fuel motion in TREAT tests M5F1, M5F2, M6 and M7: preliminary analysis of hodoscope data. Office of Scientific and Technical Information (OSTI), July 1989. http://dx.doi.org/10.2172/714612.
Повний текст джерела