Academic literature on the topic 'Financial engineering Data processing'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Financial engineering Data processing.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Financial engineering Data processing"

1

Yang, Ning. "Financial Big Data Management and Control and Artificial Intelligence Analysis Method Based on Data Mining Technology." Wireless Communications and Mobile Computing 2022 (May 29, 2022): 1–13. http://dx.doi.org/10.1155/2022/7596094.

Full text
Abstract:
Driven by capital and Internet information (IT) technology, the operating scale and capital scale of modern industrial and commercial enterprises and various organizations have increased exponentially. At present, the manual-based financial work model has been unable to adapt to the changing speed of the modern business environment and the business rhythm of enterprises. All kinds of enterprises and organizations, especially large enterprises, urgently need to improve the operational efficiency of financial systems. By enhancing the integrity, timeliness, and synergy of financial information, it improves the comprehensiveness and ability of analyzing complex problems in financial analysis. It can cope with such rapid changes and help improve the financial management capabilities of enterprises. It provides more valuable decision-making guidance for business operations and reduces business risks. In recent years, the vigorous development of artificial intelligence technology has provided a feasible solution to meet the urgent needs of enterprises. Combining data mining, deep learning, image recognition, natural language processing, knowledge graph, human-computer interaction, intelligent decision-making, and other artificial intelligence technologies with IT technology to transform financial processes, it can significantly reduce the processing time of repetitive basic financial processes, reduce the dependence on manual accounting processing, and improve the work efficiency of the financial department. Through the autonomous analysis and decision-making of artificial intelligence, the intelligentization of financial management is realized, and more accurate and effective financial decision-making support is provided for enterprises. This paper studies the company’s intelligent financial reengineering process, so as to provide reference and reference for other enterprises to upgrade similar financial systems. The results of the analysis showed that at the level of α = 0.05 , there was a significant difference in the mean between the two populations. When the r value is in the range of -1 and 1, the linear relationship between the x and y variables is more obvious. This paper proposes decision-making suggestions and risk control early warning to the group decision-making body, or evaluates the financial impact of the group’s decision-making, and opens the road to financial intelligence.
APA, Harvard, Vancouver, ISO, and other styles
2

Rodríguez-García, Miguel Ángel, Alejandro Rodríguez-González, Ricardo Colomo-Palacios, Rafael Valencia-García, Juan Miguel Gómez-Berbís, and Francisco García-Sánchez. "Using Data Crawlers and Semantic Web to Build Financial XBRL Data Generators: The SONAR Extension Approach." Scientific World Journal 2014 (2014): 1–18. http://dx.doi.org/10.1155/2014/506740.

Full text
Abstract:
Precise, reliable and real-time financial information is critical for added-value financial services after the economic turmoil from which markets are still struggling to recover. Since the Web has become the most significant data source, intelligent crawlers based on Semantic Technologies have become trailblazers in the search of knowledge combining natural language processing and ontology engineering techniques. In this paper, we present the SONAR extension approach, which will leverage the potential of knowledge representation by extracting, managing, and turning scarce and disperse financial information into well-classified, structured, and widely used XBRL format-oriented knowledge, strongly supported by a proof-of-concept implementation and a thorough evaluation of the benefits of the approach.
APA, Harvard, Vancouver, ISO, and other styles
3

Pollak, Ilya. "Statistics and Data Analysis for Financial Engineering (Ruppert, D.; 2011) [Book Reviews]." IEEE Signal Processing Magazine 28, no. 5 (September 2011): 146–47. http://dx.doi.org/10.1109/msp.2011.941994.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Cont, Rama. "Statistical Modeling of High-Frequency Financial Data." IEEE Signal Processing Magazine 28, no. 5 (September 2011): 16–25. http://dx.doi.org/10.1109/msp.2011.941548.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

G, Rohith Urs, Nithin D, Akul G. Devali, and Rakshit Vastrad. "Analysis of Text Data For Stock Prediction." International Journal for Research in Applied Science and Engineering Technology 10, no. 5 (May 31, 2022): 3391–95. http://dx.doi.org/10.22214/ijraset.2022.43132.

Full text
Abstract:
Abstract: Accounting for price fluctuations and understanding people's emotions can help to improve stock price forecasting. Only a few models can decipher financial jargon and have stock price change datasets that have been labelled. In this project, we used text mining techniques to extract high-quality data from news and tweets published by legitimate businesses on the internet, allowing us to analyse, decide, and update our database for future use. In this paper, we propose an information gathering and processing framework that combines a natural language processing tool with our algorithms. We use natural language processing and machine learning techniques to make predictions. The result demonstrates the algorithm's ability to foresee favorable outcomes.
APA, Harvard, Vancouver, ISO, and other styles
6

Wang, Qianyi. "Research on University Financial Accounting Management System Based on Big Data and Blockchain Data Fusion." Wireless Communications and Mobile Computing 2022 (September 16, 2022): 1–10. http://dx.doi.org/10.1155/2022/4118075.

Full text
Abstract:
The majority of traditional finance management in colleges and universities is manual. This backward management mode brings a lot of inconvenience to financial processing. An essential concern in the daily financial administration of colleges and universities is how to efficiently gather, handle, and evaluate this important financial information and apply this beneficial knowledge to the daily management of colleges and universities. College and university finance administration has become increasingly complex due to the rapid growth of these institutions. The standard financial accounting management system is far from adequate for the daily needs of colleges and universities as their size grows. As a result, a new financial accounting management system for universities is being developed based on the combination of big data and blockchain data. The embedded processor, DDR2 memory chip, network interface, and USB interface are all designed by the hardware section. The software element examines the needs of university financial accounting management before designing a model based on big data and blockchain integration for university financial accounting management. Finally, the financial accounting management function module and database are designed and tested. The experimental results show that the designed financial accounting management system can effectively carry out financial management, has good performance, and has certain application value.
APA, Harvard, Vancouver, ISO, and other styles
7

Drakakis, Konstantinos. "Application of signal processing to the analysis of financial data [In the Spotlight]." IEEE Signal Processing Magazine 26, no. 5 (September 2009): 160–58. http://dx.doi.org/10.1109/msp.2009.933377.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Zhu, Jian Peng, Pei Pei Wang, Ding Wang, and Ying Wang. "The Research of Technical Architecture of Semi-Structured XBRL Data Based on Hadoop Cluster and Structured Data Exchange System." Applied Mechanics and Materials 678 (October 2014): 130–34. http://dx.doi.org/10.4028/www.scientific.net/amm.678.130.

Full text
Abstract:
With the fast propulsion of the informatization, a large base of semi-structured XBRL data and structured financial data have been accumulated in various business activities, including production, commercial management, transaction, government supervising and administrating, which is ever-increasing with a relatively large-scale. These huge amounts of business data surely will be distributed on the cloud in the future. Therefore, the purpose of this article is to solve this common challenging problem of how to share the huge number of structured financial data and semi-structured XBRL data. Combining the related cloud computing technology with the XBRL technology, a technical architecture and implementation of semi-structured XBRL data and structured data exchange system based on Hadoop cluster is proposed in this study. The technical architecture has certain engineering value and theoretic significance in the area of big data processing and sharing.
APA, Harvard, Vancouver, ISO, and other styles
9

Al-qerem, Ahmad, Ghazi Al-Naymat, Mays Alhasan, and Mutaz Al-Debei. "Default Prediction Model: The Significant Role of Data Engineering in the Quality of Outcomes." International Arab Journal of Information Technology 17, no. 4A (July 31, 2020): 635–44. http://dx.doi.org/10.34028/iajit/17/4a/8.

Full text
Abstract:
For financial institutions and the banking industry, it is very crucial to have predictive models for their core financial activities, and especially those activities which play major roles in risk management. Predicting loan default is one of the critical issues that banks and financial institutions focus on, as huge revenue loss could be prevented by predicting customer’s ability not only to pay back, but also to be able to do that on time. Customer loan default prediction is a task of proactively identifying customers who are most probably to stop paying back their loans. This is usually done by dynamically analyzing customers’ relevant information and behaviors. This is significant so as the bank or the financial institution can estimate the borrowers’ risk. Many different machine learning classification models and algorithms have been used to predict customers’ ability to pay back loans. In this paper, three different classification methods (Naïve Bayes, Decision Tree, and Random Forest) are used for prediction, comprehensive different pre-processing techniques are being applied on the dataset in order to gain better data through fixing some of the main data issues like missing values and imbalanced data, and three different feature extractions algorithms are used to enhance the accuracy and the performance. Results of the competing models were varied after applying data preprocessing techniques and features selections. The results were compared using F1 accuracy measure. The best model achieved an improvement of about 40%, whilst the least performing model achieved an improvement of 3% only. This implies the significance and importance of data engineering (e.g., data preprocessing techniques and features selections) course of action in machine learning exercises
APA, Harvard, Vancouver, ISO, and other styles
10

Zhang, Zhongmin. "Intelligent Optimization of the Financial Sharing Path Based on Accounting Big Data." Mathematical Problems in Engineering 2022 (October 11, 2022): 1–8. http://dx.doi.org/10.1155/2022/1310994.

Full text
Abstract:
In order to solve the problems of inconsistent accounting and untimely accounting information, this paper proposes a research method of intelligent optimization of financial sharing path based on big data of accounting. This paper analyzes the current working situation of the financial sharing service center of large- and medium-sized enterprises and expounds the reasons for the blockchain. The author embeds the blockchain technology in the financial sharing service center, with a view to optimizing the design of the current FSSC’s functions in terms of the scope of daily business work, the effectiveness of accounting information processing, and the utilization of financial analysis decisions, and finally analyzes the corporate effect of applying the financial sharing architecture based on blockchain technology. The results show that compared with the financial sharing score under the traditional mode, the financial sharing score under the blockchain technology is higher, and the former is 69.675 points, and the latter is 80.6340 points. From this, we can conclude that financial sharing under blockchain technology has significant advantages. This framework realizes the effective allocation of enterprise information resources through finance to achieve real industry finance integration.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Financial engineering Data processing"

1

Fernandez, Noemi. "Statistical information processing for data classification." FIU Digital Commons, 1996. http://digitalcommons.fiu.edu/etd/3297.

Full text
Abstract:
This thesis introduces new algorithms for analysis and classification of multivariate data. Statistical approaches are devised for the objectives of data clustering, data classification and object recognition. An initial investigation begins with the application of fundamental pattern recognition principles. Where such fundamental principles meet their limitations, statistical and neural algorithms are integrated to augment the overall approach for an enhanced solution. This thesis provides a new dimension to the problem of classification of data as a result of the following developments: (1) application of algorithms for object classification and recognition; (2) integration of a neural network algorithm which determines the decision functions associated with the task of classification; (3) determination and use of the eigensystem using newly developed methods with the objectives of achieving optimized data clustering and data classification, and dynamic monitoring of time-varying data; and (4) use of the principal component transform to exploit the eigensystem in order to perform the important tasks of orientation-independent object recognition, and di mensionality reduction of the data such as to optimize the processing time without compromising accuracy in the analysis of this data.
APA, Harvard, Vancouver, ISO, and other styles
2

Chiu, Cheng-Jung. "Data processing in nanoscale profilometry." Thesis, Massachusetts Institute of Technology, 1995. http://hdl.handle.net/1721.1/36677.

Full text
Abstract:
Thesis (M.S.)--Massachusetts Institute of Technology, Dept. of Mechanical Engineering, 1995.
Includes bibliographical references (p. 176-177).
New developments on the nanoscale are taking place rapidly in many fields. Instrumentation used to measure and understand the geometry and property of the small scale structure is therefore essential. One of the most promising devices to head the measurement science into the nanoscale is the scanning probe microscope. A prototype of a nanoscale profilometer based on the scanning probe microscope has been built in the Laboratory for Manufacturing and Productivity at MIT. A sample is placed on a precision flip stage and different sides of the sample are scanned under the SPM to acquire its separate surface topography. To reconstruct the original three dimensional profile, many techniques like digital filtering, edge identification, and image matching are investigated and implemented in the computer programs to post process the data, and with greater emphasis placed on the nanoscale application. The important programming issues are addressed, too. Finally, this system's error sources are discussed and analyzed.
by Cheng-Jung Chiu.
M.S.
APA, Harvard, Vancouver, ISO, and other styles
3

Koriziz, Hariton. "Signal processing methods for the modelling and prediction of financial data." Thesis, Imperial College London, 2008. http://ethos.bl.uk/OrderDetails.do?uin=uk.bl.ethos.504921.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Laurila, M. (Mikko). "Big data in Finnish financial services." Bachelor's thesis, University of Oulu, 2017. http://urn.fi/URN:NBN:fi:oulu-201711243156.

Full text
Abstract:
This thesis aims to explore the concept of big data, and create understanding of big data maturity in the Finnish financial services industry. The research questions of this thesis are “What kind of big data solutions are being implemented in the Finnish financial services sector?” and “Which factors impede faster implementation of big data solutions in the Finnish financial services sector?”. Big data, being a concept usually linked with huge data sets and economies of scale, is an interesting topic for research in Finland, a market in which the size of data sets is somewhat limited by the size of the market. This thesis includes a literature review on the concept of big data, and earlier literature of the Finnish big data landscape, and a qualitative content analysis of available public information on big data maturity in the context of the Finnish financial services market. The results of this research show that in Finland big data is utilized to some extent, at least by the larger organizations. Financial services specific big data solutions include things like the automation of applications handling in insurance. The most clear and specific factors slowing the development of big data maturity in the industry are the lack of competent work-force and new regulations compliance projects taking development resources. These results can be used as an overview of the state of big data maturity in the Finnish financial services industry. This study also lays a solid foundation for further research in the form of conducting interviews, which would provide more in-depth data
Tämän työn tavoitteena on selvittää big data -käsitettä sekä kehittää ymmärrystä Suomen rahoitusalan big data -kypsyydestä. Tutkimuskysymykset tutkielmalle ovat “Millaisia big data -ratkaisuja on otettu käyttöön rahoitusalalla Suomessa?” sekä “Mitkä tekijät hidastavat big data -ratkaisujen implementointia rahoitusalalla Suomessa?”. Big data käsitteenä liitetään yleensä valtaviin datamassoihin ja suuruuden ekonomiaan. Siksi big data onkin mielenkiintoinen aihe tutkittavaksi suomalaisessa kontekstissa, missä datajoukkojen koko on jossain määrin rajoittunut markkinan koon myötä. Työssä esitetään big datan määrittely kirjallisuuteen perustuen sekä esitetään yhteenveto big datan soveltamisesta Suomessa aikaisempiin tutkimuksiin perustuen. Työssä on toteutettu laadullinen aineistoanalyysi julkisesti saatavilla olevasta informaatiosta big datan käytöstä rahoitusalalla Suomessa. Tulokset osoittavat big dataa hyödynnettävän jossain määrin rahoitusalalla Suomessa, ainakin suurikokoisissa organisaatioissa. Rahoitusalalle erityisiä ratkaisuja ovat esimerkiksi hakemuskäsittelyprosessien automatisointi. Selkeimmät big data -ratkaisujen implementointia hidastavat tekijät ovat osaavan työvoiman puute, sekä uusien regulaatioiden asettamat paineet kehitysresursseille. Työ muodostaa eräänlaisen kokonaiskuvan big datan hyödyntämisestä rahoitusalalla Suomessa. Tutkimus perustuu julkisen aineiston analyysiin, mikä osaltaan luo pohjan jatkotutkimukselle aiheesta. Jatkossa haastatteluilla voitaisiinkin edelleen syventää tietämystä aiheesta
APA, Harvard, Vancouver, ISO, and other styles
5

Siu, Ka Wai. "Numerical algorithms for data analysis with imaging and financial applications." HKBU Institutional Repository, 2018. https://repository.hkbu.edu.hk/etd_oa/550.

Full text
Abstract:
In this thesis, we study modellings and numerical algorithms to data analysis with applications to image processing and financial forecast. The thesis is composed of two parts, namely the tensor regression and data assimilation methods for image restoration.;We start with investigating the tensor regression problem in Chapter 2. It is a generalization of a classical regression in order to adopt and analyze much more information by using multi-dimensional arrays. Since the regression problem is subject to multiple solutions, we propose a regularized tensor regression model to the problem. By imposing a low-rank property of the solution and considering the structure of the tensor product, we develop an algorithm which is suitable for scalable implementations. The regularization method is used to select useful solutions which depend on applications. The proposed model is solved by the alternating minimization method and we prove the convergence of the objective function values and iterates by the maximization-minimization (MM) technique. We study different factors which affects the performance of the algorithm, including sample sizes, solution ranks and the noise levels. Applications include image compressing and financial forecast.;In Chapter 3, we apply filtering methods in data assimilation to image restoration problems. Traditionally, data assimilation methods optimally combine a predictive state from a dynamical system with real partially observations. The motivation is to improve the model forecast by real observation. We construct an artificial dynamics to the non-blind deblurring problems. By making use of spatial information of a single image, a span of ensemble members is constructed. A two-stage use of ensemble transform Kalman filter (ETKF) is adopted to deblur corrupted images. The theoretical background of ETKF and the use of artificial dynamics by stage augmentation method are provided. Numerical experiments include image and video processing.;Concluding remarks and discussion on future extensions are included in Chapter 4.
APA, Harvard, Vancouver, ISO, and other styles
6

Pan, Howard W. (Howard Weihao) 1973. "Integrating financial data over the Internet." Thesis, Massachusetts Institute of Technology, 1999. http://hdl.handle.net/1721.1/37812.

Full text
Abstract:
Thesis (M.Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 1999.
Includes bibliographical references (leaves 65-66).
This thesis examines the issues and value-added, from both the technical and economic perspective, of solving the information integration problem in the retail banking industry. In addition, we report on an implementation of a prototype for the Universal Banking Application using currently available technologies. We report on some of the issues we discovered and the suggested improvements for future work.
by Howard W. Pan.
M.Eng.
APA, Harvard, Vancouver, ISO, and other styles
7

Derksen, Timothy J. (Timothy John). "Processing of outliers and missing data in multivariate manufacturing data." Thesis, Massachusetts Institute of Technology, 1996. http://hdl.handle.net/1721.1/38800.

Full text
Abstract:
Thesis (M. Eng.)--Massachusetts Institute of Technology, Dept. of Electrical Engineering and Computer Science, 1996.
Includes bibliographical references (leaf 64).
by Timothy J. Derksen.
M.Eng.
APA, Harvard, Vancouver, ISO, and other styles
8

Nyström, Simon, and Joakim Lönnegren. "Processing data sources with big data frameworks." Thesis, KTH, Data- och elektroteknik, 2016. http://urn.kb.se/resolve?urn=urn:nbn:se:kth:diva-188204.

Full text
Abstract:
Big data is a concept that is expanding rapidly. As more and more data is generatedand garnered, there is an increasing need for efficient solutions that can be utilized to process all this data in attempts to gain value from it. The purpose of this thesis is to find an efficient way to quickly process a large number of relatively small files. More specifically, the purpose is to test two frameworks that can be used for processing big data. The frameworks that are tested against each other are Apache NiFi and Apache Storm. A method is devised in order to, firstly, construct a data flow and secondly, construct a method for testing the performance and scalability of the frameworks running this data flow. The results reveal that Apache Storm is faster than Apache NiFi, at the sort of task that was tested. As the number of nodes included in the tests went up, the performance did not always do the same. This indicates that adding more nodes to a big data processing pipeline, does not always result in a better performing setup and that, sometimes, other measures must be made to heighten the performance.
Big data är ett koncept som växer snabbt. När mer och mer data genereras och samlas in finns det ett ökande behov av effektiva lösningar som kan användas föratt behandla all denna data, i försök att utvinna värde från den. Syftet med detta examensarbete är att hitta ett effektivt sätt att snabbt behandla ett stort antal filer, av relativt liten storlek. Mer specifikt så är det för att testa två ramverk som kan användas vid big data-behandling. De två ramverken som testas mot varandra är Apache NiFi och Apache Storm. En metod beskrivs för att, för det första, konstruera ett dataflöde och, för det andra, konstruera en metod för att testa prestandan och skalbarheten av de ramverk som kör dataflödet. Resultaten avslöjar att Apache Storm är snabbare än NiFi, på den typen av test som gjordes. När antalet noder som var med i testerna ökades, så ökade inte alltid prestandan. Detta visar att en ökning av antalet noder, i en big data-behandlingskedja, inte alltid leder till bättre prestanda och att det ibland krävs andra åtgärder för att öka prestandan.
APA, Harvard, Vancouver, ISO, and other styles
9

徐順通 and Sung-thong Andrew Chee. "Computerisation in Hong Kong professional engineering firms." Thesis, The University of Hong Kong (Pokfulam, Hong Kong), 1985. http://hub.hku.hk/bib/B31263124.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Wang, Yi. "Data Management and Data Processing Support on Array-Based Scientific Data." The Ohio State University, 2015. http://rave.ohiolink.edu/etdc/view?acc_num=osu1436157356.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Financial engineering Data processing"

1

Schabeck, Timothy A. Managing microcomputer security: Networked/standalone : financial, engineering, management, marketing, executive. Edited by Buckland John A, Afsar Kim W, and Gessler Joan E. Carrollton, Tex: FTP Technical Library, 1988.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
2

Schabeck, Timothy A. Managing microcomputer security: Networked/standalone : financial, engineering, management, marketing, executive. Edited by Buckland John A and Gessler Joan E. Port Jefferson Station, N.Y. (492 Old Town Rd., Port Jefferson Station 11776): FTP Technical Library, 1986.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
3

Singapore, Singapore) International Conference on Information and Financial Engineering (2009. ICIFE 2009: 2009 International Conference on Information and Financial Engineering : proceedings, 17-20 April 2009, Singapore, Singapore. Los Alamitos, Calif: IEEE Computer Society, 2009.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
4

International Conference on Financial Theory and Engineering (2010 Dubai, United Arab Emirates). 2010 International conference on financial theory and engineering: ICFTE 2010 : proceedings : 18-20 June 2010, Dubai, United Arab Emirates. Piscataway, NJ: IEEE, 2010.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
5

Duffy, Daniel J. Financial instrument pricing using C++. Chichester, England: John Wiley & Sons, 2004.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
6

Duffy, Daniel J. Financial Instrument Pricing Using C++. New York: John Wiley & Sons, Ltd., 2004.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
7

Michael, Doumpos, ed. Intelligent decision aiding systems based on multiple criteria for financial engineering. Dordrecht: Kluwer Academic Publishers, 2000.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
8

Apostolos-Paul, Refenes, ed. Neural networks in financial engineering: Proceedings of the Third International Conference on Neural Networks in the Capital Markets, London, England, 11-13 October 95. Singapore: World Scientific, 1996.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
9

Cherubini, Umberto. Structured finance: The object oriented approach. Chichester, England: John Wiley & Sons, 2007.

Find full text
APA, Harvard, Vancouver, ISO, and other styles
10

Gorgulho, Antonio. Intelligent Financial Portfolio Composition based on Evolutionary Computation Strategies. Berlin, Heidelberg: Springer Berlin Heidelberg, 2013.

Find full text
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Financial engineering Data processing"

1

Patil, Ankit, Ankush Chopra, Sohom Ghosh, and Vamshi Vadla. "Using Natural Language Processing to Understand Reasons and Motivators Behind Customer Calls in Financial Domain." In Computational Methods and Data Engineering, 247–61. Singapore: Springer Nature Singapore, 2022. http://dx.doi.org/10.1007/978-981-19-3015-7_18.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Di Orio, Giovanni, Guilherme Brito, and Pedro Maló. "Semantic Interoperability Framework for Digital Finance Applications." In Big Data and Artificial Intelligence in Digital Finance, 67–88. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-94590-9_4.

Full text
Abstract:
AbstractThis chapter outlines the theoretical foundation for the design and implementation of the Semantic Interoperability Framework of the INFINITECH project. It details a methodology for semantic models and ontology engineering and prototyping that defines the overall strategy used to design and specify semantic models for digital finance applications. The semantic models are organized hierarchically according to the domain and the specific application and linked to reference ontologies such as the Financial Industry Business Ontology (FIBO), the Financial Instrument Global Identifier (FIGI), and the Legal Knowledge Interchange Format (LKIF). The provided models establish the cornerstone for semantic interoperability within INFINITECH applications while enabling the distributed processing of semantically linked streams.
APA, Harvard, Vancouver, ISO, and other styles
3

Hindmarch, Arthur, and Mary Simpson. "The Processing of Accounting Data." In Financial Accounting: An Introduction, 56–78. London: Macmillan Education UK, 1991. http://dx.doi.org/10.1007/978-1-349-21765-6_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Ho, Jing-Mao, and Abdullah Shahid. "Natural Language Processing for Exploring Culture in Finance: Theory and Applications." In Financial Data Analytics, 269–91. Cham: Springer International Publishing, 2022. http://dx.doi.org/10.1007/978-3-030-83799-0_9.

Full text
APA, Harvard, Vancouver, ISO, and other styles
5

Peat, Maurice. "Data + Information Systems = Financial Innovation." In Lecture Notes in Business Information Processing, 1–10. Berlin, Heidelberg: Springer Berlin Heidelberg, 2009. http://dx.doi.org/10.1007/978-3-642-01197-9_1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
6

Almashaqbeh, Ghada, Allison Bishop, and Justin Cappos. "MicroCash: Practical Concurrent Processing of Micropayments." In Financial Cryptography and Data Security, 227–44. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-51280-4_13.

Full text
APA, Harvard, Vancouver, ISO, and other styles
7

Taniguchi, Masanobu, Tomoyuki Amano, Hiroaki Ogata, and Hiroyuki Taniai. "Features of Financial Data." In Statistical Inference for Financial Engineering, 1–39. Cham: Springer International Publishing, 2014. http://dx.doi.org/10.1007/978-3-319-03497-3_1.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Berns, Karsten, Alexander Köpper, and Bernd Schürmann. "Sensor Data Processing." In Lecture Notes in Electrical Engineering, 227–53. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-65157-2_8.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Godse, Jay. "Reverse-Engineering Complex Solutions." In Ruby Data Processing, 89–96. Berkeley, CA: Apress, 2018. http://dx.doi.org/10.1007/978-1-4842-3474-7_4.

Full text
APA, Harvard, Vancouver, ISO, and other styles
10

Liu, Junxiu, Tiening Sun, Yuling Luo, Qiang Fu, Yi Cao, Jia Zhai, and Xuemei Ding. "Financial Data Forecasting Using Optimized Echo State Network." In Neural Information Processing, 138–49. Cham: Springer International Publishing, 2018. http://dx.doi.org/10.1007/978-3-030-04221-9_13.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Financial engineering Data processing"

1

Kato, Fumitake. "Developmental Research for Industrial Image Processing in the Engineering Education to Learn SDGs." In The 2nd International Conference on Technology for Sustainable Development. Switzerland: Trans Tech Publications Ltd, 2022. http://dx.doi.org/10.4028/p-i8oukj.

Full text
Abstract:
Industrial image processing technology is widely introduced and applied in various fields such as industrial plants, bio-medical industry, agro technology, and also environmental fields. In addition, image acquisition devices are getting more compact and installed onto mobile phones and handy terminals. Which means that we always carry the devices, and can easily take images (still image and moving pictures) of good quality with high resolution, anytime and anywhere. In the research field, 2D code, known as QR code or Data Matrix, has a great potential for industrial applications. As we already know, QR code has been spread and used with various usage likewise ID recognition, URL display on the sticker, and/or the financial transaction's confirmation processing. Not only QR code, but Data Matrix is also frequently used as the printed tag on various material such as metal, wood, or plastic parts. These are used as a product ID or serial number to be recognized by using a mobile terminal. The 2D code technology can be a powerful tool to check and trace the marketing channel of each part of the product. On the other hand, Sustainable Development Goals (SDGs) have been proposed since 2015, researched and introduced to improve our environment and life. This paper describes the proposals of new application technology related to the concept of SDGs. The main contents are based on science and technology, but these have been developed and implemented by the young students of the college. The process and the details are described in the paper.
APA, Harvard, Vancouver, ISO, and other styles
2

Xu, Siyun, and Qingshan Tong. "Research on Risk Pre-warning Evaluation System of Enterprise Financing Based on Big Data Processing." In 2015 International Conference on Automation, Mechanical Control and Computational Engineering. Paris, France: Atlantis Press, 2015. http://dx.doi.org/10.2991/amcce-15.2015.118.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Gray, Donald G. "A Process and Economic Model for the Real-Time Optimization of Operating Profit in a Citrus Feedmill." In ASME 2007 Citrus Engineering Conference. American Society of Mechanical Engineers, 2007. http://dx.doi.org/10.1115/cec2007-5303.

Full text
Abstract:
During the past 6 years, increases in energy costs have adversely impacted the profitability of feedmill operations. The market values of dried feed, molasses, and d-limonene have not sufficiently increased to offset this additional cost. Lacking alternatives to processing wet peel, many of the large citrus processors have operated feedmills at a financial loss. The optimal operation of a citrus feedmill requires that a combination of process, resource cost, and product market value data be analyzed and translated into actions that will maximize operating profit or minimize operational losses. In particular, variations in peel volume and moisture content, evaporation requirements, product market values, and resource costs require that a detailed process and economic analysis be routinely performed to achieve optimal financial performance. As a result, the complexity of achieving optimal performance on a day-to-day basis can be overwhelming to operators and managers. This paper discusses operational challenges that are common to many citrus feedmill operations, and proposed solutions. The basis for these solutions is a mathematical process and economic model that utilizes operational data to forecast production quantities and operational costs for a specified set of operating conditions. Equations are developed for optimizing energy usage, solids value, and operating profit in real-time. In addition to optimizing daily performance, the model can be used to determine optimal product yields, train operators and managers, determine the technical and financial merit of capital improvement projects, establish realistic performance targets, and devise accurate cost accounting drivers. Paper published with permission.
APA, Harvard, Vancouver, ISO, and other styles
4

Taghavifar, Hadi, and Lokukaluge P. Perera. "Life Cycle Assessment of Different Marine Fuel Types and Powertrain Configurations for Financial and Environmental Impact Assessment in Shipping." In ASME 2022 41st International Conference on Ocean, Offshore and Arctic Engineering. American Society of Mechanical Engineers, 2022. http://dx.doi.org/10.1115/omae2022-78774.

Full text
Abstract:
Abstract The fuel type selection according to optimal pathway from extraction of a raw material (feedstock) to its processing to transportation and finally its use in marine engines (well to wheel) based on the cost and emission criteria is the main motivation factor to conduct the current investigation. The undertaken procedure has been customized based on the available data (ship/bunker route and mileage, the ship powertrain system, etc.) of the shipping industry under the SeaTech H2020 project (seatech2020.eu). The selected modeling platform is utilized for the life cycle assessment of three potential fuels of diesel, methanol, and liquefied natural gas (LNG). Different fuel production pathways and powertrain dual-fuel technologies have been taken into account as the main variables, while the subsidiary factors such as transportation parameters (fuel economy and Avg. speed) are included in the calculations. The economic aspect and emission reduction trade-off for various scenarios are conducted to introduce the optimal solution based on the stakeholder interest in the shipping industry. The study also considers the fuel transport to the respective ports for a selected vessel from diverse fuel export locations and travelled routes according to datasets available for the same project. The results provide a guideline to the shipping industry on selecting possible conventional/renewable fuel resources to use in marine engines with emission content during each adopted pathway, where the respective subsequent expenditure per 1 MJ of each fuel sample as the functional unit has been evaluated.
APA, Harvard, Vancouver, ISO, and other styles
5

Hale, Britta, Douglas L. Van Bossuyt, Nikolaos Papakonstantinou, and Bryan O’Halloran. "A Zero-Trust Methodology for Security of Complex Systems With Machine Learning Components." In ASME 2021 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2021. http://dx.doi.org/10.1115/detc2021-70442.

Full text
Abstract:
Abstract Fuelled by recent technological advances, Machine Learning (ML) is being introduced to safety and security-critical applications like defence systems, financial systems, and autonomous machines. ML components can be used either for processing input data and/or for decision making. The response time and success rate demands are very high and this means that the deployed training algorithms often produce complex models that are not readable and verifiable by humans (like multi layer neural networks). Due to the complexity of these models, achieving complete testing coverage is in most cases not realistically possible. This raises security threats related to the ML components presenting unpredictable behavior due to malicious manipulation (backdoor attacks). This paper proposes a methodology based on established security principles like Zero-Trust and defence-in-depth to help prevent and mitigate the consequences of security threats including ones emerging from ML-based components. The methodology is demonstrated on a case study of an Unmanned Aerial Vehicle (UAV) with a sophisticated Intelligence, Surveillance, and Reconnaissance (ISR) module.
APA, Harvard, Vancouver, ISO, and other styles
6

Muroyama, Alexander, Mahesh Mani, Kevin Lyons, and Bjorn Johansson. "Simulation and Analysis for Sustainability in Manufacturing Processes." In ASME 2011 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2011. http://dx.doi.org/10.1115/detc2011-47327.

Full text
Abstract:
“Sustainability” has become a ubiquitous term in almost every field, especially in engineering design and manufacturing. Recently, an increased awareness of environmental problems and resource depletion has led to an emphasis on environmentally friendly practices. This is especially true in the manufacturing industry where energy consumption and the amount of waste generated can be high. This requires proactive tools to be developed to carefully analyze the cause-effect of current manufacturing practices and to investigate alternative practices. One such approach to sustainable manufacturing is the combined use of Discrete Event Simulation (DES) and Life Cycle Assessment (LCA) to analyze the utilization and processing of manufacturing resources in a factory setting. On an economic aspect such method can significantly reduce the financial and environmental costs by evaluating the system performance before its construction or use. This project considers what-if scenarios in a simplified golf ball factory, using as close to real-world data as possible, to demonstrate DES and LCA’s ability to facilitate decision-making and optimize the manufacturing process. Plastic injection molding, an energy-intensive step in the golf ball manufacturing process, is the focus of the DES model. AutoMod, a 3-D modeling software, was used to build the DES model and AutoStat was used to run the trials and analyze the data. By varying the input parameters such as type and number of injection molding machines and material used, the simulation model can output data indicating the most productive and energy efficient methods. On a more detailed level, the simulations can provide valuable information on bottlenecks or imbalances in the system. Correcting these can allow the factory to be both “greener” and more cost-effective.
APA, Harvard, Vancouver, ISO, and other styles
7

Guo, Jian. "Financial Big Data Analysis Service System." In IPEC2022: 2022 3rd Asia-Pacific Conference on Image Processing, Electronics and Computers. New York, NY, USA: ACM, 2022. http://dx.doi.org/10.1145/3544109.3544142.

Full text
APA, Harvard, Vancouver, ISO, and other styles
8

Kontaxakis, Antonios, Antonios Deligiannakis, Holger Arndt, Stefan Burkard, Claus-Peter Kettner, Elke Pelikan, and Kathleen Noack. "Real-time processing of geo-distributed financial data." In DEBS '21: The 15th ACM International Conference on Distributed and Event-based Systems. New York, NY, USA: ACM, 2021. http://dx.doi.org/10.1145/3465480.3467842.

Full text
APA, Harvard, Vancouver, ISO, and other styles
9

Stepennov, D. B., A. P. Varnavin, A. A. Zakharchev, and L. Pillette-Cousin. "Methodological and Practical Bases of Providing Information Support to Activities on Environmental Remediation of the Spent Nuclear Fuel and Radioactive Waste Temporary Storage Facility in Gremikha." In ASME 2011 14th International Conference on Environmental Remediation and Radioactive Waste Management. ASMEDC, 2011. http://dx.doi.org/10.1115/icem2011-59375.

Full text
Abstract:
Remediation of a spent nuclear fuel (SNF) and radioactive waste (RW) temporary storage facility is a multifaceted process that includes a number of stages, such as development of a remediation programme, performance of comprehensive engineering and radiological survey, development of a remediation design, removal of SNF and RW up to the site cleanup. At any stage of the remediation, making of justified decisions is ensured by availability and completeness of associated information. Huge amount of information has to be managed. Therefore an information analysis system (IAS) was developed by the National Research Centre «Kurchatov Institute» within the framework of the project for environmental remediation of the SNF and RW temporary storage facility in Gremikha with financial and technical support provided by France (CEA) and the Russian Federation (Rosatom). The IAS accumulates all information about the project: technical and radiological characteristics of objects/facilities, cartographic information, documentation, data on the project participants, technologies and equipment involved. The IAS architecture includes the following functional subsystems: data management, data analytical processing, project management, geoinformation, 3D modeling, and public information. The IAS allows developers and performers of environmental remediation of the SNF and RW temporary storage facility in Gremikha to fulfill tasks arising at all stages of the work. The IAS operating experience can be transferred for use during surveys and remediation of any radiation hazardous facilities.
APA, Harvard, Vancouver, ISO, and other styles
10

Tang, Qiu, Lei Jiang, Majing Su, and Qiong Dai. "A pipelined market data processing architecture to overcome financial data dependency." In 2016 IEEE 35th International Performance Computing and Communications Conference (IPCCC). IEEE, 2016. http://dx.doi.org/10.1109/pccc.2016.7820632.

Full text
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Financial engineering Data processing"

1

Hall, Candice, and Robert Jensen. Utilizing data from the NOAA National Data Buoy Center. Engineer Research and Development Center (U.S.), March 2021. http://dx.doi.org/10.21079/11681/40059.

Full text
Abstract:
This Coastal and Hydraulics Engineering Technical Note (CHETN) guides users through the quality control (QC) and processing steps that are necessary when using archived U.S. National Oceanic and Atmospheric Administration (NOAA) National Data Buoy Center (NDBC) wave and meteorological data. This CHETN summarizes methodologies to geographically clean and QC NDBC measurement data for use by the U.S. Army Corps of Engineers (USACE) user community.
APA, Harvard, Vancouver, ISO, and other styles
2

Shabelnyk, Tetiana V., Serhii V. Krivenko, Nataliia Yu Rotanova, Oksana F. Diachenko, Iryna B. Tymofieieva, and Arnold E. Kiv. Integration of chatbots into the system of professional training of Masters. [б. в.], June 2021. http://dx.doi.org/10.31812/123456789/4439.

Full text
Abstract:
The article presents and describes innovative technologies of training in the professional training of Masters. For high-quality training of students of technical specialties, it becomes necessary to rethink the purpose, results of studying and means of teaching professional disciplines in modern educational conditions. The experience of implementing the chatbot tool in teaching the discipline “Mathematical modeling of socio-economic systems” in the educational and professional program 124 System Analysis is described. The characteristics of the generalized structure of the chatbot information system for investment analysis are presented and given: input information, information processing system, output information, which creates a closed cycle (system) of direct and feedback interaction. The information processing system is represented by accounting and analytical data management blocks. The investment analysis chatbot will help masters of the specialty system analysis to manage the investment process efficiently based on making the right decisions, understanding investment analysis in the extensive structure of financial management and optimizing risks in these systems using a working mobile application. Also, the chatbot will allow you to systematically assess the disadvantages and advantages of investment projects or the direction of activity of a system analyst, while increasing interest in performing practical tasks. A set of software for developing a chatbot integrated into training is installed: Kotlin programming, a library for network interaction Retrofit, receiving and transmitting data, linking processes using the HTTP API. Based on the results of the study, it is noted that the impact of integrating a chatbot into the training of Masters ensures the development of their professional activities, which gives them the opportunity to be competent specialists and contributes to the organization of high-quality training.
APA, Harvard, Vancouver, ISO, and other styles
3

DeMarle, David, and Andrew Bauer. In situ visualization with temporal caching. Engineer Research and Development Center (U.S.), January 2022. http://dx.doi.org/10.21079/11681/43042.

Full text
Abstract:
In situ visualization is a technique in which plots and other visual analyses are performed in tandem with numerical simulation processes in order to better utilize HPC machine resources. Especially with unattended exploratory engineering simulation analyses, events may occur during the run, which justify supplemental processing. Sometimes though, when the events do occur, the phenomena of interest includes the physics that precipitated the events and this may be the key insight into understanding the phenomena that is being simulated. In situ temporal caching is the temporary storing of produced data in memory for possible later analysis including time varying visualization. The later analysis and visualization still occurs during the simulation run but not until after the significant events have been detected. In this article, we demonstrate how temporal caching can be used with in-line in situ visualization to reduce simulation run-time while still capturing essential simulation results.
APA, Harvard, Vancouver, ISO, and other styles
4

Suir, Glenn, Molly Reif, and Christina Saltus. Remote sensing capabilities to support EWN® projects : an R&D approach to improve project efficiencies and quantify performance. Engineer Research and Development Center (U.S.), August 2022. http://dx.doi.org/10.21079/11681/45241.

Full text
Abstract:
Engineering With Nature (EWN®) is a US Army Corps of Engineers (USACE) Initiative and Program that promotes more sustainable practices for delivering economic, environmental, and social benefits through collaborative processes. As the number and variety of EWN® projects continue to grow and evolve, there is an increasing opportunity to improve how to quantify their benefits and communicate them to the public. Recent advancements in remote sensing technologies are significant for EWN® because they can provide project-relevant detail across a large areal extent, in which traditional survey methods may be complex due to site access limitations. These technologies encompass a suite of spatial and temporal data collection and processing techniques used to characterize Earth's surface properties and conditions that would otherwise be difficult to assess. This document aims to describe the general underpinnings and utility of remote sensing technologies and applications for use: (1) in specific phases of the EWN® project life cycle; (2) with specific EWN® project types; and (3) in the quantification and assessment of project implementation, performance, and benefits.
APA, Harvard, Vancouver, ISO, and other styles
5

Avis, William. Drivers, Barriers and Opportunities of E-waste Management in Africa. Institute of Development Studies (IDS), December 2021. http://dx.doi.org/10.19088/k4d.2022.016.

Full text
Abstract:
Population growth, increasing prosperity and changing consumer habits globally are increasing demand for consumer electronics. Further to this, rapid changes in technology, falling prices and consumer appetite for better products have exacerbated e-waste management challenges and seen millions of tons of electronic devices become obsolete. This rapid literature review collates evidence from academic, policy focussed and grey literature on e-waste management in Africa. This report provides an overview of constitutes e-waste, the environmental and health impacts of e-waste, of the barriers to effective e-waste management, the opportunities associated with effective e-waste management and of the limited literature available that estimate future volumes of e-waste. Africa generated a total of 2.9 million Mt of e-waste, or 2.5 kg per capita, the lowest regional rate in the world. Africa’s e-waste is the product of Local and imported Sources of Used Electronic and Electrical Equipment (UEEE). Challenges in e-waste management in Africa are exacerbated by a lack of awareness, environmental legislation and limited financial resources. Proper disposal of e-waste requires training and investment in recycling and management technology as improper processing can have severe environmental and health effects. In Africa, thirteen countries have been identified as having a national e-waste legislation/policy.. The main barriers to effective e-waste management include: Insufficient legislative frameworks and government agencies’ lack of capacity to enforce regulations, Infrastructure, Operating standards and transparency, illegal imports, Security, Data gaps, Trust, Informality and Costs. Aspirations associated with energy transition and net zero are laudable, products associated with these goals can become major contributors to the e-waste challenge. The necessary wind turbines, solar panels, electric car batteries, and other "green" technologies require vast amounts of resources. Further to this, at the end of their lifetime, they can pose environmental hazards. An example of e-waste associated with energy transitions can be gleaned from the solar power sector. Different types of solar power cells need to undergo different treatments (mechanical, thermal, chemical) depending on type to recover the valuable metals contained. Similar issues apply to waste associated with other energy transition technologies. Although e-waste contains toxic and hazardous metals such as barium and mercury among others, it also contains non-ferrous metals such as copper, aluminium and precious metals such as gold and copper, which if recycled could have a value exceeding 55 billion euros. There thus exists an opportunity to convert existing e-waste challenges into an economic opportunity.
APA, Harvard, Vancouver, ISO, and other styles
6

Modlo, Yevhenii O., Serhiy O. Semerikov, Stanislav L. Bondarevskyi, Stanislav T. Tolmachev, Oksana M. Markova, and Pavlo P. Nechypurenko. Methods of using mobile Internet devices in the formation of the general scientific component of bachelor in electromechanics competency in modeling of technical objects. [б. в.], February 2020. http://dx.doi.org/10.31812/123456789/3677.

Full text
Abstract:
An analysis of the experience of professional training bachelors of electromechanics in Ukraine and abroad made it possible to determine that one of the leading trends in its modernization is the synergistic integration of various engineering branches (mechanical, electrical, electronic engineering and automation) in mechatronics for the purpose of design, manufacture, operation and maintenance electromechanical equipment. Teaching mechatronics provides for the meaningful integration of various disciplines of professional and practical training bachelors of electromechanics based on the concept of modeling and technological integration of various organizational forms and teaching methods based on the concept of mobility. Within this approach, the leading learning tools of bachelors of electromechanics are mobile Internet devices (MID) – a multimedia mobile devices that provide wireless access to information and communication Internet services for collecting, organizing, storing, processing, transmitting, presenting all kinds of messages and data. The authors reveals the main possibilities of using MID in learning to ensure equal access to education, personalized learning, instant feedback and evaluating learning outcomes, mobile learning, productive use of time spent in classrooms, creating mobile learning communities, support situated learning, development of continuous seamless learning, ensuring the gap between formal and informal learning, minimize educational disruption in conflict and disaster areas, assist learners with disabilities, improve the quality of the communication and the management of institution, and maximize the cost-efficiency. Bachelor of electromechanics competency in modeling of technical objects is a personal and vocational ability, which includes a system of knowledge, skills, experience in learning and research activities on modeling mechatronic systems and a positive value attitude towards it; bachelor of electromechanics should be ready and able to use methods and software/hardware modeling tools for processes analyzes, systems synthesis, evaluating their reliability and effectiveness for solving practical problems in professional field. The competency structure of the bachelor of electromechanics in the modeling of technical objects is reflected in three groups of competencies: general scientific, general professional and specialized professional. The implementation of the technique of using MID in learning bachelors of electromechanics in modeling of technical objects is the appropriate methodic of using, the component of which is partial methods for using MID in the formation of the general scientific component of the bachelor of electromechanics competency in modeling of technical objects, are disclosed by example academic disciplines “Higher mathematics”, “Computers and programming”, “Engineering mechanics”, “Electrical machines”. The leading tools of formation of the general scientific component of bachelor in electromechanics competency in modeling of technical objects are augmented reality mobile tools (to visualize the objects’ structure and modeling results), mobile computer mathematical systems (universal tools used at all stages of modeling learning), cloud based spreadsheets (as modeling tools) and text editors (to make the program description of model), mobile computer-aided design systems (to create and view the physical properties of models of technical objects) and mobile communication tools (to organize a joint activity in modeling).
APA, Harvard, Vancouver, ISO, and other styles
7

Key Indicators for Asia and the Pacific 2022. Asian Development Bank, August 2022. http://dx.doi.org/10.22617/fls220346-3.

Full text
Abstract:
This publication provides updated statistics on a comprehensive set of economic, financial, social, and environmental measures as well as select indicators for the Sustainable Development Goals (SDGs). The publication covers the 49 regional members of ADB. It discusses trends in development progress and the challenges to achieving inclusive and sustainable economic growth across Asia and the Pacific. This 53rd edition finds that the COVID-19 pandemic has set back the fight against poverty in the region by at least 2 years. Drawing on data simulations, it concludes that people with less social mobility may experience longer-lasting difficulties in escaping poverty. Key Indicators for Asia and the Pacific 2022 includes a special chapter on how data resilience can be achieved in the wake of pandemic disruptions to the operations of national statistical systems. The publication is complemented by a special supplement, Mapping the Public Voice for Development: Natural Language Processing of Social Media Text Data, which explores how social media text data can be harnessed to help policymakers understand the opinions, ideas, and expectations of the public.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography