Journal articles on the topic 'Stock exchanges Australia Econometric models'

To see the other types of publications on this topic, follow the link: Stock exchanges Australia Econometric models.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 15 journal articles for your research on the topic 'Stock exchanges Australia Econometric models.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Angelidis,, Dimitrios, Athanasios Koulakiotis, and Apostolos Kiohos. "Feedback Trading Strategies: The Case of Greece and Cyprus." South East European Journal of Economics and Business 13, no. 1 (June 1, 2018): 93–99. http://dx.doi.org/10.2478/jeb-2018-0006.

Full text
Abstract:
Abstract This paper examines whether or not feedback trading strategies are present in the Athens (ASE) and Cyprus Stock Exchanges (CSE). The analysis employs two econometric models: the feedback trading strategy model, introduced by Sentana and Wadhwani (1992), and the exponential autoregressive model, proposed by LeBaron (1992). These two theoretical frameworks, separately, were joined with the FIGARCH (1, d, 1) approach. Both models assume two different groups of traders - the “rational” investors that build their portfolio by following the firms’ fundamentals and the “noise” speculators that ignore stock fundamentals and focus on a positive (negative) feedback trading strategy. The empirical results revealed that negative feedback trading strategies exist in the two underlying stock markets
APA, Harvard, Vancouver, ISO, and other styles
2

Majewski, Sebastian, Waldemar Tarczynski, and Malgorzata Tarczynska-Luniewska. "Measuring investors’ emotions using econometric models of trading volume of stock exchange indexes." Investment Management and Financial Innovations 17, no. 3 (September 30, 2020): 281–91. http://dx.doi.org/10.21511/imfi.17(3).2020.21.

Full text
Abstract:
Traditional finance explains all human activity on the ground of rationality and suggests all decisions are rational because all current information is reflected in the prices of goods. Unfortunately, the development of information technology and a growth of demand for new, attractive possibilities of investment caused the process of searching new, unique signals supporting investment decisions. Such a situation is similar to risk-taking, so it must elicit the emotional reactions of individual traders.The paper aims to verify the question that the market risk may be the determinant of traders’ emotions, and if volatility is a useful tool during the investment process as the measure of traders’ optimism, similarly to Majewski’s work (2019). Likewise, various econometric types of models of estimation of the risk parameter were used in the research: classical linear using OLS, general linear using FGLS, and GARCH(p, q) models using maximum likelihood method. Hypotheses were verified using the data collected from the most popular world stock exchanges: New York, Frankfurt, Tokyo, and London. Data concerned stock exchange indexes such as SP500, DAX, Nikkei, and UK100.
APA, Harvard, Vancouver, ISO, and other styles
3

Chambi Condori, Pedro Pablo. "Financial contagion: The impact of the volatility of global stock exchanges on the Lima-Peru Stock Exchange." Economía & Negocios 1, no. 1 (June 24, 2020): 13–27. http://dx.doi.org/10.33326/27086062.2019.1.896.

Full text
Abstract:
What happens in the international financial markets in terms of volatility, have an impact on the results of the local stock market financial markets, as a result of the spread and transmission of larger stock market volatility to smaller markets such as the Peruvian, assertion that goes in accordance with the results obtained in the study in reference. The statistical evaluation of econometric models, suggest that the model obtained can be used for forecasting volatility expected in the very short term, very important estimates for agents involved, because these models can contribute to properly align the attitude to be adopted in certain circumstances of high volatility, for example in the input, output, refuge or permanence in the markets and also in the selection of best steps and in the structuring of the portfolio of investment with equity and additionally you can view through the correlation on which markets is can or not act and consequently the best results of profitability in the equity markets. This work comprises four well-defined sections; a brief history of the financial volatility of the last 15 years, a tight summary of the background and a dense summary of the methodology used in the process of the study, exposure of the results obtained and the declaration of the main conclusions which led us mention research, which allows writing, evidence of transmission and spread of the larger stock markets toward the Peruvian stock market volatility, as in the case of the American market to the market Peruvian stock market with the coefficient of dynamic correlation of 0.32, followed by the Spanish market and the market of China. Additionally, the coefficient of interrelation found by means of the model dcc mgarch is a very important indicator in the structure of portfolios of investment with instruments that they quote on the financial global markets.
APA, Harvard, Vancouver, ISO, and other styles
4

Ampomah, Ernest Kwame, Zhiguang Qin, and Gabriel Nyame. "Evaluation of Tree-Based Ensemble Machine Learning Models in Predicting Stock Price Direction of Movement." Information 11, no. 6 (June 20, 2020): 332. http://dx.doi.org/10.3390/info11060332.

Full text
Abstract:
Forecasting the direction and trend of stock price is an important task which helps investors to make prudent financial decisions in the stock market. Investment in the stock market has a big risk associated with it. Minimizing prediction error reduces the investment risk. Machine learning (ML) models typically perform better than statistical and econometric models. Also, ensemble ML models have been shown in the literature to be able to produce superior performance than single ML models. In this work, we compare the effectiveness of tree-based ensemble ML models (Random Forest (RF), XGBoost Classifier (XG), Bagging Classifier (BC), AdaBoost Classifier (Ada), Extra Trees Classifier (ET), and Voting Classifier (VC)) in forecasting the direction of stock price movement. Eight different stock data from three stock exchanges (NYSE, NASDAQ, and NSE) are randomly collected and used for the study. Each data set is split into training and test set. Ten-fold cross validation accuracy is used to evaluate the ML models on the training set. In addition, the ML models are evaluated on the test set using accuracy, precision, recall, F1-score, specificity, and area under receiver operating characteristics curve (AUC-ROC). Kendall W test of concordance is used to rank the performance of the tree-based ML algorithms. For the training set, the AdaBoost model performed better than the rest of the models. For the test set, accuracy, precision, F1-score, and AUC metrics generated results significant to rank the models, and the Extra Trees classifier outperformed the other models in all the rankings.
APA, Harvard, Vancouver, ISO, and other styles
5

Ndayisaba, Gilbert, and Abdullahi D. Ahmed. "CEO remuneration, board composition and firm performance: empirical evidence from Australian listed companies." Corporate Ownership and Control 13, no. 1 (2015): 534–52. http://dx.doi.org/10.22495/cocv13i1c5p2.

Full text
Abstract:
Classical economic theories establishing a relationship between CEO remuneration and firm performance has paid particular attention to solve conflict of interest between managerial team and firm shareholders, by designing an optimum CEO remuneration that motivate executives to work in the best interest of shareholders. Many international and less Australian empirical researches suggest that there is overwhelming evidence that firm performance is strongly linked with CEO remuneration. In this paper, we reassess the association of firm performance and CEO remuneration variables using dynamic econometric models and comprehensive data from Australian Stock Exchange (ASX). We find a positive and strong association between CEO pay of top 200 Australian public listed companies and company performance. Obtained findings are similar to USA, UK and Canada studies results. We further test the effect of board and ownership features on CEO remuneration–performance sensitivity in the top 200 Australian public companies listed on ASX. Specifically, for the period of 2003-2007, our results highlight the importance of ownership structure in influencing remuneration–performance relationship. Monitoring block holders boost the responsiveness of long term incentives (LTI) remuneration to performance, thus straightening shareholder and manager welfares. However, based on a short term investment horizon strategy, insider block holders increase (decrease) the sensitivity of short-term incentives remuneration (long term incentives pay). Surprisingly, for the period 2008-2013, our findings suggest that ownership and board features did not influence significantly CEO pay-performance sensitivities. Finally, we find that larger boards increase (decrease) the responsiveness of CEO’s known remuneration (long term incentives) to performance.
APA, Harvard, Vancouver, ISO, and other styles
6

Hami, Mustapha El, and Ahmed Hefnaoui. "Analysis of Herding Behavior in Moroccan Stock Market." Journal of Economics and Behavioral Studies 11, no. 1(J) (March 10, 2019): 181–90. http://dx.doi.org/10.22610/jebs.v11i1(j).2758.

Full text
Abstract:
Frontier markets, particularly the Moroccan financial market, are characterized by a narrowness of market, inability to absorb erratic price fluctuations and the low liquidity of securities that encourage investors to herd and imitate those who have all the information about the market. A quantitative research approach was used to analyze the existence of herding n Moroccan stock market. The daily data used in this study concerns the period from 04/01/2010 to 29/12/2017 and contains the daily returns of the MASI and a total of 43 traded stocks. Statistical and econometric methods such as multidimensional scaling and Cross-sectional absolute deviation were used. Subsequently, after the regression models were examined, findings indicated that the first stocks with the highest similarity to the index return are BMCE, BCP, IAM, ATW and CMSR, and the first stocks with the highest dissimilarity are PAP, IBC and SNP, This will have to allow investors to choose profitable alternatives and avoid those that present a possible risk. The results did also show the existence of herding in the Moroccan stock market both upward and downward. This finding was supported by the clear existence of a non-linearity between market performance and CSAD measurement, which confirms the prediction of a non-linear inversion relationship between CSAD and 𝑅𝑚. This could be due to the low level of transparency that prevails in frontier stock exchanges and reduces the quality of their information environment, which leads investors not to react rationally and to draw information from the transactions of their peers.
APA, Harvard, Vancouver, ISO, and other styles
7

Reichert, Bianca, and Adriano Mendonça Souza. "Can the Heston Model Forecast Energy Generation? A Systematic Literature Review." International Journal of Energy Economics and Policy 12, no. 1 (January 19, 2022): 289–95. http://dx.doi.org/10.32479/ijeep.11975.

Full text
Abstract:
The ability to predict the price of stock exchange assets has attracted the attention of economists and physicists around the world, as physical models are useful to predict volatility behaviors. Knowing that volatility is crucial for energy sector planning, the research aim was to investigate whether the Heston pricing model is useful to predict energy generation, trough the steps established by the systematic review protocol. In a corpus of 25 documents, it was possible to identify: lots of financial studies, energy and demography researches; a low level of interaction among universities; the largest number of publications from Australia and China; the most important journal; and the advantages of applying Econophysics models to solve volatility problems. In conclusion, the Heston model can be applied to predict energy generation, since it is a closed-form model and capable of modeling the stochastic volatility, reversing it to the predicted value of average energy generation.
APA, Harvard, Vancouver, ISO, and other styles
8

Ercegovac, Roberto, Mario Pečarić, and Ivica Klinac. "Bank Risk Profiles and Business Model Characteristics." Journal of Central Banking Theory and Practice 9, no. 3 (September 1, 2020): 107–21. http://dx.doi.org/10.2478/jcbtp-2020-0039.

Full text
Abstract:
AbstractCurrent research, especially after the financial crisis, highlights different key determinants of high risk bank profiles. The main aim of this paper is to test, through an empirical model, the impact of various determinants of bank business models on the bank risk with the purpose of enabling early identification of signals of risk and timely application of prudential measures. There are two basic business models for banks: market-oriented wholesale bank business model and client-oriented bank business model. In the wholesale model, a significant share of the assets is comprised of securities in the trade portfolio, the bank is strongly involved in the international financial markets, while on the income side of the bank profile, a large part is related to non-interest income. In the client related business model, classical banking is dominant, which is visible in the high share of loan-related assets, a larger share of self-financing and a larger share of income from interest-operational income in the total income structure of the bank. In the panel analysis of the empirical data, as an indicator of the bank risk profile, the stock market price to stock market price volatility ratio was used with the presumption that the market price and its volatility, with sufficiently liquid shares listed on public stock exchanges, is representative of bank risk. The analysis is conducted on a homogenous example of 20 European banks in the period 2002-2017. Following the econometric analysis, the conclusion is that banks in which business model wholesale characteristics are dominant are more exposed to business risk in periods of market shocks and, as such, represent a danger for the long-term stability of the financial sector.
APA, Harvard, Vancouver, ISO, and other styles
9

Kouki, Ahmed. "IFRS and value relevance." Journal of Applied Accounting Research 19, no. 1 (February 12, 2018): 60–80. http://dx.doi.org/10.1108/jaar-05-2015-0041.

Full text
Abstract:
Purpose The purpose of this paper is to compare the value relevance of accounting information between International Financial Reporting Standards (IFRS)-firms and non-IFRS-firms over five years before mandatory IFRS adoption from 2000 to 2004 and six years after IFRS adoption from 2006 to 2011. Design/methodology/approach The sample includes 1166 firm-year observations that cover firms from three Europeans countries. Different econometric tests, multivariate and panel regressions have been used to verify the hypotheses. Findings In the pre-IFRS period, voluntary IFRS adoption did not improve the value relevance of accounting information. The results indicate that the information contents of non-IFRS-firms in the post-adoption period have higher quality than in the pre-adoption period. The findings show a higher association between accounting information, stock prices and stock returns over both periods, however, the difference in results is not statistically significant. Research limitations/implications This study was not generalized to other stock exchanges that have a significant weight in the European Union, such as the FTSE 100 companies or the SP/MIB. Practical implications This study has some implications for standards setters, firms and practitioners. The transition to IFRS reduces the diversity of accounting systems and institutional conditions (capital market structure, Taxation systems). In addition, mandatory IFRS adoption engendered changes in firms’ business and organizational models that led accountants to improve their educational and training programs. Originality/value This paper contributes to the value relevance as well as IFRS literature by using a sample from code-law origin countries that switched from a debt-oriented system to shareholder-oriented system. It offers a comparative approach between IFRS-firms and Non-IFRS-firms in the pre- and post-adoption periods. In contrast, prior studies focused on the comparison during only one period. This empirical evidence should be of interest to investors and policymakers in other markets.
APA, Harvard, Vancouver, ISO, and other styles
10

Adams, Carol A. "Conceptualising the contemporary corporate value creation process." Accounting, Auditing & Accountability Journal 30, no. 4 (May 15, 2017): 906–31. http://dx.doi.org/10.1108/aaaj-04-2016-2529.

Full text
Abstract:
Purpose The purpose of this paper is to examine and explain the complex interrelationships which influence the ability of firms to create value for their providers of finance and other stakeholders (loosely referred to in practice as “integrated thinking”). In doing so it examines the interrelationships between: environmental, social and governance (ESG) risk; delivering on corporate strategy; non-financial corporate reporting; and, board oversight. Design/methodology/approach Interviews were conducted with board chairs and non-executive directors of large listed companies on the Johannesburg Stock Exchange (where Boards are required to have a social and ethics sub-committee and approve integrated reports which have been mandatory since 2010) and the Australian Stock Exchange (where Board directors’ liability legislation results in Boards being reluctant to adopt integrated reporting which is voluntary). Findings The research finds that contemporary reporting processes, and in particular those set out in the King III Code and the International Integrated Reporting Framework, influence cognitive frames enhancing board oversight and assisting organisations in managing complexity. This results in increased awareness of the impact of ESG issues together with a broader view of value creation despite investor disinterest. Research limitations/implications A number of avenues of research are suggested to further examine the interrelationships identified. Practical implications The research assists the development of practice and policy by articulating and enhancing the understanding of linkages, which loosely fall under the vague practitioner term “integrated thinking”. Social implications The conceptualisation can inform national and global discussions on the appropriateness of corporate reporting and governance models to achieve sustainable development and contribute to the Sustainable Development Goals. Originality/value The paper conceptualises emerging and complex interrelationships. The cross-country comparison allows an assessment of the extent to which different national social contexts with differing governance and reporting frameworks lead to different perspectives on, and approaches to, value creation.
APA, Harvard, Vancouver, ISO, and other styles
11

AlKhouri, Ritab, and Houda Arouri. "The effect of diversification on risk and return in banking sector." International Journal of Managerial Finance 15, no. 1 (February 4, 2019): 100–128. http://dx.doi.org/10.1108/ijmf-01-2018-0024.

Full text
Abstract:
PurposeThe purpose of this paper is to investigate the effect of revenue diversification, non-interest income and asset diversification on the performance and stability of the Gulf Cooperation Council (GCC) conventional and Islamic banking systems.Design/methodology/approachThe authors implement a panel of 69 conventional and Islamic banks listed in six GCC markets over the period of 2003–2015, using the System Generalized Method of Moments methodology.FindingsNon-interest income diversification has a negative impact on GCC banks’ performance, while asset-based diversification affects banks performance positively. However, Investors tend to penalize the value of the banks’ assets, which are highly diversified. Government intervention, lack of competition, legal protection and high control of Central banks on GCC banks’ have positive impact on performance. Contrary to the results on conventional banks, asset diversification adds value to Islamic banks. Overall, both banks’ revenue and non-interest diversification have negative impact on GCC banks’ stability, while asset diversification improves Islamic banks’ stability.Research limitations/implicationsThe analysis is limited to a sample of banks, which are listed in the GCC stock exchanges. The lack of data on private and foreign banks operating in the region made the analysis and, consequently, the results specific to shareholding companies. Also, the authors’ measures of bank stability might not be appropriate to use for Islamic banks, given their banking models implemented.Practical implicationsResearch results provide important implications for regulators, bank managers and policy makers, as to the expected ways to support economic diversification through bank diversification strategies.Originality/valueUnlike related studies, the authors’ sample of homogeneous banks has a market structure that is different from the samples in the literature covering either developed countries or heterogeneous samples from both developed and developing countries. Furthermore, using an efficient econometric methodology, the authors deal with two types of banks: conventional banks and Islamic banks. The research determines which type of bank is more able to benefit from different types of diversification. Unlike previous research, this research explores the sensitivity of the results both to the regulatory environment of the GCC market and to general market conditions.
APA, Harvard, Vancouver, ISO, and other styles
12

Essefi, Elhoucine. "Homo Sapiens Sapiens Progressive Defaunation During The Great Acceleration: The Cli-Fi Apocalypse Hypothesis." International Journal of Toxicology and Toxicity Assessment 1, no. 1 (July 17, 2021): 18–23. http://dx.doi.org/10.55124/ijt.v1i1.114.

Full text
Abstract:
This paper is meant to study the apocalyptic scenario of the at the perspectives of the Great Acceleration. the apocalyptic scenario is not a pure imagination of the literature works. Instead, scientific evidences are in favour of dramatic change in the climatic conditions related to the climax of Man actions. the modelling of the future climate leads to horrible situations including intolerable temperatures, dryness, tornadoes, and noticeable sear level rise evading coastal regions. Going far from these scientific claims, Homo Sapiens Sapiens extended his imagination through the Climate-Fiction (cli-fi) to propose a dramatic end. Climate Fiction is developed into a recording machine containing every kind of fictions that depict environmental condition events and has consequently lost its true significance. Introduction The Great Acceleration may be considered as the Late Anthropocene in which Man actions reached their climax to lead to dramatic climatic changes paving the way for a possible apocalyptic scenario threatening the existence of the humanity. So, the apocalyptic scenario is not a pure imagination of the literature works. Instead, many scientific arguments especially related to climate change are in favour of the apocalypse1. As a matter of fact, the modelling of the future climate leads to horrible situations including intolerable temperatures (In 06/07/2021, Kuwait recorded the highest temperature of 53.2 °C), dryness, tornadoes, and noticeable sear level rise evading coastal regions. These conditions taking place during the Great Acceleration would have direct repercussions on the human species. Considering that the apocalyptic extinction had really caused the disappearance of many stronger species including dinosaurs, Homo Sapiens Sapiens extended his imagination though the Climate-Fiction (cli-fi) to propose a dramatic end due to severe climate conditions intolerable by the humankind. The mass extinction of animal species has occurred several times over the geological ages. Researchers have a poor understanding of the causes and processes of these major crises1. Nonetheless, whatever the cause of extinction, the apocalyptic scenario has always been present in the geological history. For example, dinosaurs extinction either by asteroids impact or climate changes could by no means denies the apocalyptic aspect2.At the same time as them, many animal and plant species became extinct, from marine or flying reptiles to marine plankton. This biological crisis of sixty-five million years ago is not the only one that the biosphere has suffered. It was preceded and followed by other crises which caused the extinction or the rarefaction of animal species. So, it is undeniable that many animal groups have disappeared. It is even on the changes of fauna that the geologists of the last century have based themselves to establish the scale of geological times, scale which is still used. But it is no less certain that the extinction processes, extremely complex, are far from being understood. We must first agree on the meaning of the word "extinction", namely on the apocalyptic aspect of the concept. It is quite understood that, without disappearances, the evolution of species could not have followed its course. Being aware that the apocalyptic extinction had massacred stronger species that had dominated the planet, Homo Sapiens Sapiens has been aware that the possibility of apocalyptic end at the perspective of the Anthropocene (i.e., Great Acceleration) could not be excluded. This conviction is motivated by the progressive defaunation in some regions3and the appearance of alien species in others related to change of mineralogy and geochemistry4 leading to a climate change during the Anthropocene. These scientific claims fed the vast imagination about climate change to set the so-called cli-fi. The concept of the Anthropocene is the new geological era which begins when the Man actions have reached a sufficient power to modify the geological processes and climatic cycles of the planet5. The Anthropocene by no means excludes the possibility of an apocalyptic horizon, namely in the perspectives of the Great Acceleration. On the contrary, two scenarios do indeed seem to dispute the future of the Anthropocene, with a dramatic cross-charge. The stories of the end of the world are as old as it is, as the world is the origin of these stories. However, these stories of the apocalypse have evolved over time and, since the beginning of the 19th century, they have been nourished particularly by science and its advances. These fictions have sometimes tried to pass themselves off as science. This is the current vogue, called collapsology6. This end is more than likely cli-fi driven7and it may cause the extinction of the many species including the Homo Sapiens Sapiens. In this vein, Anthropocene defaunation has become an ultimate reality8. More than one in eight birds, more than one in five mammals, more than one in four coniferous species, one in three amphibians are threatened. The hypothesis of a hierarchy within the living is induced by the error of believing that evolution goes from the simplest to the most sophisticated, from the inevitably stupid inferior to the superior endowed with an intelligence giving prerogative to all powers. Evolution goes in all directions and pursues no goal except the extension of life on Earth. Evolution certainly does not lead from bacteria to humans, preferably male and white. Our species is only a carrier of the DNA that precedes us and that will survive us. Until we show a deep respect for the biosphere particularly, and our planet in general, we will not become much, we will remain a predator among other predators, the fiercest of predators, the almighty craftsman of the Anthropocene. To be in the depths of our humanity, somehow giving back to the biosphere what we have taken from it seems obvious. To stop the sixth extinction of species, we must condemn our anthropocentrism and the anthropization of the territories that goes with it. The other forms of life also need to keep their ecological niches. According to the first, humanity seems at first to withdraw from the limits of the planet and ultimately succumb to them, with a loss of dramatic meaning. According to the second, from collapse to collapse, it is perhaps another humanity, having overcome its demons, that could come. Climate fiction is a literary sub-genre dealing with the theme of climate change, including global warming. The term appears to have been first used in 2008 by blogger and writer Dan Bloom. In October 2013, Angela Evancie, in a review of the novel Odds against Tomorrow, by Nathaniel Rich, wonders if climate change has created a new literary genre. Scientific basis of the apocalyptic scenario in the perspective of the Anthropocene Global warming All temperature indices are in favour of a global warming (Fig.1). According to the different scenarios of the IPCC9, the temperatures of the globe could increase by 2 °C to 5 °C by 2100. But some scientists warn about a possible runaway of the warming which can reach more than 3 °C. Thus, the average temperature on the surface of the globe has already increased by more than 1.1 °C since the pre-industrial era. The rise in average temperatures at the surface of the globe is the first expected and observed consequence of massive greenhouse gas emissions. However, meteorological surveys record positive temperature anomalies which are confirmed from year to year compared to the temperatures recorded since the middle of the 19th century. Climatologists point out that the past 30 years have seen the highest temperatures in the Northern Hemisphere for over 1,400 years. Several climatic centres around the world record, synthesize and follow the evolution of temperatures on Earth. Since the beginning of the 20th century (1906-2005), the average temperature at the surface of the globe has increased by 0.74 °C, but this progression has not been continuous since 1976, the increase has clearly accelerated, reaching 0.19 °C per decade according to model predictions. Despite the decline in solar activity, the period 1997-2006 is marked by an average positive anomaly of 0.53 °C in the northern hemisphere and 0.27 °C in the southern hemisphere, still compared to the normal calculated for 1961-1990. The ten hottest years on record are all after 1997. Worse, 14 of the 15 hottest years are in the 21st century, which has barely started. Thus, 2016 is the hottest year, followed closely by 2015, 2014 and 2010. The temperature of tropical waters increased by 1.2 °C during the 20th century (compared to 0.5 °C on average for the oceans), causing coral reefs to bleach in 1997. In 1998, the period of Fort El Niño, the prolonged warming of the water has destroyed half of the coral reefs of the Indian Ocean. In addition, the temperature in the tropics of the five ocean basins, where cyclones form, increased by 0.5 °C from 1970 to 2004, and powerful cyclones appeared in the North Atlantic in 2005, while they were more numerous in other parts of the world. Recently, mountains of studies focused on the possible scenario of climate change and the potential worldwide repercussions including hell temperatures and apocalyptic extreme events10 , 11, 12. Melting of continental glaciers As a direct result of the global warming, melting of continental glaciers has been recently noticed13. There are approximately 198,000 mountain glaciers in the world; they cover an area of approximately 726,000 km2. If they all melted, the sea level would rise by about 40 cm. Since the late 1960s, global snow cover has declined by around 10 to 15%. Winter cold spells in much of the northern half of the northern hemisphere are two weeks shorter than 100 years ago. Glaciers of mountains have been declining all over the world by an average of 50 m per decade for 150 years. However, they are also subject to strong multi-temporal variations which make forecasts on this point difficult according to some specialists. In the Alps, glaciers have been losing 1 meter per year for 30 years. Polar glaciers like those of Spitsbergen (about a hundred km from the North Pole) have been retreating since 1880, releasing large quantities of water. The Arctic has lost about 10% of its permanent ice cover every ten years since 1980. In this region, average temperatures have increased at twice the rate of elsewhere in the world in recent decades. The melting of the Arctic Sea ice has resulted in a loss of 15% of its surface area and 40% of its thickness since 1979. The record for melting arctic sea ice was set in 2017. All models predict the disappearance of the Arctic Sea ice in summer within a few decades, which will not be without consequences for the climate in Europe. The summer melting of arctic sea ice accelerated far beyond climate model predictions. Added to its direct repercussions of coastal regions flooding, melting of continental ice leads to radical climatic modifications in favour of the apocalyptic scenario. Fig.1 Evolution of temperature anomaly from 1880 to 2020: the apocalyptic scenario Sea level rise As a direct result of the melting of continental glaciers, sea level rise has been worldwide recorded14 ,15. The average level of the oceans has risen by 22 cm since 1880 and 2 cm since the year 2000 because of the melting of the glaciers but also with the thermal expansion of the water. In the 20th century, the sea level rose by around 2 mm per year. From 1990 to 2017, it reached the relatively constant rate of just over 3mm per year. Several sources contributed to sea level increase including thermal expansion of water (42%), melting of continental glaciers (21%), melting Greenland glaciers (15%) and melting Antarctic glaciers (8%). Since 2003, there has always been a rapid rise (around 3.3 mm / year) in sea level, but the contribution of thermal expansion has decreased (0.4 mm / year) while the melting of the polar caps and continental glaciers accelerates. Since most of the world’s population is living on coastal regions, sea level rise represents a real threat for the humanity, not excluding the apocalyptic scenario. Multiplication of extreme phenomena and climatic anomalies On a human scale, an average of 200 million people is affected by natural disasters each year and approximately 70,000 perish from them. Indeed, as evidenced by the annual reviews of disasters and climatic anomalies, we are witnessing significant warning signs. It is worth noting that these observations are dependent on meteorological survey systems that exist only in a limited number of countries with statistics that rarely go back beyond a century or a century and a half. In addition, scientists are struggling to represent the climatic variations of the last two thousand years which could serve as a reference in the projections. Therefore, the exceptional nature of this information must be qualified a little. Indeed, it is still difficult to know the return periods of climatic disasters in each region. But over the last century, the climate system has gone wild. Indeed, everything suggests that the climate is racing. Indeed, extreme events and disasters have become more frequent. For instance, less than 50 significant events were recorded per year over the period 1970-1985, while there have been around 120 events recorded since 1995. Drought has long been one of the most worrying environmental issues. But while African countries have been the main affected so far, the whole world is now facing increasingly frequent and prolonged droughts. Chile, India, Australia, United States, France and even Russia are all regions of the world suffering from the acceleration of the global drought. Droughts are slowly evolving natural hazards that can last from a few months to several decades and affect larger or smaller areas, whether they are small watersheds or areas of hundreds of thousands of square kilometres. In addition to their direct effects on water resources, agriculture and ecosystems, droughts can cause fires or heat waves. They also promote the proliferation of invasive species, creating environments with multiple risks, worsening the consequences on ecosystems and societies, and increasing their vulnerability. Although these are natural phenomena, there is a growing understanding of how humans have amplified the severity and impacts of droughts, both on the environment and on people. We influence meteorological droughts through our action on climate change, and we influence hydrological droughts through our management of water circulation and water processes at the local scale, for example by diverting rivers or modifying land use. During the Anthropocene (the present period when humans exert a dominant influence on climate and environment), droughts are closely linked to human activities, cultures, and responses. From this scientific overview, it may be concluded apocalyptic scenario is not only a literature genre inspired from the pure imagination. Instead, many scientific arguments are in favour of this dramatic destiny of Homo Sapiens Sapiens. Fig.2. Sea level rise from 1880 to 2020: a possible apocalyptic scenario (www.globalchange.gov, 2021) Apocalyptic genre in recent writing As the original landmark of apocalyptic writing, we must place the destruction of the Temple of Jerusalem in 587 BC and the Exile in Babylon. Occasion of a religious and cultural crossing with imprescriptible effects, the Exile brought about a true rebirth, characterized by the maintenance of the essential ethical, even cultural, of a national religion, that of Moses, kept as pure as possible on a foreign land and by the reinterpretation of this fundamental heritage by the archaic return of what was very old, both national traditions and neighbouring cultures. More precisely, it was the place and time for the rehabilitation of cultures and the melting pot for recasting ancient myths. This vast infatuation with Antiquity, remarkable even in the vocabulary used, was not limited to Israel: it even largely reflected a general trend. The long period that preceded throughout the 7th century BC and until 587, like that prior to the edict of Cyrus in 538 BC, was that of restorations and rebirths, of returns to distant sources and cultural crossings. In the biblical literature of this period, one is struck by the almost systematic link between, on the one hand, a very sustained mythical reinvestment even in form and, on the other, the frequent use of biblical archaisms. The example of Shadday, a word firmly rooted in the Semites of the Northwest and epithet of El in the oldest layers of the books of Genesis and Exodus, is most eloquent. This term reappears precisely at the time of the Exile as a designation of the divinity of the Patriarchs and of the God of Israel; Daily, ecological catastrophes now describe the normal state of societies exposed to "risks", in the sense that Ulrich Beck gives to this term: "the risk society is a society of catastrophe. The state of emergency threatens to become a normal state there1”. Now, the "threat" has become clearer, and catastrophic "exceptions" are proliferating as quickly as species are disappearing and climate change is accelerating. The relationship that we have with this worrying reality, to say the least, is twofold: on the one hand, we know very well what is happening to us; on the other hand, we fail to draw the appropriate theoretical and political consequences. This ecological duplicity is at the heart of what has come to be called the “Anthropocene”, a term coined at the dawn of the 21st century by Eugene Stoermer (an environmentalist) and Paul Crutzen (a specialist in the chemistry of the atmosphere) in order to describe an age when humanity would have become a "major geological force" capable of disrupting the climate and changing the terrestrial landscape from top to bottom. If the term “Anthropocene” takes note of human responsibility for climate change, this responsibility is immediately attributed to overpowering: strong as we are, we have “involuntarily” changed the climate for at least two hundred and fifty years. Therefore, let us deliberately change the face of the Earth, if necessary, install a solar shield in space. Recognition and denial fuel the signifying machine of the Anthropocene. And it is precisely what structures eco-apocalyptic cinema that this article aims to study. By "eco-apocalyptic cinema", we first mean a cinematographic sub-genre: eco-apocalyptic and post-eco-apocalyptic films base the possibility (or reality) of the end of the world on environmental grounds and not, for example, on damage caused by the possible collision of planet Earth with a comet. Post-apocalyptic science fiction (sometimes abbreviated as "post-apo" or "post-nuke") is a sub-genre of science fiction that depicts life after a disaster that destroyed civilization: nuclear war, collision with a meteorite, epidemic, economic or energy crisis, pandemic, alien invasion. Conclusion Climate and politics have been linked together since Aristotle. With Montesquieu, Ibn Khaldûn or Watsuji, a certain climatic determinism is attributed to the character of a nation. The break with modernity made the climate an object of scientific knowledge which, in the twentieth century, made it possible to document, despite the controversies, the climatic changes linked to industrialization. Both endanger the survival of human beings and ecosystems. Climate ethics are therefore looking for a new relationship with the biosphere or Gaia. For some, with the absence of political agreements, it is the beginning of inevitable catastrophes. For others, the Anthropocene, which henceforth merges human history with natural history, opens onto technical action. The debate between climate determinism and human freedom is revived. The reference to the biblical Apocalypse was present in the thinking of thinkers like Günther Anders, Karl Jaspers or Hans Jonas: the era of the atomic bomb would mark an entry into the time of the end, a time marked by the unprecedented human possibility of 'total war and annihilation of mankind. The Apocalypse will be very relevant in describing the chaos to come if our societies continue their mad race described as extra-activist, productivist and consumerist. In dialogue with different theologians and philosophers (such as Jacques Ellul), it is possible to unveil some spiritual, ethical, and political resources that the Apocalypse offers for thinking about History and human engagement in the Anthropocene. What can a theology of collapse mean at a time when negative signs and dead ends in the human situation multiply? What then is the place of man and of the cosmos in the Apocalypse according to Saint John? Could the end of history be a collapse? How can we live in the time we have left before the disaster? Answers to such questions remain unknown and no scientist can predict the trajectory of this Great Acceleration taking place at the Late Anthropocene. When science cannot give answers, Man tries to infer his destiny for the legend, religion and the fiction. Climate Fiction is developed into a recording machine containing every kind of fictions that depict environmental condition events and has consequently lost its true significance. Aware of the prospect of ecological collapse additionally as our apparent inability to avert it, we tend to face geology changes of forceful proportions that severely challenge our ability to imagine the implications. Climate fiction ought to be considered an important supplement to climate science, as a result, climate fiction makes visible and conceivable future modes of existence inside worlds not solely deemed seemingly by science, however that area unit scientifically anticipated. Hence, this chapter, as part of the book itself, aims to contribute to studies of ecocriticism, the environmental humanities, and literary and culture studies. References David P.G. Bondand Stephen E. Grasby. "Late Ordovician mass extinction caused by volcanism, warming, and anoxia, not cooling and glaciation: REPLY." Geology 48, no. 8 (Geological Society of America2020): 510. Cyril Langlois.’Vestiges de l'apocalypse: ‘le site de Tanis, Dakota du Nord 2019’. Accessed June, 6, 2021, https://planet-terre.ens-lyon.fr/pdf/Tanis-extinction-K-Pg.pdf NajouaGharsalli,ElhoucineEssefi, Rana Baydoun, and ChokriYaich. ‘The Anthropocene and Great Acceleration as controversial epoch of human-induced activities: case study of the Halk El Menjel wetland, eastern Tunisia’. Applied Ecology and Environmental Research 18(3) (Corvinus University of Budapest 2020): 4137-4166 Elhoucine Essefi, ‘On the Geochemistry and Mineralogy of the Anthropocene’. International Journal of Water and Wastewater Treatment, 6(2). 1-14, (Sci Forschen2020): doi.org/10.16966/2381-5299.168 Elhoucine Essefi. ‘Record of the Anthropocene-Great Acceleration along a core from the coast of Sfax, southeastern Tunisia’. Turkish journal of earth science, (TÜBİTAK,2021). 1-16. Chiara Xausa. ‘Climate Fiction and the Crisis of Imagination: Alexis Wright’s Carpentaria and The Swan Book’. Exchanges: The Interdisciplinary Research Journal 8(2), (WARWICK 2021): 99-119. Akyol, Özlem. "Climate Change: An Apocalypse for Urban Space? An Ecocritical Reading of “Venice Drowned” and “The Tamarisk Hunter”." Folklor/Edebiyat 26, no. 101 (UluslararasıKıbrısÜniversitesi 2020): 115-126. Boswell, Suzanne F. "The Four Tourists of the Apocalypse: Figures of the Anthropocene in Caribbean Climate Fiction.". Paradoxa 31, (Academia 2020): 359-378. Ayt Ougougdal, Houssam, Mohamed YacoubiKhebiza, Mohammed Messouli, and Asia Lachir. "Assessment of future water demand and supply under IPCC climate change and socio-economic scenarios, using a combination of models in Ourika Watershed, High Atlas, Morocco." Water 12, no. 6 (MPDI 2020): 1751.DOI:10.3390/w12061751. Wu, Jia, Zhenyu Han, Ying Xu, Botao Zhou, and Xuejie Gao. "Changes in extreme climate events in China under 1.5 C–4 C global warming targets: Projections using an ensemble of regional climate model simulations." Journal of Geophysical Research: Atmospheres 125, no. 2 (Wiley2020): e2019JD031057.https://doi.org/10.1029/2019JD031057 Khan, Md Jamal Uddin, A. K. M. Islam, Sujit Kumar Bala, and G. M. Islam. "Changes in climateextremes over Bangladesh at 1.5° C, 2° C, and 4° C of global warmingwith high-resolutionregionalclimate modeling." Theoretical&AppliedClimatology 140 (EBSCO2020). Gudoshava, Masilin, Herbert O. Misiani, Zewdu T. Segele, Suman Jain, Jully O. Ouma, George Otieno, Richard Anyah et al. "Projected effects of 1.5 C and 2 C global warming levels on the intra-seasonal rainfall characteristics over the Greater Horn of Africa." Environmental Research Letters 15, no. 3 (IOPscience2020): 34-37. Wang, Lawrence K., Mu-Hao Sung Wang, Nai-Yi Wang, and Josephine O. Wong. "Effect of Global Warming and Climate Change on Glaciers and Salmons." In Integrated Natural Resources Management, ed.Lawrence K. Wang, Mu-Hao Sung Wang, Yung-Tse Hung, Nazih K. Shammas(Springer 2021), 1-36. Merschroth, Simon, Alessio Miatto, Steffi Weyand, Hiroki Tanikawa, and Liselotte Schebek. "Lost Material Stock in Buildings due to Sea Level Rise from Global Warming: The Case of Fiji Islands." Sustainability 12, no. 3 (MDPI 2020): 834.doi:10.3390/su12030834 Hofer, Stefan, Charlotte Lang, Charles Amory, Christoph Kittel, Alison Delhasse, Andrew Tedstone, and Xavier Fettweis. "Greater Greenland Ice Sheet contribution to global sea level rise in CMIP6." Nature communications 11, no. 1 (Nature Publishing Group 2020): 1-11.
APA, Harvard, Vancouver, ISO, and other styles
13

Grigoryev, Ruslan A. "Non-synchronous time series is the main reason of US stock exchanges leadership in classic econometric models." Actual Problems of Economics and Law 12, no. 2 (July 2, 2018). http://dx.doi.org/10.21202/1993-047x.12.2018.2.241-255.

Full text
APA, Harvard, Vancouver, ISO, and other styles
14

Gregoriou, Andros, and Robert Hudson. "Market frictions and the geographical location of global stock exchanges. Evidence from the S&P Global Index." Journal of Economic Studies ahead-of-print, ahead-of-print (May 22, 2020). http://dx.doi.org/10.1108/jes-03-2020-0091.

Full text
Abstract:
PurposeWe examine the impact of market frictions in the form of trading costs on investor average holding periods for stocks in the S&P global 1200 index to examine constraints on international portfolio diversification.Design/methodology/approachWe determine whether it is appropriate to pool stocks listed in the USA, Canada, Latin America, Europe, Japan, Asia and Australia into investigations using the same empirical specification. This is very important because the pooled effects may not provide consistent estimates of the average.FindingsWe report overwhelming econometric evidence that it is not valid to pool stocks in all the underlying regional equity indices for our investigation, indicating that the effect of frictions varies between markets.Research limitations/implicationsWhen we pool the stocks within markets, we discover that for companies listed in the USA, Europe, Canada and Australia, market frictions do not significantly influence holding periods and hence are not a barrier to portfolio rebalancing. However, companies listed in Latin America and Asia face market frictions, which are significant in terms of increasing holding periods.Practical implicationsWe ascertain that taking into account the properties of stock markets in different geographical locations is vital for understanding the limits on achieving international portfolio diversification.Originality/valueUnlike prior research, we overcome the problems caused by contemporaneous correlation, endogeneity and joint determination of investor average holding periods and trading costs by employing the Generalized Method of Moments (GMM) system panel estimator. This makes our empirical estimates robust and more reliable than the previous empirical research in this area.
APA, Harvard, Vancouver, ISO, and other styles
15

Burns, Alex. "Oblique Strategies for Ambient Journalism." M/C Journal 13, no. 2 (April 15, 2010). http://dx.doi.org/10.5204/mcj.230.

Full text
Abstract:
Alfred Hermida recently posited ‘ambient journalism’ as a new framework for para- and professional journalists, who use social networks like Twitter for story sources, and as a news delivery platform. Beginning with this framework, this article explores the following questions: How does Hermida define ‘ambient journalism’ and what is its significance? Are there alternative definitions? What lessons do current platforms provide for the design of future, real-time platforms that ‘ambient journalists’ might use? What lessons does the work of Brian Eno provide–the musician and producer who coined the term ‘ambient music’ over three decades ago? My aim here is to formulate an alternative definition of ambient journalism that emphasises craft, skills acquisition, and the mental models of professional journalists, which are the foundations more generally for journalism practices. Rather than Hermida’s participatory media context I emphasise ‘institutional adaptiveness’: how journalists and newsrooms in media institutions rely on craft and skills, and how emerging platforms can augment these foundations, rather than replace them. Hermida’s Ambient Journalism and the Role of Journalists Hermida describes ambient journalism as: “broad, asynchronous, lightweight and always-on communication systems [that] are creating new kinds of interactions around the news, and are enabling citizens to maintain a mental model of news and events around them” (Hermida 2). His ideas appear to have two related aspects. He conceives ambient journalism as an “awareness system” between individuals that functions as a collective intelligence or kind of ‘distributed cognition’ at a group level (Hermida 2, 4-6). Facebook, Twitter and other online social networks are examples. Hermida also suggests that such networks enable non-professionals to engage in ‘communication’ and ‘conversation’ about news and media events (Hermida 2, 7). In a helpful clarification, Hermida observes that ‘para-journalists’ are like the paralegals or non-lawyers who provide administrative support in the legal profession and, in academic debates about journalism, are more commonly known as ‘citizen journalists’. Thus, Hermida’s ambient journalism appears to be: (1) an information systems model of new platforms and networks, and (2) a normative argument that these tools empower ‘para-journalists’ to engage in journalism and real-time commentary. Hermida’s thesis is intriguing and worthy of further discussion and debate. As currently formulated however it risks sharing the blind-spots and contradictions of the academic literature that Hermida cites, which suffers from poor theory-building (Burns). A major reason is that the participatory media context on which Hermida often builds his work has different mental models and normative theories than the journalists or media institutions that are the target of critique. Ambient journalism would be a stronger and more convincing framework if these incorrect assumptions were jettisoned. Others may also potentially misunderstand what Hermida proposes, because the academic debate is often polarised between para-journalists and professional journalists, due to different views about institutions, the politics of knowledge, decision heuristics, journalist training, and normative theoretical traditions (Christians et al. 126; Cole and Harcup 166-176). In the academic debate, para-journalists or ‘citizen journalists’ may be said to have a communitarian ethic and desire more autonomous solutions to journalists who are framed as uncritical and reliant on official sources, and to media institutions who are portrayed as surveillance-like ‘monitors’ of society (Christians et al. 124-127). This is however only one of a range of possible relationships. Sole reliance on para-journalists could be a premature solution to a more complex media ecology. Journalism craft, which does not rely just on official sources, also has a range of practices that already provides the “more complex ways of understanding and reporting on the subtleties of public communication” sought (Hermida 2). Citizen- and para-journalist accounts may overlook micro-studies in how newsrooms adopt technological innovations and integrate them into newsgathering routines (Hemmingway 196). Thus, an examination of the realities of professional journalism will help to cast a better light on how ambient journalism can shape the mental models of para-journalists, and provide more rigorous analysis of news and similar events. Professional journalism has several core dimensions that para-journalists may overlook. Journalism’s foundation as an experiential craft includes guidance and norms that orient the journalist to information, and that includes practitioner ethics. This craft is experiential; the basis for journalism’s claim to “social expertise” as a discipline; and more like the original Linux and Open Source movements which evolved through creative conflict (Sennett 9, 25-27, 125-127, 249-251). There are learnable, transmissible skills to contextually evaluate, filter, select and distil the essential insights. This craft-based foundation and skills informs and structures the journalist’s cognitive witnessing of an event, either directly or via reconstructed, cultivated sources. The journalist publishes through a recognised media institution or online platform, which provides communal validation and verification. There is far more here than the academic portrayal of journalists as ‘gate-watchers’ for a ‘corporatist’ media elite. Craft and skills distinguish the professional journalist from Hermida’s para-journalist. Increasingly, media institutions hire journalists who are trained in other craft-based research methods (Burns and Saunders). Bethany McLean who ‘broke’ the Enron scandal was an investment banker; documentary filmmaker Errol Morris first interviewed serial killers for an early project; and Neil Chenoweth used ‘forensic accounting’ techniques to investigate Rupert Murdoch and Kerry Packer. Such expertise allows the journalist to filter information, and to mediate any influences in the external environment, in order to develop an individualised, ‘embodied’ perspective (Hofstadter 234; Thompson; Garfinkel and Rawls). Para-journalists and social network platforms cannot replace this expertise, which is often unique to individual journalists and their research teams. Ambient Journalism and Twitter Current academic debates about how citizen- and para-journalists may augment or even replace professional journalists can often turn into legitimation battles whether the ‘de facto’ solution is a social media network rather than a media institution. For example, Hermida discusses Twitter, a micro-blogging platform that allows users to post 140-character messages that are small, discrete information chunks, for short-term and episodic memory. Twitter enables users to monitor other users, to group other messages, and to search for terms specified by a hashtag. Twitter thus illustrates how social media platforms can make data more transparent and explicit to non-specialists like para-journalists. In fact, Twitter is suitable for five different categories of real-time information: news, pre-news, rumours, the formation of social media and subject-based networks, and “molecular search” using granular data-mining tools (Leinweber 204-205). In this model, the para-journalist acts as a navigator and “way-finder” to new information (Morville, Findability). Jaron Lanier, an early designer of ‘virtual reality’ systems, is perhaps the most vocal critic of relying on groups of non-experts and tools like Twitter, instead of individuals who have professional expertise. For Lanier, what underlies debates about citizen- and para-journalists is a philosophy of “cybernetic totalism” and “digital Maoism” which exalts the Internet collective at the expense of truly individual views. He is deeply critical of Hermida’s chosen platform, Twitter: “A design that shares Twitter’s feature of providing ambient continuous contact between people could perhaps drop Twitter’s adoration of fragments. We don’t really know, because it is an unexplored design space” [emphasis added] (Lanier 24). In part, Lanier’s objection is traceable back to an unresolved debate on human factors and design in information science. Influenced by the post-war research into cybernetics, J.C.R. Licklider proposed a cyborg-like model of “man-machine symbiosis” between computers and humans (Licklider). In turn, Licklider’s framework influenced Douglas Engelbart, who shaped the growth of human-computer interaction, and the design of computer interfaces, the mouse, and other tools (Engelbart). In taking a system-level view of platforms Hermida builds on the strength of Licklider and Engelbart’s work. Yet because he focuses on para-journalists, and does not appear to include the craft and skills-based expertise of professional journalists, it is unclear how he would answer Lanier’s fears about how reliance on groups for news and other information is superior to individual expertise and judgment. Hermida’s two case studies point to this unresolved problem. Both cases appear to show how Twitter provides quicker and better forms of news and information, thereby increasing the effectiveness of para-journalists to engage in journalism and real-time commentary. However, alternative explanations may exist that raise questions about Twitter as a new platform, and thus these cases might actually reveal circumstances in which ambient journalism may fail. Hermida alludes to how para-journalists now fulfil the earlier role of ‘first responders’ and stringers, in providing the “immediate dissemination” of non-official information about disasters and emergencies (Hermida 1-2; Haddow and Haddow 117-118). Whilst important, this is really a specific role. In fact, disaster and emergency reporting occurs within well-established practices, professional ethics, and institutional routines that may involve journalists, government officials, and professional communication experts (Moeller). Officials and emergency management planners are concerned that citizen- or para-journalism is equated with the craft and skills of professional journalism. The experience of these officials and planners in 2005’s Hurricane Katrina in the United States, and in 2009’s Black Saturday bushfires in Australia, suggests that whilst para-journalists might be ‘first responders’ in a decentralised, complex crisis, they are perceived to spread rumours and potential social unrest when people need reliable information (Haddow and Haddow 39). These terms of engagement between officials, planners and para-journalists are still to be resolved. Hermida readily acknowledges that Twitter and other social network platforms are vulnerable to rumours (Hermida 3-4; Sunstein). However, his other case study, Iran’s 2009 election crisis, further complicates the vision of ambient journalism, and always-on communication systems in particular. Hermida discusses several events during the crisis: the US State Department request to halt a server upgrade, how the Basij’s shooting of bystander Neda Soltan was captured on a mobile phone camera, the spread across social network platforms, and the high-velocity number of ‘tweets’ or messages during the first two weeks of Iran’s electoral uncertainty (Hermida 1). The US State Department was interested in how Twitter could be used for non-official sources, and to inform people who were monitoring the election events. Twitter’s perceived ‘success’ during Iran’s 2009 election now looks rather different when other factors are considered such as: the dynamics and patterns of Tehran street protests; Iran’s clerics who used Soltan’s death as propaganda; claims that Iran’s intelligence services used Twitter to track down and to kill protestors; the ‘black box’ case of what the US State Department and others actually did during the crisis; the history of neo-conservative interest in a Twitter-like platform for strategic information operations; and the Iranian diaspora’s incitement of Tehran student protests via satellite broadcasts. Iran’s 2009 election crisis has important lessons for ambient journalism: always-on communication systems may create noise and spread rumours; ‘mirror-imaging’ of mental models may occur, when other participants have very different worldviews and ‘contexts of use’ for social network platforms; and the new kinds of interaction may not lead to effective intervention in crisis events. Hermida’s combination of news and non-news fragments is the perfect environment for psychological operations and strategic information warfare (Burns and Eltham). Lessons of Current Platforms for Ambient Journalism We have discussed some unresolved problems for ambient journalism as a framework for journalists, and as mental models for news and similar events. Hermida’s goal of an “awareness system” faces a further challenge: the phenomenological limitations of human consciousness to deal with information complexity and ambiguous situations, whether by becoming ‘entangled’ in abstract information or by developing new, unexpected uses for emergent technologies (Thackara; Thompson; Hofstadter 101-102, 186; Morville, Findability, 55, 57, 158). The recursive and reflective capacities of human consciousness imposes its own epistemological frames. It’s still unclear how Licklider’s human-computer interaction will shape consciousness, but Douglas Hofstadter’s experiments with art and video-based group experiments may be suggestive. Hofstadter observes: “the interpenetration of our worlds becomes so great that our worldviews start to fuse” (266). Current research into user experience and information design provides some validation of Hofstadter’s experience, such as how Google is now the ‘default’ search engine, and how its interface design shapes the user’s subjective experience of online search (Morville, Findability; Morville, Search Patterns). Several models of Hermida’s awareness system already exist that build on Hofstadter’s insight. Within the information systems field, on-going research into artificial intelligence–‘expert systems’ that can model expertise as algorithms and decision rules, genetic algorithms, and evolutionary computation–has attempted to achieve Hermida’s goal. What these systems share are mental models of cognition, learning and adaptiveness to new information, often with forecasting and prediction capabilities. Such systems work in journalism areas such as finance and sports that involve analytics, data-mining and statistics, and in related fields such as health informatics where there are clear, explicit guidelines on information and international standards. After a mid-1980s investment bubble (Leinweber 183-184) these systems now underpin the technology platforms of global finance and news intermediaries. Bloomberg LP’s ubiquitous dual-screen computers, proprietary network and data analytics (www.bloomberg.com), and its competitors such as Thomson Reuters (www.thomsonreuters.com and www.reuters.com), illustrate how financial analysts and traders rely on an “awareness system” to navigate global stock-markets (Clifford and Creswell). For example, a Bloomberg subscriber can access real-time analytics from exchanges, markets, and from data vendors such as Dow Jones, NYSE Euronext and Thomson Reuters. They can use portfolio management tools to evaluate market information, to make allocation and trading decisions, to monitor ‘breaking’ news, and to integrate this information. Twitter is perhaps the para-journalist equivalent to how professional journalists and finance analysts rely on Bloomberg’s platform for real-time market and business information. Already, hedge funds like PhaseCapital are data-mining Twitter’s ‘tweets’ or messages for rumours, shifts in stock-market sentiment, and to analyse potential trading patterns (Pritchett and Palmer). The US-based Securities and Exchange Commission, and researchers like David Gelernter and Paul Tetlock, have also shown the benefits of applied data-mining for regulatory market supervision, in particular to uncover analysts who provide ‘whisper numbers’ to online message boards, and who have access to material, non-public information (Leinweber 60, 136, 144-145, 208, 219, 241-246). Hermida’s framework might be developed further for such regulatory supervision. Hermida’s awareness system may also benefit from the algorithms found in high-frequency trading (HFT) systems that Citadel Group, Goldman Sachs, Renaissance Technologies, and other quantitative financial institutions use. Rather than human traders, HFT uses co-located servers and complex algorithms, to make high-volume trades on stock-markets that take advantage of microsecond changes in prices (Duhigg). HFT capabilities are shrouded in secrecy, and became the focus of regulatory attention after several high-profile investigations of traders alleged to have stolen the software code (Bray and Bunge). One public example is Streambase (www.streambase.com), a ‘complex event processing’ (CEP) platform that can be used in HFT, and commercialised from the Project Aurora research collaboration between Brandeis University, Brown University, and Massachusetts Institute of Technology. CEP and HFT may be the ‘killer apps’ of Hermida’s awareness system. Alternatively, they may confirm Jaron Lanier’s worst fears: your data-stream and user-generated content can be harvested by others–for their gain, and your loss! Conclusion: Brian Eno and Redefining ‘Ambient Journalism’ On the basis of the above discussion, I suggest a modified definition of Hermida’s thesis: ‘Ambient journalism’ is an emerging analytical framework for journalists, informed by cognitive, cybernetic, and information systems research. It ‘sensitises’ the individual journalist, whether professional or ‘para-professional’, to observe and to evaluate their immediate context. In doing so, ‘ambient journalism’, like journalism generally, emphasises ‘novel’ information. It can also inform the design of real-time platforms for journalistic sources and news delivery. Individual ‘ambient journalists’ can learn much from the career of musician and producer Brian Eno. His personal definition of ‘ambient’ is “an atmosphere, or a surrounding influence: a tint,” that relies on the co-evolution of the musician, creative horizons, and studio technology as a tool, just as para-journalists use Twitter as a platform (Sheppard 278; Eno 293-297). Like para-journalists, Eno claims to be a “self-educated but largely untrained” musician and yet also a craft-based producer (McFadzean; Tamm 177; 44-50). Perhaps Eno would frame the distinction between para-journalist and professional journalist as “axis thinking” (Eno 298, 302) which is needlessly polarised due to different normative theories, stances, and practices. Furthermore, I would argue that Eno’s worldview was shaped by similar influences to Licklider and Engelbart, who appear to have informed Hermida’s assumptions. These influences include the mathematician and game theorist John von Neumann and biologist Richard Dawkins (Eno 162); musicians Eric Satie, John Cage and his book Silence (Eno 19-22, 162; Sheppard 22, 36, 378-379); and the field of self-organising systems, in particular cyberneticist Stafford Beer (Eno 245; Tamm 86; Sheppard 224). Eno summed up the central lesson of this theoretical corpus during his collaborations with New York’s ‘No Wave’ scene in 1978, of “people experimenting with their lives” (Eno 253; Reynolds 146-147; Sheppard 290-295). Importantly, he developed a personal view of normative theories through practice-based research, on a range of projects, and with different creative and collaborative teams. Rather than a technological solution, Eno settled on a way to encode his craft and skills into a quasi-experimental, transmittable method—an aim of practitioner development in professional journalism. Even if only a “founding myth,” the story of Eno’s 1975 street accident with a taxi, and how he conceived ‘ambient music’ during his hospital stay, illustrates how ambient journalists might perceive something new in specific circumstances (Tamm 131; Sheppard 186-188). More tellingly, this background informed his collaboration with the late painter Peter Schmidt, to co-create the Oblique Strategies deck of aphorisms: aleatory, oracular messages that appeared dependent on chance, luck, and randomness, but that in fact were based on Eno and Schmidt’s creative philosophy and work guidelines (Tamm 77-78; Sheppard 178-179; Reynolds 170). In short, Eno was engaging with the kind of reflective practices that underpin exemplary professional journalism. He was able to encode this craft and skills into a quasi-experimental method, rather than a technological solution. Journalists and practitioners who adopt Hermida’s framework could learn much from the published accounts of Eno’s practice-based research, in the context of creative projects and collaborative teams. In particular, these detail the contexts and choices of Eno’s early ambient music recordings (Sheppard 199-200); Eno’s duels with David Bowie during ‘Sense of Doubt’ for the Heroes album (Tamm 158; Sheppard 254-255); troubled collaborations with Talking Heads and David Byrne (Reynolds 165-170; Sheppard; 338-347, 353); a curatorial, mentor role on U2’s The Unforgettable Fire (Sheppard 368-369); the ‘grand, stadium scale’ experiments of U2’s 1991-93 ZooTV tour (Sheppard 404); the Zorn-like games of Bowie’s Outside album (Eno 382-389); and the ‘generative’ artwork 77 Million Paintings (Eno 330-332; Tamm 133-135; Sheppard 278-279; Eno 435). Eno is clearly a highly flexible maker and producer. Developing such flexibility would ensure ambient journalism remains open to novelty as an analytical framework that may enhance the practitioner development and work of professional journalists and para-journalists alike.Acknowledgments The author thanks editor Luke Jaaniste, Alfred Hermida, and the two blind peer reviewers for their constructive feedback and reflective insights. References Bray, Chad, and Jacob Bunge. “Ex-Goldman Programmer Indicted for Trade Secrets Theft.” The Wall Street Journal 12 Feb. 2010. 17 March 2010 ‹http://online.wsj.com/article/SB10001424052748703382904575059660427173510.html›. Burns, Alex. “Select Issues with New Media Theories of Citizen Journalism.” M/C Journal 11.1 (2008). 17 March 2010 ‹http://journal.media-culture.org.au/index.php/mcjournal/article/view/30›.———, and Barry Saunders. “Journalists as Investigators and ‘Quality Media’ Reputation.” Record of the Communications Policy and Research Forum 2009. Eds. Franco Papandrea and Mark Armstrong. Sydney: Network Insight Institute, 281-297. 17 March 2010 ‹http://eprints.vu.edu.au/15229/1/CPRF09BurnsSaunders.pdf›.———, and Ben Eltham. “Twitter Free Iran: An Evaluation of Twitter’s Role in Public Diplomacy and Information Operations in Iran’s 2009 Election Crisis.” Record of the Communications Policy and Research Forum 2009. Eds. Franco Papandrea and Mark Armstrong. Sydney: Network Insight Institute, 298-310. 17 March 2010 ‹http://eprints.vu.edu.au/15230/1/CPRF09BurnsEltham.pdf›. Christians, Clifford G., Theodore Glasser, Denis McQuail, Kaarle Nordenstreng, and Robert A. White. Normative Theories of the Media: Journalism in Democratic Societies. Champaign, IL: University of Illinois Press, 2009. Clifford, Stephanie, and Julie Creswell. “At Bloomberg, Modest Strategy to Rule the World.” The New York Times 14 Nov. 2009. 17 March 2010 ‹http://www.nytimes.com/2009/11/15/business/media/15bloom.html?ref=businessandpagewanted=all›.Cole, Peter, and Tony Harcup. Newspaper Journalism. Thousand Oaks, CA: Sage Publications, 2010. Duhigg, Charles. “Stock Traders Find Speed Pays, in Milliseconds.” The New York Times 23 July 2009. 17 March 2010 ‹http://www.nytimes.com/2009/07/24/business/24trading.html?_r=2andref=business›. Engelbart, Douglas. “Augmenting Human Intellect: A Conceptual Framework, 1962.” Ed. Neil Spiller. Cyber Reader: Critical Writings for the Digital Era. London: Phaidon Press, 2002. 60-67. Eno, Brian. A Year with Swollen Appendices. London: Faber and Faber, 1996. Garfinkel, Harold, and Anne Warfield Rawls. Toward a Sociological Theory of Information. Boulder, CO: Paradigm Publishers, 2008. Hadlow, George D., and Kim S. Haddow. Disaster Communications in a Changing Media World, Butterworth-Heinemann, Burlington MA, 2009. Hemmingway, Emma. Into the Newsroom: Exploring the Digital Production of Regional Television News. Milton Park: Routledge, 2008. Hermida, Alfred. “Twittering the News: The Emergence of Ambient Journalism.” Journalism Practice 4.3 (2010): 1-12. Hofstadter, Douglas. I Am a Strange Loop. New York: Perseus Books, 2007. Lanier, Jaron. You Are Not a Gadget: A Manifesto. London: Allen Lane, 2010. Leinweber, David. Nerds on Wall Street: Math, Machines and Wired Markets. Hoboken, NJ: John Wiley and Sons, 2009. Licklider, J.C.R. “Man-Machine Symbiosis, 1960.” Ed. Neil Spiller. Cyber Reader: Critical Writings for the Digital Era, London: Phaidon Press, 2002. 52-59. McFadzean, Elspeth. “What Can We Learn from Creative People? The Story of Brian Eno.” Management Decision 38.1 (2000): 51-56. Moeller, Susan. Compassion Fatigue: How the Media Sell Disease, Famine, War and Death. New York: Routledge, 1998. Morville, Peter. Ambient Findability. Sebastopol, CA: O’Reilly Press, 2005. ———. Search Patterns. Sebastopol, CA: O’Reilly Press, 2010.Pritchett, Eric, and Mark Palmer. ‘Following the Tweet Trail.’ CNBC 11 July 2009. 17 March 2010 ‹http://www.casttv.com/ext/ug0p08›. Reynolds, Simon. Rip It Up and Start Again: Postpunk 1978-1984. London: Penguin Books, 2006. Sennett, Richard. The Craftsman. London: Penguin Books, 2008. Sheppard, David. On Some Faraway Beach: The Life and Times of Brian Eno. London: Orion Books, 2008. Sunstein, Cass. On Rumours: How Falsehoods Spread, Why We Believe Them, What Can Be Done. New York: Farrar, Straus and Giroux, 2009. Tamm, Eric. Brian Eno: His Music and the Vertical Colour of Sound. New York: Da Capo Press, 1995. Thackara, John. In the Bubble: Designing in a Complex World. Boston, MA: The MIT Press, 1995. Thompson, Evan. Mind in Life: Biology, Phenomenology, and the Science of Mind. Boston, MA: Belknap Press, 2007.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography