To see the other types of publications on this topic, follow the link: Temporary equilibrium theory.

Journal articles on the topic 'Temporary equilibrium theory'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 37 journal articles for your research on the topic 'Temporary equilibrium theory.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Rose, Hugh. "A policy rule for ‘Say's Law’ in a theory of temporary equilibrium." Journal of Macroeconomics 7, no. 1 (December 1985): 1–17. http://dx.doi.org/10.1016/0164-0704(85)90002-3.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Henrotte, Philippe. "Construction of a state space for interrelated securities with an application to temporary equilibrium theory." Economic Theory 8, no. 3 (October 1, 1996): 423–59. http://dx.doi.org/10.1007/s001990050100.

Full text
APA, Harvard, Vancouver, ISO, and other styles
3

Henrotte, Philippe. "Construction of a state space for interrelated securities with an application to temporary equilibrium theory." Economic Theory 8, no. 3 (October 1996): 423–59. http://dx.doi.org/10.1007/bf01213504.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Hagemann, Harald. "The Cambridge–Cambridge controversy on the theory of capital: 50 years after." European Journal of Economics and Economic Policies: Intervention 17, no. 2 (September 18, 2020): 196–207. http://dx.doi.org/10.4337/ejeep.2020.02.09.

Full text
Abstract:
The paper points out that capital theory has always been a hotly debated subject, partly because the theoretical issues involved are very complex, and partly because rival ideologies and value systems directly affect the issues discussed. The focus is on the history, the main protagonists, and the relevant problems examined and argued about during the two Cambridges controversy on the theory of capital which was at its peak 50 years ago. Whereas one clear result of these debates is that neither Samuelson's surrogate production function nor Solow's rate-of-return concept could resurrect aggregate neoclassical theory, many other questions, such as the treatment of capital in temporary or intertemporal general equilibrium models or the empirical relevance of the reswitching phenomenon, are still discussed controversially.
APA, Harvard, Vancouver, ISO, and other styles
5

Huang, Can, Yi Zhi Bu, and Qing Hua Zhang. "The Verification of Geometry Control Method for Cable-Stayed Bridge." Advanced Materials Research 1065-1069 (December 2014): 992–96. http://dx.doi.org/10.4028/www.scientific.net/amr.1065-1069.992.

Full text
Abstract:
Based on the energy method and beam-element theory, the nonlinear strain are considered, non-stress length and non-stress curvature of element of geometry control method are introducted in the integration process of stain energy. The static equilibrium equation of the geometry control method is established. Take the impacts of structural geometric profile induced by temporary loads and temperature field during the construction procedure are investigated, the correctness of the geometry control method is verified by the numerical simulation analysis.
APA, Harvard, Vancouver, ISO, and other styles
6

Dai, Yunxian, Yiping Lin, Huitao Zhao, and Chaudry Masood Khalique. "Global stability and Hopf bifurcation of a delayed computer virus propagation model with saturation incidence rate and temporary immunity." International Journal of Modern Physics B 30, no. 28n29 (November 10, 2016): 1640009. http://dx.doi.org/10.1142/s0217979216400099.

Full text
Abstract:
In this paper, a delayed computer virus propagation model with a saturation incidence rate and a time delay describing temporary immune period is proposed and its dynamical behaviors are studied. The threshold value [Formula: see text] is given to determine whether the virus dies out completely. By comparison arguments and iteration technique, sufficient conditions are obtained for the global asymptotic stabilities of the virus-free equilibrium and the virus equilibrium. Taking the delay as a parameter, local Hopf bifurcations are demonstrated. Furthermore, the direction of Hopf bifurcation and the stabilities of the bifurcating periodic solutions are determined by the normal form theory and the center manifold theorem for functional differential equations (FDEs). Finally, numerical simulations are carried out to illustrate the main theoretical results.
APA, Harvard, Vancouver, ISO, and other styles
7

Isar, Aurelian. "Entanglement Generation in Two-Mode Gaussian Systems in a Thermal Environment." Open Systems & Information Dynamics 23, no. 01 (March 2016): 1650007. http://dx.doi.org/10.1142/s1230161216500074.

Full text
Abstract:
We describe the evolution of the quantum entanglement in a system composed of two interacting bosonic modes immersed in a thermal reservoir, in the framework of the theory of open systems based on completely positive quantum dynamical semigroups. The evolution of entanglement is described in terms of the covariance matrix for Gaussian initial states. We calculate the logarithmic negativity and show that for separable initial squeezed thermal states entanglement generation may take place, for definite values of squeezing parameter, average photon numbers, temperature of the thermal bath, dissipation constant and the strength of interaction between the two modes. After its generation one can observe temporary suppressions and revivals of the entanglement. For entangled initial squeezed thermal states, entanglement suppression takes place, for all temperatures of the reservoir, and temporary revivals and suppressions of entanglement can be observed too. In the limit of infinite time the system evolves asymptotically to an equilibrium state which may be entangled or separable.
APA, Harvard, Vancouver, ISO, and other styles
8

Li, Ya, Na Wang, Chunzhang Wang, Xin Wang, Jinglai Zhang, and Li Wang. "Theoretical study on the unimolecular decomposition of 2-chlorinated ethyl hydroperoxide." Journal of Theoretical and Computational Chemistry 15, no. 01 (February 2016): 1650008. http://dx.doi.org/10.1142/s0219633616500085.

Full text
Abstract:
Chlorine-containing organic compounds have been of major interest since such compounds would serve as temporary reservoirs for HOX, ROX and ClOX radicals. Moreover, it would transport chlorine species to the atmosphere and stratosphere. However, limited studies have been performed on the 2-chlorinated ethyl hydroperoxide. In this work, the mechanism of unimolecular dissociation of 2-chlorinated ethyl hydroperoxide is theoretically studied. The equilibrium structures are optimized at the Boese–Martin for kinetics (BMK) level. And the energies are further refined at the Balanced multi-coefficient correlation-coupled cluster theory with single and double excitations (BMC-CCSD) level on the basis of the optimized geometries. Fifteen reaction channels are finally confirmed including the direct C–O, O–O, O–H, and C–C bond cleavage or the H2-, H2O-, H2O2-, and CH3Cl-elimination.
APA, Harvard, Vancouver, ISO, and other styles
9

Gospodarek, Tadeusz. "THE BALANCE PARADOX OF MANAGEMENT." Central European Review of Economics and Management 2, no. 2 (June 23, 2018): 143. http://dx.doi.org/10.29015/cerem.526.

Full text
Abstract:
Aim: There exists an inequilibrium between the available quantity of goods and the level of consumption resulting in local economic polarisations and asymmetric capital concentrations. Replacements of real money with derivative instruments cause strong perturbations on capital markets. Consumer preferences change towards the maximization of the utility of the used capital. The above observations are a basis for the hypothesis that managers, in general, prefer to maximize the momentum profit regardless of the risk of losing the stability of macroeconomic systems.Design/Research method: It is heuristic about the objective function of an organization based on observations, that there are two excluding tendencies in formulating goals: to maximize the profit (using all possible opportunities) and simultaneously to achieve stability in the long run (keeping the micro-macro balance).Conclusions/findings: Managements cause deviations from the micro-macro balance, and at the same time trying to keep this balance. This leads to the following paradox of management (the balance dilemma of management): Managers always try to maximize opportune profits, regardless of future benefits that may be derived from keeping the equilibrium. And conversely, rational long-term stability suggests postponing most opportunities and keeping external boundaries (e.g. realizing sustainable development). However, managers’ temporary preferences lead to an increasing number of unbalanced interactions between organizations and their surroundings, up to the critical point when some catastrophic economic processes may take place.Originality/value of the article: Original heuristics based on the observations of some micro-macro economic balance relations in business practice.Implications of the research: One more paradox in the theory of management have been presented. It is important for base statements of the theory of organizational bahaviors consistency and inferring would be more accurate.Key words: General economic equilibrium, rationally bounded decisions, paradox of management, micro/macro balance, management theory.JEL: L2, M21, D5, F41
APA, Harvard, Vancouver, ISO, and other styles
10

Hayes, Amy E., Matthew C. Davidson, Steven W. Keele, and Robert D. Rafal. "Toward a Functional Analysis of the Basal Ganglia." Journal of Cognitive Neuroscience 10, no. 2 (March 1998): 178–98. http://dx.doi.org/10.1162/089892998562645.

Full text
Abstract:
Parkinson patients were tested in two paradigms to test the hypothesis that the basal ganglia are involved in the shifting of attentional set. Set shifting means a respecification of the conditions that regulate responding, a process sometimes referred to as an executive process. In one paradigm, upon the appearance of each stimulus, subjects were instructed to respond either to its color or to its shape. In a second paradigm, subjects learned to produce short sequences of three keypresses in response to two arbitrary stimuli. Reaction times were compared for the cases where set either remained the same or changed for two successive stimuli. Parkinson patients were slow to change set compared to controls. Parkinson patients were also less able to filter the competing but irrelevant set than were control subjects. The switching deficit appears to be dopamine based; the magnitude of the shifting deficit was related to the degree to which l-dopa-based medication ameliorated patients' motor symptoms. Moreover, temporary withholding of medication, a so-called off manipulation, increased the time to switch. Using the framework of equilibrium point theory of movement, we discuss how a set switching deficit may also underlie clinical motor disturbances seen in Parkinson's disease.
APA, Harvard, Vancouver, ISO, and other styles
11

Venuti, Lorenzo Campos, and Paolo Zanardi. "Theory of temporal fluctuations in isolated quantum systems." International Journal of Modern Physics B 29, no. 14 (May 22, 2015): 1530008. http://dx.doi.org/10.1142/s021797921530008x.

Full text
Abstract:
When an isolated quantum system is driven out of equilibrium, expectation values of general observables start oscillating in time. This paper reviews the general theory of such temporal fluctuations. We first survey some results on the strength of such temporal fluctuations. For example temporal fluctuations are exponentially small in the system's volume for generic systems whereas they fall-off algebraically in integrable systems. We then concentrate on the so-called quench scenario where the system is driven out-of-equilibrium under the application of a sudden perturbation. For sufficiently small perturbations, temporal fluctuations of physical observables can be characterized in full generality and can be used as an effective tool to probe quantum criticality of the underlying model. In the off-critical region the distribution becomes Gaussian. Close to criticality the distribution becomes a universal function uniquely characterized by a single critical exponent, that we compute explicitly. This contrasts standard equilibrium quantum fluctuations for which the critical distribution depends on a numerable set of critical coefficients and is known only for limited examples. The possibility of using temporal fluctuations to determine pseudo-critical boundaries in optical lattice experiments is further reviewed.
APA, Harvard, Vancouver, ISO, and other styles
12

CABALAR, PEDRO, MARTÍN DIÉGUEZ, and CONCEPCIÓN VIDAL. "An infinitary encoding of temporal equilibrium logic." Theory and Practice of Logic Programming 15, no. 4-5 (July 2015): 666–80. http://dx.doi.org/10.1017/s1471068415000307.

Full text
Abstract:
AbstractThis paper studies the relation between two recent extensions of propositional Equilibrium Logic, a well-known logical characterisation of Answer Set Programming. In particular, we show how Temporal Equilibrium Logic, which introduces modal operators as those typically handled in Linear-Time Temporal Logic (LTL), can be encoded into Infinitary Equilibrium Logic, a recent formalisation that allows the use of infinite conjunctions and disjunctions. We prove the correctness of this encoding and, as an application, we further use it to show that the semantics of the temporal logic programming formalism called TEMPLOG is subsumed by Temporal Equilibrium Logic.
APA, Harvard, Vancouver, ISO, and other styles
13

Bakhtar, F., S. R. Otto, M. Y. Zamri, and J. M. Sarkies. "Instability in two-phase flows of steam." Proceedings of the Royal Society A: Mathematical, Physical and Engineering Sciences 464, no. 2091 (December 11, 2007): 537–54. http://dx.doi.org/10.1098/rspa.2007.0087.

Full text
Abstract:
In two-phase flows of steam, when the velocity is between the equilibrium and frozen speeds of sound, the system is fundamentally unstable. Because any disturbance of the system, e.g. imposition of a small supercooling on the fluid, will cause condensation, the resulting heat release will accelerate the flow and increase the supercooling and thus move the system further from thermodynamic equilibrium. But in high-speed flows of a two-phase mixture, dynamic changes affect the thermodynamic equilibrium within the fluid, leading to phase change, and the heat release resulting from condensation disturbs the flow further and can also cause the disturbances to be amplified at other Mach numbers. To investigate the existence of instabilities in such flows, the behaviour of small perturbations of the system has been examined using stability theory. It is found that, although the amplification rate is highest between the equilibrium and frozen speeds of sound, such flows are temporally unstable at all Mach numbers.
APA, Harvard, Vancouver, ISO, and other styles
14

Bertilsson, Margareta. "Sociologiens problemer i en social verden uden grænser." Dansk Sociologi 10, no. 4 (February 5, 2007): 53–69. http://dx.doi.org/10.22439/dansoc.v10i4.738.

Full text
Abstract:
An unrestricted social world and the problem(s) of sociology Sociology has been characterized as an embattled science all through its century-long history. Now, by the turn of the mil¬lennium it is still considered a controver¬sial science: contemporary discussions in the USA have made it clear that so¬ciology risks implosion from the inside either because it is evolving into a science of cultural differences mirroring the ma¬ny forms of contemporary moral redresses or else as an esoteric symbol system out of touch with social reality. The article starts out with surveying some such con¬temporary arguments, raised by socio¬logists and directed against the sociolo¬gical discipline in the 90´s. Seen from within the system of the modern sciences, sociology is typically what has been called an “unrestricted” science. It harbours a wide variety of theo¬retical and methodological approaches without any real centre. Profound que¬stions have recently been raised if socio¬logy as a discipline is falling apart in various specialties with regard to sub¬stances, theories, and methods. Network arrangements among cognitive special¬ties threaten the classic disciplinary mo¬del of science today in general, and so¬ciology is especially threatened by such de-centring tendencies. But the thrust of the argument in this article is to view the alleged dissolution of sociology in light of the wider theme of an eventual dissolution of the notion of the “social”. Could it be, by the end of the 20´s century, that our social world assumes different characteristics than those contained in the old framework of the nation-state. Many different notions of the term social are then listed aiming at the question if the term social itself is but a late historic construction, pertai¬ning in particular to the glue that was to hold the territorial state together. As a “resource” the social world is a precondi¬tion of human life, but as a “topic of dis¬course”, and a theme of sociology, em¬ployments of the term social need consi¬dering a future social world without de¬finite borders. Due to the expansion not the least of mass technology, the social world has been transformed immensely. The expan¬sion process can be captured as a three¬fold process: individualization, contrac¬tualization, and mediatization. Whereas the old container theory stipulated a more or less unified model of the social, a new vision of the social requires us to accom¬modate to this overriding and threefold expansion process. However, these diffe¬rent processes may pull in different cog¬nitive directions - and the question will be raised in the end if the future of socio¬logy can find a “reflective equilibrium” and maintain its disciplinary bounda¬ries or if the discipline will dissolve as a consequence of the dissolution of its own subject-matter?
APA, Harvard, Vancouver, ISO, and other styles
15

Unno, Wasaburo. "Problems of Solar Convection." Symposium - International Astronomical Union 142 (1990): 39–44. http://dx.doi.org/10.1017/s0074180900087672.

Full text
Abstract:
Kinetic energy of convection is transported inwards in the main body of convection zone. The temperature gradient becomes super-radiative at the top of the overshooting zone. These two effects make the solar equilibrium model much less sensitive to the assumed mixing length in Xiong's eddy diffusion theory. Observed brightening of downflow at high level in the surface region over intergranular lanes seems to be consistent with the overshooting model. Momentum transport by convective motion is shown to be crucial in pushing down magnetic flux tubes against buoyancy. Also, subadiabatic layers are possibly formed temporarily in the middle of the convection zone, exciting oscillations and generating chaotic motions.
APA, Harvard, Vancouver, ISO, and other styles
16

CABALAR, PEDRO, ROLAND KAMINSKI, TORSTEN SCHAUB, and ANNA SCHUHMANN. "Temporal Answer Set Programming on Finite Traces." Theory and Practice of Logic Programming 18, no. 3-4 (July 2018): 406–20. http://dx.doi.org/10.1017/s1471068418000297.

Full text
Abstract:
AbstractIn this paper, we introduce an alternative approach to Temporal Answer Set Programming that relies on a variation of Temporal Equilibrium Logic (TEL) for finite traces. This approach allows us to even out the expressiveness of TEL over infinite traces with the computational capacity of (incremental) Answer Set Programming (ASP). Also, we argue that finite traces are more natural when reasoning about action and change. As a result, our approach is readily implementable via multi-shot ASP systems and benefits from an extension of ASP's full-fledged input language with temporal operators. This includes future as well as past operators whose combination offers a rich temporal modeling language. For computation, we identify the class of temporal logic programs and prove that it constitutes a normal form for our approach. Finally, we outline two implementations, a generic one and an extension of the ASP systemclingo.Under consideration for publication in Theory and Practice of Logic Programming (TPLP)
APA, Harvard, Vancouver, ISO, and other styles
17

Kulić, I. M., M. Mani, H. Mohrbach, R. Thaokar, and L. Mahadevan. "Botanical ratchets." Proceedings of the Royal Society B: Biological Sciences 276, no. 1665 (March 11, 2009): 2243–47. http://dx.doi.org/10.1098/rspb.2008.1685.

Full text
Abstract:
Ratcheting surfaces are a common motif in nature and appear in plant awns and grasses. They are known to proffer selective advantages for seed dispersion and burial. In two simple model experiments, we show that these anisotropically toothed surfaces naturally serve as motion rectifiers and generically move in a unidirectional manner, when subjected to temporally and spatially symmetric excitations of various origins. Using a combination of theory and experiment, we show that a linear relationship between awn length and ratchet efficiency holds under biologically relevant conditions. Grass awns can thus efficiently transform non-equilibrium environmental stresses from such sources as humidity variations into useful work and directed motion using their length as a fluctuation amplifier, yielding a selective advantage to these organelles in many plant species.
APA, Harvard, Vancouver, ISO, and other styles
18

CABALAR, PEDRO, MARTÍN DIÉGUEZ, TORSTEN SCHAUB, and ANNA SCHUHMANN. "Towards Metric Temporal Answer Set Programming." Theory and Practice of Logic Programming 20, no. 5 (September 2020): 783–98. http://dx.doi.org/10.1017/s1471068420000307.

Full text
Abstract:
AbstractWe elaborate upon the theoretical foundations of a metric temporal extension of Answer Set Programming. In analogy to previous extensions of ASP with constructs from Linear Temporal and Dynamic Logic, we accomplish this in the setting of the logic of Here-and-There and its non-monotonic extension, called Equilibrium Logic. More precisely, we develop our logic on the same semantic underpinnings as its predecessors and thus use a simple time domain of bounded time steps. This allows us to compare all variants in a uniform framework and ultimately combine them in a common implementation.
APA, Harvard, Vancouver, ISO, and other styles
19

Budge, Ian. "A New Spatial Theory of Party Competition: Uncertainty, Ideology and Policy Equilibria Viewed Comparatively and Temporally." British Journal of Political Science 24, no. 4 (October 1994): 443–67. http://dx.doi.org/10.1017/s0007123400006955.

Full text
Abstract:
This article considers how parties can decide on policy when there is no reliable information about the effect of these decisions on voting. Where this is the case they must base their stands on a priori assumptions about appropriate priorities, namely on political ideologies. These indicate the general policy area a party should occupy, but do not give detailed guidance on which position to take within it. Five different ways of deciding on this, within ideological constraints, are specified. The predictions derived from these models well anticipate the actual decisions made by post-war parties in twenty democracies, as summarized in the unique spatial maps of policy movements published by the Manifesto Research Group of the European Consortium for Political Research.
APA, Harvard, Vancouver, ISO, and other styles
20

Bayraktar, Erhan, and Alexander Munk. "High-Roller Impact: A Large Generalized Game Model of Parimutuel Wagering." Market Microstructure and Liquidity 03, no. 01 (March 2017): 1750006. http://dx.doi.org/10.1142/s238262661750006x.

Full text
Abstract:
How do large-scale participants in parimutuel wagering events affect the house and ordinary bettors? A standard narrative suggests that they may temporarily benefit the former at the expense of the latter. To approach this problem, we begin by developing a model based on the theory of large generalized games. Constrained only by their budgets, a continuum of diffuse (ordinary) players and a single atomic (large-scale) player simultaneously wager to maximize their expected profits according to their individual beliefs. Our main theoretical result gives necessary and sufficient conditions for the existence and uniqueness of a pure-strategy Nash equilibrium. Using this framework, we analyze our question in concrete scenarios. First, we study a situation in which both predicted effects are observed. Neither is always observed in our remaining examples, suggesting the need for a more nuanced view of large-scale participants.
APA, Harvard, Vancouver, ISO, and other styles
21

Woitke, Peter. "Dust-driven Winds Beyond Spherical Symmetry." Proceedings of the International Astronomical Union 4, S252 (April 2008): 229–34. http://dx.doi.org/10.1017/s1743921308022849.

Full text
Abstract:
AbstractNew 2D dynamical models for the winds of AGB stars are presented which include hydrodynamics with radiation pressure on dust, equilibrium chemistry, time-dependent dust formation theory, and coupled frequency-dependent Monte Carlo radiative transfer. The simulations reveal a much more complicated picture of the dust formation and wind acceleration as compared to 1D spherical wind models. Triggered by non-spherical pulsations or large-scale convective motions, dust forms event-like in the cooler regions above the stellar surface which are temporarily less illuminated, followed by the radial ejection of dust arcs and clumps. These simulations can possibly explain recent high angular resolution interferometric IR observations of red giants, which show an often non-symmetric and highly time-variable innermost dust formation and wind acceleration zone. The dependence of the mass-loss rates on stellar parameters is less threshold-like as used from 1D models, and therefore, it seems quite possible that the phenomenon of dust-driven winds may occur also in less evolved red giants.
APA, Harvard, Vancouver, ISO, and other styles
22

Friedman, Jonathan. "Ecological Consciousness and the Decline of 'Civilisations': The Ontology, Cosmology and Ideology of Non-equilibrium Living Systems." Worldviews: Global Religions, Culture, and Ecology 2, no. 3 (1998): 303–15. http://dx.doi.org/10.1163/156853598x00271.

Full text
Abstract:
AbstractThis article is a discussion of the cosmological and ontological bases of ecological thinking in cross-cultural terms. It is argued that there are two different sources for much of modem ecological thinking. One has its origins in the various developments in systems theory and cybernetics and is rooted in a hard 'engineering' framework. The other, which is the basic focus of this discussion, is based on constructions of 'nature' (not necessarily an explicit category in all societies) as temporally variable, and on the transformation of 'nature' in conditions of crisis. Newer approaches in ecological anthropology have rightly emphasised an understanding of the way nature is socially constructed, as, for example, an autonomous entity, or as totally integrated within other social and personal relations. It is suggested here that ecological thinking is related to the inversion of cosmological relations that occurs in many societies in crisis. Examples of the transformation of expansionist chiefdoms into ecologically more stable yet crisis-ridden egalitarian societies are used to argue that ecological thinking is not a product of a particular kind of society but of a particular condition that develops within societies that enter into crises after an earlier phase of growth and expansion. Modem ecological consciousness is argued to be the same kind of phenomenon, even if it develops in a very different kind of social system - one that reconstructs the 'primitive' societies of the Western periphery as the ecologically moral and balanced world that we have lost rather than the backward world of a previous modernist evolutionism.
APA, Harvard, Vancouver, ISO, and other styles
23

Lefebvre, Germain, Aurélien Nioche, Sacha Bourgeois-Gironde, and Stefano Palminteri. "Contrasting temporal difference and opportunity cost reinforcement learning in an empirical money-emergence paradigm." Proceedings of the National Academy of Sciences 115, no. 49 (November 15, 2018): E11446—E11454. http://dx.doi.org/10.1073/pnas.1813197115.

Full text
Abstract:
Money is a fundamental and ubiquitous institution in modern economies. However, the question of its emergence remains a central one for economists. The monetary search-theoretic approach studies the conditions under which commodity money emerges as a solution to override frictions inherent to interindividual exchanges in a decentralized economy. Although among these conditions, agents’ rationality is classically essential and a prerequisite to any theoretical monetary equilibrium, human subjects often fail to adopt optimal strategies in tasks implementing a search-theoretic paradigm when these strategies are speculative, i.e., involve the use of a costly medium of exchange to increase the probability of subsequent and successful trades. In the present work, we hypothesize that implementing such speculative behaviors relies on reinforcement learning instead of lifetime utility calculations, as supposed by classical economic theory. To test this hypothesis, we operationalized the Kiyotaki and Wright paradigm of money emergence in a multistep exchange task and fitted behavioral data regarding human subjects performing this task with two reinforcement learning models. Each of them implements a distinct cognitive hypothesis regarding the weight of future or counterfactual rewards in current decisions. We found that both models outperformed theoretical predictions about subjects’ behaviors regarding the implementation of speculative strategies and that the latter relies on the degree of the opportunity costs consideration in the learning process. Speculating about the marketability advantage of money thus seems to depend on mental simulations of counterfactual events that agents are performing in exchange situations.
APA, Harvard, Vancouver, ISO, and other styles
24

Spohr, Klaus, Domenico Doria, and Bradley Meyer. "Theoretical Discourse on Producing High Temporal Yields of Nuclear Excitations in Cosmogenic 26Al with a PW Laser System: The Pathway to an Astrophysical Earthbound Laboratory." Galaxies 7, no. 1 (December 26, 2018): 4. http://dx.doi.org/10.3390/galaxies7010004.

Full text
Abstract:
The development of the 10 PW laser system at the Extreme Light Infrastructure is a crucial step towards the realization of an astrophysical Earthbound laboratory. The interaction of high-power laser pulses with matter results in ultrashort (fs-ps) pulses of 10s of MeV ions and radiation that can create plasma and induce nuclear reactions therein. Due to the high fluxes of reaction-driving beam pulses, high yields of radioactive target nuclei in their ground and excited states can be provided in situ on short time scales. Cosmogenic 26Al, which is of pronounced astrophysical interest, is a prime candidate for evaluating these new experimental possibilities. We describe how, for a short duration of Δ t ∼ 200 ps , laser-driven protons with energies above E p ∼ 5 MeV can induce the compound nucleus reaction 26Mg(p, n)26Al leading to high and comparable yields of the three lowest-lying states in 26Al including the short-lived, t 1 / 2 = 1.20 ns state at 417 keV. In the aftermath of the reaction, for a short duration of t ∼ ns , the yield ratios between the ground and the two lowest-lying excited states will resemble those present at thermodynamic equilibrium at high temperatures, thus mimicking high 26Al entropies in cold environments. This can be seen as a possible first step towards an investigation of the interplay between those states in plasma environments. Theory suggests an intricate coupling of the ground state 26Alg.s. and the first excited isomer 26mAl via higher-lying excitations such as the J = 3 + state at 417 keV resulting in a dramatic reduction of the effective lifetime of 26Al which will influence the isotope’s abundance in our Galaxy.
APA, Harvard, Vancouver, ISO, and other styles
25

Horwitz, R. "Cell biology as the centuries change - about as good as it gets." Journal of Cell Science 113, no. 6 (March 15, 2000): 906–8. http://dx.doi.org/10.1242/jcs.113.6.906.

Full text
Abstract:
Recently, the newspapers and journals were bubbling with articles and editions devoted to various kinds of millennium and Y2K perspective. Some were retrospective and others prospective; some simply comprised lists of ‘greatests’. Interpreting the past with accuracy and insight is challenging, as is predicting the future. Fortunately, many others have already done that. So, instead, I will look at our discipline, cell biology, defined very broadly and to include molecular biology, both prospectively and retrospectively in the context of some perhaps prosaic but pertinent questions about the discipline that are surfacing as the centuries change. Many greats: One approach to summarizing the past is through lists of the greatest participants or classic papers in a given area. These lists appear frequently in areas like physics and mathematics, where progress is, or at least was, heavily influenced by heroic individuals who opened or sustained a field. In these areas, most participants and observers would develop a very similar list of the ‘greatests’, and nearly everyone working in the discipline would know what their contribution was. Is this true in cell biology? Are there names that everyone would know, or a canon of papers that everyone has read? Did the cell biology of the last 50–100 years evolve because of heroic individuals? Or were there only some insightful pioneers, followed by a large number of important accomplishments that occurred in many different laboratories? Interestingly, none of the major journals has compiled a ‘greatest’ list or even a “classic papers” list in cell biology. This is revealing. Perhaps it tells us that there were no great cell biologists - i.e. that the recent, great progress that we have witnessed didn't require great individuals. More likely, however, there are too many - that is, the advances in cell biology tend to be incremental, with many more bright sparks and contained blazes than forest fires. Seminal observations are frequent and arise in unexpected places. Progress may be better measured as the integral of many important contributions and contributors. Thus, cell biology is the product of many many great scientists, who interact, synergize and stand on each other's shoulders. The attractiveness of cell biology lies in this open, frontier culture. And the result is that the pie of success is large and that many have been rewarded. An interesting consequence of our frontier culture is that it is too exciting and fast paced for anyone to take the time to develop a sense of history and accomplishment. Sidney Brenner makes this point in his review of a book entitled ‘The lac Operon: A short history of a genetic paradigm’, by Benno Muller-Hill (Nature 386, 235). Brenner writes: “This book opens with the lament that for young molecular biologists history does not exist, and that they have no interest in the long struggle that has made the subject what it is today. I hold the weaker view that history does exist for the young, but it is divided into two epochs: the past two years, and everything that went before. That these have equal weight is a reflection of the exponential growth of the subject, and the urgent need to possess the future and acquire it more rapidly than anybody else does not make for empathy with the past.” A few years ago I read a list of the names of biomedical Nobel Laureates to some colleagues. They knew only a handful of the names and what their contributions were. It seems that so much is being accomplished so quickly it is hard for individuals to stand out. And the consequent focus on the collective achievement is what makes our discipline so rewarding for so many. But how long will this frontier culture last? The emergence of big biology, through government and private-foundation initiative, is changing the landscape. The rate of progress continues to accelerate. Will one soon require a very big lab to survive? Will creative minds find cell biology fertile territory? There are answers to big science. Most important is to embrace what it produces and look ahead. Another is to develop multi-institutional collaborative networks in which the product can far exceed the contributions of single individuals. And, finally, there are always trails to blaze and syntheses to make. They require little more than hard work, organization, good sense, perseverance, and some luck. Is it almost over? Extrapolating the rapid progress that we are witnessing, can one realistically predict what our discipline will be like over the next few decades? Will the questions that we are investigating now be answered or passe, and, if so, how soon? How long will cell biology continue to be on the center stage? Will there be new, fundamental, concepts or a paradigm shift? What ‘unexpecteds’ might we expect? At meetings over beer and at dinner tables with seminar speakers, the question “Is it almost over?” creeps in with increasing frequency. The concern is that the big picture will be in place soon - that is, the outlines of the fundamental cellular processes will be largely understood at a molecular level. This concern, of course, reflects the depth with which one wants to understand the cell. Clearly, we now know vastly more than we did even a decade ago. There is an emerging sense that a rudimentary understanding of the most basic cellular processes is in sight; one sees this even in the undergraduate cell biology textbooks. Of course, progress will continue. However, the questions about fundamental processes will become increasingly refined, and the answers more detailed - more likely to occupy space in specialty treatises than in undergraduate cell biology texts. The approaches and concepts will become more deeply linked to chemistry and physics, eventually focusing on subtleties of mechanism and structure. Some of these details will change our basic concepts dramatically; but the frequency of such occurrences will dwindle. These details are also necessary for the applications of cell biology that are beginning to emerge and for a true marriage of cell biology with the molecular world. This level of inquiry and detail, or increasing reductionism, may not sustain the interest of or resonate well with many of our colleagues. However, for others, it's just the beginning and is opening doors for a cadre of new colleagues trained in physics and chemistry to enter with fresh ideas, insights and technologies. Will it ever end? But is it almost over? Do we really know how cells do what they do? How is the thicket of seemingly redundant pathways and networks, molecules, and supramolecular assemblies coordinated spatially and temporally? Which of the many pathways and redundant mechanisms revealed in culture are utilized in vivo. How are cellular phenomena, as revealed in the spatial and temporal coordination required for cell division or migration, for example, integrated? How do groups of cells integrate and coordinate to effect tissue function, embryonic development, and pathology, for example? As we begin to observe cellular phenomena in situ, they can appear very different from those observed in culture. The compensation and redundancy seen in knockout, transgenic or mutated organisms also reveals a diversity of possible mechanisms. It seems that the cell has different ways of doing the same thing. How does the cell do it normally, and when, if ever, are the other mechanisms used? We have tended to focus the majority of our efforts on a few cell types. What about the other cells? How do they do it? These questions are especially pertinent in developmental biology and pathobiology, where the cellular environments are changing; they also point to a class of challenging, important new avenue of investigation. As the canon of cellular phenomena becomes understood at an increasingly refined level, it provides the basis for explaining integrative phenomena. It also becomes the source of interesting and important practical applications. In this way, cell biology can become the language for understanding complex integrative phenomena like learning and memory, behavior and personality - areas in which the genome project and genetics might merge to provide unique insights. In addition, cell biology is the source of endless practical applications and, in some sense, sits in the center of a booming biotechnology industry that includes novel therapeutic strategies, designer animals and plants, tissue replacements, biomaterials and biosensors. The possibilities here seem endless. What does genomics bode for cell biology? A great deal of opportunity. Do sequences, homologies, binding interactions, changes in expression, and even knockouts provide a satisfactory understanding of function? Isn't the genomic bottleneck the assignment of cellular functions to different genes? In its essence, gene function can be viewed as a cell biological issue and perhaps not fully amenable to high-throughput analysis. Thus, the genome project promises to keep cell biology on the center stage. And maybe, therefore, we will have too much to do. The devil is in the detail: A major product of the successes in cell biology is a mind-numbing number of facts, particulars, data and details. The volume of information and detail that we are generating in genome studies and cell signaling, for example, unsettles some. Will the molecular paradigm, which has been so successful and brought us here, lead us to the next level? In the reductionist paradigm, the cell can be viewed as a complex chemical system that obeys the laws of physics and the principles of chemistry. In this view, one needs to know the relevant chemical properties for all of the cellular components. Once this is known, the cellular dynamics and equilibria can be computed, and ultimately cell behavior modeled. For small systems and isolated processes, this has had an important predictive value and has been insightful and revealing. Most importantly, it uses the principles of chemistry, which is a common language that is known and understood by nearly all participants. Can this approach be usefully extrapolated to a highly complex system like an entire cell? It may take a while, as it poses some interesting challenges. How many complex differential equations, which must cover both temporal and spatial distributions, would be involved? How accurately will the concentrations and rate constants need to be measured? How does one deal with the non-ideal nature of the cell interior and exterior? The differential equations required to describe the systems of interacting pathways or networks found in a cell will necessarily be very complex and contain many terms. How does the error in measurements of the rate constants and concentrations, for example, propagate - that is, given any reasonable measurement error, can one derive anything that is meaningful and useful? The situation is complicated further by the nature of the cell. What are the effective concentrations (the activities) of the components? How does one address reactions that are occurring on surfaces or macromolecular assemblies that can be dynamic? These are formidable challenges. Chemistry faces them continually, as do other sciences that deal with complex phenomena. Natural phenomena have strong roots in the principles of physics and the concepts of chemistry. Yet the mathematics that backs them up does not readily yield to highly complex phenomena. Maybe different approaches - perhaps one based in the complex-systems theory that is so familiar to engineers - will provide an alternative. Where's the big picture? Are there other ways of dealing with our flood of details and particulars? There is a call for mathematicians, computer scientists, engineers and/or theorists to help bring order to this information flood. Can they make sense of this complexity? Are there overarching and unifying concepts that will allow us to think in generalities, rather than in particulars? There may already be some unifying concepts. One is the genetic paradigm, which views a cell's behavior as a consequence of its expressed genes. The geneticist's point of view has already provided an important, empirical and quantitative way of looking at cellular and organismal phenomena. This view of a cell or organism, or even a disease like cancer, differs greatly from that of a biochemist, which focuses on mechanisms and specifics. In some respects, it shifts attention away from the particulars and sticky mechanistic issues, and thus can be simplifying. Genetics has been a very powerful driver in many areas, not only as a tool to determine function but also as a way of looking at a process. The marriage of genetics with developmental biology is only one of many examples. A number of other examples derive from modeling. The Hodgkin-Huxley equation is one prominent and useful example. It models the axon as an electrical entity. For other purposes, the cell has been likewise treated as a mechanical entity and modeled in the jargon of mechanics. There are other ways of modeling the cell and its component processes - for example, through signal and systems theory, network and graph theory, Boolean algebra, and statistics. Each of these treatments can be meaningful and useful to those well versed in that particular discipline. But are these useful to those not versed in them? Is there a unifying theory or model that avoids a proliferation of models. How does one connect them to our chemical roots? In physics, the simplest is accepted as correct. Cell biology has a different reality. It is derived evolutionarily, and therefore, the simplest model may not be correct or even useful. Perhaps, in the future, there will even be a synthesis - like the periodic table or quantum mechanics for physics and chemistry - that allows us to deal with the mega-detail that we are generating. Big surprises in small packages: To date, cell biology has progressed rapidly because of its qualitative nature. Differences in localization are often characterized by fluorescence intensities that are described qualitatively as brighter or dimmer or as more or less localized. Similarly, differences in expression are often characterized by the intensity of bands on western blots or SDS gels; these are often described as bigger or smaller. Many changes are, in fact, very large, and this level of characterization is likely to be adequate. But have we missed anything? Is there a need for more quantitative measurements? When differences in expression are analyzed by gene array, where does one draw the line? Is a tenfold change more significant than a 2–3-fold change? Many measurements would not detect changes that are only 2–3-fold, and in others we have tended to ignore them. We wouldn't see such a small change in fluorescence intensity by eye, for example, nor would we readily identify changes in concentration that arise from differences in localization rather than expression. Ignoring small changes assumes that biological readouts are not highly poised. But is this true? Systems that have interacting components, undergo conformational changes or enzymatic modifications, or are part of amplifying cascades, for example, can be highly poised. Thus 2–3-fold changes in expression or in substrate/ligand concentration can have effects that are very large. Of course, the converse follows as well. Large changes might have only modest consequences - for example, if one is well removed from the Kd. Examples of small changes having large effects and vice versa are common features of complex systems and are now beginning to appear in the cell biology literature. It seems likely there will be many more as our measurements become increasingly quantitative. Downstream signals: What can one make of all of this? (1) This is a very, very good time for cell biology. Questions that have loomed for decades and centuries are becoming understood in a meaningful way. The progress is breathtaking; it wasn't this easy only a couple of decades ago. (2) Many are participating in the success; they are all contributing to something useful and important. (3) The devil is in the detail but so are the opportunities. (4) Big science is here to stay - perhaps a consequence of our success. As investigators, we need to embrace it and look ahead. (5) The only constant in our research will be change. We will need to be flexible in our approaches and questions. (6) We must translate our progress to the public through education and the popular press in ways that sustain their interest and support and attract new minds to our discipline. (7) The surge in new technology will continue to drive our progress, which will come to nearly anyone who works hard, chooses a good problem, and takes a reasonable approach. (8) We need to develop strategies to deal with the information flood; it won't ebb soon. And the anticipated simplifications from the mathematicians, computer scientists and modelers may take quite a while. (9) Enjoy your successes. This might be about as good as it gets.
APA, Harvard, Vancouver, ISO, and other styles
26

Nicola, PierCarlo. "ECONOMIA MATEMATICA E MECCANICA RAZIONALE." Istituto Lombardo - Accademia di Scienze e Lettere - Incontri di Studio, November 18, 2013, 57–72. http://dx.doi.org/10.4081/incontri.2008.49.

Full text
Abstract:
Riassunto. – Almeno a partire da Pareto le analogie meccaniche hanno giocato un ruolo rilevante nella teoria economica; ma sarebbe ingiusto passare sotto silenzio il fatto che Marshall, praticamente coetaneo di Pareto, propendeva piuttosto per l’impiego di analogie di natura biologica. Questo detto, l’equilibrio generale, attualmente il paradigma più diffuso fra i cultori di teoria economica, si presta mirabilmente a sottolineare le analogie meccaniche proponibili nella teoria economica. Partendo dalla formulazione canonica dell’equilibrio generale concorrenziale, come venne proposta da Arrow e Debreu nel 1954, l’intervento si propone di presentare succintamente alcune delle estensioni formulate nel corso degli ultimi 50 anni:– l’introduzione dei monopoli e della concorrenza monopolistica;– i prezzi rigidi e il razionamento;– le generazioni sovrapposte;– gli equilibri temporanei e le aspettative soggettive;– gli equilibri approssimati in presenza di non convessità; – l’infinità degli agenti e la concorrenza perfetta;– l’incertezza;– i mercati incompleti e le attività finanziarie.Vengono brevemente discussi anche i problemi che sorgono nell’applicazione di modelli di equilibrio generale allo studio di economie concrete, e si sottolinea il fatto che i dati economici sono sempre soggetti ad errori spesso rilevanti, mentre quasi mai é possibile condurre esperimenti controllabili, diversamente da quanto avviene nella fisica. Le applicazioni portano naturalmente a chiedersi quale utilità possono rivestire i modelli di equilibrio generale a fini previsionali.***Abstract. – Since Pareto times mechanical analogies played a relevant role in economic theory; but it is necessary to underline that Marshall, an economist contemporary to Pareto, preferred to employ biological analogies. General equilibrium, presently the mainstream paradigm in economic theory, lends itself very naturally to underline mechanical analogies permeating economic theory. Starting from the canonical formulation of general equilibrium, as proposed by Arrow and Debreu in 1954, our aim is to briefly recall some extensions made during the last 50 years:– the inclusion of monopolies and monopolistic competition;– sticky prices and rationing;– overlapping generations;– temporary equilibrium and subjective expectations;– approssimate equilibria under non convexities;– infinity of agents and perfect competition;– uncertainty;– incomplete markets and financial activities.We discuss also some problems which are present in applying general equilibrium models to our days economies, and we underline the fact that economic data are always affected by errors, while very rarely it is possible to do controlled experiments, contrary to what happens in physics. Therefore, it is natural to ask which is the utility of general equilibrium models as far as forecasting activities are of interest.
APA, Harvard, Vancouver, ISO, and other styles
27

ul Rehman, Attiq, Ram Singh, Thabet Abdeljawad, Eric Okyere, and Liliana Guran. "Modeling, analysis and numerical solution to malaria fractional model with temporary immunity and relapse." Advances in Difference Equations 2021, no. 1 (August 21, 2021). http://dx.doi.org/10.1186/s13662-021-03532-4.

Full text
Abstract:
AbstractThe present paper deals with a fractional-order mathematical epidemic model of malaria transmission accompanied by temporary immunity and relapse. The model is revised by using Caputo fractional operator for the index of memory. We also recommend the utilization of temporary immunity and the possibility of relapse. The theory of locally bounded and Lipschitz is employed to inspect the existence and uniqueness of the solution of the malaria model. It is shown that temporary immunity has a great effect on the dynamical transmission of host and vector populations. The stability analysis of these equilibrium points for fractional-order derivative α and basic reproduction number $\mathcal{R}_{0}$ R 0 is discussed. The model will exhibit a Hopf-type bifurcation. The two control variables are introduced in this model to decrease the number of populations. Mandatory conditions for the control problem are produced. Two types of numerical method via Laplace Adomian decomposition and Runge–Kutta of fourth order for simulating the proposed model with fractional-order derivative are presented. To validate the mathematical results, numerical simulations, sensitivity analysis, convergence analysis, and other important studies are given. The paper is finished with some conclusions and discussion.
APA, Harvard, Vancouver, ISO, and other styles
28

Zhang, Zizhen, Junchen Zou, and Soumen Kundu. "Bifurcation and optimal control analysis of a delayed drinking model." Advances in Difference Equations 2020, no. 1 (September 24, 2020). http://dx.doi.org/10.1186/s13662-020-02987-1.

Full text
Abstract:
Abstract Alcoholism is a social phenomenon that affects all social classes and is a chronic disorder that causes the person to drink uncontrollably, which can bring a series of social problems. With this motivation, a delayed drinking model including five subclasses is proposed in this paper. By employing the method of characteristic eigenvalue and taking the temporary immunity delay for alcoholics under treatment as a bifurcation parameter, a threshold value of the time delay for the local stability of drinking-present equilibrium and the existence of Hopf bifurcation are found. Then the length of delay has been estimated to preserve stability using the Nyquist criterion. Moreover, optimal strategies to lower down the number of drinkers are proposed. Numerical simulations are presented to examine the correctness of the obtained results and the effects of some parameters on dynamics of the drinking model.
APA, Harvard, Vancouver, ISO, and other styles
29

Wyrwich, Michael. "Migration restrictions and long-term regional development: evidence from large-scale expulsions of Germans after World War II." Journal of Economic Geography, August 29, 2019. http://dx.doi.org/10.1093/jeg/lbz024.

Full text
Abstract:
AbstractThis article investigates the long-run impact of a migration barrier on regional development. The analysis is based on the large-scale expulsion of Germans from Central and Eastern Europe after World War II (WWII). Expellees were not allowed to resettle in the French occupation zone in the first years after the War while there was no such legislation in the other occupation zones (USA; UK; Soviet Union). The temporary migration barrier had long-lasting consequences. In a nutshell, results of a Difference-in-Difference (DiD) analysis show that growth of population and population density were significantly lower even 60 years after the removal of the barrier if a region was part of the French occupation zone. There was a common trend in regional development before the migration barrier became effective. Further analyses suggest that this pattern is driven by different population dynamics in agglomerated areas. The article discusses implications for spatial theory namely whether location fundamentals, agglomeration theories or both affect the spatial equilibrium under certain conditions.
APA, Harvard, Vancouver, ISO, and other styles
30

Londoño, Jaime A. "A New Theory of Inter-Temporal Equilibrium for Security Markets." SSRN Electronic Journal, 2010. http://dx.doi.org/10.2139/ssrn.1552234.

Full text
APA, Harvard, Vancouver, ISO, and other styles
31

"Stock price fluctuation as a diffusion in a random environment." Philosophical Transactions of the Royal Society of London. Series A: Physical and Engineering Sciences 347, no. 1684 (June 15, 1994): 471–83. http://dx.doi.org/10.1098/rsta.1994.0057.

Full text
Abstract:
The fluctuation of stock prices is modelled as a sequence of temporary equilibria on a financial market with different types of agents. I summarize joint work with M. Schweizer on the class of Ornstein-Uhlenbeck processes in a random environment which appears in the diffusion limit. Moreover, it is shown how the random environment may be generated by the interaction of a large set of agents modelled by Markov chains as they appear in the theory of probabilistic cellular automata.
APA, Harvard, Vancouver, ISO, and other styles
32

Zamora, Alejandro J. "Preferencia temporal, múltiples tipos de interés y la teoría austriaca del ciclo económico." REVISTA PROCESOS DE MERCADO, March 8, 2021, 13–63. http://dx.doi.org/10.52195/pm.v13i1.124.

Full text
Abstract:
Traditional expositions of the Austrian Business Cycle Theory (ABCT) rest on the reference to only one – equilibrium – interest rate. This paper, on the one hand, explores the theoretical validity of such a reference and, on the other hand, analyses whether it is essential to the ABCT. To address both issues, we depart from a defense of the recently-disputed Pure Time Preference Theory (PTPT), which purports to explain the cause of the phenomenon of interest and serves as a basis for the ABCT. In the light of our study, in the first place, we reject some of the formulations of the PTPT that lead to erroneous interpretations of this theory and we put forward an enunciation that avoids common confu-sions; secondly, we deny the validity of the reference to only one interest rate as well as its essentiality to the ABCT; finally, we point out the necessity of up-dating the exposition of the ABCT in consonance with the previous conclusions. Key words: Austrian Business Cycle Theory, Business Cycles, Pure Time Prefe-rence Theory, Interest Rates, Term Structure of Interest Rates. JEL Classification: E32, E40, E43, E50, B53. Resumen: Las exposiciones tradicionales de la Teoría Austriaca del Ciclo Eco-nómico (TACE) se apoyan en la referencia a un único tipo de interés de equili-brio. Este artículo, por un lado, explora la cuestión de la validez teórica de dicha referencia y, por otro, estudia si ésta es realmente esencial a la TACE. Para analizar dichas cuestiones, partimos de una defensa de la recientemente cuestionada Teoría de la Preferencia Temporal Pura (TPTP), teoría que preten-de explicar la causa del fenómeno del interés y que sirve de fundamento a la TACE. A la luz de nuestro estudio, en primer lugar, rechazamos ciertas formu-laciones de la TPTP que conducen a interpretaciones erróneas de la misma, proponiendo una enunciación de la teoría que evita confusiones comunes; se-gundo, negamos la validez de la referencia a un tipo de interés único y su carácter esencial a la TACE; por último, notamos la necesidad de actualizar la exposición de la TACE conforme a las conclusiones antedichas. Palabras clave: Teoría Austriaca del Ciclo Económico, Ciclos Económicos, Teo-ría de la Preferencia Temporal Pura, Tipos de Interés, Estructura de Tipos de Interés. Clasificación JEL: E32, E40, E43, E50, B53.
APA, Harvard, Vancouver, ISO, and other styles
33

Elmberger, Agnes, Erik Björck, Juha Nieminen, Matilda Liljedahl, and Klara Bolander Laksov. "Collaborative knotworking – transforming clinical teaching practice through faculty development." BMC Medical Education 20, no. 1 (December 2020). http://dx.doi.org/10.1186/s12909-020-02407-8.

Full text
Abstract:
Abstract Background Faculty development is important for advancing teaching practice in health professions education. However, little is known regarding how faculty development outcomes are achieved and how change in practice may happen through these activities. In this study, we explored how clinical educators integrated educational innovations, developed within a faculty development programme, into their clinical workplaces. Thus, the study seeks to widen the understanding of how change following faculty development unfolds in clinical systems. Methods The study was inspired by case study design and used a longitudinal faculty development programme as a case offering an opportunity to study how participants in faculty development work with change in practice. The study applied activity theory and its concept of activity systems in a thematic analysis of focus group interviews with 14 programme attendees. Participants represented two teaching hospitals, five clinical departments and five different health professions. Results We present the activity systems involved in the integration process and the contradiction that arose between them as the innovations were introduced in the workplace. The findings depict how the faculty development participants and the clinicians teaching in the workplace interacted to overcome this contradiction through iterative processes of negotiating a mandate for change, reconceptualising the innovation in response to workplace reactions, and reconciliation as temporary equilibria between the systems. Conclusion The study depicts the complexities of how educational change is brought about in the workplace after faculty development. Based on our findings and the activity theoretical concept of knotworking, we suggest that these complex processes may be understood as collaborative knotworking between faculty development participants and workplace staff through which both the output from faculty development and the workplace practices are transformed. Increasing our awareness of these intricate processes is important for enhancing our ability to make faculty development reach its full potential in bringing educational change in practice.
APA, Harvard, Vancouver, ISO, and other styles
34

"The evolution of the universe in asymmetric cosmic time." Advances in Theoretical & Computational Physics 4, no. 3 (July 5, 2021). http://dx.doi.org/10.33140/atcp.04.03.01.

Full text
Abstract:
The Cosmic Time Hypothesis (CTH) presented in this paper is a purely axiomatic theory. In contrast to today's standard model of cosmology, the ɅCDM model, it does not contain empirical parameters such as the cosmological constant Ʌ, nor does it contain sub-theories such as the inflation theory. The CTH was developed solely on the basis of the general theory of relativity (GRT), aiming for the greatest possible simplicity. The simplest cosmological model permitted by ART is the Einstein-de Sitter model. It is the basis for solving some of the fundamental problems of cosmology that concern us today. First of all, the most important results of the CTH: It solves one of the biggest problems of cosmology the problem of the cosmological constant (Ʌ)-by removing the relation between and the vacuum energy density ɛv (Λ=0, ɛv > 0). According to the CTH, the vacuum energy density ɛv is not negative and constant, as previously assumed, but positive and time-dependent (ɛv ̴ t -2). ɛv is part of the total energy density (Ɛ) of the universe and is contained in the energy-momentum tensor of Einstein's field equations. Cosmology is thus freed from unnecessary ballast, i.e. a free parameter (= natural constant) is omitted (Ʌ = 0). Conclusion: There is no "dark energy"! According to the CTH, the numerical value of the vacuum energy density v is smaller by a factor of ≈10-122 than the value calculated from quantum field theory and is thus consistent with observation. The measurement data obtained from observations of SNla supernovae, which suggest a currently accelerated expansion of the universe, result - if interpreted from the point of view of the CTH - in a decelerated expansion, as required by the Einstein-de Sitter universe. Dark matter could also possibly not exist, because the KZH demands that the "gravitational constant" is time-dependent and becomes larger the further the observed objects are spatially and thus also temporally distant from us. Gravitationally bound local systems, e.g. Earth - Moon or Sun - Earth, expand according to the same law as the universe. This explains why Hubble's law also applies within very small groups of galaxies, as observations show. The CTH requires that the strongest force (strong nuclear force) and the weakest (gravitational force) at Planck time (tp ≈10-43 seconds after the "big bang") when all forces of nature are supposed to have been united in a single super force, were of equal magnitude and had the same range. According to the KZH, the product of the strength and range of the gravitational force is constant, i.e. independent of time, and is identical to the product of the strength and range of the strong nuclear force. At Planck time, the universe had the size of an elementary particle (Rp = rE ≈10-15 m). This value also corresponds to the range of the strong nuclear force (Yukawa radius) and the Planck length at Planck time. The CTH provides a possible explanation for Mach's first and second principles. It solves some old problems of the big bang theory in a simple and natural way. The problem of the horizon, flatness, galaxy formation and the age of the world. The inflation theory thus becomes superfluous. • The CTH provides the theoretical basis for the theory of Earth expansion • In Cosmic Time, there was no Big Bang. The universe is infinitely old. • Unlike other cosmological models, the CTH does not require defined "initial conditions" because there was no beginning. • The CTH explains why the cosmic expansion is permanently in an unstable state of equilibrium, which is necessary for a long-term flat (Euclidean), evolutionarily developing universe.
APA, Harvard, Vancouver, ISO, and other styles
35

Harrison, Karey. "Building Resilient Communities." M/C Journal 16, no. 5 (August 24, 2013). http://dx.doi.org/10.5204/mcj.716.

Full text
Abstract:
This paper will compare the metaphoric structuring of the ecological concept of resilience—with its roots in Holling's 1973 paper; with psychological concepts of resilience which followed from research—such as Werner, Bierman, and French and Garmezy and Streitman) published in the early 1970s. This metaphoric analysis will expose the difference between complex adaptive systems models of resilience in ecology and studies related to resilience in relation to climate change; compared with the individualism of linear equilibrium models of resilience which have dominated discussions of resilience in psychology and economics. By examining the ontological commitments of these competing metaphors, I will show that the individualistic concept of resilience which dominates psychological discussions of resilience is incompatible with the ontological commitments of ecological concepts of resilience. Because the ontological commitments of the concepts of ecological resilience on the one hand, and psychological resilience on the other, are so at odds with one another, it is important to be clear which concept of resilience is being evaluated for its adequacy as a concept. Having clearly distinguished these competing metaphors and their ontological commitments, this paper will show that it is the complex adaptive systems model of resilience from ecology, not the individualist concept of psychological resilience, that has been utilised by both the academic discussions of adaptation to climate change, and the operationalisation of the concept of resilience by social movements like the permaculture, ecovillage, and Transition Towns movements. Ontological Metaphors My analysis of ontological metaphors draws on insights from Kuhn's (114) account of gestalt perception in scientific paradigm shifts; the centrality of the role of concrete analogies in scientific reasoning (Masterman 77); and the theorisation of ontological metaphors in cognitive linguistics (Gärdenfors). Figure 1: Object Ontological commitments reflect the shared beliefs within a community about the sorts of things that exist. Our beliefs about what exists are shaped by our sensory and motor interactions with objects in the physical world. Physical objects have boundaries and surfaces that separate the object from not-the-object. Objects have insides and outsides, and can be described in terms of more-or-less fixed and stable “objective” properties. A prototypical example of an “object” is a “container”, like the example shown in Figure 1. Ontological metaphors allow us to conceive of “things” which are not objects as if they were objects by picking “out parts of our experience and treat them as [if they were] discrete entities or substances of a uniform kind” (Lakoff and Johnson 25). We use ontological metaphors when we imagine a boundary around a collection of things, such as the members of a team or trees in a forest, and conceive of them as being in a container (Langacker 191–97). We can then think of “things” like a team or forest as if they were a single entity. We can also understand processes and activities as if they were things with boundaries. Whether or not we characterise some aspect of our experience as a noun (a bounded entity) or as a verb (a process that occurs over time) is not determined by the nature of things in themselves, but by our understanding and interpretation of our experience (Langacker 233). In this paper I employ a technique that involves examining the details of “concrete images” from the source domains for metaphors employed in the social sciences to expose for analysis their ontological commitments (Harrison, “Politics” 215; Harrison, “Economics” 7). By examining the ontological metaphors that structure the resilience literature I will show how different conceptions of resilience reflect different beliefs and commitments about the sorts of “things” there are in the world, and hence how we can study and understand these “things.” Engineering Metaphors In his discussion of engineering resilience, Holling (“Engineering Vs. Ecological” 33) argues that this conception is the “foundation for economic theory”, and defined in terms of “resistance to disturbance and the speed of return to the equilibrium” or steady state of the system. Whereas Holling takes his original example of the use of the engineering concept of resilience from economics, Pendall, Foster, & Cowell (72), and Martin-Breen and Anderies (6) identify it as the concept of resilience that dominates the field of psychology. They take the stress loading of bridges to be the engineering source for the metaphor. Figure 2: Pogo stick animation (Source: Blacklemon 67, CC http://en.wikipedia.org/wiki/File:Pogoanim.gif). In order to understand this metaphor, we need to examine the characteristics of the source domain for the metaphor. A bridge can be “under tension, compression or both forces at the same time [and] experiences what engineers define as stress” (Matthews 3). In order to resist these forces, bridges need to be constructed of material which “behave much like a spring” that “strains elastically (deforms temporarily and returns to its original shape after a load has been removed) under a given stress” (Gordon 52; cited in Matthews). The pogostick shown in Figure 2 illustrates how a spring returns to its original size and configuration once the load or stress is removed. WGBH Educational Foundation provides links to simple diagrams that illustrate the different stresses the three main designs of bridges are subject to, and if you compare Computers & Engineering's with Gibbs and Bourne's harmonic spring animation you can see how both a bridge under live load and the pogostick in Figure 2 oscillate just like an harmonic spring. Subject to the elastic limits of the material, the deformation of a spring is proportional to the stress or load applied. According to the “modern theory of elasticity [...] it [is] possible to deduce the relation between strain and stress for complex objects in terms of intrinsic properties of the materials it is made of” (“Hooke’s Law”). When psychological resilience is characterised in terms of “properties of individuals [that] are identified in isolation” (Martin-Breen and Anderies 12); and in terms of “behaviours and attributes [of individuals] that allow people to get along with one another and to succeed socially” (Pendall, Foster, and Cowell 72), they are reflecting this engineering focus on the properties of materials. Martin-Breen and Anderies (42) argue that “the Engineering Resilience framework” has been informed by ontological metaphors which treat “an ecosystem, person, city, government, bridge, [or] society” as if it were an object—“a unified whole”. Because this concept of resilience treats individuals as “objects,” it leads researchers to look for the properties or characteristics of the “materials” which individuals are “made of”, which are either elastic and allow them to “bounce” or “spring” back after stress; or are fragile and brittle and break under load. Similarly, the Designers Institute (DINZ), in its conference on “Our brittle society,” shows it is following the engineering resilience approach when it conceives of a city or society as an object which is made of materials which are either “strong and flexible” or “brittle and fragile”. While Holling characterises economic theory in terms of this engineering metaphor, it is in fact chemistry and the kinetic theory of gases that provides the source domain for the ontological metaphor which structures both static and dynamic equilibrium models within neo-classical economics (Smith and Foley; Mirowski). However, while springs are usually made out of metals, they can be made out of any “material [that] has the required combination of rigidity and elasticity,” such as plastic, and even wood (in a bow) (“Spring (device)”). Gas under pressure turns out to behave the same as other springs or elastic materials do under load. Because both the economic metaphor based on equilibrium theory of gases and the engineering analysis of bridges under load can both be subsumed under spring theory, we can treat both the economic (gas) metaphor and the engineering (bridge) metaphor as minor variations of a single overarching (spring) metaphor. Complex Systems Metaphors Holling (“Resilience & Stability” 13–15) critiques equilibrium models, arguing that non-deterministic, complex, non-equilibrium and multi-equilibrium ecological systems do not satisfy the conditions for application of equilibrium models. Holling argues that unlike the single equilibrium modelled by engineering resilience, complex adaptive systems (CAS) may have multi or no equilibrium states, and be non-linear and non-deterministic. Walker and Salt follow Holling by calling for recognition of the “dynamic complexity of the real world” (8), and that “these [real world] systems are complex adaptive systems” (11). Martin-Breen and Anderies (7) identify the key difference between “systems” and “complex adaptive systems” resilience as adaptive capacity, which like Walker and Salt (xiii), they define as the capacity to maintain function, even if system structures change or fail. The “engineering” concept of resilience focuses on the (elastic) properties of materials and uses language associated with elastic springs. This “spring” metaphor emphasises the property of individual components. In contrast, ecological concepts of resilience examine interactions between elements, and the state of the system in a multi-dimensional phase space. This systems approach shows that the complex behaviour of a system depends at least as much on the relationships between elements. These relationships can lead to “emergent” properties which cannot be reduced to the properties of the parts of the system. To explain these relationships and connections, ecologists and climate scientists use language and images associated with landscapes such as 2-D cross-sections and 3-D topology (Holling, “Resilience & Stability” 20; Pendall, Foster, and Cowell 74). Figure 3 is based on an image used by Walker, Holling, Carpenter and Kinzig (fig. 1b) to represent possible states of ecological systems. The “basins” in the image rely on our understanding of gravitational forces operating in a 3-D space to model “equilibrium” states in which the system, like the “ball” in the “basin”, will tend to settle. Figure 3: (based on Langston; in Walker et al. fig. 1b) – Tipping Point Bifurcation Wasdell (“Feedback” fig. 4) adapted this image to represent possible climate states and explain the concept of “tipping points” in complex systems. I have added the red balls (a, b, and c to replace the one black ball (b) in the original which represented the state of the system), the red lines which indicate the path of the ball/system, and the black x-y axis, in order to discuss the image. Wasdell (“Feedback Dynamics” slide 22) takes the left basin to represents “the variable, near-equilibrium, but contained dynamics of the [current] glacial/interglacial period”. As a result of rising GHG levels, the climate system absorbs more energy (mostly as heat). This energy can force the system into a different, hotter, state, less amenable to life as we know it. This is shown in Figure 3 by the system (represented as the red ball a) rising up the left basin (point b). From the perspective of the gravitational representation in Figure 3, the extra energy in the basin operates like the rotation in a Gravitron amusement ride, where centrifugal force pushes riders up the sides of the ride. If there is enough energy added to the climate system it could rise up and jump over the ridge/tipping point separating the current climate state into the “hot earth” basin shown on the right. Once the system falls into the right basin, it may be stuck near point c, and due to reinforcing feedbacks have difficulty escaping this new “equilibrium” state. Figure 4 represents a 2-D cross-section of the 3-D landscape shown in Figure 3. This cross-section shows how rising temperature and greenhouse gas (GHG) concentrations in a multi-equilibrium climate topology can lead to the climate crossing a tipping point and shifting from state a to state c. Figure 4: Topographic cross-section of possible climate states (derived from Wasdell, “Feedback” 26 CC). As Holling (“Resilience & Stability”) warns, a less “desirable” state, such as population collapse or extinction, may be more “resilient”, in the engineering sense, than a more desirable state. Wasdell (“Feedback Dynamics” slide 22) warns that the climate forcing as a result of human induced GHG emissions is in fact pushing the system “far away from equilibrium, passed the tipping point, and into the hot-earth scenario”. In previous episodes of extreme radiative forcing in the past, this “disturbance has then been amplified by powerful feedback dynamics not active in the near-equilibrium state [… and] have typically resulted in the loss of about 90% of life on earth.” An essential element of system dynamics is the existence of (delayed) reinforcing and balancing causal feedback loops, such as the ones illustrated in Figure 5. Figure 5: Pre/Predator model (Bellinger CC-BY-SA) In the case of Figure 5, the feedback loops illustrate the relationship between rabbit population increasing, then foxes feeding on the rabbits, keeping the rabbit population within the carrying capacity of the ecosystem. Fox predation prevents rabbit over-population and consequent starvation of rabbits. The reciprocal interaction of the elements of a system leads to unpredictable nonlinearity in “even seemingly simple systems” (“System Dynamics”). The climate system is subject to both positive and negative feedback loops. If the area of ice cover increases, more heat is reflected back into space, creating a positive feedback loop, reinforcing cooling. Whereas, as the arctic ice melts, as it is doing at present (Barber), heat previously reflected back into space is absorbed by now exposed water, increasing the rate of warming. Where negative feedback (system damping) dominates, the cup-shaped equilibrium is stable and system behaviour returns to base when subject to disturbance. [...]The impact of extreme events, however, indicates limits to the stable equilibrium. At one point cooling feedback loops overwhelmed the homeostasis, precipitating the "snowball earth" effect. […] Massive release of CO2 as a result of major volcanic activity […] set off positive feedback loops, precipitating runaway global warming and eliminating most life forms at the end of the Permian period. (Wasdell, “Topological”) Martin-Breen and Anderies (53–54), following Walker and Salt, identify four key factors for systems (ecological) resilience in nonlinear, non-deterministic (complex adaptive) systems: regulatory (balancing) feedback mechanisms, where increase in one element is kept in check by another element; modularity, where failure in one part of the system will not cascade into total systems failure; functional redundancy, where more than one element performs every essential function; and, self-organising capacity, rather than central control ensures the system continues without the need for “leadership”. Transition Towns as a Resilience Movement The Transition Town (TT) movement draws on systems modelling of both climate change and of Limits to Growth (Meadows et al.). TT takes seriously Limits to Growth modelling that showed that without constraints in population and consumption the world faces systems collapse by the middle of this century. It recommends community action to build as much capacity as possible to “maintain existence of function”—Holling's (“Engineering vs. Ecological” 33) definition of ecological resilience—in the face of failing economic, political and environmental systems. The Transition Network provides a template for communities to follow to “rebuild resilience and reduce CO2 emissions”. Rob Hopkins, the movements founder, explicitly identifies ecological resilience as its central concept (Transition Handbook 6). The idea for the movement grew out of a project by (2nd year students) completed for Hopkins at the Kinsale Further Education College. According to Hopkins (“Kinsale”), this project was inspired by Holmgren’s Permaculture principles and Heinberg's book on adapting to life after peak oil. Permaculture (permanent agriculture) is a design system for creating agricultural systems modelled on the diversity, stability, and resilience of natural ecosystems (Mollison ix; Holmgren xix). Permaculture draws its scientific foundations from systems ecology (Holmgren xxv). Following CAS theory, Mollison (33) defines stability as “self-regulation”, rather than “climax” or a single equilibrium state, and recommends “diversity of beneficial functional connections” (32) rather than diversity of isolated elements. Permaculture understands resilience in the ecological, rather than the engineering sense. The Transition Handbook (17) “explores the issues of peak oil and climate change, and how when looked at together, we need to be focusing on the rebuilding of resilience as well as cutting carbon emissions. It argues that the focus of our lives will become increasingly local and small scale as we come to terms with the real implications of the energy crisis we are heading into.” The Transition Towns movement incorporate each of the four systems resilience factors, listed at the end of the previous section, into its template for building resilient communities (Hopkins, Transition Handbook 55–6). Many of its recommendations build “modularity” and “self-organising”, such as encouraging communities to build “local food systems, [and] local investment models”. Hopkins argues that in a “more localised system” feedback loops are tighter, and the “results of our actions are more obvious”. TT training exercises include awareness raising for sensitivity to networks of (actual or potential) ecological, social and economic relationships (Hopkins, Transition Handbook 60–1). TT promotes diversity of local production and economic activities in order to increase “diversity of functions” and “diversity of responses to challenges.” Heinberg (8) wrote the forward to the 2008 edition of the Transition Handbook, after speaking at a TotnesTransition Town meeting. Heinberg is now a senior fellow at the Post Carbon Institute (PCI), which was established in 2003 to “provide […] the resources needed to understand and respond to the interrelated economic, energy, environmental, and equity crises that define the 21st century [… in] a world of resilient communities and re-localized economies that thrive within ecological bounds” (PCI, “About”), of the sort envisioned by the Limits to Growth model discussed in the previous section. Given the overlapping goals of PCI and Transition Towns, it is not surprising that Rob Hopkins is now a Fellow of PCI and regular contributor to Resilience, and there are close ties between the two organisations. Resilience, which until 2012 was published as the Energy Bulletin, is run by the Post Carbon Institute (PCI). Like Transition Towns, Resilience aims to build “community resilience in a world of multiple emerging challenges: the decline of cheap energy, the depletion of critical resources like water, complex environmental crises like climate change and biodiversity loss, and the social and economic issues which are linked to these. […] It has [its] roots in systems theory” (PCI, “About Resilience”). Resilience.org says it follows the interpretation of Resilience Alliance (RA) Program Director Brian Walker and science writer David Salt's (xiii) ecological definition of resilience as “the capacity of a system to absorb disturbance and still retain its basic function and structure.“ Conclusion This paper has analysed the ontological metaphors structuring competing conceptions of resilience. The engineering resilience metaphor dominates in psychological resilience research, but is not adequate for understanding resilience in complex adaptive systems. Ecological resilience, on the other hand, dominates in environmental and climate change research, and is the model of resilience that has been incorporated into the global permaculture and Transition Towns movements. References 2nd year students. Kinsale 2021: An Energy Descent Action Plan. Kinsale, Cork, Ireland: Kinsale Further Education College, 2005. 16 Aug. 2013 ‹http://transitionculture.org/wp-content/uploads/KinsaleEnergyDescentActionPlan.pdf>. Barber, Elizabeth. “Arctic Ice Continues to Thin, and Thin, European Satellite Reveals.” Christian Science Monitor 11 Sep. 2013. 25 Sep. 2013 ‹http://www.csmonitor.com/Environment/2013/0911/Arctic-ice-continues-to-thin-and-thin-European-satellite-reveals>. Bellinger, Gene. “Prey/Predator Model.” SystemsWiki 23 Nov. 2009. 16 Aug. 2013 ‹http://systemswiki.org/index.php?title=Prey/Predator_Model>. Blacklemon67. "Pogo Animation." Wikipedia 2007. 24 Sep. 2013 ‹http://en.wikipedia.org/wiki/File:Pogoanim.gif>. Computers & Engineering. Bridge Trucks Animated Stress Plot 1. 2003. GIF file. SAP2000 Bridge Design. ‹http://www.comp-engineering.com/announce/bridge/demo/truck_1.gif>. DINZ. “Resilience Engineering: 'Our Brittle Society' - The Sustainability Society - May 18th 2012.” The Designers Institute. 2013. 11 Aug. 2013 ‹http://www.dinz.org.nz/Events/2012/May/47965>. Gärdenfors, Peter. “Cognitive Semantics and Image Schemas with Embodied Forces.” Embodiment in Cognition and Culture. Ed. John Michael Krois et al. John Benjamins Publishing, 2007. 57–76. 8 Nov. 2012 ‹http://oddelki.ff.uni-mb.si/filozofija/files/Festschrift/Dunjas_festschrift/gardenfors.pdf>. Garmezy, N, and S Streitman. “Children at Risk: The Search for the Antecedents of Schizophrenia. Part I. Conceptual Models and Research Methods.” Schizophrenia Bulletin 8 (1974): 14–90. NCBI PubMed 14 Aug. 2013 ‹http://schizophreniabulletin.oxfordjournals.org/content/1/8/14.full.pdf>. Gibbs, Keith, and John Bourne. “The Helical Spring.” Schoolphysics 2013. 15 Aug. 2013 ‹http://www.schoolphysics.co.uk/animations/Helical_spring_shm/index.html>. Gordon, James Edward. Structures: Or, Why Things Don’t Fall Down. London: Plenum Press, 1978. Harrison, Karey. “Image Schemas and Political Ontology.” Communication, Cognition and Media: Political and Economic Discourse. Ed. Augusto Soares da Silva et al. Portugal: Aletheia, forthcoming. ———. “Ontological Commitments of Ethics and Economics.” Economic Thought 2.1 (2013): 1–19. 23 Apr. 2013 ‹http://et.worldeconomicsassociation.org/article/view/64>. Heinberg, Richard. Powerdown: Options and Actions for a Post-carbon World. New Society Publishers, 2004. Holling, Crawford Stanley. “Engineering Resilience versus Ecological Resilience.” Engineering within Ecological Constraints. Ed. Peter Schulze. Washington, DC: National Academy Press, 1996. 31–44. 11 Aug. 2013 ‹http://www.nap.edu/openbook.php?record_id=4919&page=31>. ———. “Resilience and Stability of Ecological Systems.” Annual Review of Ecology and Systematics 4.1 (1973): 1–23. 11 Aug. 2013 ‹http://webarchive.iiasa.ac.at/Admin/PUB/Documents/RP-73-003.pdf>. Holmgren, David. Permaculture: Principles & Pathways beyond Sustainability. Holmgren Design Services, 2002. Hopkins, Rob. “Kinsale Energy Descent Action Plan (2005).” Transition Culture: an Evolving Exploration into the Head, Heart and Hands of Energy Descent. n.d. 16 Aug. 2013 ‹http://transitionculture.org/essential-info/pdf-downloads/kinsale-energy-descent-action-plan-2005/>. ———. The Transition Handbook: From Oil Dependency to Local Resilience. Green Books, 2008. Print. ———. The Transition Handbook: From Oil Dependency to Local Resilience. Free edit version. ‹http://www.appropedia.org/Category:The_Transition_Handbook: Appropedia.org> 2010. 16 Aug. 2010 ‹http://www.cs.toronto.edu/~sme/CSC2600/transition-handbook.pdf>. Kuhn, Thomas. The Structure of Scientific Revolutions. 2nd ed. University of Chicago Press, 1962. Lakoff, George, and Mark Johnson. Metaphors We Live By. University of Chicago Press, 1980. Langacker, Ronald W. Foundations of Cognitive Grammar: Theoretical Prerequisites. Vol. 1. Stanford University Press, 1987. Langston, Art. “Tipping Point” or Bifurcation Between Two Attractor Basins. 2004. 25 Sep. 2013. ‹http://www.ecologyandsociety.org/vol9/iss2/art5/figure1.html>. Martin-Breen, Patrick, and J. Marty Anderies. Resilience: A Literature Review. Rockefeller Foundation, 2011. 8 Aug. 2013 ‹http://www.rockefellerfoundation.org/blog/resilience-literature-review>. Masterman, Margaret. “The Nature of a Paradigm.” Criticism and the Growth of Knowledge. Eds. Imre Lakatos & Alan Musgrave. Cambridge University Press, 1970. 59–89. Matthews, Theresa. “The Physics of Bridges.” Yale-New Haven Teachers Institute. 2013. 14 Aug. 2013 ‹http://www.yale.edu/ynhti/curriculum/units/2001/5/01.05.08.x.html>. Meadows, Donella H. et al. The Limits to Growth: A Report for the Club of Rome’s Project on the Predicament of Mankind. Universe Books, 1972. Mirowski, Philip. “From Mandelbrot to Chaos in Economic Theory.” Southern Economic Journal 57.2 (1990): 289–307. Mollison, Bill. Permaculture: A Designers’ Manual. Tagari Publications, 1988. PCI. “About.” Post Carbon Institute. 16 July 2012. 16 Aug. 2013 ‹http://www.postcarbon.org/about/>. ———. “About Resilience.org.” Resilience 16 July 2012. 16 Aug. 2013 ‹http://www.resilience.org/about>. Pendall, Rolf, Kathryn A. Foster, and Margaret Cowell. “Resilience and Regions: Building Understanding of the Metaphor.” Cambridge Journal of Regions, Economy and Society 3.1 (2010): 71–84. 4 Aug. 2013 ‹http://cjres.oxfordjournals.org/content/3/1/71>. RA. “About RA.” Resilience Alliance 2013. 16 Aug. 2013 ‹http://www.resalliance.org/index.php/about_ra>. Smith, Eric, and Duncan K. Foley. “Classical Thermodynamics and Economic General Equilibrium Theory.” Journal of Economic Dynamics and Control 32.1 (2008): 7–65. Transition Network. “About Transition Network.” Transition Network. 2012. 16 Aug. 2013 ‹http://www.transitionnetwork.org/about>. Walker, B. H., and David Salt. Resilience Thinking: Sustaining Ecosystems and People in a Changing World. Island Press, 2006. Walker, Brian et al. “Resilience, Adaptability and Transformability in Social–Ecological Systems.” Ecology and Society 9.2 (2004): 5. Wasdell, David. “A Topological Approach.” The Feedback Crisis in Climate Change: The Meridian Report. n.d. 16 Aug. 2013 ‹http://www.meridian.org.uk/Resources/Global%20Dynamics/Feedback%20Crisis/frameset1.htm?p=3>. ———. “Beyond the Tipping Point: Positive Feedback and the Acceleration of Climate Change.” The Foundation for the Future, Humanity 3000 Workshop. Seattle, 2006. ‹http://www.meridian.org.uk/_PDFs/BeyondTippingPoint.pdf>. ———. “Feedback Dynamics and the Acceleration of Climate Change.” Winterthur, 2008. 16 Aug. 2013 ‹http://www.crisis-forum.org.uk/events/Workshop1/Workshop1_presentations/wasdellpictures/wasdell_clubofrome.php>. Werner, Emmy E., Jessie M. Bierman, and Fern E. French. The Children of Kauai: A Longitudinal Study from the Prenatal Period to Age Ten. University of Hawaii Press, 1971.WGBH. “Bridge Basics.” Building Big. 2001. 14 Aug. 2013 ‹http://www.pbs.org/wgbh/buildingbig/bridge/basics.html>. Wikipedia contributors. “Gravitron.” Wikipedia, the Free Encyclopedia 20 Sep. 2013. 25 Sep. 2013 ‹http://en.wikipedia.org/wiki/Gravitron>. ———. “Hooke’s Law.” Wikipedia, the Free Encyclopedia 8 Aug. 2013. 15 Aug. 2013 ‹http://en.wikipedia.org/wiki/Hooke%27s_law>. ———. “Spring (device).” Wikipedia, the Free Encyclopedia 9 Aug. 2013. 24 Sep. 2013 ‹http://en.wikipedia.org/wiki/Spring_(device)>. ———. “System Dynamics.” Wikipedia, the Free Encyclopedia 9 Aug. 2013. 13 Aug. 2013 ‹http://en.wikipedia.org/wiki/System_dynamics>.
APA, Harvard, Vancouver, ISO, and other styles
36

Dieter, Michael. "Amazon Noir." M/C Journal 10, no. 5 (October 1, 2007). http://dx.doi.org/10.5204/mcj.2709.

Full text
Abstract:
There is no diagram that does not also include, besides the points it connects up, certain relatively free or unbounded points, points of creativity, change and resistance, and it is perhaps with these that we ought to begin in order to understand the whole picture. (Deleuze, “Foucault” 37) Monty Cantsin: Why do we use a pervert software robot to exploit our collective consensual mind? Letitia: Because we want the thief to be a digital entity. Monty Cantsin: But isn’t this really blasphemic? Letitia: Yes, but god – in our case a meta-cocktail of authorship and copyright – can not be trusted anymore. (Amazon Noir, “Dialogue”) In 2006, some 3,000 digital copies of books were silently “stolen” from online retailer Amazon.com by targeting vulnerabilities in the “Search inside the Book” feature from the company’s website. Over several weeks, between July and October, a specially designed software program bombarded the Search Inside!™ interface with multiple requests, assembling full versions of texts and distributing them across peer-to-peer networks (P2P). Rather than a purely malicious and anonymous hack, however, the “heist” was publicised as a tactical media performance, Amazon Noir, produced by self-proclaimed super-villains Paolo Cirio, Alessandro Ludovico, and Ubermorgen.com. While controversially directed at highlighting the infrastructures that materially enforce property rights and access to knowledge online, the exploit additionally interrogated its own interventionist status as theoretically and politically ambiguous. That the “thief” was represented as a digital entity or machinic process (operating on the very terrain where exchange is differentiated) and the emergent act of “piracy” was fictionalised through the genre of noir conveys something of the indeterminacy or immensurability of the event. In this short article, I discuss some political aspects of intellectual property in relation to the complexities of Amazon Noir, particularly in the context of control, technological action, and discourses of freedom. Software, Piracy As a force of distribution, the Internet is continually subject to controversies concerning flows and permutations of agency. While often directed by discourses cast in terms of either radical autonomy or control, the technical constitution of these digital systems is more regularly a case of establishing structures of operation, codified rules, or conditions of possibility; that is, of guiding social processes and relations (McKenzie, “Cutting Code” 1-19). Software, as a medium through which such communication unfolds and becomes organised, is difficult to conceptualise as a result of being so event-orientated. There lies a complicated logic of contingency and calculation at its centre, a dimension exacerbated by the global scale of informational networks, where the inability to comprehend an environment that exceeds the limits of individual experience is frequently expressed through desires, anxieties, paranoia. Unsurprisingly, cautionary accounts and moral panics on identity theft, email fraud, pornography, surveillance, hackers, and computer viruses are as commonplace as those narratives advocating user interactivity. When analysing digital systems, cultural theory often struggles to describe forces that dictate movement and relations between disparate entities composed by code, an aspect heightened by the intensive movement of informational networks where differences are worked out through the constant exposure to unpredictability and chance (Terranova, “Communication beyond Meaning”). Such volatility partially explains the recent turn to distribution in media theory, as once durable networks for constructing economic difference – organising information in space and time (“at a distance”), accelerating or delaying its delivery – appear contingent, unstable, or consistently irregular (Cubitt 194). Attributing actions to users, programmers, or the software itself is a difficult task when faced with these states of co-emergence, especially in the context of sharing knowledge and distributing media content. Exchanges between corporate entities, mainstream media, popular cultural producers, and legal institutions over P2P networks represent an ongoing controversy in this respect, with numerous stakeholders competing between investments in property, innovation, piracy, and publics. Beginning to understand this problematic landscape is an urgent task, especially in relation to the technological dynamics that organised and propel such antagonisms. In the influential fragment, “Postscript on the Societies of Control,” Gilles Deleuze describes the historical passage from modern forms of organised enclosure (the prison, clinic, factory) to the contemporary arrangement of relational apparatuses and open systems as being materially provoked by – but not limited to – the mass deployment of networked digital technologies. In his analysis, the disciplinary mode most famously described by Foucault is spatially extended to informational systems based on code and flexibility. According to Deleuze, these cybernetic machines are connected into apparatuses that aim for intrusive monitoring: “in a control-based system nothing’s left alone for long” (“Control and Becoming” 175). Such a constant networking of behaviour is described as a shift from “molds” to “modulation,” where controls become “a self-transmuting molding changing from one moment to the next, or like a sieve whose mesh varies from one point to another” (“Postscript” 179). Accordingly, the crisis underpinning civil institutions is consistent with the generalisation of disciplinary logics across social space, forming an intensive modulation of everyday life, but one ambiguously associated with socio-technical ensembles. The precise dynamics of this epistemic shift are significant in terms of political agency: while control implies an arrangement capable of absorbing massive contingency, a series of complex instabilities actually mark its operation. Noise, viral contamination, and piracy are identified as key points of discontinuity; they appear as divisions or “errors” that force change by promoting indeterminacies in a system that would otherwise appear infinitely calculable, programmable, and predictable. The rendering of piracy as a tactic of resistance, a technique capable of levelling out the uneven economic field of global capitalism, has become a predictable catch-cry for political activists. In their analysis of multitude, for instance, Antonio Negri and Michael Hardt describe the contradictions of post-Fordist production as conjuring forth a tendency for labour to “become common.” That is, as productivity depends on flexibility, communication, and cognitive skills, directed by the cultivation of an ideal entrepreneurial or flexible subject, the greater the possibilities for self-organised forms of living that significantly challenge its operation. In this case, intellectual property exemplifies such a spiralling paradoxical logic, since “the infinite reproducibility central to these immaterial forms of property directly undermines any such construction of scarcity” (Hardt and Negri 180). The implications of the filesharing program Napster, accordingly, are read as not merely directed toward theft, but in relation to the private character of the property itself; a kind of social piracy is perpetuated that is viewed as radically recomposing social resources and relations. Ravi Sundaram, a co-founder of the Sarai new media initiative in Delhi, has meanwhile drawn attention to the existence of “pirate modernities” capable of being actualised when individuals or local groups gain illegitimate access to distributive media technologies; these are worlds of “innovation and non-legality,” of electronic survival strategies that partake in cultures of dispersal and escape simple classification (94). Meanwhile, pirate entrepreneurs Magnus Eriksson and Rasmus Fleische – associated with the notorious Piratbyrn – have promoted the bleeding away of Hollywood profits through fully deployed P2P networks, with the intention of pushing filesharing dynamics to an extreme in order to radicalise the potential for social change (“Copies and Context”). From an aesthetic perspective, such activist theories are complemented by the affective register of appropriation art, a movement broadly conceived in terms of antagonistically liberating knowledge from the confines of intellectual property: “those who pirate and hijack owned material, attempting to free information, art, film, and music – the rhetoric of our cultural life – from what they see as the prison of private ownership” (Harold 114). These “unruly” escape attempts are pursued through various modes of engagement, from experimental performances with legislative infrastructures (i.e. Kembrew McLeod’s patenting of the phrase “freedom of expression”) to musical remix projects, such as the work of Negativland, John Oswald, RTMark, Detritus, Illegal Art, and the Evolution Control Committee. Amazon Noir, while similarly engaging with questions of ownership, is distinguished by specifically targeting information communication systems and finding “niches” or gaps between overlapping networks of control and economic governance. Hans Bernhard and Lizvlx from Ubermorgen.com (meaning ‘Day after Tomorrow,’ or ‘Super-Tomorrow’) actually describe their work as “research-based”: “we not are opportunistic, money-driven or success-driven, our central motivation is to gain as much information as possible as fast as possible as chaotic as possible and to redistribute this information via digital channels” (“Interview with Ubermorgen”). This has led to experiments like Google Will Eat Itself (2005) and the construction of the automated software thief against Amazon.com, as process-based explorations of technological action. Agency, Distribution Deleuze’s “postscript” on control has proven massively influential for new media art by introducing a series of key questions on power (or desire) and digital networks. As a social diagram, however, control should be understood as a partial rather than totalising map of relations, referring to the augmentation of disciplinary power in specific technological settings. While control is a conceptual regime that refers to open-ended terrains beyond the architectural locales of enclosure, implying a move toward informational networks, data solicitation, and cybernetic feedback, there remains a peculiar contingent dimension to its limits. For example, software code is typically designed to remain cycling until user input is provided. There is a specifically immanent and localised quality to its actions that might be taken as exemplary of control as a continuously modulating affective materialism. The outcome is a heightened sense of bounded emergencies that are either flattened out or absorbed through reconstitution; however, these are never linear gestures of containment. As Tiziana Terranova observes, control operates through multilayered mechanisms of order and organisation: “messy local assemblages and compositions, subjective and machinic, characterised by different types of psychic investments, that cannot be the subject of normative, pre-made political judgments, but which need to be thought anew again and again, each time, in specific dynamic compositions” (“Of Sense and Sensibility” 34). This event-orientated vitality accounts for the political ambitions of tactical media as opening out communication channels through selective “transversal” targeting. Amazon Noir, for that reason, is pitched specifically against the material processes of communication. The system used to harvest the content from “Search inside the Book” is described as “robot-perversion-technology,” based on a network of four servers around the globe, each with a specific function: one located in the United States that retrieved (or “sucked”) the books from the site, one in Russia that injected the assembled documents onto P2P networks and two in Europe that coordinated the action via intelligent automated programs (see “The Diagram”). According to the “villains,” the main goal was to steal all 150,000 books from Search Inside!™ then use the same technology to steal books from the “Google Print Service” (the exploit was limited only by the amount of technological resources financially available, but there are apparent plans to improve the technique by reinvesting the money received through the settlement with Amazon.com not to publicise the hack). In terms of informational culture, this system resembles a machinic process directed at redistributing copyright content; “The Diagram” visualises key processes that define digital piracy as an emergent phenomenon within an open-ended and responsive milieu. That is, the static image foregrounds something of the activity of copying being a technological action that complicates any analysis focusing purely on copyright as content. In this respect, intellectual property rights are revealed as being entangled within information architectures as communication management and cultural recombination – dissipated and enforced by a measured interplay between openness and obstruction, resonance and emergence (Terranova, “Communication beyond Meaning” 52). To understand data distribution requires an acknowledgement of these underlying nonhuman relations that allow for such informational exchanges. It requires an understanding of the permutations of agency carried along by digital entities. According to Lawrence Lessig’s influential argument, code is not merely an object of governance, but has an overt legislative function itself. Within the informational environments of software, “a law is defined, not through a statue, but through the code that governs the space” (20). These points of symmetry are understood as concretised social values: they are material standards that regulate flow. Similarly, Alexander Galloway describes computer protocols as non-institutional “etiquette for autonomous agents,” or “conventional rules that govern the set of possible behavior patterns within a heterogeneous system” (7). In his analysis, these agreed-upon standardised actions operate as a style of management fostered by contradiction: progressive though reactionary, encouraging diversity by striving for the universal, synonymous with possibility but completely predetermined, and so on (243-244). Needless to say, political uncertainties arise from a paradigm that generates internal material obscurities through a constant twinning of freedom and control. For Wendy Hui Kyong Chun, these Cold War systems subvert the possibilities for any actual experience of autonomy by generalising paranoia through constant intrusion and reducing social problems to questions of technological optimisation (1-30). In confrontation with these seemingly ubiquitous regulatory structures, cultural theory requires a critical vocabulary differentiated from computer engineering to account for the sociality that permeates through and concatenates technological realities. In his recent work on “mundane” devices, software and code, Adrian McKenzie introduces a relevant analytic approach in the concept of technological action as something that both abstracts and concretises relations in a diffusion of collective-individual forces. Drawing on the thought of French philosopher Gilbert Simondon, he uses the term “transduction” to identify a key characteristic of technology in the relational process of becoming, or ontogenesis. This is described as bringing together disparate things into composites of relations that evolve and propagate a structure throughout a domain, or “overflow existing modalities of perception and movement on many scales” (“Impersonal and Personal Forces in Technological Action” 201). Most importantly, these innovative diffusions or contagions occur by bridging states of difference or incompatibilities. Technological action, therefore, arises from a particular type of disjunctive relation between an entity and something external to itself: “in making this relation, technical action changes not only the ensemble, but also the form of life of its agent. Abstraction comes into being and begins to subsume or reconfigure existing relations between the inside and outside” (203). Here, reciprocal interactions between two states or dimensions actualise disparate potentials through metastability: an equilibrium that proliferates, unfolds, and drives individuation. While drawing on cybernetics and dealing with specific technological platforms, McKenzie’s work can be extended to describe the significance of informational devices throughout control societies as a whole, particularly as a predictive and future-orientated force that thrives on staged conflicts. Moreover, being a non-deterministic technical theory, it additionally speaks to new tendencies in regimes of production that harness cognition and cooperation through specially designed infrastructures to enact persistent innovation without any end-point, final goal or natural target (Thrift 283-295). Here, the interface between intellectual property and reproduction can be seen as a site of variation that weaves together disparate objects and entities by imbrication in social life itself. These are specific acts of interference that propel relations toward unforeseen conclusions by drawing on memories, attention spans, material-technical traits, and so on. The focus lies on performance, context, and design “as a continual process of tuning arrived at by distributed aspiration” (Thrift 295). This later point is demonstrated in recent scholarly treatments of filesharing networks as media ecologies. Kate Crawford, for instance, describes the movement of P2P as processual or adaptive, comparable to technological action, marked by key transitions from partially decentralised architectures such as Napster, to the fully distributed systems of Gnutella and seeded swarm-based networks like BitTorrent (30-39). Each of these technologies can be understood as a response to various legal incursions, producing radically dissimilar socio-technological dynamics and emergent trends for how agency is modulated by informational exchanges. Indeed, even these aberrant formations are characterised by modes of commodification that continually spillover and feedback on themselves, repositioning markets and commodities in doing so, from MP3s to iPods, P2P to broadband subscription rates. However, one key limitation of this ontological approach is apparent when dealing with the sheer scale of activity involved, where mass participation elicits certain degrees of obscurity and relative safety in numbers. This represents an obvious problem for analysis, as dynamics can easily be identified in the broadest conceptual sense, without any understanding of the specific contexts of usage, political impacts, and economic effects for participants in their everyday consumptive habits. Large-scale distributed ensembles are “problematic” in their technological constitution, as a result. They are sites of expansive overflow that provoke an equivalent individuation of thought, as the Recording Industry Association of America observes on their educational website: “because of the nature of the theft, the damage is not always easy to calculate but not hard to envision” (“Piracy”). The politics of the filesharing debate, in this sense, depends on the command of imaginaries; that is, being able to conceptualise an overarching structural consistency to a persistent and adaptive ecology. As a mode of tactical intervention, Amazon Noir dramatises these ambiguities by framing technological action through the fictional sensibilities of narrative genre. Ambiguity, Control The extensive use of imagery and iconography from “noir” can be understood as an explicit reference to the increasing criminalisation of copyright violation through digital technologies. However, the term also refers to the indistinct or uncertain effects produced by this tactical intervention: who are the “bad guys” or the “good guys”? Are positions like ‘good’ and ‘evil’ (something like freedom or tyranny) so easily identified and distinguished? As Paolo Cirio explains, this political disposition is deliberately kept obscure in the project: “it’s a representation of the actual ambiguity about copyright issues, where every case seems to lack a moral or ethical basis” (“Amazon Noir Interview”). While user communications made available on the site clearly identify culprits (describing the project as jeopardising arts funding, as both irresponsible and arrogant), the self-description of the artists as political “failures” highlights the uncertainty regarding the project’s qualities as a force of long-term social renewal: Lizvlx from Ubermorgen.com had daily shootouts with the global mass-media, Cirio continuously pushed the boundaries of copyright (books are just pixels on a screen or just ink on paper), Ludovico and Bernhard resisted kickback-bribes from powerful Amazon.com until they finally gave in and sold the technology for an undisclosed sum to Amazon. Betrayal, blasphemy and pessimism finally split the gang of bad guys. (“Press Release”) Here, the adaptive and flexible qualities of informatic commodities and computational systems of distribution are knowingly posited as critical limits; in a certain sense, the project fails technologically in order to succeed conceptually. From a cynical perspective, this might be interpreted as guaranteeing authenticity by insisting on the useless or non-instrumental quality of art. However, through this process, Amazon Noir illustrates how forces confined as exterior to control (virality, piracy, noncommunication) regularly operate as points of distinction to generate change and innovation. Just as hackers are legitimately employed to challenge the durability of network exchanges, malfunctions are relied upon as potential sources of future information. Indeed, the notion of demonstrating ‘autonomy’ by illustrating the shortcomings of software is entirely consistent with the logic of control as a modulating organisational diagram. These so-called “circuit breakers” are positioned as points of bifurcation that open up new systems and encompass a more general “abstract machine” or tendency governing contemporary capitalism (Parikka 300). As a consequence, the ambiguities of Amazon Noir emerge not just from the contrary articulation of intellectual property and digital technology, but additionally through the concept of thinking “resistance” simultaneously with regimes of control. This tension is apparent in Galloway’s analysis of the cybernetic machines that are synonymous with the operation of Deleuzian control societies – i.e. “computerised information management” – where tactical media are posited as potential modes of contestation against the tyranny of code, “able to exploit flaws in protocological and proprietary command and control, not to destroy technology, but to sculpt protocol and make it better suited to people’s real desires” (176). While pushing a system into a state of hypertrophy to reform digital architectures might represent a possible technique that produces a space through which to imagine something like “our” freedom, it still leaves unexamined the desire for reformation itself as nurtured by and produced through the coupling of cybernetics, information theory, and distributed networking. This draws into focus the significance of McKenzie’s Simondon-inspired cybernetic perspective on socio-technological ensembles as being always-already predetermined by and driven through asymmetries or difference. As Chun observes, consequently, there is no paradox between resistance and capture since “control and freedom are not opposites, but different sides of the same coin: just as discipline served as a grid on which liberty was established, control is the matrix that enables freedom as openness” (71). Why “openness” should be so readily equated with a state of being free represents a major unexamined presumption of digital culture, and leads to the associated predicament of attempting to think of how this freedom has become something one cannot not desire. If Amazon Noir has political currency in this context, however, it emerges from a capacity to recognise how informational networks channel desire, memories, and imaginative visions rather than just cultivated antagonisms and counterintuitive economics. As a final point, it is worth observing that the project was initiated without publicity until the settlement with Amazon.com. There is, as a consequence, nothing to suggest that this subversive “event” might have actually occurred, a feeling heightened by the abstractions of software entities. To the extent that we believe in “the big book heist,” that such an act is even possible, is a gauge through which the paranoia of control societies is illuminated as a longing or desire for autonomy. As Hakim Bey observes in his conceptualisation of “pirate utopias,” such fleeting encounters with the imaginaries of freedom flow back into the experience of the everyday as political instantiations of utopian hope. Amazon Noir, with all its underlying ethical ambiguities, presents us with a challenge to rethink these affective investments by considering our profound weaknesses to master the complexities and constant intrusions of control. It provides an opportunity to conceive of a future that begins with limits and limitations as immanently central, even foundational, to our deep interconnection with socio-technological ensembles. References “Amazon Noir – The Big Book Crime.” http://www.amazon-noir.com/>. Bey, Hakim. T.A.Z.: The Temporary Autonomous Zone, Ontological Anarchy, Poetic Terrorism. New York: Autonomedia, 1991. Chun, Wendy Hui Kyong. Control and Freedom: Power and Paranoia in the Age of Fibre Optics. Cambridge, MA: MIT Press, 2006. Crawford, Kate. “Adaptation: Tracking the Ecologies of Music and Peer-to-Peer Networks.” Media International Australia 114 (2005): 30-39. Cubitt, Sean. “Distribution and Media Flows.” Cultural Politics 1.2 (2005): 193-214. Deleuze, Gilles. Foucault. Trans. Seán Hand. Minneapolis: U of Minnesota P, 1986. ———. “Control and Becoming.” Negotiations 1972-1990. Trans. Martin Joughin. New York: Columbia UP, 1995. 169-176. ———. “Postscript on the Societies of Control.” Negotiations 1972-1990. Trans. Martin Joughin. New York: Columbia UP, 1995. 177-182. Eriksson, Magnus, and Rasmus Fleische. “Copies and Context in the Age of Cultural Abundance.” Online posting. 5 June 2007. Nettime 25 Aug 2007. Galloway, Alexander. Protocol: How Control Exists after Decentralization. Cambridge, MA: MIT Press, 2004. Hardt, Michael, and Antonio Negri. Multitude: War and Democracy in the Age of Empire. New York: Penguin Press, 2004. Harold, Christine. OurSpace: Resisting the Corporate Control of Culture. Minneapolis: U of Minnesota P, 2007. Lessig, Lawrence. Code and Other Laws of Cyberspace. New York: Basic Books, 1999. McKenzie, Adrian. Cutting Code: Software and Sociality. New York: Peter Lang, 2006. ———. “The Strange Meshing of Impersonal and Personal Forces in Technological Action.” Culture, Theory and Critique 47.2 (2006): 197-212. Parikka, Jussi. “Contagion and Repetition: On the Viral Logic of Network Culture.” Ephemera: Theory & Politics in Organization 7.2 (2007): 287-308. “Piracy Online.” Recording Industry Association of America. 28 Aug 2007. http://www.riaa.com/physicalpiracy.php>. Sundaram, Ravi. “Recycling Modernity: Pirate Electronic Cultures in India.” Sarai Reader 2001: The Public Domain. Delhi, Sarai Media Lab, 2001. 93-99. http://www.sarai.net>. Terranova, Tiziana. “Communication beyond Meaning: On the Cultural Politics of Information.” Social Text 22.3 (2004): 51-73. ———. “Of Sense and Sensibility: Immaterial Labour in Open Systems.” DATA Browser 03 – Curating Immateriality: The Work of the Curator in the Age of Network Systems. Ed. Joasia Krysa. New York: Autonomedia, 2006. 27-38. Thrift, Nigel. “Re-inventing Invention: New Tendencies in Capitalist Commodification.” Economy and Society 35.2 (2006): 279-306. Citation reference for this article MLA Style Dieter, Michael. "Amazon Noir: Piracy, Distribution, Control." M/C Journal 10.5 (2007). echo date('d M. Y'); ?> <http://journal.media-culture.org.au/0710/07-dieter.php>. APA Style Dieter, M. (Oct. 2007) "Amazon Noir: Piracy, Distribution, Control," M/C Journal, 10(5). Retrieved echo date('d M. Y'); ?> from <http://journal.media-culture.org.au/0710/07-dieter.php>.
APA, Harvard, Vancouver, ISO, and other styles
37

Filinich, Renzo, and Tamara Jesus Chibey. "Becoming and Individuation on the Encounter between Technical Apparatus and Natural System." M/C Journal 23, no. 4 (August 12, 2020). http://dx.doi.org/10.5204/mcj.1651.

Full text
Abstract:
This essay sheds lights on the framing process during the research on the crossing between natural and artificial systems. To approach this, we must outline the machine-natural system relation. From this notion, technology is not seen as an external thing, nor even in contrast to an imaginary of nature, but as an effect that emerges from our thinking and revealing being that, in many cases, may be reduced to an issue of knowledge and action. Here, we want to consider the concept of transduction from Gilbert Simondon as one possible framework for considering the socio-technological actions at stake. His thought offers a detailed conceptual vocabulary for the question of individuation as a “revelation process”, a concern with how things come into existence and proceed temporally as projective entities.Moreover, our approach to the work of philosopher Simondon marked the starting point of our interest and approach to the issue of technique and its politics. From this perspective, the reflection given by Simondon in his thesis on the Individuation and the Mode of Existence of Technical Objects, is to trace certain reasons that are necessary for the development of this project and helping to explain it. In first place, Simondon does not state a specific regime of “human individuation”. The possibility of a psychic and collective individuation is produced, as is manifested when addressing the structure of his main thesis, at the heart of biological individuation; Simondon strongly attacks the anthropocentric tendencies that attempt to establish a defining boundary between biological and psychic reality. We may presume, then, that the issue of language as a defining and differencing element of the human does not interest him; it is at this point that our project begins to focus on employing the transduction of the téchnē as a metaphor of life (Espinoza Lolas et al.); regarding the limits that language may imply for the conformation and expression of the psychic reality. In second place, this critique to the economy of attention present across our research and in Simondon’s thinking seeks to introduce a hypothesis raised in another direction: towards the issue of the technique. During the introduction of his Mode of Existence of Technical Objects, Simondon shows some urgency in the need to approach the reality of technical objects as an autonomous reality and as a configuring reality of the psychic and collective individualisation. Facing the general importance granted to language as a key element of the historical and hermeneutical, even ontological, aspects of the human being, Simondon considers that the technique is the reality that plays the fundamental role of mediating between the human being and the world.Following these observations, a possible question that will guide our research arises: How do the technologisation and informatisation of the cultural techniques alter the nature itself of the knowing of the affection of being with others (people, things, animals)? In the hypothesis of this investigation we claim that—insofar as we deliver an approach and perspective on the technologisation of the world as a process of individuation (considering Simondon’s concept in this becoming, in which an artificial agent and its medium may get out of phase to solve its tensions and give rise to physical or living individuals that constitute their system and go through a series of metastable equilibria)—it’s possible to prove this capacity of invention as a clear example of a form of transindividual individuation (referring to the human being), that thanks to the information that the artificial agent acquires and recovers by means of its “imagination”, which integrates in its perception and affectivity, enables the creation of new norms or artifacts installing in its becoming, as is the case of bioeconomy and cognitive capitalism (Fumagalli 219). It is imperious to observe and analyse the fact that the concept of nature must be integrated along with the concept of Cosmotecnia (Hui 3) to avoid the opposition between nature and technique in conceptual terms, and that is the reason why in the following section we will mention a third memory that is inscribed in this concept. There is no linear time development in human history from nature to technique, from nature to politics.The Extended MindThe idea of memory as something transmissible is important when thinking of the present, there is no humanity outside the technical, neither prior to the technical, and it is important to safeguard this idea to highlight the phýsis/téchnē dichotomy presented by Simondon and Stigler. It is erroneous to think that some entity may exceed the human, that it has any exteriority when it is the materialization of the human forms, or even more, that the human is crossed by it and is not separable. For French philosopher Bernard Stiegler there is no human nature without technique, and vice versa (Stigler 223). Here appears the issue of knowing which are the limits where “the body of the human me might stop” (Hutinel 44), a first glimpse of externalized memory was the flint axe, which is made by using other tools, even when its use is unknown. Its mere existence preserves a knowledge that goes beyond who made it, or its genetic or epigenetic transmission is preserved beyond the organic.We raise the question about a phýsis coming from the téchnē, it is a central topic that dominates the discussion nowadays, about technology and its ability to have a transforming effect over every area of contemporary life and human beings themselves. It is being “revealed” that the true qualitative novelty of the technological improves that happen in front of our eyes resides not only in the appearance of new practices that are related to any particular scientific research. We must point out the evident tension between bíos and zôê during the process of this adaptation, which is an ontological one, but we also witness how the recursivity becomes a modus operandi during this process, which is both social and technological. Just as the philosophy of nature, the philosophy of biology confronts its own limit under the light shed by the recursive algorithms implemented as a dominant way of adaptation, which is what Deleuze called societies of control (Deleuze 165). At the same time, there is an artificial selection (instead of a natural selection) imposed by the politics of transhumanism (for example, human improvement, genetic engineering).In this direction, a first aspect to consider resides in that life, held as an object of power and politics, does not constitute a “natural life”, but the result of a technical production from which its “nature” develops, as well as the possibilities of its deployment. Now then, it is precisely due to this gesture that Stiegler longs to distinguish between what is originary in mankind and its artefactual or artificial becoming: “the prosthesis is not a simple extension of the human body, it is the constitution of said body insofar as ‘human’ (the quotes belong to the constitution). It is not a ‘medium’ for mankind, but its end, and it is known the essential mistakenness of the expression, ‘the end of mankind’” (Stiegler 9). Before such phenomena, it is appropriate to lay out a reflexive methodology centered in observing and analysing the aforementioned idea by Stiegler that there is no mankind without techniques; and there is no technique without mankind (Stigler 223). This implies that this idea of téchnē comprises both the techniques needed to create things, as the technical products resulting from these techniques. The word “techniques” also becomes ambiguous among the modern technology of machines and the primitive “tools” and their techniques, whether they have become art of craft, things that we would not necessarily think as “technology”. What Stiegler is suggesting here is to describe the scope of the term téchnē within an ontogenetic and phylogenetic process of the human being; providing us a reflection about what do we “possess as a fundamental thing” for our being as humans is also fundamental to how “we experience time” since the externalization of our memory into our tools, which Stiegler understands as a “third kind” of memory which is separated from the internal memory that is individually acquired from our brain (epigenetic), and the biological evolutive memory that is inherited from our ancestors (phylogenetic); Stiegler calls this kind of evolutive process epiphylogenetic or epiphylogenesis. Therefore, we could argue that we are defined by this process of epiphylogenesis, and that we are constituted by a past that we ourselves, as individuals, have not lived; this past is delivered to us through culture, which is the fusion of the “technical objects that embody the knowledge of our ancestors, tools that we adopt to transform our surroundings” (Stiegler 177). These supports of external memory (this is, exteriorisations of the consciousness) provide a new collectivisation of the consciousness that exists beyond the individual.The current trend of investigation of ontogeny and phylogeny is driven by the growing consensus both in sciences and humanities in that the living world in every one of its aspects – biologic, semiotic, economic, affective, social, etc. – escapes the finite scheme of description and representation. It is for this reason that authors such as Matteo Pasquinelli refer, in a more modest way, to the idea of “augmented intelligence” (9), reminding us that there is a posthuman legacy between human and machine that still is problematic, “though the machines manifest different degrees of autonomous agency” (Pasquinelli 11).For Simondon, and this is his revolutionary contribution to philosophy, one should think individuation not from the perspective of the individual, but from the point of view of the process that originated it. In other words, individuation must be thought in terms of a process that not only takes for granted the individual but understands it as a result.In Simondon’s words:If, on the contrary, one supposes that individuation does not only produce the individual, one would not attempt to pass quickly through the stage of individuation in order arrive at the final reality that is the individual--one would attempt to grasp the ontogenesis in the entire progression of its reality, and to know the individual through the individuation, rather than the individuation through the individual. (5)Therefore, the epistemological problem does not fall in how the téchnē flees the human domain in its course to become technologies, but in how these “exteriorization” processes (Stiegler 213) alter the concepts themselves of number, image, comparison, space, time, or city, to give a few examples. However, the anthropological category of “exteriorization” does not bring entirely justice to these processes, as they work in a retroactive and recursive manner in the original techniques. Along with the concept of text and book, the practice of reading has also changed during the course of digitalisation and algorithmisation of the processing of knowledge; alongside with the concept of comparison, the practice of comparison has changed since the comparison (i.e. of images) has become an operation that is based in the extraction of data and automatic learning. On the other side, in reverse, we must consider, in an archeological and mediatic fashion, the technological state of life as a starting point from which we must ask what cultural techniques were employed in first place. Asking: How does the informatisation of the cultural techniques produce new forms of subjectivity? How does the concept of cultural techniques already imply the idea of “chains of operations” and, therefore, a permanent (retro)coupling between the living and the non-living agency?This reveals that classical cultural techniques such as indexation or labelling, for example, have acquired ontological powers in the Google era: only what is labelled exists; only what can be searched is absolute. At the same time, in the fantasies of the mediatic corporations, the variety of objects that can be labelled (including people) tends to be coextensive with the world of the phenomena itself (if not the real world), which will then always be only an augmented version of itself.Technology became important for contemporary knowledge only through mediation; therefore, the use of tools could not be the consequence of an extremely well-developed brain. On the contrary, the development of increasingly sophisticated tools took place at the same pace as the development of the brain, as Leroi-Gourhan attempts to probe when studying the history of tools together with the history of the human skeleton and brain. And what he managed to demonstrate is that the history of technique and the history of the human being run in parallel lines; they are, if not equal, at least inextricable. Even today, the progress of knowledge is still not completely subordinated to the technological inversion (Lyotard 37). In short, human evolution is inseparable from the evolution of the téchne, the evolution of technology. One may simply think the human being as a natural animal, isolated from the external material world. What he becomes and what he is, is essentially bonded to the techniques, from the very beginning. Leroi-Gourhan puts it this way in his text Gesture and Speech: “the apparition of tools as a species ... feature that marks the boundary between animals and humans” (90).To understand the behavior of the technological systems is essential for our ability to control their actions, to harvest their benefits and to minimize their damage. Here it is argued that this requires a wide agenda of scientific investigation to study the behavior of the machine that incorporates and broadens the biotechnological discipline, and includes knowledges coming from all sciences. In some way, Simondon sensed this encounter of knowledges, and proposed the concept of the Allagmatic, or theory of operations, “constituted by a systematized set of particular knowledges” (Simondon 469). We could attempt to begin by describing a set of questions that are fundamental for this emerging field, and then exploring the technical, legal, and institutional limitations in the study of technological agency.Information, Communication and SignificationTo establish the relation between information and communication, we will speak from the following two perspectives: first with Norbert Wiener, then with Simondon. We will see how the concept of information is essential to start understanding communication in an artificial agent.On one side, we have the notion from Wiener about information that is demarcated in his project about cybernetics. Cybernetics is the study of communication and control through the inquiry of messages in animals, human beings, and machines. This idea of information arises from the interrelation with the surrounding. Wiener defines it as the “content of what is an interchange object with the external world, while we adjust to it and make it adjust to us” (Wiener 17-18). In other words, we receive and use information since we interact with the world in which we live. It is in this sense that information is connected to the idea of feedback that is defined as the exchange and interaction of information in our systems or other systems. In Wiener’s own words, feedback is “the property of adjusting the future behavior to facts of the past” (31).Information, for Wiener, is influenced, at the same time, by the mathematic and probabilistic idea from the theory of information. Wiener refers to the amount of information that finds its starting point at the mechanics of statistics, along with the concept of entropy, inasmuch that the information is opposed to it. Therefore, information, by supplying a set of messages, indicates a measure of organisation. Argentinian philosopher Pablo Rodríguez adds that “information [for Wiener] is a new physical category of the universe. [It is] the measure of organization of any entity, an organization without which the material and energetic systems wouldn’t be able to survive” (2-3). This way, we have that information responds to the measure of organization and self-regulation of a given system.Moreover, and almost in complete contrast, we have the concept given by Simondon, where information is applicable to the whole possible range: animals, machines, human beings, molecules, crystals, etc. In this sense, it is more versatile, as it exceeds the domains of the technique. To understand well the scope of this concept we will approach it from two definitions. In first place, Simondon, in his conference Amplification in the Process of Information, in the book Communication and Information, claims that information “is not a thing, but the operation of a thing that arrives to a system and produces a transformation in there. The information can’t be defined beyond this act of transformative incidence, and the operation of receiving” (Simondon 139). From this definition it follows the idea of modulation, just when he refers to the “transformation” and “act of transformative incidence” modulation corresponds to the energy that flows amplified during that transformation that occurs within a system.There is a second definition of information that Simondon provides in his thesis Individuation in Light of Notions of Form and Information, in which he claims that: “the information signal is not just what is to be transmitted … it is also that what must be received, this is, what must adopt a signification” (Simondon 281). In this definition Simondon clearly distances himself from Wiener’s cybernetics, insofar as it deals with information as that which must be received, and not that that is to be transmitted. Although Simondon refers to a link between information and signification, this last aspect is not measured in linguistic terms. It rather expresses the decodification of a given code. This is, signification, and information as well, are the result of a disparity of energies, namely, between the overlaying of two possible states (0 and 1, or on and off).This is a central point of divergence with Wiener, as he refers to information in terms of transference of messages, while Simondon does it in terms of transformation of energies. This way, Simondon adds an energy element to the traditional definition of information, which now works as an operation, based in the transformation of energies as a result of a disparity or the overlaying of two possible elements within a system (recipient). It is according to this innovative element that modulation operates in a metastable system. And this is precisely the last concept we need to clarify: the idea of metastability and its relationship with the recipient-system.Metastability is an expression that finds its origins in thermodynamics. Philosophy traditionally operates around the idea of the stability of the being, while Simondon’s proposal states that the being is its becoming. This way, metastability is the condition of possibility of the individuation insofar as the metastable medium leaves behind a remainder of energy for future individuation processes. Thus, metastability refers to the temporal equilibrium of a system that remains in time, as it maintains within itself potential energy, useful for other future individuations.Returning to the conference Amplification in the Process of Information, Simondon points out that “the recipient metastability is the condition of efficiency of the incident information” (139). In such sense, we may claim that there is no information if the signal is not received. Therefore, the recipient is a necessary condition for said information to be given. Simondon understands the recipient as a mixed system (a quasi-system): on one hand, it must be isolated in terms of energy, and it must count with a membrane that allows it to not spend all the energy at the same time; on the other hand, it must be heteronomous, as it depends on an external input of information to activate the system (recipient).The metastable medium is the one indicated to understand the artificial agent, as it leaves the possibility open for the potential energy to manifest and not be spent all at once, but to leave a remainder useful for future modulations, and so, new transformations may occur. At the same time, Simondon’s concept of information is the most convenient when referring to communication and the relationship with the medium, primarily for its property of modulating potential energy. Nevertheless, it is also necessary to retrieve the idea of feedback from Wiener, as it is in the relationship of the artificial agent with its surrounding (and the world) that information is given, and it may flow amplified through its system. By this, significations manage to decode the internal code of the artificial agent, which represents the first gesture towards the opening of the communication.ConclusionThe hypotheses on extended cognition are subject to a huge amount of debate in the artistic, philosophical, and science of cognition circles nowadays, but their implications extend further beyond metaphysics and sciences of the mind. It is apparent that we have just began to scratch the surface of the social sphere in a broader way; realising that these start from cultural branches of the sight; as our minds are; if our minds are partially poured into our smartphones and even in our homes, then it is not a transformation in the human nature, but the latest manifestation of an ancient human ontology of the organic cognitive and informative systems dynamically assembled.It is to this condition that the critical digital humanities and every form of critique should answer. This is due to an attempt to dig out the delays and ruptures within the systems of mass media, by adding the relentless belief in real time as the future, to remind that systems always involve an encounter with a radical “strangeness” or “alienity”, an incommensurability between the future and the desire that turns into the radical potential of many of our contemporary social movements and politics. Our challenge in our critical job is to dismantle the practice of the representation and to reincorporate it to different forms of space and experience that are not reactionary but imaginary. What we attempt to bring into the light here is the need to get every spectator to notice the limits of the machinic vision and to acknowledge the role of image in the recruitment of liminal energies for the capital. The final objective of this essay will be to see that nature possesses the technique of an artist who renders contingency into necessity and inscribes the infinite within the finite, in arts it is not the figure of nature that corresponds to individuation but rather the artist whose task is not only to render contingency necessary as its operation, but also aim for an elevation of the audience as a form of revelation. The artist is he who opens up, through his or her work, a process of transindividuation, meaning a psychical and collective individuation.ReferencesDeleuze, Gilles. “Post-Script on Control Societies.” Polis 13 (2006): 1-7. 14 Feb. 2020 <http://journals.openedition.org/polis/5509>.Espinoza Lolas, Ricardo, et al. “On Technology and Life: Fundamental Concepts of Georges Caguilhem and Xavier Zubiri’s Thought.” Ideas y Valores 67.167 (2018): 127-47. 14 Feb. 2020 <http://dx.doi.org/10.15446/ideasyvalores.v67n167.59430>.Fumagalli, Andrea. Bioeconomía y Capitalismo Cognitivo: Hacia un Nuevo Paradigma de Acumulación. Madrid: Traficantes de Sueños, 2010.Hui, Yuk. “On Cosmotechnics: For a Renewed Relation between Technology and Nature in the Anthropocene.” Techné: Research in Philosophy and Technology 21.2/3 (2017): 319-41. 14 Feb. 2020 <https://www.pdcnet.org/techne/content/techne_2017_0021_42769_0319_0341>.Leroi-Gourhan, André. El Gesto y la Palabra. Venezuela: Universidad Central de Venezuela, 1971.———. El Hombre y la Materia: Evolución y Técnica I. Madrid: Taurus, 1989.———. El Medio y la Técnica: Evolución y Técnica II. Madrid: Taurus, 1989.Lyotard, Jean-François. La Condición Postmoderna: Informe sobre el Saber. Madrid: Cátedra, 2006.Pasquinelli, Matteo. “The Spike: On the Growth and Form of Pattern Police.” Nervous Systems 18.5 (2016): 213-20. 14 Feb. 2020 <http://matteopasquinelli.com/spike-pattern-police/>. Rivera Hutinel, Marcela.“Techno-Genesis and Anthropo-Genesis in the Work of Bernard Stiegler: Or How the Hand Invents the Human.” Liminales, Escritos Sobre Psicología y Sociedad 2.3 (2013): 43-58. 15 Dec. 2019 <http://revistafacso.ucentral.cl/index.php/liminales/article/view/228>.Rodríguez, Pablo. “El Signo de la ‘Sociedad de la Información’ de Cómo la Cibernética y el Estructuralismo Reinventaron la Comunicación.” Question 1.28 (2010): 1-17. 14 Feb. 2020 <https://perio.unlp.edu.ar/ojs/index.php/question/article/view/1064>.Simondon, Gilbert. Comunicación e Información. Buenos Aires: Editorial Cactus, 2015.———. La Individuación: a la luz de las nociones de forma y de información. Buenos Aires: La Cebra/Cactus, 2009 / 2015.———. El Modo de Existencia de los Objetos Técnicos. Buenos Aires: Prometeo, 2007.———. “The Position of the Problem of Ontogenesis.” Parrhesia 7 (2009): 4-16. 4 Nov. 2019 <http://parrhesiajournal.org/parrhesia07/parrhesia07_simondon1.pdf>.Stiegler, Bernard. La Técnica y el Tiempo I. Guipúzcoa: Argitaletxe Hiru, 2002.———. “Temporality and Technical, Psychic and Collective Individuation in the Work of Simondon.” Revista Trilogía Ciencia Tecnología Sociedad 4.6 (2012): 133-46.Wiener, Norbert. Cibernética y Sociedad. Buenos Aires: Editorial Sudamericana, 1958.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography