Journal articles on the topic 'Virtual particle exchange'

To see the other types of publications on this topic, follow the link: Virtual particle exchange.

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 23 journal articles for your research on the topic 'Virtual particle exchange.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Conrad, Michael. "Transient excitations of the Dirac vacuum as a mechanism of virtual particle exchange." Physics Letters A 152, no. 5-6 (January 1991): 245–50. http://dx.doi.org/10.1016/0375-9601(91)90099-t.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

Hartin, Anthony. "Strong field QED in lepton colliders and electron/laser interactions." International Journal of Modern Physics A 33, no. 13 (May 9, 2018): 1830011. http://dx.doi.org/10.1142/s0217751x18300119.

Full text
Abstract:
The studies of strong field particle physics processes in electron/laser interactions and lepton collider interaction points (IPs) are reviewed. These processes are defined by the high intensity of the electromagnetic fields involved and the need to take them into account as fully as possible. Thus, the main theoretical framework considered is the Furry interaction picture within intense field quantum field theory. In this framework, the influence of a background electromagnetic field in the Lagrangian is calculated nonperturbatively, involving exact solutions for quantized charged particles in the background field. These “dressed” particles go on to interact perturbatively with other particles, enabling the background field to play both macroscopic and microscopic roles. Macroscopically, the background field starts to polarize the vacuum, in effect rendering it a dispersive medium. Particles encountering this dispersive vacuum obtain a lifetime, either radiating or decaying into pair particles at a rate dependent on the intensity of the background field. In fact, the intensity of the background field enters into the coupling constant of the strong field quantum electrodynamic Lagrangian, influencing all particle processes. A number of new phenomena occur. Particles gain an intensity-dependent rest mass shift that accounts for their presence in the dispersive vacuum. Multi-photon events involving more than one external field photon occur at each vertex. Higher order processes which exchange a virtual strong field particle resonate via the lifetimes of the unstable strong field states. Two main arenas of strong field physics are reviewed; those occurring in relativistic electron interactions with intense laser beams, and those occurring in the beam–beam physics at the interaction point of colliders. This review outlines the theory, describes its significant novel phenomenology and details the experimental schema required to detect strong field effects and the simulation programs required to model them.
APA, Harvard, Vancouver, ISO, and other styles
3

Szmidt, Kazimierz. "On the SPH Approximations in Modeling Water Waves." Archives of Hydro-Engineering and Environmental Mechanics 60, no. 1-4 (October 1, 2014): 63–86. http://dx.doi.org/10.2478/heem-2013-0009.

Full text
Abstract:
Abstract This paper presents an examination of approximation aspects of the Smoothed Particle Hydrodynamics (SPH) in modeling the water wave phenomenon. Close attention is paid on consistency of the SPH formulation and its relation with a correction technique applied to improve the method accuracy. The considerations are confined to flow fields within finite domains with a free surface and fixed solid boundaries with free slip boundary conditions. In spite of a wide application of the SPH method in fluid mechanics, the appropriate modeling of the boundaries is still not clear. For solid straight line boundaries, a natural way is to use additional (virtual, ghost) particles outside the boundary and take into account mirror reflection of associated field variables. Such a method leads to good results, except for a vicinity of solid horizontal bottoms where, because of the SPH approximations in the description of pressure, a stratification of the fluid material particles may occur. In order to illustrate the last phenomenon, some numerical tests have been made. These numerical experiments show that the solid fluid bottom attracts the material particles and thus, to prevent these particles from penetration into the bottom, a mutual exchange of positions of real and ghost particles has been used in a computation procedure.
APA, Harvard, Vancouver, ISO, and other styles
4

Salam, A. "The Unified Theory of Resonance Energy Transfer According to Molecular Quantum Electrodynamics." Atoms 6, no. 4 (October 11, 2018): 56. http://dx.doi.org/10.3390/atoms6040056.

Full text
Abstract:
An overview is given of the molecular quantum electrodynamical (QED) theory of resonance energy transfer (RET). In this quantized radiation field description, RET arises from the exchange of a single virtual photon between excited donor and unexcited acceptor species. Diagrammatic time-dependent perturbation theory is employed to calculate the transfer matrix element, from which the migration rate is obtained via the Fermi golden rule. Rate formulae for oriented and isotropic systems hold for all pair separation distances, R, beyond wave function overlap. The two well-known mechanisms associated with migration of energy, namely the R−6 radiationless transfer rate due to Förster and the R−2 radiative exchange, correspond to near- and far-zone asymptotes of the general result. Discriminatory pair transfer rates are also presented. The influence of an environment is accounted for by invoking the polariton, which mediates exchange and by introducing a complex refractive index to describe local field and screening effects. This macroscopic treatment is compared and contrasted with a microscopic analysis in which the role of a neutral, polarizable and passive third-particle in mediating transfer of energy is considered. Three possible coupling mechanisms arise, each requiring summation over 24 time-ordered diagrams at fourth-order of perturbation theory with the total rate being a sum of two- and various three-body terms.
APA, Harvard, Vancouver, ISO, and other styles
5

Li, Qiang, Qiang Yi, Rongxin Tang, Xin Qian, Kai Yuan, and Shiyun Liu. "A Hybrid Optimization from Two Virtual Physical Force Algorithms for Dynamic Node Deployment in WSN Applications." Sensors 19, no. 23 (November 22, 2019): 5108. http://dx.doi.org/10.3390/s19235108.

Full text
Abstract:
With the rapid development of unmanned aerial vehicle in space exploration and national defense, large-scale wireless sensor network (WSN) became an important and effective technology. It may require highly accurate locating for the nodes in some real applications. The dynamic node topology control of a large-scale WSN in an unmanned region becomes a hot research topic recently, which helps improve the system connectivity and coverage. In this paper, a hybrid optimization based on two different virtual force algorithms inspired by the interactions among physical sensor nodes is proposed to address the self-consistent node deployment in a large-scale WSN. At the early stage, the deployment algorithm was to deploy the sensor nodes by leveraging the particle motions in dusty plasma to achieve the hexagonal topology of the so-called “Yukawa crystal”. After that, another virtual exchange force model was combined to present a hybrid optimization, which could yield perfect hexagonal topology, better network uniformity, higher coverage rate, and faster convergence speed. The influence of node position, velocity, and acceleration during the node deployment stage on the final network topology are carefully discussed for this scheme. It can aid engineers to control the network topology for a large number of wireless sensors with affordable system cost by choosing suitable parameters based on physical environments or application scenarios in the near future.
APA, Harvard, Vancouver, ISO, and other styles
6

Thompson, Andrew F., and Jean-Baptiste Sallée. "Jets and Topography: Jet Transitions and the Impact on Transport in the Antarctic Circumpolar Current." Journal of Physical Oceanography 42, no. 6 (June 1, 2012): 956–72. http://dx.doi.org/10.1175/jpo-d-11-0135.1.

Full text
Abstract:
Abstract The Southern Ocean’s Antarctic Circumpolar Current (ACC) naturally lends itself to interpretations using a zonally averaged framework. Yet, navigation around steep and complicated bathymetric obstacles suggests that local dynamics may be far removed from those described by zonally symmetric models. In this study, both observational and numerical results indicate that zonal asymmetries, in the form of topography, impact global flow structure and transport properties. The conclusions are based on a suite of more than 1.5 million virtual drifter trajectories advected using a satellite altimetry–derived surface velocity field spanning 17 years. The focus is on sites of “cross front” transport as defined by movement across selected sea surface height contours that correspond to jets along most of the ACC. Cross-front exchange is localized in the lee of bathymetric features with more than 75% of crossing events occurring in regions corresponding to only 20% of the ACC’s zonal extent. These observations motivate a series of numerical experiments using a two-layer quasigeostrophic model with simple, zonally asymmetric topography, which often produces transitions in the front structure along the channel. Significantly, regimes occur where the equilibrated number of coherent jets is a function of longitude and transport barriers are not periodic. Jet reorganization is carried out by eddy flux divergences acting to both accelerate and decelerate the mean flow of the jets. Eddy kinetic energy is amplified downstream of topography due to increased baroclinicity related to topographic steering. The combination of high eddy kinetic energy and recirculation features enhances particle exchange. These results stress the complications in developing consistent circumpolar definitions of the ACC fronts.
APA, Harvard, Vancouver, ISO, and other styles
7

Zhou, Yuan, and Shoken M. Miyama. "The Energy Exchange between Different Masses in an Expanding Gravitating System." Symposium - International Astronomical Union 174 (1996): 387–88. http://dx.doi.org/10.1017/s0074180900001959.

Full text
Abstract:
We investigate whether or not the energy exchange occurs between two species groups of particles in an expanding two-component gravitating system, and we derive the relaxation time scale for energy exchange in such a phase. This is accomplished by solving the dynamic equation coupled with Poisson's equation. We derive a characteristic time determined by the various mass and velocity ratios. When the expansion time does not exceed the characteristic time, energy exchange between the two components is possible and depends on the mass ratio. Once the characteristic time is exceeded, there is virtually no relaxation at all in system. When m2 ≫ m1, the transfer of energy becomes inefficient. Therefore, energy exchange between two species of particle in an expanding two-component gravitating system depends not only on the mass ratio but also on the expansion time.
APA, Harvard, Vancouver, ISO, and other styles
8

Ding, B., and J. W. Darewych. "Variational formulation of relativistic four-body systems in quantum field theory: scalar quadronium." Canadian Journal of Physics 80, no. 5 (May 1, 2002): 605–12. http://dx.doi.org/10.1139/p02-061.

Full text
Abstract:
We discuss a variational method for describing relativistic four-body systems within the Hamiltonian formalism of quantum field theory. The scalar Yukawa (or Wick–Cutkosky) model, in which scalar particles and antiparticles interact via a massive or massless scalar field, is used to illustrate the method. A Fock-space variational trial state is used to describe the stationary states of scalar quadronium (two particles and two antiparticles) interacting via one-quantum exchange and virtual annihilation pairwise interactions. Numerical results for the ground-state mass and approximate wave functions of quadronium are presented for various strengths of the coupling, for the massive and massless quantum exchange cases. PACS Nos.: 11.10Ef, 11.10St, 03.70+k, 03.65Pm
APA, Harvard, Vancouver, ISO, and other styles
9

Salam, A. "Bridge-Mediated RET between Two Chiral Molecules." Applied Sciences 11, no. 3 (January 23, 2021): 1012. http://dx.doi.org/10.3390/app11031012.

Full text
Abstract:
Molecular quantum electrodynamics (QED) theory is employed to calculate the rate of resonance energy transfer (RET) between a donor, D, described by an electric dipole and quadrupole, and magnetic dipole coupling, and an identical acceptor molecule, A, that is mediated by a third body, T, which is otherwise inert. A single virtual photon propagates between D and T, and between T and A. Time-dependent perturbation theory is used to compute the matrix element, from which the transfer rate is evaluated using the Fermi golden rule. This extends previous studies that were limited to the electric dipole approximation only and admits the possibility of the exchange of excitation between a chiral emitter and absorber. Rate terms are computed for specific pure and mixed multipole-dependent contributions of D and A for both an oriented arrangement of the three particles and for the freely tumbling situation. Mixed multipole moment contributions, such as those involving electric–magnetic dipole or electric dipole–quadrupole coupling at one center, do not survive random orientational averaging. Interestingly, the mixed electric–magnetic dipole D and A rate term is non-vanishing and discriminatory, exhibiting a dependence on the chirality of the emitter and absorber, and is entirely retarded. It vanishes, however, if D and A are oriented perpendicularly to one another. Near- and far-zone asymptotes of isotropic contributions to the rate are also evaluated, demonstrating radiationless short-range transfer and inverse-square radiative exchange at very large separations.
APA, Harvard, Vancouver, ISO, and other styles
10

Schickinger, Matthias, Martin Zacharias, and Hendrik Dietz. "Tethered multifluorophore motion reveals equilibrium transition kinetics of single DNA double helices." Proceedings of the National Academy of Sciences 115, no. 32 (July 23, 2018): E7512—E7521. http://dx.doi.org/10.1073/pnas.1800585115.

Full text
Abstract:
We describe a tethered multifluorophore motion assay based on DNA origami for revealing bimolecular reaction kinetics on the single-molecule level. Molecular binding partners may be placed at user-defined positions and in user-defined stoichiometry; and binding states are read out by tracking the motion of quickly diffusing fluorescent reporter units. Multiple dyes per reporter unit enable singe-particle observation for more than 1 hour. We applied the system to study in equilibrium reversible hybridization and dissociation of complementary DNA single strands as a function of tether length, cation concentration, and sequence. We observed up to hundreds of hybridization and dissociation events per single reactant pair and could produce cumulative statistics with tens of thousands of binding and unbinding events. Because the binding partners per particle do not exchange, we could also detect subtle heterogeneity from molecule to molecule, which enabled separating data reflecting the actual target strand pair binding kinetics from falsifying influences stemming from chemically truncated oligonucleotides. Our data reflected that mainly DNA strand hybridization, but not strand dissociation, is affected by cation concentration, in agreement with previous results from different assays. We studied 8-bp-long DNA duplexes with virtually identical thermodynamic stability, but different sequences, and observed strongly differing hybridization kinetics. Complementary full-atom molecular-dynamics simulations indicated two opposing sequence-dependent phenomena: helical templating in purine-rich single strands and secondary structures. These two effects can increase or decrease, respectively, the fraction of strand collisions leading to successful nucleation events for duplex formation.
APA, Harvard, Vancouver, ISO, and other styles
11

Vishwakarma, Ram Gopal. "A scale-invariant, Machian theory of gravitation and electrodynamics unified." International Journal of Geometric Methods in Modern Physics 15, no. 10 (October 2018): 1850178. http://dx.doi.org/10.1142/s0219887818501785.

Full text
Abstract:
As gravitation and electromagnetism are closely analogous long-range interactions, and the current formulation of gravitation is given in terms of geometry, we expect the latter also to appear through the geometry. This unification has however, remained an unfulfilled goal. The goal is achieved here in a new theory, which results from the principles of equivalence and Mach supplemented with a novel insight that the field tensors in a geometric theory of gravitation and electromagnetism must be traceless, since these long-range interactions are mediated by virtual exchange of massless particles whose mass is expected to be related to the trace of the field tensors. Hence, the Riemann tensor, like the analogous electromagnetic field tensor, must be traceless. Thence emerges a scale-invariant, Machian theory of gravitation and electrodynamics unified, wherein the vanishing of the Ricci tensor appears as a boundary condition. While the field equations of the theory are given by the vanishing divergence of the respective field tensors and their duals, the matter and charge emerge from the spacetime. A quantitative formulation thereof, embodied in “energy-momentum super tensors”, follows from the respective Bianchi identities for the two fields. The resulting theory is valid at all scales and explains the observations without invoking the non-baryonic dark matter, dark energy or inflation. Moreover, it answers the questions that the general relativity-based standard paradigm could not address.
APA, Harvard, Vancouver, ISO, and other styles
12

Prokof’ev, Valery Yu, Natalya E. Gordina, Oleg N. Zakharov, Elena V. Tsvetova, and Anastasia E. Kolobkova. "GRANULATED LOW-MODULUS ZEOLITES FOR EXTRACTION OF CO CATIONS." IZVESTIYA VYSSHIKH UCHEBNYKH ZAVEDENII KHIMIYA KHIMICHESKAYA TEKHNOLOGIYA 63, no. 6 (May 13, 2020): 44–49. http://dx.doi.org/10.6060/ivkkt.20206306.6144.

Full text
Abstract:
The paper gives a characteristic of LTA and SOD zeolite pellets, which were synthesized using preliminary ultrasonic processing. By X-ray diffraction and IR spectroscopy methods, it was established that the investigated samples contained about 100% of the LTA or SOD phase. It was shown that zeolite LTA particles have a dimension of coherent scattering region of 780 nm and root mean square microdeformations of 0.05%, while SOD zeolite particles have this parameter about 460 nm and the defectiveness of crystal lattice is 0.15%. The values of the specific surface area of zeolite pellets were determined, the values of which are 148.8 and 115.6 m2/g for LTA and SOD, respectively. It was noted that the SOD zeolite pellets have virtually no micropores. Equilibrium curves were obtained for the saturation of zeolite pellets with cobalt cations depending on the concentration of Co2+ in an aqueous solution of cobalt chloride. It was shown that in the entire investigated range of Co2+ concentrations, the LTA zeolite capacity is 30% higher than that of the SOD zeolite which can be explained by two main reasons. Firstly, this is a higher specific surface area of the LTA zeolite. Secondly, these are structural features of the zeolite frameworks, namely, LTA zeolite has α-cavities which are more accessible for Co2+. It is also shown that with increasing temperature, the zeolites capacity on cobalt cations grows. It was discovered that a new absorption band with a wave number at 1390 cm–1 appears on the IR spectra of both zeolites after saturation. In addition, an increase in the size of the unit cell and a growth in the defectiveness of particles were noted in cobalt-saturated zeolites. Taken together, these facts point to the cation exchange. It was shown that 1 g of LTA zeolite provides almost complete purification of 50 ml of the solution from 57Co cations in a dynamic mode.
APA, Harvard, Vancouver, ISO, and other styles
13

Gula, Jonathan, M. Jeroen Molemaker, and James C. McWilliams. "Submesoscale Dynamics of a Gulf Stream Frontal Eddy in the South Atlantic Bight." Journal of Physical Oceanography 46, no. 1 (January 2016): 305–25. http://dx.doi.org/10.1175/jpo-d-14-0258.1.

Full text
Abstract:
AbstractFrontal eddies are commonly observed and understood as the product of an instability of the Gulf Stream along the southeastern U.S. seaboard. Here, the authors study the dynamics of a simulated Gulf Stream frontal eddy in the South Atlantic Bight, including its structure, propagation, and emergent submesoscale interior and neighboring substructure, at very high resolution (dx = 150 m). A rich submesoscale structure is revealed inside the frontal eddy. Meander-induced frontogenesis sharpens the gradients and forms very sharp fronts between the eddy and the adjacent Gulf Stream. The strong straining increases the velocity shear and suppresses the development of barotropic instability on the upstream face of the meander trough. Barotropic instability of the sheared flow develops from small-amplitude perturbations when the straining weakens at the trough. Small-scale meandering perturbations evolve into rolled-up submesoscale vortices that are advected back into the interior of the frontal eddy. The deep fronts mix the tracer properties and enhance vertical exchanges of tracers between the mixed layer and the interior, as diagnosed by virtual Lagrangian particles. The frontal eddy also locally creates a strong southward flow against the shelf leading to topographic generation of submesoscale centrifugal instability and mixing. In eddy-resolving models that do not resolve these submesoscale processes, there is a significant weakening of the intensity of the upwelling in the core of the frontal eddies, and their decay is generally too fast.
APA, Harvard, Vancouver, ISO, and other styles
14

Shibatani, Toru, Eric J. Carlson, Fredrick Larabee, Ashley L. McCormack, Klaus Früh, and William R. Skach. "Global Organization and Function of Mammalian Cytosolic Proteasome Pools: Implications for PA28 and 19S Regulatory Complexes." Molecular Biology of the Cell 17, no. 12 (December 2006): 4962–71. http://dx.doi.org/10.1091/mbc.e06-04-0311.

Full text
Abstract:
Proteolytic activity of the 20S proteasome is regulated by activators that govern substrate movement into and out of the catalytic chamber. However, the physiological relationship between activators, and hence the relative role of different proteasome species, remains poorly understood. To address this problem, we characterized the total pool of cytosolic proteasomes in intact and functional form using a single-step method that bypasses the need for antibodies, proteasome modification, or column purification. Two-dimensional Blue Native(BN)/SDS-PAGE and tandem mass spectrometry simultaneously identified six native proteasome populations in untreated cytosol: 20S, singly and doubly PA28-capped, singly 19S-capped, hybrid, and doubly 19S-capped proteasomes. All proteasome species were highly dynamic as evidenced by recruitment and exchange of regulatory caps. In particular, proteasome inhibition with MG132 markedly stimulated PA28 binding to exposed 20S α-subunits and generated doubly PA28-capped and hybrid proteasomes. PA28 recruitment virtually eliminated free 20S particles and was blocked by ATP depletion. Moreover, inhibited proteasomes remained stably associated with distinct cohorts of partially degraded fragments derived from cytosolic and ER substrates. These data establish a versatile platform for analyzing substrate-specific proteasome function and indicate that PA28 and 19S activators cooperatively regulate global protein turnover while functioning at different stages of the degradation cycle.
APA, Harvard, Vancouver, ISO, and other styles
15

BRODSKY, STANLEY J. "HIGH ENERGY PHOTON–PHOTON COLLISIONS AT A LINEAR COLLIDER." International Journal of Modern Physics A 20, no. 31 (December 20, 2005): 7306–32. http://dx.doi.org/10.1142/s0217751x05031137.

Full text
Abstract:
High intensity back-scattered laser beams will allow the efficient conversion of a substantial fraction of the incident lepton energy into high energy photons, thus significantly extending the physics capabilities of an e-e± linear collider. The annihilation of two photons produces C = + final states in virtually all angular momentum states. An important physics measurement is the measurement of the Higgs coupling to two photons. The annihilation of polarized photons into the Higgs boson determines its fundamental H0γγ coupling as well as determining its parity. Other novel two-photon processes include the two-photon production of charged pairs τ+τ-, W+W-, [Formula: see text], and supersymmetric squark and slepton pairs. The one-loop box diagram leads to the production of pairs of neutral particles such as γγ → Z0Z0, γZ0, and γγ. At the next order one can study Higgstrahlung processes, such as γγ → W+W-W-H. Since each photon can be resolved into a W+W- pair, high energy photon-photon collisions can also provide a remarkably background-free laboratory for studying possibly anomalous WW collisions and annihilation. In the case of QCD, each photon can materialize as a quark anti-quark pair which interact via multiple gluon exchange. The diffractive channels in photon-photon collisions allow a novel look at the QCD pomeron and odderon. The C = - odderon exchange contribution can be identified by looking at the heavy quark asymmetry. In the case of eγ → e′ collisions, one can measure the photon structure functions and its various components. Exclusive hadron production processes in photon-photon collisions provide important tests of QCD at the amplitude level, particularly as measures of hadron distribution amplitudes which are also important for the analysis of exclusive semi-leptonic and two-body hadronic B-decays.
APA, Harvard, Vancouver, ISO, and other styles
16

Messerschmidt, Marc, Leonard Chavas, Sunil Ananthaneni, Hamidreza Dadgostar, Heinz Graafsma, Mengning Liang, Adrian Mancuso, et al. "Serial Femtosecond Crystallography user's consortium apparatus at European XFEL." Acta Crystallographica Section A Foundations and Advances 70, a1 (August 5, 2014): C1748. http://dx.doi.org/10.1107/s2053273314082515.

Full text
Abstract:
The Serial Femtosecond Crystallography (SFX) user's consortium apparatus is to be installed within the Single Particles, Clusters and Biomolecules (SPB) instrument of the European X-ray Free-Electron Laser facility (XFEL.EU) [1, 2]. The XFEL.EU will provide ultra-short, highly intense, coherent X-ray pulses at an unprecedented repetition rate. The experimental setup and methodological approaches of many scientific areas will be transformed, including structural biology that could potentially overcome common problems and bottlenecks encountered in crystallography, such as creating large crystals, dealing with radiation damage, or understanding sub-picosecond time-resolved phenomena. The key concept of the SFX method is based on the kinetic insertion of protein crystal samples in solution via a gas dynamic virtual nozzle jet and recording diffraction signals of individual, randomly oriented crystals passing through the XFEL beam, as first demonstrated by Chapman et al. [3]. The SFX-apparatus will refocus the beam spent by the SPB instrument into a second interaction region, in some cases enabling two parallel experiments. The planned photon energy range at the SPB instrument is from 3 to 16 keV. The Adaptive Gain Integrating Pixel Detector (AGIPD) is to be implemented in the SPB instrument, including a 4 Megapixel version for the SFX-apparatus. The AGIPD is designed to store over 350 data frames from successive pulses, and aims to collect more than 3,000 images per second. Together with the implementation of automated procedures for sample exchange and injection, high-throughput nanocrystallography experiments can be integrated at the SFX-apparatus. In this work, we review the overall design of the SFX-apparatus and discuss the main parameters and challenges
APA, Harvard, Vancouver, ISO, and other styles
17

Vance, Jean E., and Dennis E. Vance. "The role of phosphatidylcholine biosynthesis in the secretion of lipoproteins from hepatocytes." Canadian Journal of Biochemistry and Cell Biology 63, no. 8 (August 1, 1985): 870–81. http://dx.doi.org/10.1139/o85-108.

Full text
Abstract:
An investigation of the role of phospholipids in lipoprotein assembly and secretion is important since phospholipids, particularly phosphatidylcholine, are prominent components of all plasma lipoproteins. The fatty acid composition of phosphatidylcholine is virtually identical in human very low (VLDL), low, and high density lipoproteins, which supports the idea that phosphatidylcholine exchanges freely among plasma lipoproteins. However, the fatty acid composition of phosphatidylcholine from cultured rat hepatocytes is different from that in the secreted lipoproteins. In addition, the composition of molecular species of phosphatidylcholine is quite different in the rat liver, plasma, and red cells.Phosphatidylcholine is made in liver by two alternate pathways, by the CDP-choline pathway and by the methylation of phosphatidylethanolamine. Regulation of phosphatidylcholine biosynthesis by the CDP-choline pathway in rat liver is well established. In most instances, the rate of phosphatidylcholine synthesis is governed by the activity of CTP:phosphocholine cytidylyltransferase, which is present in the cytosol and also associated with microsomes. The cytosolic enzyme is inactive but can be reversibly translocated to the microsomes, where it is active. Translocation of this enzyme to the microsomes can be achieved either by a dephosphorylation reaction or by the presence of fatty acids in the cytosol.Once synthesized, how is phosphatidylcholine assembled into lipoprotein particles? The sequence of assembly of phospholipids into VLDL has been investigated in several studies. In a pulse–chase experiment, there was an initial labelling (within 15 min of the pulse) of phospholipids in secreted VLDL, which probably reflected the rapid movement of the phospholipids from their site of synthesis (the endoplasmic reticulum) to the Golgi. There appears to be a rapid exchange of phospholipid between the Golgi membranes and contents. There was also a delayed labelling (after 30 min) of the phospholipids and triacylglycerols (from [3H]glycerol) and the apoproteins (from [3H]leucine) in the secreted VLDL. This lag was attributed to the time taken for the nascent VLDL particles to move from the lumen of the endoplasmic reticulum to the Golgi and into the medium.Is phosphatidylcholine biosynthesis required for lipoprotein secretion? This question was investigated in rats maintained on a choline-deficient diet for 10 days. The total amount of plasma phosphatidylcholine decreased by approximately 40% and the rats developed fatty livers. The mechanisms for these effects of choline deficiency have not been fully explained, although one might anticipate that a deficiency of choline would inhibit the synthesis of phosphatidylcholine via the CDP-choline pathway.In addition, the possibility was considered that the alternative pathway for phosphatidylcholine biosynthesis (the methylation of phosphatidylethanolamine) might be necessary for lipoprotein secretion. Inhibition of the methylation pathway for phosphatidylcholine synthesis by greater than 90% in cultured rat hepatocytes did not inhibit the secretion of phosphatidylcholine, phosphatidylethanolamine, or apoproteins of the lipoproteins in the culture medium.Since phosphatidylcholine, triacylglycerol, and cholesterol are required for lipoprotein assembly and secretion, there might be some coordination among the synthesis and secretion of phosphatidylcholine, triacylglycerol, and cholesterol. Such coordinate regulation has been observed. For example, fatty acids stimulate the synthesis and secretion of both triacylglycerol and phosphatidylcholine by cultured rat hepatocytes. In addition, glucagon and AMP inhibit fatty acid synthesis, phosphatidylcholine biosynthesis, and the secretion of both phosphatidylcholine and triacylglycerol. Surprisingly, insulin appears to inhibit lipoprotein secretion even though it promotes fatty acid and cholesterol biosynthesis. In other studies in which there was an increased supply of cholesterol and cholesterol esters in the diet of rats, the percent of cholesterol esters in the core of the secreted VLDL particles was increased. Moreover, cholesterol feeding increased the plasma concentrations of both phosphatidylcholine and cholesterol. Presumably, there was a coordination of the synthesis of phosphatidylcholine with the level of cholesterol in the cells and (or) plasma.We conclude that phosphatidylcholine biosynthesis is a critical component for the synthesis and secretion of lipoproteins from liver.
APA, Harvard, Vancouver, ISO, and other styles
18

Kozlov, Oleg V., Rohan Singh, Bing Ai, Jihong Zhang, Chao Liu, and Victor I. Klimov. "Transient Spectroscopy of Glass-Embedded Perovskite Quantum Dots: Novel Structures in an Old Wrapping." Zeitschrift für Physikalische Chemie 232, no. 9-11 (August 28, 2018): 1495–511. http://dx.doi.org/10.1515/zpch-2018-1168.

Full text
Abstract:
Abstract Semiconductor doped glasses had been used by the research and engineering communities as color filters or saturable absorbers well before it was realized that their optical properties were defined by tiny specs of semiconductor matter known presently as quantum dots (QDs). Nowadays, the preferred type of QD samples are colloidal particles typically fabricated via organometallic chemical routines that allow for exquisite control of QD morphology, composition and surface properties. However, there is still a number of applications that would benefit from the availability of high-quality glass-based QD samples. These prospective applications include fiber optics, optically pumped lasers and amplifiers and luminescent solar concentrators (LSCs). In addition to being perfect optical materials, glass matrices could help enhance stability of QDs by isolating them from the environment and improving heat exchange with the outside medium. Here we conduct optical studies of a new type of all-inorganic CsPbBr3 perovskite QDs fabricated directly in glasses by high-temperature precipitation. These samples are virtually scattering free and exhibit excellent waveguiding properties which makes them well suited for applications in, for example, fiber optics and LSCs. However, the presently existing problem is their fairly low room-temperature emission quantum yields of only ca. 1%–2%. Here we investigate the reasons underlying the limited emissivity of these samples by conducting transient photoluminescence (PL) and absorption measurements across a range of temperatures from 20 to 300K. We observe that the low-temperature PL quantum yield of these samples can be as high as ~25%. However, it quickly drops (in a nearly linear fashion) with increasing temperature. Interestingly, contrary to traditional thermal quenching models, experimental observations cannot be explained in terms of a thermally activated nonradiative rate but rather suggest the existence of two distinct QD sub-ensembles of “emissive” and completely “nonemissive” particles. The temperature-induced variation in the PL efficiency is likely due to a structural transformation of the QD surfaces or interior leading to formation of extremely fast trapping sites or nonemissive phases resulting in conversion of emissive QDs into nonemissive. Thus, future efforts on improving emissivity of glass-based perovskite QD samples might focus on approaches for extending the range of stability of the low-temperature highly emissive structure/phase of the QDs up to room temperature.
APA, Harvard, Vancouver, ISO, and other styles
19

Albay, John A. C., Zhi-Yi Zhou, Cheng-Hung Chang, and Yonggun Jun. "Shift a laser beam back and forth to exchange heat and work in thermodynamics." Scientific Reports 11, no. 1 (February 23, 2021). http://dx.doi.org/10.1038/s41598-021-83824-7.

Full text
Abstract:
AbstractAlthough the equivalence of heat and work has been unveiled since Joule’s ingenious experiment in 1845, they rarely originate from the same source in experiments. In this study, we theoretically and experimentally demonstrated how to use a high-precision optical feedback trap to combine the generation of virtual temperature and potential to simultaneously manipulate the heat and work of a small system. This idea was applied to a microscopic Stirling engine consisting of a Brownian particle under a time-varying confining potential and temperature. The experimental results justified the position and the velocity equipartition theorem, confirmed several theoretically predicted energetics, and revealed the engine efficiency as well as its trade-off relation with the output power. The small theory–experiment discrepancy and high flexibility of the swift change of the particle condition highlight the advantage of this optical technique and prove it to be an efficient way for exploring heat and work-related issues in the modern thermodynamics for small systems.
APA, Harvard, Vancouver, ISO, and other styles
20

Barker, Timothy Scott. "Information and Atmospheres: Exploring the Relationship between the Natural Environment and Information Aesthetics." M/C Journal 15, no. 3 (May 3, 2012). http://dx.doi.org/10.5204/mcj.482.

Full text
Abstract:
Our culture abhors the world.Yet Quicksand is swallowing the duellists; the river is threatening the fighter: earth, waters and climate, the mute world, the voiceless things once placed as a decor surrounding the usual spectacles, all those things that never interested anyone, from now on thrust themselves brutally and without warning into our schemes and manoeuvres (Michel Serres, The Natural Contract, p 3). When Michel Serres describes culture's abhorrence of the world in the opening pages of The Natural Contract he draws our attention to the sidelining of nature in histories and theories that have sought to describe Western culture. As Serres argues, cultural histories are quite often built on the debates and struggles of humanity, which are largely held apart from their natural surroundings, as if on a stage, "purified of things" (3). But, as he is at pains to point out, human activity and conflict always take place within a natural milieu, a space of quicksand, swelling rivers, shifting earth, and atmospheric turbulence. Recently, via the potential for vast environmental change, what was once thought of as a staid “nature” has reasserted itself within culture. In this paper I explore how Serres’s positioning of nature can be understood amid new communication systems, which, via the apparent dematerialization of messages, seems to have further removed culture from nature. From here, I focus on a set of artworks that work against this division, reformulating the connection between information, a topic usually considered in relation to media and anthropic communication (and something about which Serres too has a great deal to say), and nature, an entity commonly considered beyond human contrivance. In particular, I explore how information visualisation and sonification has been used to give a new sense of materiality to the atmosphere, repotentialising the air as a natural and informational entity. The Natural Contract argues for the legal legitimacy of nature, a natural contract similar in standing to Rousseau’s social contract. Serres’ss book explores the history and notion of a “legal person”, arguing for a linking of the scientific view of the world and the legal visions of social life, where inert objects and living beings are considered within the same legal framework. As such The Natural Contract does not deal with ecology per-se, but instead focuses on an argument for the inclusion of nature within law (Serres, “A Return” 131). In a drastic reconfiguring of the subject/object relationship, Serres explains how the space that once existed as a backdrop for human endeavour now seems to thrust itself directly into history. "They (natural events) burst in on our culture, which had never formed anything but a local, vague, and cosmetic idea of them: nature" (Serres, The Natural Contract 3). In this movement, nature does not simply take on the role of a new object to be included within a world still dominated by human subjects. Instead, human beings are understood as intertwined with a global system of turbulence that is both manipulated by them and manipulates them. Taking my lead from Serres’s book, in this paper I begin to explore the disconnections and reconnections that have been established between information and the natural environment. While I acknowledge that there is nothing natural about the term “nature” (Harman 251), I use the term to designate an environment constituted by the systematic processes of the collection of entities that are neither human beings nor human crafted artefacts. As the formation of cultural systems becomes demarcated from these natural objects, the scene is set for the development of culturally mediated concepts such as “nature” and “wilderness,” as entities untouched and unspoilt by cultural process (Morton). On one side of the divide the complex of communication systems is situated, on the other is situated “nature”. The restructuring of information flows due to developments in electronic communication has ostensibly removed messages from the medium of nature. Media is now considered within its own ecology (see Fuller; Strate) quite separate from nature, except when it is developed as media content (see Cubitt; Murray; Heumann). A separation between the structures of media ecologies and the structures of natural ecologies has emerged over the history of electronic communication. For instance, since the synoptic media theory of McLuhan it has been generally acknowledged that the shift from script to print, from stone to parchment, and from the printing press to more recent developments such as the radio, telephone, television, and Web2.0, have fundamentally altered the structure and effects of human relationships. However, these developments – “the extensions of man” (McLuhan)— also changed the relationship between society and nature. Changes in communications technology have allowed people to remain dispersed, as ideas, in the form of electric currents or pulses of light travel vast distances and in diverse directions, with communication no longer requiring human movement across geographic space. Technologies such as the telegraph and the radio, with their ability to seemingly dematerialize the media of messages, reformulated the concept of communication into a “quasi-physical connection” across the obstacles of time and space (Clarke, “Communication” 132). Prior to this, the natural world itself was the medium through which information was passed. Rather than messages transmitted via wires, communication was associated with the transport of messages through the world via human movement, with the materiality of the medium measured in the time it took to cover geographic space. The flow of messages followed trade flows (Briggs and Burke 20). Messages moved along trails, on rail, over bridges, down canals, and along shipping channels, arriving at their destination as information. More recently however, information, due to its instantaneous distribution and multiplication across space, seems to have no need for nature as a medium. Nature has become merely a topic for information, as media content, rather than as something that takes part within the information system itself. The above example illustrates a separation between information exchange and the natural environment brought about by a set of technological developments. As Serres points out, the word “media” is etymologically related to the word “milieu”. Hence, a theory of media should be always related to an understanding of the environment (Crocker). But humans no longer need to physically move through the natural world to communicate, ideas can move freely from region to region, from air-conditioned room to air-conditioned room, relatively unimpeded by natural forces or geographic distance. For a long time now, information exchange has not necessitated human movement through the natural environment and this has consequences for how the formation of culture and its location in (or dislocation from) the natural world is viewed. A number of artists have begun questioning the separation between media and nature, particularly concerning the materiality of air, and using information to provide new points of contact between media and the atmosphere (for a discussion of the history of ecoart see Wallen). In Eclipse (2009) (fig. 1) for instance, an internet based work undertaken by the collective EcoArtTech, environmental sensing technology and online media is used experimentally to visualize air pollution. EcoArtTech is made up of the artist duo Cary Peppermint and Leila Nadir and since 2005 they have been inquiring into the relationship between digital technology and the natural environment, particularly regarding concepts such as “wilderness”. In Eclipse, EcoArtTech garner photographs of American national parks from social media and photo sharing sites. Air quality data gathered from the nearest capital city is then inputted into an algorithm that visibly distorts the image based on the levels of particle pollution detected in the atmosphere. The photographs that circulate on photo sharing sites such as Flickr—photographs that are usually rather banal in their adherence to a history of wilderness photography—are augmented by the environmental pollution circulating in nearby capital cities. Figure 1: EcoArtTech, Eclipse (detail of screenshot), 2009 (Internet-based work available at:http://turbulence.org/Works/eclipse/) The digital is often associated with the clean transmission of information, as packets of data move from a server, over fibre optic cables, to be unpacked and re-presented on a computer's screen. Likewise, the photographs displayed in Eclipse are quite often of an unspoilt nature, containing no errors in their exposure or focus (most probably because these wilderness photographs were taken with digital cameras). As the photographs are overlaid with information garnered from air quality levels, the “unspoilt” photograph is directly related to pollution in the natural environment. In Eclipse the background noise of “wilderness,” the pollution in the air, is reframed as foreground. “We breathe background noise…Background noise is the ground of our perception, absolutely uninterrupted, it is our perennial sustenance, the element of the software of all our logic” (Serres, Genesis 7). Noise is activated in Eclipse in a similar way to Serres’s description, as an indication of the wider milieu in which communication takes place (Crocker). Noise links the photograph and its transmission not only to the medium of the internet and the glitches that arise as information is circulated, but also to the air in the originally photographed location. In addition to noise, there are parallels between the original photographs of nature gleaned from photo sharing sites and Serres’s concept of a history that somehow stands itself apart from the effects of ongoing environmental processes. By compartmentalising the natural and cultural worlds, both the historiography that Serres argues against and the wilderness photograph produces a concept of nature that is somehow outside, behind, or above human activities and the associated matter of noise. Eclipse, by altering photographs using real-time data, puts the still image into contact with the processes and informational outputs of nature. Air quality sensors detect pollution in the atmosphere and code these atmospheric processes into computer readable information. The photograph is no longer static but is now open to continual recreation and degeneration, dependent on the coded value of the atmosphere in a given location. A similar materiality is given to air in a public work undertaken by Preemptive Media, titled Areas Immediate Reading (AIR) (fig. 2). In this project, Preemptive Media, made up of Beatriz da Costa, Jamie Schulte and Brooke Singer, equip participants with instruments for measuring air quality as they walked around New York City. The devices monitor the carbon monoxide (CO), nitrogen oxides (NOx) or ground level ozone (O3) levels that are being breathed in by the carrier. As Michael Dieter has pointed out in his reading of the work, the application of sensing technology by Preemptive Media is in distinct contrast to the conventional application of air quality monitoring, which usually takes the form of extremely high resolution located devices spread over great distances. These larger air monitoring networks tend to present the value garnered from a large expanse of the atmosphere that covers individual cities or states. The AIR project, in contrast, by using small mobile sensors, attempts to put people in informational contact with the air that they are breathing in their local and immediate time and place, and allows them to monitor the small parcels of atmosphere that surround other users in other locations (Dieter). It thus presents many small and mobile spheres of atmosphere, inhabited by individuals as they move through the city. In AIR we see the experimental application of an already developed technology in order to put people on the street in contact with the atmospheres that they are moving through. It gives a new informational form to the “vast but invisible ocean of air that surrounds us and permeates us” (Ihde 3), which in this case is given voice by a technological apparatus that converts the air into information. The atmosphere as information becomes less of a vague background and more of a measurable entity that ingresses into the lives and movements of human users. The air is conditioned by information; the turbulent and noisy atmosphere has been converted via technology into readable information (Connor 186-88). Figure 2: Preemptive Media, Areas Immediate Reading (AIR) (close up of device), 2011 Throughout his career Serres has developed a philosophy of information and communication that may help us to reframe the relationship between the natural and cultural worlds (see Brown). Conventionally, the natural world is understood as made up of energy and matter, with exchanges of energy and the flows of biomass through food webs binding ecosystems together (DeLanda 120-1). However, the tendencies and structures of natural systems, like cultural systems, are also dependent on the communication of information. It is here that Serres provides us with a way to view natural and cultural systems as connected by a flow of energy and information. He points out that in the wake of Claude Shannon’s famous Mathematical Theory of Communication it has been possible to consider the relationship between information and thermodynamics, at least in Shannon’s explanation of noise as entropy (Serres, Hermes74). For Serres, an ecosystem can be conceptualised as an informational and energetic system: “it receives, stores, exchanges, and gives off both energy and information in all forms, from the light of the sun to the flow of matter which passes through it (food, oxygen, heat, signals)” (Serres, Hermes 74). Just as we are related to the natural world based on flows of energy— as sunlight is converted into energy by plants, which we in turn convert into food— we are also bound together by flows of information. The task is to find new ways to sense this information, to actualise the information, and imagine nature as more than a welter of data and the air as more than background. If we think of information in broad ranging terms as “coded values of the output of a process” (Losee 254), then we see that information and the environment—as a setting that is produced by continual and energetic processes—are in constant contact. After all, humans sense information from the environment all the time; we constantly decode the coded values of environmental processes transmitted via the atmosphere. I smell a flower, I hear bird songs, and I see the red glow of a sunset. The process of the singing bird is coded as vibrations of air particles that knock against my ear drum. The flower is coded as molecules in the atmosphere enter my nose and bind to cilia. The red glow is coded as wavelengths from the sun are dispersed in the Earth’s atmosphere and arrive at my eye. Information, of course, does not actually exist as information until some observing system constructs it (Clarke, “Information” 157-159). This observing system as we see the sunset, hear the birds, or smell the flower involves the atmosphere as a medium, along with our sense organs and cognitive and non-cognitive processes. The molecules in the atmosphere exist independently of our sense of them, but they do not actualise as information until they are operationalised by the observational system. Prior to this, information can be thought of as noise circulating within the atmosphere. Heinz Von Foester, one of the key figures of cybernetics, states “The environment contains no information. The environment is as it is” (Von Foester in Clarke, “Information” 157). Information, in this model, actualises only when something in the world causes a change to the observational system, as a difference that makes a difference (Bateson 448-466). Air expelled from a bird’s lungs and out its beak causes air molecules to vibrate, introducing difference into the atmosphere, which is then picked up by my ear and registered as sound, informing me that a bird is nearby. One bird song is picked up as information amid the swirling noise of nature and a difference in the air makes a difference to the observational system. It may be useful to think of the purpose of information as to control action and that this is necessary “whenever the people concerned, controllers as well as controlled, belong to an organised social group whose collective purpose is to survive and prosper” (Scarrott 262). Information in this sense operates the organisation of groups. Using this definition rooted in cybernetics, we see that information allows groups, which are dependent on certain control structures based on the sending and receiving of messages through media, to thrive and defines the boundaries of these groups. We see this in a flock of birds, for instance, which forms based on the information that one bird garners from the movements of the other birds in proximity. Extrapolating from this, if we are to live included in an ecological system capable of survival, the transmission of information is vital. But the form of the information is also important. To communicate, for example, one entity first needs to recognise that the other is speaking and differentiate this information from the noise in the air. Following Clarke and Von Foester, an observing system needs to be operational. An art project that gives aesthetic form to environmental processes in this vein—and one that is particularly concerned with the co-agentive relation between humans and nature—is Reiko Goto and Tim Collin’s Plein Air (2010) (fig. 3), an element in their ongoing Eden 3 project. In this work a technological apparatus is wired to a tree. This apparatus, which references the box easels most famously used by the Impressionists to paint ‘en plein air’, uses sensing technology to detect the tree’s responses to the varying CO2 levels in the atmosphere. An algorithm then translates this into real time piano compositions. The tree’s biological processes are coded into the voice of a piano and sensed by listeners as aesthetic information. What is at stake in this work is a new understanding of atmospheres as a site for the exchange of information, and an attempt to resituate the interdependence of human and non-human entities within an experimental aesthetic system. As we breathe out carbon dioxide—both through our physiological process of breathing and our cultural processes of polluting—trees breath it in. By translating these biological processes into a musical form, Collins and Gotto’s work signals a movement from a process of atmospheric exchange to a digital process of sensing and coding, the output of which is then transmitted through the atmosphere as sound. It must be mentioned that within this movement from atmospheric gas to atmospheric music we are not listening to the tree alone. We are listening to a much more complex polyphony involving the components of the digital sensing technology, the tree, the gases in the atmosphere, and the biological (breathing) and cultural processes (cars, factories and coal fired power stations) that produce these gases. Figure 3: Reiko Goto and Tim Collins, Plein Air, 2010 As both Don Ihde and Steven Connor have pointed out, the air that we breathe is not neutral. It is, on the contrary, given its significance in technology, sound, and voice. Taking this further, we might understand sensing technology as conditioning the air with information. This type of air conditioning—as information alters the condition of air—occurs as technology picks up, detects, and makes sensible phenomena in the atmosphere. While communication media such as the telegraph and other electronic information distribution systems may have distanced information from nature, the sensing technology experimentally applied by EcoArtTech, Preeemptive Media, and Goto and Collins, may remind us of the materiality of air. These technologies allow us to connect to the atmosphere; they reformulate it, converting it to information, giving new form to the coded processes in nature.AcknowledgmentAll images reproduced with the kind permission of the artists. References Bateson, Gregory. Steps to an Ecology of Mind. Chicago: University of Chicago Press, 1972. Briggs, Asa, and Peter Burke. A Social History of the Media: From Gutenberg to the Internet. Maden: Polity Press, 2009. Brown, Steve. “Michel Serres: Science, Translation and the Logic of the Parasite.” Theory, Culture and Society 19.1 (2002): 1-27. Clarke, Bruce. “Communication.” Critical Terms for Media Studies. Eds. Mark B. N. Hansen and W. J. T. Mitchell. Chicago: University of Chicago Press, 2010. 131-45 -----. “Information.” Critical Terms for Media Studies. Eds. Mark B. N. Hansen and W. J. T. Mitchell. Chicago: University of Chicago Press, 2010. 157-71 Crocker, Stephen. “Noise and Exceptions: Pure Mediality in Serres and Agamben.” CTheory: 1000 Days of Theory. (2007). 7 June 2012 ‹http://www.ctheory.net/articles.aspx?id=574› Connor, Stephen. The Matter of Air: Science and the Art of the Etheral. London: Reaktion, 2010. Cubitt, Sean. EcoMedia. Amsterdam and New York: Rodopi, 2005 Deiter, Michael. “Processes, Issues, AIR: Toward Reticular Politics.” Australian Humanities Review 46 (2009). 9 June 2012 ‹http://www.australianhumanitiesreview.org/archive/Issue-May-2009/dieter.htm› DeLanda, Manuel. Intensive Science and Virtual Philosophy. London and New York: Continuum, 2002. Fuller, Matthew. Media Ecologies: Materialist Energies in Art and Technoculture. Cambridge, MA: MIT Press, 2005 Harman, Graham. Guerilla Metaphysics. Illinois: Open Court, 2005. Ihde, Don. Listening and Voice: Phenomenologies of Sound. Albany: State University of New York, 2007. Innis, Harold. Empire and Communication. Toronto: Voyageur Classics, 1950/2007. Losee, Robert M. “A Discipline Independent Definition of Information.” Journal of the American Society for Information Science 48.3 (1997): 254–69. McLuhan, Marshall. Understanding Media: The Extensions of Man. London: Sphere Books, 1964/1967. Morton, Timothy. Ecology Without Nature: Rethinking Environmental Aesthetics. Cambridge: Harvard University Press, 2007. Murray, Robin, and Heumann, Joseph. Ecology and Popular Film: Cinema on the Edge. Albany: State University of New York, 2009 Scarrott, G.C. “The Nature of Information.” The Computer Journal 32.3 (1989): 261-66 Serres, Michel. Hermes: Literature, Science Philosophy. Baltimore: The John Hopkins Press, 1982. -----. The Natural Contract. Trans. Elizabeth MacArthur and William Paulson. Ann Arbor: The University of Michigan Press, 1992/1995. -----. Genesis. Trans. Genevieve James and James Nielson. Ann Arbor: The University of Michigan Press, 1982/1995. -----. “A Return to the Natural Contract.” Making Peace with the Earth. Ed. Jerome Binde. Oxford: UNESCO and Berghahn Books, 2007. Strate, Lance. Echoes and Reflections: On Media Ecology as a Field of Study. New York: Hampton Press, 2006 Wallen, Ruth. “Ecological Art: A Call for Intervention in a Time of Crisis.” Leonardo 45.3 (2012): 234-42.
APA, Harvard, Vancouver, ISO, and other styles
21

Taveira, Rodney. "Don DeLillo, 9/11 and the Remains of Fresh Kills." M/C Journal 13, no. 4 (August 19, 2010). http://dx.doi.org/10.5204/mcj.281.

Full text
Abstract:
It’s a portrait of grief, to be sure, but it puts grief in the air, as a cultural atmospheric, without giving us anything to mourn.—— Tom Junod, “The Man Who Invented 9/11”The nearly decade-long attempt by families of 9/11 victims to reclaim the remains of their relatives involves rhetorics of bodilessness, waste, and virtuality that offer startling illustrations of what might be termed “the poetics of grief.” After combining as the WTC Families for Proper Burial Inc. in 2002, the families sued the city of New York in 2005. They lost and the case has been under appeal since 2008. WTC Families is asking for nearly one million tons of material to be moved from the Fresh Kills landfill on Staten Island in order to sift it for human remains. These remains will then be reclaimed and interred: Proper Burial. But the matter is far less definitive. When a judge hearing the appeal asked how one would prove someone’s identity, the city’s lawyer replied, “You have to be able to particularise and say it’s your body. All that’s left here is a bunch of undifferentiated dust.” The reply “elicited gasps and muttered ‘no’s’ from a crowd whose members wore laminated photos of deceased victims” (Hughes). These laminated displays are an attempt by WTC Families to counteract the notion of the victims as “undifferentiated dust”; the protected, hermetic images are testimony to painful uncertainty, an (always) outmoded relic of the evidentiary self.In the face of such uncertainty, it was not only court audiences who waited for a particular response to the terrorist attacks. Adam Hirsch, reviewer for the New York Sun, claimed that “the writer whose September 11 novel seemed most necessary was Don DeLillo. Mr. DeLillo, more than any other novelist, has always worked at the intersection of public terror and private fear.” DeLillo’s prescience regarding the centrality of terrorism in American culture was noted by many critics in the aftermath of the attack on the World Trade Centre. The novelist even penned an essay for Harper’s in which he reflected on the role of the novelist in the new cultural landscape of the post-9/11 world. In an online book club exchange for Slate, Meghan O’Rourke says, “DeLillo seemed eerily primed to write a novel about the events of September 11. … Rereading some of his earlier books, including the terrorism-riddled Mao II, I wondered, half-seriously, if Mohamed Atta and crew had been studying DeLillo.” If there was any writer who might have been said to have seen it coming it was DeLillo. The World Trade Center had figured in his novels before the 9/11 attacks. The twin towers are a primary landmark in Underworld, gracing the cover of the novel in ghostly black and white. In Players (1977), a Wall Street worker becomes involved in a terrorist plot to bomb the New York Stock exchange and his wife works in the WTC for the “Grief Management Council”—“Where else would you stack all this grief?” (18).ClassificationsAs the WTC Families for Proper Burial Inc. trial demonstrates, the reality of the terrorist attacks of September 11 offered an altogether more macabre and less poetic reality than DeLillo’s fiction had depicted. The Fresh Kills landfill serves in Underworld as a metaphor for the accumulated history of Cold War America in the last half-century. Taking in the “man-made mountain,” waste management executive Brian Glassic thinks, “It was science fiction and prehistory”; seeing the World Trade Center in the distance, “he sensed a poetic balance between that idea and this one” (Underworld 184). But the poetic balance DeLillo explores in the 1997 novel has been sundered by the obliteration of the twin towers. Fresh Kills and the WTC are now united by a disquieting grief. The landfill, which closed in 2001, was forced to reopen when the towers collapsed to receive their waste. Fresh Kills bears molecular witness to this too-big collective trauma. “‘They commingled it, and then they dumped it,’ Mr. Siegel [lawyer for WTC Families] said of the remains being mixed with household trash, adding that a Fresh Kills worker had witnessed city employees use that mixture to fill potholes” (Hughes). The revelation is obscene: Are we walking and driving over our dead? The commingling of rubble and human remains becomes a collective (of) contamination too toxic, too overwhelming for conventional comprehension. “You can’t even consider the issue of closure until this issue has been resolved,” says the lawyer representing WTC Families (Hartocollis).Nick Shay, Underworld’s main character, is another waste executive who travels the world to observe ways of dealing with garbage. Of shopping with his wife, Nick says, “Marion and I saw products as garbage even when they sat gleaming on store shelves, yet unbought. We didn’t say, What kind of casserole will that make? We said, What kind of garbage will that make?” (121). This attests to the virtuality of waste, a potentiality of the products – commercial, temporal, biological – that comprise the stuff of contemporary American culture. Synecdoche and metonymy both, waste becomes the ground of hysteron proteron, the rhetorical figure that disorders time and makes the future always present. Like (its) Fresh Kills, waste is science fiction and prehistory.Repeating the apparent causal and temporal inversion of hysteron proteron, Nick’s son Jeff uses his home computer to access a simultaneous future and past that is the internal horizon of Underworld’s historical fiction. Jeff has previously been using his computer to search for something in the video footage of the “Texas Highway Killer,” a serial murderer who randomly shoots people on Texan highways. Jeff tries to resolve the image so that the pixels will yield more, exposing their past and future. “He was looking for lost information. He enhanced and super-slowed, trying to find some pixel in the data swarm that might provide a clue to the identity of the shooter” (118). Searching for something more, something buried, Jeff, like WTC Families, is attempting to redeem the artifactual and the overlooked by reconfiguring them as identity. DeLillo recognises this molecular episteme through the “dot theory of reality”: “Once you get inside a dot, you gain access to hidden information, you slide inside the smallest event. This is what technology does. It peels back the shadows and redeems the dazed and rambling past. It makes reality come true” (177). Like the gleaming supermarket products Nick and Marion see as garbage, the unredeemed opens onto complex temporal and rhetorical orders. Getting inside garbage is like getting “inside a dot.” This approach is not possible for the unplanned waste of 9/11. Having already lost its case, WTC Families will almost certainly lose its appeal because its categories and its means are unworkable and inapplicable: they cannot particularise.PremonitionsIn his 9/11 essay “In the Ruins of the Future,” published in Harper’s a few months after the attacks, DeLillo says “We are all breathing the fumes of lower Manhattan where traces of the dead are everywhere, in the soft breeze off the river, on rooftops and windows, in our hair and on our clothes” (39). DeLillo‘s portrait of molecular waste adumbrates the need to create “counternarratives.” Until the events of 11 September 2001 the American narrative was that of the Cold War, and thus also the narrative of Underworld; one for which DeLillo claims the Bush administration was feeling nostalgic. “This is over now,” he says. “The narrative ends in the rubble and it is left to us to create the counternarrative” (34).DeLillo was already at work on a narrative of his own at the time of the terrorist attacks. As Joseph Conte notes, when the World Trade Center was attacked, “DeLillo, had nearly finished drafting his thirteenth novel, Cosmopolis [… and] shared in the collective seizure of the American mind” (179). And while it was released in 2003, DeLillo sets the novel in 2000 on “a day in April.” If the millennium, the year 2000, has been as Boxall claims the horizon of DeLillo’s writing, the tagging of this “day in April” at the beginning of the novel signals Cosmopolis as a limit-work (4). 9/11 functions as a felt absence in the novel, a binding thing floating in the air, like the shirt that DeLillo will use to begin and end Falling Man; a story that will ‘go beyond’ the millennial limit, a story that is, effectively, the counternarrative of which DeLillo speaks in his 9/11 essay. Given the timing of the terrorist attacks in New York, and DeLillo’s development of his novel, it is extraordinary to consider just how Cosmopolis reflects on its author’s position as a man who should have “seen it coming.” The billionaire protagonist Eric Packer traverses Manhattan by car, his journey a bifurcation between sophistication and banality. Along the way he has an onanistic sexual encounter whilst having his prostate examined, hacks into and deletes his wife’s old money European fortune, loses his own self-made wealth by irrationally betting against the rise of the yen, kills a man, and shoots himself in the hand in front of his assassin. Eric actively moves toward his own death. Throughout Eric’s journey the socially binding integrity of the present and the future is teased apart. He continually sees images of future events before they occur – putting his hand on his chin, a bomb explosion, and finally, his own murder – via video screens in his car and wristwatch. These are, as Conte rightly notes, repeated instances of hysteron proteron (186). His corpse does not herald obsolescence but begins the true life of waste: virtual information. Or, as Eric’s “Chief of Theory” asks, “Why die when you can live on a disk?” (106). There are shades here of Jeff’s pixelated excursion into the video footage of the Texas Highway Killer: “Once you get inside a dot, you gain access to hidden information.” Life at this level is not only virtual, it is particularised, a point (or a collection of points) Eric comes to grasp during the protracted scene in which he watches himself die: “The stuff he sneezes when he sneezes, this is him” (207). In Falling Man, the work in which DeLillo engages directly with the 9/11 attack, the particularised body recurs in various forms. First there is the (now iconic) falling man: the otherwise unknown victim of the terrorist attack who leapt from the WTC and whose descent was captured in a photograph by Richard Drew. This figure was named (particularised) by Tom Junod (who provides the epigram for this essay) as “The Falling Man.” In DeLillo’s novel another Falling Man, a performance artist, re-enacts the moment by jumping off buildings, reiterating the photograph (back) into a bodily performance. In these various incarnations the falling man is serially particularised: photographed, named, then emulated. The falling man is a single individual, and multiple copies. He lives on long after death and so does his trauma. He represents the poetic expression of collective grief. Particularised bodies also infect the terror narrative of Falling Man at a molecular level. Falling Man’s terrorist, Hammad, achieves a similar life-after-death by becoming “organic shrapnel.” The surviving victims of the suicide bomb attack, months later, begin to display signs of the suicide bombers in lumps and sores emerging from their bodies, too-small bits of the attacker forever incorporated. Hammad is thus paired with the victims of the crash in a kind of disseminative and absorptive (rhetorical) structure. “The world changes first in the mind of the man who wants to change it. The time is coming, our truth, our shame, and each man becomes the other, and the other still another, and then there is no separation” (80). RevisionsThe traces of American culture that were already contained in the landfill in Underworld have now become the resting place of the dust and the bodies of the trauma of 9/11. Rereading DeLillo’s magnum opus one cannot help but be struck by the new resonance of Fresh Kills.The landfill showed him smack-on how the waste stream ended, where all the appetites and hankerings, the sodden second thoughts came runneling out, the things you wanted ardently and then did not…. He knew the stench must ride the wind into every dining room for miles around. When people heard a noise at night, did they think the heap was coming down around them, sliding toward their homes, an omnivorous movie terror filling their doorways and windows?The wind carried the stink across the kill…. The biggest secrets are the ones spread before us. (184-5)The landfill looms large on the landscape, a huge pile of evidence for the mass trauma of what remains, those that remain, and what may come—waste in all its virtuality. The “omnivorous movie terror filling their doorways and windows” is a picture of dust-blanketed Downtown NYC that everybody, everywhere, continually saw. The mediatory second sight of sifting the landfill, of combing the second site of the victims for its “sodden second thoughts,” is at once something “you wanted ardently and then did not.” The particles are wanted as a distillate, produced by the frameline of an intentional, processual practice that ‘edits’ 9/11 and its aftermath into a less unacceptable sequence that might allow the familiar mourning ritual of burying a corpse. WTC Families Inc. is seeking to throw the frame of human identity around the unincorporated particles of waste in the Fresh Kills landfill, an unbearably man-made, million-ton mountain. This operation is an attempt to immure the victims and their families from the attacks and its afterlife as waste or recycled material, refusing the ever-present virtual life of waste that always accompanied them. Of course, even if WTC Families is granted its wish to sift Fresh Kills, how can it differentiate its remains from those of the 9/11 attackers? The latter have a molecular, virtual afterlife in the present and the living, lumpy reminders that surface as foreign bodies.Resisting the city’s drive to rebuild and move on, WTC Families for Proper Burial Inc. is absorbed with the classification of waste rather than its deployment. In spite of the group’s failed court action, the Fresh Kills site will still be dug over: a civil works project by the NYC Department of Parks & Recreation will reclaim the landfill and rename it “Freshkills Park,” a re-creational area to be twice the size of Central Park—As DeLillo foresaw, “The biggest secrets are the ones spread before us.”ReferencesBoxall, Peter. Don DeLillo: The Possibility of Fiction. London: Routledge, 2006.Conte, Joseph M. “Writing amid the Ruins: 9/11 and Cosmopolis”. The Cambridge Companion to Don DeLillo. Ed. John N. Duvall. Cambridge: Cambridge University Press, 2008. 179-192.Cowart, David. Don DeLillo: The Physics of Language. Athens: University of Georgia Press, 2003.DeLillo, Don. Players. London: Vintage, 1991.———. Mao II. London: Vintage, 1992.———. Underworld. London: Picador, 1997.———. “In the Ruins of the Future”. Harper’s. Dec. 2001: 33-40.———. Cosmopolis. London: Picador, 2003.———. Falling Man. New York: Scribner, 2007.Hartocollis, Anemona. “Landfill Has 9/11 Remains, Medical Examiner Wrote”. 24 Mar. 2007. The New York Times. 7 Mar. 2009 ‹http://www.nytimes.com/2007/03/24%20/nyregion/24remains.html›. Hirsch, Adam. “DeLillo Confronts September 11”. 2 May 2007. The New York Sun. 10 May 2007 ‹http://www.nysun.com/arts/delillo-confronts-september-11/53594/›.Hughes, C. J. “9/11 Families Press Judges on Sifting at Landfill”. 16 Dec. 2009. The New York Times. 17 Dec. 2009 ‹http://www.nytimes.com/2009/12/17/nyregion/17sift.html›.Junod, Tom. “The Man Who Invented 9/11”. 7 May 2007. Rev. of Falling Man by Don DeLillo. Esquire. 28 May 2007 ‹http://www.esquire.com/fiction/book-review/delillo›.O’Rourke, Meghan. “DeLillo Seemed Almost Eerily Primed to Write a Novel about 9/11”. 23 May 2007. Slate.com. 28 May 2007 ‹http://www.slate.com/id/2166831/%20entry/2166848/›.
APA, Harvard, Vancouver, ISO, and other styles
22

Lemos Morais, Renata. "The Hybrid Breeding of Nanomedia." M/C Journal 17, no. 5 (October 25, 2014). http://dx.doi.org/10.5204/mcj.877.

Full text
Abstract:
IntroductionIf human beings have become a geophysical force, capable of impacting the very crust and atmosphere of the planet, and if geophysical forces become objects of study, presences able to be charted over millions of years—one of our many problems is a 'naming' problem. - Bethany NowviskieThe anthropocene "denotes the present time interval, in which many geologically significant conditions and processes are profoundly altered by human activities" (S.Q.S.). Although the narrative and terminology of the anthropocene has not been officially legitimized by the scientific community as a whole, it has been adopted worldwide by a plethora of social and cultural studies. The challenges of the anthropocene demand interdisciplinary efforts and actions. New contexts, situations and environments call for original naming propositions: new terminologies are always illegitimate at the moment of their first appearance in the world.Against the background of the naming challenges of the anthropocene, we will map the emergence and tell the story of a tiny world within the world of media studies: the world of the term 'nanomedia' and its hyphenated sister 'nano-media'. While we tell the story of the uses of this term, its various meanings and applications, we will provide yet another possible interpretation and application to the term, one that we believe might be helpful to interdisciplinary media studies in the context of the anthropocene. Contemporary media terminologies are usually born out of fortuitous exchanges between communication technologies and their various social appropriations: hypodermic media, interactive media, social media, and so on and so forth. These terminologies are either recognised as the offspring of legitimate scientific endeavours by the media theory community, or are widely discredited and therefore rendered illegitimate. Scientific legitimacy comes from the broad recognition and embrace of a certain term and its inclusion in the canon of an epistemology. Illegitimate processes of theoretical enquiry and the study of the kinds of deviations that might deem a theory unacceptable have been scarcely addressed (Delborne). Rejected terminologies and theories are marginalised and gain the status of bastard epistemologies of media, considered irrelevant and unworthy of mention and recognition. Within these margins, however, different streams of media theories which involve conceptual hybridizations can be found: creole encounters between high culture and low culture (James), McLuhan's hybrid that comes from the 'meeting of two media' (McLuhan 55), or even 'bastard spaces' of cultural production (Bourdieu). Once in a while a new media epistemology arises that is categorised as a bastard not because of plain rejection or criticism, but because of its alien origins, formations and shape. New theories are currently emerging out of interdisciplinary and transdisciplinary thinking which are, in many ways, bearers of strange features and characteristics that might render its meaning elusive and obscure to a monodisciplinary perspective. Radical transdisciplinary thinking is often alien and alienated. It results from unconventional excursions into uncharted territories of enquiry: bastard epistemologies arise from such exchanges. Being itself a product of a mestizo process of thinking, this article takes a look into the term nanomedia (or nano-media): a marginal terminology within media theory. This term is not to be confounded with the term biomedia, coined by Eugene Thacker (2004). (The theory of biomedia has acquired a great level of scientific legitimacy, however it refers to the moist realities of the human body, and is more concerned with cyborg and post-human epistemologies. The term nanomedia, on the contrary, is currently being used according to multiple interpretations which are mostly marginal, and we argue, in this paper, that such uses might be considered illegitimate). ’Nanomedia’ was coined outside the communications area. It was first used by scientific researchers in the field of optics and physics (Rand et al), in relation to flows of media via nanoparticles and optical properties of nanomaterials. This term would only be used in media studies a couple of years later, with a completely different meaning, without any acknowledgment of its scientific origins and context. The structure of this narrative is thus illegitimate, and as such does not fit into traditional modalities of written expression: there are bits and pieces of information and epistemologies glued together as a collage of nano fragments which combine philology, scientific literature, digital ethnography and technology reviews. Transgressions Illegitimate theories might be understood in terms of hybrid epistemologies that intertwine disciplines and perspectives, rendering its outcomes inter or transdisciplinary, and therefore prone to being considered marginal by disciplinary communities. Such theories might also be considered illegitimate due to social and political power struggles which aim to maintain territory by reproducing specific epistemologies within a certain field. Scientific legitimacy is a social and political process, which has been widely addressed. Pierre Bourdieu, in particular, has dedicated most of his work to deciphering the intricacies of academic wars around the legitimacy or illegitimacy of theories and terminologies. Legitimacy also plays a role in determining the degree to which a certain theory will be regarded as relevant or irrelevant:Researchers’ tendency to concentrate on those problems regarded as the most important ones (e.g. because they have been constituted as such by producers endowed with a high degree of legitimacy) is explained by the fact that a contribution or discovery relating to those questions will tend to yield greater symbolic profit (Bourdieu 22).Exploring areas of enquiry which are outside the boundaries of mainstream scientific discourses is a dangerous affair. Mixing different epistemologies in the search for transversal grounds of knowledge might result in unrecognisable theories, which are born out of a combination of various processes of hybridisation: social, technological, cultural and material.Material mutations are happening that call for new epistemologies, due to the implications of current technological possibilities which might redefine our understanding of mediation, and expand it to include molecular forms of communication. A new terminology that takes into account the scientific and epistemological implications of nanotechnology applied to communication [and that also go beyond cyborg metaphors of a marriage between biology and cibernetics] is necessary. Nanomedia and nanomediations are the terminologies proposed in this article as conceptual tools to allow these further explorations. Nanomedia is here understood as the combination of different nanotechnological mediums of communication that are able to create and disseminate meaning via molecular exchange and/ or assembly. Nanomediation is here defined as the process of active transmission and reception of signs and meaning using nanotechnologies. These terminologies might help us in conducting interdisciplinary research and observations that go deeper into matter itself and take into account its molecular spaces of mediation - moving from metaphor into pragmatics. Nanomedia(s)Within the humanities, the term 'nano-media' was first proposed by Mojca Pajnik and John Downing, referring to small media interventions that communicate social meaning in independent ways. Their use of term 'nano-media' proposes to be a revised alternative to the plethora of terms that categorise such media actions, such as alternative media, community media, tactical media, participatory media, etc. The metaphor of smallness implied in the term nano-media is used to categorise the many fragments and complexities of political appropriations of independent media. Historical examples of the kind of 'nano' social interferences listed by Downing (2),include the flyers (Flugblätter) of the Protestant Reformation in Germany; the jokes, songs and ribaldry of François Rabelais’ marketplace ... the internet links of the global social justice (otromundialista) movement; the worldwide community radio movement; the political documentary movement in country after country.John Downing applies the meaning of the prefix nano (coming from the Greek word nanos - dwarf), to independent media interventions. His concept is rooted in an analysis of the social actions performed by local movements scattered around the world, politically engaged and tactically positioned. A similar, but still unique, proposition to the use of the term 'nano-media' appeared 2 years later in the work of Graham St John (442):If ‘mass media’ consists of regional and national print and television news, ‘niche media’ includes scene specific publications, and ‘micro media’ includes event flyers and album cover art (that which Eshun [1998] called ‘conceptechnics’), and ‘social media’ refers to virtual social networks, then the sampling of popular culture (e.g. cinema and documentary sources) using the medium of the programmed music itself might be considered nano-media.Nano-media, according to Graham St John, "involves the remediation of samples from popular sources (principally film) as part of the repertoire of electronic musicians in their efforts to create a distinct liminalized socio-aesthetic" (St John 445). While Downing proposes to use the term nano-media as a way to "shake people free of their obsession with the power of macro-media, once they consider the enormous impact of nano-technologies on our contemporary world" (Downing 1), Graham St John uses the term to categorise media practices specific to a subculture (psytrance). Since the use of the term 'nano-media' in relation to culture seems to be characterised by the study of marginalised social movements, portraying a hybrid remix of conceptual references that, if not completely illegitimate, would be located in the border of legitimacy within media theories, I am hereby proposing yet another bastard version of the concept of nanomedia (without a hyphen). Given that neither of the previous uses of the term 'nano-media' within the discipline of media studies take into account the technological use of the prefix nano, it is time to redefine the term in direct relation to nanotechnologies and communication devices. Let us start by taking a look at nanoradios. Nanoradios are carbon nanotubes connected in such a way that when electrodes flow through the nanotubes, various electrical signals recover the audio signals encoded by the radio wave being received (Service). Nanoradios are examples of the many ways in which nanotechnologies are converging with and transforming our present information and communication technologies. From molecular manufacturing (Drexler) to quantum computing (Deutsch), we now have a wide spectrum of emerging and converging technologies that can act as nanomedia - molecular structures built specifically to act as communication devices.NanomediationsBeyond literal attempts to replicate traditional media artifacts using nanotechnologies, we find deep processes of mediation which are being called nanocommunication (Hara et al.) - mediation that takes place through the exchange of signals between molecules: Nanocommunication networks (nanonetworks) can be used to coordinate tasks and realize them in a distributed manner, covering a greater area and reaching unprecedented locations. Molecular communication is a novel and promising way to achieve communication between nanodevices by encoding messages inside molecules. (Abadal & Akyildiz) Nature is nanotechnological. Living systems are precise mechanisms of physical engineering: our molecules obey our DNA and fall into place according to biological codes that are mysteriously written in our every cell. Bodies are perfectly mediated - biological systems of molecular communication and exchange. Humans have always tried to emulate or to replace natural processes by artificial ones. Nanotechnology is not an exception. Many nanotechnological applications try to replicate natural systems, for example: replicas of nanostructures found in lotus flowers are now being used in waterproof fabrics, nanocrystals, responsible for resistance of cobwebs, are being artificially replicated for use in resistant materials, and various proteins are being artificially replicated as well (NNI 05). In recent decades, the methods of manipulation and engineering of nano particles have been perfected by scientists, and hundreds of nanotechnological products are now being marketed. Such nano material levels are now accessible because our digital technologies were advanced enough to allow scientific visualization and manipulation at the atomic level. The Scanning Tunneling Microscopes (STMs), by Gerd Binnig and Heinrich Rohrer (1986), might be considered as the first kind of nanomedia devices ever built. STMs use quantum-mechanical principles to capture information about the surface of atoms and molecules, allowed digital imaging and visualization of atomic surfaces. Digital visualization of atomic surfaces led to the discovery of buckyballs and nanotubes (buckytubes), structures that are celebrated today and received their names in honor of Buckminster Fuller. Nanotechnologies were developed as a direct consequence of the advancement of digital technologies in the fields of scientific visualisation and imaging. Nonetheless, a direct causal relationship between nano and digital technologies is not the only correlation between these two fields. Much in the same manner in which digital technologies allow infinite manipulation and replication of data, nanotechnologies would allow infinite manipulation and replication of molecules. Nanocommunication could be as revolutionary as digital communication in regards to its possible outcomes concerning new media. Full implementation of the new possibilities of nanomedia would be equivalent or even more revolutionary than digital networks are today. Nanotechnology operates at an intermediate scale at which the laws of classical physics are mixed to the laws of quantum physics (Holister). The relationship between digital technologies and nanotechnologies is not just instrumental, it is also conceptual. We might compare the possibilities of nanotechnology to hypertext: in the same way that a word processor allows the expression of any type of textual structure, so nanotechnology could allow, in principle, for a sort of "3-D printing" of any material structure.Nanotechnologies are essentially media technologies. Nanomedia is now a reality because digital technologies made possible the visualization and computational simulation of the behavior of atomic particles at the nano level. Nanomachines that can build any type of molecular structure by atomic manufacturing could also build perfect replicas of themselves. Obviously, such a powerful technology offers medical and ecological dangers inherent to atomic manipulation. Although this type of concern has been present in the global debate about the social implications of nanotechnology, its full implications are yet not entirely understood. A general scientific consensus seems to exist, however, around the idea that molecules could become a new type of material alphabet, which, theoretically, would make possible the reconfiguration of the physical structures of any type of matter using molecular manufacturing. Matter becomes digital through molecular communication.Although the uses given to the term nano-media in the context of cultural and social studies are merely metaphorical - the prefix nano is used by humanists as an allegorical reference of a combination between 'small' and 'contemporary' - once the technological and scientifical realities of nanomedia present themselves as a new realm of mediation, populated with its own kind of molecular devices, it will not be possible to ignore its full range of implications anymore. A complexifying media ecosystem calls for a more nuanced and interdisciplinary approach to media studies.ConclusionThis article narrates the different uses of the term nanomedia as an illustration of the way in which disciplinarity determines the level of legitimacy or illegitimacy of an emerging term. We then presented another possible use of the term in the field of media studies, one that is more closely aligned with its scientific origins. The importance and relevance of this narrative is connected to the present challenges we face in the anthropocene. The reality of the anthropocene makes painfully evident the full extent of the impact our technologies have had in the present condition of our planet's ecosystems. For as long as we refuse to engage directly with the technologies themselves, trying to speak the language of science and technology in order to fully understand its wider consequences and implications, our theories will be reduced to fancy metaphors and aesthetic explorations which circulate around the critical issues of our times without penetrating them. The level of interdisciplinarity required by the challenges of the anthropocene has to go beyond anthropocentrism. Traditional theories of media are anthropocentric: we seem to be willing to engage only with that which we are able to recognise and relate to. Going beyond anthropocentrism requires that we become familiar with interdisciplinary discussions and perspectives around common terminologies so we might reach a consensus about the use of a shared term. For scientists, nanomedia is an information and communication technology which is simultaneously a tool for material engineering. For media artists and theorists, nano-media is a cultural practice of active social interference and artistic exploration. However, none of the two approaches is able to fully grasp the magnitude of such an inter and transdisciplinary encounter: when communication becomes molecular engineering, what are the legitimate boundaries of media theory? If matter becomes not only a medium, but also a language, what would be the conceptual tools needed to rethink our very understanding of mediation? Would this new media epistemology be considered legitimate or illegitimate? Be it legitimate or illegitimate, a new media theory must arise that challenges and overcomes the walls which separate science and culture, physics and semiotics, on the grounds that it is a transdisciplinary change on the inner workings of media itself which now becomes our vector of epistemological and empirical transformation. A new media theory which not only speaks the language of molecular technologies but that might be translated into material programming, is the only media theory equipped to handle the challenges of the anthropocene. ReferencesAbadal, Sergi, and Ian F. Akyildiz. "Bio-Inspired Synchronization for Nanocommunication Networks." Global Telecommunications Conference (GLOBECOM), 2011.Borisenko, V. E., and S. Ossicini. What Is What in the Nanoworld: A Handbook on Nanoscience and Nanotechnology. Weinheim: Wiley-VCH, 2005.Bourdieu, Pierre. "The Specificity of the Scientific Field and the Social Conditions of the Progress of Reason." Social Science Information 14 (Dec. 1975): 19-47.---. La Distinction: Critique Sociale du Jugement. Paris: Editions de Minuit, 1979. Delborne, Jason A. "Transgenes and Transgressions: Scientific Dissent as Heterogeneous Practice". Social Studies of Science 38 (2008): 509.Deutsch, David. The Beginning of Infinity. London: Penguin, 2011.Downing, John. "Nanomedia: ‘Community’ Media, ‘Network’ Media, ‘Social Movement’ Media: Why Do They Matter? And What’s in a Name? Mitjans Comunitaris, Moviments Socials i Xarxes." InCom-UAB. Barcelona: Cidob, 15 March 2010.Drexler, E.K. "Modular Molecular Composite Nanosystems." Metamodern 10 Nov. 2008. Epstein, Steven. Impure Science: AIDS, Activism, and the Politics of Knowledge. Vol. 7. U of California P, 1996.Hara, S., et al. "New Paradigms in Wireless Communication Systems." Wireless Personal Communications 37.3-4 (May 2006): 233-241.Holister, P. "Nanotech: The Tiny Revolution." CMP Cientifica July 2002.James, Daniel. Bastardising Technology as a Critical Mode of Cultural Practice. PhD Thesis. Wellington, New Zealand, Massey University, 2010.Jensen, K., J. Weldon, H. Garcia, and A. Zetti. "Nanotube Radio." Nano Letters 7.11 (2007): 3508–3511. Lee, C.H., S.W. Lee, and S.S. Lee. "A Nanoradio Utilizing the Mechanical Resonance of a Vertically Aligned Nanopillar Array." Nanoscale 6.4 (2014): 2087-93. Maasen. Governing Future Technologies: Nanotechnology and the Rise of an Assessment Regime. Berlin: Springer, 2010. 121–4.Milburn, Colin. "Digital Matters: Video Games and the Cultural Transcoding of Nanotechnology." In Governing Future Technologies: Nanotechnology and the Rise of an Assessment Regime, eds. Mario Kaiser, Monika Kurath, Sabine Maasen, and Christoph Rehmann-Sutter. Berlin: Springer, 2009.Miller, T.R., T.D. Baird, C.M. Littlefield, G. Kofinas, F. Chapin III, and C.L. Redman. "Epistemological Pluralism: Reorganizing Interdisciplinary Research". Ecology and Society 13.2 (2008): 46.National Nanotechnology Initiative (NNI). Big Things from a Tiny World. 2008.Nowviskie, Bethany. "Digital Humanities in the Anthropocene". Nowviskie.org. 15 Sep. 2014 .Pajnik, Mojca, and John Downing. "Introduction: The Challenges of 'Nano-Media'." In M. Pajnik and J. Downing, eds., Alternative Media and the Politics of Resistance: Perspectives and Challenges. Ljubljana, Slovenia: Peace Institute, 2008. 7-16.Qarehbaghi, Reza, Hao Jiang, and Bozena Kaminska. "Nano-Media: Multi-Channel Full Color Image with Embedded Covert Information Display." In ACM SIGGRAPH 2014 Posters. New York: ACM, 2014. Rand, Stephen C., Costa Soukolis, and Diederik Wiersma. "Localization, Multiple Scattering, and Lasing in Random Nanomedia." JOSA B 21.1 (2004): 98-98.Service, Robert F. "TF10: Nanoradio." MIT Technology Review April 2008. Shanken, Edward A. "Artists in Industry and the Academy: Collaborative Research, Interdisciplinary Scholarship and the Creation and Interpretation of Hybrid Forms." Leonardo 38.5 (Oct. 2005): 415-418.St John, Graham. "Freak Media: Vibe Tribes, Sampledelic Outlaws and Israeli Psytrance." Continuum: Journal of Media and Cultural Studies 26. 3 (2012): 437–447.Subcomission on Quartenary Stratigraphy (S.Q.S.). "What Is the Anthropocene?" Quaternary.stratigraphy.org.Thacker, Eugene. Biomedia. Minneapolis: University of Minnesota Press, 2004.Toffoli, Tommaso, and Norman Margolus. "Programmable Matter: Concepts and Realization." Physica D 47 (1991): 263–272.Vanderbeeken, Robrecht, Christel Stalpaert, Boris Debackere, and David Depestel. Bastard or Playmate? On Adapting Theatre, Mutating Media and the Contemporary Performing Arts. Amsterdam: Amsterdam University, 2012.Wark, McKenzie. "Climate Science as Sensory Infrastructure." Extract from Molecular Red, forthcoming. The White Review 20 Sep. 2014.Wilson, Matthew W. "Cyborg Geographies: Towards Hybrid Epistemologies." Gender, Place and Culture 16.5 (2009): 499–515.
APA, Harvard, Vancouver, ISO, and other styles
23

Otsuki, Grant Jun. "Augmenting Japan’s Bodies and Futures: The Politics of Human-Technology Encounters in Japanese Idol Pop." M/C Journal 16, no. 6 (November 7, 2013). http://dx.doi.org/10.5204/mcj.738.

Full text
Abstract:
Perfume is a Japanese “techno-pop” idol trio formed in 2000 consisting of three women–Ayano Omoto, Yuka Kashino, and Ayaka Nishiwaki. Since 2007, when one of their songs was selected for a recycling awareness campaign by Japan's national public broadcaster, Perfume has been a consistent fixture in the Japanese pop music charts. They have been involved in the full gamut of typical idol activities, from television and radio shows to commercials for clothing brands, candy, and drinks. Their success reflects Japanese pop culture's long-standing obsession with pop idols, who once breaking into the mainstream, become ubiquitous cross-media presences. Perfume’s fame in Japan is due in large part to their masterful performance of traditional female idol roles, through which they assume the kaleidoscopic positions of daughter, sister, platonic friend, and heterosexual romantic partner depending on the standpoint of the beholder. In the lyrical content of their songs, they play the various parts of the cute but shy girl who loves from a distance, the strong compatriot that pushes the listener to keep striving for their dreams, and the kindred spirit with whom the listener can face life's ordinary challenges. Like other successful idols, their extensive lines of Perfume-branded merchandise and product endorsements make the exercise of consumer spending power by their fans a vehicle for them to approach the ideals and experiences that Perfume embodies. Yet, Perfume's videos, music, and stage performances are also replete with subversive images of machines, virtual cities and landscapes, and computer generated apparitions. In their works, the traditional idol as an object of consumer desire co-exists with images of the fragmentation of identity, distrust in the world and the senses, and the desire to escape from illusion, all presented in terms of encounters with technology. In what their fans call the "Near Future Trilogy", a set of three singles released soon after their major label debut (2005-06), lyrics refer to the artificiality and transience of virtual worlds ("Nothing I see or touch has any reality" from "Electro-World," or "I want to escape. I want to destroy this city created by immaculate computation" from "Computer City"). In their later work, explicit lyrical references to virtual worlds and machines largely disappear, but they are replaced with images and bodily performances of Perfume with robotic machinery and electronic information. Perfume is an idol group augmented by technology. In this paper, I explore the significance of these images of technological augmentation of the human body in the work of Perfume. I suggest that the ways these bodily encounters of the human body and technology are articulated in their work reflect broader social and economic anxieties and hopes in Japan. I focus in the first section of this paper on describing some of the recurring technological motifs in their works. Next, I show how their recent work is an experiment with the emergent possibilities of human-technology relationships for imagining Japan's future development. Not only in their visual and performance style, but in their modes of engagement with their fans through new media, I suggest that Perfume itself is attempting to seek out new forms of value creation, which hold the promise of pushing Japan out of the extended economic and social stagnation of its 1990s post-bubble "Lost Decade,” particularly by articulating how they connect with the world. The idol's technologically augmented body becomes both icon and experiment for rethinking Japan and staking out a new global position for it. Though I have referred above to Perfume as its three members, I also use the term to signify the broader group of managers and collaborating artists that surrounds them. Perfume is a creation of corporate media companies and the output of development institutions designed to train multi-talented entertainers from a young age. In addition to the three women who form the public face of Perfume, main figures include music producer Yasutaka Nakata, producer and choreographer MIKIKO, and more recently, the new media artist Daito Manabe and his company, Rhizomatiks. Though Perfume very rarely appear on stage or in their videos with any other identifiable human performers, every production is an effort involving dozens of professional staff. In this respect, Perfume is a very conventional pop idol unit. The attraction of these idols for their fans is not primarily their originality, creativity, or musicality, but their professionalism and image as striving servants (Yano 336). Idols are beloved because they "are well-polished, are trained to sing and act, maintain the mask of stardom, and are extremely skillful at entertaining the audience" (Iwabuchi 561). Moreover, their charisma is based on a relationship of omoiyari or mutual empathy and service. As Christine Yano has argued for Japanese Enka music, the singer must maintain the image of service to his or her fans and reach out to them as if engaged in a personal relationship with each (337). Fans reciprocate by caring for the singer, and making his or her needs their own, not the least of which are financial. The omoiyari relationship of mutual empathy and care is essential to the singer’s charismatic appeal (Yano 347). Thus it does not matter to their fans that Perfume do not play their own instruments or write their own songs. These are jobs for other professionals. However, mirroring the role of the employee in the Japanese company-as-family (see Kondo), their devotion to their jobs as entertainers, and their care and respect for their fans must be evident at all times. The tarnishing of this image, for instance through revelations of underage smoking or drinking, can be fatal, and has resulted in banishment from the media spotlight for some former stars. A large part of Japanese stars' conventional appeal is based on their appearance as devoted workers, consummate professionals, and partners in mutual empathy. As charismatic figures that exchange cultural ideals for fans’ disposable income, it is not surprising that many authors have tied the emergence of the pop idol to the height of Japan's economic prosperity in the 1970s and 1980s, when the social contract between labor and corporations that provided both lifelong employment and social identity had yet to be seriously threatened. Aoyagi suggests (82) that the idol system is tied to post-war consumerism and the increased importance of young adults, particularly women, as consumers. As this correlation between the health of idols and the economy might imply, there is a strong popular connection between concerns of social fission and discontent and economic stagnation. Koichi Iwabuchi writes that Japanese media accounts in the 1990s connected the health of the idol system to the "vigor of society" (555). As Iwabuchi describes, some Japanese fans have looked for their idols abroad in places such as Hong Kong, with a sense of nostalgia for a kind of stardom that has waned in Japan and because of "a deep sense of disillusionment and discontent with Japanese society" (Iwabuchi 561) following the collapse of Japan's bubble economy in the early 1990s. In reaction to the same conditions, some Japanese idols have attempted to exploit this nostalgia. During a brief period of fin-de-siècle optimism that coincided with neoliberal structural reforms under the government of Junichiro Koizumi, Morning Musume, the most popular female idol group at the time, had a hit single entitled "Love Machine" that ended the 1990s in Japan. The song's lyrics tie together dreams of life-long employment, romantic love, stable traditional families, and national resurgence, linking Japan's prosperity in the world at large to its internal social, emotional, and economic health. The song’s chorus declares, "The world will be envious of Japan's future!", although that future still has yet to materialize. In its place has appeared the "near-future" imaginary of Perfume. As mentioned above, the lyrics of some of their early songs referenced illusory virtual worlds that need to be destroyed or transcended. In their later works, these themes are continued in images of the bodies of the three performers augmented by technology in various ways, depicting the performers themselves as robots. Images of the three performers as robots are first introduced in the music video for their single "Secret Secret" (2007). At the outset of the video, three mannequins resembling Perfume are frozen on a futuristic TV soundstage being dressed by masked attendants who march off screen in lock step. The camera fades in and out, and the mannequins are replaced with the human members frozen in the same poses. Other attendants raise pieces of chocolate-covered ice cream (the music video also served as an advertisement for the ice cream) to the performers' mouths, which when consumed, activate them, launching them into a dance consisting of stilted, mechanical steps, and orthogonal arm positions. Later, one of the performers falls on stairs and appears to malfunction, becoming frozen in place until she receives another piece of ice cream. They are later more explicitly made into robots in the video for "Spring of Life" (2012), in which each of the three members are shown with sections of skin lifted back to reveal shiny, metallic parts inside. Throughout this video, their backs are connected to coiled cables hanging from the ceiling, which serve as a further visual sign of their robotic characters. In the same video, they are also shown in states of distress, each sitting on the floor with parts exposed, limbs rigid and performing repetitive motions, as though their control systems have failed. In their live shows, themes of augmentation are much more apparent. At a 2010 performance at the Tokyo Dome, which was awarded the jury selection prize in the 15th Japan Media Arts Festival by the Japanese Agency for Cultural Affairs, the centerpiece was a special performance entitled "Perfume no Okite" or "The Laws of Perfume." Like "Secret Secret," the performance begins with the emergence of three mannequins posed at the center of the stadium. During the introductory sequence, the members rise out of a different stage to the side. They begin to dance, synchronized to massively magnified, computer generated projections of themselves. The projections fluctuate between photorealistic representations of each member and ghostly CG figures consisting of oscillating lines and shimmering particles that perform the same movements. At the midpoint, the members each face their own images, and state their names and dates of birth before uttering a series of commands: "The right hand and right leg are together. The height of the hands must be precise. Check the motion of the fingers. The movement of the legs must be smooth. The palms of the hands must be here." With each command, the members move their own bodies mechanically, mirrored by the CG figures. After more dancing with their avatars, the performance ends with Perfume slowly lowered down on the platform at the center of the stage, frozen in the same poses and positions as the mannequins, which have now disappeared. These performances cleverly use images of robotic machinery in order to subvert Perfume's idol personas. The robotic augmentations are portrayed as vectors for control by some unseen external party, and each of the members must have their life injected into them through cables, ice cream, or external command, before they can begin to dance and sing as pop idols. Pop idols have always been manufactured products, but through such technological imagery Perfume make their own artificiality explicit, revealing to the audience that it is not the performers they love, but the emergent and contingently human forms of a social, technological, and commercial system that they desire. In this way, these images subvert the performers' charisma and idol fans' own feelings of adoration, revealing the premise of the idol system to have been manufactured to manipulate consumer affect and desire. If, as Iwabuchi suggests, some fans of idols are attracted to their stars by a sense of nostalgia for an age of economic prosperity, then Perfume's robotic augmentations offer a reflexive critique of this industrial form. In "The Laws of Perfume", the commands that comport their bodies may be stated in their own voices, yet they issue not from the members themselves, but their magnified and processed avatars. It is Perfume the commercial entity speaking. The malfunctioning bodies of Perfume depicted in "Secret Secret" and "Spring of Life" do not detract from their charisma as idols as an incident of public drunkenness might, because the represented breakdowns in their performances are linked not to the moral purity or professionalism of the humans, but to failures of the technological and economic systems that have supported them. If idols of a past age were defined by their seamless and idealized personas as entertainers and employees, then it is fitting that in an age of much greater economic and social uncertainty that they should acknowledge the cracks in the social and commercial mechanisms from which their carefully designed personas emerge. In these videos and performances, the visual trope of technological body augmentation serves as a means for representing both the dependence of the idol persona on consumer capitalism, and the fracturing of that system. However, they do not provide an answer to the question of what might lie beyond the fracturing. The only suggestions provided are the disappearance of that world, as in the end of "Computer City," or in the reproduction of the same structure, as when the members of Perfume become mannequins in "The Laws of Perfume" and "Secret Secret." Interestingly, it was with Perfume's management's decision to switch record labels and market Perfume to an international audience that Perfume became newly augmented, and a suggestion of an answer became visible. Perfume began their international push in 2012 with the release of a compilation album, "Love the World," and live shows and new media works in Asia and Europe. The album made their music available for purchase outside of Japan for the first time. Its cover depicts three posed figures computer rendered as clouds of colored dots produced from 3D scans of the members. The same scans were used to create 3D-printed plastic figures, whose fabrication process is shown in the Japanese television ad for the album. The robotic images of bodily augmentation have been replaced by a more powerful form of augmentation–digital information. The website which accompanied their international debut received the Grand Prix of the 17th Japan Media Arts Prize. Developed by Daito Manabe and Rhizomatiks, visitors to the Perfume Global website were greeted by a video of three figures composed of pulsating clouds of triangles, dancing to a heavy, glitch-laden electronic track produced by Nakata. Behind them, dozens of tweets about Perfume collected in real-time scroll across the background. Controls to the side let visitors change not only the volume of the music, but also the angle of their perspective, and the number and responsiveness of the pulsating polygons. The citation for the site's prize refers to the innovative participatory features of the website. Motion capture data from Perfume, music, and programming examples used to render the digital performance were made available for free to visitors, who were encouraged to create their own versions. This resulted in hundreds of fan-produced videos showing various figures, from animals and cartoon characters to swooshing multi-colored lines, dancing the same routine. Several of these were selected to be featured on the website, and were later integrated into the stage performance of the piece during Perfume's Asia tour. A later project extended this idea in a different direction, letting website visitors paint animations on computer representations of the members, and use a simple programming language to control the images. Many of these user creations were integrated into Perfume's 2013 performance at the Cannes Lions International Festival as advertising. Their Cannes performance begins with rapidly shifting computer graphics projected onto their costumes as they speak in unison, as though they are visitors from another realm: "We are Perfume. We have come. Japan is far to the east. To encounter the world, the three of us and everyone stand before you: to connect you with Japan, and to communicate with you, the world." The user-contributed designs were projected on to the members' costumes as they danced. This new mode of augmentation–through information rather than machinery–shows Perfume to be more than a representation of Japan's socio-economic transitions, but a live experiment in effecting these transitions. In their international performances, their bodies are synthesized in real-time from the performers' motions and the informatic layer generated from tweets and user-generated creations. This creates the conditions for fans to inscribe their own marks on to Perfume, transforming the emotional engagement between fan and idol into a technological linkage through which the idols’ bodies can be modified. Perfume’s augmented bodies are not just seen and desired, but made by their fans. The value added by this new mode of connection is imagined as the critical difference needed to transform Perfume from a local Japanese idol group into an entity capable of moving around the world, embodying the promise of a new global position for Japan enabled through information. In Perfume, augmentation suggests a possible answer to Japan’s economic stagnation and social fragmentation. It points past a longing for the past towards new values produced in encounters with the world beyond Japan. Augmentations newly connect Perfume and Japan with the world economically and culturally. At the same time, a vision of Japan emerges, more mobile, flexible, and connected perhaps, yet one that attempts to keep Japan a distinct entity in the world. Bodily augmentations, in media representations and as technological practices, do more than figuratively and materially link silicon and metal with flesh. They mark the interface of the body and technology as a site of transnational connection, where borders between the nation and what lies outside are made References Aoyagi, Hiroshi. Islands of Eight Million Smiles: Idol Performance and Symbolic Production in Contemporary Japan. Cambridge, Mass.: Harvard University Press, 2005. Iwabuchi, Koichi. "Nostalgia for a (Different) Asian Modernity: Media Consumption of "Asia" in Japan." positions: east asia cultures critique 10.3 (2002): 547-573. Kondo, Dorinne K. Crafting Selves: Power, Gender and Discourses of Identity in a Japanese Workplace. Chicago and London: University of Chicago Press, 1990. Morning Musume. “Morning Musume ‘Love Machine’ (MV).” 15 Oct. 2010. 4 Dec. 2013 ‹http://www.youtube.com/watch?v=6A7j6eryPV4›. Perfume. “[HD] Perfume Performance Cannes Lions International Festival of Creativity.” 20 June 2013. 11 Nov. 2013 ‹http://www.youtube.com/watch?v=gI0x5vA7fLo›. ———. “[SPOT] Perfume Global Compilation “LOVE THE WORLD.”” 11 Sep. 2012. 11 Nov. 2013 ‹http://www.youtube.com/watch?v=28SUmWDztxI›. ———. “Computer City.” 18 June 2013. 10 Oct. 2013 ‹http://www.youtube.com/watch?v=jOXGKTrsRNg›. ———. “Electro World.” 18 June 2013. 10 Oct. 2013 ‹http://www.youtube.com/watch?v=8zh0ouiYIZc›. ———. “Perfume no Okite.” 8 May 2011. 10 Oct. 2013 ‹http://www.youtube.com/watch?v=2EjOistJABM›. ———. “Perfume Official Global Website.” 2012. 11 Nov. 2013 ‹http://perfume-global.com/project.html›. ———. “Secret Secret.” 18 Jan. 2012. 10 Oct. 2013 ‹http://www.youtube.com/watch?v=birLzegOHyU›. ———. “Spring of Life.” 18 June 2013. 10 Oct. 2013 ‹http://www.youtube.com/watch?v=7PtvnaEo9-0›. Yano, Christine. "Charisma's Realm: Fandom in Japan." Ethnology 36.4 (1997): 335-49.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography