To see the other types of publications on this topic, follow the link: Chart rendering.

Journal articles on the topic 'Chart rendering'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the top 50 journal articles for your research on the topic 'Chart rendering.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Browse journal articles on a wide variety of disciplines and organise your bibliography correctly.

1

Liu, Tao, Depeng Zhao, and Mingyang Pan. "Generating 3D Depiction for a Future ECDIS Based on Digital Earth." Journal of Navigation 67, no. 6 (June 17, 2014): 1049–68. http://dx.doi.org/10.1017/s0373463314000381.

Full text
Abstract:
An Electronic Navigational Chart (ENC) is a two-dimensional abstraction and generalisation of the real world and it limits users' ability to obtain more real and rich spatial information of the navigation environment. However, a three-dimensional (3D) chart could dramatically reduce the number of human errors and improve the accuracy and efficiency of manoeuvring. Thus it is important to be able to visualize charts in 3D. This article proposes a new model for future Electronic Chart Display and Information Systems (ECDIS) and describes our approach for the construction of web-based multi-resolution future ECDIS implemented in our system Automotive Intelligent Chart (AIC) 3D ECDIS, including multi-resolution riverbed construction technology, multi-layer technology for data fusion, Mercator transformation of the model, rendering and web publishing methods. AIC 3D ECDIS can support global spatial data and 3D visualization, which merges the 2D vector electronic navigational chart with the three-dimensional navigation environment in a unified framework and interface, and is also published on the web to provide application and data service through the network.
APA, Harvard, Vancouver, ISO, and other styles
2

Liu, Tao, De-Peng Zhao, Ming-Yang Pan, and Kun Bai. "Fusing Multiscale Charts into 3D ENC Systems Based on Underwater Topography and Remote Sensing Image." Mathematical Problems in Engineering 2015 (2015): 1–7. http://dx.doi.org/10.1155/2015/610750.

Full text
Abstract:
The purpose of this study is to propose an approach to fuse multiscale charts into three-dimensional (3D) electronic navigational chart (ENC) systems based on underwater topography and remote sensing image. This is the first time that the fusion of multiscale standard ENCs in the 3D ENC system has been studied. First, a view-dependent visualization technology is presented for the determination of the display condition of a chart. Second, a map sheet processing method is described for dealing with the map sheet splice problem. A process order called “3D order” is designed to adapt to the characteristics of the chart. A map sheet clipping process is described to deal with the overlap between the adjacent map sheets. And our strategy for map sheet splice is proposed. Third, the rendering method for ENC objects in the 3D ENC system is introduced. Fourth, our picking-up method for ENC objects is proposed. Finally, we implement the above methods in our system: automotive intelligent chart (AIC) 3D electronic chart display and information systems (ECDIS). And our method can handle the fusion problem well.
APA, Harvard, Vancouver, ISO, and other styles
3

LINGE JOHNSEN, S. A., J. BOLLMANN, H. W. LEE, and Y. ZHOU. "Accurate representation of interference colours (Michel-Lévy chart): from rendering to image colour correction." Journal of Microscopy 269, no. 3 (September 21, 2017): 321–37. http://dx.doi.org/10.1111/jmi.12641.

Full text
APA, Harvard, Vancouver, ISO, and other styles
4

Field, Clive D. "Rendering unto Caesar?: The Politics of Church of England Clergy since 1980." Journal of Anglican Studies 5, no. 1 (June 2007): 89–108. http://dx.doi.org/10.1177/1740355307077935.

Full text
Abstract:
ABSTRACTThis is the first systematic attempt to chart the evolving political views of contemporary Church of England clergy. The article is based upon a comparative quantitative analysis and synthesis of eighteen national and four local surveys conducted between 1979 and 2004. Ministerial opinions on the state's influence on the Church and the Church's influence on the state are both considered. Ten specific conclusions are drawn. While the clergy generally cling to the concept of an Established Church, they are very critical of some of the traditional manifestations of that establishment. They also mostly think it highly appropriate for the Church to intervene in the world of party politics, and not simply on moral issues. In this they are positioned ahead of the thinking of many of the committed Anglican laity, for whom a degree of separation of religion and politics remains the ideal. The academic, ecclesiastical and political implications of these findings are briefly explored.
APA, Harvard, Vancouver, ISO, and other styles
5

Zhao, Ding Xuan, Ying Zhao, and Ying Jie Li. "Design of Towing Operation Training Simulation System with Multi-Channel Display." Applied Mechanics and Materials 182-183 (June 2012): 1189–93. http://dx.doi.org/10.4028/www.scientific.net/amm.182-183.1189.

Full text
Abstract:
The construction of towing operation training simulation system based on HLA (High Level Architecture) and Vega Prime interface is analysed. Designed the detailed object model for distribute simulation, and analyzed the responsibility of federation and each federate. The flow chart of simulation program which runs HLA and Vega Prime simultaneously was designed. In the multi thread program, the rendering cycle of Vega Prime is main thread and HLA thread for time cycle is embedded in. A multi-channel display system was also used to improve the visual effort of training system.
APA, Harvard, Vancouver, ISO, and other styles
6

Wintroub, Michael. "The Translations of a Humanist Ship Captain: Jean Parmentier’s 1529 Voyage to Sumatra*." Renaissance Quarterly 68, no. 1 (2015): 98–132. http://dx.doi.org/10.1086/681309.

Full text
Abstract:
AbstractThis article uses the multifaceted meaning of the wordtranslationas an analytic key to understand and analyze social and epistemic change in early modern France.Translationis here taken to mean not simply the rendering of one language into another, but also a physical displacement from one location to another, and the movement of knowledge and expertise from one discipline (and social status) to another. The manifold ideas and practices associated with translation are used to chart the voyage of Jean Parmentier, a humanist, poet, and ship captain from Dieppe, as he guided his ship and crew to Sumatra in 1529.
APA, Harvard, Vancouver, ISO, and other styles
7

Alavi, Seyyed Mohammad, and Ali Panahi Masjedlou. "Construct Under-representation and Construct Irrelevant Variances on IELTS Academic Writing Task 1: Is There Any Threat to Validity?" Theory and Practice in Language Studies 7, no. 11 (November 1, 2017): 1097. http://dx.doi.org/10.17507/tpls.0711.19.

Full text
Abstract:
The study reports on the validity of IELTS Academic Writing Task One (IAWTO) and compares and assesses the performance descriptors, i.e., coherence and cohesion, lexical resource and grammatical range, employed on IAWTO and IELTS Academic Writing Task Two (IAWTT). To these objectives, the data used were 53 participants' responses to graphic prompts driven by IELTS scoring rubrics, descriptive prompt, and retrospective, rather than concurrent, think-aloud protocols for detecting the cognitive validity of responses. The results showed that IAWTO input was degenerate and insufficient, rendering the construct under-represented, i.e., narrowing the construct. It was also found that IAWTO displayed to be in tune with cognitive difficulty of diagram analysis and the intelligence-based design of the process chart, rather than bar chart, being thus symmetrical with variances irrelevant to construct; this is argued to be biased to one group: Leading to under-performance of one group in marked contrast to over-performance of another group. Added to that, qualitative results established on instructors' protocols were suggestive of the dominance of performance descriptors on IAWTT rather than on IAWTO. The pedagogical implications of this study are further argued.
APA, Harvard, Vancouver, ISO, and other styles
8

Liu, Wen Tao. "Applied Information Technology in Graphics Algorithm Implementation Based on the Web Canvas." Advanced Materials Research 908 (March 2014): 543–46. http://dx.doi.org/10.4028/www.scientific.net/amr.908.543.

Full text
Abstract:
The canvas in the HTML5 provides very powerful ability to realize complex graphics and image operations. It can be used for developing information application about graphics such as chart module, scientific computing visualization, drawing software, animation system and information visualization and so on. It will play an important role in the web game for PC or mobile platform because the HTML5 has the platform-independent feature. In this paper a system of graphics algorithm presentation is provided and it uses the prototype object and image data structure of canvas rendering context 2D model to define the custom function to implement the graphics algorithm. It demonstrates the realization process of the principle of graphics algorithms and it shows that the canvas has great flexibility and scalability characteristics.
APA, Harvard, Vancouver, ISO, and other styles
9

Joswig, Holger, John P. Girvin, Warren T. Blume, Jorge G. Burneo, and David A. Steven. "Awake perimetry testing for occipital epilepsy surgery." Journal of Neurosurgery 129, no. 5 (November 2018): 1195–99. http://dx.doi.org/10.3171/2017.6.jns17846.

Full text
Abstract:
In the literature, there are few reports that provide a detailed account on the technique of visual electrocortical stimulation in the setting of resective surgery for occipital epilepsy. In this technical note, the authors describe how a 26-year-old male with long-standing occipital epilepsy underwent resective surgery under awake conditions, using electrocortical stimulation of the occipital lobe, with the aid of a laser pointer and a perimetry chart on a stand within his visual field. The eloquent primary visual cortex was found to overlap with the seizure onset zone that was previously determined with subdural electrodes. A maximum functionally safe resection was performed, rendering the patient seizure free as of his last follow-up at 20 months, with no visual field impairment.
APA, Harvard, Vancouver, ISO, and other styles
10

Bhusal, Pramod, and Rajendra Dangol. "Performance of different metrics proposed to CIE TC 1-91." International Journal of Sustainable Lighting 19, no. 2 (December 29, 2017): 91. http://dx.doi.org/10.26607/ijsl.v19i2.36.

Full text
Abstract:
The main aim of the article is to find out the performance of different metrics proposed to CIE TC 1-91. Currently, six different indexes have been proposed to CIE TC 1-91: Colour Quality Scale (CQS), Feeling of Contrast Index (FCI), Memory colour rendering index (MCRI), Preference of skin (PS), Relative gamut area index (RGAI) and Illuminating Engineering society Method for evaluating light source colour rendition (IES TM-30). The evaluation and analysis are based on previously conducted experiment in lighting booth. The analysis showed the area based metric FCI was good subjective preference indicator. The subjective preference was measured in terms of naturalness of objects, colourfulness of colour checker chart, and the visual appearance of the lit scene in the booth.
APA, Harvard, Vancouver, ISO, and other styles
11

Namba, Hidetsugu, Hidetoshi Hiyama, Atsushi Kuroda, Masumi Arakawa, Kaori Tokunoh, and Koichi Ikeda. "Comparison between the appearance of test colour chart for specifying colour rendering properties in JIS and the metric quantities of uniform colour spaces." JOURNAL OF THE ILLUMINATING ENGINEERING INSTITUTE OF JAPAN 80, Appendix (1996): 235–36. http://dx.doi.org/10.2150/jieij1980.80.appendix_235.

Full text
APA, Harvard, Vancouver, ISO, and other styles
12

Shpak, S., V. Martirosova, T. Sakhno, and G. Kozhushko. "DIRECTIONS FOR IMPROVEMENT OF STANDARDS ON LED TECHNIQUE AND LIGHTING WITH ITS USE." Municipal economy of cities 1, no. 154 (April 3, 2020): 57–66. http://dx.doi.org/10.33042/2522-1809-2020-1-154-57-66.

Full text
Abstract:
One of the main tasks of high-quality lighting is to provide comfortable visual work and adequate perception of illuminated objects by obtaining light with a wide range of correlated color temperatures with high color rendering quality. The paper analyzes the shortcomings of national regulatory documents on establishing tolerances for the color of lamps and fixtures using LEDs and evaluating their quality in color reproduction. Instead of using Mac Adam ellipses on the SIE 1931 (x, y) color chart for establishing the color requirements it is recommended to use circles on the 1976 SIE (u ', v'), and evaluate the color rendering quality using the CRI method, additionally apply the CQS and TM 30-18 methods. Due to the fact that the spatial color of LED luminaires can be inhomogeneous, it is recommended to indicate both the average color and color in a certain direction, as well as an indicator of the color heterogeneity. The necessity of developing standards for protection against unwanted non-visual biological effects, as well as other negative effects of artificial light, in particular, excessive brightness, pulsation of the light flux, photobiological hazards, etc., is substantiated. Considering the importance of the influence of light pulsation on the quality of lighting, before the development of CIE or IEC standards, it is recommended that national standards for LED lamps and luminaires include the requirements for description of pulsation parameters and measurement methods in accordance with the recommendations of IEEE 1789-2015 standard, which is the most advanced today. For the design of ergonomic lighting, it is proposed to provide information on lamps and luminaires related to their maximum brightness. When developing new criteria for lighting, taking into account the visual impact, it is already possible to use the knowledge accumulated by this time, in particular, for creation of biologically active and biologically dimmed light by changing the spectral composition of radiation and illumination. Proposals related to the limits of the correlated colour temperature, brightness, level of pulsation for lighting of children and educational institutions, residential premises and the like are also considered. Recommendations on the development of new national standards of Ukraine harmonized with international ones are also provided. Keywords: сolor rendering index, percent flicker, correlated colour temperature, illumination, photobiological safety
APA, Harvard, Vancouver, ISO, and other styles
13

Reeder, Allison, Ricardo Aulet, Mirabelle Sajisevi, and William Brundage. "Feasibility of In-office Fine-Needle Aspiration for Base of Tongue Tumors." Otolaryngology–Head and Neck Surgery 163, no. 4 (June 30, 2020): 849–51. http://dx.doi.org/10.1177/0194599820935454.

Full text
Abstract:
We aim to demonstrate the feasibility of in-office transcervical ultrasound (TCUS)–guided fine-needle aspiration (FNA) of base of tongue (BOT) tumors in a single-institution. Retrospective chart review was performed and 3 patients met criteria, with BOT tumors ≥3 cm . Two patients had no cervical adenopathy, while FNA of a cervical lymph node was inconclusive in patient 3. Two patients had multiple medical comorbidities rendering them high risk for general anesthesia, and 1 patient had a BOT tumor obscuring visualization of the glottis, which would have precluded intubation and potentially required tracheostomy to proceed. All patients underwent successful in-office TCUS-guided FNA, with results showing squamous cell carcinoma. There were no related complications. In-office TCUS-guided FNA can be used for diagnosis of BOT lesions that are evident on ultrasound. This is beneficial in cases where general anesthesia is considered high risk. Additionally, 1 patient safely continued anticoagulation, and another was able to avoid tracheostomy. This technique is cost-effective as it avoids the expenses associated with operative intervention.
APA, Harvard, Vancouver, ISO, and other styles
14

Alkhawam, H., A. Al-khazraji, S. Ahmad, JJ Lieber, R. Madanieh, TJ Vittorio, and M. El-Hunjul. "ID: 3: MORBIDITY AND MORTALITY OF CONGESTIVE HEART FAILURE IN TRAUMA PATIENTS: A RETROSPECTIVE CHART ANALTSIS." Journal of Investigative Medicine 64, no. 4 (March 22, 2016): 920. http://dx.doi.org/10.1136/jim-2016-000120.19.

Full text
Abstract:
BackgroundCardiovascular morbidity and mortality in heart failure (HF) patients comprise a major health and economic burden, especially when readmission rate and length of stay are considered. With increasing life expectancy, HF prevalence continues to increase. Diseases such as diabetes mellitus, hypertension and ischemic heart disease continue to be the leading causes of HF. Current data suggests that HF is the most common cause for hospital admission in patients older than 65 years.ObjectiveIn this study, we sought out to compare the morbidity, mortality, 30-day readmission rate and length of stay in trauma patients who have a pre-existing history of HF to those who do not have a history of HF. Additionally, we emphasize the effect of different cardiac variables in the HF group such as the pathophysiology of HF (HF with preserved ejection fraction [HFpEF] vs. HF with reduced ejection fraction [HFrEF]) and the etiology of HFrEF (ischemic vs. nonischemic).MethodsA retrospective chart analysis of 8,137 patients who were admitted to our hospital between 2005–2013 secondary to trauma with an Injury Severity Score<30. Data was extracted using ICD-9 codes. Neurotrauma patients were excluded.ResultsOf 8,137 trauma patients, 334 had pre-existing HF, of which 169 had HFpEF while 165 had HFrEF). Of the 165 HFrEF cases, 121 were ischemic in etiology vs. 44 nonischemic. Of 334 patients, 81 patients (24%) were readmitted within 30 days vs. 1,068 (14%) of the non-HF patients (95% CI 1.52–2.25, RR: 1.85, p<0.0001). Of the 81 readmitted HF patients, 64 had HFpEF while 35 had HFrEF. There was no statistical significance observed in any of the endpoints in the HFpEF versus. HFrEF groups (figure 1 and table 1). Mortality, 30-day readmission and length of stay were all significantly higher in the ischemic vs. non-ischemic HFrEF group (figure 1 and table 2).ConclusionsIn our trauma population, HF patients had a significantly higher morbidity, mortality and 30-day readmission rate when compared to non-HF patients. The pathophysiology of HF (HFpEF vs. HFrEF) did not seem to play a role. However, after subgroup analysis of the HFrEF group based on etiology, all endpoints including mortality, readmission and length of stay were significantly higher in the ischemic HFrEF subgroup rendering this entity higher importance when treating trauma patients with pre-existing HF.Abstract ID: 3 Figure 1
APA, Harvard, Vancouver, ISO, and other styles
15

Kapur, Ajay, David Barbee, Yijan Cao, Abolghassem Jamshidi, and Louis Potters. "Policy and procedure manuals: Does existence imply standardization in practice?" Journal of Clinical Oncology 32, no. 30_suppl (October 20, 2014): 275. http://dx.doi.org/10.1200/jco.2014.32.30_suppl.275.

Full text
Abstract:
275 Background: Patient safety organizations recommend the use of policy and procedure manuals (PPM) to effectuate standards and guidelines in practice. In this work we explored inter-physicist variability in the interpretation of five established physics policies amongst physicists in our department. Methods: The policies for treatment planning, 2nd physics checks, 1st day physics checks, weekly chart checks and final physics checks were reviewed by members of the quality management team in our department. Specific definitive statements were extracted into a spreadsheet and provided to 11 physicists and 6 dosimetrists. The intent was to obtain individual responses on adherence to the statements and thereby assess the level of standardization in perception and practice amongst the staff using free-margin kappa statistics. The responses were limited to affirmation, rejection or non-applicability. A total of 732 responses were assessed. Results: Based on the Landis-and-Koch criteria for interpretation of kappa values, the consistency amongst the respondents varied from moderate (0.40-0.60) to good (0.60-0.80). The kappa scores for the statements assessed were 0.56 for treatment planning, 0.64 for second physics check, 0.70 for the first day physics check, 0.54 for the weekly physics check and 0.73 for the final physics check. Conclusions: Validating the effectiveness of PPMs by measuring uniformity in staff interpretation is an important step in establishing their effectiveness. Mere existence of a PPM may not be sufficient. This work demonstrated reasonable uniformity of interpretation of existing policies, but underscored the need for further improvement. The review of specific policy statements with weaker consensus may lead to more effective revisions and in-servicing to enhance clarity and reduce ambiguity, re-testable using the same approach. Absence of validation will tend to retain ambiguity, thereby rendering the policy not as effective as it could be.
APA, Harvard, Vancouver, ISO, and other styles
16

Sundi, Debasish, Jason Cohen, Alexander Cole, Brian Neuman, Jack Cooper, Farzana Faisal, Marian Raben, et al. "Multidisciplinary clinic evaluation changes prostate cancer stage and risk stratification." Journal of Clinical Oncology 32, no. 4_suppl (February 1, 2014): 91. http://dx.doi.org/10.1200/jco.2014.32.4_suppl.91.

Full text
Abstract:
91 Background: The use of multidisciplinary clinics (MDCs) for outpatient cancer evaluation is increasing. Data on whether MDCs improve prostate cancer (PCa) care are limited. We studied the frequency of changes in PCa grade and stage upon MDC evaluation. Methods: Between May 2008 and December 2012, 887 consecutive patients underwent consultation for newly diagnosed prostate cancer at the Johns Hopkins Hospital (JHH) MDC, which features real-time collaboration among urologists, radiation oncologists, and medical oncologists. Retrospective chart review identified presenting tumor characteristics, based on outside assessment (medical records sent upon referral to MDC), as compared with disease stage and grade as determined at MDC evaluation. All outside biopsy slides were reviewed by JHH pathologists, and all outside imaging (CT, MRI, bone scan) was reviewed by JHH radiologists. Results: The three most chosen treatments after MDC evaluation were external beam radiotherapy +/- androgen deprivation (39.3%), radical prostatectomy (32.0%) and active surveillance/expectant management (12.9%). Using the NCCN guidelines as a benchmark, many men were found to have undergone non-indicated imaging (bone scan 23.9%, CT/MRI 47.4%). Overall, 186/647 (28.7%) had a change in their NCCN risk classification or N or M stage. For example, 2.9% of men were down-classified as very-low-risk, rendering them eligible for active surveillance. 5.7% of men thought to have localized cancer were up-classified as metastatic, thus prompting systemic management approaches. Conclusions: Comprehensive evaluation of prostate cancer patients in a MDC is associated with critical changes in presenting disease classification from baseline in over one in four men. While questions about the long term costs and benefits of MDCs remain, these results lend credence to the growing belief that MDCs may dramatically impact management for a large number of men with prostate cancer. [Table: see text]
APA, Harvard, Vancouver, ISO, and other styles
17

Snider, C., W. Chernomas, K. Cook, D. Jiang, T. Klassen, S. Logsetty, J. Mahmood, E. Mordoch, and T. Strome. "MP010: Wraparound care for youth injured by violence: a randomized control trial." CJEM 18, S1 (May 2016): S69. http://dx.doi.org/10.1017/cem.2016.151.

Full text
Abstract:
Introduction: Youth injured by violence is a major public health concern in Canada. It is the fourth leading cause of death in youth and the foremost reason youth visit an emergency department (ED). In Winnipeg, 20% of youth who visit an ED with an injury due to violence will have an ED visit for a subsequent violent injury within one year. Youth injured by violence are in a reflective and receptive state of mind, rendering the ED setting appropriate for intervention. Methods: We completed a randomized control trial in November 2015 comparing wraparound care for youth age 14 - 24 who were injured by violence to standard ED care. Youth were excluded if their injury was due to child maltreatment, sexual assault or self-harm. An adapted pre-consent randomization methodology was used. The intervention was developed using a community based participatory research approach. Wraparound care was delivered by a support worker with lived experience with violence. Support workers were on call 24/7 in order to start the intervention in the ED and take advantage of the “teachable moment.” Care continued in the community for approximately one year. Results: A total of 133 youth were randomized (68 intervention, 65 control) in one year. There was no difference in age, gender, or severity of injury between the two groups. Patients randomized to the intervention spent a median of 30 minutes less in the ED than those receiving standard care (p=0.22). Youth are safely housed, have enrolled in education opportunities, and are engaged in addictions care. Results of a chart review examining repeat visits to the ED for violent injury, substance use and mental health will be completed in Spring 2016 and will be presented. Conclusion: There were no differences between standard care and intervention groups on baseline characteristics reflecting effective randomization. The introduction of an intervention at bedside in the ED did not have a negative impact on patient length of stay.
APA, Harvard, Vancouver, ISO, and other styles
18

Abdelsattar, Zaid M., Sandra L. Wong, Nancy J. Birkmeyer, Robert K. Cleary, Melissa L. Times, Ryan E. Figg, Nanette Peters, Robert W. Krell, Darrell A. Campbell, and Samantha K. Hendren. "Multi-institutional assessment of sphincter preservation for rectal cancer." Journal of Clinical Oncology 32, no. 3_suppl (January 20, 2014): 566. http://dx.doi.org/10.1200/jco.2014.32.3_suppl.566.

Full text
Abstract:
566 Background: Rates of sphincter preserving surgery (SPS) have been proposed as a quality measure for rectal cancer (RC) surgery. However, administrative and registry-based SPS rates often lack critical patient and tumor characteristics, rendering it unclear if variations in SPS rates are due to unmeasured case-mix differences or selection criteria. The aim of this study was to determine whether hospitals’ SPS rates differ after accounting for clinical characteristics. Methods: As part of a RC quality project, 10 hospitals in the Michigan Surgical Quality Collaborative retrospectively collected RC-specific data from 2007-2012. We assessed for SPS predictors using multivariable regression. Patients were categorized as “definitely SPS eligible” a priori if they did not have any of the following: poor sphincter control, stoma preference, sphincter involvement, tumor <6 cm from the anal verge (an intentionally conservative cutoff) or metastatic disease. We compared hospital performance with and without these clinical data using Spearman’s correlations. Results: In total, 349 patients underwent surgery for RC in 10 hospitals (5/10 high volume and 6/10 major teaching). Of those, 74% had SPS (range by hospital 50%-91%). On multivariable analysis, only pre-op radiation, tumor location, hospital teaching status and hospital ID were independent predictors of SPS, but not age, sex, BMI, AJCC stage, ASA class, or hospital CRC surgery volume. Analyses of the “definitely eligible” patients revealed an overall SPS rate of 88% (65-100%). Hospital SPS rankings using crude versus clinically-adjusted SPS rates proved to be highly correlated (Spearman’s ρ= 0.9). Tumor locations suggest differing selection criteria for SPS in different hospitals (Table). Conclusions: Rates of SPS vary by hospital, even after correcting for clinical characteristics using detailed chart review. These data suggest missed opportunities for SPS, and refute the general hypothesis that hospital variation in SPS rates in previous studies is due to unmeasured case-mix differences. [Table: see text]
APA, Harvard, Vancouver, ISO, and other styles
19

Sparber, Cornelia, Anna Sophie Berghoff, Margaretha Rudas, Peter Christian Dubsky, Catharina DeVries, Christoph Minichsdorfer, Claudia Sattlberger, et al. "Low HER2-expression to predict impaired activity of endocrine therapy in patients with estrogen-receptor (ER) positive metastatic breast cancer (MBC)." Journal of Clinical Oncology 31, no. 15_suppl (May 20, 2013): 573. http://dx.doi.org/10.1200/jco.2013.31.15_suppl.573.

Full text
Abstract:
573 Background: ER cross-activation by growth-factor signalling causes resistance to endocrine therapy (ET) in patients (pts) with Her2-positive MBC. Moreover, low levels of Her2-expression (Her2 1+; Her2 2+ without gene amplification), may result in reduced efficacy of ET in early breast cancer pts. In a recently published study, these tumours had a less favourable outcome as compared to tumours with a Her2-score of 0. Here, we investigated if low levels of Her2-expression could predict for shorter progression-free survival (PFS) in MBC pts on ET. Methods: PFS on first-line ET was chosen as primary endpoint and estimated with the Kaplan-Meier method. To test for differences between PFS curves, the log-rank test was used. Association of the following variables with PFS was investigated: low Her2-expression, grading, level of ER-expression, progesterone-receptor status, Ki67 (cut-off ≤20%), prior adjuvant ET, and presence of visceral metastases. For an estimated superiority of 40% in terms of PFS in favour of the Her2-negative group, a sample of 130 pts in two groups was needed in order to rule out the null-hypothesis with a 80% power and a two-sided α of 0.025. Results: A total number of 320 ER-positive MBC pts were identified from a breast cancer database; 170 pts were available for this analysis. Median PFS on first-line ET was 11 months (m) (8.56-13.44), corresponding numbers for second-line were 6 m (4.65-7.36), and third-line 4 m (1.52-6.48), respectively; median OS from diagnosis of MBC was 58 m (48.15-67.86). None of the variables investigated were significantly associated with first-line PFS. Second-line PFS, however, was significantly shorter in pts with grade 3 tumours and prior adjuvant ET; a trend towards shorter PFS was observed in high proliferating tumours. Conclusions: In this chart review, low levels of Her2 expression did not predict for shorter PFS in pts receiving ET; PFS in different treatment lines was well in line with data from clinical trials. High tumour grading and prior adjuvant ET were associated with accelerated onset of resistance, rendering those patients candidates for early combination of ET with targeted treatment approaches.
APA, Harvard, Vancouver, ISO, and other styles
20

Trautner, T., and S. Bruckner. "Line Weaver: Importance‐Driven Order Enhanced Rendering of Dense Line Charts." Computer Graphics Forum 40, no. 3 (June 2021): 399–410. http://dx.doi.org/10.1111/cgf.14316.

Full text
APA, Harvard, Vancouver, ISO, and other styles
21

Menon, Sudha, Sandesh O., Debish Anand, and Girish Menon. "Spheno-Orbital Meningiomas: Optimizing Visual Outcome." Journal of Neurosciences in Rural Practice 11, no. 03 (May 11, 2020): 385–94. http://dx.doi.org/10.1055/s-0040-1709270.

Full text
Abstract:
Abstract Background Spheno-orbital meningiomas (SOMs) constitute a rare cause for orbital proptosis and visual impairment. This study aims to share our outcome experience with regard to vision and exophthalmos following the surgical management of 17 patients with SOM. Methods Retrospective analysis of the case records of all surgically treated SOMs in the last 10 years. Exophthalmos index (EI) was calculated based on preoperative magnetic resonance imaging/computed tomography imaging. Vision was assessed using the Snellen’s chart and Goldman’s perimeter. Orbital volume was calculated using three-dimensional volume rendering assisted region-of-interest computation. Preoperative duration of symptoms and extent of surgery were the other predictors analyzed. Results Patients’ age ranged from 17 to 72 years (mean, 50.57 y; median, 50.0 years). Women represented 13 (76.4%) of the entire study group. Proptosis (14/17; 82.4%) and visual impairment (14/17; 82.3%) were the two most common presenting complaints followed by headache (12/17; 70.1%). Gross total resection (GTR) was achieved in only 2 of the 17 patients (11.8%). Majority of the tumors were benign World Health Organization Grade I meningiomas (14/17; 84%). Mean follow-up time for the entire cohort was 56 months. Postoperatively, proptosis improved in nine (64.3%) and remained static in the rest five (35.7%) of patients. Four patients (28.6%) improved in vision following surgery. Vision remained static in eight patients (57.1%). Vision deteriorated in two (14.3%) patients who had severe preoperative visual deficits. New onset oculomotor palsy, trigeminal dysfunction, and mechanical ocular motility restriction were noticed in three (17.6%), two (11.2%), and six (35.3%) patients, respectively. The mean preoperative orbital volume was 21.68 ± 3.2 cm3 and the mean postoperative orbital volume was 23.72 ± 3.4 cm3. Orbital volume was inversely related to EI. Optic canal (OC) deroofing and extensive orbital wall decompression facilitated visual improvement and proptosis reduction. None of the variables including orbital volume proved to be statistically significant in predicting outcome. Conclusion SOMs constitute a rare subgroup of skull base meningiomas that pose considerable surgical challenges. A surgical strategy aimed at safe maximal resection rather than aggressive GTR provides favorable outcome with less morbidity. Adequate bony decompression of the orbital walls and OC provides satisfactory improvements in proptosis and vision. Residual disease is common, but the risk of symptomatic recurrence is low especially when combined with adjuvant radiotherapy. Visual outcome is likely to be poor in patients presenting with severely compromised vision.
APA, Harvard, Vancouver, ISO, and other styles
22

Lee, Raphael C., Gregory Kieska, and Mahesh H. Mankani. "A Three-Dimensional Computerized Burn Chart: Stage I: Development of Three-Dimensional Renderings." Journal of Burn Care & Rehabilitation 15, no. 1 (January 1994): 80–83. http://dx.doi.org/10.1097/00004630-199401000-00015.

Full text
APA, Harvard, Vancouver, ISO, and other styles
23

Sintari, Made Nita. "Hospital’s Performance with Malcolm Baldrige Method." Journal of Public Health Research and Community Health Development 3, no. 2 (February 29, 2020): 108. http://dx.doi.org/10.20473/jphrecode.v3i2.13419.

Full text
Abstract:
Background: Malcolm Baldrige method has been widely used to assess institutional performance. However, data obtained from the assessment are still visualized in the form of tables. This kind of visualization for rendering data is less interesting and makes readers difficult to interpret the data. Aims: This study presented some recommendations for data visualization in various forms, such as charts and diagrams to facilitate data interpretation. Methods: This study utilized Microsoft Excel 2007 software to create charts. The data used as an example for analysis was the performance assessment using Malcolm Baldrige method at Muhammadiyah Gresik Hospital in June 2012. Results: The results of this study reported that by using Malcolm Baldrige method, the data could be visualized in forms of bar charts, radar charts, and pie charts. Conclusion: To conclude, performance assessment can be visualized not only in tables but also charts and diagram that have a more visual presentation.Keywords: Charts, performance, Malcolm Baldrige, visualization
APA, Harvard, Vancouver, ISO, and other styles
24

Gao, Yiming, and Jiangqin Wu. "GAN-Based Unpaired Chinese Character Image Translation via Skeleton Transformation and Stroke Rendering." Proceedings of the AAAI Conference on Artificial Intelligence 34, no. 01 (April 3, 2020): 646–53. http://dx.doi.org/10.1609/aaai.v34i01.5405.

Full text
Abstract:
The automatic style translation of Chinese characters (CH-Char) is a challenging problem. Different from English or general artistic style transfer, Chinese characters contain a large number of glyphs with the complicated content and characteristic style. Early methods on CH-Char synthesis are inefficient and require manual intervention. Recently some GAN-based methods are proposed for font generation. The supervised GAN-based methods require numerous image pairs, which is difficult for many chirography styles. In addition, unsupervised methods often cause the blurred and incorrect strokes. Therefore, in this work, we propose a three-stage Generative Adversarial Network (GAN) architecture for multi-chirography image translation, which is divided into skeleton extraction, skeleton transformation and stroke rendering with unpaired training data. Specifically, we first propose a fast skeleton extraction method (ENet). Secondly, we utilize the extracted skeleton and the original image to train a GAN model, RNet (a stroke rendering network), to learn how to render the skeleton with stroke details in target style. Finally, the pre-trained model RNet is employed to assist another GAN model, TNet (a skeleton transformation network), to learn to transform the skeleton structure on the unlabeled skeleton set. We demonstrate the validity of our method on two chirography datasets we established.
APA, Harvard, Vancouver, ISO, and other styles
25

Borowski, Sylwester, Anna Mazurkiewicz, and Magda Czyżewska. "The method of assessment and formation the availability of the subsystem ensuring fitness for means of transport." MATEC Web of Conferences 302 (2019): 01003. http://dx.doi.org/10.1051/matecconf/201930201003.

Full text
Abstract:
In systems of means of transport operation in order to achieve appropriate completion of assigned transportation tasks it is necessary to maintain a required number of means of transport in the state of availability for carrying out of transportation task (roadworthy and stocked). In general, the processes of rendering vehicles roadworthy are connected to supplying them with fuel and operational materials, carrying out services and repairs, condition diagnostics. In the analyzed system of transport means operation, the processes are carried out in serviceability assurance subsystem SAS. In complex operation systems, the processes of rendering technical objects roadworthy are carried out at specifically designed technical infrastructure posts. The possibility of carrying out the assigned service and repair tasks depends on the availability and the number of such posts. The article presents the method of defining the operational availability and the number of technical infrastructure posts required for appropriate functioning of assigned service and repair task. Then typical calculation results are presented in charts prepared on the basis of data obtained from tests at existing transport means operation system.
APA, Harvard, Vancouver, ISO, and other styles
26

Yang, Mina. "Yellow Skin, White Masks." Daedalus 142, no. 4 (October 2013): 24–37. http://dx.doi.org/10.1162/daed_a_00232.

Full text
Abstract:
Ethnic studies scholars have long bemoaned the near absence of Asians on the big and small screens and popular music charts in the United States, rendering them as outsiders vis-à-vis the American public sphere. In the last few years, however, Asians have sprung up on shows like “Glee” and “America's Best Dance Crew” in disproportionately large numbers, challenging entrenched stereotypes and creating new audiovisual associations with Asianness. This essay considers how emerging Asian American hiphop dancers and musicians negotiate their self-representation in different contexts and what their strategies reveal about the postmillennial Asian youth's relationship to American and transpacific culture and the outer limits of American music.
APA, Harvard, Vancouver, ISO, and other styles
27

Migawa, Klaudiusz, Maciej Woropay, Maciej Gniot, and Monika Salamońska. "Method of Determining the Required Number of Technical Backup Area Posts." Journal of KONBiN 44, no. 1 (December 1, 2017): 81–96. http://dx.doi.org/10.1515/jok-2017-0062.

Full text
Abstract:
Abstract In complex operation systems, the processes of rendering technical objects roadworthy are carried out at specifically designed technical backup area posts. The article presents the method of defining the number of technical backup area posts required for appropriate functioning of assigned service and repair task. Then typical calculation results are presented in charts prepared on the basis of data obtained from tests at existing transport means operation system. The presented method makes it possible to analogically determine the minimum required number of posts for carrying out the assigned service and repair task for both a subsystem comprised of a group of units a given group of posts or an individual post in traffic maintenance and intervention subsystems.
APA, Harvard, Vancouver, ISO, and other styles
28

Ross, Stephen. "Beckett's Molloy, the Promise of Youth, and the Postwar." Modernist Cultures 16, no. 3 (August 2021): 385–407. http://dx.doi.org/10.3366/mod.2021.0340.

Full text
Abstract:
This paper argues that Samuel Beckett's Molloy charts the historical transition of the postwar moment in generational terms – terms that themselves had real historical significance in the years immediately following the Allied victory in World War II – ultimately, and uncharacteristically, advancing a youthful figure of promise in young Jacques Moran, Jr. Though Beckett is much more commonly read as an allegorist of existential ambivalence – if not despair – I contend that his first postwar novel must properly be understood as staging history in its confused, agonistic, and frustrating family dynamics. Beckett's rendering of authority, history, and hierarchy in terms of perversion, queerness, and sterility ultimately preserves a futurity centred not precisely on the child per se, but on the emergent figure of the adolescent – the teenager, even, I will venture to claim – avant la lettre: Jacques Moran, Jr.
APA, Harvard, Vancouver, ISO, and other styles
29

Miraskari, Mohammad, Farzad Hemmati, MY Alqaradawi, and Mohamed S. Gadala. "Linear stability analysis of finite length journal bearings in laminar and turbulent regimes." Proceedings of the Institution of Mechanical Engineers, Part J: Journal of Engineering Tribology 231, no. 10 (February 6, 2017): 1254–67. http://dx.doi.org/10.1177/1350650117691697.

Full text
Abstract:
Dynamic coefficients of a finite length journal bearing are numerically calculated under laminar and turbulent regimes based on Ng–Pan–Elrod and Constantinescu models. Linear stability charts of a flexible rotor supported on laminar and turbulent journal bearings are found by calculating the threshold speed of instability associated to the start of instable oil whirl phenomenon. Local journal trajectories of the rotor-bearing system were found at different operating conditions solely based on the calculated dynamic coefficients in laminar and turbulent flow. Results show no difference between laminar and turbulent models at low loading while significant change of the size of the stable region was observed by increasing the Reynolds number in turbulent models. Stable margins based on the laminar flow at relatively low Sommerfeld numbers [Formula: see text] were shown to fall inside the unstable region and hence rendering the laminar stability curves obsolete at high Reynolds numbers. Ng-Pan turbulent model was found to be generally more conservative and hence is recommended for rotor-bearing design.
APA, Harvard, Vancouver, ISO, and other styles
30

Miriovsky, Benjamin J., Lawrence N. Shulman, and Amy P. Abernethy. "Importance of Health Information Technology, Electronic Health Records, and Continuously Aggregating Data to Comparative Effectiveness Research and Learning Health Care." Journal of Clinical Oncology 30, no. 34 (December 1, 2012): 4243–48. http://dx.doi.org/10.1200/jco.2012.42.8011.

Full text
Abstract:
Rapidly accumulating clinical information can support cancer care and discovery. Future success depends on information management, access, use, and reuse. Electronic health records (EHRs) are highlighted as a critical component of evidence development and implementation, but to fully harness the potential of EHRs, they need to be more than electronic renderings of the traditional paper medical chart. Clinical informatics and structured accessible secure data captured through EHR systems provide mechanisms through which EHRs can facilitate comparative effectiveness research (CER). Use of large linked administrative databases to answer comparative questions is an early version of informatics-enabled CER familiar to oncologists. An updated version of informatics-enabled CER relies on EHR-derived structured data linked with supplemental information to provide patient-level information that can be aggregated and analyzed to support hypothesis generation, comparative assessment, and personalized care. As implementation of EHRs continues to expand, electronic databases containing information collected via EHRs will continuously aggregate; aggregating data enhanced with real-time analytics can provide point-of-care evidence to oncologists, tailored to patient-level characteristics. The system learns when clinical care informs research, and insights derived from research are reinvested in care. Challenges must be overcome, including interoperability, standardization, access, and development of real-time analytics.
APA, Harvard, Vancouver, ISO, and other styles
31

North, J. Lionel. "‘Good Wordes and Faire Speeches’ (Rom 16.18 AV): More Materials and a Pauline Pun." New Testament Studies 42, no. 4 (October 1996): 600–614. http://dx.doi.org/10.1017/s0028688500021445.

Full text
Abstract:
In Rom 16.18 we have a unique word, alongside another word which is used by Paul in a unique way. That is challenge enough to any student of the language of the Greek NT. ‘Faire speeches’ is the AV's rendering of εὐλογία. Elsewhere in the NT, where it occurs 16 times, εὐλογία is always used in an approving sense, of the human praise of God or of the divine bounty for which praise is due. The word is found nine times in the Pauline corpus; a little earlier in this same letter, Paul had spoken about his certainty that he will visit Rome ‘in the fullness of the blessing (εὐλογία) of Christ’ (15.29 RSV). After that and the other Pauline and the non-Pauline usage, the use at 16.18 grates on the ear.1Its context shows that here εὐλογία is being used disparagingly, of men who flatter to deceive (ἐξαπατῶσιν) and work to mislead and divide the community. Today, we might call such men ‘smoothies’ who ‘turn on the charm’, ‘chat up’ the gullible, ‘talk up’ their policies and ‘sweet talk’ their way to success for their own selfish purposes.
APA, Harvard, Vancouver, ISO, and other styles
32

Mote, Kevin. "Fast Point-Feature Label Placement for Dynamic Visualizations." Information Visualization 6, no. 4 (December 2007): 249–60. http://dx.doi.org/10.1057/palgrave.ivs.9500163.

Full text
Abstract:
This paper describes a fast approach to automatic point label de-confliction on interactive maps. The general Map Labeling problem is NP-hard and has been the subject of much study for decades. Computerized maps have introduced interactive zooming and panning, which has intensified the problem. Providing dynamic labels for such maps typically requires a time-consuming pre-processing phase. In the realm of visual analytics, however, the labeling of interactive maps is further complicated by the use of massive datasets laid out in arbitrary configurations, thus rendering reliance on a pre-processing phase untenable. This paper offers a method for labeling point-features on dynamic maps in real time without pre-processing. The algorithm presented is efficient, scalable, and exceptionally fast; it can label interactive charts and diagrams at speeds of multiple frames per second on maps with tens of thousands of nodes. To accomplish this, the algorithm employs a novel geometric de-confliction approach, the ‘trellis strategy,’ along with a unique label candidate cost analysis to determine the “least expensive” label configuration. The speed and scalability of this approach make it well-suited for visual analytic applications.
APA, Harvard, Vancouver, ISO, and other styles
33

Akhtar, Javaid, Muhammad Imran, Arshid Mahmood Ali, Zeeshan Nawaz, Ayyaz Muhammad, Rehan Khalid Butt, Maria Shahid Jillani, and Hafiz Amir Naeem. "Torrefaction and Thermochemical Properties of Agriculture Residues." Energies 14, no. 14 (July 13, 2021): 4218. http://dx.doi.org/10.3390/en14144218.

Full text
Abstract:
In this study, the densification of three agriculture waste biomasses (corn cobs, cotton stalks, and sunflower) is investigated using the torrefaction technique. The samples were pyrolyzed under mild temperature conditions (200–320 °C) and at different residence times (10 min–60 min). The thermal properties of the obtained bio-char samples were analyzed via thermo-gravimetric analysis (TGA). Compositional analysis of the torrefied samples was also carried out to determine the presence of hemicellulose, cellulose, and lignin contents. According to the results of this study, optimum temperature conditions were found to be 260 °C–300 °C along with a residence time of 20 min–30 min. Based on the composition analysis, it was found that biochar contains more lignin and celluloses and lower hemicellulose contents than do the original samples. The removal of volatile hemicelluloses broke the interlocking of biomass building blocks, rendering biochar brittle, grindable, and less reactive. The results of this study would be helpful in bettering our understanding of the conversion of agricultural waste residues into valuable, solid biofuels for use in energy recovery schemes. The optimum temperature condition, residence time, and GCV for torrefied corn cobs were found to be 290 °C, 20 min, and 5444 kcal/kg, respectively. The optimum temperature condition, residence time, and GCV for torrefied cotton balls were found to be 270 °C, 30 min, and 4481 Kcal/kg, respectively. In the case of sunflower samples, the mass yield of the torrefied sample decreased from 85% to 71% by increasing the residence time from 10 min to 60 min, respectively.
APA, Harvard, Vancouver, ISO, and other styles
34

Thomson, Guy P. C. "Bulwarks of Patriotic Liberalism: the National Guard, Philharmonic Corps and Patriotic Juntas in Mexico, 1847–88." Journal of Latin American Studies 22, no. 1-2 (March 1990): 31–68. http://dx.doi.org/10.1017/s0022216x00015108.

Full text
Abstract:
In the archive of the now disbanded jefatura política of Tetela de Ocampo is an account of the funeral ceremony of the Puebla State deputy and school teacher, Ciudadano Miguel Méndez, only son of General Juan Nepomuceno Méndez, caudillo máximo of the State of Puebla between 1857 and 1884. The Velada Fúnebre was held in 1888 in the cabecera of Xochiapulco (alias ‘La Villa del Cinco de Mayo’), a municipio of nahuatl speakers on the southern edge of Mexico's Sierra Madre Oriental, adjoining the cereal producing plateaux of San Juan de los Llanos. The ceremony took place in the ‘Netzahualcoyotl’ municipal school room and was organised by the municipality's Society of Teachers. The description of the elaborately decorated room and baroque ceremony fills several pages.1 The teachers had decked the school room (normally adorned by ‘sixty-two great charts of natural history, twenty Industrial diagrams, large maps of Universal Geography, and diverse statistical charts and many engravings related to education’) with military banners and weapons, masonic trophies, candelabra, floral crowns and yards of white and black ribbon. In the centre of the room stood the coffin on an altar, itself raised upon a platform, guarded by four National Guard sentries and attended by the philharmonic corps of Xochiapulco and all the public officials of the cabecera and its dependent barrios. For nine days preceding the ceremony this band had played funeral marches, between six and eight in the evening, on the plaza, in front of the house of the deceased. The service was taken by Mr Byron Hyde, a Methodist minister from the United States. Accompanied by his wife at a piano, Hyde gave renderings (in English) of three Wesleyan hymns.2 There followed three eulogies of Miguel Méndez, extolling his services to the Liberal cause and on behalf of the ‘desgraciada nación azteca’. These speeches were infused with extreme anticlerical and anti-Conservative sentiments, a martial patriotic liberalism, a reverence for the principles of the French Revolution, an admiration for Garibaldi and Hidalgo (in that order), and an obsession with the importance of education as the only means for emancipating the indigenous population from clerical subjection.
APA, Harvard, Vancouver, ISO, and other styles
35

Weiner, D. S., J. Guirguis, M. Makowski, S. Testa, L. Shauver, and D. Morgan. "Orthopaedic manifestations of pseudoachondroplasia." Journal of Children's Orthopaedics 13, no. 4 (August 2019): 409–16. http://dx.doi.org/10.1302/1863-2548.13.190066.

Full text
Abstract:
Purpose In 1959, Maroteaux and Lamy initially designated pseudoachondroplasia as a distinct dysplasia different from achondroplasia the most common form of skeletal dysplasia. Pseudoachondroplasia is caused by a mutation in the collagen oligomeric matrix protein gene (COMP) gene on chromosome 19p13.1-p12 encoding the COMP. The COMP gene mutations result in rendering the articular and growth plate cartilages incapable of withstanding routine biomechanical loads with resultant deformity of the joints. The purpose of the study was to characterize the typical orthopaedic findings in pseudoachondroplasia. Methods The charts and radiographs of 141 patients with pseudoachondroplasia were analyzed. This cohort, to our knowledge, represents the largest group of patients describing the typical orthopaedic manifestations of pseudoachondroplasia. Results Patients with pseudoachondroplasia have normal craniofacial appearance with normal intelligence. Short stature is not present at birth and generally appears by two to four years of age. The condition is a form of spondyloepiphyseal dysplasia and the long bones are characterized by dysplastic changes in the epiphysis, metaphysis and vertebral bodies. Radiographically the long bones have altered the appearance and structure of the epiphyses with small irregularly formed or fragmented epiphyses or flattening. The metaphyseal regions of the long bones show flaring, widening or ‘trumpeting’. The cervical (89%) and thoracic and lumbar vertebrae show either platyspondyly, ovoid, ‘cod-fish’ deformity or anterior ‘beaking’. Kyphosis (28%), scoliosis (58%) and lumbar lordosis (100%) are commonly seen. The femoral head and acetabulum are severely dysplastic (100%). The knees show either genu valgum (22%), genu varum (56%) or ‘windswept’ deformity (22%). Conclusion Most commonly these distortions of the appendicular and the axial skeleton lead to premature arthritis particularly of the hips and often the knees not uncommonly in the 20- to 30-year-old age group. Level of Evidence: III
APA, Harvard, Vancouver, ISO, and other styles
36

Hautecloque-Raysz, Segolene, Marie Albert Thananayagam, Niels Martignene, Marie-Cecile Le Deley, Nicolas Penel, and Aurelien Carnot. "Overall survival (OS) and prognostic factors (PF) of patients (pts) with metastatic solid tumors admitted in intensive care unit (ICU)." Journal of Clinical Oncology 39, no. 15_suppl (May 20, 2021): e24074-e24074. http://dx.doi.org/10.1200/jco.2021.39.15_suppl.e24074.

Full text
Abstract:
e24074 Background: Admission of cancer pts in ICU is a medical and ethical challenge, since there is no reliable prognostic tool for guiding decision making. Therefore, recent progress in cancer management improved the OS of advanced cancer pts, rendering this issue more and more frequent. Methods: We have retrospectively analyzed the medical charts of 129 consecutive pts treated in our institution and admitted in ICU from 01/2014 and 04/2019. We identified PF using Cox Models. We analyzed PF for OS or for ability to restart systemic treatment. Results: At the time of ICU admission, the mean age was 58.9 (range, 25-81). The were 51% of men. PS were 0-1 in 61% and 2-3 in 39%. The most prevalent cancers were lung (20%), sarcoma (17%), breast (16%) and gynecological cancers (11%). The number of metastatic sites was 1 (17%), 2 (39%), 3 (26%) and more (19%). The malignancy was stable in 69% and progressive in 31%. Pts currently received systemic treatment in 78%, were free of treatment in 5% and were not yet treated in 16%. Cancer itself (53%), toxicity of treatment (43%) and underlying comorbidities (37%) contributed to ICU admission. The number of organ failures was 0 in 12%, 1 in 40%, 2 in 26% and 3 or more in 22%. The 4 most symptoms were dyspnea (34%), severe infection (25%), cardiac event (23%) and bleeding (10%). Pt management required vasoamines administration (38%), ventilation (31%) and dialysis (4%). The median duration of stay in ICU was 4 days (range, 0-71). 21 (16%) pts died in ICU and 16 (12%) just after ICU discharge. 83 pts returned to home (64%). Systemic treatment was done in 61% of pts who left hospitalization alive after ICU. 1-month, 6-month and 12-month OS were 67, 36 and 21%, respectively. Multivariate analysis identified the following PF for OS: PS = 0 (HR = 0.57 [0.38-0.79]; p = .007), admission related to underlying comorbidity (HR = 0.63 [0.41-0.98], p = .0001) or admission related to treatment toxicity (HR = 0.31 [0.17-0.56] p = .0001). Multivariate identified 1 PF for ability to restart systemic treatment: one or less organ failure (HR = 0.34 [0.14-0.83], p = 0.02). Conclusions: In ICU, OS of adult pts with solid tumors looks like the non-cancer population. ¼ of admitted pts died in ICU or just after. Key PF were PS and cause of critical state. When ICU admission is related to cancer itself, the prognosis was poor.
APA, Harvard, Vancouver, ISO, and other styles
37

Tılfarlıoğlu, Filiz Yalçın, and Jivan Kamal Anwer. "Integration of Lean method in English Language Teaching and Learning: A New Perspective." Journal of Education and Training Studies 5, no. 9 (August 27, 2017): 230. http://dx.doi.org/10.11114/jets.v5i9.2625.

Full text
Abstract:
Lean is regarded as a systematic approach to maximizing value by minimizing waste, and by flowing the product or service at the pull of the customer demand. These key concepts of “value,” “flow,” and “pull,” align with the ultimate lean goal: “perfection,” or a continuous striving for improvement in the performance of the organization. Through applying a lean methodology to the teaching processes, additionally the teachers can eliminate reasons that do not add value and are thus wasteful, and they can focus their efforts on the advancement of teaching and learning. By applying the lean principles and techniques developed in the industry, educators can refine the content, pedagogy, organization, and assessment methods employed in their accounting courses to help and ensure that students gain the knowledge and skills that will make them most desirable to students. Lean can be taught throughout several methods and tools, such as readings and class discussion, game- and simulation-based methods, and the open forum method. The readings and discussion method present students with the opportunity, and even the obligation, to express their Point of view on certain issues, requiring the students to think critically on the subject and use logic to evaluate others' positions through open and active participation. When applying lean production to learning, we should first identify the process and then maintain focus on what adds value (i.e., student improvements), empower students to do CI (continuous improvement), eliminate what does not add value through Kaizen (brainstorm alternatives and identify a hypothesis to test), conduct PDCA (develop an experimental plan, carry out the plan or do it and then check for results and adjust accordingly), and make a team work to support and share with each other. To apply lean thinking and to create a lean culture classroom, the classroom should first be organized; thereafter, visual sheets should be managed, pre-planning must be done, take-time should be established, and work should be standardized by creating syllabi and schedules and associated materials. Other classroom tools must be available as well, such as Pareto charts, root cause, and weekly quality assessments. To fulfill this objective, a research question has been approved as a data collection instrument in this descriptive study, a 35 item questionnaire was administered to English preparatory school student at Duhok city and an interview was conducted with twelves students with different levels in Sabis International School. In addition, the descriptive statistics indicated that the male learners employed language learning strategies more frequently (average=3.3) than the female learners (average=3.2) One can conclude that male students use lean learning strategies more than female student do. The value of (F) is 2.479, which reflects the dependency to be at significant levels (>0.01 at the level of 1%). Rendering to this model, duration of taking English (b = 1.534, p= .116 p > .01) is statistically not significant predictor of learning lean method. Value of T which is (27.87> .01) and the Value of P (.000) reveals the descriptive factor of gender effect on learning lean method as statistically significant. Besides, all the values in the regression model come out to support the view that gender is effective in the use of learning the lean method. The result also indicates that the duration of taking English does not significantly effect on learning and using lean learning strategies as it reveals in ANOVA analysis. In conclude, the result showed that when student get older and takes more English course, the more learning lean methods they apply in learning language.
APA, Harvard, Vancouver, ISO, and other styles
38

Kim, Sang Woo, YoungWook Go, Sang-Ook Kang, and Chang Kyu Lee. "Quantitative visual tests in primary open-angle glaucoma patients according to three different lights with different color-rendering index." BMC Ophthalmology 21, no. 1 (May 28, 2021). http://dx.doi.org/10.1186/s12886-021-02005-2.

Full text
Abstract:
Abstract Purpose To compare quantitative visual tests, such as visual acuity, contrast sensitivity, and color vision tests in patients with primary open-angle glaucoma (POAG) patients according to three different light systems with different color-rendering index (CRI). Methods This was a cross-sectional study of 36 eyes in 36 patients with POAG. Three different light systems consisting of a 3-band fluorescent lamp (CRI 80), a white LED (CRI 75), and a quantum dot LED (CRI > 95) were used. All lights had the same illuminance of 230 lx to exclude illuminance effects. The visual testing included best-corrected visual acuity (BCVA) using an ETDRS chart, a CSV-1000E contrast test, and a color test performed by the Farnsworth Munsell 100-hue test. Results There was no significant difference in BCVA (p = 0.86). There were no significant differences in the detail contrast tests according to the three light systems (p = 0.95, p = 0.94, p = 0.94, respectively, p = 0.64). There was significant difference between the three light systems in color test (p = 0.042). The color test scores with a quantum dot LED were significantly lower than those of the white LED and 3-band fluorescent lamp (p = 0.03 and 0.047, respectively). Conclusions POAG patients did not show significant differences in visual acuity scores and contrast test scores, expressed as black and white symbols, according to the different light systems. However, POAG patients tested under a quantum dot LED (CRI > 95) could distinguish color differences better than in the other light systems.
APA, Harvard, Vancouver, ISO, and other styles
39

Dias, Maria Helena, and Maria Fernanda Alegria. "Na transição para a moderna cartografia. As cartas náuticas da região de Lisboa de Tofiño e Franzini." Finisterra 29, no. 58 (December 13, 2012). http://dx.doi.org/10.18055/finis1833.

Full text
Abstract:
AT THE TRANSFORMATION TO MODERN CARTOGRAPHY. THE NAUTICAL CHARTS OF THE LISBON AREA ACCORDING TO TOFIÑO AND FRANZINI - This study shows the representation of the Lisbon area as presented by two of the cartographers whose achievements were paramount to improving our knowledge of the coastline: the Spaniard D. Vicente Tofiño de San Miguel (1732-1795) and the Portuguese Mariano Miguel Franzini (1769-1861). The study follows the research embarked upon in 1991, which focuses on old maps of the Portuguese coastline giving a rather detailed depiction of the Lisbon area. The paper dwells briefly on the 16th to 18th century maps analysed in previous studies, presenting then the cartographic production of Tofiño and Franzini that was amenable to inventory. Their production is given against a syntesis of each of their professional backgrounds as technical, military and scientific officials.The representations of the Lisbon coastline in Tofiño's 1788 chart and in Franzini's 1811 charts are analysed and compared to present-day hydrographic charts. The comparison procedure is enhanced through computerized manipulation of the information contained in the documents. Tofiño Spherical Chart (Carta esférica) shows few innovations in the patterning of the Portuguese coastline with the oldest known representations, which is not the case of the remaining charts in his invaluable Maritime Atlas of Spain (Atlas maritimo de España) (1789). The modern outlines of the Portuguese coast are in fact owed to Franzini, who produced both a Reduced Chart (Carta reduzida) and detailed charts. The comparison between current hydrographic charts and Franzini's charts emphasizes his accuracy in representing the coastline, with a considerable number of soundings that indeed enabled diagrammatic renderings of depth contours in the simplified charts shown in this study.
APA, Harvard, Vancouver, ISO, and other styles
40

Nguyen, Julie My Van, Danielle Vicus, Sharon Nofech-Mozes, Lilian T. Gien, Marcus Q. Bernardini, Marjan Rouzbahman, and Liat Hogen. "Risk of second malignancy in patients with ovarian clear cell carcinoma." International Journal of Gynecologic Cancer, December 18, 2020, ijgc—2020–001946. http://dx.doi.org/10.1136/ijgc-2020-001946.

Full text
Abstract:
ObjectiveOvarian clear cell carcinoma has unique clinical and molecular features compared with other epithelial ovarian cancer histologies. Our objective was to describe the incidence of second primary malignancy in patients with ovarian clear cell carcinoma.MethodsRetrospective cohort study of patients with ovarian clear cell carcinoma at two tertiary academic centers in Toronto, Canada between May 1995 and June 2017. Demographic, histopathologic, treatment, and survival details were obtained from chart review and a provincial cancer registry. We excluded patients with histologies other than pure ovarian clear cell carcinoma (such as mixed clear cell histology), and those who did not have their post-operative follow-up at these institutions.ResultsOf 209 patients with ovarian clear cell carcinoma, 54 patients developed a second primary malignancy (25.8%), of whom six developed two second primary malignancies. Second primary malignancies included: breast (13), skin (9), gastrointestinal tract (9), other gynecologic malignancies (8), thyroid (6), lymphoma (3), head and neck (4), urologic (4), and lung (4). Eighteen second primary malignancies occurred before the index ovarian clear cell carcinoma, 35 after ovarian clear cell carcinoma, and 7 were diagnosed concurrently. Two patients with second primary malignancies were diagnosed with Lynch syndrome. Smoking and radiation therapy were associated with an increased risk of second primary malignancy on multivariable analysis (OR 3.69, 95% CI 1.54 to 9.07, p=0.004; OR 4.39, 95% CI 1.88 to 10.6, p=0.0008, respectively). However, for patients developing second primary malignancies after ovarian clear cell carcinoma, radiation therapy was not found to be a significant risk factor (p=0.17). There was no significant difference in progression-free survival (p=0.85) or overall survival (p=0.38) between those with second primary malignancy and those without.ConclusionPatients with ovarian clear cell carcinoma are at increased risk of second primary malignancies, most frequently non-Lynch related. A subset of patients with ovarian clear cell carcinoma may harbor mutations rendering them susceptible to second primary malignancies. Our results may have implications for counseling and consideration for second primary malignancy screening.
APA, Harvard, Vancouver, ISO, and other styles
41

Mohtat, Arash, and József Kövecses. "High-Fidelity Rendering of Contact With Virtual Objects." Journal of Dynamic Systems, Measurement, and Control 137, no. 7 (July 1, 2015). http://dx.doi.org/10.1115/1.4029465.

Full text
Abstract:
When interacting with a virtual object (VO) through a haptic device, it is crucial to feedback a contact force to the human operator (HO) that displays the VO physical properties with high fidelity. The core challenge, here, is to expand the renderable range of these properties, including larger stiffness and smaller inertia, at the available sampling rate. To address this challenge, a framework entitled high-fidelity contact rendering (HFCR) has been developed in this paper. The framework consists of three main strategies: an energy-based rendering of the contact force, smooth transition (ST) between contact modes, and remaining leak dissipation (LD). The essence of these strategies is to make the VO emulate its continuous-time counterpart. This is achieved via physically meaningful modifications in the constitutive relations to suppress artificial energy leaks. The strategies are first developed for the one-dimensional (1D) canonical VO; then, generalization to the multivariable case is discussed. Renderability has been analyzed exploring different stability criteria within a unified approach. This leads to stability charts and identification of renderable range of properties in the presence and absence of the HO. The framework has been validated through simulation and various experiments. Results verify its promising aspects for various scenarios including sustained contact and sudden collision events with or without the HO.
APA, Harvard, Vancouver, ISO, and other styles
42

"Virtual Reality to Promote Real del Monte, Mexico." WSEAS TRANSACTIONS ON ENVIRONMENT AND DEVELOPMENT 16 (November 5, 2020). http://dx.doi.org/10.37394/232015.2020.16.78.

Full text
Abstract:
The development of a virtual reality app based on the old English Pantheon, located in the Magic Town of Real del Monte, Mexico, is addressed in this text. The purpose of the project is that people can take a virtual tour through the pantheon with a smartphone to know the charm and mystery of its tombs, as well as digitally preserve the place in spite of its continuous deterioration over time. All the stages of the creation of the virtual scenery are described in detail, such as: storyboard, 3D modeling, UV mapping, texturing, lighting, and rendering; as well as the stages to realize the application in virtual reality, such as: camera adaptation, plugins’ installation, creation of scripts, relation of camera to the character, and creation of the Android APK. This app offers new ways of advertising historical places of the region and, at the same time, promoting tourism
APA, Harvard, Vancouver, ISO, and other styles
43

Budianto, Enrico, Hary Prabowo, Hafiz ., M. Nanda Kurniawan, Prayoga Dahirsa, and Tirmidzi Faisal. Jurnal Ilmu Komputer dan Informasi 4, no. 2 (May 30, 2012): 92. http://dx.doi.org/10.21609/jiki.v4i2.168.

Full text
Abstract:
GPU atau singkatan dari Graphical Processing Unit merupakan mikroprosesor khusus yang berfungsi memercepat proses rendering grafik 2 dimensi atau 3 dimensi. GPU telah digunakan di beberapa perangkat seperti sistem yang telah ditanam, telepon genggam, komputer, workstation, dan game console. Penggunaan GPU sangat membantu efisiensi penggunaan waktu dalam proses perhitungan. Struktur paralel yang dimilikinya membuat efektivitas GPU lebih baik dibandingkan Control Processing Unit (CPU). Saat ini penggunaan GPU tidak hanya di bidang ilmu komputer saja, kemampuan yang dimiliki GPU dalam proses perhitungan yang rumit dan berulang-ulang menyebabkan penggunaanya telah dimanfaatkan di berbagai bidang. Dalam bidang kedokteran, GPU dimanfaatkan dalam mendiagnosis sebuah penyakit, sementara pada bidang akuntansi GPU digunakan dalam perhitungan data yang sangat banyak. Pada penelitian ini akan dijelaskan mengenai efektivitas penggunaan GPU dalam menjalankan sebuah program dan aplikasi serta perbedaannya dengan penggunaan CPU biasa. GPU stands for Graphical Processing Unit is a specialized microprocessor that serves to accelerate the process of rendering two-dimensional charts or three dimensions. GPUs have been used in several devices such as in embedded systems, mobile phones, computers, workstations, and game console. GPU usage helps the efficiency use of time in the calculation process. Its parallel structure, makes the effectiveness of GPU better than the CPU. Today the use of GPU not only in the field of computer science, the capabilities of the GPU in the complex and repetitive calculations process causes it has been utilized in various fields. In the medical field, a GPU used in diagnosing the disease, while in the field of accounting GPU used in the calculation of very much data. In this research, researcher will explain the effectiveness of using GPU in running a program and applications as well as the differences with ordinary CPU usage.
APA, Harvard, Vancouver, ISO, and other styles
44

Chepwogen Soi, Beatrice. "A Study of Tea Pluckers’ Perception on the Introduction of Tea Plucking Machines in James Finlay’s Kenya." International Journal of Scientific Research and Management 6, no. 02 (February 26, 2018). http://dx.doi.org/10.18535/ijsrm/v6i2.em09.

Full text
Abstract:
This study sought to assess the perception of tea pluckers on the introduction of tea plucking machines in James Finlay’s Kenya. The research was a case study of one of the major tea industry players; James Finlay’s Kenya Ltd. The target populations were all the tea estates in James Finlay’s (K) Ltd in Kericho County. James Finlay’s has a total of 13 Tea estates. There are a total of 10,262 tea pluckers in the company. A sample of five estates was considered and a sample of 20% of tea pluckers in these estates was taken for the study. A representative group according to Mugenda and Mugenda is one that is at least 10% of the population of interest. The above sample was therefore representative. Data was collected using structured questionnaires and interview methods. The data collected was supplemented with available literature review on the subject of study. The response from cross-functional sample group was analysed using descriptive statistical techniques in form of frequency distribution tables, percentages, pie charts, and computer packages. The study found that the tea pluckers viewed the introduction of the machines negatively; they interpreted it as a way of rendering them obsolete and eventually retrenching them. They therefore did not fully support its implementation
APA, Harvard, Vancouver, ISO, and other styles
45

Davidson, Guy. "The Gay Novel and the Gay World." Journal of New Zealand Studies, NS26 (July 2, 2018). http://dx.doi.org/10.26686/jnzs.v0ins26.4840.

Full text
Abstract:
In a recent review essay, J. Daniel Elam charts the emergence of “gay world literary fiction,” a subgenre of the category “world literature,” which over the last twenty years or so has become both a marketing strategy for publishers and a “disciplinary rallying point of literary criticism and the academic humanities.”[i] While Elam’s essay is implicitly underpinned by the usual disciplinary understanding of world literature (fiction from potentially anywhere in the globe, translated into English, and studied comparatively), its focus is narrowed to the “gay world” within the planetary world—a putatively homogenous, transnational gay subculture enabled by digital connectivity and the flows of global capital. This new gay world is, according to Elam, characterized by atomization: “From Sofia to Shanghai, authors of gay fiction describe a collection of scattered and isolated individuals, needy but incurious.” The situation has emerged from the “curious paradox” that “visibility and acceptance” have “made life better” for many gay men “at the cost of community and identity.” “Gay visibility, with its attendant politics of respectability” has occurred at the expense of older subcultural institutions like “the gay bar, the bathhouse, the piano bar, and cruising areas,” rendering the gay community “a banally knowable object rather than the product of a passionately forged experience of self-making. In place of the urgent longings of 20th-century queer literature, one encounters a peculiar form of worldly, muted yearning. So-called gay world literature emerges from a global community that isn’t a community at all.” [i] J. Daniel Elam, “The World of Gay Lit,” Public Books (16 October 2017). Web. Accessed 1 March, 2018. “Disciplinary rallying point”: Emily Apter, Against World Literature: On the Politics of Untranslatability (London: Verso, 2013), 1. For a discussion of the interrelations between “world literature” as the marketization of cultural differences and as a field of scholarly enquiry, see Simon During, Exit Capitalism: Literary Culture, Theory and Post-Secular Modernity (New York: Routledge, 2009), 57–58.
APA, Harvard, Vancouver, ISO, and other styles
46

Vella Bonavita, Helen, and Lelia Green. "Illegitimate." M/C Journal 17, no. 5 (October 29, 2014). http://dx.doi.org/10.5204/mcj.924.

Full text
Abstract:
Illegitimacy is a multifaceted concept, powerful because it has the ability to define both itself and its antithesis; what it is not. The first three definitions of the word “illegitimate” in the Oxford English Dictionary – to use an illegitimate academic source – begin with that negative: “illegitimate” is “not legitimate’, ‘not in accordance with or authorised by law”, “not born in lawful wedlock”. In fact, the OED offers eight different usages of the term “illegitimate”, all of which rely on the negation or absence of the legitimate counterpart to provide a definition. In other words, something can only be illegitimate in the sense of being outside the law, if a law exists. A child can only be considered illegitimate, “not born in lawful wedlock” if the concept of “lawful wedlock” exists.Not only individual but national identity can be constructed by defining what – or who – has a legitimate reason to be a part of that collective identity, and who does not. The extent to which the early years of Australian colonial history was defined by its punitive function can be mapped by an early usage of the term “illegitimate” as a means of defining the free settlers of Australia. In an odd reversal of conventional associations of “illegitimate”, the “illegitimates” of Australia were not convicts. They were people who had not been sent there for legitimate – (legal) reasons and who therefore did not fit into the depiction of Australia as a penal colony. The definition invites us to consider the relationship between Australia and Britain in those early years, when Australia provided Britain with a means of constructing itself as a “legitimate” society by functioning as a location where undesirable elements could be identified and excluded. The “illegitimates” of Australia challenged Australia’s function of rendering Britain a “legitimate” society. As a sense of what is “illegitimate” in a particular context is codified and disseminated, a corresponding sense of what is “legitimate” is also created, whether in the context of the family, the law, academia, or the nation. As individuals and groups label and marginalise what is considered unwanted, dangerous, superfluous or in other ways unsatisfactory in a society, the norms that are implicitly accepted become visible. Rather as the medical practice of diagnosis by exclusion enables a particular condition to be identified because other potential conditions have been ruled out, attempts to “rule out” forms of procreation, immigration, physical types, even forms of performance as illegitimate enable a legitimate counterpart to be formed and identified. Borrowing a thought from Tolstoy’s Anna Karenina, legitimates are all alike and formed within the rules; the illegitimates are illegitimate in a variety of ways. The OED lists “illegitimate” as a noun or adjective; the word’s primary function is to define a status or to describe something. Less commonly, it can be used as a verb; to “illegitimate” someone is to bastardise them, to render them no longer legitimate, to confer and confirm their illegitimate status. Although this has most commonly been used in terms of a change in parents’ marital status (for example Queen Elizabeth I of England was bastardised by having her parents’ marriage declared invalid; as had been also the case with her older half-sister, Mary) illegitimisation as a means of marginalising and excluding continues. In October 2014, Australian Immigration Minister Scott Morrison introduced legislation designed to retrospectively declare that children born in Australia to parents that have been designated “unlawful maritime arrivals” should inherit that marginalised status (Mosendz, Brooke). The denial of “birthright citizenship”, as it is sometimes called, to these infants illegitimises them in terms of their nationality, cutting them away from the national “family”. Likewise the calls to remove Australian nationality from individuals engaging in prohibited terrorist activities uses a strategy of illegitimisation to exclude them from the Australian community. No longer Australian, such people become “national bastards”.The punitive elements associated with illegitimacy are not the only part of the story, however. Rather than being simply a one-way process of identification and exclusion, the illegitimate can also be a vital source of generating new forms of cultural production. The bastard has a way of pushing back, resisting efforts at marginalisation. The papers in this issue of M/C consider the multifarious ways in which the illegitimate refuses to conform to its normative role of defining and obeying boundaries, fighting back from where it has been placed as being beyond the law. As previously mentioned, the OED lists eight possible usages of “illegitimate”. Serendipitously, the contributions to this issue of M/C address each one of them, in different ways. The feature article for this issue, by Katie Ellis, addresses the illegitimisation inherent in how we perceive disability. With a profusion of bastards to choose from in the Game of Thrones narratives, Ellis has chosen to focus on the elements of physical abnormality that confer illegitimate status. From the other characters’ treatment of the dwarf Tyrion Lannister, and other disabled figures within the story, Ellis is able to explore the marginalisation of disability, both as depicted by George R. R. Martin and experienced within the contemporary Australian community. Several contributions address the concept of the illegitimate from its meaning of outside the law, unauthorised or unwarranted. Anne Aly’s paper “Illegitimate: When Moderate Muslims Speak Out” sensitively addresses the illegitimate position to which many Muslims in Australia feel themselves relegated. As she argues, attempting to avoid being regarded as “apologists for Islam” yet simultaneously expected to act as a unifying voice for what is in fact a highly fragmented cultural mix, places such individuals in an insupportable, “illegitimate” position. Anne Aly also joins Lelia Green in exploring the rhetorical strategies used by various Australian governments to illegitimate specific cohorts of would-be Australian migrants. “Bastard immigrants: asylum seekers who arrive by boat and the illegitimate fear of the other” discusses attempts to designate certain asylum seekers as illegitimate intruders into the national family of Australia in the context of the ending of the White Australia policy and the growth of multicultural Australia. Both papers highlight the punitive impact of illegitimisation on particular segments of society and invite recognition of the unlawfulness, or illegitimacy, of the processes themselves that have been used to create such illegitimacy.Illegitimate processes and incorrect inferences, and the illegitimisation of an organisation through media representation which ignores a range of legitimate perspectives are the subject of Ashley Donkin’s work on the National School Chaplaincy and Student Welfare Program (NSCSWP). As Donkin notes, this has been a highly controversial topic in Australia, and her research identifies the inadequacies and prejudices that, she argues, contributed to an illegitimate representation of the programme in the Australian media. Without arguing for or against the NSCSWP, Donkin’s research exposes the extent of prejudiced reporting in the Australian media and its capacity to illegitimise programmes (or, indeed, individuals). Interesting here, and not entirely irrelevant (although not directly addressed in Donkin’s paper), is the notion of prejudice as being an opinion formed or promulgated prior to considering the equitable, just or judicial/judged position. Analogous to the way in which the illegitimate is outside the law, the prejudiced only falls within the law through luck, rather than judgement, since ill-advised opinion has guided its formation. Helen Vella Bonavita explores why illegitimacy is perceived as evil or threatening, looking to anthropologists Mary Douglas and Edmund Leach. Using Shakespeare’s Henry V as a case study, Vella Bonavita argues that illegitimacy is one of the preeminent metaphors used in literature and in current political discourses to articulate fears of loss of national as well as personal identity. As Vella Bonavita notes, as well as being a pollutant that the centre attempts to cast to the margins, the illegitimate can also be a potent threat, a powerful figure occupying an undeniable position, threatening the overturning of the established order. The OED’s definition of illegitimate as “one whose position is viewed in some way as illegitimate” is the perspective taken by Crystal Abidin and Herawaty Abbas. In her work “I also Melayu OK”, Abidin explores the difficult world of the bi-racial person in multi-ethnic Singapore. Through a series of interviews, Abbas describes the strategies by which individuals, particularly Malay-Chinese individuals, emphasise or de-emphasise particular linguistic or cultural behaviours in order to overcome their ambivalent cultural position and construct their own desired socially legitimate identity. Abidin’s positive perspective nonetheless evokes its shadow side, the spectre of the anti-miscegenation laws of a range of racist times and societies (but particularly Apartheid South Africa), and those societies’ attempts to outlaw any legitimisation of relationships, and children, that the law-makers wished to prohibit. The paper also resonates with the experience of relationships across sectarian divides and the parlous circumstances of Protestant –Catholic marriages and families during the 1970s in the north of Ireland, or of previously-acceptable Serbo-Croatian unions during the disintegration of the former Socialist Federal Republic of Yugoslavia in the 1990s. Herawaty Abbas and Brooke Collins-Gearing reflect on the process of academic self-determination and self-construction in “Dancing with an illegitimate feminism: a female Buginese scholar's voice in Australian Academia”. Abbas and Collins-Gearing address the research journey from the point of view of a female Buginese PhD candidate and an Indigenous Australian supervisor. With both candidate and supervisor coming from traditionally marginalised backgrounds in the context of Western academia, Abbas and Collins-Gearing chart a story of empowerment, of finding a new legitimacy in dialogue with conventional academic norms rather than conforming to them. Three contributions address the illegitimate in the context of the illegitimate child, moving from traditional associations of shame and unmarried pregnancy, to two creative pieces which, like Abidin, Abbas and Collins-Gearing, chart the transformative process that re-constructs the illegitimate space into an opportunity to form a new identity and the acceptance, and even embrace, of the previously de-legitimising authorities. Gardiner’s work, “It is almost as if there were a written script: child murder, concealment of birth and the unmarried mother in Western Australia” references two women whose stories, although situated almost two hundred years apart in time, follow a similarly-structured tale of pregnancy, shame and infant death. Kim Coull and Sue Bond in “Secret Fatalities and Liminalities” and “Heavy Baggage and the Adoptee” respectively, provide their own stories of illuminative engagement with an illegitimate position and the process of self-fashioning, while also revisiting the argument of the illegitimate as the liminal, a perspective previously advanced by Vella Bonavita’s piece. The creative potential of the illegitimate condition is the focus of the final three pieces of this issue. Bruno Starrs’s “Hyperlinking History and the Illegitimate Imagination” discusses forms of creative writing only made possible by the new media. Historic metafiction, the phrase coined by Linda Hutcheon to reflect the practice of inserting fictional characters into historical situations, is hardly a new phenomenon, but Starrs notes how the possibilities offered by e-publishing enable the creation of a new level of metafiction. Hyperlinks to external sources enable the author to engage the reader in viewing the book both as a work of fiction and as self-conscious commentary on its own fictionality. Renata Morais’ work on different media terminologies in “I say nanomedia, You say nano-media: il/legitimacy, interdisciplinarity and the anthropocene” also considers the creative possibilities engendered by interdisciplinary connections between science and culture. Her choice of the word “anthropocene,” denoting the geological period when humanity began to have a significant impact on the world’s ecosystems, itself reflects the process whereby an idea that began in the margins gains force and legitimacy. From an informal and descriptive term, the International Commission on Stratigraphy have recently formed a working group to investigate whether the “Anthropocene” should be formally adopted as the name for the new epoch (Sample).The final piece in this issue, Katie Lavers’ “Illegitimate Circus”, again traces the evolution of a theatrical form, satisfyingly returning in spirit if not in the written word to some of the experiences imagined by George R. R. Martin for his character Tyrion Lannister. “Illegitimate drama” was originally theatre which relied more on spectacle than on literary quality, according to the OED. Looking at the evolution of modern circus from Astley’s Amphitheatre through to the Cirque du Soleil spectaculars, Lavers’ article demonstrates that the relationship between legitimate and illegitimate is not one whereby the illegitimate conforms to the norms of the legitimate and thereby becomes legitimate itself, but rather where the initial space created by the designation of illegitimate offers the opportunity for a new form of art. Like Starrs’ hyperlinked fiction, or the illegitimate narrators of Coull or Bond’s work, the illegitimate art form does not need to reject those elements that originally constituted it as “illegitimate” in order to win approval or establish itself. The “illegitimate”, then, is not a fixed condition. Rather, it is a status defined according to a particular time and place, and which is frequently transitional and transformative; a condition in which concepts (and indeed, people) can evolve independently of established norms and practices. Whereas the term “illegitimate” has traditionally carried with it shameful, dark and indeed punitive overtones, the papers collected in this issue demonstrate that this need not be so, and that the illegitimate, possibly more than the legitimate, enlightens and has much to offer.ReferencesMosendz, Polly. “When a Baby Born in Australia Isn’t Australian”. The Atlantic 16 Oct. 2014. 25 Oct. 2014 ‹http://www.theatlantic.com/international/archive/2014/10/when-a-baby-born-in-australia-isnt-australian/381549/›Baskin, Brooke. “Asylum Seeker Baby Ferouz Born in Australia Denied Refugee Status by Court”. The Courier Mail 15 Oct. 2014. 25 Oct. 2014 ‹http://www.couriermail.com.au/news/queensland/asylum-seeker-baby-ferouz-born-in-australia-denied-refugee-status-by-court/story-fnihsrf2-1227091626528›.Sample, Ian. “Anthropocene: Is This the New Epoch of Humans?” The Guardian 16 Oct. 2014. 25 Oct. 2014 ‹http://www.theguardian.com/science/2014/oct/16/-sp-scientists-gather-talks-rename-human-age-anthropocene-holocene›.
APA, Harvard, Vancouver, ISO, and other styles
47

"Catalogus van schilderijen van Jan Claesz." Oud Holland - Quarterly for Dutch Art History 104, no. 3-4 (1990): 212–17. http://dx.doi.org/10.1163/187501790x00093.

Full text
Abstract:
AbstractIn Enkhuizen, the fifth major town in the region of Holland at the time, dozens of portraits were painted in the last years of the sixteenth and first decades of the seventeenth century. ln 1934 A. B. de Vries acknowledged a few paintings of 1594 and 1595 (cat. nos. 3, 4 and 5) as the work of an artist who was active in Enkhuizen and a follower of the Amsterdam painters Pieter and Aert Pietersz. It transpires that a large number of other portraits can be attributed to that same painter. Thanks to the fact that a print by Willem Delff after one of the works in this group, a portrait of Henricus Antonii Nerdenus of 1604 (fig. 5) bears the inscription Ioan.Nicol.Enchus.pinx., the anonymous Enkhuizen artist can be identified as one Jan Claesz. Archive research has yielded only a series of entries in notarial deeds of 1613 - 1616, but the painter's works facilitate the construction of a brief biography. Jan Claesz. was probably born around 1570 or a little earlier in or near Enkhuizen, and trained with Pieter or Aert Pietersz. in Amsterdam. The young artist painted a few portraits in that city in 1593. Shortly afterwards he moved to Enkhuizen, where, j udging by his paintings, he was certainly active until 1618. He probably died that year or a little later. As far as can be established he confined himself to portraiture. The earliest known attributable works are his portraits of Bartholomeus van der Wicrc and his wife, painted in 1593 (figs. 7 and 8) and clearly showing the influence of Pieter and Aert Pietersz. The compositions and poses are characteristic of Jan Claesz.'s work; the background perspective does not quite come off. His portraits of two sisters of 1594 (figs. 9 and 10) are less ambitious, and are among the most attractive Netherlandish children's portraits of the late sixteenth century. Very similar is a portrait of Reynu Semeyns, painted a year later (fig. II), which displays the same painstaking method. This picture once had a companion piece, a portrait of the famous explorer Jan Huygen van Linschoten which is only known from a copper engraving with a partial copy in mirror image (fig. 12). This print suggests a close relationship between the portrait of Van Linschoten and a painting of 1598 in which Adriaen Teding van Berkhout is depicted (fig. 13). In 1598 Jan Claesz. also painted a full-length portrait of a child standing on a tiled floor, with two pilasters and an arch in the background (fig. 14), an arrangement he used on a number of subsequent occasions (figs. 23, 24, 26 and 27). A separate group in Jan Claesz.'s œuvre consists of three double portraits of 1601 and 1602, featuring an adult wih a child (figs. 15, 16 and 17); the companion pieces of 1602 demonstrate that the painter not only worked for Enkhuizen patrons but also for the regents in the neighbouring town of Hoorn. A few portraits of older people painted between 1603 and 1608 (figs. 2, 3, 18, 19 and 20) clearly show the minute detail in the painting, sometimes resulting in a certain hardness in the rendering. A portrait of a boy of 1608 (fig. 21) suggests that the artist was familiar with the interest evinced in other towns for giving portraits trompe-l'œil frames. Another portrait of a boy painted a year later (fig. 22) is the earliest known example of a type of children's portrait that was especially popular in West Frisia in the seventeenth century; the subject is a boy with a miniature horse. A child's portrait previously attributed to Adriaen van der Linde, a painter active in Frisia, but consistent in every aspect with other paintings by Jan Claesz., dates from the same period (fig. 24). A similar portrait, probably depicting Claes Gerritsz. Slijper and painted in 1614, has suffered considerably from overpainting of the head (fig. 28). A few portraits of adults dating from 1616-1618 (figs. 33, 34 and 36) are the last known works of the painter and among the best he ever did. Like other paintings by Jan Claesz. (figs. 1 5 and 35), they also give us an idea of the rich traditional costume of Enkhuizen. Jan Claesz. may be regarded as a representative of the generation of portraitists who in the waning sixteenth and dawning seventeenth century laid the foundations for the heyday of portraiture in the ensuing years of the seventeenth century. He is also a representative of the widespread influence of the painters Pieter and Aert Pietersz., an influence particularly noticeable in the northern region of the Netherlands. He added his own elements to their example. His fairly numerous portraits of children, with their somewhat naive charm, form an important contribution to our knowledge of the North Netherlandish children's portrait of around 1600.
APA, Harvard, Vancouver, ISO, and other styles
48

Brien, Donna Lee. "Imagining Mary Dean." M/C Journal 7, no. 1 (January 1, 2004). http://dx.doi.org/10.5204/mcj.2320.

Full text
Abstract:
“As the old technologies become automatic and invisible, we find ourselves more concerned with fighting or embracing what’s new”—Dennis Baron, From Pencils to Pixels: The Stage of Literacy Technologies In a world where nothing is certain… and even the objectivity of science is qualified by relativity and uncertainty, the single human voice, telling its own story, can seem the only authentic way of rendering consciousness. – David Lodge (“Sense and Sensibility”) Leon Edel expressed the central puzzle of writing biography as “every life takes its own form and a biographer must find the ideal and unique literary form that will express it” (qtd. in Novarr 165). My primary challenge in writing Poisoned: The Trials of Mary Dean – a biography in the form of a (fictionalised) first-person memoir purportedly written by the subject herself – was the location of a textual voice for Mary that, if not her own, could have credibly belonged to a woman of her time, place and circumstance. The ‘Dean case’ caused a sensation across Australia in the mid-1890s when George Dean was arrested for the attempted murder of his 20-year-old wife, Mary. George was a handsome Sydney ferry master who had played the romantic lead in a series of spectacular rescues, flinging himself into the harbour to save women passengers who had fallen overboard. When on trial for repeatedly poisoning his wife, his actions and motivations were not probed; instead, Mary’s character and behaviour and, by extrapolation, those of the entire female sex, were examined and analysed. This approach climaxed in defence counsel claims that Mary poisoned herself to frame her husband, but George was found guilty and sentenced to hang, the mandatory punishment for attempted murder at that time. Despite the persuasive prosecution evidence and the jury’s unanimous verdict, the Sydney press initiated a public outcry. After a series of inflamed community meetings and with a general election approaching, the Premier called for a Royal Commission into Dean’s conviction. This inquiry came to the extraordinary conclusion that: the facts, as shown, are quite as compatible with the hypothesis that Mrs. Dean ... administered the arsenic to herself – possibly at the prompting of her mother and without any intention of taking a fatal dose – as that the poison was administered to her by her husband with an intent to kill. (Regina v George Dean 16) George was freed with a Royal Pardon and Mary was publicly reviled as a pariah of the lowest order. This unhappy situation continued even after it was revealed that her husband had confessed his guilt to his solicitor, and charges of conspiracy and perjury were brought against George and his lawyers who were then members of the New South Wales Parliament. Although the lawyers both escaped relatively unscathed, George Dean was gaoled for 14 years. This was despite Mary’s story having obvious potential for a compelling biographical narrative. To begin with, she experiences the terror of suspecting her own husband is poisoning her as she convalesces after the birth of their child. She survives repeated doses of strychnine and arsenic, only to confront the humiliating certainty that her husband was desperate to be rid of her. Then, weak and ill, she has to endure the ordeal of police-court proceedings and a criminal trial when she is damned as a witch conspiring with her wicked mother to ruin her husband. Withstanding assertions that her childhood home was a brothel and she a prostitute, she spends long weeks in hospital knowing her husband is under sentence of execution, only to be released, destitute, with a sickly child she has poisoned with her own breast milk. Still physically debilitated, she is called before a Royal Commission where she is again violently cross-examined and, on the day of her twenty-first birthday, is confronted with the knowledge that not only was her mother a transported convict, but that she is, herself, of illegitimate birth. When the Commission finds in her husband’s favour, Mary has to watch her poisoner pardoned, freed and feted as a popular celebrity, while she faces an increasingly viperous press, and is jeered at and spat on in the streets. Next, she is forced to testify at yet another series of public trials and finally, even when her husband confesses his crime and is gaoled for perjury (his Royal Pardon saving him from again facing an attempted murder charge), she is ostracised as the penniless wife of a common criminal and illegitimate daughter of a transported convict. Despite this, and having little more than the shame of divorce to look forward to, Mary nevertheless regains her health and, four years after her final court appearance, marries a respectable shopkeeper. A year later, in 1902, she gives birth to her second child. together with examples written by women of her time, class and education, fabricating an extended letter (written by Mary, but based on historical evidence) seemed a viable textual solution. For centuries, domestic letters were a major means of autobiographical expression for ‘non-literary’, working-class women and, moreover, a textual format within which Mary (silenced for over a century) could finally relate her own version of events. These decisions aligned with what John Burnett has identified as the most common motivations for a working-class person to write an autobiographical narrative: “belief that he [sic] had some important … personal triumph over difficulties and misfortune … to leave for one’s children or grandchildren” (11). This relatively common human desire also tailored neatly with a central theme animating Mary’s life – that ignorance about the past can poison your future. To create a textual voice for Mary in her narrative, I utilised the literary process of ‘ventriloquising’ or providing a believable (fictional) voice for a historical character – the term ‘literary ventriloquism’ was coined by David Lodge in 1987 for how novelists create (and readers ‘hear’) the various voices in literary works (100). While biographies including Andrew Motion’s Wainewright the Poisoner (2000) and, as Richard Freadman has noted, Gertrude Stein’s The Autobiography of Alice B. Toklas (1933) have effectively employed varieties of biographical ventriloquism, this is a literary device more frequently used by fiction writers. It is also interesting to note that when skilled fiction writers employ ventriloquism, their resulting works are often perceived as much as biography-histories as imaginative pieces. Peter Carey’s The True History of the Kelly Gang (2000) is the invented document of which Kelly biographers dream, an autobiographical account supposedly written by Kelly so his infant daughter might “comprehend the injustice we poor Irish suffered” (5), but the voice Carey created was so credible that historians (including Ian Jones and Alex McDermott) debated its authenticity. This was despite Carey making no claims for the historical accuracy of his work. Of course, I primarily tested my text against such press interviews, Mary’s own letters and the articulations of her voice reported in the trial and other court records. Not that the latter group of texts can be taken as ‘verbatim’ transcriptions. Although court and other legal records provide, as Karen Dubinskyhas noted, “a window into instances of personal life … we can hear people talking about love, emotional and sexual intimacy, power, betrayal and broken promises” (4), such texts are profoundly mediated documents. The citations we now read in print passed through many hands – Mary’s testimonies would have been initially noted by the court stenographer, then transcribed, corrected, edited, typeset, corrected again, printed and bound – with each stage in the process incorporating inaccuracies, omissions and changes into the text. And, however accurate, such transcripts are never complete, neither indicating the tone in which answers were given, nor the speakers’ hesitations, pauses or accompanying gestures. The transcripts I used also record many examples of Lyndal Roper’s “forced discourse”, where Mary was directed to give only usually abbreviated responses to questions, questions which no doubt often directed the tone, content and even wording of her answers (54). Despite these limitations, it was following Mary in court through these texts, cringing at the humiliation and bullying she was subjected to, rallying when she showed spirit and almost cheering when she was finally vindicated, which allowed me to feel a real human connection with her as my subject. It was via these texts (and her own letters) that I also became aware that Mary Dean had been a person who, at the same time as she was living her life, was also (as are we all) remembering, forgetting and, probably, fabricating stories about that life – stories which, at times, challenged and contradicted each other. My aim was always to move beyond finding a persuasive textual voice for Mary, that is one which seemed authentic (and suitable for a novel), to one able to tell some of the contradictory stories of Mary’s life, as she no longer could. Ultimately, I wanted every utterance of my textual rendering of her speech to declare (as J. M. Coetzee has one of his characters say): “I live, I suffer, I am here. With cunning and treachery, if necessary, I fight against becoming one of the forgotten ones of history” (3). Works Cited Allen, Judith A. Sex and Secrets: Crimes Involving Australian Women Since 1880. Melbourne: Oxford U P, 1990. Burnett, John, ed. Useful Toil: Autobiographies of Working People from the 1820s to the 1920s. London: Allen Lane, 1976. Carey, Peter. The True History of the Kelly Gang, St. Lucia: U of Queensland P, 2000. Coetzee, J. M. In the Heart of the Country. London: Secker and Warburg, 1977. Daily Telegraph, 9 October 1895: 5. Dubinsky, Karen. Improper Advances: Rape and Heterosexual Conflict in Ontario, 1880-1929. Chicago and London: U of Chicago P, 1993. Freadman, Richard. â??Prose and Cons of a Bizarre Lifeâ?. The Age 27 May 2000: Saturday Extra, 7. Holmes, Katie. Spaces in Her Day: Australian Womenâ??s Diaries of the 1920s-1930s. St. Leonards: Allen and Unwin, 1995. â??Interview with Mrs. Dean.â? Truth 5 May 1895: 7. Jones, Ian. â??Not in Nedâ??s Natureâ?. The Weekend Australian 18-9 Aug. 2001: R12-3. Lodge, David. â??After Bakhtin.â? The Linguistics of Writing. Ed. Nigel Fabb, Derek Attridge, Alan Durant and Colin MacCabe. New York: Methuen, 1987: 89-102. Lodge, David. â??Sense and Sensibilityâ?. The Guardian Unlimited 2 Nov. 2002. [accessed 12/11/02] <http://books.guardian.co..uk/review/story/0,12084,823955,00.php> McDermott, Alex. â??Ned Kellyâ??s Yawp.â? Australian Book Review Mar. 2002: 16-8. McDermott, Alex. â??The Apocalyptic Chant of Edward Kellyâ?. The Jerilderie Letter. Melbourne: Text, 2001. v-xxxiv. Motion, Andrew. Wainewright the Poisoner. London: Faber, 2000. Novarr, David. The Lines of Life: Theories of Biography, 1880-1970. West Layfayette, In.: Perdue U P, 1986. Peers, Juliet. What No Man Had Ever Done Before. Malvern, Vic.: Dawn Revival, 1992. Regina v George Dean: Report of the Royal Commission, Appointed Seventh Day of May, 1895. Sydney: Government, 1895. Roper, Lyndal. Oedipus and the Devil: Witchcraft, Sexuality and Religion in Early Modern Europe. London and New York: Routledge, 1994. Seymour, Mary. Letter to George Clements, 12 October 1891. Letters to Frank Brereton, 22 October 1891, 30 December 1891, 25 April 1892. Regina v George Dean: Report of the Royal Commission: 244-46. Spender, Dale. â??Journal on a Journal.â? Womenâ??s Studies International Forum 10.1 (1987): 1-5. Sydney Morning Herald. 9 October 1895: 8, 10 October 1895: 5. Links http://books.guardian.co..uk/review/story/0 Citation reference for this article MLA Style Brien, Donna Lee. "Imagining Mary Dean " M/C: A Journal of Media and Culture <http://www.media-culture.org.au/0401/07-brien.php>. APA Style Brien, D. (2004, Jan 12). Imagining Mary Dean . M/C: A Journal of Media and Culture, 7, <http://www.media-culture.org.au/0401/07-brien.php>
APA, Harvard, Vancouver, ISO, and other styles
49

Shiloh, Ilana. "Adaptation, Intertextuality, and the Endless Deferral of Meaning." M/C Journal 10, no. 2 (May 1, 2007). http://dx.doi.org/10.5204/mcj.2636.

Full text
Abstract:
Film adaptation is an ambiguous term, both semantically and conceptually. Among its multiple connotations, the word “adaptation” may signify an artistic composition that has been recast in a new form, an alteration in the structure or function of an organism to make it better fitted for survival, or a modification in individual or social activity in adjustment to social surroundings. What all these definitions have in common is a tacitly implied hierarchy and valorisation: they presume the existence of an origin to which the recast work of art is indebted, or of biological or societal constraints to which the individual should conform in order to survive. The bias implied in the very connotations of the word has affected the theory and study of film adaptation. This bias is most noticeably reflected in the criterion of fidelity, which has been the major standard for evaluating film adaptations ever since George Bluestone’s 1957 pivotal Novels into Films. “Fidelity criticism,” observes McFarlane, “depends on a notion of the text as having and rendering up to the (intelligent) reader a single, correct ‘meaning’ which the film-maker has either adhered to or in some sense violated or tampered with” (7). But such an approach, Leitch argues, is rooted in several unacknowledged but entrenched misconceptions. It privileges literature over film, casts a false aura of originality on the precursor text, and ignores the fact that all texts, whether literary or cinematic, are essentially intertexts. As Kristeva, along with other poststructuralist theorists, has taught us, any text is an amalgam of others, a part of a larger fabric of cultural discourse (64-91). “A text is a multidimensional space in which a variety of writings, none of them original, blend and clash”, writes Barthes in 1977 (146), and 15 years later film theoretician Robert Stam elaborates: “The text feeds on and is fed into an infinitely permutating intertext, which is seen through evershifting grids of interpretation” (57). The poststructuralists’ view of texts draws on the structuralists’ view of language, which is conceived as a system that pre-exists the individual speaker and determines subjectivity. These assumptions counter the Romantic ideology of individualism, with its associated concepts of authorial originality and a text’s single, unified meaning, based on the author’s intention. “It is language which speaks, not the author,” declares Barthes, “to write is to reach the point where only language acts, ‘performs’, and not me” (143). In consequence, the fidelity criterion of film adaptation may be regarded as an outdated vestige of the Romantic world-view. If all texts quote or embed fragments of earlier texts, the notion of an authoritative literary source, which the cinematic version should faithfully reproduce, is no longer valid. Film adaptation should rather be perceived as an intertextual practice, contributing to a dynamic interpretive exchange between the literary and cinematic texts, an exchange in which each text can be enriched, modified or subverted. The relationship between Jonathan Nolan’s short story “Memento Mori” and Christopher Nolan’s film Memento (2001) is a case in point. Here there was no source text, as the writing of the story did not precede the making of the film. The two processes were concurrent, and were triggered by the same basic idea, which Jonathan discussed with his brother during a road trip from Chicago to LA. Christopher developed the idea into a film and Jonathan turned it into a short story; he also collaborated in the film script. Moreover, Jonathan designed otnemem> (memento in reverse), the official Website, which contextualises the film’s fictional world, while increasing its ambiguity. What was adapted here was an idea, and each text explores in different ways the narrative, ontological and epistemological implications of that idea. The story, the film and the Website produce a multi-layered intertextual fabric, in which each thread potentially unravels the narrative possibilities suggested by the other threads. Intertextuality functions to increase ambiguity, and is therefore thematically relevant, for “Memento Mori”, Memento and otnemem> are three fragmented texts in search of a coherent narrative. The concept of narrative may arguably be one of the most overused and under-defined terms in academic discourse. In the context of the present paper, the most productive approach is that of Wilkens, Hughes, Wildemuth and Marchionini, who define narrative as a chain of events related by cause and effect, occurring in time and space, and involving agency and intention. In fiction or in film, intention is usually associated with human agents, who can be either the characters or the narrator. It is these agents who move along the chain of causes and effects, so that cause-effect and agency work together to make the narrative. This narrative paradigm underpins mainstream Hollywood cinema in the years 1917-1960. In Narration in the Fiction Film, David Bordwell writes: The classical Hollywood film presents psychologically defined individuals who struggle to solve a clear-cut problem or to attain specific goals. … The story ends with a decisive victory or defeat, a resolution of the problem, and a clear achievement, or non achievement, of the goals. The principal causal agency is thus the character … . In classical fabula construction, causality is the prime unifying principle. (157) The large body of films flourishing in America between the years 1941 and 1958 collectively dubbed film noir subvert this narrative formula, but only partially. As accurately observed by Telotte, the devices of flashback and voice-over associated with the genre implicitly challenge conventionally linear narratives, while the use of the subjective camera shatters the illusion of objective truth and foregrounds the rift between reality and perception (3, 20). Yet in spite of the narrative experimentation that characterises the genre, the viewer of a classical film noir film can still figure out what happened in the fictional world and why, and can still reconstruct the story line according to sequential and causal schemata. This does not hold true for the intertextual composite consisting of Memento, “Memento Mori” and otnemem>. The basic idea that generated the project was that of a self-appointed detective who obsessively investigates and seeks to revenge his wife’s rape and murder, while suffering from a total loss of short term memory. The loss of memory precludes learning and the acquisition of knowledge, so the protagonist uses scribbled notes, Polaroid photos and information tattooed onto his skin, in an effort to reconstruct his fragmented reality into a coherent and meaningful narrative. Narrativity is visually foregrounded: the protagonist reads his body to make sense of his predicament. To recap, the narrative paradigm relies on a triad of terms: connectedness (a chain of events), causality, and intentionality. The basic situation in Memento and “Memento Mori”, which involves a rupture in the protagonist’s/narrator’s psychological time, entails a breakdown of all three pre-requisites of narrativity. Since the protagonists of both story and film are condemned, by their memory deficiency, to living in an eternal present, they are unable to experience the continuity of time and the connectedness of events. The disruption of temporality inevitably entails the breakdown of causality: the central character’s inability to determine the proper sequence of events prevents him from being able to distinguish between cause and effect. Finally, the notion of agency is also problematised, because agency implies the existence of a stable, identifiable subject, and identity is contingent on the subject’s uninterrupted continuity across time and change. The subversive potential of the basic narrative situation is heightened by the fact that both Memento and “Memento Mori” are focalised through the consciousness and perception of the main character. This means that the story, as well as the film, is conveyed from the point of view of a narrator who is constitutionally unable to experience his life as a narrative. This conundrum is addressed differently in the story and in the film, both thematically and formally. “Memento Mori” presents, in a way, the backdrop to Memento. It focuses on the figure of Earl, a brain damaged protagonist suffering from anterograde amnesia, who is staying in a blank, anonymous room, that we assume to be a part of a mental institution. We also assume that Earl’s brain damage results from a blow to the head that he received while witnessing the rape and murder of his wife. Earl is bent on avenging his wife’s death. To remind himself to do so, he writes messages to himself, which he affixes on the walls of his room. Leonard Shelby is Memento’s cinematic version of Earl. By Leonard’s own account, he has an inability to form memories. This, he claims, is the result of neurological damage sustained during an attack on him and his wife, an attack in which his wife was raped and killed. To be able to pursue his wife’s killers, he has recourse to various complex and bizarre devices—Polaroid photos, a quasi-police file, a wall chart, and inscriptions tattooed onto his skin—in order to replace his memory. Hampered by his affliction, Leonard trawls the motels and bars of Southern California in an effort to gather evidence against the killer he believes to be named “John G.” Leonard’s faulty memory is deviously manipulated by various people he encounters, of whom the most crucial are a bartender called Natalie and an undercover cop named Teddy, both involved in a lucrative drug deal. So far for a straightforward account of the short story and the film. But there is nothing straightforward about either Memento or “Memento Mori”. The basic narrative premise, consisting of a protagonist/narrator suffering from a severe memory deficit, is a condition entailing far-reaching psychological and philosophical implications. In the following discussion, I would like to focus on these two implications and to tie them in to the notions of narrativity, intertextuality, and eventually, adaptation. The first implication of memory loss is the dissolution of identity. Our sense of identity is contingent on our ability to construct an uninterrupted personal narrative, a narrative in which the present self is continuous with the past self. In Oneself as Another, his philosophical treatise on the concept of selfhood, Paul Ricoeur queries: “do we not consider human lives to be more readable when they have been interpreted in terms of the stories that people tell about them?” He concludes by observing that “interpretation of the self … finds in narrative, among others signs and symbols, a privileged form of mediation” (ft. 114). Ricoeur further suggests that the sense of selfhood is contingent on four attributes: numerical identity, qualitative identity, uninterrupted continuity across time and change, and finally, permanence in time that defines sameness. The loss of memory subverts the last two attributes of personal identity, the sense of continuity and permanence over time, and thereby also ruptures the first two. In “Memento Mori” and Memento, the disintegration of identity is formally rendered through the fragmentation of the literary and cinematic narratives, respectively. In Jonathan Nolan’s short story, traditional linear narrative is disrupted by shifts in point of view and by graphic differences in the shape of the print on the page. “Memento Mori” is alternately narrated in the first and in the third person. The first person segments, which constitute the present of the story, are written by Earl to himself. As his memory span is ten-minute long, his existence consists of “just the same ten minutes, over and over again” (Nolan, 187). Fully aware of the impending fading away of memory, Earl’s present-version self leaves notes to his future-version self, in an effort to situate him in time and space and to motivate him to the final action of revenge. The literary device of alternating points of view formally subverts the notion of identity as a stable unity. Paradoxically, rather than setting him apart from the rest of us, Earl’s brain damage foregrounds his similarity. “Every man is broken into twenty-four-hour fractions,” observes Earl, comforting his future self by affirming his basic humanity, “Your problem is a little more acute, maybe, but fundamentally the same thing” (Nolan, 189). His observation echoes Beckett’s description of the individual as “the seat of a constant process of decantation … to the vessel containing the fluid of past time” (Beckett, 4-5). Identity, suggests Jonathan Nolan, following Beckett, among other things, is a theoretical construct. Human beings are works in progress, existing in a state of constant flux. We are all fragmented beings—the ten-minute man is only more so. A second strategy employed by Jonathan to convey the discontinuity of the self is the creation of visual graphic disunity. As noted by Yellowlees Douglas, among others, the static, fixed nature of the printed page and its austere linearity make it ideal for the representation of our mental construct of narrative. The text of “Memento Mori” appears on the page in three different font types: the first person segments, Earl’s admonitions to himself, appear in italics; the third person segments are written in regular type; and the notes and signs are capitalised. Christopher Nolan obviously has recourse to different strategies to reach the same ends. His principal technique, and the film’s most striking aspect, is its reversed time sequence. The film begins with a crude Polaroid flash photograph of a man’s body lying on a decaying wooden floor. The image in the photo gradually fades, until the camera sucks the picture up. The photograph presents the last chronological event; the film then skips backwards in ten-minute increments, mirroring the protagonist’s memory span. But the film’s time sequence is not simply a reversed linear structure. It is a triple-decker narrative, mirroring the three-part organisation of the story. In the opening scene, one comes to realise that the film-spool is running backwards. After several minutes the film suddenly reverses and runs forward for a few seconds. Then there is a sudden cut to a different scene, in black and white, where the protagonist (who we have just learned is called Leonard) begins to talk, out of the blue, about his confusion. Soon the film switches to a color scene, again unconnected, in which the “action” of the film begins. In the black and white scenes, which from then on are interspersed with the main action, Leonard attempts to understand what is happening to him and to explain (to an unseen listener) the nature of his condition. The “main action” of the film follows a double temporal structure: while each scene, as a unit of action, runs normally forward, each scene is triggered by the following, rather than by the preceding scene, so that we are witnessing a story whose main action goes back in time as the film progresses (Hutchinson and Read, 79). A third narrative thread, interspersed with the other two, is a story that functions as a foil to the film’s main action. It is the story of Sammy Jankis: one of the cases that Leonard worked on in his past career as an insurance investigator. Sammy was apparently suffering from anterograde amnesia, the same condition that Leonard is suffering from now. Sammy’s wife filed an insurance claim on his behalf, a claim that Leonard rejected on the grounds that Sammy’s condition was merely psychosomatic. Hoping to confirm Leonard’s diagnosis, Sammy’s diabetic wife puts her husband to the test. He fails the test as he tenderly administers multiple insulin injections to her, thereby causing her death. As Leonard’s beloved wife also suffered from diabetes, and as Teddy (the undercover cop) eventually tells Leonard that Sammy never had a wife, the Sammy Jankis parable functions as a mise en abyme, which can either corroborate or subvert the narrative that Leonard is attempting to construct of his own life. Sammy may be seen as Leonard’s symbolic double in that his form of amnesia foreshadows the condition with which Leonard will eventually be afflicted. This interpretation corroborates Leonard’s personal narrative of his memory loss, while tainting him with the blame for the death of Sammy’s wife. But the camera also suggests a more unsettling possibility—Leonard may ultimately be responsible for the death of his own wife. The scene in which Sammy, condemned by his amnesia, administers to his wife a repeated and fatal shot of insulin, is briefly followed by a scene of Leonard pinching his own wife’s thigh before her insulin shot, a scene recurring in the film like a leitmotif. The juxtaposition of the two scenes suggests that it is Leonard who, mistakenly or deliberately, has killed his wife, and that ever since he has been projecting his guilt onto others: the innocent victims of his trail of revenge. In this ironic interpretive twist, it is Leonard, rather than Sammy, who has been faking his amnesia. The parable of Sammy Jankis highlights another central concern of Memento and “Memento Mori”: the precarious nature of truth. This is the second psychological and philosophical implication of what Leonard persistently calls his “condition”, namely his loss of memory. The question explicitly raised in the film is whether memory records or creates, if it retains the lived life or reshapes it into a narrative that will confer on it unity and meaning. The answer is metaphorically suggested by the recurring shots of a mirror, which Leonard must use to read his body inscriptions. The mirror, as Lacan describes it, offers the infant his first recognition as a coherent, unique self. But this recognition is a mis-recognition, for the reflection has a coherence and unity that the subject both lacks and desires. The body inscriptions that Leonard can read only in the mirror do not necessarily testify to the truth. But they do enable him to create a narrative that makes his life worth living. A Lacanian reading of the mirror image has two profoundly unsettling implications. It establishes Leonard as a morally deficient, rather than neurologically deficient, human being, and it suggests that we are not fundamentally different from him. Leonard’s intricate system of notes and body inscriptions builds up an inventory of set representations to which he can refer in all his future experiences. Thus when he wakes up naked in bed with a woman lying beside him, he looks among his Polaroid photographs for a picture which he can match with her, which will tell him what the woman’s name is and what he can expect from her on the basis of past experience. But this, suggest Hutchinson and Read, is an external representation of operations that all of us perform mentally (89). We all respond to sensory input by constructing internal representations that form the foundations of our psyche. This view underpins current theories of language and of the mind. Semioticians tell us that the word, the signifier, refers to a mental representation of an object rather than to the object itself. Cognitivists assume that cognition consists in the operation of mental items which are symbols for real entities. Leonard’s apparently bizarre method of apprehending reality is thus essentially an externalisation of memory. But if, cognitively and epistemologically speaking, Lennie is less different from us than we would like to think, this implication may also extend to our moral nature. Our complicity with Leonard is mainly established through the film’s complex temporal structure, which makes us viscerally share the protagonist’s/narrator’s confusion and disorientation. We become as unable as he is to construct a single, coherent and meaningful narrative: the film’s obscurity is built in. Memento’s ambiguity is enhanced by the film’s Website, which presents a newspaper clipping about the attack on Leonard and his wife, torn pages from police and psychiatric reports, and a number of notes from Leonard to himself. While blurring the boundaries between story and film by suggesting that Leonard, like Earl, may have escaped from a mental institution, otnemem> also provides evidence that can either confirm or confound our interpretive efforts, such as a doctor’s report suggesting that “John G.” may be a figment of Leonard’s imagination. The precarious nature of truth is foregrounded by the fact that the narrative Leonard is trying to construct, as well as the narrative in which Christopher Nolan has embedded him, is a detective story. The traditional detective story proceeds from a two-fold assumption: truth exists, and it can be known. But Memento and “Memento Mori” undermine this epistemological confidence. They suggest that truth, like identity, is a fictional construct, derived from the tales we tell ourselves and recount to others. These tales do not coincide with objective reality; they are the prisms we create in order to understand reality, to make our lives bearable and worth living. Narratives are cognitive patterns that we construct to make sense of the world. They convey our yearning for coherence, closure, and a single unified meaning. The overlapping and conflicting threads interweaving Memento, “Memento Mori” and the Website otnemem> simultaneously expose and resist our nostalgia for unity, by evoking a multiplicity of meanings and creating an intertextual web that is the essence of all adaptation. References Barthes, Roland. Image-Music-Text. London: Fontana, 1977. Beckett, Samuel. Proust. London: Chatto and Windus, 1931. Bluestone, George. Novels into Film. Berkley and Los Angeles: California UP, 1957. Bordwell, David. Narration in the Fiction Film. Madison: Wisconsin UP, 1985. Hutchinson, Phil, and Rupert Read. “Memento: A Philosophical Investigation.” Film as Philosophy: Essays in Cinema after Wittgenstein and Cavell. Ed. Rupert Read and Jerry Goodenough. Hampshire: Palgrave, 2005. 72-93. Kristeva, Julia. “World, Dialogue and Novel.” Desire in Language: A Semiotic Approach to Literature and Art. Ed. Leon S. Rudiez. Trans. Thomas Gora. New York: Columbia UP, 1980. 64-91. Lacan, Jacques. “The Mirror Stage as Formative of the Function of the I as Revealed in Psychoanalytic Experience.” Ēcrits: A Selection. New York: Norton 1977. 1-7. Leitch, Thomas. “Twelve Fallacies in Contemporary Adaptation Theory.” Criticism 45.2 (2003): 149-71. McFarlane, Brian. Novel to Film: An Introduction to the Theory of Adaptation. Oxford: Clarendon Press, 1996. Nolan, Jonathan. “Memento Mori.” The Making of Memento. Ed. James Mottram. London: Faber and Faber, 2002. 183-95. Nolan, Jonathan. otnemem. 24 April 2007 http://otnemem.com>. Ricoeur, Paul. Oneself as Another. Chicago: Chicago UP, 1992. Stam, Robert. “Beyond Fidelity: The Dialogics of Adaptation.” Film Adaptation. Ed. James Naremore. New Brunswick: Rutgers UP, 2000. 54-76. Telotte, J.P. Voices in the Dark: The Narrative Patterns of Film Noir. Urbana and Chicago: Illinois UP, 1989. Wilkens, T., A. Hughes, B.M. Wildemuth, and G. Marchionini. “The Role of Narrative in Understanding Digital Video.” 24 April 2007 http://www.open-video.org/papers/Wilkens_Asist_2003.pdf>. Yellowlees Douglass, J. “Gaps, Maps and Perception: What Hypertext Readers (Don’t) Do.” 24 April 2007 http://www.pd.org/topos/perforations/perf3/douglas_p3.html>. Citation reference for this article MLA Style Shiloh, Ilana. "Adaptation, Intertextuality, and the Endless Deferral of Meaning: Memento." M/C Journal 10.2 (2007). echo date('d M. Y'); ?> <http://journal.media-culture.org.au/0705/08-shiloh.php>. APA Style Shiloh, I. (May 2007) "Adaptation, Intertextuality, and the Endless Deferral of Meaning: Memento," M/C Journal, 10(2). Retrieved echo date('d M. Y'); ?> from <http://journal.media-culture.org.au/0705/08-shiloh.php>.
APA, Harvard, Vancouver, ISO, and other styles
50

Holleran, Samuel. "Better in Pictures." M/C Journal 24, no. 4 (August 19, 2021). http://dx.doi.org/10.5204/mcj.2810.

Full text
Abstract:
While the term “visual literacy” has grown in popularity in the last 50 years, its meaning remains nebulous. It is described variously as: a vehicle for aesthetic appreciation, a means of defence against visual manipulation, a sorting mechanism for an increasingly data-saturated age, and a prerequisite to civic inclusion (Fransecky 23; Messaris 181; McTigue and Flowers 580). Scholars have written extensively about the first three subjects but there has been less research on how visual literacy frames civic life and how it might help the public as a tool to address disadvantage and assist in removing social and cultural barriers. This article examines a forerunner to visual literacy in the push to create an international symbol language born out of popular education movements, a project that fell short of its goals but still left a considerable impression on graphic media. This article, then, presents an analysis of visual literacy campaigns in the early postwar era. These campaigns did not attempt to invent a symbolic language but posited that images themselves served as a universal language in which students could receive training. Of particular interest is how the concept of visual literacy has been mobilised as a pedagogical tool in design, digital humanities and in broader civic education initiatives promoted by Third Space institutions. Behind the creation of new visual literacy curricula is the idea that images can help anchor a world community, supplementing textual communication. Figure 1: Visual Literacy Yearbook. Montebello Unified School District, USA, 1973. Shedding Light: Origins of the Visual Literacy Frame The term “visual literacy” came to the fore in the early 1970s on the heels of mass literacy campaigns. The educators, creatives and media theorists who first advocated for visual learning linked this aim to literacy, an unassailable goal, to promote a more radical curricular overhaul. They challenged a system that had hitherto only acknowledged a very limited pathway towards academic success; pushing “language and mathematics”, courses “referred to as solids (something substantial) as contrasted with liquids or gases (courses with little or no substance)” (Eisner 92). This was deemed “a parochial view of both human ability and the possibilities of education” that did not acknowledge multiple forms of intelligence (Gardner). This change not only integrated elements of mass culture that had been rejected in education, notably film and graphic arts, but also encouraged the critique of images as a form of good citizenship, assuming that visually literate arbiters could call out media misrepresentations and manipulative political advertising (Messaris, “Visual Test”). This movement was, in many ways, reactive to new forms of mass media that began to replace newspapers as key forms of civic participation. Unlike simple literacy (being able to decipher letters as a mnemonic system), visual literacy involves imputing meanings to images where meanings are less fixed, yet still with embedded cultural signifiers. Visual literacy promised to extend enlightenment metaphors of sight (as in the German Aufklärung) and illumination (as in the French Lumières) to help citizens understand an increasingly complex marketplace of images. The move towards visual literacy was not so much a shift towards images (and away from books and oration) but an affirmation of the need to critically investigate the visual sphere. It introduced doubt to previously upheld hierarchies of perception. Sight, to Kant the “noblest of the senses” (158), was no longer the sense “least affected” by the surrounding world but an input centre that was equally manipulable. In Kant’s view of societal development, the “cosmopolitan” held the key to pacifying bellicose states and ensuring global prosperity and tranquillity. The process of developing a cosmopolitan ideology rests, according to Kant, on the gradual elimination of war and “the education of young people in intellectual and moral culture” (188-89). Transforming disparate societies into “a universal cosmopolitan existence” that would “at last be realised as the matrix within which all the original capacities of the human race may develop” and would take well-funded educational institutions and, potentially, a new framework for imparting knowledge (Kant 51). To some, the world of the visual presented a baseline for shared experience. Figure 2: Exhibition by the Gesellschafts- und Wirtschaftsmuseum in Vienna, photograph c. 1927. An International Picture Language The quest to find a mutually intelligible language that could “bridge worlds” and solder together all of humankind goes back to the late nineteenth century and the Esperanto movement of Ludwig Zamenhof (Schor 59). The expression of this ideal in the world of the visual picked up steam in the interwar years with designers and editors like Fritz Kahn, Gerd Arntz, and Otto and Marie Neurath. Their work transposing complex ideas into graphic form has been rediscovered as an antecedent to modern infographics, but the symbols they deployed were not to merely explain, but also help education and build international fellowship unbounded by spoken language. The Neuraths in particular are celebrated for their international picture language or Isotypes. These pictograms (sometimes viewed as proto-emojis) can be used to represent data without text. Taken together they are an “intemporal, hieroglyphic language” that Neutrath hoped would unite working-class people the world over (Lee 159). The Neuraths’ work was done in the explicit service of visual education with a popular socialist agenda and incubated in the social sphere of Red Vienna at the Gesellschafts- und Wirtschaftsmuseum (Social and Economic Museum) where Otto served as Director. The Wirtschaftsmuseum was an experiment in popular education, with multiple branches and late opening hours to accommodate the “the working man [who] has time to see a museum only at night” (Neurath 72-73). The Isotype contained universalist aspirations for the “making of a world language, or a helping picture language—[that] will give support to international developments generally” and “educate by the eye” (Neurath 13). Figure 3: Gerd Arntz Isotype Images. (Source: University of Reading.) The Isotype was widely adopted in the postwar era in pre-packaged sets of symbols used in graphic design and wayfinding systems for buildings and transportation networks, but with the socialism of the Neuraths’ peeled away, leaving only the system of logos that we are familiar with from airport washrooms, charts, and public transport maps. Much of the uptake in this symbol language could be traced to increased mobility and tourism, particularly in countries that did not make use of a Roman alphabet. The 1964 Olympics in Tokyo helped pave the way when organisers, fearful of jumbling too many scripts together, opted instead for black and white icons to represent the program of sports that summer. The new focus on the visual was both technologically mediated—cheaper printing and broadcast technologies made the diffusion of image increasingly possible—but also ideologically supported by a growing emphasis on projects that transcended linguistic, ethnic, and national borders. The Olympic symbols gradually morphed into Letraset icons, and, later, symbols in the Unicode Standard, which are the basis for today’s emojis. Wordless signs helped facilitate interconnectedness, but only in the most literal sense; their application was limited primarily to sports mega-events, highway maps, and “brand building”, and they never fulfilled their role as an educational language “to give the different nations a common outlook” (Neurath 18). Universally understood icons, particularly in the form of emojis, point to a rise in visual communication but they have fallen short as a cosmopolitan project, supporting neither the globalisation of Kantian ethics nor the transnational socialism of the Neuraths. Figure 4: Symbols in use. Women's bathroom. 1964 Tokyo Olympics. (Source: The official report of the Organizing Committee.) Counter Education By mid-century, the optimism of a universal symbol language seemed dated, and focus shifted from distillation to discernment. New educational programs presented ways to study images, increasingly reproducible with new technologies, as a language in and of themselves. These methods had their roots in the fin-de-siècle educational reforms of John Dewey, Helen Parkhurst, and Maria Montessori. As early as the 1920s, progressive educators were using highly visual magazines, like National Geographic, as the basis for lesson planning, with the hopes that they would “expose students to edifying and culturally enriching reading” and “develop a more catholic taste or sensibility, representing an important cosmopolitan value” (Hawkins 45). The rise in imagery from previously inaccessible regions helped pupils to see themselves in relation to the larger world (although this connection always came with the presumed superiority of the reader). “Pictorial education in public schools” taught readers—through images—to accept a broader world but, too often, they saw photographs as a “straightforward transcription of the real world” (Hawkins 57). The images of cultures and events presented in Life and National Geographic for the purposes of education and enrichment were now the subject of greater analysis in the classroom, not just as “windows into new worlds” but as cultural products in and of themselves. The emerging visual curriculum aimed to do more than just teach with previously excluded modes (photography, film and comics); it would investigate how images presented and mediated the world. This gained wider appeal with new analytical writing on film, like Raymond Spottiswoode's Grammar of the Film (1950) which sought to formulate the grammatical rules of visual communication (Messaris 181), influenced by semiotics and structural linguistics; the emphasis on grammar can also be seen in far earlier writings on design systems such as Owen Jones’s 1856 The Grammar of Ornament, which also advocated for new, universalising methods in design education (Sloboda 228). The inventorying impulse is on display in books like Donis A. Dondis’s A Primer of Visual Literacy (1973), a text that meditates on visual perception but also functions as an introduction to line and form in the applied arts, picking up where the Bauhaus left off. Dondis enumerates the “syntactical guidelines” of the applied arts with illustrations that are in keeping with 1920s books by Kandinsky and Klee and analyse pictorial elements. However, at the end of the book she shifts focus with two chapters that examine “messaging” and visual literacy explicitly. Dondis predicts that “an intellectual, trained ability to make and understand visual messages is becoming a vital necessity to involvement with communication. It is quite likely that visual literacy will be one of the fundamental measures of education in the last third of our century” (33) and she presses for more programs that incorporate the exploration and analysis of images in tertiary education. Figure 5: Ideal spatial environment for the Blueprint charts, 1970. (Image: Inventory Press.) Visual literacy in education arrived in earnest with a wave of publications in the mid-1970s. They offered ways for students to understand media processes and for teachers to use visual culture as an entry point into complex social and scientific subject matter, tapping into the “visual consciousness of the ‘television generation’” (Fransecky 5). Visual culture was often seen as inherently democratising, a break from stuffiness, the “artificialities of civilisation”, and the “archaic structures” that set sensorial perception apart from scholarship (Dworkin 131-132). Many radical university projects and community education initiatives of the 1960s made use of new media in novel ways: from Maurice Stein and Larry Miller’s fold-out posters accompanying Blueprint for Counter Education (1970) to Emory Douglas’s graphics for The Black Panther newspaper. Blueprint’s text- and image-dense wall charts were made via assemblage and they were imagined less as charts and more as a “matrix of resources” that could be used—and added to—by youth to undertake their own counter education (Cronin 53). These experiments in visual learning helped to break down old hierarchies in education, but their aim was influenced more by countercultural notions of disruption than the universal ideals of cosmopolitanism. From Image as Text to City as Text For a brief period in the 1970s, thinkers like Marshall McLuhan (McLuhan et al., Massage) and artists like Bruno Munari (Tanchis and Munari) collaborated fruitfully with graphic designers to create books that mixed text and image in novel ways. Using new compositional methods, they broke apart traditional printing lock-ups to superimpose photographs, twist text, and bend narrative frames. The most famous work from this era is, undoubtedly, The Medium Is the Massage (1967), McLuhan’s team-up with graphic designer Quentin Fiore, but it was followed by dozens of other books intended to communicate theory and scientific ideas with popularising graphics. Following in the footsteps of McLuhan, many of these texts sought not just to explain an issue but to self-consciously reference their own method of information delivery. These works set the precedent for visual aids (and, to a lesser extent, audio) that launched a diverse, non-hierarchical discourse that was nonetheless bound to tactile artefacts. In 1977, McLuhan helped develop a media textbook for secondary school students called City as Classroom: Understanding Language and Media. It is notable for its direct address style and its focus on investigating spaces outside of the classroom (provocatively, a section on the third page begins with “Should all schools be closed?”). The book follows with a fine-grained analysis of advertising forms in which students are asked to first bring advertisements into class for analysis and later to go out into the city to explore “a man-made environment, a huge warehouse of information, a vast resource to be mined free of charge” (McLuhan et al., City 149). As a document City as Classroom is critical of existing teaching methods, in line with the radical “in the streets” pedagogy of its day. McLuhan’s theories proved particularly salient for the counter education movement, in part because they tapped into a healthy scepticism of advertisers and other image-makers. They also dovetailed with growing discontent with the ad-strew visual environment of cities in the 1970s. Budgets for advertising had mushroomed in the1960s and outdoor advertising “cluttered” cities with billboards and neon, generating “fierce intensities and new hybrid energies” that threatened to throw off the visual equilibrium (McLuhan 74). Visual literacy curricula brought in experiential learning focussed on the legibility of the cities, mapping, and the visualisation of urban issues with social justice implications. The Detroit Geographical Expedition and Institute (DGEI), a “collective endeavour of community research and education” that arose in the aftermath of the 1967 uprisings, is the most storied of the groups that suffused the collection of spatial data with community engagement and organising (Warren et al. 61). The following decades would see a tamed approach to visual literacy that, while still pressing for critical reading, did not upend traditional methods of educational delivery. Figure 6: Beginning a College Program-Assisting Teachers to Develop Visual Literacy Approaches in Public School Classrooms. 1977. ERIC. Searching for Civic Education The visual literacy initiatives formed in the early 1970s both affirmed existing civil society institutions while also asserting the need to better inform the public. Most of the campaigns were sponsored by universities, major libraries, and international groups such as UNESCO, which published its “Declaration on Media Education” in 1982. They noted that “participation” was “essential to the working of a pluralistic and representative democracy” and the “public—users, citizens, individuals, groups ... were too systematically overlooked”. Here, the public is conceived as both “targets of the information and communication process” and users who “should have the last word”. To that end their “continuing education” should be ensured (Study 18). Programs consisted primarily of cognitive “see-scan-analyse” techniques (Little et al.) for younger students but some also sought to bring visual analysis to adult learners via continuing education (often through museums eager to engage more diverse audiences) and more radical popular education programs sponsored by community groups. By the mid-80s, scores of modules had been built around the comprehension of visual media and had become standard educational fare across North America, Australasia, and to a lesser extent, Europe. There was an increasing awareness of the role of data and image presentation in decision-making, as evidenced by the surprising commercial success of Edward Tufte’s 1982 book, The Visual Display of Quantitative Information. Visual literacy—or at least image analysis—was now enmeshed in teaching practice and needed little active advocacy. Scholarly interest in the subject went into a brief period of hibernation in the 1980s and early 1990s, only to be reborn with the arrival of new media distribution technologies (CD-ROMs and then the internet) in classrooms and the widespread availability of digital imaging technology starting in the late 1990s; companies like Adobe distributed free and reduced-fee licences to schools and launched extensive teacher training programs. Visual literacy was reanimated but primarily within a circumscribed academic field of education and data visualisation. Figure 7: Visual Literacy; What Research Says to the Teacher, 1975. National Education Association. USA. Part of the shifting frame of visual literacy has to do with institutional imperatives, particularly in places where austerity measures forced strange alliances between disciplines. What had been a project in alternative education morphed into an uncontested part of the curriculum and a dependable budget line. This shift was already forecasted in 1972 by Harun Farocki who, writing in Filmkritik, noted that funding for new film schools would be difficult to obtain but money might be found for “training in media education … a discipline that could persuade ministers of education, that would at the same time turn the budget restrictions into an advantage, and that would match the functions of art schools” (98). Nearly 50 years later educators are still using media education (rebranded as visual or media literacy) to make the case for fine arts and humanities education. While earlier iterations of visual literacy education were often too reliant on the idea of cracking the “code” of images, they did promote ways of learning that were a deep departure from the rote methods of previous generations. Next-gen curricula frame visual literacy as largely supplemental—a resource, but not a program. By the end of the 20th century, visual literacy had changed from a scholarly interest to a standard resource in the “teacher’s toolkit”, entering into school programs and influencing museum education, corporate training, and the development of public-oriented media (Literacy). An appreciation of image culture was seen as key to creating empathetic global citizens, but its scope was increasingly limited. With rising austerity in the education sector (a shift that preceded the 2008 recession by decades in some countries), art educators, museum enrichment staff, and design researchers need to make a case for why their disciplines were relevant in pedagogical models that are increasingly aimed at “skills-based” and “job ready” teaching. Arts educators worked hard to insert their fields into learning goals for secondary students as visual literacy, with the hope that “literacy” would carry the weight of an educational imperative and not a supplementary field of study. Conclusion For nearly a century, educational initiatives have sought to inculcate a cosmopolitan perspective with a variety of teaching materials and pedagogical reference points. Symbolic languages, like the Isotype, looked to unite disparate people with shared visual forms; while educational initiatives aimed to train the eyes of students to make them more discerning citizens. The term ‘visual literacy’ emerged in the 1960s and has since been deployed in programs with a wide variety of goals. Countercultural initiatives saw it as a prerequisite for popular education from the ground up, but, in the years since, it has been formalised and brought into more staid curricula, often as a sort of shorthand for learning from media and pictures. The grand cosmopolitan vision of a complete ‘visual language’ has been scaled back considerably, but still exists in trace amounts. Processes of globalisation require images to universalise experiences, commodities, and more for people without shared languages. Emoji alphabets and globalese (brands and consumer messaging that are “visual-linguistic” amalgams “increasingly detached from any specific ethnolinguistic group or locality”) are a testament to a mediatised banal cosmopolitanism (Jaworski 231). In this sense, becoming “fluent” in global design vernacular means familiarity with firms and products, an understanding that is aesthetic, not critical. It is very much the beneficiaries of globalisation—both state and commercial actors—who have been able to harness increasingly image-based technologies for their benefit. To take a humorous but nonetheless consequential example, Spanish culinary boosters were able to successfully lobby for a paella emoji (Miller) rather than having a food symbol from a less wealthy country such as a Senegalese jollof or a Morrocan tagine. This trend has gone even further as new forms of visual communication are increasingly streamlined and managed by for-profit media platforms. The ubiquity of these forms of communication and their global reach has made visual literacy more important than ever but it has also fundamentally shifted the endeavour from a graphic sorting practice to a critical piece of social infrastructure that has tremendous political ramifications. Visual literacy campaigns hold out the promise of educating students in an image-based system with the potential to transcend linguistic and cultural boundaries. This cosmopolitan political project has not yet been realised, as the visual literacy frame has drifted into specialised silos of art, design, and digital humanities education. It can help bridge the “incomplete connections” of an increasingly globalised world (Calhoun 112), but it does not have a program in and of itself. Rather, an evolving visual literacy curriculum might be seen as a litmus test for how we imagine the role of images in the world. References Brown, Neil. “The Myth of Visual Literacy.” Australian Art Education 13.2 (1989): 28-32. Calhoun, Craig. “Cosmopolitanism in the Modern Social Imaginary.” Daedalus 137.3 (2008): 105–114. Cronin, Paul. “Recovering and Rendering Vital Blueprint for Counter Education at the California Institute for the Arts.” Blueprint for Counter Education. Inventory Press, 2016. 36-58. Dondis, Donis A. A Primer of Visual Literacy. MIT P, 1973. Dworkin, M.S. “Toward an Image Curriculum: Some Questions and Cautions.” Journal of Aesthetic Education 4.2 (1970): 129–132. Eisner, Elliot. Cognition and Curriculum: A Basis for Deciding What to Teach. Longmans, 1982. Farocki, Harun. “Film Courses in Art Schools.” Trans. Ted Fendt. Grey Room 79 (Apr. 2020): 96–99. Fransecky, Roger B. Visual Literacy: A Way to Learn—A Way to Teach. Association for Educational Communications and Technology, 1972. Gardner, Howard. Frames Of Mind. Basic Books, 1983. Hawkins, Stephanie L. “Training the ‘I’ to See: Progressive Education, Visual Literacy, and National Geographic Membership.” American Iconographic. U of Virginia P, 2010. 28–61. Jaworski, Adam. “Globalese: A New Visual-Linguistic Register.” Social Semiotics 25.2 (2015): 217-35. Kant, Immanuel. Anthropology from a Pragmatic Point of View. Cambridge UP, 2006. Kant, Immanuel. “Perpetual Peace.” Political Writings. Ed. H. Reiss. Cambridge UP, 1991 [1795]. 116–130. Kress, G., and T. van Leeuwen. Reading images: The Grammar of Visual Design. Routledge, 1996. Literacy Teaching Toolkit: Visual Literacy. Department of Education and Training (DET), State of Victoria. 29 Aug. 2018. 30 Sep. 2020 <https://www.education.vic.gov.au:443/school/teachers/teachingresources/discipline/english/literacy/ readingviewing/Pages/litfocusvisual.aspx>. Lee, Jae Young. “Otto Neurath's Isotype and the Rhetoric of Neutrality.” Visible Language 42.2: 159-180. Little, D., et al. Looking and Learning: Visual Literacy across the Disciplines. Wiley, 2015. Messaris, Paul. “Visual Literacy vs. Visual Manipulation.” Critical Studies in Mass Communication 11.2: 181-203. DOI: 10.1080/15295039409366894 ———. “A Visual Test for Visual ‘Literacy.’” The Annual Meeting of the Speech Communication Association. 31 Oct. to 3 Nov. 1991. Atlanta, GA. <https://files.eric.ed.gov/fulltext/ED347604.pdf>. McLuhan, Marshall. Understanding Media: The Extensions of Man. McGraw-Hill, 1964. McLuhan, Marshall, Quentin Fiore, and Jerome Agel. The Medium Is the Massage, Bantam Books, 1967. McLuhan, Marshall, Kathryn Hutchon, and Eric McLuhan. City as Classroom: Understanding Language and Media. Agincourt, Ontario: Book Society of Canada, 1977. McTigue, Erin, and Amanda Flowers. “Science Visual Literacy: Learners' Perceptions and Knowledge of Diagrams.” Reading Teacher 64.8: 578-89. Miller, Sarah. “The Secret History of the Paella Emoji.” Food & Wine, 20 June 2017. <https://www.foodandwine.com/news/true-story-paella-emoji>. Munari, Bruno. Square, Circle, Triangle. Princeton Architectural Press, 2016. Newfield, Denise. “From Visual Literacy to Critical Visual Literacy: An Analysis of Educational Materials.” English Teaching-Practice and Critique 10 (2011): 81-94. Neurath, Otto. International Picture Language: The First Rules of Isotype. K. Paul, Trench, Trubner, 1936. Schor, Esther. Bridge of Words: Esperanto and the Dream of a Universal Language. Henry Holt and Company, 2016. Sloboda, Stacey. “‘The Grammar of Ornament’: Cosmopolitanism and Reform in British Design.” Journal of Design History 21.3 (2008): 223-36. Study of Communication Problems: Implementation of Resolutions 4/19 and 4/20 Adopted by the General Conference at Its Twenty-First Session; Report by the Director-General. UNESCO, 1983. Tanchis, Aldo, and Bruno Munari. Bruno Munari: Design as Art. MIT P, 1987. Warren, Gwendolyn, Cindi Katz, and Nik Heynen. “Myths, Cults, Memories, and Revisions in Radical Geographic History: Revisiting the Detroit Geographical Expedition and Institute.” Spatial Histories of Radical Geography: North America and Beyond. Wiley, 2019. 59-86.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography