Academic literature on the topic 'Inc Harvey Probber'

Create a spot-on reference in APA, MLA, Chicago, Harvard, and other styles

Select a source type:

Consult the lists of relevant articles, books, theses, conference reports, and other scholarly sources on the topic 'Inc Harvey Probber.'

Next to every source in the list of references, there is an 'Add to bibliography' button. Press on it, and we will generate automatically the bibliographic reference to the chosen work in the citation style you need: APA, MLA, Harvard, Chicago, Vancouver, etc.

You can also download the full text of the academic publication as pdf and read online its abstract whenever available in the metadata.

Journal articles on the topic "Inc Harvey Probber"

1

Chien, K., R. L. Van de Velde, and R. C. Heusser. "Technical notes on routine TEM procedures." Proceedings, annual meeting, Electron Microscopy Society of America 45 (August 1987): 630–31. http://dx.doi.org/10.1017/s042482010012758x.

Full text
Abstract:
Sectioning quality of epoxy resins can be improved by the addition of a 1% silicone 200 fluid (Dow Corning), however this produces a softer block. To compensate, a harder plastic has been used for embedding various tissues encountered in our pathology laboratory. Exact amounts of the plastic mixture can be directly made up for embedding as shown: The chart reveals a Poly/Bed 812 (WPE 145) to anhydride ratio of 1:0.7 and a NMA to DDSA ratio of 7:3. 1% silicone fluid is added to above mixtures.Due to impurities within the DDSA and NMA, the polymerized epoxy blocks vary in darkness and appear to affect sectioning quality. After discussing this problem with Polysciences Inc., they have agreed to purify their anhydrides in an effort to standardize the consistency of the plastic.
APA, Harvard, Vancouver, ISO, and other styles
2

Undavalli, Vamsi Krishna, Satyanarayana Chowdary Ponnaganti, and Hanumanth Narni. "Prevalence of generalized and abdominal obesity: India’s big problem." International Journal Of Community Medicine And Public Health 5, no. 4 (March 23, 2018): 1311. http://dx.doi.org/10.18203/2394-6040.ijcmph20180984.

Full text
Abstract:
Background: The rising prevalence overweight and obesity in India has a direct correlation with the increasing prevalence of obesity-related co-morbidities; hypertension, the metabolic syndrome, dyslipidemia, type 2 diabetes mellitus, and cardiovascular disease. The risk for these disorders appears to start from a body mass index (BMI) of about 21 kg/m2. The objective of the study was to know the prevalence of generalized and abdominal obesity in the field practice area.Methods: A community based cross-sectional study conducted among 309 people in the rural field practice area of medical college from January to March, 2017.Results: In the present study prevalence of generalized, abdominal and combined obesity was 56%, 71.2% and 51.3% respectively.Conclusions: Prevention of obesity should begin in early childhood. Obesity is harder to treat in adults than it is in children. The control of obesity centers on the weight reduction. Information Education and Communication (IEC), Behaviour change communication (BCC) is used to encourage individuals of the society to adopt healthy behaviours like dietary modifications, increased physical activity and a combination of both.
APA, Harvard, Vancouver, ISO, and other styles
3

Dellana, Christopher J. "Higher Education: An Appropriate Realm to Impose False Claims Act Liability Under the Post-Formation Implied False Certification Theory." University of Pittsburgh Law Review 78, no. 2 (March 29, 2017). http://dx.doi.org/10.5195/lawreview.2016.453.

Full text
Abstract:
Confederate batteries opened up on Fort Sumter in April of 1861, inaugurating the bloodiest conflict in American history. President Abraham Lincoln’s war effort, nursing wounds from defeats at Fredericksburg in 1862 and Chancellorsville in 1863, sorely needed more men and supplies. Propaganda campaigns and conscription efforts filled gaps in the depleted ranks of Lincoln’s army, helping it swell into the largest mobilization of troops in the world. Reliable supplies were, however, harder to come by; while Union soldiers fell to Confederate bullets and bayonets on the battlefield, army commissaries and quartermasters fell victims to fraud. A lack of meaningful government oversight had created an environment rife with profiteering. During the first years of the war, the government unwittingly purchased 1,000 horses so sick with every known equine disease that they were entirely useless; in another instance, the government paid a contractor for 411 horses of which only 76 were found fit for service (with the remainder being either blind, undersized, ringboned, or dead upon arrival). The government also bought artillery shells filled with sawdust rather than gunpowder, flimsy shoes that lasted for only twenty days, “rotten” blankets, “worthless” overcoats, and “muskets not [even] worth shooting.” To stop these abuses, Congress appointed a special committee, called the Select Committee on Government Contracts, to investigate the extent of the fraudulent contracting; the committee solicited testimony from military personnel, experts, and others that highlighted the disturbing magnitude of the problem. In response, the Union government promulgated the False Claims Act (“FCA”) in March of 1863. Following the conclusion of the war, and the rapid decline of government contracting needs, the FCA was left to gather dust in a forgotten corner of federal law until the late twentieth century. In the 1980s, the FCA surged back to prominence to address abuses in the defense contracting industry and, once again, it became the government’s weapon of choice to combat fraud.Since its Civil War origins, the FCA has undergone substantial changes. Congress, in recognition of the FCA’s increasing importance with the growth of the modern regulatory state, expanded the purview of the FCA in both 1986 and 2009, much to the chagrin of government contractors. The 2009 amendment, in particular, was a clear demonstration of congressional intent to expand the scope of the FCA by overriding federal judicial precedent that attempted to limit it. Congress’s goal in amending the FCA, thus, was not just to “enact a broad remedial statute” but rather to “preserve the traditional boundaries of fraud,” as well.The FCA operates as a powerful tool to combat fraud that, otherwise left unchecked, might imperil the federal government’s finances. The FCA allows either the Attorney General or a qui tam whistleblower (known in the FCA context as a relator) to bring an action on behalf of the United States against persons or entities committing certain types of fraud against the government. The FCA, codified at 31 U.S.C. § 3729, holds that any individual who “knowingly” presents or knowingly conspires to “present[], or cause[] to be presented, a false or fraudulent claim for payment or approval” or “makes, uses, or causes to be made or used, a false record or statement material to a false . . . claim” is liable under the FCA, which imposes damages up to $11,000 per violation in addition to treble the amount of the government’s damages. This can result in cases where the damages could total a staggering $2 billion. The FCA, as a tool of fraud deterrence and of compliance enforcement, has had the most significant effect on the healthcare industry. By way of illustration, between 1986 and 2009, two-thirds of the $22 billion recovered by the federal government ($14.3 billion) came from recoveries in the healthcare industry. Since 2009, however, differing interpretations of the Fraud Enforcement and Recovery Act (“FERA”), the passage of the Patient Protection and Affordable Care Act (“ACA”), and the Supreme Court’s unanimous decision in Universal Health Services, Inc. v. United States ex rel. Escobar have all expanded the scope of the FCA, leading new industries to find themselves increasingly in the crosshairs of expanded procedural theories of liability.At an operative level, the FCA posits that both “factually false” and “legally false” claims are actionable; “factually false” claims include goods or services either incorrectly described or not provided at all, and “legally false” claims are false based on statements, promises, or other certifications of compliance. While various circuits have held that the FCA reaches factually false conduct, legal falsity (with the Supreme Court’s recent endorsement) could gain traction as an equally important theory for prosecuting fraud. This expanded theory of liability may continue to evolve as the industries that the FCA regulates continue to evolve, as well. One such industry falling under this broad purview is higher education.This Note will address whether or not educational institutions in the for-profit sector should be held liable under the FCA for entering into a Program Participant Agreement (“PPA”) with the government, in good faith, only to thereafter commit fraud. This Note contends that the modern higher education environment provides an appropriate context in which courts may permissibly disregard any distinction between conditions of participation and conditions of payment for purposes of imposing FCA liability. It further posits that the Supreme Court’s Escobar decision, though an important landmark toward a broader enforcement tool, did not go far enough to deter fraud in higher education. Part I will describe the background of the FCA, the rationale for the development of the “legally false” theory of liability, and the differences between the express and implied types of certification. It will also discuss judicial interpretation of legal falsity, with emphasis on the Supreme Court’s decision in Escobar. Part II will address conditions of participation and conditions of payment and why the difference may remain significant in the fraud context. Part III will explain the structure of for-profit educational institutions, their role as government contractors, and the nature of the circuit split regarding the receipt of Higher Education Act (“HEA”) Title IV funds and FCA liability. Part IV will discuss policy implications of this “implied certification of post-formation performance” theory and why the educational setting is the appropriate venue in which to hold government contractors liable for fraud on an expansive sub-theory of implied false certification.
APA, Harvard, Vancouver, ISO, and other styles
4

Burns, Alex. "'This Machine Is Obsolete'." M/C Journal 2, no. 8 (December 1, 1999). http://dx.doi.org/10.5204/mcj.1805.

Full text
Abstract:
'He did what the cipher could not, he rescued himself.' -- Alfred Bester, The Stars My Destination (23) On many levels, the new Nine Inch Nails album The Fragile is a gritty meditation about different types of End: the eternal relationship cycle of 'fragility, tension, ordeal, fragmentation' (adapted, with apologies to Wilhelm Reich); fin-de-siècle anxiety; post-millennium foreboding; a spectre of the alien discontinuity that heralds an on-rushing future vastly different from the one envisaged by Enlightenment Project architects. In retrospect, it's easy for this perspective to be dismissed as jargon-filled cyber-crit hyperbole. Cyber-crit has always been at its best too when it invents pre-histories and finds hidden connections between different phenomena (like the work of Greil Marcus and early Mark Dery), and not when it is closer to Chinese Water Torture, name-checking the canon's icons (the 'Deleuze/Guattari' tag-team), texts and key terms. "The organization of sound is interpreted historically, politically, socially ... . It subdues music's ambition, reins it in, restores it to its proper place, reconciles it to its naturally belated fate", comments imagineer Kodwo Eshun (4) on how cyber-crit destroys albums and the innocence of the listening experience. This is how official histories are constructed a priori and freeze-dried according to personal tastes and prior memes: sometimes the most interesting experiments are Darwinian dead-ends that fail to make the canon, or don't register on the radar. Anyone approaching The Fragile must also contend with the music industry's harsh realities. For every 10 000 Goth fans who moshed to the primal 'kill-fuck-dance' rhythms of the hit single "Closer" (heeding its siren-call to fulfil basic physiological needs and build niche-space), maybe 20 noted that the same riff returned with a darker edge in the title track to The Downward Spiral, undermining the glorification of Indulgent hedonism. "The problem with such alternative audiences," notes Disinformation Creative Director Richard Metzger, "is that they are trying to be different -- just like everyone else." According to author Don Webb, "some mature Chaos and Black Magicians reject their earlier Nine Inch Nails-inspired Goth beginnings and are extremely critical towards new adopters because they are uncomfortable with the subculture's growing popularity, which threatens to taint their meticulously constructed 'mysterious' worlds. But by doing so, they are also rejecting their symbolic imprinting and some powerful Keys to unlocking their personal history." It is also difficult to separate Nine Inch Nails from the commercialisation and colossal money-making machine that inevitably ensued on the MTV tour circuit: do we blame Michael Trent Reznor because most of his audience are unlikely to be familiar with 'first-wave' industrial bands including Cabaret Voltaire and the experiments of Genesis P. Orridge in Throbbing Gristle? Do we accuse Reznor of being a plagiarist just because he wears some of his influences -- Dr. Dre, Daft Punk, Atari Teenage Riot, Pink Floyd's The Wall (1979), Tom Waits's Bone Machine (1992), David Bowie's Low (1977) -- on his sleeve? And do we accept no-brain rock critic album reviews who quote lines like 'All the pieces didn't fit/Though I really didn't give a shit' ("Where Is Everybody?") or 'And when I suck you off/Not a drop will go to waste' ("Starfuckers Inc") as representative of his true personality? Reznor evidently has his own thoughts on this subject, but we should let the music speak for itself. The album's epic production and technical complexity turned into a post-modern studio Vision Quest, assisted by producer Alan Moulder, eleventh-hour saviour Bob Ezrin (brought in by Reznor to 'block-out' conceptual and sonic continuity), and a group of assault-technicians. The fruit of these collaborations is an album where Reznor is playing with our organism's time-binding sense, modulating strange emotions through deeply embedded tonal angularities. During his five-year absence, Trent Reznor fought diverse forms of repetitious trauma, from endogenous depression caused by endless touring to the death of his beloved grandmother (who raised him throughout childhood). An end signals a new beginning, a spiral is an open-ended and ever-shifting structure, and so Reznor sought to re-discover the Elder Gods within, a shamanic approach to renewal and secular salvation utilised most effectively by music PR luminary and scientist Howard Bloom. Concerned with healing the human animal through Ordeals that hard-wire the physiological baselines of Love, Hate and Fear, Reznor also focusses on what happens when 'meaning-making' collapses and hope for the future cannot easily be found. He accurately captures the confusion that such dissolution of meaning and decline of social institutions brings to the world -- Francis Fukuyama calls this bifurcation 'The Great Disruption'. For a generation who experienced their late childhood and early adolescence in Reagan's America, Reznor and his influences (Marilyn Manson and Filter) capture the Dark Side of recent history, unleashed at Altamont and mutating into the Apocalyptic style of American politics (evident in the 'Star Wars'/SDI fascination). The personal 'psychotic core' that was crystallised by the collapse of the nuclear family unit and supportive social institutions has returned to haunt us with dystopian fantasies that are played out across Internet streaming media and visceral MTV film-clips. That such cathartic releases are useful -- and even necessary (to those whose lives have been formed by socio-economic 'life conditions') is a point that escapes critics like Roger Scruton, some Christian Evangelists and the New Right. The 'escapist' quality of early 1980s 'Rapture' and 'Cosmocide' (Hal Lindsey) prophecies has yielded strange fruit for the Children of Ezekiel, whom Reznor and Marilyn Manson are unofficial spokes-persons for. From a macro perspective, Reznor's post-human evolutionary nexus lies, like J.G. Ballard's tales, in a mythical near-future built upon past memory-shards. It is the kind of worldview that fuses organic and morphogenetic structures with industrial machines run amok, thus The Fragile is an artefact that captures the subjective contents of the different mind produced by different times. Sonic events are in-synch but out of phase. Samples subtly trigger and then scramble kinaesthetic-visceral and kinaesthetic-tactile memories, suggestive of dissociated affective states or body memories that are incapable of being retrieved (van der Kolk 294). Perhaps this is why after a Century of Identity Confusion some fans find it impossible to listen to a 102-minute album in one sitting. No wonder then that the double album is divided into 'left' and 'right' discs (a reference to split-brain research?). The real-time track-by-track interpretation below is necessarily subjective, and is intended to serve as a provisional listener's guide to the aural ur-text of 1999. The Fragile is full of encrypted tones and garbled frequencies that capture a world where the future is always bleeding into a non-recoverable past. Turbulent wave-forms fight for the listener's attention with prolonged static lulls. This does not make for comfortable or even 'nice' listening. The music's mind is a snapshot, a critical indicator, of the deep structures brewing within the Weltanschauung that could erupt at any moment. "Somewhat Damaged" opens the album's 'Left' disc with an oscillating acoustic strum that anchor's the listener's attention. Offset by pulsing beats and mallet percussion, Reznor builds up sound layers that contrast with lyrical epitaphs like 'Everything that swore it wouldn't change is different now'. Icarus iconography is invoked, but perhaps a more fitting mythopoeic symbol of the journey that lies ahead would be Nietzsche's pursuit of his Ariadne through the labyrinth of life, during which the hero is steadily consumed by his numbing psychosis. Reznor fittingly comments: 'Didn't quite/Fell Apart/Where were you?' If we consider that Reznor has been repeating the same cycle with different variations throughout all of his music to date, retro-fitting each new album into a seamless tapestry, then this track signals that he has begun to finally climb out of self-imposed exile in the Underworld. "The Day the World Went Away" has a tremendously eerie opening, with plucked mandolin effects entering at 0:40. The main slashing guitar riff was interpreted by some critics as Reznor's attempt to parody himself. For some reason, the eerie backdrop and fragmented acoustic guitar strums recalls to my mind civil defence nuclear war films. Reznor, like William S. Burroughs, has some powerful obsessions. The track builds up in intensity, with a 'Chorus of the Damned' singing 'na na nah' over apocalyptic end-times imagery. At 4:22 the track ends with an echo that loops and repeats. "The Frail" signals a shift to mournful introspectiveness with piano: a soundtrack to faded 8 mm films and dying memories. The piano builds up slowly with background echo, holds and segues into ... "The Wretched", beginning with a savage downbeat that recalls earlier material from Pretty Hate Machine. 'The Far Aways/Forget It' intones Reznor -- it's becoming clear that despite some claims to the contrary, there is redemption in this album, but it is one borne out of a relentless move forward, a strive-drive. 'You're finally free/You could be' suggest Reznor studied Existentialism during his psychotherapy visits. This song contains perhaps the ultimate post-relationship line: 'It didn't turn out the way you wanted it to, did it?' It's over, just not the way you wanted; you can always leave the partner you're with, but the ones you have already left will always stain your memories. The lines 'Back at the beginning/Sinking/Spinning' recall the claustrophobic trapped world and 'eternal Now' dislocation of Post-Traumatic Stress Disorder victims. At 3:44 a plucked cello riff, filtered, segues into a sludge buzz-saw guitar solo. At 5:18 the cello riff loops and repeats. "We're in This Together Now" uses static as percussion, highlighting the influence of electricity flows instead of traditional rock instrument configurations. At 0:34 vocals enter, at 1:15 Reznor wails 'I'm impossible', showing he is the heir to Roger Waters's self-reflective rock-star angst. 'Until the very end of me, until the very end of you' reverts the traditional marriage vow, whilst 'You're the Queen and I'm the King' quotes David Bowie's "Heroes". Unlike earlier tracks like "Reptile", this track is far more positive about relationships, which have previously resembled toxic-dyads. Reznor signals a delta surge (breaking through barriers at any cost), despite a time-line morphing between present-past-future. At 5:30 synths and piano signal a shift, at 5:49 the outgoing piano riff begins. The film-clip is filled with redemptive water imagery. The soundtrack gradually gets more murky and at 7:05 a subterranean note signals closure. "The Fragile" is even more hopeful and life-affirming (some may even interpret it as devotional), but this love -- representative of the End-Times, alludes to the 'Glamour of Evil' (Nico) in the line 'Fragile/She doesn't see her beauty'. The fusion of synths and atonal guitars beginning at 2:13 summons forth film-clip imagery -- mazes, pageants, bald eagles, found sounds, cloaked figures, ruined statues, enveloping darkness. "Just like You Imagined" opens with Soundscapes worthy of Robert Fripp, doubled by piano and guitar at 0:39. Drums and muffled voices enter at 0:54 -- are we seeing a pattern to Reznor's writing here? Sonic debris guitar enters at 1:08, bringing forth intensities from white noise. This track is full of subtle joys like the 1:23-1:36 solo by David Bowie pianist Mike Garson and guitarist Adrian Belew's outgoing guitar solo at 2:43, shifting back to the underlying soundscapes at 3:07. The sounds are always on the dissipative edge of chaos. "Just like You Imagined" opens with Soundscapes worthy of Robert Fripp, doubled by piano and guitar at 0:39. Drums and muffled voices enter at 0:54 -- are we seeing a pattern to Reznor's writing here? Sonic debris guitar enters at 1:08, bringing forth intensities from white noise. This track is full of subtle joys like the 1:23-1:36 solo by David Bowie pianist Mike Garson and guitarist Adrian Belew's outgoing guitar solo at 2:43, shifting back to the underlying soundscapes at 3:07. The sounds are always on the dissipative edge of chaos. "Pilgrimage" utilises a persistent ostinato and beat, with a driving guitar overlay at 0:18. This is perhaps the most familiar track, using Reznor motifs like the doubling of the riff with acoustic guitars between 1:12-1:20, march cries, and pitch-shift effects on a 3:18 drumbeat/cymbal. Or at least I could claim it was familiar, if it were not that legendary hip-hop producer and 'edge-of-panic' tactilist Dr. Dre helped assemble the final track mix. "No, You Don't" has been interpreted as an attack on Marilyn Manson and Hole's Courntey Love, particularly the 0:47 line 'Got to keep it all on the outside/Because everything is dead on the inside' and the 2:33 final verse 'Just so you know, I did not believe you could sink so low'. The song's structure is familiar: a basic beat at 0:16, guitars building from 0:31 to sneering vocals, a 2:03 counter-riff that merges at 2:19 with vocals and ascending to the final verse and 3:26 final distortion... "La Mer" is the first major surprise, a beautiful and sweeping fusion of piano, keyboard and cello, reminiscent of Symbolist composer Debussy. At 1:07 Denise Milfort whispers, setting the stage for sometime Ministry drummer Bill Reiflin's jazz drumming at 1:22, and a funky 1:32 guitar/bass line. The pulsing synth guitar at 2:04 serves as anchoring percussion for a cinematic electronica mindscape, filtered through new layers of sonic chiaroscuro at 2:51. 3:06 phase shifting, 3:22 layer doubling, 3:37 outgoing solo, 3:50-3:54 more swirling vocal fragments, seguing into a fading cello quartet as shadows creep. David Carson's moody film-clip captures the end more ominously, depicting the beauty of drowning. This track contains the line 'Nothing can stop me now', which appears to be Reznor's personal mantra. This track rivals 'Hurt' and 'A Warm Place' from The Downward Spiral and 'Something I Can Never Have' from Pretty Hate Machine as perhaps the most emotionally revealing and delicate material that Reznor has written. "The Great Below" ends the first disc with more multi-layered textures fusing nostalgia and reverie: a twelve-second cello riff is counter-pointed by a plucked overlay, which builds to a 0:43 washed pulse effect, transformed by six second pulses between 1:04-1:19 and a further effects layer at 1:24. E-bow effects underscore lyrics like 'Currents have their say' (2:33) and 'Washes me away' (2:44), which a 3:33 sitar riff answers. These complexities are further transmuted by seemingly random events -- a 4:06 doubling of the sitar riff which 'glitches' and a 4:32 backbeat echo that drifts for four bars. While Reznor's lyrics suggest that he is unable to control subjective time-states (like The Joker in the Batman: Dark Knight series of Kali-yuga comic-books), the track constructions show that the Key to his hold over the listener is very carefully constructed songs whose spaces resemble Pythagorean mathematical formulas. Misdirecting the audience is the secret of many magicians. "The Way Out Is Through" opens the 'Right' disc with an industrial riff that builds at 0:19 to click-track and rhythm, the equivalent of a weaving spiral. Whispering 'All I've undergone/I will keep on' at 1:24, Reznor is backed at 1:38 by synths and drums coalescing into guitars, which take shape at 1:46 and turn into a torrential electrical current. The models are clearly natural morphogenetic structures. The track twists through inner storms and torments from 2:42 to 2:48, mirrored by vocal shards at 2:59 and soundscapes at 3:45, before piano fades in and out at 4:12. The title references peri-natal theories of development (particularly those of Stanislav Grof), which is the source of much of the album's imagery. "Into the Void" is not the Black Sabbath song of the same name, but a catchy track that uses the same unfolding formula (opening static, cello at 0:18, guitars at 0:31, drums and backbeat at 1:02, trademark industrial vocals and synth at 1:02, verse at 1:23), and would not appear out of place in a Survival Research Laboratories exhibition. At 3:42 Reznor plays with the edge of synth soundscapes, merging vocals at 4:02 and ending the track nicely at 4:44 alone. "Where Is Everybody?" emulates earlier structures, but relies from 2:01 on whirring effects and organic rhythms, including a flurry of eight beat pulses between 2:40-2:46 and a 3:33 spiralling guitar solo. The 4:26 guitar solo is pure Adrian Belew, and is suddenly ended by spluttering static and white noise at 5:13. "The Mark Has Been Made" signals another downshift into introspectiveness with 0:32 ghostly synth shimmers, echoed by cello at 1:04 which is the doubled at 1:55 by guitar. At 2:08 industrial riffs suddenly build up, weaving between 3:28 distorted guitars and the return of the repressed original layer at 4:16. The surprise is a mystery 32 second soundscape at the end with Reznor crooning 'I'm getting closer, all the time' like a zombie devil Elvis. "Please" highlights spacious noise at 0:48, and signals a central album motif at 1:04 with the line 'Time starts slowing down/Sink until I drown'. The psychic mood of the album shifts with the discovery of Imagination as a liberating force against oppression. The synth sound again is remarkably organic for an industrial album. "Starfuckers Inc" is the now infamous sneering attack on rock-stardom, perhaps at Marilyn Manson (at 3:08 Reznor quotes Carly Simon's 'You're So Vain'). Jungle beats and pulsing synths open the track, which features the sound-sculpting talent of Pop Will Eat Itself member Clint Mansell. Beginning at 0:26, Reznor's vocals appear to have been sampled, looped and cut up (apologies to Brion Gysin and William S. Burroughs). The lines 'I have arrived and this time you should believe the hype/I listened to everyone now I know everyone was right' is a very savage and funny exposure of Manson's constant references to Friedrich Nietzsche's Herd-mentality: the Herd needs a bogey-man to whip it into submission, and Manson comes dangerous close to fulfilling this potential, thus becoming trapped by a 'Stacked Deck' paradox. The 4:08 lyric line 'Now I belong I'm one of the Chosen Ones/Now I belong I'm one of the Beautiful Ones' highlights the problem of being Elect and becoming intertwined with institutionalised group-think. The album version ditches the closing sample of Gene Simmons screaming "Thankyou and goodnight!" to an enraptured audience on the single from KISS Alive (1975), which was appropriately over-the-top (the alternate quiet version is worth hearing also). "The danger Marilyn Manson faces", notes Don Webb (current High Priest of the Temple of Set), "is that he may end up in twenty years time on the 'Tonight Show' safely singing our favourite songs like a Goth Frank Sinatra, and will have gradually lost his antinomian power. It's much harder to maintain the enigmatic aura of an Evil villain than it is to play the clown with society". Reznor's superior musicianship and sense of irony should keep him from falling into the same trap. "Complication" juggernauts in at 0:57 with screaming vocals and a barrage of white noise at 1:56. It's clear by now that Reznor has read his psychological operations (PSYOP) manuals pertaining to blasting the hell out of his audiences' psyche by any means necessary. Computer blip noise and black light flotation tank memories. Dislocating pauses and time-bends. The aural equivalent of Klein bottles. "Complication" juggernauts in at 0:57 with screaming vocals and a barrage of white noise at 1:56. It's clear by now that Reznor has read his psychological operations (PSYOP) manuals pertaining to blasting the hell out of his audiences' psyche by any means necessary. Computer blip noise and black light flotation tank memories. Dislocating pauses and time-bends. The aural equivalent of Klein bottles. "The Big Come Down" begins with a four-second synth/static intro that is smashed apart by a hard beat at 0:05 and kaleidoscope guitars at 0:16. Critics refer to the song's lyrics in an attempt to project a narcissistic Reznor personality, but don't comment on stylistic tweaks like the AM radio influenced backing vocals at 1:02 and 1:19, or the use of guitars as a percussion layer at 1:51. A further intriguing element is the return of the fly samples at 2:38, an effect heard on previous releases and a possible post-human sub-text. The alien mythos will eventually reign over the banal and empty human. At 3:07 the synths return with static, a further overlay adds more synths at 3:45 as the track spirals to its peak, before dissipating at 3:1 in a mesh of percussion and guitars. "Underneath It All" opens with a riff that signals we have reached the album's climatic turning point, with the recurring theme of fragmenting body-memories returning at 0:23 with the line 'All I can do/I can still feel you', and being echoed by pulsing static at 0:42 as electric percussion. A 'Messiah Complex' appears at 1:34 with the line 'Crucify/After all I've died/After all I've tried/You are still inside', or at least it appears to be that on the surface. This is the kind of line that typical rock critics will quote, but a careful re-reading suggests that Reznor is pointing to the painful nature of remanifesting. Our past shapes us more than we would like to admit particularly our first relationships. "Ripe (With Decay)" is the album's final statement, a complex weaving of passages over a repetitive mesh of guitars, pulsing echoes, back-beats, soundscapes, and a powerful Mike Garson piano solo (2:26). Earlier motifs including fly samples (3:00), mournful funeral violas (3:36) and slowing time effects (4:28) recur throughout the track. Having finally reached the psychotic core, Reznor is not content to let us rest, mixing funk bass riffs (4:46), vocal snatches (5:23) and oscillating guitars (5:39) that drag the listener forever onwards towards the edge of the abyss (5:58). The final sequence begins at 6:22, loses fidelity at 6:28, and ends abruptly at 6:35. At millennium's end there is a common-held perception that the world is in an irreversible state of decay, and that Culture is just a wafer-thin veneer over anarchy. Music like The Fragile suggests that we are still trying to assimilate into popular culture the 'war-on-Self' worldviews unleashed by the nineteenth-century 'Masters of Suspicion' (Charles Darwin, Sigmund Freud, Friedrich Nietzsche). This 'assimilation gap' is evident in industrial music, which in the late 1970s was struggling to capture the mood of the Industrial Revolution and Charles Dickens, so the genre is ripe for further exploration of the scarred psyche. What the self-appointed moral guardians of the Herd fail to appreciate is that as the imprint baseline rises (reflective of socio-political realities), the kind of imagery prevalent throughout The Fragile and in films like Strange Days (1995), The Matrix (1999) and eXistenZ (1999) is going to get even darker. The solution is not censorship or repression in the name of pleasing an all-saving surrogate god-figure. No, these things have to be faced and embraced somehow. Such a process can only occur if there is space within for the Sadeian aesthetic that Nine Inch Nails embodies, and not a denial of Dark Eros. "We need a second Renaissance", notes Don Webb, "a rejuvenation of Culture on a significant scale". In other words, a global culture-shift of quantum (aeon or epoch-changing) proportions. The tools required will probably not come just from the over-wordy criticism of Cyber-culture and Cultural Studies or the logical-negative feeding frenzy of most Music Journalism. They will come from a dynamic synthesis of disciplines striving toward a unity of knowledge -- what socio-biologist Edward O. Wilson has described as 'Consilience'. Liberating tools and ideas will be conveyed to a wider public audience unfamiliar with such principles through predominantly science fiction visual imagery and industrial/electronica music. The Fragile serves as an invaluable model for how such artefacts could transmit their dreams and propagate their messages. For the hyper-alert listener, it will be the first step on a new journey. But sadly for the majority, it will be just another hysterical industrial album promoted as selection of the month. References Bester, Alfred. The Stars My Destination. London: Millennium Books, 1999. Eshun, Kodwo. More Brilliant than the Sun: Adventures in Sonic Fiction. London: Quartet Books, 1998. Van der Kolk, Bessel A. "Trauma and Memory." Traumatic Stress: The Effects of Overwhelming Experience on Mind, Body, and Society. Eds. Bessel A. van der Kolk et al. New York: Guilford Press, 1996. Nine Inch Nails. Downward Spiral. Nothing/Interscope, 1994. ---. The Fragile. Nothing, 1999. ---. Pretty Hate Machine. TVT, 1989. Citation reference for this article MLA style: Alex Burns. "'This Machine Is Obsolete': A Listeners' Guide to Nine Inch Nails' The Fragile." M/C: A Journal of Media and Culture 2.8 (1999). [your date of access] <http://www.uq.edu.au/mc/9912/nine.php>. Chicago style: Alex Burns, "'This Machine Is Obsolete': A Listeners' Guide to Nine Inch Nails' The Fragile," M/C: A Journal of Media and Culture 2, no. 8 (1999), <http://www.uq.edu.au/mc/9912/nine.php> ([your date of access]). APA style: Alex Burns. (1999) 'This machine is obsolete': a listeners' guide to Nine Inch Nails' The fragile. M/C: A Journal of Media and Culture 2(8). <http://www.uq.edu.au/mc/9912/nine.php> ([your date of access]).
APA, Harvard, Vancouver, ISO, and other styles
5

Ellis, Katie, and Mike Kent. "iTunes Is Pretty (Useless) When You’re Blind: Digital Design Is Triggering Disability When It Could Be a Solution." M/C Journal 11, no. 3 (July 2, 2008). http://dx.doi.org/10.5204/mcj.55.

Full text
Abstract:
Introduction This year, 2008, marks the tenth anniversary of the portable MP3 player. MPMan F10, the first such device to utilise the MP3-encoding format, was launched in March 1998 (Smith). However it was not until April 2003 when Apple Inc launched the iPod that the market began the massive growth that has made the devices almost ubiquitous in everyday life. In 2006 iPods were rated as more popular than beer amongst college students in the United States, according to Student Monitor. Beer had only previously surpassed in popularity once before, in 1997, by the Internet (Zeff). This year will also see the launch in Australia of the latest offering in this line of products – the iPhone – which incorporates the popular MP3 player in an advanced mobile phone. The iPhone features a touch-sensitive flat screen that serves as the interface for its operating system. While the design is striking, it also generates accessibility problems. There are obvious implications for those with vision impairments when there are no physical markers to point towards the phone’s functions (Crichton). This article critically examines the promise of Internet-based digital technology to open up the world to people with disabilities, and the parallel danger that the social construction of disability in the digital environment will simply come to mirror pre-existing analogue discrimination. This paper explores how technologies and innovations designed to improve access by the disabled actually enhance access for all users. The first part of the paper focuses on ‘Web 2.0’ and digital access for people with disability, particularly those with vision impairment. The online software that drives the iPod and iPhone and exclusively delivers content to these devices is iTunes. While iTunes seems on the surface to provide enormous opportunity for the vision impaired to access a broad selection of audio content, its design actually works to inhibit access to the platform for this group. Apple promotes the use of iTunes in educational settings through the iTunes U channel, and this potentially excludes those who have difficulty with access to the technology. Critically, it is these excluded people who, potentially, could benefit the most from the new technology. We consider the difficulty experienced by users of screen readers and braille tablets in relation to iTunes and highlight the potential problems for universities who seek to utilise iTunes U. In the second part of the paper we reframe disability accessibility as a principle of universal access and design and outline how changes made to assist users with disability can enhance the learning experience of all students using the Lectopia lecture recording and distribution system as an example. The third section of the paper situates these digital developments within the continuum of disability theory deploying Finkelstein’s three stages of disability development. The focus then shifts to the potential of online virtual worlds such as Second Life to act as a place where the promise of technology to mediate for disability might be realised. Goggin and Newell suggest that the Internet will not be fully accessible until disability is considered a cultural identity in the same way that class, gender and sexuality are. This article argues that accessibility must be addressed through the context of design and shared open standards for digital platforms. Web 2.0 and Accessibility The World Wide Web based its successful development on a set of common standards that worked across different software and operating systems. This interoperability held out great opportunity for the implementation of enabling software for those with disability, particularly sight and hearing impairments. The increasing sophistication and diversification of online content has confounded this initial promise. Websites have become more complex, particularly with the rise of ‘Web 2.0’ and the associated trends in coding and website design. This has aggravated attempts to mediate this content for a disabled audience through software (Zajicek). As Wood notes, ‘these days many computers are used principally to access the Internet – and there is no telling what a blind person will encounter there’. As the content requiring translation – either from text into audio or onto a braille tablet, or from audio into text captions – become less standardised and more complex, it becomes both harder for software to act as a translator, and harder to navigate this media once translated. This is particularly the case when links are generated ‘on the fly’ for each view of a website and where images replace words as hyperlinks. These problems can trace their origin to before the development of the World Wide Web. Reihing, addressing another Apple product in 1987 notes: The Apple Macintosh is particularly hard to use because it depends heavily on graphics. Some word processors ‘paint’ pictures of letters on the screen instead of using standard computer codes, and speech or braille devices can’t cope (in Goggin and Newell). Web 2.0 sites loaded with Ajax and other forms of Java scripting present a particular challenge for translation software (Zajicek). iTunes, an iconic Web 2.0 application, is a further step away from easily translated content as proprietary software that while operating though the Internet, does not conform to Web standards. Many translation software packages are unable to read the iTunes software at all or are limited and only able to read part of the page, but not enough of it to use the program (Furendal). As websites utilising ‘Web 2.0’ technology increase in popularity they become less attractive to users who are visually impaired, particularly because the dynamic elements can not be accessed using screen readers provided with the operating system (Bigham, Prince and Ladner). While at one level this presents an inability for a user with a disability to engage with the popular software, it also meant that universities seeking to use iTunes U to deliver content were excluding these students. To Apple’s credit they have taken some of these access concerns on board with the recent release of both the Apple operating system and iTunes, to better enable Apple’s own access software to translate the iTunes screen for blind users. However this also illustrates the problems with this type of software operating outside of nominated standards as there are still serious problems with access to iTunes on Microsoft’s dominant Windows operating system (Furendal). While Widows provides its own integrated screen reading software, the company acknowledges that this is not sufficiently powerful for regular use by disabled users who will need to use more specialised programs (Wood). The recent upgrade of the standard Windows operating system from XP to Vista seems to have abandoned the previous stipulation that there was a keyboard shortcut for each operation the system performed – a key requirement for those unable to use a visual interface on the screen to ‘point and click’ with a mouse (Wood). Other factors, such as the push towards iTunes U, explored in the next section, explain the importance of digital accessibility for everyone, not just the disabled as this technology becomes ubiquitous. The use of Lectopia in higher education demonstrates the value of flexibility of delivery to the whole student population, inclusive of the disabled. iPods and Higher Education iTunes is the enabling software supporting the iPod and iPhone. As well as commercial content, iTunes also acts as a distribution medium for other content that is free to use. It allows individuals or organisations to record and publish audio and video files – podcasts and vodcasts – that can be automatically downloaded from the Internet and onto individual computers and iPods as they become available. Significantly this technology has provided opportunities for educational use. iTunes U has been developed by Apple to facilitate the delivery of content from universities through the service. While Apple has acknowledged that this is, in part, a deliberate effort to drive the uptake of iTunes (Udell), there are particular opportunities for the distribution of information through this channel afforded by the technology. Duke University in the United States was an early adopter, distributing iPods to each of its first-year students for educational use as early as 2004 (Dean). A recent study of students at The University of Western Australia (UWA) by Williams and Fardon found that students who listen to lectures through portable media players such as iPods (the ‘Pod’ in iPod stands for ‘portable on demand’) have a higher attendance rate at lectures than those who do not. In 1998, the same year that the first portable MP3 player was being launched, the Lectopia (or iLecture) lecture recording and distribution system was introduced in Australia at UWA to enable students with disabilities better access to lecture materials. While there have been significant criticisms of this platform (Brabazon), the broad uptake and popularity of this technology, both at UWA and at many universities across Australia, demonstrates how changes made to assist disability can potentially help the broader community. This underpins the concept of ‘universal design’ where consideration given to people with disability also improves the lives of people without disability. A report by the Australian Human Rights and Equal Opportunity Commission, examined the accessibility of digital technology. Disability issues, such as access to digital content, were reframed as universal design issues: Disability accessibility issues are more accurately perceived in many cases as universal access issues, such that appropriate design for access by people with disabilities will improve accessibility and usability for … the community more generally. The idea of universal access was integral to Tim Berners-Lee’s original conception of the Web – however the platform has developed into a more complex and less ordered environment that can stray from agreed standards (Edwards, "Stop"). iTunes comes with its own accessibility issues. Furendal demonstrated that its design has added utility for some impairments notably dyslexia and colour blindness. However, as noted above, iTunes is highly problematic for those with other vision impairment particularly the blind. It is an example of the condition noted by Regan: There exists a false perception among designers that accessibility represents a restriction on creativity. There are few examples that exist in the world that can dissuade designers of this notion. While there are no technical reasons for this division between accessibility and design, the notion exists just the same. The invisibility of this issue confirms that while an awareness of differing abilities can assist all users, this blinkered approach to diverse visual acuities is not only blocking social justice imperatives but future marketing opportunities. The iPhone is notable for problems associated with use by people with disabilities, particularly people with hearing (Keizer) and vision impairments (Crichton). In colder climates the fact that the screen would not be activated by a gloved hand has also been a problem, its design reflects bias against not just the physically impaired. Design decisions reflect the socially constructed nature of disability where disability is related to how humans have chosen to construct the world (Finkelstein ,"To Deny"). Disability Theory and Technology Nora Groce conducted an anthropological study of Martha’s Vineyard in the United States. During the nineteenth century the island had an unusually high incidence of deafness. In response to this everyone on the island was able to communicate in sign language, regardless of the hearing capability, as a standard mode of communication. As a result the impairment of deafness did not become a disability in relation to communication. Society on the island was constructed to be inclusive without regard to a person’s hearing ability. Finkelstein (Attitudes) identified three stages of disability ‘creation’ to suggest disability (as it is defined socially) can be eradicated through technology. He is confident that the third phase, which he argues has been occurring in conjunction with the information age, will offset many of the prejudicial attitudes established during the second phase that he characterised as the industrial era. Digital technologies are often presented as a way to eradicate disability as it is socially constructed. Discussions around the Web and the benefits for people with disability usually centre on accessibility and social interaction. Digital documents on the Internet enable people with disability greater access than physical spaces, such as libraries, especially for the visually impaired who are able to make use of screen readers. There are more than 38 million blind people who utilise screen reading technology to access the Web (Bigham, Prince and Ladner). A visually impaired person is able to access digital texts whereas traditional, analogue, books remain inaccessible. The Web also allows people with disability to interact with others in a way that is not usually possible in general society. In a similar fashion to arguments that the Web is both gender and race neutral, people with disability need not identify as disabled in online spaces and can instead be judged on their personality first. In this way disability is not always a factor in the social encounter. These arguments however fail to address several factors integral to the social construction of disability. While the idea that a visually impaired person can access books electronically, in conjunction with a screen reader, sounds like a disability-free utopia, this is not always the case as ‘digital’ does not always mean ‘accessible’. Often digital documents will be in an image format that cannot be read by the user’s screen reader and will need to be converted and corrected by a sighted person. Sapey found that people with disabilities are excluded from informational occupations. Computer programming positions were fourth least likely of the 58 occupations examined to employ disabled people. As Rehing observed in 1987, it is a fantasy to think that accessibility for blind people simply means turning on a computer (Rehing in Goggin and Newell). Although it may sound empowering for people with disability to interact in an environment where they can live out an identity different from the rhythm of their daily patterns, the reality serves to decrease the visibility of disability in society. Further, the Internet may not be accessible for people with disability as a social environment in the first place. AbilityNet’s State of the eNation Web Accessibility Report: Social Networking Sites found a number of social networking sites including the popular MySpace and Facebook are inaccessible to users with a number of different disabilities, particularly those with a visual impairment such as blindness or a cognitive disability like dyslexia. This study noted the use of ‘Captcha’ – ‘Completely Automated Public Turing test to tell Computers and Humans Apart’ – technology designed to differentiate between a person signing up for an account and an automated computer process. This system presents an image of a word deliberately blurred and disfigured so that it cannot be readily identified by a computer, which can only be translated by a human user. This presents an obstacle to people with a visual impairment, particularly those relying on transcription software that will, by design, not be able to read the image, as well as those with dyslexia who may also have trouble translating the image on the screen. Virtual Worlds and New Possibilities The development of complex online virtual worlds such as Second Life presents their own set of challenges for access, for example, the use of Captcha. However they also afford opportunity. With over a million residents, there is a diversity of creativity. People are using Second Life to try on different identities or campaign for causes relevant in the real world. For example, Simon Stevens (Simon Walsh in SL), runs the nightclub Wheelies in the virtual world and continues to use a wheelchair and helmet in SL – similar to his real-life self: I personally changed Second Life’s attitude toward disability when I set up ‘Wheelies’, its first disability nightclub. This was one of those daft ideas which grew and grew and… has remained a central point for disability issues within Second Life. Many new Disabled users make contact with me for advice and wheelies has helped some of them ‘come out’ and use a wheelchair (Carter). Able-bodied people are also becoming involved in raising disability awareness through Second Life, for example Fez Richardson is developing applications for use in Second Life so that the non-disabled can experience the effects of impairment in this virtual realm (Cassidy) Tertiary Institutions are embracing the potential of Second Life, utilising the world as a virtual classroom. Bates argues that Second Life provides a learning environment free of physical barriers that has the potential to provide an enriched learning experience for all students regardless of whether they have a disability. While Second Life might be a good environment for those with mobility impairment there are still potential access problems for the vision and hearing impaired. However, Second Life has recently become open source and is actively making changes to aid accessibility for the visually impaired including an audible system where leaves rustle to denote a tree is nearby, and text to speech software (Sierra). Conclusion Goggin and Newell observe that new technology is a prominent component of social, cultural and political changes with the potential to mitigate for disability. The uneven interface of the virtual and the analogue, as demonstrated by the implementation and operation of iTunes, indicates that this mitigation is far from an inevitable consequence of this development. However James Edwards, author of the Brothercake blog, is optimistic that technology does have an important role in decreasing disability in wider society, in line with Finkelstein’s third phase: Technology is the last, best hope for accessibility. It's not like the physical world, where there are good, tangible reasons why some things can never be accessible. A person who's blind will never be able to drive a car manually; someone in a wheelchair will never be able to climb the steps of an ancient stone cathedral. Technology is not like the physical world – technology can take any shape. Technology is our slave, and we can make it do what we want. With technology there are no good reasons, only excuses (Edwards, "Technology"). Internet-based technologies have the potential to open up the world to people with disabilities, and are often presented as a way to eradicate disability as it is socially constructed. While Finkelstein believes new technologies characteristic of the information age will offset many of the prejudicial attitudes established during the industrial revolution, where technology was established around able-bodied norms, the examples of the iPhone and Captcha illustrate that digital technology is often constructed in the same social world that people with disability are routinely disabled by. The Lectopia system on the other hand enables students with disabilities to access lecture materials and highlights the concept of universal access, the original ideology underpinning design of the Web. Lectopia has been widely utilised by many different types of students, not just the disabled, who are seeking flexibility. While we should be optimistic, we must also be aware as noted by Goggin and Newell the Internet cannot be fully accessible until disability is considered a cultural identity in the same way that class, gender and sexuality are. Accessibility is a universal design issue that potentially benefits both those with a disability and the wider community. References AbilityNet Web Accessibility Team. State of the eNation Web Accessibility Reports: Social Networking Sites. AbilityNet. January 2008. 12 Apr. 2008 ‹http://www.abilitynet.org.uk/docs/enation/2008SocialNetworkingSites.pdf›. Bates, Jacqueline. "Disability and Access in Virtual Worlds." Paper presented at Alternative Format Conference, LaTrobe University, Melbourne, 21–23 Jan. 2008. Bigham, Jeffrey P., Craig M. Prince, and Richard E. Ladner . "WebAnywhere: A Screen Reader On-the-Go." Paper presented at 17th International World Wide Web Conference, Beijing, 21–22 April 2008. 29 Apr. 2008 ‹http://webinsight.cs.washington.edu/papers/webanywhere-html/›. Brabazon, Tara. "Socrates in Earpods: The iPodification of Education." Fast Capitalism 2.1, (July 2006). 8 June 2008 ‹http://www.uta.edu/huma/agger/fastcapitalism/2_1/brabazon.htm›. Carter, Paul. "Virtually the Same." Disability Now (May 2007). Cassidy, Margaret. "Flying with Disability in Second Life." Eureka Street 18.1 (10 Jan. 2008): 22-24. 15 June 2007 ‹http://www.eurekastreet.com.au/article.aspx?aeid=4849›. Crichton, Paul. "More on the iPhone…" Access 2.0. BBC.co.uk 22 Jan. 2007. 12 Apr. 2008 ‹http://www.bbc.co.uk/blogs/access20/2007/01/more_on_the_iphone.shtml›. Dean, Katie. "Duke Gives iPods to Freshmen." Wired Magazine (20 July 2004). 29 Apr. 2008 ‹http://www.wired.com/entertainment/music/news/2004/07/64282›. Edwards, James. "Stop Using Ajax!" Brothercake (24 April 2008). 1 May 2008 ‹http://dev.opera.com/articles/view/stop-using-ajax›. –––. "Technology Is the Last, Best Hope for Accessibility." Brothercake 13 Mar. 2007. 1 May 2008 ‹http://www.brothercake.com/site/resources/reference/hope›. Finkelstein, Victor. "To Deny or Not to Deny Disability." Magic Carpet 27.1 (1975): 31-38. 1 May 2008 ‹http://www.independentliving.org/docs1/finkelstein.html›. –––. Attitudes and Disabled People: Issues for Discussion. Geneva: World Rehabilitation Fund, 1980. 1 May 2008 ‹http://www.leeds.ac.uk/disability-studies/archiveuk/finkelstein/attitudes.pdf›. Furendal, David. "Downloading Music and Videos from the Internet: A Study of the Accessibility of The Pirate Bay and iTunes store." Presentation at Uneå University, 24 Jan. 2007. 13 Apr. 2008 ‹http://www.david.furendal.com/Accessibility.aspx›. Groce, Nora E. Everyone Here Spoke Sign Language: Hereditary Deafness on Martha's Vineyard. Cambridge, MA: Harvard University, 1985. Goggin, Gerard, and Christopher Newell. Digital Disability: The Social Construction of Disability in New Media. Oxford: Rowman & Littlefield, 2003. Human Rights and Equal Opportunities Commission. Accessibility of Electronic Commerce and New Service and Information Technologies for Older Australians and People with a Disability. 31 March 2000. 30 Apr. 2008 ‹http://www.hreoc.gov.au/disability_rights/inquiries/ecom/ecomrep.htm#BM2_1›. Keizer, Gregg. "Hearing Loss Group Complains to FCC about iPhone." Computerworld (20 Sep. 2007). 12 Apr. 2008 ‹http://www.computerworld.com/action/article.do?command=viewArticleBasic&articleId=9037999›. Regan, Bob. "Accessibility and Design: A Failure of the Imagination." ACM International Conference Proceedings Series 63: Proceedings of The 2004 International Cross-disciplinary Workshop on Web Accessibility (W4A). 29–37. Sapey, Bob. "Disablement in the Information Age." Disability and Society 15.4 (June 2000): 619–637. Sierra. "IBM Project: Second Life Accessible for Blind People." Techpin (24 Sep. 2007). 3 May 2008 ‹http://www.techpin.com/ibm-project-second-life-accessible-for-blind-people/›. Smith, Tony. "Ten Years Old: The World’s First MP3 Player." Register Hardware (10 Mar. 2008). 12 Apr. 2008 ‹http://www.reghardware.co.uk/2008/03/10/ft_first_mp3_player/›. Udell, Jon. "The iTunes U Agenda." InfoWorld (22 Feb. 2006). 13 Apr. 2008 ‹http://weblog.infoworld.com/udell/2006/02/22.html›. Williams, Jocasta, and Michael Fardon. "Perpetual Connectivity: Lecture Recordings and Portable Media Players." Proceedings from Ascilite, Singapore, 2–5 Dec. 2007. 1084–1092. Wood, Lamont. "Blind Users Still Struggle with 'Maddening' Computing Obstacles." Computerworld (16 Apr. 2008). 27 Apr. 2008 ‹http://www.computerworld.com/action/article.do?command=viewArticleBasic&articleId=9077118&source=NLT_AM&nlid=1›. Zajicek, Mary. "Web 2.0: Hype or Happiness?" Paper presented at International Cross-Disciplinary Conference on Web Accessibility, Banff, Canada, 2–9 May 2007. 12 Apr. 2008 ‹http://www.w4a.info/2007/prog/k2-zajicek.pdf›. Zeff, Robbin. "Universal Design across the Curriculum." New Directions for Higher Education 137 (Spring 2007): 27–44.
APA, Harvard, Vancouver, ISO, and other styles

Dissertations / Theses on the topic "Inc Harvey Probber"

1

(9826889), Arthur Pinkney. "Real-time scheduling of sugar cane transport." Thesis, 2011. https://figshare.com/articles/thesis/Real-time_scheduling_of_sugar_cane_transport/13461767.

Full text
Abstract:
"The Cane Railway Scheduling Problem (CRSP) is to design a set of locomotive runs that supply the sugar factory with cane to crush, and the harvesting contractors with empty bins to fill with cane. Cane transport is the link between the harvesting and processing sectors of the sugar industry and its impact on both sectors must be considered. CRSP is significant problem for the sugar industry ... This research develops a real-time dynamic scheduling system (RTSS) that removes many of the assumptions and handles uncertainty, and produces schedules that are suitable for everyday, operational use"--Abstract.
APA, Harvard, Vancouver, ISO, and other styles

Books on the topic "Inc Harvey Probber"

1

Wellman, Christopher Heath. The Problem of Relatedness. Oxford University Press, 2017. http://dx.doi.org/10.1093/acprof:oso/9780190274764.003.0006.

Full text
Abstract:
Chapter 6 grapples with the problem of relatedness, which requires one to confront foundational questions in moral philosophy and their implications for forfeiture theory. The core issue is whether a wrongdoer forfeits her right against being harmed for any reason whatsoever (the unlimited-reasons approach), or only for reasons appropriately related to her wrongdoing (the limited-reasons approach). After rebutting the initial impression that the unlimited-reasons approach is wholly implausible, the chapter offers some reasons in defense of the conclusion that wrongdoers do not forfeit their rights against being harmed in general; more specifically, they forfeit their right against being punished, which (by definition) involves being intentionally stigmatized for one’s putative wrongdoing.
APA, Harvard, Vancouver, ISO, and other styles
2

Haines, Daniel. The Problem of Territory. Oxford University Press, 2017. http://dx.doi.org/10.1093/acprof:oso/9780190648664.003.0002.

Full text
Abstract:
This chapter examines the contested meanings of territoriality in decolonizing South Asia, building on recent scholarship on nationalist thought. It argues that when independence came, bringing with it the Partition of Punjab and Bengal, the spatial basis of the Indian and Pakistani nation-states was hardly stable. As the British colonial government prepared to withdraw, nationalists put forward competing visions of what independence could bring. Many of these visions had a difficult relationship with the idea of a national territory. The Indian National Congress sought a composite Indian national identity to hold together a vast and diverse region, while the Muslim League proposed a new entity called ‘Pakistan’, but with little clarity regarding the state’s location, extent or constitutional relationship to India. These territorial uncertainties provided the political context in which the Indus waters dispute became a matter of state sovereignty after independence.
APA, Harvard, Vancouver, ISO, and other styles
3

Sharma, Arvind. Part of the Problem, Part of the Solution. Greenwood Publishing Group, Inc., 2008. http://dx.doi.org/10.5040/9798400695360.

Full text
Abstract:
Part of the Problem, Part of the Solution unleashes religion's true potential to do good by bridging the modern divide between religion and an ever pervasive secular society, a notion often loathed by individuals on both sides of the religious aisle. As noted scholars such as Huston Smith, Karen Armstrong, Rosemary Radford Reuther, Harvey Cox, and Seyyed Hossein Nasr explain throughout the conversations related in this text, people of varied and conflicting faiths can come together to engage in civil, useful dialogue, and members of quite varied religious traditions can work together for the benefit of all humankind and can help defuse the world's current epidemic of violence. By showing how religion is an instrument in human affairs that can be tuned for both good and evil, this book lays the groundwork for an important cooperative effort to blossom. Furthermore, today's trend of associating all religion with suspicion has spiraled into a dangerous situation-that in discarding all religion because some of it causes harm, one risks throwing away the baby with the bathwater. Books such as When Religion Becomes Evil by Charles Kimball, The God Delusion by Richard Dawkins, The End of Faith by Sam Harris, Breaking the Spell: Religion as a Natural Phenomenon by Daniel Dennett, and God is Not Great: How Religion Poisons Everything by Christopher Hitchens have created quite a sensation, leaving the impression that religion, at its root, brings more heartache than handshakes. This development has dismayed many scholars, students, and practitioners of religion, of all faiths, who believe that only half the story-the negative half-is being told. Although demonstrating that certain religious beliefs have surely contributed to the violence that has occurred in this century, this book also explores how other religious teachings can help solve the epidemic of violence.
APA, Harvard, Vancouver, ISO, and other styles
4

Buchanan, Ben. Information Distribution and the Status Quo. Oxford University Press, 2017. http://dx.doi.org/10.1093/acprof:oso/9780190665012.003.0007.

Full text
Abstract:
While the last chapter showed how past mitigations to the security dilemma do not work in cybersecurity, this one goes further, showing that in some ways the cybersecurity dilemma is harder to solve than the security dilemma. It shows how security dilemma thinking rests on two main assumptions about information distribution and about a baseline status quo, neither of which holds up in cybersecurity. Remove these two assumptions, and the problem gets harder still. This chapter outlines the details of those assumptions and their specific flaws in the context of cybersecurity.
APA, Harvard, Vancouver, ISO, and other styles
5

Collier Hillstrom, Laurie. The #MeToo Movement. ABC-CLIO, LLC, 2018. http://dx.doi.org/10.5040/9798400605062.

Full text
Abstract:
This volume provides a concise but authoritative overview of the #MeToo Movement and its enormous impact on American society, from the studios of Hollywood to factories, campuses, and offices across the country. The 21st Century Turning Points series is a one-stop resource for understanding the people and events changing America today. The #MeToo Movement is devoted to the issue that brought sexual harassment out of the shadows of American culture and into the spotlight. Sparked by revelations of decades of sexual harassment by powerful Hollywood executive Harvey Weinstein, the movement quickly uncovered similar abusive behavior by numerous other famous public figures. It also revealed the extent to which sexual harassment has been a persistent problem in many workplace settings across America and the ways in which girls and women are subjected to degrading and discriminatory treatment because of their gender. The book provides a broad perspective on these issues. It discusses late twentieth-century efforts to identify sexual harassment as a longstanding societal problem; explains how the 2016 presidential election brought new attention to this issue; introduces activists who helped to launch the #MeToo Movement; and surveys the impact of the movement on American politics, business, and entertainment.
APA, Harvard, Vancouver, ISO, and other styles
6

Daley, Dennis C., and Antoine Douaihy. A Family Guide to Coping with Substance Use Disorders. Oxford University Press, 2019. http://dx.doi.org/10.1093/med-psych/9780190926632.001.0001.

Full text
Abstract:
This guide was written for family members, significant others, and people concerned about their relatives or friends who have an alcohol or drug problem, which in this book is referred to as substance misuse or substance use disorder (SUD). Substance problems can take many shapes and forms and differ in their severity and impact. This family guide will discuss these problems and how to help the affected person and other family members (including children) who may have been harmed by a loved one’s substance problem. This guide can also help individuals with a substance use problem understand the impact of their SUDs on the family as well as what their family members can do to help themselves. Addressing family issues and making amends are key issues for people in recovery from SUDs.
APA, Harvard, Vancouver, ISO, and other styles
7

Kamm, F. M. Rights and Their Limits. Oxford University PressNew York, 2022. http://dx.doi.org/10.1093/oso/9780197567739.001.0001.

Full text
Abstract:
Abstract This book deals with how rights and their limits are dealt with in theories as well as in hypothetical and practical cases. It begins by considering moral status and its relation to having rights including whether animals have them and what rights future persons have. It considers whether rights are grounded in duties to oneself, which duties are correlative to rights, and whether neuroscientific and psychological studies can help determine what rights we have. The limits of the right not to be harmed are investigated by considering critiques of deontological distinctions, costs that must be undertaken to avoid harming, and a proposal for permissibly harming someone in the Trolley Problem. The possibility that the Trolley Problem can help determine what rights are involved in programming self-driving cars, providing medical treatments, and redistributive economic policy is considered. The book concludes by comparing the use of case-based judgments about extreme cases in moral versus aesthetic theory, and by exploring the significance of the right not to be harmed for morally correct policies in the extreme cases of torture and a pandemic. Where pertinent, the views on these issues of T. Regan, D. Parfit, C. Korsgaard, S. Kagan, R. Dworkin, A. Sen, A. Gibbard, J. Greene, A. Danto, and J. Thomson, among others, are considered.
APA, Harvard, Vancouver, ISO, and other styles
8

Rose, David C. The Rise of Flourishing Societies. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780199330720.003.0005.

Full text
Abstract:
This chapter explains how societies can climb a development ladder whereby each step leads to a larger set of transactions through which to increase the value of output per capita. Each step higher is harder because each step adds transactions that require higher levels of social trust. The problem is that many of the benefits of climbing the ladder are realized at the level of society as a whole, so individual adults and individual parents have much to gain by conserving on their own resources while allowing everyone else in society to invest into the inculcation of the required moral beliefs to produce a high-trust society. There is a public good problem associated with investing enough to best promote the common good. This problem is particularly daunting for the kind of moral beliefs required to produce trustworthy individuals and it worsens with societal success.
APA, Harvard, Vancouver, ISO, and other styles
9

McPherson, Lionel K. Legalism, Justice, and the War on Terrorism. Oxford University Press, 2017. http://dx.doi.org/10.1093/oso/9780190495657.003.0011.

Full text
Abstract:
Some standard norms of conduct in war are morally unsustainable. The “noncombatant immunity” principle that prohibits deliberate use of force against noncombatants represents one such norm. Standard noncombatant immunity is limited, its focus on intention, allowing, in effect, ordinary noncombatants to be harmed routinely through lawful attacks by combatants. These noncombatant casualties often are likely, foreseeable, and avoidable and thus not merely accidental. Apart from the moral problem of just war legalism, the practical problem is this: a military power cannot expect to win hearts and minds in foreign populations, as the war on terrorism requires, when its approach to fighting expresses relatively little concern for noncombatant lives. Greatly reducing noncombatant casualties is a pragmatic imperative that recommends fighting to a much higher standard—even when prevailing moral and legal norms allow collaterally harming noncombatants.
APA, Harvard, Vancouver, ISO, and other styles
10

Western, Bruce. Violence, Poverty, Values, and the Will to Punish. Oxford University Press, 2018. http://dx.doi.org/10.1093/oso/9780190888589.003.0007.

Full text
Abstract:
This chapter argues that Fassin’s analysis should be expanded in three ways. First, Fassin should take greater account of how the unlawful state violence he rightly deplores is nonetheless frequently produced in response to violent criminal acts. Losing sight of the underlying problem of criminal violence in poor and marginal communities can make it harder to see how reform might be possible, by reducing the problem to one of arbitrary labeling (and subsequent punishment) of certain kinds of conduct. Second, while Fassin notes the connections between vulnerability to state violence and poverty, it would be worth paying more attention to the way economic inequality dehumanizes certain subjects and makes them more vulnerable objects of state abuse. Social analysis should be humanizing, in response. Third, Fassin should express positive value commitments to those latent in his critique as a guide to reform.
APA, Harvard, Vancouver, ISO, and other styles

Book chapters on the topic "Inc Harvey Probber"

1

Bao, Heng, Lirui Deng, Jiazhi Guan, Liang Zhang, and Xunxun Chen. "Improving Deepfake Video Detection with Comprehensive Self-consistency Learning." In Communications in Computer and Information Science, 151–61. Singapore: Springer Nature Singapore, 2022. http://dx.doi.org/10.1007/978-981-19-8285-9_11.

Full text
Abstract:
AbstractDeepfake videos created by generative-base models have become a serious societal problem recently as been hardly distinguishable by human eyes, which has aroused a lot of academic attention. Previous researches have made effort to address this problem by various schemes to extract visual artifacts of non-pristine frames or discrepancy between real and fake videos, where the patch-based approaches are shown to be promising but mostly used in frame-level prediction. In this paper, we propose a method that leverages comprehensive consistency learning in both spatial and temporal relation with patch-based feature extraction. Extensive experiments on multiple datasets demonstrate the effectiveness and robustness of our approach by combines all consistency cue together.
APA, Harvard, Vancouver, ISO, and other styles
2

Margaria, Tiziana, Hafiz Ahmad Awais Chaudhary, Ivan Guevara, Stephen Ryan, and Alexander Schieweck. "The Interoperability Challenge: Building a Model-Driven Digital Thread Platform for CPS." In Lecture Notes in Computer Science, 393–413. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-89159-6_25.

Full text
Abstract:
AbstractWith the heterogeneity of the industry 4.0 world, and more generally of the Cyberphysical Systems realm, the quest towards a platform approach to solve the interoperability problem is front and centre to any system and system-of-systems project. Traditional approaches cover individual aspects, like data exchange formats and published interfaces. They may adhere to some standard, however they hardly cover the production of the integration layer, which is implemented as bespoke glue code that is hard to produce and even harder to maintain. Therefore, the traditional integration approach often leads to poor code quality, further increasing the time and cost and reducing the agility, and a high reliance on the individual development skills. We are instead tackling the interoperability challenge by building a model driven/low-code Digital Thread platform that 1) systematizes the integration methodology, 2) provides methods and techniques for the individual integrations based on a layered Domain Specific Languages (DSL) approach, 3) through the DSLs it covers the integration space domain by domain, technology by technology, and is thus highly generalizable and reusable, 4) showcases a first collection of examples from the domains of robotics, IoT, data analytics, AI/ML and web applications, 5) brings cohesiveness to the aforementioned heterogeneous platform, and 6) is easier to understand and maintain, even by not specialized programmers. We showcase the power, versatility and the potential of the Digital Thread platform on four interoperability case studies: the generic extension to REST services, to robotics through the UR family of robots, to the integration of various external databases (for data integration) and to the provision of data analytics capabilities in R.
APA, Harvard, Vancouver, ISO, and other styles
3

Kupferman, Orna, Ofer Leshkowitz, and Naama Shamash Halevy. "Synthesis with Privacy Against an Observer." In Lecture Notes in Computer Science, 256–77. Cham: Springer Nature Switzerland, 2024. http://dx.doi.org/10.1007/978-3-031-57228-9_13.

Full text
Abstract:
AbstractWe study automatic synthesis of systems that interact with their environment and maintain privacy against an observer to the interaction. The system and the environment interact via sets I and O of input and output signals. The input to the synthesis problem contains, in addition to a specification, also a list of secrets, a function $$\textsf{cost}: I \cup O \rightarrow {\mathbb N}$$ cost : I ∪ O → N , which maps each signal to the cost of hiding it, and a bound $$b \in {\mathbb N}$$ b ∈ N on the budget that the system may use for hiding of signals. The desired output is an (I/O)-transducer $$\mathcal {T}$$ T and a set $$\mathcal {H} \subseteq I \cup O$$ H ⊆ I ∪ O of signals that respects the bound on the budget, thus $$\sum _{s \in \mathcal {H}} \textsf{cost}(s) \le b$$ ∑ s ∈ H cost ( s ) ≤ b , such that for every possible interaction of $$\mathcal {T}$$ T , the generated computation satisfies the specification, yet an observer from which the signals in $$\mathcal {H}$$ H are hidden, cannot evaluate the secrets.We first show that the complexity of the problem is 2EXPTIME-complete for specifications and secrets in LTL, thus it is not harder than synthesis with no privacy requirements. We then analyze the complexity of the problem more carefully, isolating the two aspects that do not exist in traditional synthesis, namely the need to hide the value of the secrets and the need to choose the set $$\mathcal {H}$$ H . We do this by studying settings in which traditional synthesis can be solved in polynomial time – when the specification formalism is deterministic automata and when the system is closed, and show that each of the two aspects involves an exponential blow-up in the complexity. We continue and study bounded synthesis with privacy, where the input also includes a bound on the size of the synthesized transducer, as well as a variant of the problem in which the observer has knowledge about the specification, which can be helpful in evaluating the secrets. We study the effect of both variants on the different aspects of the problem and provide algorithms with a tight complexity.
APA, Harvard, Vancouver, ISO, and other styles
4

Marathe, Nachiket P., and Michael S. Bank. "The Microplastic-Antibiotic Resistance Connection." In Microplastic in the Environment: Pattern and Process, 311–22. Cham: Springer International Publishing, 2021. http://dx.doi.org/10.1007/978-3-030-78627-4_9.

Full text
Abstract:
AbstractMicroplastic pollution is a big and rapidly growing environmental problem. Although the direct effects of microplastic pollution are increasingly studied, the indirect effects are hardly investigated, especially in the context of spreading of disease and antibiotic resistance genes, posing an apparent hazard for human health. Microplastic particles provide a hydrophobic surface that provides substrate for attachment of microorganisms and readily supports formation of microbial biofilms. Pathogenic bacteria such as fish pathogens Aeromonas spp., Vibrio spp., and opportunistic human pathogens like Escherichia coli are present in these biofilms. Moreover, some of these pathogens are shown to be multidrug resistant. The presence of microplastics is known to enhance horizontal gene transfer in bacteria and thus, may contribute to dissemination of antibiotic resistance. Microplastics can also adsorb toxic chemicals like antibiotics and heavy metals, which are known to select for antibiotic resistance. Microplastics may, thus, serve as vectors for transport of pathogens and antibiotic resistance genes in the aquatic environment. In this book chapter, we provide background information on microplastic biofouling (“plastisphere concept”), discuss the relationship between microplastic and antibiotic resistance, and identify knowledge gaps and directions for future research.
APA, Harvard, Vancouver, ISO, and other styles
5

Workman, Paul. "Reflections and Outlook on Targeting HSP90, HSP70 and HSF1 in Cancer: A Personal Perspective." In Advances in Experimental Medicine and Biology, 163–79. Cham: Springer International Publishing, 2020. http://dx.doi.org/10.1007/978-3-030-40204-4_11.

Full text
Abstract:
Abstract This personal perspective focuses on small-molecule inhibitors of proteostasis networks in cancer—specifically the discovery and development of chemical probes and drugs acting on the molecular chaperones HSP90 and HSP70, and on the HSF1 stress pathway. Emphasis is on progress made and lessons learned and a future outlook is provided. Highly potent, selective HSP90 inhibitors have proved invaluable in exploring the role of this molecular chaperone family in biology and disease pathology. Clinical activity was observed, especially in non small cell lung cancer and HER2 positive breast cancer. Optimal use of HSP90 inhibitors in oncology will likely require development of creative combination strategies. HSP70 family members have proved technically harder to drug. However, recent progress has been made towards useful chemical tool compounds and these may signpost future clinical drug candidates. The HSF1 stress pathway is strongly validated as a target for cancer therapy. HSF1 itself is a ligandless transcription factor that is extremely challenging to drug directly. HSF1 pathway inhibitors have been identified mostly by phenotypic screening, including a series of bisamides from which a clinical candidate has been identified for treatment of ovarian cancer, multiple myeloma and potentially other cancers.
APA, Harvard, Vancouver, ISO, and other styles
6

Berck, Peter, and Christopher Costello. "Efficiency Controls and the Captured Fishery Regulator." In Sustainable Resource Development in the 21st Century, 125–41. Cham: Springer International Publishing, 2023. http://dx.doi.org/10.1007/978-3-031-24823-8_10.

Full text
Abstract:
AbstractRent dissipation in open access fisheries is well studied (Gordon J Polit Econ 62:124–142, 1954; Homans and Wilen J Environ Econ Manag 32:1–21, 1997) and has been shown to induce efficiency losses of over $50 billion per year in global fisheries (Costello et al Proc Natl Acad Sci 113(18):5125–5129, 2016). While fisheries are increasingly managed with quota-based approaches, over half of the world’s fish catch is still largely unregulated. This lack of complete management stems, in part, from the reluctance of fishery regulators to limit entry or directly regulate harvest. This often leaves restrictions on efficiency—such as technology or season restrictions—as the only means to achieve management goals. We study the situation when a regulator is “captured” in the sense that he cannot directly control entry but acts in the representative fisher’s best interest. Incumbent fishers are faced with the problem that potential entrants appear just like incumbents, so current profits must be weighed against the incentive for entry. We find that when the regulator is captured by industry members, he unambiguously allows overfishing—reaching a lower stock and higher effort than is socially optimal. This steady state has zero rents, but, interestingly, a higher stock and effort than in the pure open access equilibrium.
APA, Harvard, Vancouver, ISO, and other styles
7

Juul Nielsen, Peter, and Lars Heltoft. "Indexicality across the boundaries of syntax, semantics and pragmatics." In Ditransitives in Germanic Languages, 150–94. Amsterdam: John Benjamins Publishing Company, 2023. http://dx.doi.org/10.1075/sigl.7.05nie.

Full text
Abstract:
In Danish, indirect object (IO) constructions fall into two main classes: (1) the three-argument valence-governed pattern and (2) the free indirect object construction. The free IO is a constructional extension to certain types of monotransitive constructions and verbs; by contrast, the valence-governed IO is a manifestation of the third argument of three-place verb stems in (prototypically) transfer constructions. The free indirect object (free IO) in Modern Danish presents an intricate problem, calling for concepts and solutions not normally connected with constructional syntax. Its frequency is extremely low, and intuitions about its acceptability vary according to basic speech act type. In assertive contexts, it comes across as old-fashioned and is hardly productive; in regulative contexts, by contrast, it retains full productivity. The few positive results yielded by a corpus search are almost exclusively examples of free IOs in regulative contexts. Indexicality, as used especially in morphology by Henning Andersen and Raimo Anttila, is the key concept of our analysis. An IO np must identify its argument by pointing indexically to some aspect of the predicate’s semantics, but since – in the case of free IOs – there is no third argument A3 in the verb’s valence schema, there is apparently nothing for the free IO to index. In special cases, however, most importantly in regulative contexts, the free IO finds an alternative indicatum by pointing to features of the performative situation. Our findings indicate the need for a grammatical theory that allows syntactic rules to be not only semantically, but also pragmatically sensitive.
APA, Harvard, Vancouver, ISO, and other styles
8

Morgan, Oliver. "Speaking When You’re Spoken To." In Turn-taking in Shakespeare, 21–42. Oxford University Press, 2019. http://dx.doi.org/10.1093/oso/9780198836353.003.0001.

Full text
Abstract:
This chapter is concerned with what Harvey Sacks has called ‘the speaker sequencing problem’—how a group of people (or fictional characters) is able to decide which of them should speak next. It examines the model of speaker-sequencing now standard in conversation analysis and asks what this model has to offer for the student of dialogue. For literary critical purposes, it concludes, we will need a radically simplified approach. Happily, this is not difficult to achieve. We need only make a single, very reasonable, assumption—that the basic rule which underpins all conversational sequencing is ‘speak when you’re spoken to’. The implications of this assumption for the analysis dramatic dialogue are explored in Chapter 2.
APA, Harvard, Vancouver, ISO, and other styles
9

Pennington, M. W., W. R. Kem, and E. Karlsson. "Sea anemone potassium channel toxins." In Guidebook to Protein Toxins and Their Use in Cell Biology, 161–63. Oxford University PressOxford, 1997. http://dx.doi.org/10.1093/oso/9780198599555.003.0057.

Full text
Abstract:
Abstract During screening for dendrotoxin-like compounds in marine organisms, extracts of several sea anemones were found to inhibit the binding of 125I-dendrotoxin I, a probe for voltage dependent potassium channels, to rat brain synaptosomal membranes (Harvey et al. 1991; Karlsson et al. 1991). Two toxins were later isolated from Carib¬ bean sea anemones, ShK toxin from Stichodactyla helianthus (Karlsson et al. 1992; Aneiros et al. 1993; Castaneda et al. 1995) and BgK toxin from Bunodosoma granulifera (Karlsson et al. 1992). More recently, another toxin has been isolated from Anemonia sulcata, which we refer to here as AsK toxin (Schweitz et al. 1995). These toxins are known to block voltage-dependent potassium channels.
APA, Harvard, Vancouver, ISO, and other styles
10

Hedman, Shawn. "Computability and complexity." In A First Course in Logic. Oxford University Press, 2004. http://dx.doi.org/10.1093/oso/9780198529804.003.0011.

Full text
Abstract:
In this chapter we study two related areas of theoretical computer science: computability theory and computational complexity. Each of these subjects take mathematical problems as objects of study. The aim is not to solve these problems, but rather to classify them by level of difficulty. Time complexity classifies a given problem according to the length of time required for a computer to solve the problem. The polynomial-time problems P and the nondeterministic polynomial-time problems NP are the two most prominent classes of time complexity. Some problems cannot be solved by the algorithmic process of a computer. We refer to problems as decidable or undecidable according to whether or not there exists an algorithm that solves the problem. Computability theory considers undecidable problems and the brink between the undecidable and the decidable. There are only countably many algorithms and uncountably many problems to solve. From this fact we deduce that most problems are not decidable. To proceed beyond this fact, we must state precisely what we mean by an “algorithm” and a “problem.” One of the aims of this chapter is to provide a formal definition for the notion of an algorithm. The types of problems we shall consider are represented by the following examples. • The even problem: Given an n ∈ ℕ, determine whether or not n is even. • The 10-clique problem: Given finite graph, determine whether or not there exists a subgraph that is isomorphic to the 10-clique. • The satisfiability problem for first-order logic: Given a sentence of first-order logic, determine whether or not it is satisfiable. The first problem is quite easy. To determine whether a given number is even, we simply check whether the last digit of the number is 0, 2, 4, 6 or 8. The second problem is harder. If the given graph is large and does contain a 10-clique as a subgraph, then we may have to check many subsets of the graph before we find it. Time complexity gives precise meaning to the ostensibly subjective idea of one problem being “harder” than another. The third problem is the most difficult of the three problems.
APA, Harvard, Vancouver, ISO, and other styles

Conference papers on the topic "Inc Harvey Probber"

1

Simon. "Soloway's Rainfall Problem Has Become Harder." In 2013 Learning and Teaching in Computing and Enginering (LaTiCE). IEEE, 2013. http://dx.doi.org/10.1109/latice.2013.44.

Full text
APA, Harvard, Vancouver, ISO, and other styles
2

de Colnet, Alexis, and Pierre Marquis. "On the Complexity of Enumerating Prime Implicants from Decision-DNNF Circuits." In Thirty-First International Joint Conference on Artificial Intelligence {IJCAI-22}. California: International Joint Conferences on Artificial Intelligence Organization, 2022. http://dx.doi.org/10.24963/ijcai.2022/358.

Full text
Abstract:
We consider the problem Enum·IP of enumerating prime implicants of Boolean functions represented by decision decomposable negation normal form (dec-DNNF) circuits. We study Enum·IP from dec-DNNF within the framework of enumeration complexity and prove that it is in OutputP, the class of output polynomial enumeration problems, and more precisely in IncP, the class of polynomial incremental time enumeration problems. We then focus on two closely related, but seemingly harder, enumeration problems where further restrictions are put on the prime implicants to be generated. In the first problem, one is only interested in prime implicants representing subset-minimal abductive explanations, a notion much investigated in AI for more than thirty years. In the second problem, the target is prime implicants representing sufficient reasons, a recent yet important notion in the emerging field of eXplainable AI, since they aim to explain predictions achieved by machine learning classifiers. We provide evidence showing that enumerating specific prime implicants corresponding to subset-minimal abductive explanations or to sufficient reasons is not in OutputP.
APA, Harvard, Vancouver, ISO, and other styles
3

Kashansky, V., R. Prodan, and G. Radchenko. "SOME ASPECTS OF THE WORKFLOW SCHEDULING IN THE COMPUTING CONTINUUM SYSTEMS." In 9th International Conference "Distributed Computing and Grid Technologies in Science and Education". Crossref, 2021. http://dx.doi.org/10.54546/mlit.2021.29.45.001.

Full text
Abstract:
Contemporary computing systems are commonly characterized in terms of data-intensive workflows, that are managed by utilizing large number of heterogeneous computing and storage elements interconnected through complex communication topologies. As the scale of the system grows and workloads become more heterogeneous in both inner structure and the arrival patterns, scheduling problem becomes exponentially harder, requiring problem-specifc heuristics. Despite several decades of the active research on it, one issue that still requires effort is to enable efficient workflows scheduling in such complex environments, while preserving robustness of the results. Moreover, recent research trend coined under term "computing continuum" prescribes convergence of the multiscale computational systems with complex spatio-temporal dynamics and diverse sets of the management policies. This paper contributes with the set of recommendations and brief analysis for the existing scheduling algorithms.
APA, Harvard, Vancouver, ISO, and other styles
4

Sedlaczek, Kai, and Peter Eberhard. "Design Optimization of Rigid Body Mechanism Topology." In ASME 2007 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2007. http://dx.doi.org/10.1115/detc2007-34309.

Full text
Abstract:
Despite modern computer based design tools, the development process of new mechanisms is still based on the engineer’s experience, intuition and ingenuity. The goal of this work is to find a combination of linkage topology and joint types that represent the most suitable mechanism layout for a particular task. Optimization techniques are hardly used for this design problem except for the task of dimensional synthesis of a given mechanism type. This study presents and compares two different approaches to topology or type optimization of planar rigid body mechanisms that can be used to improve the overall design process. The first approach is based on a truss-like ground structure that represents an over-determined system of rigid bars where the most appropriate topology can be extracted from this ground structure by means of gradient based optimization algorithms. In the second approach, we use a genetic algorithm for the intrinsically combinatorial problem of topology synthesis. We explain both approaches and show their capabilities, their advantages and drawbacks.
APA, Harvard, Vancouver, ISO, and other styles
5

Albeanu, Grigore, Henrik Madsen, and Florin Popentiuvladicescu. "LEARNING FROM NATURE: NATURE-INSPIRED ALGORITHMS." In eLSE 2016. Carol I National Defence University Publishing House, 2016. http://dx.doi.org/10.12753/2066-026x-16-158.

Full text
Abstract:
During last decade, the nature has inspired researchers to develop new algorithms [1, 2, 3]. The largest collection of nature-inspired algorithms is biology-inspired: swarm intelligence (particle swarm optimization, ant colony optimization, cuckoo search, bees algorithm, bat algorithm, firefly algorithm etc.), genetic and evolutionary strategies, artificial immune systems etc. As well-known examples, the following have to be mentioned: aircraft wing design, wind turbine design, bionic car, bullet train, optimal decisions related to traffic, appropriate strategies to survive under a well-adapted immune system etc. Based on collective social behavior of organisms, researchers had developed optimization strategies taking into account not only the individuals, but also groups and environment [1]. However, learning from nature, new classes of approaches can be identified, tested and compared against already available algorithms. After a short introduction, this work review the most effective, according to their performance, nature-inspired algorithms, in the second section. The third section is dedicated to learning strategies based on nature oriented thinking. Examples and the benefits obtained from applying nature-inspired strategies in problem solving are given in the fourth section. Concluding remarks are given in the final section. References 1. G. Albeanu, B. Burtschy, Fl. Popentiu-Vladicescu, Soft Computing Strategies in Multiobjective Optimization, Ann. Spiru Haret Univ., Mat-Inf Ser., 2013, 2, http://anale-mi.spiruharet.ro/upload/full_2013_2_a4.pdf 2. H. Madsen, G. Albeanu, and Fl. Popentiu-Vladicescu, BIO Inspired Algorithms in Reliability, In H. Pham (ed.) Proceedings of the 20th ISSAT International Conference on Reliability and Quality in Design, Reliability and Quality in Design, August 7-9, 2014, Seattle, WA, U.S.A. 3. N. Shadbolt, Nature-Inspired Computing, http://www.agent.ai/doc/upload/200402/shad04_1.pdf
APA, Harvard, Vancouver, ISO, and other styles
6

Masana, Ravindra, and Mohammed F. Daqaq. "Exploiting Super-Harmonic Resonances of a Bi-Stable Axially-Loaded Beam for Energy Harvesting Under Low-Frequency Excitations." In ASME 2011 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2011. http://dx.doi.org/10.1115/detc2011-47723.

Full text
Abstract:
A research paradox currently lies in the design of miniaturized vibratory energy harvesters capable of harnessing energy efficiently from low-frequency excitations. To address this problem, this effort investigates the prospect of utilizing super-harmonic resonances of a bi-stable system to harvest energy from excitation sources with low-frequency components. Towards that objective, the paper considers the electromechanical response of an axially-loaded clamped-clamped piezoelectric beam harvester with bi-stable potential characteristics. By numerically constructing the voltage-frequency bifurcation maps of the response near the super-harmonic resonance of order two, it is shown that, for certain base excitation levels, the harvester can exhibit responses that are favorable for energy harvesting. These include a unique branch of large-orbit periodic inter-well oscillations, coexisting branches of large-orbit solutions, and a bandwidth of frequencies where a unique chaotic attractor exists. In these regions, the harvester can produce power levels that are comparable to those obtained near the primary resonance.
APA, Harvard, Vancouver, ISO, and other styles
7

Mukherjee, Arpan, Rahul Rai, Puneet Singla, Tarunraj Singh, and Abani Patra. "Non-Negative Matrix Factorization Based Uncertainty Quantification Method for Complex Networked Systems." In ASME 2015 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2015. http://dx.doi.org/10.1115/detc2015-46087.

Full text
Abstract:
The behavior of large networked systems with underlying complex nonlinear dynamic are hard to predict. With increasing number of states, the problem becomes even harder. Quantifying uncertainty in such systems by conventional methods requires high computational time and the accuracy obtained in estimating the state variables can also be low. This paper presents a novel computational Uncertainty Quantifying (UQ) method for complex networked systems. Our approach is to represent the complex systems as networks (graphs) whose nodes represent the dynamical units, and whose links stand for the interactions between them. First, we apply Non-negative Matrix Factorization (NMF) based decomposition method to partition the domain of the dynamical system into clusters, such that the inter-cluster interaction is minimized and the intra-cluster interaction is maximized. The decomposition method takes into account the dynamics of individual nodes to perform system decomposition. Initial validation results on two well-known dynamical systems have been performed. The validation results show that uncertainty propagation error quantified by RMS errors obtained through our algorithms are competitive or often better, compared to existing methods.
APA, Harvard, Vancouver, ISO, and other styles
8

Wallqvist, Viveca, Per M. Claesson, Agne Swerin, Catherine Östlund, Joachim Schoelkopf, and Patrick A. C. Gane. "Adhesive and Long-range Capillary Forces Between Hydrophobic Surfaces in Water: Effects of Surface Topography." In Advances in Pulp and Paper Research, Oxford 2009, edited by S. J. I’Anson. Fundamental Research Committee (FRC), Manchester, 2009. http://dx.doi.org/10.15376/frc.2009.2.1167.

Full text
Abstract:
Interactions between a hydrophobic probe particle and surfaces with nanoscopic surface features have been investigated. Such surfaces were prepared by polishing or by spin-coating of nanoparticles. The surface topography was characterized by AFM, using the methods of high-resolution imaging, low-resolution imaging using the probe particle, and by the rolling ball method. The polished surfaces display sharp nanoscopic peaks and hardly any crevices. In contrast, the spin-coated surfaces can be characterized as nanostructured, due to the high density of nanoparticles that on a short length scale provides a regular pattern of crevices and hills. On all surfaces a larger waviness is also distinguished. In all cases the dominant force at short separations was found to be a capillary attraction due to the formation of an air/vapour condensate. Our data show that the large-scale waviness of the surface does not significantly influence the range and magnitude of the capillary attraction, but large local variations in these quantities are found. The large variation in adhesion force corresponds to a small variation in local contact angle of the capillary condensate at the surfaces. The report discusses how the nature of the surface topographical features influences the capillary attraction by influencing the local contact angle and by pinning of the three phase contact line. The effect is clearly dependent on whether the surface features exist in the form of crevices or as extending ridges.
APA, Harvard, Vancouver, ISO, and other styles
9

Kishita, Yusuke, Yuta Inoue, Shinichi Fukushige, Yasushi Umeda, and Hideki Kobayashi. "Estimation of Long-Term Copper Demand Based on Sustainability Scenarios: A Challenge to Sustainable Manufacturing Industry." In ASME 2012 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. American Society of Mechanical Engineers, 2012. http://dx.doi.org/10.1115/detc2012-70695.

Full text
Abstract:
A variety of sustainability scenarios (e.g., IPCC’s Emissions Scenarios) have been described toward a sustainable society. While many of them aim at solving climate change problems and they often assume various low-carbon technologies, the problem is that such scenarios hardly examine their feasibility from the viewpoint of resource depletion. In particular, copper is a critical base metal because introducing low-carbon technologies (e.g., electric vehicles and wind power generators) may induce copper consumption. To assess feasibility of existing sustainability scenarios, this paper proposes a method for estimating long-term copper demand based on those scenarios. Our method proposes an integrated model that evaluates world copper demand from two principal aspects of influencing copper consumption — (1) the building of social infrastructure and (2) new products that might disseminate in the future (e.g., electric vehicles and photovoltaic systems). A case analysis on a long-term energy scenario is carried out. Its results reveal that the cumulative copper consumptions in the world exceed the copper reserved in the earth by 2040. The increase in copper consumptions results mainly from world economic growth led by developing countries, while the dissemination of electric vehicles and photovoltaic systems has a minor impact on the consumption increase.
APA, Harvard, Vancouver, ISO, and other styles
10

Ghiasi, Hossein, Damiano Pasini, and Larry Lessard. "Layer Separation for Optimization of Composite Laminates." In ASME 2008 International Design Engineering Technical Conferences and Computers and Information in Engineering Conference. ASMEDC, 2008. http://dx.doi.org/10.1115/detc2008-50106.

Full text
Abstract:
The excellent mechanical properties of laminated composites cannot be exploited without a careful design of stacking sequence of the layers. An important variable in the search of the optimum stacking sequence is the number of layers. The larger is this number, the harder as well as longer is the search for an optimal solution. To tackle efficiently such a variable-dimensional problem, we introduce here a multi-level optimization technique. The proposed method, called Layer Separation (LS), increases or decreases the number of layers by gradually separating a layer into two, or by merging two layers into one. LS uses different levels of laminate representation ranging from a coarse level parameterization, which corresponds to a small number of thick layers, to a fine level parameterization, which corresponds to a large number of thin layers. A benefit of such differentiation is an increase of the probability of finding the global optimum. In this paper, LS is applied to the design of composite laminates under single and multiple loadings. The results show that LS convergence rate is not inferior to that of other optimization techniques available in the literature. It is faster than an evolutionary algorithm, more efficient than a layerwise method, simple to perform, and it has the advantage of possibility of termination at any point during the optimization process.
APA, Harvard, Vancouver, ISO, and other styles

Reports on the topic "Inc Harvey Probber"

1

Baader, Franz, Nguyen Than Binh, Stefan Borgwardt, and Barbara Morawska. Unification in the Description Logic EL Without Top Constructor. Technische Universität Dresden, 2011. http://dx.doi.org/10.25368/2022.179.

Full text
Abstract:
Unification in Description Logics has been proposed as a novel inference service that can, for example, be used to detect redundancies in ontologies. The inexpressive Description Logic EL is of particular interest in this context since, on the one hand, several large biomedical ontologies are defined using EL. On the other hand, unification in EL has recently been shown to be NP-complete, and thus of considerably lower complexity than unification in other DLs of similarly restricted expressive power. However, EL allows the use of the top concept (>), which represents the whole interpretation domain, whereas the large medical ontology SNOMEDCT makes no use of this feature. Surprisingly, removing the top concept from EL makes the unification problem considerably harder. More precisely, we will show that unification in EL without the top concept is PSpace-complete.
APA, Harvard, Vancouver, ISO, and other styles
2

Baader, Franz, Carsten Lutz, Eldar Karabaev, and Manfred Theißen. A New n-ary Existential Quantifier in Description Logics. Technische Universität Dresden, 2005. http://dx.doi.org/10.25368/2022.151.

Full text
Abstract:
Motivated by a chemical process engineering application, we introduce a new concept constructor in Description Logics (DLs), an n-ary variant of the existential restriction constructor, which generalizes both the usual existential restrictions and so-called qualified number restrictions. We show that the new constructor can be expressed in ALCQ, the extension of the basic DL ALC by qualified number restrictions. However, this representation results in an exponential blow-up. By giving direct algorithms for ALC extended with the new constructor, we can show that the complexity of reasoning in this new DL is actually not harder than the one of reasoning in ALCQ. Moreover, in our chemical process engineering application, a restricted DL that provides only the new constructor together with conjunction, and satisfies an additional restriction on the occurrence of roles names, is sufficient. For this DL, the subsumption problem is polynomial.
APA, Harvard, Vancouver, ISO, and other styles
3

Philosoph-Hadas, Sonia, Peter Kaufman, Shimon Meir, and Abraham Halevy. Signal Transduction Pathway of Hormonal Action in Control and Regulation of the Gravitropic Response of Cut Flowering Stems during Storage and Transport. United States Department of Agriculture, October 1999. http://dx.doi.org/10.32747/1999.7695838.bard.

Full text
Abstract:
Original objectives: The basic goal of the present project was to increase our understanding of the cellular mechanisms operating during the gravitropic response of cut flowers, for solving their bending problem without affecting flower quality. Thus, several elements operating at the 3 levels o the gravity-induced signal transduction pathway, were proposed to be examined in snapdragon stems according to the following research goals: 1) Signaling: characterize the signal transduction pathway leading to the gravitropic response, regarding the involvement of [Ca2+]cyt as a mediator of IAA movement and sensitivity to auxin. 2) Transduction by plant hormones: a) Examine the involvement of auxin in the gravitropic response of flower stems with regard to: possible participation of auxin binding protein (ABP), auxin redistribution, auxin mechanism of action (activation of H+-ATPase) mediation by changes in [Ca2+]cyt and possible regulation of auxin-induced Ca2+ action b: calmodulin-activated or Ca2+-activated protein kinases (PK). b) Examine the involvement of ethylene in the gravitropic response of flower stems with regard to auxin-induced ethylene production and sensitivity of the tissue to ethylene. 3) Response: examine the effect of gravistimulation on invertase (associated with growth and elongation) activity and invertase gene expression. 4) Commercial practice: develop practical and simple treatments to prevent bending of cut flowers grown for export. Revisions: 1) Model systems: in addition to snapdragon (Antirrhinum majus L.), 3 other model shoe systems, consisting of oat (Avena sativa) pulvini, Ornithogalun 'Nova' cut flowers and Arabidopsis thaliana inflorescence, were targeted to confirm a more general mechanism for shoot gravitropism. 2 Research topics: the involvement of ABP, auxin action, PK and invertase in the gravitropic response of snapdragon stems could not be demonstrated. Alternatively, the involvement in the gravity signaling cascade of several other physiological mediators apart of [Ca2+]cyt such as: IP3, protein phosphorylation and actin cytoskeleton, was shown. Additional topics introduced: starch statolith reorientation, differential expression of early auxin responsive genes, and differential shoot growth. Background to the topic: The gravitropic bending response of flowering shoots occurring upon their horizontal placement during shipment exhibits a major horticultural problem. In spite of extensive studies in various aboveground organs, the gravitropic response was hardly investigated in flowering shoots. Being a complex multistep process that requires the participation of various cellular components acting in succession or in parallel, analysis of the negative gravitropic response of shoot includes investigation of signal transduction elements and various regulatory physiological mediators. Major achievements: 1) A correlative role for starch statoliths as gravireceptors in flowering shoot was initially established. 2) Differentially phosphorylated proteins and IP3 levels across the oat shoe pulvini, as well as a differential appearance of 2 early auxin-responsive genes in snapdragon stems were all detected within 5-30 minutes following gravistimulation. 3) Unlike in roots, involvement of actin cytoskeleton in early events of the gravitropic response of snapdragon shoots was established. 4) An asymmetric IAA distribution, followed by an asymmetric ethylene production across snapdragon stems was found following gravistimulation. 5) The gravity-induced differential growth in shoots of snapdragon was derived from initial shrinkage of the upper stem side and a subsequent elongation o the lower stem side. 6) Shoot bending could be successfully inhibited by Ca2+ antagonists (that serve as a basis for practical treatments), kinase and phosphatase inhibitors and actin-cytoskeleton modulators. All these agents did not affect vertical growth. The essential characterization of these key events and their sequence led us to the conclusion that blocking gravity perception may be the most powerful means to inhibit bending without hampering shoot and flower growth after harvest. Implications, scientific and agriculture: The innovative results of this project have provided some new insight in the basic understanding of gravitropism in flower stalks, that partially filled the gap in our knowledge, and established useful means for its control. Additionally, our analysis has advanced the understanding of important and fundamental physiological processes involved, thereby leading to new ideas for agriculture. Gravitropism has an important impact on agriculture, particularly for controlling the bending of various important agricultural products with economic value. So far, no safe control of the undesired bending problem of flower stalks has been established. Our results show for the first time that shoot bending of cut flowers can be inhibited without adverse effects by controlling the gravity perception step with Ca2+ antagonists and cytoskeleton modulators. Such a practical benefit resulting from this project is of great economic value for the floriculture industry.
APA, Harvard, Vancouver, ISO, and other styles
4

Crisosto, Carlos, Susan Lurie, Haya Friedman, Ebenezer Ogundiwin, Cameron Peace, and George Manganaris. Biological Systems Approach to Developing Mealiness-free Peach and Nectarine Fruit. United States Department of Agriculture, 2007. http://dx.doi.org/10.32747/2007.7592650.bard.

Full text
Abstract:
Peach and nectarine production worldwide is increasing; however consumption is flat or declining because of the inconsistent eating quality experienced by consumers. The main factor for this inconsistent quality is mealiness or woolliness, a form of chilling injury that develops following shipping periods in the global fruit market today. Our research groups have devised various postharvest methods to prolong storage life, including controlled atmosphere and delayed storage; however, these treatments only delay mealiness. Mealiness texture results from disruption of the normal ripening process involving disassembly of cell wall material, and creates a soft fruit texture that is dry and grainy instead of juicy and smooth. Solving this problem is a prerequisite for increasing the demand for fresh peach and nectarine. Two approaches were used to reveal genes and their associated biochemical processes that can confer resistance to mealiness or wooliness. At the Volcani Center, Israel, a nectarine cultivar and the peach cultivar (isogenetic materials) from which the nectarine cultivar spontaneously arose, and at the Kearney Agricultural Center of UC Davis, USA, a peach population that segregates for quantitative resistance to mealiness was used for dissecting the genetic components of mealiness development. During our project we have conducted research integrating the information from phenotypic, biochemical and gene expression studies, proposed possible candidate genes and SNPs-QTLs mapping that are involved in reducing peach mealiness susceptibility. Numerous genes related to ethylene biosynthesis and its signal transduction, cell wall structure and metabolism, stress response, different transcription factor families were detected as being differentially accumulated in the cold-treated samples of these sensitive and less sensitive genotypes. The ability to produce ethylene and keep active genes involved in ethylene signaling, GTP-binding protein, EIN-3 binding protein and an ethylene receptor and activation of ethyleneresponsive fruit ripening genes during cold storage provided greater resistance to CI. Interestingly, in the functional category of genes differentially expressed at harvest, less chilling sensitive cultivar had more genes in categories related to antioxidant and heat sock proteins/chaperones that may help fruit to adapt to low temperature stress. The specific objectives of the proposed research were to: characterize the phenotypes and cell wall components of the two resistant systems in response to mealiness- inducing conditions; identify commonalities and specific differences in cell wall proteins and the transcriptome that are associated with low mealiness incidence; integrate the information from phenotypic, biochemical, and gene expression studies to identify candidate genes that are involved in reducing mealiness susceptibility; locate these genes in the Prunus genome; and associate the genes with genomic regions conferring quantitative genetic variation for mealiness resistance. By doing this we will locate genetic markers for mealiness development, essential tools for selection of mealiness resistant peach lines with improved fruit storability and quality. In our research, QTLs have been located in our peach SNPs map, and proposed candidate genes obtained from the integrated result of phenotypic, biochemical and gene expression analysis are being identified in our QTLs as an approach searching for consistent assistant markers for peach breeding programs.
APA, Harvard, Vancouver, ISO, and other styles
5

Lichter, Amnon, Joseph L. Smilanick, Dennis A. Margosan, and Susan Lurie. Ethanol for postharvest decay control of table grapes: application and mode of action. United States Department of Agriculture, July 2005. http://dx.doi.org/10.32747/2005.7587217.bard.

Full text
Abstract:
Original objectives: Dipping of table grapes in ethanol was determined to be an effective measure to control postharvest gray mold infection caused by Botrytis cinerea. Our objectives were to study the effects of ethanol on B.cinerea and table grapes and to conduct research that will facilitate the implementation of this treatment. Background: Botrytis cinerea is known as the major pathogen of table grapes in cold storage. To date, the only commercial technology to control it relied on sulfur dioxide (SO₂) implemented by either fumigation of storage facilities or from slow release generator pads which are positioned directly over the fruits. This treatment is very effective but it has several drawbacks such as aftertaste, bleaching and hypersensitivity to humans which took it out of the GRAS list of compounds and warranted further seek for alternatives. Prior to this research ethanol was shown to control several pathogens in different commodities including table grapes and B. cinerea. Hence it seemed to be a simple and promising technology which could offer a true alternative for storage of table grapes. Further research was however required to answer some practical and theoretical questions which remained unanswered. Major conclusions, solutions, achievements: In this research project we have shown convincingly that 30% ethanol is sufficient to prevent germination of B. cinerea and kill the spores. In a comparative study it was shown that Alternaria alternata is also rather sensitive but Rhizopus stolonifer and Aspergillus niger are less sensitive to ethanol. Consequently, ethanol protected the grapes from decay but did not have a significant effect on occurrence of mycotoxigenic Aspergillus species which are present on the surface of the berry. B. cinerea responded to ethanol or heat treatments by inducing sporulation and transient expression of the heat shock protein HSP104. Similar responses were not detected in grape berries. It was also shown that application of ethanol to berries did not induce subsequent resistance and actually the berries were slightly more susceptible to infection. The heat dose required to kill the spores was determined and it was proven that a combination of heat and ethanol allowed reduction of both the ethanol and heat dose. Ethanol and heat did not reduce the amount or appearance of the wax layers which are an essential component of the external protection of the berry. The ethanol and acetaldehyde content increased after treatment and during storage but the content was much lower than the natural ethanol content in other fruits. The efficacy of ethanol applied before harvest was similar to that of the biological control agent, Metschnikowia fructicola, Finally, the performance of ethanol could be improved synergistically by packaging the bunches in modified atmosphere films which prevent the accumulation of free water. Implications, both scientific and agricultural: It was shown that the major mode of action of ethanol is mediated by its lethal effect on fungal inoculum. Because ethanol acts mainly on the cell membranes, it was possible to enhance its effect by lowering the concentration and elevating the temperature of the treatment. Another important development was the continuous protection of the treated bunches by modified atmosphere that can solve the problem of secondary or internal infection. From the practical standpoint, a variety of means were offered to enhance the effect of the treatment and to offer a viable alternative to SO2 which could be instantly adopted by the industry with a special benefit to growers of organic grapes.
APA, Harvard, Vancouver, ISO, and other styles
6

Heitman, Joshua L., Alon Ben-Gal, Thomas J. Sauer, Nurit Agam, and John Havlin. Separating Components of Evapotranspiration to Improve Efficiency in Vineyard Water Management. United States Department of Agriculture, March 2014. http://dx.doi.org/10.32747/2014.7594386.bard.

Full text
Abstract:
Vineyards are found on six of seven continents, producing a crop of high economic value with much historic and cultural significance. Because of the wide range of conditions under which grapes are grown, management approaches are highly varied and must be adapted to local climatic constraints. Research has been conducted in the traditionally prominent grape growing regions of Europe, Australia, and the western USA, but far less information is available to guide production under more extreme growing conditions. The overarching goal of this project was to improve understanding of vineyard water management related to the critical inter-row zone. Experiments were conducted in moist temperate (North Carolina, USA) and arid (Negev, Israel) regions in order to address inter-row water use under high and low water availability conditions. Specific objectives were to: i) calibrate and verify a modeling technique to identify components of evapotranspiration (ET) in temperate and semiarid vineyard systems, ii) evaluate and refine strategies for excess water removal in vineyards for moist temperate regions of the Southeastern USA, and iii) evaluate and refine strategies for water conservation in vineyards for semi-arid regions of Israel. Several new measurement and modeling techniques were adapted and assessed in order to partition ET between favorable transpiration by the grapes and potentially detrimental water use within the vineyard inter-row. A micro Bowen ratio measurement system was developed to quantify ET from inter-rows. The approach was successful at the NC site, providing strong correlation with standard measurement approaches and adding capability for continuous, non-destructive measurement within a relatively small footprint. The environmental conditions in the Negev site were found to limit the applicability of the technique. Technical issues are yet to be solved to make this technique sufficiently robust. The HYDRUS 2D/3D modeling package was also adapted using data obtained in a series of intense field campaigns at the Negev site. The adapted model was able to account for spatial variation in surface boundary conditions, created by diurnal canopy shading, in order to accurately calculate the contribution of interrow evaporation (E) as a component of system ET. Experiments evaluated common practices in the southeastern USA: inter-row cover crops purported to reduce water availability and thereby favorably reduce grapevine vegetative growth; and southern Israel: drip irrigation applied to produce a high value crop with maximum water use efficiency. Results from the NC site indicated that water use by the cover crop contributed a significant portion of vineyard ET (up to 93% in May), but that with ample rainfall typical to the region, cover crop water use did little to limit water availability for the grape vines. A potential consequence, however, was elevated below canopy humidity owing to the increased inter-row evapotranspiration associated with the cover crops. This creates increased potential for fungal disease occurrence, which is a common problem in the region. Analysis from the Negev site reveals that, on average, E accounts for about10% of the total vineyard ET in an isolated dripirrigated vineyard. The proportion of ET contributed by E increased from May until just before harvest in July, which could be explained primarily by changes in weather conditions. While non-productive water loss as E is relatively small, experiments indicate that further improvements in irrigation efficiency may be possible by considering diurnal shading effects on below canopy potential ET. Overall, research provided both scientific and practical outcomes including new measurement and modeling techniques, and new insights for humid and arid vineyard systems. Research techniques developed through the project will be useful for other agricultural systems, and the successful synergistic cooperation amongst the research team offers opportunity for future collaboration.
APA, Harvard, Vancouver, ISO, and other styles
7

Soliciting opinions and solutions on the "Q Zhang's Problem". BDICE, March 2023. http://dx.doi.org/10.58911/bdic.2023.03.001.

Full text
Abstract:
"Q Zhang's problem" is a teaching problem proposed by Qian Zhang, a science teacher at Dongjiao Minxiang Primary School in Dongcheng District, Beijing. In 2022, she proposed that: (1) when explaining the knowledge points of frequency in the "Sound" unit, experiments on the vibration of objects such as rubber bands and steel rulers were used to assist students in learning, but the effect was not obvious, because it was difficult for the naked eye to distinguish the speed of vibration of objects such as rubber bands, and it was difficult to correspond to the high and low frequencies; (2) Students seem to be confused about the difference between frequency and amplitude. When guiding them to make the rubber band vibrate faster, they tend to tug harder at the rubber band, but this actually changes the amplitude rather than the frequency (changing the frequency should be to control its vibrating chord length, similar to the tuning method of a stringed instrument). Therefore, demonstration experiments using objects such as rubber bands as frequencies do not seem suitable and cannot effectively assist students in establishing the concept of frequency. We hope to solicit opinions and solutions (research ideas) on this problem, with a focus on two points: ① the mathematical/physical explanation of the problem. That is, does simply changing the amplitude really not affect the original vibration frequency of the object (except when the amplitude is 0) ② explanation from a cognitive perspective: Why do people confuse the two concepts? What is the cognitive mechanism behind it.
APA, Harvard, Vancouver, ISO, and other styles
We offer discounts on all premium plans for authors whose works are included in thematic literature selections. Contact us to get a unique promo code!

To the bibliography