tragos

Scroll to Info & Navigation

Choice—preferably an exhaustive menu of it—pretty much defines our status as consumers, and has long been an unquestioned tenet of the capitalist feast, but in fact carte blanche is no way to run a cultural life (or any kind of life, for that matter), and one thing that has nourished the theatrical experience, from the Athens of Aeschylus to the multiplex, is the element of compulsion. Someone else decides when the show will start; we may decide whether to attend, but, once we take our seats, we join the ride and surrender our will. The same goes for the folks around us, whom we do not know, and whom we resemble only in our private desire to know more of what will unfold in public, on the stage or screen. We are strangers in communion, and, once that pact of the intimate and the populous is snapped, the charm is gone. Our revels now are ended.

“Tower Heist” and “Melancholia” Reviews : The New Yorker

Anthony Lane gives Video on Demand a dressing down, but disguises the effort as a review of Tower Heist and Melancholia.

Happy Halloween!

No question, one can use a smart phone as an aid to memory, and I do use one myself for that purpose. But I don’t find them a congenial repository for anything more complicated than reminding myself to pick up a pair of pants from the cleaners or make an appointment with the cat doctor. If one has the urge to write down a complete thought, a handsome notebook gives it more class. Even a scrap of paper and a stub of a pencil are more preferable for philosophizing than typing the same words down, since writing a word out, letter by letter, is a more self-conscious process and one more likely to inspire further revisions and elaborations of that thought.

Take Care of Your Little Notebook by Charles Simic | NYRblog | The New York Review of Books

squashed:

I just received a call from a friend in Cairo (I won’t say who it is now because he’s a prominent activist) telling me neither his DSL nor his USB internet service is working. I’ve just checked with two other friends in different parts of Cairo and their internet is not working either.

This just happened 10 minutes ago — and perhaps not uncoincidentally just after AP TV posted a video of a man being shot.

Will update with more info. The ISPs being used by my friends are TEDATA, Vodafone, and Egynet.

Whoa.

…those of us who turn in disgust from what we consider an overinflated liberal-bourgeois sense of self should be careful what we wish for: our denuded networked selves don’t look more free, they just look more owned.

From Zadie Smith’s “Generation Why?" reviewing The Social Network, Facebook, and two Zuckerbergs.

Smith makes an interesting move here. She sets Sorkin’s Zuckerberg over and against the real Zuckerberg in order to understand not—as is typical of recent reviews—the social forces behind the technology, but instead the technological forces behind our social world.

In some ways, her review “repurposes” Marshal McLuhan’s decades-old critiques for the software and audience of the new millennium. The medium is no longer the message so much as it is a social template, a mold constricting the freedom with which each individual constructs a self.

Facebook, she claims, patterns us after its founder, who is not Sorkin’s machiavel, but rather the essence of banality. And thus Facebook’s software reduces us to this same essence.

The thesis is reductive, on one hand, but experientially accurate on the other. The more honest voices in my head proclaimed for the truth of Smith’s analysis. But oddly, these same voices hardly demurred when it came to Tumblr.

I suspect this has something to do with the relatively pure state of virtuality on Tumblr. Unlike Facebook friends (for the most part), Tumblr identities are engendered in the same medium in which they develop. For the most part, there are far fewer will-o-wisps of purported reality, fewer analogues to the actual, that are the stuff of Facebook dreams.

All of this goes to say: Smith was incredibly savvy in making the gap between the fictional and real Zuckerbergs the springboard for her analysis of the latter’s creation.

Asked how technology is changing fiction, [Don DeLillo] speculated that novels would become ‘user-generated’, and wondered if the ‘human need for narrative’ would be reduced. ‘The world is becoming increasingly customised, altered to individual specifications. This shrinking context will necessarily change the language that people speak, write, and read,’ he said. ‘Here’s a stray question (or a metaphysical leap): Will language have the same depth and richness in electronic form that it can reach on the printed page? Does the beauty and variability of our language depend to an important degree on the medium that carries the words? Does poetry need paper?’

From DeLillo’s statement issued on receiving the PEN/Saul Bellow Prize yesterday, as reported by the The Guardian.

Few days pass, it seems, without a new report or commentary on the state of print and on the future of the electronic word. Some eulogize print while railing against the bytes that bleed the plasma of our mind. Some declare imminent victory for the electronic book, and necessarily see the novel in print as the illuminated manuscript of our future. Most often, they strike a fair and balanced tone, noting that, of course, we will enjoy a farrago of media, meaning that we will need to come to a more acute understanding of the contexts, content and qualities unique to print and electronic texts.

The day before I left London to move to Ankara, I took a brief excursion to the Victoria and Albert Museum, where I found myself spellbound by the medieval manuscripts on display, by the vibrancy of color, the idiosyncratic richness of the script, and the uncanny collusion of form and content.

These manuscripts helped me understand what Benjamin meant when he proposed his notion of ‘aura.’ (Maybe I never understood this concept as well as I’d convinced myself I had.) Between the content of the gospels and the beauty of the text, both contained within the V&A manuscripts, there was a “third” meaning that superseded both, while remaining elusive.

And yet. Soon after peering into the glass boxes containing the manuscripts, I wandered into a side-hall lined with computers containing educational programs designed to flesh out and clarify the manuscripts. The digital representations allowed me to range over the intricacies of the text and color, to zoom in and out according to the vagaries of my curiosity.

And I loved it. But maybe I was craven, and needed the comfort provided by losing the physical object’s property of evasive ambiguity. Maybe the permanence hinted at but only falsely achieved by the physical page is what this elusiveness was all about.

By keeping lots of brain cells buzzing, Google seemed to be making people smarter. But as [UCLA professor of psychiatry Gary] Small was careful to point out, more brain activity is not necessarily better brain activity. The real revelation was how quickly and extensively Internet use reroutes people’s neural pathways. “The current explosion of digital technology not only is changing the way we live and communicate,” Small concluded, “but is rapidly and profoundly altering our brains.”

What kind of brain is the Web giving us? That question will no doubt be the subject of a great deal of research in the years ahead. Already, though, there is much we know or can surmise—and the news is quite disturbing. Dozens of studies by psychologists, neurobiologists, and educators point to the same conclusion: When we go online, we enter an environment that promotes cursory reading, hurried and distracted thinking, and superficial learning. Even as the Internet grants us easy access to vast amounts of information, it is turning us into shallower thinkers, literally changing the structure of our brain.

Nicholas Carr: The Web Shatters Focus, Rewires Brains |Wired Magazine

(So long, I’m off to read a book)

(via byronic)

If this nascent research on the web’s alteration of our neural pathways has merit, and our brains lose their capacity for extended concentration, and, ironically, we can no longer benefit from the prodigious increase in information at our behest, then I have some questions for those of you out there with an actual education in neuroscience.

First of all: what is the effect that long, extended reading has on our ability to process the more fragmented information we get through the web?

Many if not most of the people I know balance their web time with extended time reading of long fictional and non-fictional narratives. In the morning, they will troll their RSS feeds and Tumblr dashboards; and in the evening, they will sit down to read a novel or a feature piece in the NYRB.

But here is what I don’t know, not even “experientially:” are these two forms of engaging information complementary or incompatible practices?

Does our morning time spent scrolling through Tumblr undermine our ability to read Thomas Mann at night? Or does a careful, considered perusal of a long feature piece in the New Yorker help us assemble and consider the shards of discourse scattered across the web?

What the internet is creating is a class of literate, gifted amateur writers, in an old tradition. Like Trollope, who was a British Post official all his working life, they write for love and because they must. Like Rohinton Mistry, a banking executive, or Wallace Stevens, an insurance executive, or Edmund Wilson, who spent his most productive years sitting in his big stone house in upstate New York and writing about what he damned well pleased. Samuel Pepys, who wrote the greatest diary in the language, was a high officials in the British Admiralty. Many people can write well and yearn to, but they are not content, like Pepys, for their work to go unread. A blog on the internet gives them a place to publish. Maybe they don’t get a lot of visits, but it’s out there. As a young women in San Francisco, Pauline Kael wrote the notes for screenings of great films, and did a little free-lancing. If she’d had a blog, no telling what she might have written during those years.

Roger Ebert in The golden age of movie critics (via girlperson)