Weekend Reading

As we recover from the election’s end, let’s try to do some reading:

Yale-NUS is unfolding with the ethical and civic compromises that were so easy to foresee. But there is more to it than that. Yale-NUS is being conceived as the realization of a dream that has been harder, but not impossible, to implement in New Haven. That is the idea of a new, smooth and seamless, singular liberal-arts curriculum: centrally controlled, departmentless, and monolingual. (“Avoid[ing] the language barrier” was one thing that attracted Yale to Singapore, where the language of instruction is English.) Departments, in this view, are “silos,” presumed to “hobble” knowledge; they are supposedly stuck in a fractious condition of specificity.

What is proposed instead is something centrally conceived and regulated—more than a mere convenience in an authoritarian state. Yet on the intellectual front, this new model of the liberal arts disingenuously waves the flags of “difference” and “interdisciplinarity,” as if they were novel concepts. (“Many nations live by different traditions and norms,” the Yale-NUS Prospectus helpfully tells us.)

The surprise is that that model has already made inroads in New Haven, with the rise of homogenized, nondepartmental programs and majors with bland titles like “Humanities” or “Global Studies”—and excrescences like the Jackson Institute for Global Affairs. The partial erosion of departments in New Haven has led, logically, to their complete absence at Frankenyale in Singapore. If the promised “feedback loop” between Singapore and New Haven succeeds, the two institutions in tandem will produce a new generation of conformist, dissent-averse managers and executives, particularly well suited for the new global boardroom and tea at Davos.

In 2002, on a Friday, Larry Page began to end the book as we know it. Using the 20 percent of his time that Google then allotted to its engineers for personal projects, Page and Vice-President Marissa Mayer developed a machine for turning books into data. The original was a crude plywood affair with simple clamps, a metronome, a scanner, and a blade for cutting the books into sheets. The process took 40 minutes. The first refinement Page developed was a means of digitizing books without cutting off their spines — a gesture of tender-hearted sentimentality towards print. The great disbinding was to be metaphorical rather than literal. A team of Page-supervised engineers developed an infrared camera that took into account the curvature of pages around the spine. They resurrected a long dormant piece of Optical Character Recognition software from Hewlett-Packard and released it to the open-source community for improvements. They then crowd-sourced textual correction at a minimal cost through a brilliant program called reCAPTCHA, which employs an anti-bot service to get users to read and type in words the Optical Character Recognition software can’t recognize. (A miracle of cleverness: everyone who has entered a security identification has also, without knowing it, aided the perfection of the world’s texts.) Soon after, the world’s five largest libraries signed on as partners. And, more or less just like that, literature became data.

And because I couldn’t not put in some links about the election:

Advertisement

Leave a comment

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s