Dark Pigeon Skies

We should all be grateful for the Endangered Species Act — passed in 1973 under Richard Nixon of all people, back before Congress had turned over the legislative function to lobbyists – and upheld by the Supreme Court before that august council had become an extension of the Republican National Committee.

In fact some of us would like to strengthen the act, for example by adding humans to the endangered list so that we can be saved like the Peregrine Falcons, Bald Eagles, and California Condors. Also, we’d like to require that large farms leave large grass buffer zones untilled, both as wildlife habitat and to keep silt-laden runoff from clogging our rivers and sticking our barges in the resulting mud.

Finally, a death penalty should be added for frackers who report (as EQT Resources did in Pennsylvania) that they had sent 21 tons of fracking fluids and drill cuttings to landfill when the actual number was 95.000 tons.

To err is human, but stretching the truth by 50 million percent is unacceptable.

Yet despite all these virtuous impulses, there are other environmental reflexes that are simply absent from my genetic kit. One such dubious pontification popped up this month with the 100th anniversary of the day Martha died in a Cincinnati zoo. She was the last living Passenger Pigeon.

Failing to lament Martha’s demise makes me a pariah, I gather, at least judging from the dirge penned by Cornell ornithologist John Fitzpatrick for the New York Times this past Sunday. He thinks Martha would ask if she could – “Have you learned anything from my passing?”

Says Fitzpatrick: “It seems that whenever humans discover bounty, it is doomed to become a fleeting resource. The fate of the cod fisheries in the late 1900s mirrors that of the Passenger Pigeon a century before. Pacific Bluefin Tuna may be next in line.” He might have added that another characteristic shared by all three species is that they’re edible.

But the tuna and cod never flew over your house three billion at a time, blotting out the blue skies and sunshine and coating your picnic table, windshield, tricycle, clothesline, and grotto of the Virgin Mary with three to five inches of bird droppings.

When the pigeons flew over a northeastern U.S. city, they would darken the sky for days on end.

To me, that state of affairs – not the subsequent extinction – was the real environmental calamity of its time. It ended with the ultimate eradication of Passenger Pigeons by the millions in their northern breeding grounds and at huge southern roosts – and shipping them by the trainload to processing plants, finally to dinner tables and restaurants as low cost “city chicken.”

It’s good to be an environmentalist. But let’s pause to note the ingenuity and resourcefulness of us true-blue, red-blooded Americans.

What do we do when engulfed in a plague of biblical proportions? We eat it!

Would that mosquitos and locusts were as tasty as city chickens.

Mything Links

Since we publish a fair number of scientific reports here at The Horse (usually under the rubric, “Front Ears of Science”), we like to think we’re keeping up with the latest discoveries.

Hardly a week goes by, however, that some spoilsport doesn’t publish some new study that claims to turn previous findings inside out.

Nine years ago, in the Author’s Preface to my book of light verse, I urged readers to try writing some of their own:

“You may discover that writing light verse is a good way to shift gears in your brain – to let your neurons know (which is what neurons do) that you appreciate their versatility, and to let your left brain take a sabbatical from problem-solving to have some fun rummaging around in your right brain’s tool boxes and toy chests.”

So of course this week a professor of cognitive science at UC Irvine publishes an article calling the whole left brain/right brain model a myth. Gregory Hickok says the two hemispheres are pretty much alike and that they work together in most tasks that involve any complexity to speak of.

A myth? He can speak for himself. Most of us keep our myths in the right brain and our data in the left brain, at room temperature.

Prof. Hickok also pooh-poohs the idea of mirror neurons, which were supposed to explain how you know what I’m thinking of doing and vice-versa – the basis for learning by imitation and for empathy.

Even worse, he disputes the popular notion that we humans use only 10% of our brain power (See the current film, Lucy. On second thought, don’t – it wouldn’t engage anything close to 10% of your brain). Lucy starts using greater portions of her mental capacities and acquires a violent repertoire of computer animation and sound effects.

Next, the professor will no doubt attack one of my most cherished secrets of intellectual potential. I had learned from Stephen Hawking’s A Brief History of Time that the total amount of energy in the universe is zero. The positive energy visible in light, heat, and matter is cancelled out by the negative energy of gravity.

Zero! Can you imagine?

That’s how much energy you have in your own brain on a bad day. I realized then that you or I could start our own universes at will, using zero energy, if only we could find the right trigger mechanism.

For me, that turned out to be looking sharply to the left and yelling, “FRP!”

Of course I can’t prove that this works (nor can the Professor prove it doesn’t) because it’s impossible to communicate with other universes, but I know for a fact that my latest creation is a universe in which myths about the brain are legally protected as endangered speciousness. But idiotic puns are perfectly legal.

Obviously, you have to use your right brain to create a wild, messy thing like a universe. Your left brain would decline the honor, rolling your eyes without even bothering to raise your eyebrows.

Front Ears of Science

Our exclusive science feature usually reports from the far corners of human knowledge – new discoveries in research and new atrocities in pseudoscientific and antiscientific mandates from officials of church and state, talk radio, and Fox News.

This time we ignore all that and report on questions instead of answers.

*

Holly looked up the other day from an article in the Times Science Section and asked,

“What would a dinosaur need with feathers?”

What indeed! A splendid question to incite class discussion at any grade level from kindergarten to graduate school.

But there it was – paleontologists have unearthed yet another fossil of yet another feathered dinosaur species – this time in Russia – yet another proof that dinosaurs are the ancestors of modern birds. That would seem to indicate that feathers conferred some evolutionary advantage to the survival of the species – which is Holly’s question.

What advantage?

I could think of half an answer. In the case of pterosaurs — flying species such as pterodactyls – feathers had the same advantage they have for birds.

But a number of feathered species turn out to be in the same group as T-Rex. In a rare burst of paleontological humor, lead researcher Pascal Godefroit of the Royal Belgian Institute mused:

“Maybe T-Rex was some kind of big chicken.”

*

Barb has long been troubled by a related question—Why aren’t we finding bird beaks all over our back yards? Their hard chitin is not readily compostable (as feathers usually are – which is why it took so long for science to discover them on dinosaurs).

As Barb concludes her doctoral dissertation on the subject,

“Why aren’t we up to our asses in bird beaks?”

*

The youngest inquirer on today’s panel, eleven year old Elena, has the rare talent of reducing a hundred questions to a two or three word essence.

Worried because she had heard her grandfather was sick, Elena asked him if his heart had gotten better. After getting an evasive answer, she looked at him closely, then asked:

“Can you run?”

The Truth About Truth

Don’t tell anyone, but in our vast test kitchens and cavernous research facilities under Yucca Mountain, we often face the dilemma of conflicting conclusions reached by different research teams — and then, in the interest of objectivity, we’re forced to flip a coin.

We use a tamper-proof 1899 Morgan silver dollar.

Since you can’t very well subject peerless researchers to peer review, we’ve never had to disclose our coinflip methodology to a peer-reviewed journal; but you can imagine our relief when we learned that everybody else’s research is comparably flawed, whether or not they admit it.

I’m not referring to the stock market recommendations of sell-side analysts on Wall Street or the studies sponsored by tobacco companies or drug companies, all of which are Easter egg hunts for new arguments to support foregone conclusions – as are the climate change denying studies covertly funded by coal and oil companies and the ubiquitous Koch brothers.

No, I mean real research by good scientists in pursuit of genuine discoveries – but who also have careers to build, tenure to seek, grants to obtain, and passionate beliefs in their own theories.

Get thee behind me, Satan, so you can push a little harder.

In Tuesday’s Science Times, George Johnson recalls a 2005 paper published by Dr. John Ioannidis – a scientist whose research subject is research itself. It was titled, “Why Most Published Research Findings Are False.”

Dr. Ioannidis noted the human tendency to see what we want to see, magnified by competition and a shrinking pool of grant money, along with some error-prone experimental designs – a context within which it’s easy to fool yourself, even with the best of conscious intentions.

Other scientists wondered if Dr. Ioannidis’ findings might have been skewed by his own biases.

But then he published a definitive report on the most highly regarded research papers of the previous decade (e.g. on the effects of a daily aspirin on cardiac patients, the risks of hormone replacement therapy for older women), finding that in most cases the reported results were later altered or contradicted.

Since then other researchers have come forward with kindred views.

One recalled that while working at Amgen he and his colleagues tried to replicate the findings of 53 landmark cancer papers. In 47 of the 53 cases they were unable to do so, even when they got the original researchers to collaborate.

Now that the alarm has been sounded, various journals and institutes are looking for ways to reform the process. But as Johnson points out in the Times, the scientific literature “has roughly doubled in size every 10 or 15 years since the days of Isaac Newton.”

He says the National Library of Medicine’s PubMed database alone containes 23 million citations.

How can we help? We’re thinking of sending them our 1899 silver dollar.

Go Coffee! Go Nuts!

Today’s assignment – forget everything you learned in yesterday’s assignment.

That’s the way things are going in the arcane pseudoscience of nutritional supplements and herbal remedies, which cost Americans 5 to 7 billion dollars a year.

Two research studies within the past three months (and several convincing experts) have concluded that virtually all of the many thousands of supplements we buy are worthless in terms of improving our health or prolonging our lives.

And even if they weren’t, the substance on the label may not be the one in the bottle, and the actual contents could be tainted or toxic. The supplement industry is largely unregulated (thanks to our vigilant Congress), so we may never know the full story.

What we do know (unless tomorrow’s assignment changes everything we know today) is that there are two important sources of dietary supplements that actually do help to ward off disease, and that neither comes in a bottle.

One is in the coffee house. The other is in the trees.

We covered coffee here a year or so ago – growing evidence that people who drink 4 to 6 or more cups a day (regular, not decaf) are less likely to suffer dementia, depression, prostate and several other cancers, stroke, type 2 diabetes, and a number of other maladies. Our conclusion at that time:

“If coffee had been developed by Pfizer or Merck, it would cost $1,600 a cup.”

Two weeks ago, Jane Brody in the Times made an equivalent case for nuts – both tree nuts and peanuts. Much of her data came from the New England Journal of Medicine’s report on two major studies covering 119,000 nurses and other professionals over a span of several decades.

Among the highlights:

> The more often people consumed nuts, the less likely they were to die of cancer, heart disease, or respiratory disease.

> Those who ate nuts 7 or more times a week were 20% less likely to die from 1980 to 2010.

> But aren’t nuts fattening? You would think so, with 160 to 200 calories per ounce. Yet in study after study, the more often people ate nuts, the leaner they tended to be.

> Most nuts are good sources of vitamin E, folic acid, selenium, magnesium, and several phytochemicals – natural compounds with anti-oxidant, anti-inflammatory, or anti-cancer properties.

As Brody sums it up, “All nuts are powerhouses of biologically active substances, most of which are known to protect and promote health.”

In both cases – nuts and coffee – there’s no bottle, no label. Not even unregulated free-market capitalism can counterfeit a good cup of coffee or mislabel a pistachio.

GM Crops – Another View

NOTE: Our apologies for the infrequency of recent postings. Barb and Al have been on what the Times would call “book leave.” The resulting works will appear in coming months and, we’re confident, will be favorably reviewed – at least here on The Horse.

*

Two months ago, environmentalists in the Philippines vandalized a field of Golden Rice – a grain whose genes have been modified to carry beta carotene. It’s a public project. No corporation is behind the experiment, and the seeds will be given out free to farmers, intended to improve the health of children in poor countries

At about the same time, a study purporting to show tumors in rats that had eaten a GM maize was thoroughly discredited, then withdrawn by the journal that had published it – even as Russia and Kenya banned GM crops and the French said they would press for a European ban, partly based on the flawed study.

Prompted by such developments, earlier this month The Economist cried foul. A few excerpts:

*

“One of he biggest challenges facing mankind is to feed the 9 to 10 billion people who will be alive and (hopefully) richer in 2050. This will require doubling food production on roughly the same area of land, using less water and fewer chemicals …

“Organic farming … cannot meet this challenge. If the green revolution had never happened, and yields had stayed at 1960 levels, the world could not produce its current food output even if it ploughed up every last acre of cultivatable land.”

*

“In contrast, GM crops boost yields, protecting wild habitat from the plough. They are more resistant to the vagaries of climate change, and to diseases and pests, reducing the need for agrochemicals. “

*

“Some developing countries – Kenya, India and others – have turned their backs on technologies that could literally save their peoples’ lives. And European governments spend taxpayers’ money financing groups encouraging them to do so. The group in the Philippines that trashed the rice trials gets money from the Swedish government.”

*

“In the field of climate change, environmentalists insist that the scientific consensus should frame policy. They should follow that principle with GM crops, and abandon a campaign that impoverishes people and the rest of the planet.”

Quickie Quotes

“Philosophy of Science is as useful to scientists as Ornithology is to birds.”

— Richard Feynman

*

“The Catechism of Cliche”

From a column by the late Flann O’Brien, written early in his 25 years with the Irish Times:

When things are few, what also are they?
Far between.
What are stocks of fuel doing when they are low?
Running.
How low are they running?
Dangerously.
What does one do with a suggestion?
One throws it out.
For what does one throw a suggestion out?
For what it may be worth.

Front Ears of Science

Never Trust a Wasp

Scientific method insists on the dispassionate observation of empirical fact. Usually.

In 1884, a correspondent for the science journal Nature reported observing some trapped wasps begin to gobble up a faltering companion. Mortally outraged, the investigator whacked the wasps with a book. Fucking cannibals – no starring role in the learned journals for you!

Science has come a long way since then. Cannibalism among various wasp species is now well known and morally sanctioned, as are the disgusting habits of birds who feed their young by regurgitation.

A little known criterion for Nobel prizes is the Ich! Factor. As in Ichthyology.

*

So this Shark Goes Out For a Walk,,,

Speaking of which – According to the International Journal of Ichthyology, some so-called scientists in Indonesia have discovered a new species of shark that walks, make that waddles along the ocean floor by waggling its fins.

As Darwin said, it takes all kinds. One might think, though, that if you plan to make your living as a shark, the least you could do would be learn to swim.

*

Fu Man Who?

A research team in Taiwan has developed a small computer chip that could be implanted in one of your teeth, tracking how often you cough, clench, chew, grind, or snore. The report we saw was vague about how this would enhance your health, but we thought you might like to know the name of the lead researcher: Dr. Chu.

*

Keynes on Flip-Flopping

When a gentleman at a conference accused him of fudging his previously stated position on an economic issue, John Maynard Keynes shot back:

“When the facts change, I change my mind. What do you do, sir?”

*

Still Mything

Barb continues to comb the ruins of the Greek economy, looking for what she believes to be the three missing myths of ancient Athens – the stories of Polemic, Libido, and Apologia.

*

We Love Our ET’s – But Do They Ever Call?

Futurist Ray Kurzweil has begun to doubt that our future will feature any messages from aliens.

He questions the basis of SETI – the Search for Extra-Terrestrial Intelligence – and the optimistic estimates by Carl Sagan and others that there are at least a million advanced civilizations in our galaxy alone, plus those in 100 billion other galaxies. Kurzweil echoes Enrico Fermi’s famous question on that subject:

“Where is everybody?”

Why haven’t we heard from them? Not that anyone has asked us, but we don’t charge a cent for our opinions, so this may be well worth the price:

We’re talking about SpaceBook, not FaceBook.

The advanced beings out there may not be looking for friends or followers and may have no use whatever for civilizations like ours, barely past the stage of inventing radio – which some of them may have supplanted ten million years ago. They could be communicating instantaneously with quantum-entangled neutrinos or by some means we are a million years shy of even imagining.

And here we sit like a termite colony whose denizens have been hoping that some higher intelligence like humans or dolphins or parakeets or border collies might soon get in touch via a shower of pheromones.

As Carl Sagan once pointed out, if a primitive culture whose communications consist of runners and drums tries to foresee getting signals from an advanced civilization, they would probably imagine very fast runners and extremely loud drums.

Breaking News! – the Long View

Martin Rees has a fresh perspective on how we humans think about time. A professor of cosmology and astrophysics at Cambridge, Rees is also president (as Isaac Newton once was) of the Royal Society.

His thoughts on how the times are changing appear in the 2009 anthology What Have You Changed Your Mind About? (a terrific book – see first comment, below) published by the Edge Foundation – with an introduction by Brian Eno, of all people.

*

To our medieval ancestors – say, in the 1300s or 1400s – the entire lifespan of civilization, the world, the universe, everything under creation, seemed vaguely to stretch out over a few thousand years. Except for a handful of learned monks, hardly anyone thought about it. People didn’t expect much to change in their lifetimes, and they were right.

A cathedral might take 200 years to build (Chartres took 600), but a worker cutting stones for its walls knew that eventually someone just like him would be attending services there and hearing the liturgy sung in Latin.

We now know that the universe is 13.7 billion years old – and that our sun has another six billion or so years to burn — but we are no longer capable of looking ahead 200 years, as a 14th century stonecutter could, or even 20 years. Everything is shifting radically within our own lifetimes. Civilization-changing innovations such as moon shots, CDs, PCs, or cellphones blaze like super novae and are then eclipsed within a decade or two. Planners of a half-mile high building in China expect to construct it in four months, with a coffee shop (not a tea house) at the top. But plans are suddenly suspended because more Chinese skyscrapers are rising within three years than all the European cathedrals built in any three centuries of the middle ages — and officials aren’t sure there will be enough entrepreneurs to worship there.

In the long view, the times will keep on changing, and so will we. As Rees puts it:

“Any creatures witnessing the sun’s demise six billion years hence won’t be human; they could be as different from us as we are from a slime mold.”

Not only is everything transforming at a frantic pace, for the first time it is we humans who are making the changes – for good or ill, by accident or by design — altering Earth’s climate and its sea levels, modifying the genetic blueprints of plant and animal life, enlisting millions of robots into the workforce, engulfing humanity with a tsunami of instant information, misinformation, and disinformation. Says Rees:

“What will happen depends on us.”

Along with the ominous implications of such a prospect, the professor sees some hopeful signs. Climate change is getting real attention. For radioactive waste sites, governments now require that they be secure for 10,000 years.

“We are custodians,” Rees concludes, “of a posthuman future – here on Earth and perhaps beyond – that cannot just be left to writers of science fiction.”

Dilettante Dynamos

We hear you, experts of the capitalist world – innovation is the mainspring of personal success and economic prosperity.

To imitate is human; to innovate is divine.

We at The Horse take that as a call to arms – in truth, a patriotic duty — though it does seem a bit imitative to say so.

Unfortunately, there are no Steve Jobses here (all right, how would you pluralize him?), so we have to work within our genetic and technical limits. Nevertheless, just the other day, Barb reinvented the Creamsicle.

Do you remember Creamsicles? (They’re still around – Unilever makes them.) And Fudgsicles? — which in Joan’s hometown were called Fudgicles, but those are no more sophisticated than a Popsicle. One monolithic bar of frozen, flavored stuff on a flat stick.

The Creamsicle is different.

It boasts two nested systems – like a golf ball or a jelly doughnut – an outer coating of fruited ice and an inner core of vanilla ice cream, again mounted on a stick. It used to be that if you saved up the sticks from a year or so of Creamsicles, you could do something with them, but I forget what. Maybe get a free one, and then save the stick.

Barb’s research began in a Berkeley ice cream parlor, assisted by two creative, entrepreneurial soda jerks. Barb wanted a cherry ice cream soda, but they didn’t have cherry soda to combine with the ice cream. So they brainstormed the solution of adding club soda and maraschino cherry syrup to vanilla ice cream. Not bad, but no breakthrough. Next time, Barb got a bottle of orange soda and poured it over vanilla ice cream in a glass.

Eureka! It tasted exactly like a Creamsicle.

Freezing it on a stick awaits development of new home freezer logistics and procurement of some tongue depressors, but innovation takes time, even if it’s for a reinvention rather than an invention.

Reinvention is American, too. If you look into the history of the Sunday Funnies, you find that 70 and 80 years ago (in the heyday of Prince Valiant, Fritzie Ritz, Buck Rogers, Nancy and Sluggo) there was a popular strip called The Gumps. Its protagonist was Andy Gump, which would have been a fairly funny name had it not been for the high-toned Gump’s designer store off Union Square.

In one episode, Andy’s uncle Wonderful Gump returned from 40 years as a recluse in a cave, where he had independently invented the radio. Uncle Wonderful couldn’t have known he was reinventing an existing technology, but he must have gotten a strong hint when he switched on his new apparatus and started to hear Fibber Magee, Jack Benny, Bill Stern, and Ma Perkins. Or possibly he, too, believed — if you build it they will come.

Fudge Cycles

Mark, another of our editors, founded a company when he was teenager. He named it J. Penniless, and its first product was the J. Penniless Build-a-Pencil kit. What it lacked in potential market breadth, Build-a-Pencil made up for in the stunning clarity of its user manual: “1. Find Part A.”

Later, he proved adept at restoring defunct bicycles, which suggested new grounds for innovation — e.g., converting a bicycle into two unicycles, which might have been named FudgeCycles. It was technically feasible, but the business plan wasn’t. A few scratches of the head, and Mark went on to re-imagine the business plan for a number of struggling enterprises.

I’m the dunce inventor of the editorial group. I tried to match Barb’s creation by making a root beer float, but it sank.

Still, we’re trying. Purists might dismiss reinvention as a form of imitation, but remember how they discovered penicillin. If you leave a Creamsicle out of the freezer long enough, it might go viral.