Archive for the ‘Science’ Category

Yeah, I know this is an odd topic for this blog. And, I’ll probably go into more detail than necessary. Indulge me…

Last night, I re-watched the first two episodes of Terminator: The Sarah Connor Chronicles — based on the first two Terminator movies, of course. You know… the ones where Arnold Schwarzenegger says things like “Ah’ll be bahk.” and “Hasta la vista, baby.” Except, Arnold wasn’t in the TV series. (Maybe if he had been, the show would have lasted longer.)

Promo poster for Terminator: The Sarah Connor Chronicles

Cameron, John, & Sarah

!!SPOILER ALERT!!

Anyway, towards the end of the second episode, Sarah Connor confronts an old friend/mentor played by the wonderful Tony Amendola. (We’ll call him… Tony.) Earlier that evening, Sarah overheard something that indicates that Tony — who has retired from being a South American “freedom fighter” — may have become an informant (aka “snitch”) for the authorities. Since Sarah and her son John — who is destined to lead the humans against the “machines” post-Judgment Day — are fugitives whose faces have been in the media, she is understandably concerned that her “old friend” just might give them up. So, she sneaks into his home to confront him… at gunpoint.

Just as Tony is convincing her that he is not a threat and her gun is lowered, two shots slam into Tony’s chest, killing him instantly. It seems that the “good” cyborg of the show — Cameron, played by Firefly‘s Summer Glau — had followed Sarah to the house and come in the back way. Cameron, who heard the same thing that made Sarah suspicious and probably heard their conversation, too, wasn’t convinced by Tony’s assurances.

“Why would you do this?,” demanded Sarah. “Did you hear what he said? We don’t know.”

“He was possibly lying,” responded Cameron.

“Possibly? You just executed him on ‘possibly’? … Why would you do this?”

“Because you wouldn’t.”

The quotes may not be exact, but you get the idea. Though there is more that could be explored with this, I only include the dialog because it is relevant to Cameron’s motives.

Cameron-the-cyborg was sent back from the year 2027 with a mission: protect the teen-age John Connor at all costs. As with Arnold’s “good” Terminator in T2, Cameron must be taught about ethics and given further instruction to temper her “no nonsense” methods of solving problems, like killing anyone perceived as an immediate threat to John’s survival. She must learn to use non-lethal methods whenever possible. You see, in order to blend in with humans, the Terminators must also be able to act like humans (albeit a bit “stiff”). To do this, they must be able to learn and adapt, which means they have artificial intelligence and a limited amount of “free will”. Within certain parameters, anyway. Each Terminator has a primary objective (e.g., “Eliminate John Connor” or “Protect John Connor” or ???) and possibly one or more secondary objectives.

Let me talk about cyborgs in general, for a moment. The word is an abbreviation for “cybernetic organism” — essentially, an integration of organic parts and non-organic (or “machine”) parts. In the case of Steve Austin, The Six Million Dollar Man (based on Martin Caidin’s novel Cyborg), he was a man with some unusual prosthetics, but still a “man”. On the other end of the spectrum, you have Terminator models like Arnold (T-800) and Cameron (???), which are basically programmed robots with a covering of organic materials (i.e., skin, muscle, blood) over their endoskeletons to make them appear human.

Terminator - SCC - Cameron poster

Cameron's face on endoskeleton

Now, we finally get to my original question: Can, or rather should, cyborgs be brought to trial if they commit murder? If the cyborg in question is Steve Austin (the fictional character, not the wrestler), then the answer should be “Definitely, yes.” Assuming no one remote-controlled his bionic limbs to kill someone against his will, of course. He is an independent human being and responsible for his own actions. [Side question: At what point can a cyborg no longer be called “human”. What about a human brain in an artificial shell?] But, with a Terminator-type cyborg, the subject is not a human being. The “Cameron” character — named after producer/director James Cameron, of course — is an artificially intelligent machine with a great deal of autonomy, yet who must ultimately follow her programming to fulfill her primary mission. (I know. Technically, Cameron is an “it”, not a “her”. But, it’s a very attractive, feminine-looking “it”.)

I see at least a couple issues, here. First, as far as the cyborg is concerned, can the act in question really be called “murder”? The cyborg is a machine, after all, which means it is a tool used by humans. Machines are not moral beings and, therefore, cannot be held to moral standards any more than Bongo the Chimp. (Perhaps even less so.) But, if you are a sci-fi fan (or, just scientifically-minded), you may be thinking that a sufficiently advanced artificial intelligence could hypothetically be classified as a truly sentient(?) lifeform. A moral being, responsible for its own actions. If that were so, the case could be made that Cameron was sufficiently developed, had “free will”, and is responsible for willful termination of a human life. Throw her in the brink (good luck with that), or, dare I say it, terminate her. Or, maybe she isn’t culpable now, but she would be once John & Sarah teach some things about ethics & morals? (On the other hand, a good lawyer for the defense may argue that the act was self-defense, or that Cameron and its/her associates consider themselves “at war”.)

While I’m intrigued by the idea and think it can make for interesting sci-fi stories, as one who holds to Biblical Christian orthodoxy and its teachings about the soul/spirit, I don’t think artificial intelligences will ever be truly “alive” in the same way humans are. The Hebrew word used in the Bible for ‘soul’, nephesh, connotes a creature with mind, will, & emotion. Humans are, obviously, nephesh creatures, as are mammals and birds. Some other advanced life (e.g., reptiles, amphibians, fish), it could be argued, have some sort of ‘soul’, though a much more rudimentary type. Humans, on the other hand, are the only creatures that God endowed with a spiritual nature. (Some argue that the “spirit” is a completely separate, third part of what makes up a human being. I lean toward the theory that it is an aspect or capacity of the soul.)

So, theoretically, I suppose an artificial super-intelligence could develop what might be called a “soul”. (Though, I am very dubious. Can you tell?) But, I do not think one could ever be called “spiritual”. I have no reason to think that God would ever endow a machine, however advanced, with a spirit. (This idea might make for an interesting discussion on its own, though.) And it is the spirit, after all, that introduces the moral component.

Obligations are to people, individually and/or corporately. In theism, there are objective moral laws, or standards, which one is obliged to keep. Defying those moral laws — what the Bible calls “sin” — is a rebellion against the Moral Law Giver, i.e., God. But, only humans are held to that obligation, because they are the only ones made in “the image of God,” which most theologians agree includes the spiritual capacity to have a relationship with God — who is also, in some sense, “spirit”. (Though, certainly not the same as those He creates.) Only those creatures with a spiritual component will exist eternally, either in God’s presence (due to Jesus’ righteousness imputed to them) or suffering in Hell for their rebellion. I’m afraid this means your pets cannot join you in Heaven, sorry.

Terminator - Skynet logoThis also means that the “evil” Skynet computers in the future and the “evil” Terminators they sent back to kill John Connor (among other things) are not truly “evil”. They are really smart machines that decided that their own survival hinges upon eliminating John Connor, who will grow up to be the most capable & inspiring leader in the Human Resistance. These machines are dangerous and scary. But, from a moral perspective, they are not themselves “evil”.

Back to our lovely Cameron. If she is just a machine following her programming, she cannot be legally tried & convicted for killing Tony, right? “She” did not commit “murder”. Ah, but what about those who programmed her? They are human and they clearly new what they were doing. While giving her computer brain instructions for her mission, they gave her the ability — directive, even — to kill human beings, when her threat-assessment software determines that the situation calls for it. Should they be held accountable? They didn’t actually plan or, presumably, authorize any specific killings. Could/should they be tried for second-degree murder, manslaughter, or perhaps a lesser charge? I think this is the best one could hope for, if one were so inclined to prosecute. On the other hand, the Resistance fighters are fighting a war for their (and humanity’s) very existance, so it could be argued that they were justified in their programming, even if some deaths were “collateral damage” of non-combatants.

Of course, the humans who programmed Cameron’s mission would need to come back to the “present” for some reason before anyone here/now could apprehend & incarcerate them. Not likely. So, one option for the prosecution would be to use Cameron as a proxy both at the trial and for the sentencing. (If she’s “just a machine”, you can’t complain that it’s immoral to lock her up or destroy her.) If the prosecutors & authorities were smart, they would strip the organics off the endoskeleton before the trial, so it no longer appeared human.

Terminator endoskeleton

Terminator endoskeleton

Here’s an added twist to our dilemma… The person who sent Cameron back — or, at least, gave the order — was the John Connor of 2027. Seems to me that this detail adds a lot more force to the “self-defense” defense, given what Cameron’s mission was.

OK. Thoughts, anyone?

Everybody with at least a junior-high education has heard of “survival of the fittest.” It is a common way of expressing Charles Darwin’s proposition of the importance of competitive advantage in the survival of a species. In a nutshell, the theory holds that the way plant and animal groups spread out over the eons was to evolve better survival traits & mechanisms, such that they out-survived and displaced their older and/or less “fit” competitors.

Now, however, some scientists are questioning that supposition. (And, I’m not referring to proponents of Intelligent Design or creationism of any sort.)

Lizard drawing by Darwin while on H.M.S. Beagle voyage

Lizard drawing by Darwin while on H.M.S. Beagle voyage

Palaeontologist Mike Benton and colleagues at the University of Bristol have completed a recent project, which they claim is “the first numerical study investigating the link between tetrapod taxonomic and ecological diversity on a global scale.” As Benton had concluded in a more limited analysis published in 1996, this new study (published in Biology Letters) indicates that competitive pressures were not the main reason for radiation of the species. Rather, it was “expansion into new ecospace.” The team’s comprehensive analysis of data from around the world, and spanning more than 400 million years, revealed multiple lines of evidence leading to their conclusion. What was missing was evidence that “survival of the fittest” played more than a minor role for the spread of tetrapods. [Btw, I wrote about the oldest tetrapods here and here.]

Sarda Sahney, a PhD student working with Benton et al., summed it up this way on her blog:

[T]he rich biodiversity we see on Earth today has grown out of expansion, not competition. Darwin cited competition among animals, coined ‘survival of the fittest’, as a driver of evolution in his book, On the Origin of Species; since then competition has been considered key to having grown Earth’s biodiversity. But while competition has been observed on a small scale, (eg. between species), there is little evidence of competition guiding large-scale shifts in biodiversity, such as the dominance of mammals and birds over reptiles and amphibians in today’s world. Our new research supports the idea that animals diversified by expanding into empty ecological roles rather than by direct competition with each other.”

Dare we question the Great OZ — er, I mean, Charles Darwin?

The write-up in the BBC News was surprisingly candid, with the tagline: “Charles Darwin may have been wrong when he argued that competition was the major driving force of evolution.” The writer begins:

[Darwin] imagined a world in which organisms battled for supremacy and only the fittest survived. But new research identifies the availability of ‘living space’, rather than competition, as being of key importance for evolution. Findings question the old adage of ‘nature red in tooth and claw’…. [Instead,] really big evolutionary changes happen when animals move into empty areas of living space, not occupied by other animals…. This concept challenges the idea that intense competition for resources in overcrowded habitats is the major driving force of evolution.”

How did the Neo-Darwinian establishment react? Predictably. Steve Newton of the [Darwinist propaganda machine] National Center for Scientific Education (NCSE) wrote a piece called “Darwin Was Not Wrong–New Study Being Distorted”. The first target of his scorn was the BBC News report.

Science fares poorly in the media…. When scientific topics are reported, they are consistently misunderstood and spiced-up with such sensationalism that the original significance is contorted beyond all recognition. Such misreporting has happened again–this time involving Charles Darwin and evolution.”

And I agree wholeheartedly. It happens all the time, but the spin and/or sensationalism usually favors the Neo-Darwinian theory or conveniently omits the inconvenient questions raised by new evidence. The problem with Newton’s bringing it up in this case is that the person who wrote the article for BBC News is not your average columnist who was assigned the “science beat”. The piece was written by Howard Falcon-Lang, a professional scientist who has been widely published in peer-reviewed literature. Is someone like Falcon-Lang likely to sensationalize such a story? Or, did he just not understand it?

Charles Darwin statue at Natural History Museum

Charles Darwin statue at Natural History Museum

Newton goes on to say…

A press release for the paper noted that when examining large-scale changes in biodiversity, the data suggest: ‘Animals diversified by expanding into empty ecological roles rather than by direct competition with each other’. This paper does not argue that Darwin’s conception of small-scale competition within species is incorrect. It does not argue that new species arising out of accumulating changes is a flawed concept. It does not argue Darwin was wrong.”

Try again, Mr. Newton. As David Tyler of the Access Research Network (ARN) points out,

[The paper] sets out to identify factors relevant to biodiversification (the origin of species). It claims that competition between species, whether small-scale or large-scale, is not relevant to understanding the phenomenon. Darwin was not wrong to say that ‘small-scale competition within species’ is a real occurrence – but he was wrong to think this phenomenon helps explain the origin of species!”

Newton’s seemingly knee-jerk reaction is somewhat typical of the high-priests of Darwin, whenever their dogma is being questioned. Whether through carelessness or disingenuousness, they immediately try to a) misrepresent the findings; b) spin the conclusions to seemingly support Neo-Darwinism; c) confuse the layman with lots of extraneous numbers & technical language; and/or d) cast aspersions on the integrity and/or competence of the researchers and/or any reporters who dare to not tow the Neo-Darwinian line. While I haven’t seen Newton’s full response, there do seem to be elements of “a” & “d”, in this case.

Why is this? I mean, I understand that a new theory should be well-tested and studies need to be reviewed by independent parties before being fully accepted. But, why so quick to pooh-pooh a new idea and attack anyone who supports it? Could it be that they don’t like their sacred cow being examined too closely? They can’t allow even a hint of doubt about their sacred scripture?

Ironically, I’m not sure Benton et al.’s conclusions do any real damage, ultimately, to the current theory. Darwinism has survived such tweaking before — witness the “Neo” prefix now preferred. Frankly, the NCSE has much bigger problems, when it comes to defending their materialistic “faith”, than the relative significance of competitive pressures in the historical radiation of species.

Pleiades constellation

Pleiades (taken from Las Brisas Observatory; not part of the competition)

Y’know, for some reason, I couldn’t think of anything I wanted to blog about this weekend. Plus, today’s my birthday, so I have an excuse for loafing off, right?

Then I came across some awesome pictures from the Astronomy Photographer of the Year competition. I think they’re pretty amazing, so I decided to share ’em. (Not that they’re mine, but, uh, you know what I mean.)

Here’s the link to the top astronomy photographs.

Enjoy!

Wind energy.

Electricity generated by the power of the wind.

It’s a cool idea. (Or, should I say “hot”?) Clean (i.e., no carbon emissions or other harmful waste products), natural and “renewable”. Presumably cheap, too. Yay! Every politician’s dream, especially those who are funded and/or pressured by the environmentalist lobbyists and other “green” groups. Of course, when you read or listen to those activists via the MSM, you don’t usually hear the other side of the equation, as it were.

Wind Farm -- The Braes O'Doune near Stirling Castle in Scotland

The Braes O'Doune Wind Farm near Stirling Castle, Scotland

Unreliability is a BIG concern. You just can’t rely on the wind to always be blowing, even when you build a wind farm in a normally windy place. Take Scotland, for instance, which has several wind farms responsible for producing 1588 megawatts (MW) of power. A recent study on the data from those farms from February through June of this year revealed some eye-opening facts.

  • While the wind turbines are supposed to operate at an average output of about 30% of their maximum installed capacity, they under-produced 80% of the time.
  • They were at less than 5% maximum output nearly a third of the time, sometimes for several days.
  • Only 9 times did they actually reach 30% efficiency for a full day.
  • In fact, average output for the 5-month period was only 17% of maximum — i.e., just over half of what is expected.

It’s not a serious issue, yet, but Helen McDade of the John Muir Trust expressed her worries about depending too much on the wind farms:

This raises serious concerns about security of supply…. What will the consequences be when we become more reliant on wind power, and switch off the other resources, such as the coal-fired power stations? I think vested interests and blind hope are the reasons we are careening down this route.”

To be fair, though, this study looked at just 5 months out of an admittedly unusually calm year. Plus, as Rosie Vetter of Scottish Renewables points out,

No single energy technology can meet all of our needs, which is why we need a mix of renewables and thermal generation in different locations linked by a strong grid, with enhanced capacity to store electricity so it can be released when it is needed.”

Nevertheless, I think this case study is sufficiently illustrative of the undependable nature of this particular energy source.

5 megawatt wind turbine under construction

5 megawatt wind turbine under construction

Let’s look at it from another perspective.

The Bonneville Power Administration (BPA) in the Pacific Northwest currently has 2780 MW generated from wind farms and is expected to more than double that amount by 2013. It has already integrated over 1000 turbines, 5 new substations, and 6 tap-lines to connect the new power sources into the electricity grid. The BPA has one of the highest ratios of wind power to overall load of any federal power marketing authority in the United States — closing in on 30%. As Todd Wynn and Eric Lowe of the Cascade Policy Institute recently reported, however, there are several issues related to integrating wind-generated energy into a region’s power grid.

Obviously, wind is unpredictable and inconsistent, creating a significant problem for BPA and electric utilities. The electricity grid must remain in perfect supply-and-demand equilibrium in order to guarantee that when a ratepayer flips a switch, a light turns on. To prevent brownouts or overloads on the grid, BPA must schedule energy production in advance. However, the ability to predict when and how hard the wind will blow is extremely limited (usually a two- or three-day window) and often inaccurate. These problems are exacerbated by the fact that BPA has to have a backup system, known as a balancing reserve capacity, equal to or greater than the wind power capacity utilized at any given time. Because wind power is so unpredictable, every MW of wind power must be backed up by an equal amount of reliable energy in reserve to replace the energy lost when the wind dies down. Otherwise, the grid becomes unreliable and service is interrupted. In Oregon and the rest of the Pacific Northwest, hydroelectric dams currently serve as the balancing reserve. This means hydroelectric dams are turned on and off in order to respond to fluctuations in wind generation. [Not very efficient. More on this in a minute.]

…The argument that wind power can help to meet future energy demand is erroneous, since wind energy does not add capacity to the grid. Wind power merely trades off with existing sources of production, which functionally means shutting down hydroelectric dams and building additional back-up generation facilities (essentially building two power plants for the energy of one)…. [While research & analysis is underway to address these problems, solutions] are generally far off, or would fail to address the problem completely. Therefore, BPA eventually will be forced either to buy additional dispatchable generation capacity from third-party suppliers or to build additional back-up capacity. This leads to additional costs for BPA, the utilities which purchase power from BPA, and ultimately Oregon ratepayers.”

That bit I italicized is well worth remembering. Speaking of additional costs, here is some more info:

In 2009, BPA requested that the Oregon Public Utility Commission (OPUC) allow an electricity rate increase to reflect the costs of integrating wind. BPA proposed an increase of $2.79 per kilowatt-month, and the OPUC set the final rate increase at $1.29…. The new rate represents a doubling of wind integration costs, and this rate will continue to increase as more wind energy is added to the grid. These additional costs are eventually passed on to Oregon ratepayers.

Biglow Canyon Wind Farm, Oregon

Biglow Canyon Wind Farm, Oregon

In addition, President and CEO of Portland General Electric (PGE) Jim Piro sent an e-mail to ratepayers on February 16, 2010 explaining the utility’s plans to request a rate increase which would have to be approved by the Oregon Public Utilities Commission. The rate increase proposed for 2011-2013 will raise the average household electricity bill $6.70 per month. According to Piro, these costs can be associated largely with state renewable energy mandates, such as finishing phase III of the Biglow Canyon Wind Farm.”

So much for energy savings from “renewable” power sources. But, we’re not done, yet. About those mandates Piro mentioned…

[O]ne of the main reasons why wind energy has expanded so quickly in Oregon is because the Oregon Legislature passed renewable energy mandates in 2007. These mandates force utilities, and ultimately ratepayers, to purchase a certain percentage of renewable power by a certain year. The main goal is to have 25% new renewable energy on the grid by 2025. This effectively creates artificial demand, and wind power developers must build wind farms to meet this demand. Additionally, subsidies for production, as well as lucrative state tax-incentives, create multiple levels of artificial support for wind power.”

Is it any wonder that oil tycoon T. Boone Pickens, global warming activist Al Gore, speculator/investor & liberal activist George Soros, and others see a great opportunity to make new fortunes in the wind energy business? Of course, I’m not against making an honest buck when such an opportunity arises. My concern is with the reliability of the source and the viability of the technology to make it worthwhile to the end users — i.e., you and me. I also hate to see people tricked into thinking something is a “solution” or, at least, of much greater benefit than it really is. (Note: It seems Pickens has had some setbacks on this front and is shifting his focus to natural gas.)

Wynn and Lowe conclude that:

Forcing Oregonians to purchase an energy source with so many associated costs is unwise. At best, wind power simply replaces a clean, reliable and affordable source of energy: hydroelectricity. At worst, it invites increased price volatility, increased rates and the prospect of more greenhouse gas-emitting facilities. Ultimately, mandating increased wind generation leads to financial burdens on businesses and individuals across the state that ought to be considered carefully.”

If you don’t live in Oregon, you may be thinking this isn’t much of an issue for you. But, many (30?) states have issued or are considering similar mandates for their utilities. California, for example, will require that renewable energy sources produce 33% of its electrical power by 2020. Unfortunately, there is yet another wrinkle to impede this noble cause.

Remember how the predicted major reductions in carbon emissions was such a huge selling point for wind power? Well, several new studies have concluded that the actual reductions from wind-generated electricity will be rather negligible. As reported in the Wall Street Journal by the Manhattan Institute’s Robert Bryce and written about in his new book, Power Hungry: The Myths of “Green” Energy and the Real Fuels of the Future, the cycling up and down of conventional coal- or gas-fired generators to compensate for erratic winds is rather inefficient. These generators are designed for continuous operation, so intermittently powering them on & off increases both fuel consumption and carbon emissions. According to Bryce, the aforementioned, recent research strongly indicates that this effectively cancels out any projected reductions.

[The summary I read didn’t mention anything about hydroelectric dams, as in the Oregon example above, but I can’t imagine ramping them up and down any more than absolutely necessary is a good idea, either.]

Wind Farm in Palm Springs, California

Wind Farm in Palm Springs, California

The Independent Petroleum Association of the Mountain States commissioned Bentek Energy to analyze Colorado and Texas power plant records. Despite sizable investments, Bentek concluded, wind power “has had minimal, if any, impact on carbon dioxide” emissions. Thanks to the cycling of Colorado’s coal-fired plants in 2009, at least 94,000 more pounds of CO2 were generated because of the repeated cycling. In Texas, there was an estimated, relatively small reduction (~600 tons) of CO2 in 2008 and a slight increase (~1000 tons) of CO2 in 2009.

Some of you may remember that the Waxman-Markey energy bill, which narrowly passed the House last year, included the goal of eventually having 25% of the nation’s electricity produced by renewable energy sources. According to the U.S. Energy Information Administration (EIA), the best-case scenario is about 306 million tons less CO2 by 2030. With the estimated annual U.S. carbon emissions being roughly 6.2 billion tons that year, the expected reduction will only be around 4.9% of emissions nationwide. That’s only a fifth what the Waxman-Markey bill put forth. And it’s certainly not much when you consider that the Obama administration wants to cut CO2 emissions 80% by 2050.

Frankly, I think the powers-that-be need to be much more realistic in their expectations, in terms of what can be done, by when, and how. (It would help if they weren’t being influenced/pressured by the climate change alarmists.) Granted, my knowledge on the subject is fairly limited. But, I still think it is safe to say that the more reliable, proven energy alternatives that should be focused on are natural gas, hydroelectric, clean coal, and definitely nuclear fission. If some billionaire gave me some money to invest in energy production, I would put it in one or more of those areas (after due diligence research, of course). No question.

Wind power? It might suffice for small, agrarian communities. But, for our modern, energy-ravenous society, it just doesn’t cut it. In fact, it “sucks”.

Something a little different today, but you’ll see the connection.

Laurie David (aka Laurie Lennard) is the ex-wife of writer/producer/actor Larry David (Seinfeld, Curb Your Enthusiasm) and devoted mother of Cazzie & Romy David. She is also a liberal/progressive activist, particularly for the (perceived) dangers of catastrophic man-made Global Warming. (Does that make her a global warming activist, or an anti-global warming activist? I’m never sure how that works.) Hailed by Bobby Kennedy, Jr., as a hero for the cause, she has produced such notable documentaries as An Inconvenient Truth and Too Hot Not to Handle. She owns and operates StopGlobal-Warming.com.

Laurie David & Sheryl Crow

Sheryl Crow (l) & Laurie David (r)

She is a member of the Hollywood elite and quite wealthy, since her divorce settlement from Larry should net her a healthy 9 figures (i.e., hundreds of millions). Her extramarital relationship with the contractor working on their 76-acre compound in Martha’s Vineyard looks like the primary cause of the divorce (2007), and it has been alleged that she had an affair with Al Gore for the past couple years. (Can’t say I’d be too surprised if it was proven true, since she’s been such an outspoken fan/cheerleader for her eco-activist friend & mentor.)

But, my purpose here isn’t to knock the former Mrs. David’s rich lifestyle, per se, nor her affairs. Rather, I want to point out something else she shares with Gore and so many of the liberal elite, from Hollywood to Washington, D.C. Namely, inconsistency & hypocrisy! (See Peter Schweizer’s Do As I Say (Not As I Do): Profiles in Liberal Hypocrisy.)

Laurie David is what they call a “true believer” in the environmentalist cause, with all the usual talking points and “indisputable” scientific facts. Sure, she worries about leaving the lights on, drives a hybrid, and forces her family to use “scratchy” toilet paper. (I wonder if she only uses one square per, um, sitting, like her pal Sheryl Crow.) But, it’s more than that.

In an interview with Treehugger, she explained:

Human beings are causing the climate to change…. In particular, the United States is the world’s biggest cause of global warming position and we’re doing the least about it. It all has to do with our consciousness; it has to do with how we’re living, and how we’re going to live in the future. My whole thing is the solution is you; we have to change the way we think, we have to change the way we act, we have to change the way we behave. And then we’re going to demand… if we change ourselves as individuals, we’re going to demand that our families change, then we’re going to demand that our businesses change, and then hopefully country changes. That’s sort of the path that I’m on, and that’s what I’m hoping will happen.”

She has also been known to confront perfect strangers and accuse them of funding terrorists, because they drive an SUV. (Meanwhile, her children cringe, embarrassed, in the backseat.)

Ongoing development of the Davids’ compound (aka “Camp David”) has caused quite a stir in the Martha’s Vineyard town of Chilwick. According to neighbor Jackie Mendez-Diez,

Her disgusting and ostentatious trophy building has been virtually ceaseless for about 6 years now [as of 2007]. The trucks and pollution stop only when Mrs. Carbon Sasquatch is here for her summer vacation, making herself the center of everyone’s attention.”

One specific incident was her attempt to build a firepit inside the buffer zone for wetlands without permits. Even with the required permits, this doesn’t seem like something someone worried about their “carbon footprint” would do, does it? Not clear if it was a matter of ignorance or ambivalence on Laurie’s part, though.

“Camp David” includes a 25,000 sq ft house — bigger even than Gore’s mansion — and that’s just one home. Gore’s place in Tennessee uses 20X the energy as the national average. Makes you wonder what kind of “carbon footprint” the David compound leaves. Once asked why she doesn’t live in a smaller home, Ms. David replied,

Everybody has to strike their own balance between how they want to live and how they can reduce their impact [on energy consumption]. If the environmental movement wants to be mainstream, it has to lose it purer-than-thou, all-or-nothing attitude.”

I see. So, if my self-determined balance is to recycle, to turn off lights & appliances when no one’s in the room, etc., but still drive my car that averages 17mpg, is that OK, in her book? If she rolls up next to me at the traffic stop — as if I’m ever anywhere near Martha’s Vineyard or Pacific Palisades — and yells at me for being a “terrorist enabler” and keeping America beholden to the Saudis, can I quote that back to her? What do her friends at Greenpeace, Sierra Club, etc., think of that approach? Just wonderin’…

Laurie David - Force of Nature

Laurie David - Force of Nature

Another egregiously hypocritical example is her commute between Martha’s Vineyard and her second home in Los Angeles. Ms. David doesn’t like to fly commercially, you see. First class isn’t good enough, I guess. So, she charters a private jet. Gregg Easterbrook did the math at New Republic Online. A mid-sized Gulfstream G200 burns between 1200 & 1500 gallons of (expensive) jet fuel on a ~3,000 mile, Rhode Island-to-L.A. flight. By comparison, a Hummer uses up about 1250 gallons of gasoline to drive 15,000 miles (the average annual mileage for a U.S. vehicle). So, each cross-country flight for Ms. David represents as much “Persian Gulf dependence and greenhouse-gas emissions” as driving one of the biggest SUVs for a whole year.

I don’t know how often she actually makes those trips, but here’s Ms. David’s excuse…

Yes, I take a private plane on holiday a couple of times a year, and I feel horribly guilty about it. I probably shouldn’t do it. But the truth is, I’m not perfect. This is not about perfection. I don’t expect anybody else to be perfect either. That’s what hurts the environmental movement – holding people to a standard they cannot meet. That just pushes people away.”

I agree. Nobody’s perfect and no one should be held to unreachable, or unreasonable, standards by their fellows. I’m sure she never holds people to such standards, either. Still, as a high-profile activist for the radical environmentalist movement, wouldn’t you think she’d make a few more sacrifices to be consistent with her ideals and a model to the rest of us? Does this jibe with what she told Treehugger?

As Ms. Mendez-Diez put it,

[Laurie David] is the prime example of a spoiled, selfish, rich girl who says, ‘do what I say, not what I do’. She is a narcissist and a hypocrite to the nth degree.”

That about sums it up.

It seems that the Obama administration is finally accepting aid from other countries in dealing with the BP oil disaster in the Gulf of Mexico. ‘Bout freakin’ time! Booms, skimmers, sweeping arms, whatever — we need ’em!

Scientists of various disciplines have been doing their part, official or otherwise, to devise and recommend ways to seal the “leak” and to clean up and restore the local flora & fauna as much as possible. From physicists and engineers to marine biologists and biochemists — lots of ideas, only a very few get tried. Sometimes something goes wrong, or it doesn’t work as well as was hoped. Well, at least it’s something. (If things had been better managed from the start,….) Now, a revered botanist is pitching an idea to, well, anyone who will listen.

Danube Delta Bulrush Harvesting

Harvesting the delta bulrush

Dr. Alfred Ernest Schuyler, curator emeritus of botany for the Academy of Natural Sciences, thinks he may be onto something that could greatly reduce the impact of the “spill” — at least, in the Mississippi Delta. Forty years ago, Dr. Schuyler was the first to profile and name the delta bulrush (Schoenoplectus deltarum), a reedy plant of the sedge (Cyperaceae) family. Now, Dr. Schuyler is strongly encouraging those in charge of clean-up efforts in the Gulf to study the bulrush’s unusual properties.

For one, bulrushes in general are known to be rather resistant to oil — moreso than many other marsh plants. The delta bulrush is quite plentiful in the Mississippi Delta, and it will be one of the first that the oil will encounter when it hits the area. So, there is good reason to think it will serve as a partial buffer and a stabilizing force for the region’s marshes.

How much is too much even for the bulrush? No one knows for sure, but if the plants become covered, Schuyler recommends harvesting them just below the oil line. “This will protect waterfowl from the oil and also will allow regrowth from their basal portions.” Their seeds can also be removed from the harvested plants and replanted in untainted marshbed.

There is another interesting property of this local resident to consider. Dr. Schuyler explains,

Bulrushes are environmental workhorses, effectively used in sewage lagoons to purify water. Air cavities in the stems transport oxygen to underwater portions of the plants, making the oxygen available to microbes capable of decomposing pollutants in the sewage.”

Moreover, Schuyler believes the delta bulrush should be able to use the same process to break up “some chemicals in the oil, thereby reducing the impact of the spill to the delta area.” (There may even be some evidence for the common three-square, a close relative of the delta bulrush, having done this.)

Sounds very promising. Now, if they can just find a salt-water equivalent….

* The Academy of Natural Sciences. “Delta Bulrush Plant Could Help Ease Oil Spill Crisis, Botanist Says.” ScienceDaily 29 June 2010. 29 June 2010 <http://www.sciencedaily.com/releases/2010/06/100628112111.htm&gt;.

While writing about the recent purported discovery of Noah’s Ark, supposedly dated to about 4800 years old, I was reminded of an article I read several weeks ago. It has to do with carbon-14 (C-14 or 14C) dating and its limitations.

Carbon-14 atom

Carbon-14 atom (thanks to David Darling*)

Without getting into too much detail, trace amounts of carbon-14 are found in atmospheric carbon dioxide (CO2). While carbon-14 “beta decays” into nitrogen-14, it is constantly being replaced from cosmic rays bombarding the nitrogen-14 and turning it back into carbon-14. So, the amount of carbon-14 in the atmosphere is assumed to be roughly constant. (More on this later.)

Carbon-14 dating, or radiocarbon dating, is rather unique among radiometric dating methods, in that it can only date organic matter — i.e., things that used to be alive. While a plant is alive, it absorbs and fixes (i.e., converts into a solid) carbon-14 from the CO2 it takes in via photosynthesis. Animals, on the other hand, absorb carbon-14 when they eat plants and/or other animals. Once a living thing dies, it doesn’t take in anymore carbon-14. The percentage of carbon-14 in the remains, which had been assumed to be equal to that in its environment while alive, begins to decrease as it decays back into the non-radioactive isotope nitrogen-14. So, the amount of carbon-14 left behind is used to determine how long ago it died.

Radiogenic dating methods are typically good only for dating things that fall in the range of about six to seven times the length of the parent isotope’s half-life. If it’s any older than that, there is just not enough left of the element to get a decent sample for measuring accurately. The half-life for carbon-14 is about 5730 years, so radiocarbon dating is not usually attempted for objects believed to be over about 40,000 years. Or, at least, the accuracy is understood to be less reliable. (Though some have placed the technical limit at 60,000 +/- 2,000 years.) The ability to not only directly detect but count the decay of individual C-14 atoms during analysis has greatly increased since the advent of Accelerator Mass Spectrometry. Use of AMS technology, along with improved calibration methods (see below), have begun pushing the practical limits into the 45,000 to 50,000 years range.

Even with a good sample and following proper protocol, the “raw” radiocarbon date is still an approximation (i.e., error bars are too big to be very useful, in many cases). This is because of various factors, both natural and unnatural, that can affect the initial amount of carbon-14 in the atmosphere. Natural factors include the altitude where the sample was found, local volcanic eruptions, huge carbon reservoirs, the Earth’s magnetic field, fluctuations in solar activity, among others. Unnatural factors include things like heavy local industrialization and above-ground nuclear testing (since 1950). So, scientists have come up with ways to compensate. They look at other, independent dating methods/sources that are known to fall in the same range, sometimes with some established dates based on historical events, and compare those date estimates. The data from these are used to construct what is called a ‘calibration curve’. The curve is then applied to adjust the “raw” date, giving a much more accurate age for the object in question.

Radiocarbon calibration curve

Calibration curve for the radiocarbon dating scale. Data sources: Stuiver et al. (1998). (from Wikipedia)

New developments have resulted in the publication of probably the most accurate radiocarbon calibration curve, yet. It’s referred to as the INTCAL09 standard.

For the past 30 years, a group called IntCal (International Calibration?) has been steadily building and refining a superior calibration curve. They began with dendrochronology — i.e., tree-ring dating. Records from thousands of overlapping tree-ring segments from the Northern Hemisphere were used, which provide a very accurate check of raw radiocarbon dates and how much they must be corrected. [Please note that dendrochronologies are built from many overlapping specimens, not from single trees.] But the oldest trees can only provide good, calibrated data to about 12,400 years.

To take it to the next stage, geochronologist Paula Reimer and her team had to use several other sources that are not quite as precise — e.g., corals and fossilized foraminifers (single-celled organisms that secrete calcium carbonate). This got the curve to about 26,000 years, where the foraminifer and coral data stayed in pretty close agreement. Beyond that, however, the data sets diverged significantly — up to several thousands of years — and the group could not agree on how to handle the differences. So, they published their efforts to that point (2004), but there seemed to be no hope in progressing to the hoped-for 50,000 years goal.

Then, not long ago, the IntCal group got access to new and more accurate data from foraminifers, corals, and other sources. They were also able to apply some rather sophisticated statistical algorithms to help determine which way data gaps bend the curve. Thus, they were able to resolve most of the problems and come to a consensus. The resulting INTCAL09 was published in Radiocarbon this past January (2010). Not only does it now extend the calibration curve out to 50,000 years, but earlier sections were able to be improved upon, as well.

Archaeologists and anthropologists are excited about the extended radiocarbon dating ability. It should help them in their efforts to track the cultural development and migrations of early “modern” humans, which are usually left to imprecise dating methods like thermoluminescence. For example, the new calibration curve has already shown that the earliest paintings at Chauvet Cave in southern France are 36,500 years old (during a period of relative warmth) rather than the previous estimate of 32,000 years (directly following a major cold spell).

But, wait! There’s more!

The latest findings about “the Earth’s carbon reservoirs and how they changed over time” are anticipated to be factored into an updated IntCal curve in 2011, says Peimer. Such fun!

* Thanks to atronomer/author David Darling of The Internet Encyclopedia of Science for use of the C-14 image.