Wednesday, April 30, 2008

Only 100 Days Until the Orympic Games

There are no birds. You don’t notice it at first, but then after a day or two you realize that you just never see a bird. In China, nobody has seen a bird for almost thirty years. Back around the time of the Cultural Revolution, Chairman Mao convinced his population that the birds were competing with hungry Chinese children for precious grain and seed morsels, and he declared that the people should kill the birds to save their food supply. The people obediently complied, as they have always done in China. Final score: Mao 1, Birds 0. End of story.

There are other problems in the air as well. While you are looking upward for birds in China, you can’t help but notice the sky, and you eventually come to realize that it’s never blue. The pollution is always there, even on windy days, and the color of the atmosphere fluctuates between a very light, almost-misty gray, and the kind of dark ominous gray that would precede a storm in most other parts of the world. This perpetual gray sky has become something of a PR problem as China looks forward to hosting the Orympics (phonetic spelling). At least one world-class marathon runner has announced his intention to boycott the race rather than breathe the polluted air.

To me, a discussion of the Chinese sky seems like the best way to highlight the environmental problems faced by the world’s most populous nation. The startling statistics are another way to define the problem, but you can’t actually see statistics. We read them so often that we can probably recite them from memory. One third of all the concrete poured on the planet is poured in China. One new coal-fired electric power plant comes on line every nine days. The largest telecom company on earth is China Wireless, and they clear a spot for a new cell tower every three minutes. Automobile congestion on Chinese roads is the worst on earth, and this is at a time when only one out of every one-hundred Chinese citizens owns a car. With the completion of the Three Gorges Dam, the upper Yangtze River is now the most polluted natural body of water on earth, with bacterial levels comparable to the holding ponds in sewage treatment plants. I saw this for myself recently, and I held my nose as I watched the prow of my boat slice through a foam of floating feces. Happily, the Orympic rowing events will not be held on this venue.

At the western end of this new cesspool stands Chungking, known in China as “The Furnace.” Temperatures there in summer hover around 130 degrees Fahrenheit, in spite of the fact that the sun seldom penetrates the perpetual cloud cover. I was in Chungking on a day in June when the temperature was only 115, and it bore no resemblance at all to a hot day in Phoenix or Las Vegas. In Chungking, the sun doesn’t cast a shadow. The reason, again, is that Chinese sky. In another 100 days, with the Orympic Games taking place in August, the whole world will know exactly what I’m talking about.

Sunday, April 27, 2008

Don't Blame Kindler




Jeff Kindler has been CEO at Pfizer since mid-2006, and I think of him whenever I think about the next President of The United States. Kindler, like the person who will follow George W. Bush, took the helm from a predecessor who spent eight years at the top before leaving his situation as a mess for someone else to deal with. Jeff Kindler has shouldered much of the Wall Street blame for Pfizer’s stagnant stock value, but to those on the inside, it’s been evident for two years that he never really had a fighting chance.

Blame the wars. For Bush, it was Iraq. For Kindler, it was an out-of-control enlargement of the sales force (the army, if you want to call it that) which came as a flawed legacy from the man who preceded Kindler as Pfizer CEO. By the time Kindler took over at Pfizer, this pharma arms-race, characterized by escalating numbers of so-called, “detail people,” was destabilizing, not just Pfizer, but the entire pharma industry. But it’s worth noting in hindsight that Pfizer started it.

There are 1.4 million men and women practicing some form of medicine in the U.S. who are lawfully allowed to prescribe human pharmaceuticals. Some of these legal prescribers are dentists, some are part-timers, and some are retired. A few are actually dead, and their family members, for whatever reason, keep their DEA number active. The actual number of human physicians who write at least one prescription annually is something in the neighborhood of 900,000. This is the pool- the reservoir- of medical people that the pharma industry seeks to court. By the end of 2006, the total industry-wide number of “detail people” was 92,500. Do the math. That computes to more than one pharma salesperson for every 10 prescribers. That may not make sense to you, but in the earliest years of the 21st century, it made sense to the head of Pfizer. And when it stopped making sense, Jeff Kindler was the man left to deal with the fallout.

I’ve talked with physicians who tell me that- at the high point of the madness- it was not uncommon to see three different Pfizer salespeople each week. Other pharma companies joined the bandwagon and beefed-up their sales ranks as well. Pfizer’s total number topped-out at about 30,000. Merck and J&J were right behind, along with GSK and others.

Taking into account a nice salary, company car, expense account, healthcare benefits, and storage rental for space to hold the astronomical quantities of drug samples, it takes about $200,000 a year to keep a “detail person” in a territory. Add to that, the cost of the management hierarchy layered over the sales reps, and the production and supply costs for all those drug samples, and soon you’re talking about some real money in the marketing budget. To be honest, at the beginning, the strategy made a certain amount of sense. The late 1990s had seen an unbroken and timely string of blockbuster drugs entering the Pfizer product roster from an R&D department that seemed charmed in its ability to deliver the goods. Lipitor, Celebrex, Viagra, and other whoppers drove the Pfizer stock price to stratospheric heights in 2000 before the price, then, fell to half its value where it has languished for the past eight years. Added to the product cascade was a series of takeovers by Pfizer of other drug companies like Warner-Lambert and Pharmacia. These new Pfizer corporate acquisitions also meant new “acquired” products to feed the sales machine. At the top floor on 42nd street, the new strategy was to have multiple sales forces detailing different therapy classes. Hence, the tales from physicians about seeing three different Pfizer salespeople each week.

Shorty before Jeff Kindler took over the corner office, the once-prolific Pfizer R&D operation hit a number of dry holes and the new product flow slowed to a trickle. Behind this was a new, and less benevolent attitude at the FDA. The real setback, however, was a new, and less benevolent attitude in physicians’ offices. Quite simply, doctors got sick and tired of seeing so many salespeople. Today, a good detail person can get, on average, only five minutes of “face time” with the doctor. Most sales calls today are nothing but sample drops, and this activity hardly justifies the $200,000 annual investment in the sample dropper.

In 2006 and 2007, public opinion polls started showing that the pharmaceutical industry was held in the same low public esteem as the gun manufacturers and the tobacco companies. This is partly the result of constant battering from the U.S. Congress and the AARP. The 2005 movie, The Constant Gardener, didn’t help either. None of the blame for any of this rests with Jeff Kindler, but he is the one who seems to be taking the heat. One of his first moves as CEO was to begin cutting the number of salespeople, and it seems like, long term, it may have some benefit. On the whole, however, Kindler faces a situation analogous to the one faced by the next U.S. Commander-in Chief. He has to clean up the mess. And as for Kindler’s predecessor, he jumped off the top floor on 42nd street floating under a 200 million dollar golden parachute.

Friday, April 25, 2008

If It Was Fair and Balanced, It Wouldn't Be Essential

Ask the average person to describe the New York Times with one word, and most people will say, “Liberal.” Maybe that’s not such a bad thing. That’s not the word I would use, however. For me, the New York Times is “essential.”

There’s a legend in the folklore of politics about Karl Rove in his early days in Texas. He wanted a direct mail list of Conservative Republicans in the state, and because such a thing did not exist at the time, Rove bought himself a list of the subscribers to Field and Stream Magazine. He knew that such a list would be overwhelmingly comprised of Conservative Republicans, and he was comfortable in the knowledge that Field and Stream would not begin opposing the NRA just to broaden its appeal to a wider audience. Rove knew that publications all have their own identifiable and predictable constituencies, and this is true of newspapers as well as magazines.

The constituency of the New York Times is primarily composed of East Coast, urban intellectuals, and such a group tends to be politically liberal. News is a product, and a newspaper is a business, and the reader is the customer. And the first rule of business is to give the customer what they want. The New York Times does this well. To become more balanced and nonjudgmental, the newspaper could reach out to a broader audience of Conservatives by printing the daily diatribes of Rush Limbaugh, or quoting the rants of Bill O’Reilly, but decision makers at the Times are smart enough to know that red-meat, Right-wingers like to get their venom directly from the source by tuning in to the EIB Network or the FOX News Channel. The Times appeals to Liberals because Liberals are the ones who read the paper.

For me, the important question is not whether the Times is liberal, but is the Times essential? I believe that it is, and here’s why. About five years ago, I had the privilege of being part of a group discussion with Daniel Ellsberg, and Ellsberg told us, personally, why he gave the Pentagon Papers to the New York Times. He said that the Times, in his opinion, was the only institution with the clout and the courage to withstand the predictable onslaught from the Nixon administration. Ellsberg confided that, prior to turning over the papers to the Times, he had experienced an “All-The-President’s-Men” moment when he genuinely feared for his life. The New York Times, for Ellsberg, was not just an outlet to expose the lies of the Pentagon about Vietnam, but it was also an ally to help guarantee his personal safety.

Here’s the essential thing about why such a liberal bastion is necessary. When power on the Left goes off the track, it tends toward the silly and the pathetic, and maybe even the tragic. I would cite LBJ, Ted Kennedy and Bill Clinton as examples, annoying, but not really threatening. But when power on the Right gets out of control, it is downright destructive. Both ends of the political spectrum are not above punishing their enemies, but whereas the Lefties do it with some temerity, those on the Right do it with a kind of evil enthusiasm. Joe McCarthy, the Nixon administration, and Bush-Cheney are examples of this. Even an institution as strong as the U.S. Congress doesn’t seem up to the task of speaking truth to that kind of power. I don’t mean to imply that the New York Times is the only thing standing between us and outright fascism, but it does seem to have a good track record in helping keep the Right under control. If this offends some Conservatives, it seems like a fair trade-off.

Tuesday, April 22, 2008

WANTED- President. MENSA members need not apply

The message that “intellect is important” has become a tough-sell in the modern United States. Maybe this was always the case. After all, ever since the earliest days of the American Revolution, the new nation saw itself, partly, as a kind of thumb-in-the-eye response to the obnoxious and snooty European aristocracies. “America,” we were told, “celebrated the common man,” although it’s doubtful that anyone ever knew, precisely, just what that phrase meant. What sustained us, however, and what allowed us to prosper beyond anything the world had ever seen was the fact that we selected leaders who were just a little bit smarter than the common man. There was a time, not so long ago, that we actually wanted to be governed by the best and the brightest.

Sometime during the 1992 presidential campaign, Bill Clinton or one of his strategists had the bright idea to position candidate Clinton as something of a common man. It worked like magic. Voters were told by pundits that they seemed to rally around Clinton because he was “like them.” He didn’t talk like FDR or Eisenhower or JFK. In fact, he didn’t talk like any previous presidential contender or president. He was the kindly and folksy Sheriff of Mayberry and we were all little Opie Taylors. The ground had shifted, and future presidential aspirants would now be judged on how much they were like the voters—the common man.

It should have come as no surprise, then, that George W. Bush failed to measure up in intellect or performance when faced with 9/11, and Hurricane Katrina, and massive Mexican immigration, and skyrocketing oil prices. His election was based in part on the fact that he was like the voters, people who were easily blindsided by adjustable rate mortgages, people who were unlikely to be attending MENSA meetings. In a nation where 25% of high school students drop out before graduation, we need to ask the question, “Do we really want our president to be just like us?”

We are now faced with the candidacy of a man who makes no pretense at being just like the voters. On top of that, he’s black. And because he speaks in complete and eloquent sentences, there is the suspicion that he might be an “uppity” black, with none of the comforting Uncle Remus qualities that made Bill Clinton our honorary first black president. This new black candidate has been labeled (dare I say it?) an “elitist.” For eight years we’ve seen the destruction that defined the presidency of a pseudo cowboy with a room-temperature I.Q. God forbid that we should now raise an “elitist” to the presidency. Is there any possible way that an unvarnished elitist could be worse than a "decider" shooting from the hip like a common man. I have a message for the American voters. The fact that we’ve never been a particularly cerebral populace should not condemn us to be forever governed by nitwits.

Saturday, April 19, 2008

Iraq- How It Will End



On March 19, 2003, the United States invaded Iraq, with 94% of the American public in full support of the mission. Three weeks after that, on April 11, 2003, I personally attended a lecture by 1972 presidential candidate, George McGovern. His talk was titled, “The Wrong War, In the Wrong Place, At the Wrong Time.” In two hours of speaking time, McGovern predicted, with near-perfect accuracy, everything that would unfold in Iraq over the next few years. The insurgency, the sectarian violence that would escalate into a civil war, the ineffectiveness of the elected leaders in Iraq, the utter incompetence of the U.S. commander-in-chief and the Secretary of Defense, the loss of public image in the eyes of the world, the outrageous cost, and even the inadequacy of the poorly-armored vehicles that would cost so many American lives- all this was laid out by McGovern. And this was in front of an audience that, like the greater American public, disagreed with his message by a margin of 94% to 6%. He was right that day, and we were wrong.

Based on that experience, I now listen when George McGovern speaks. Two months ago (March 14, 2008) I again heard McGovern talk, and this time he put forth his plan for leaving Iraq. He said simply, “Load the troops on trucks and drive toward the nearest border.” I believe that’s how it could be done, and I firmly believe that’s how it will be done.

As things stand right now, both Iraq and the United States of America are disintegrating, and the one is causing the other. Iraq has made a trillion dollars of American wealth evaporate just as surely as if it was piled up in paper currency and set afire in an all-consuming blaze. The loss of that national wealth has been felt in ripples through the economy, and been accelerated by bad real estate loans and astronomical oil prices, but beneath the surface, it’s all tied together. The falling dollar is symptomatic of both the military and the economic pathologies that currently define the United States. This isn’t my opinion. This comes from the 2001 Nobel Laureate and former chief economist for the World Bank, Joseph Stiglitz, and I assume that a Nobel Prize winner knows more about the “conomy” than George W. Bush.

I will make a prediction. The Iraq War will end sooner rather than later, even if McCain becomes the next president. When the choice becomes one of disintegration in Iraq or disintegration right here in America, we will choose to let Iraq disintegrate. The choice, as they say, is a “no-brainer.” When Americans realize that their economic pain and their agony over the failure of the war are both symptoms of the same disease they will demand an end to it. It will be like cutting off a limb infected with gangrene to save the rest of the body.

Thursday, April 17, 2008

Still Waiting for that Flying Car (part 1)

I HAVE SEEN THE FUTURE proclaimed the small stickpin tin lapel buttons that were given out to patrons when they exited the General Motors Futurama pavilion at the 1939 World's Fair in New York City. Billing itself with the slogan, "The World of Tomorrow," the fair became a two-year celebration dedicated to the blessings of democracy and the wonders of technology, and it was this latter purpose, the apotheosis of technology, that captured the imagination of the fairgoers in a way unlikely to ever be seen again. During those two incredible years, sandwiched between a decade of economic hopelessness and the coming horror of the Second World War, it seemed for a brief moment that anything was possible. The automotive titans, Ford and General Motors competed for the attention, and even the affection, of fair attendees with two innovative exhibits that conveyed the idea of progress in two very different ways. Ford sought to highlight the paradigm shift that had already taken place by giving visitors a look back in time in the form of a floor show entitled "A Thousand Times Neigh," a horse's-eye view of the automobile. This offering from the Ford Motor Company, while entertaining and original, was completely upstaged by the Futurama of General Motors which had been designed by futurist, Norman Bel Geddes, to show the American landscape as it was predicted to look in the year 1960. Bel Geddes' vision of the future included 1,500-foot-high office buildings, 14-lane superhighways, and small individual vehicles capable of traveling by both roadway and air.

These components of future technology envisioned in Futurama by Norman Bel Geddes made their way onto a list of predictions that grew out of the 1939 World's Fair to tell the people of that time what they could expect to see in the next 25 years. The following is a collection of anticipated marvels prophesied in 1939 that were realistically expected to exist in the year 1964: buildings taller than the Empire State Building constructed with lavish use of aluminum and glass, a multi-lane highway system that would allow a driver to travel coast-to-coast without stopping for anything but food and gasoline, a personal vehicle capable of both air and ground travel, the cautious but feasible use of atomic energy for power production, ubiquitous plastics, television sets in every home supported by a broadcast infrastructure, nylon stockings for women, rockets capable of orbiting above earth's atmosphere, radio telephones for occasional use in automobiles, aircraft capable of carrying 200 passengers at 400 mph, antibiotics, warships an eighth of a mile long, prefabricated low-cost houses, and fresh fruits and vegetables available at any time of year.

All but one of these promised technological achievements were either fully realized by 1964 or were in development with their final actuality clearly in sight. Only the dual-use aircar remained elusive, and if one considers the technological problems associated with atomic energy and orbital rockets, it's hard to argue that the technology of air-ground functionality was simply too insurmountable. It's equally hard to make the case that the aircar concept didn't fully capture the imagination of the public sufficiently to spur development and production. The list was, after all, composed in 1939, and given the attention that the aircar concept had received in the years leading up to 1939, one might have anticipated that its development would have been among the first of the expectations to come true.

Wednesday, April 16, 2008

Still Waiting for that Flying Car (part 2)

It would be difficult to overstate America's obsession with science and technology during the Great Depression. Four magazines best reflected the obsession: Popular Science, Popular Mechanics, Science and Mechanics, and Popular Aviation, all with beautiful color artistry on the cover each month showing some fanciful invention, and an accompanying article inside promising that the wizardry on the cover was just around the corner. Streamlined efficient automobiles, giant airships, speedy single-wing aluminum monoplanes, gigantic steamships of futuristic proportions, buildings constructed of new materials that would allow them to be built to a height of one mile all this was predicted to be just on the horizon. The most imaginative cover art composition, however, was seen in smaller vehicles with dual-use capability such as boats that could travel on land, and cars that could cross water, and especially cars and boats that could take to the air. The extraordinary inventiveness and artistic quality of these magazine covers have made them highly collectible in recent years, and in looking over published anthologies of the 1930s cover art, one finds that fully one out of ten depicted some form of airworthy car or roadworthy airplane.

When citizens in 1939, primed by a decade of exposure to such images on the newsstand, were promised flying cars by 1964, that idea was far less fanciful than the concept of atomic energy or uninterrupted highway travel. The situation becomes even more unbelievable when one finds that, although the designs shown in the cover art were universally impractical and unworkable, many were actually patented. And so the flying car was initially envisioned in the minds of magazine cover artists with no regard for practicality or functionality, and then it was subjected to a certain amount of patentable developmental study before being touted as a travel mode of the future in The World of Tomorrow showcase, but it was too late in achieving actualization and so it finally diminished in importance before ultimately fading from public attention. The thing of it is, it never stopped being a great idea. The logical question is, "What happened?"

What got in the way was World War II. Everything on the 1939 prediction list seemed to have immediate military application- everything, that is, but the dual function air-ground vehicle. The nylon research that would eventually find its way into the women's hosiery market was first applied to parachutes. Even the technology used to supply fresh fruits and vegetables year round had application for food supply in combat environments. The automobile industry, which might have been counted on to consider levitating models, simply went out of the car business during the war, and the aircraft industry was forced to focus exclusively on various types of warplanes. The war completely changed everything.

After the war, the aircraft manufacturers rushed to take the heavy load-carrying capacity developed for bombers and apply it to commercial passenger transports, thus guaranteeing the fulfillment of the 400 mph, 200 passenger airplane prediction on the 1939 list. The car makers, upon returning to civilian production, were simply concerned with survival. Some like Chrysler and Ford and GM made the comeback successfully. Others like Packard and Hudson and Nash and Studebaker and the upstart Kaiser Company fell by the wayside. In such a turbulent time of market shakeout, the research investment needed to make a car fly was unthinkable. Moreover, the public no longer felt like it needed the product, and this more than anything else explains why the airworthy automobile languished in various design stages within the files of the U.S. Patent Office. Postwar car buyers were content just to be able to buy new automobiles again, and the engines that now used war-developed powerplant innovations made the postwar cars so much faster than the prewar models that drivers felt no need to actually leave the ground since most felt like they were flying already. Even those magazines devoted to science and mechanics which had fired the imagination of technophiles in the 1930s had now changed after the war. Photographic progress and development had now made color photographs, not illustrations, the standard for magazine covers, and since airworthy cars didn't exist, they couldn't be photographed. Wartime advancements had given science magazine cover editors things like radar and atomic bombs and V2 rockets to sell their periodicals, leaving the fanciful impractical imaginings of onetime cover illustrators marginalized in the dreamlike nostalgic cloudscape which characterized that distant America back before the war. Everything had changed.

Tuesday, April 15, 2008

Killing the Goose That Lays Golden Eggs

The pharmaceutical industry, handgun manufacturers, and tobacco companies. Question: What do these three institutions have in common? Answer: Public opinion of all three is equally low, according to several market research polls conducted in 2006 and 2007. When you try to understand the facts behind this amazing situation, you come to realize that there is more at work here than ordinary capitalism.

To the extent that a person can live without a handgun or cigarettes, these products can almost be considered luxuries, and as such they are exempt from price considerations. The complaint about guns or tobacco products is not that they are too expensive. But not so for pharmaceuticals. Since these products can often mean the difference between life and death, people tend to think that, somehow, unfettered access to the products should be as easily available as access to pure drinking water. It’s a ghastly misinterpretation of the American Constitution’s “right to life.” The capitalistic idea of supply and demand cost structure just doesn’t compute when it comes to Big Pharma, and this mind-set holds true even for pharmaceutical products that are not essential to maintaining life. Take the classic case of Viagra.

Pfizer Inc., like all big companies, has its own folklore. This is the folk tale about Viagra. Back in 1997, when the product was still in the clinical trial stage, and it was known only by its chemical name, sildenafil, Pfizer did a price-point analysis to see what the market would bear in terms of product cost. But then a curious thing happened. According to the folklore, there was a meeting of the Viagra launch team that took place about three weeks before the product’s introduction. Somebody at that meeting said something to the effect, “Gee, wouldn’t it be nice if some late night comic did a joke about Viagra, and mentioned the name. It would be like free advertising.” As they say, the rest is history. In the six months following the product launch, EVERY late night comic did a Viagra joke virtually every night, and mentioned the product name every time. A market survey in late 1998 showed that, within a year of the product launch, Viagra held the same worldwide name recognition as the Coca Cola brand. But the curious thing is this: The original price-point analysis cost target was now too high. When Viagra became world famous, and when subsequent demand exploded, potential customers for the product began to think that it was worth LESS, not more. So much for supply and demand.

There is one last thing to consider when it comes to Big Pharma and capitalism. As a purveyor of products that literally keep people alive, Big Pharma is truly the goose that lays golden eggs. And we all know the fairy tale about that scenario. Pharmaceutical research and production is one of the easiest of all industries to move offshore. The absurdly low public opinion of Big Pharma is primarily the result of constant battering from The U.S. Congress and the AARP. When companies like Pfizer and Merck decide that “enough is enough,” it doesn’t take much imagination to visualize these fine companies relocating to foreign countries. Marketing teams would then be free to adopt the mind-set that, if Americans don’t like the pricing structure, they can simply save their money and avoid the products and die. That’s capitalism in its purest form.

Saturday, April 5, 2008

The Last Reason for War

World War II was unique in many aspects, not the least of which was the fact that the rationale for fighting, the essential reason for the war, remained unchanged throughout the full four years of the conflict. From start to finish, all Americans knew exactly why they were involved, and that reason for war stayed constant. I believe that this, more than anything else, explains why World War II is the only war in the last century where the United States came away with a clear and lasting victory (patriots may wish to cite the 1989 Panama Invasion as an exception to this statement, but that’s beyond pathetic).

In most cases, the stated reason for war changes to adapt to circumstances, and the rationale at the end is very different from the rationale at the beginning. Once you understand this process, the history of the war is already written long before the last shot is fired. Consider a couple of cases in point.

Immediately following the American Civil War, the Government, with a victorious army needing a reason to stay in uniform, sent the troops out West to “make the land safe for settlers.” The military orders were to convert the Indians into peaceful Christian farmers and to “exterminate” those Indians who resisted the process. The word “exterminate” was actually used. By 1876, the extermination had given way to pacification and resettlement. Then an egotistical fool named Custer led his men to their deaths at the Little Big Horn, and for the next 25 years the stated reason for the “conflict out West” was to avenge Custer and to honor the sacrifice of the men under his command. It took a hundred years for the truth to be told when bumper stickers finally appeared on Volkswagens in the 1970s reading, “Custer Had It Coming.”

Fast forward to those 1960s. The U.S. was bogged down in yet another war with a fuzzy rationale and bleak future prospects. The stated reason for the Vietnam War, initially, had been to create a buffer against Communist expansion. The marching orders were to “neutralize” (“exterminate” was no longer fashionable jargon) those who supported the Communists. Then, in February of 1968 came the Communist Tet offensive, and all the rhetoric changed. Tet was the modern version of the Little Big Horn. For the next seven years, we languished in Vietnam for the stated reason that “we needed to honor the sacrifice of those who had already given their lives.”

Which brings us to Iraq. I won’t bother with the litany of phony reasons for the war that have come and gone. Suffice it to say that we’re now down to the last stage in war where the government says we need to honor the sacrifice of those who have given their lives. Whenever a government tells us that, we can know five things with absolute certainty. This applies to any government and any war.

  1. The initial reason for the war was fraudulent.
  2. The stated objectives of the war are unfulfilled.
  3. The government is losing the war.
  4. The current leader of the government doesn’t want a lost war on his resume.
  5. The government is NOT in the least concerned with those who are sacrificing their lives in battle.

There is a saying that, “In war, the first casualty is truth.” It is possible, however, to cut through the lies when you know how to read the code.