Thursday, March 31, 2011

An Ounce of Prevention

Governor Jerry Brown just declared the three-year drought in California over. There have been record rains and snows throughout the state with some of the worst storms that I can recall. It wasn’t supposed to be this way. Meteorologists all predicted a La Niña year (low to no moisture). Weather forecasting is the one job where it’s OK to be wrong most of the time.


Wanting to know what’s going to happen is natural. Popular entertainment has explored the future and its relationship to the past and present in a myriad of ways (books, movies and on TV where the SyFy channel is loaded with shows exploring the subject). Some of us read our horoscope; others go to psychics and mediums.

As a business strategist, much of my career has been spent making and implementing recommendations based on forecasts. I take an array of data points and mix them in a proverbial bowl and then make determinations as to which ingredient should be weighted more than another. Each business is different and each circumstance unique. Similar situations yield different solutions. I make those decisions guided by the principles, goals and motives of the owners.


In an era of declining income the solutions are generally to increase revenue and decrease expenses. The balance of which is tweaked depends on the industry, market trends and many other factors. The strategic analysis is relatively simple – the strategic implementation is where the challenge and a dash of artistry come into play.

President Obama on Monday evening outlined his strategic analysis/vision as it relates to the U.S. action in Libya. The Obama Doctrine was clearly and effectively articulated. It appears to be well reasoned and considered. Ultimately it’s not persuasive. It’s missing context and ignores history.



There are many areas of the world where people are in crisis. The question isn’t whether they need help – they do. The question is what type of help to provide. If the American foreign policy is to provide humanitarian help worldwide, then let’s provide it in a non-militarized fashion. Right now which country gets help feels random largely because it’s dependent on whether the area is battle ready and Pentagon approved. The President said that helping in one part of the world and not another is a “false choice.”


It’s true that America cannot use our military wherever repression occurs. And given the costs and risks of intervention, we must always measure our interests against the need for action. But that cannot be an argument for never acting on behalf of what’s right. … To brush aside America’s responsibility as a leader and -– more profoundly -– our responsibilities to our fellow human beings under such circumstances would have been a betrayal of who we are. Some nations may be able to turn a blind eye to atrocities in other countries. The United States of America is different. And as President, I refused to wait for the images of slaughter and mass graves before taking action.

This is a powerful, passionate argument for American morality to govern involvement in regions around the world. It’s just that’s not been our history nor is it part of our guiding principles. Section 10  of the Constitution (the nation's fundamental laws and guiding principles) states unquivocably: “No State shall, without the Consent of Congress … engage in War, unless actually invaded, or in such imminent Danger …”  Even if we ignore the requirement that Congress declare war - nothing in the Libya situation indicates the U.S. is in imminent danger or has been invaded.


 


The United States has a noble history of non-interventionism for the majority of our existence as a nation. President George Washington advised the country to "avoid foreign entanglements." Thomas Jefferson favored "peace, commerce, and honest friendship with all nations; entangling alliances with none." John Quincy Adams wrote that the U.S. "goes not abroad in search of monsters to destroy."


In fact, in the 1930’s Congress passed a series of Neutrality Acts designed to keep the country focused on internal matters, especially after the Depression. The Acts were largely repealed in 1941 in the face of German submarine attacks on U.S. vessels and the Japanese attack on Pearl Harbor which led to U.S. involvement in World War II. (This was the last time that the U.S. Congress actually declared War.)
The unblemished history of non-interventionism lasted until April 6, 1917 when Germany sank seven U.S. merchant ships. After World War I the U.S. entered a long period of isolationism – trying to return to its Constitutional roots of not getting involved in foreign conflicts.



The stunning impact of the U.S. military victories that helped end World War II is now legendary and paved the way for a foreign policy that is based more on what the military can do rather than on what the military should do. In addition military action is taken based on what might happen rather than on what has happened ... let alone whether the sovereignty of the U.S. is at risk.

President Obama said that the military action of the U.S. and her allies in Libya “prevented a massacre.” It’s impossible to prove.  Other prevention efforts haven’t been effective. Preventing Saddam Hussein from having Weapons of Mass Destruction allowed the U.S. for the first time in her history to attack another nation preemptively. Eight years later there have been thousands of deaths and more than $1 trillion spent on the prevention efforts that yielded not one WMD.


Ben Franklin famously said “An ounce of prevention is worth a pound of cure.” Those who support the Doctrine of Interventionism (be it Obama’s, Bush’s, Clinton's, Kennedy's...) would believe that this quote supports their philosophy of preventative action.  Instead the ounce of prevention is preventing war itself – and the best way to do that is to return to America’s roots and tradition of non-interventionism.

Thursday, March 24, 2011

If this had been a real emergency…

At the start of every movie, concert, theatre performance (“ticketed performance”) in California audiences are directed to “look around for the nearest emergency exit.”  This admonition/warning was the result of a law signed by Gov. Schwarzenegger effective January 2007.    Airline passengers are accustomed to extensive safety announcements thanks to federal law and the FAA.  I’ve taken enough cruises so I actually know how to put on the lifejacket for the mandatory safety drills.  Products carry warning labels as a matter of course. 
Living in “Earthquake Country” there is a certain amount of preparedness that is commonplace in my life.  There’s extra bottled water on hand, first aid kits, a few hundred dollars in small bills for when the ATM’s go out, etc.  I don’t have a bunker and food storage is, well, more likely to be consumed than saved.  There are communications plans for friends and family and I have a couple of power back-up units that would last less than an afternoon.

Back in the 1990’s I ran a technology company whose primary customers were radio stations around the country  – and the service that was provided to the stations required stability.   When we moved into a new location in Santa Monica I oversaw the install of a custom back-up battery power system that would run the servers, computers and office for an entire week.  The installation of all those batteries required structural reinforcement to the building and a monumentally large check by the parent company.  Happily, it was never needed.

The devastation that the 8.9 earthquake, the tsunami and the nuclear crisis have wrought on Japan is sad, tragic and simply awful.  The Japanese people serve as a model of preparedness.  Schools and businesses have weekly earthquake drills.  Building codes are rigorous.  Even still, Mother Nature’s powerful one-two-three punch proved to be too much and part of the country is devastated.  The survivors are reestablishing their lives in a different world and modifying behavior as a result.
Americans do not appear to be as self-reliant.  It seems that whenever there is a disruption of routine an emergency is declared.  In 2010 President Obama declared 81 disasters – the most ever.  Should he serve two full terms he will declare 560 emergencies.  George W. Bush declared 516 during his eight years, Bill Clinton 379.  Clearly the United States is a magnet for disasters and it gets worse each year.  But does it?
An emergency declaration allows for logistics support, funding and an assortment of other federal programs to kick in.  An area under an emergency declaration is able to suspend rules and laws in order to manage the disaster.  This structure exists for good reason – during a major cataclysmic event first responders must be able to stabilize the situation as quickly as possible without some of the luxuries that democracy usually demands.  During a less serious event the powers are excessive so determining what is and what isn't an emergency is important.
The definition of an emergency has evolved...or devolved.  This past winter, one of the worst in many years, declarations were happening before the snow started  It seems every storm can be categorized as an emergency – and, much like the in-flight and in-theatre announcements it becomes white noise.  Most people don't think too much about it and are not only glad for the extra help, they expect it.
Responding poorly to an emergency can be political suicide.  President Bush (41) was roundly criticized for FEMA’s response to Hurricane Andrew.  Thirteen years later his son (43) received even more criticism over the federal response to Hurricane Katrina while President Clinton got strong praise for how his team dealt with the 6.7 Northridge Earthquake.  Each of these were important, major events.
The costs of these emergences add up.  Each state has its own emergency budget and FEMA costs about $10.1 billion per year just for operations.   Each declaration receives additional funding which is outside of the regular budgeting process.
More important than the dollars, the cycle of declare-every-event-a-disaster has reduced our ability to distinguish what is really an emergency and what is inconvenient, difficult and part of life.  I choose to live in an area known for earthquakes, droughts and mudslides.  I have family and friends who choose to live close to beaches that flood, or in the mountains where snow avalanches, or in states where hurricanes and tornados regularly sweep through. 
Life has risks.  Looking to the government after something bad happens to buffer the pain or solve the problem is part of the deterioration of personal responsibility that damages America more than any storm ever could.  For something cataclysmic – yes, the government has a role and it must play it well.  These events are few and far between and must be very thoughtfully and carefully designated.  For something unfortunate the government should not protect against consequence of just living.
Between 2000 and 2010 nearly every county in America had at one time been under an emergency declaration.  Given the wide range of powers that government takes on during a declaration, a conspiracy theorist might conclude that there’s some evil plot to strip Americans of their rights.  The truth is that to buffer ourselves from inconvenience we have voluntarily and eagerly given over those rights…which is actually much more insidious and leads us as a nation further away from individual reponsibility and liberty.

Sunday, March 20, 2011

Spring: a young man’s fancy turns to war

Today, March 20, is the first day of Spring.  It is the eighth anniversary of the launch of “Operation Iraqi Freedom” where a coalition of countries led by the U.S. invaded Iraq on a futile hunt for Weapons of Mass Destruction.  Yesterday, the U.S. began its third conflict in the Middle East by bombing Libya.  We also mourn death of Warren Christopher, longtime diplomat and public servant, who was passionately opposed to war.  Not only did the former Secretary of State die yesterday, so did his ideal of diplomacy.

President Barak Obama won the Nobel Prize for peace, not based on what he had done in the nascent months of his administration, but by what the committee expected he would do.  The President gave us hope:  “I believe that we must develop alternatives to violence that are tough enough to actually change behavior -- for if we want a lasting peace, then the words of the international community must mean something.  Those regimes that break the rules must be held accountable.  Sanctions must exact a real price.  Intransigence must be met with increased pressure -- and such pressure exists only when the world stands together as one. … The promotion of human rights cannot be about exhortation alone.  At times, it must be coupled with painstaking diplomacy.  I know that engagement with repressive regimes lacks the satisfying purity of indignation.  But I also know that sanctions without outreach -- condemnation without discussion -- can carry forward only a crippling status quo.  No repressive regime can move down a new path unless it has the choice of an open door.”  Perhaps the Libyan door wasn't ever open, but did we even try?

There are justifications and nuances that make the U.S. forced entry into Libya more palatable. The President did utilize the international community and U.S. engagement is part of what appears to be a truer coalition than in Iraq or Afghanistan. No matter the packaging, though, American forces are in conflict not as a response to challenge U.S. sovereignty but as something else. 

It is cold and rainy this first day of the season in Los Angeles – a more apt metaphor to start Spring than the usual sense of optimism and rebirth.  May this conflict end soon.

Thursday, March 17, 2011

March to Madness

It’s underway – determining the best college basketball team is the lively NCAA tournament that has become known as March Madness.  My alma matter (Syracuse) sends out near daily email updates and keeps their Facebook page current with the latest information as I’m sure the rest of the schools do.  TV viewership is high and betting takes places in offices around the country. 

The condensed schedule is part of the allure:  we can watch teams “go all the way” in a few short weeks.  The fact that the competitors are amateurs – college kids – is a huge part of the appeal.  Big dollars are at stake for school programs, television networks --- but the players are just college kids playing the game.  Well, that’s the illusion anyway.  Not like the NFL.
Last week the NFL Players Union (NFLPA) disbanded themselves. The union was in negotiations with the owners about the next contract…a process known as Collective Bargaining where the workers (players) have authorized the union to represent them to multiple owners (teams).  By decertifying themselves the workers can negotiate individually or in small groups which is usually considered less effective.  In the case of the NFL several high profile players have filed lawsuits and anti-trust actions against the owners trying to get leverage in negotiating for more money.
The NFL economy is not performance based All the money that comes in (from the TV licensing deal, ticket sales, etc.)  goes into a pot and is split evenly between the clubs and then split nearly evenly between the owners and players.  It doesn’t matter if a team gets lousy ratings.  It doesn’t matter if a team loses every single game.  Each team still get their equal split.
The team owners want to change the split so that their pool is bigger than the players.    I don’t know enough to say whether that’s fair or not since I come from a more market-based philosophy.  The fundamentals of the NFL economy guarantee more leverage to team owners who all benefit equally.  If each team earned its own income and paid its own expenses and stood on its merits I imagine that the relations with the players would be substantially different as it would be in the teams interest to keep their players happy.  For the players, walking away from the negotiation table and dissolving the union seems to be a drastic action – one that nearly guarantees the loss of the 2011-12 season.  Should that the public sector unions do the same?
Wisconsin Gov. Scott Walker last week officially eliminated some of the collective bargaining rights from most of the state's public employees.  There is a lot of emotion and excitement around this action.  It was dramatic and exciting to watch…though we’ll see whether the action stands the test of time and whether the shenanigans used to pass the measure survive the judiciary.  Since the union had agreed to the financial give-backs it would have made more sense to have that voted on and separate out the other issues…but that would have been good policy and not good politics. 
Saying that collective bargaining has been eliminated in Wisconsin is a misnomer.  The unions that represent state workers still exist in Wisconsin.  They can even negotiate with lawmakers – though there is now a statutory limit that wage increases max out at the rate of inflation.   Many lawmakers receive donations from unions.  This may influence the negotiation.  There is also little incentive to negotiate hard as there are few options available to lawmakers – it’s not like they can relocate or trim services or walk-out like the NFL Players.  Because of these concerns, 30 states, Washington DC, Puerto Rico and the Federal Government do not permit collective bargaining. 
Unions have played a vital role in protecting workers throughout history.  From medieval times guilds existed to protect and enhance their members' livelihoods by controlling the progression of members from apprentice to craftsman, journeyman, and eventually to master and grandmaster of their craft. 
In 2010 there were 14.7 million workers – or 11.9% represented by unions.  In 1945 American unions reached their apex with 36% of workers represented.  Many reasons have been attributed to the decline.  The fact that unions achieved the bulk of their initial goals is one of the lasting impacts.  NFL players now get clean socks (one of the early achievements of the NFLPA).  More substantive health and safety issues have benefited all other industries.  Wages rose allowing for the establishment of the middle class in America and standards for fair work and bargaining became the norm.
By the 1970’s Union actions had set a baseline for American workers.  All that was left was for wages and benefits to continue to increase, regardless of the need.  These increases, along with excessive regulation and global competition, contributed to the demise of the manufacturing base in the United States.  The demands became unsustainable.  The result is that it is no longer affordable to produce products in the USA.
The 2007-08 financial crisis afforded Unions a new opportunity:  relevance.  For businesses (and even the public sector) to survive all of the stakeholders must be at the same table – employers and employees – finding common ground in an affordable realizable way.  Unions are not inherently bad and people must be free to choose whether to negotiate individually or as a group with their bosses.  Unions can actually be the force for change on issues of efficiency, goals and results.  It would take a different philosophy on the part of everybody….starting with workers being cheerleaders.  Which reminds me…Go Orange!!!

Thursday, March 10, 2011

Springing Forward to War?

Early Sunday morning (late Saturday night) most of us “Spring Forward” to Daylight Savings time, “losing” an hour of sleep but gaining additional daylight. A genesis for the twice-annual process of adjusting sleep patterns and disrupting businesses is the popular (if inaccurate) notion that harkens back to days of farming when having more daylight allowed the harvest to be reaped. In reality, commerce and transportation figured more prominently.

The idea of daylight saving was first conceived by Benjamin Franklin. Standard time in time zones was instituted in the U.S. and Canada by the railroads on November 18, 1883. Prior to that, time of day was a local matter, and most cities and towns used some form of local solar time, maintained by a well-known clock (on a church steeple, for example, or in a jeweler's window). Daylight Saving Time has been used in the U.S. and in many European countries since World War I. By 1966, some 100 million Americans were observing Daylight Saving Time based on their local laws and customs. Congress decided to step in and end the confusion, and to establish one pattern across the country. The Uniform Time Act of 1966 was signed on April 12, 1966, by President Lyndon Johnson, creating Daylight Saving Time.


Standardizing time zones may be one of the better government interventions! Futzing forward and backward (“saving”) every six months doesn’t make much sense. The time has long since passed to do away with this anachronism and let time stand still.

Where time is marching forward in a much more precarious way is in Libya. In the past several months the world has watched millions of Middle Eastern people challenge their situation and change their governments with relative ease and lack of bloodshed. It is still far too early in the “transformation” to see whether the changes in Egypt, Bahrain, Tunisia, etc. are cosmetic or will have any actual long-lasting impact.

In Libya the opposition/rebels/freedom-fighters have had a more difficult time than elsewhere in the region. Each country is unique and the reasons for the uprisings are not consistent. Forces loyal to Moammar Gadhafi have thus far thwarted the revolution and allowed the dictator to stay in power. (This could change at any time, however, and likely will).

Many politicians and pundits are promoting a number of proposals to intervene to push along the process: establish a no-fly zone, arm the rebels, etc. The Obama administration says it will not do anything unilaterally and will seek international support. Déjà vu.

The U.S. is currently engaged in two combat missions (Iraq and Afghanistan) where 5,924 Americans have died and 42,406 have been wounded so far. Hundreds of thousands of Iraqis and Afghans have also died and been wounded. The conflicts have cost in excess of $1.2 trillion dollars. There is no justification for these losses and less so if America/International community intervenes in another Middle Eastern country, Libya.

Is Gadhafi a bad guy? Yes. For 42 years the world has dealt with this bad guy – and many others like him in the region and around the world. The question that gets too little attention isn’t how to intervene but rather why should the U.S. (or the world for that matter) take sides?

“It’s the right thing to do.” “Humanitarian needs.” “The rebels are asking for our help.” “Stabilize the oil market.” All are true to one degree or another, but is that enough of a reason?


History – and even current events – are rife with examples where help is desperately needed in some region of the world and it isn't/wasn't provided by the U.S. and her allies. It is hard to say no – especially in the face of hardship, need and compelling television coverage.  Sometimes it's just hte right thing to do.


Certainly as a business person I recognize that the steady flow of oil is vital for the economy. 3% of the oil that the U.S. uses comes from Libya. Not that there ever is, but 3% is really not a significant amount of oil to justify the moral hazard of intervention – let alone the financial impact.

Just in pure economic terms it doesn’t make sense either. Libya produces 1.5 million barrels of oil a day – or 547,500,000 barrels per year. At a cost of $104 per barrel (3/9/11 rate) the exported oil to the U.S. amounts to $1.7 billion. Given the amount America would spend on the military machinery – it would actually be more cost effective to spend the $1.7 billion elsewhere – even double – for the oil than investing in another conflict in the Middle East that would also be a rallying cry for terrorists.

There will always be more places and people who want and need American help than can be provided. A third conflict in the Middle East beckons. Will we turn back the clock and say “no, sorry, not this time?”  I fear we will spring forward and become further entrenched in the Middle East.  Now that's something worth losing sleep over.



















Thursday, March 3, 2011

Freddie & Fannie – time to go

I enjoy watching the myriad of house hunting, decorating and renovation shows that are on TV. Seeing what money would buy in LA versus Kansas versus New York is illuminating and a great way to escape the dredge of figuring out how to pay the mortgage. Several entire networks are dedicated to these shows and variations appear across the cable box – and they are all rooted in perpetuating home ownership.

Americans have a tradition of home ownership – from the 1800’s “land run” where previously-restricted land was opened for homesteading often on a first-come basis. By 1900 46.5% of Americans owned their house. During the Great Depression of the late 1920’s and early 1930’s many lost their homes.

Fannie Mae was established in 1938 as part of President Roosevelt's New Deal. Once a bank approves a loan to their customer, the bank then sells the loan to the government and uses the funds to loan to somebody else for their home. Banks no longer had to wait to be paid back before having the capital to lend again allowing more people to get loans. Home ownership increased to 62.9% by 1970 thanks to the additional availability of federal dollars.

Freddie Mac was created in 1970 to expand the secondary market for mortgages. Similar to Fannie, Freddie buys mortgages from banks. Freddie then packages the mortgages together to sell off to other investors. This increases the pool of dollars for banks and investors to lend. By 2000 the home ownership increased to 66.2% and by 2010 was essentially stagnant at 66.9%.

I am a homeowner and it's likely that my house is owned by the government since Freddie and Fannie guarantee more than half of all mortgages in the U.S. Combined they have $9.5 trillion in assets, carry $3.4 trillion in mortgage backed securities and have debt of $1.6 trillion. (The entire U.S. budget is approx. $3.7 trillion.) For years Freddie and Fannie were pseudo-government agencies run like private corporations with shareholders but were collateralized by tax dollars. On September 7, 2008 they were put under the conservatorship of the Federal Housing Finance Agency (FHFA). It was (so far) the single largest government takeover of a private (or semi-private) institution in history. $134 billion (so far) to bail out the companies.

President Obama has proposed three options for shrinking the U.S. involvement in housing. Each of the Obama recommendations have merit and begin to reduce the dependence on taxpayers…but the options are incremental. The Obama administration must be lauded for actually taking on this issue: it’s a complex and dry policy that replaces counting sheep as a solution for insomnia. It is a good entry point to determine what the appropriate role of Government is.


Government has been involved in shaping and guiding housing policy forever – at least since feudal times. Rather than incrementally tweaking what would it look like to see the Government get out of the mortgage business altogether? Imagine: 
Banks would look at a property, evaluate the value of the land and the qualifications of the potential buyer to determine whether it was worth the risk to give them money and then earn some interest as the money is paid back over time. If the bank made a bad decision their recourse would be (as it is today) securing the property back from their customer and reselling it to somebody else…sometimes taking a loss, sometimes not. The bank would actually be accountable and suffer a consequence for its decision without having the Government backstop the loan.

Arguments for government financing housing are powerful: the private sector market on its own provides very few opportunities where risk is present (bad neighborhoods, nontraditional buyers, etc.). Lending costs would be higher because there would be fewer loans. Access to housing for minorities might be difficult as it was prior to the 1970’s.

These issues are legitimate, real and there is historical precedent to be aware of. Note I am not suggesting a change to housing laws and protections: just government financing of housing. A purely market-based solution allows for the concerns to be addressed. Not-for-profit corporations can act as banks, can provide supplemental funding for distressed neighborhoods or less than perfect applicants. And maybe we have 40% of people owning homes – but it’s 40% who can legitimately afford to. And maybe I won't be one of them.  It won’t be perfect, but the Government guaranteed and taxpayer funded solutions isn’t perfect now.

The issue isn’t whether Government has an interest in housing…it’s whether Government should be financing housing. Freddie and Fannie have proven that despite the best intentions, they are ineffective. Banks have proven their own incompetence as well during this financial disaster – so by no means are banks the salvation, but they operate within rules and regulations. If the authorities don’t hold people and companies accountable when they break those rules that’s an issue of enforcement. The problem is that during the financial disaster of 2007-09 banks had no consequence for their decisions and actions. Government bailouts or government forced mergers prevented banks from having to be liquidated.

The justification was that the entire financial system had to be saved or the average person would be hurt. The result was that the essential part of capitalism, the part that I believe makes capitalism work, was taken out of the equation:  there was no penalty, no consequence. So the message is: screw up and be rewarded, not punished. Of course the consequence is present, the U.S. taxpayer (thanks to our Chinese friends) will clean it all up over the next 50 years.  Glad the average folks didn't get hurt!  Until institutions are held responsible for their actions, it won’t much matter whether Fannie and Freddie continue to require billions in bailout dollars or is tweaked ... nobody but the taxpayer will be left holding the bag. Eliminating taxpayer funding of housing is a good place to start...right "House Hunters."