A_map_of_New_England,_being_the_first_that_ever_was_here_cut_..._places_(2675732378).jpg
RWhitcomb-editor RWhitcomb-editor

James P. Freeman: Past time to break up America's mega-banks before they cause another crash

JPMorgan Chase & Co. headquarters, in Manhattan.

JPMorgan Chase & Co. headquarters, in Manhattan.

As Americans were lounging comfortably over the 4th of July holiday, some were surely feeling a sense of new-found serenity. Not because they were full of burgers, barkers and beer. On the contrary, they were digesting the news that 34 of the nation’s largest banks, for the first time since the financial crisis began a decade ago, all passed the Federal Reserve’s annual “stress tests,” which, according to The Wall Street Journal, “could bolster the industry’s case for cutting back regulation.” But hold the match before lighting leftover fireworks in celebration. Break up the banks first.

Remarkably, the five largest banks today — JPMorgan Chase  & Co. Bank of America, Citibank, Wells Fargo Bank, US Bank — now control about 45 percent of the financial industry’s total assets or roughly $7.3 trillion in assets. To put that number in perspective, the size of the U.S. economy is roughly $18 billion.

Twenty-five years ago, the five largest banks owned just 10 percent of all financial assets. The Federal Deposit Insurance Corporation’s statistics reveal that in 1992 there were 11,463 commercial banks and 2,390 savings and loans. By March of 2017, that number had dwindled to 5,060 commercial banks and just 796 savings institutions. The assets held by the five largest banks in 2007 – $4.6 trillion – increased by more than 150 percent over the past decade, when they held 35 percent of industry assets.

The sharp rise in the concentration of these assets (a measure of size and wealth) has real economic, political and social ramifications. As oxfamamerica.org fears, “These massive banks use their wealth to wield significant political and economic power in the U.S., in the countries where they operate, and in the international arena.” Finance today, author Rana Foroohar reasons in Makers and Takers, “holds a disproportionate amount of power in sheer economic terms.” (It takes about 25 percent of all corporate profits while creating only 4 percent of all jobs.)

Moreover, writing five years ago in The Washington Post, just as the banking system was being recalibrated and reregulated, U.S. Sen.  Sherrod Brown, D-Ohio, ranking member of the Senate Banking Committee, recognized then (and still true today) what the federal government should recognize now:  “Even at the best-managed firms, there are dangerous consequences of large, complex institutions undertaking large, complex activities. These companies are simply too big to manage, and they’re still too big to fail.” While Brown is correct in citing “Too Big To Fail” (a warped public policy), he is right to emphasize (as many more public officials should) that these firms are simply too big to manage and maintain.

JPMorgan Chase is a financial colossus. In January 2017, it reported assets of $2.5 trillion that generated $99 billion in revenue and earned $24.7 billion in profit. It is the largest U.S.-domiciled bank and the sixth largest bank in the world. Today, it alone holds over 12 percent of the industry’s total assets, a greater share than the five biggest banks put together in 1992. Its global workforce of 240,000 operates in 60 countries. From 2009 to 2015, the bank paid $38 billion in fines and settlements (involving among them  the Bernie Madoff and London Whale scandals), mere rounding errors in its ethics and financials.

Jamie Dimon is the company's chairman and chief executive officer, perhaps the second most difficult job in the country, only after the presidency of the United States. To say that he “manages” the firm would be an overstatement. More accurately, he “presides.” Dimon was named CEO in 2005 (when assets were only $1.2 trillion) and is widely given credit for adroitly navigating the financial turmoil of 2008-2009. In 2015, he made $27 million and said that banks were “under assault” by regulators. He keeps two lists in his breast pocket:  what he owes people; what people owe him.

Now 61, he is the subject of much discussion centering on speculation about who his  successor will be. In an interview for Bloomberg Television last September, Dimon said he would leave “when the right person is ready.” But who is ever “ready” and able to run a $2.5 trillion company? No one competently.

Wells Fargo ($2 trillion in assets with 8,500 locations) was thought to be among the best managed banks before, during, and after the crisis until it was revealed last year that it had fired 5,300 employees and clawed back $180 million in compensation, due to the unauthorized opening of 2 million customer accounts. A flawed “decentralized structure” and perverse sales culture were to blame for illicit activities that occurred over a decade.

Fortune Magazine in March 2007, with sterling irony, named Lehman Brothers (No. 1) and Bear Stearns (No. 2), respectively, as the most admired companies in the securities industry. By the end of 2008 Lehman Brothers had filed for bankruptcy (the largest in U.S. history, $692 billion) and Bear Stearns had been sold in a fire sale by fiat to JPMorgan Chase.

Big banks are driven by avarice and algorithms (complicated code-directing computers to effect financial transactions by the millisecond), not altruism. Lending is secondary to speculating. Complexity has replaced familiarity. Vaults hold more data than gold. And traders can destroy banks faster than boards of directors. On any given business day, no executive or regulator can be certain of the health of these institutions.

This is the new Wall Street alchemy.

The first public signs of distress in the financial system before the Crash of 2008 appeared 10 years ago, when a July 2007 letter to the firm’s investors disclosed that two obscure hedge funds managed by Bear Stearns had collapsed. The long fuse had been lit. The Great Recession was triggered. And the torch paper was provided in the form of legislation.

The Financial Services Modernization Act of 1999 (known as Gramm-Leach-Bliley) neutered the Banking Act of 1933 (known as Glass-Steagall), which separated the riskier elements of investment banking from the more conservative aspects of commercial banking. The 1999 law spurred a new model of financial supermarkets (banking, investments and insurance under one company). It also unwittingly fostered a new risk-taking model:  Losses could be socialized (depositors, shareholders, taxpayers) while profits could be privatized (executive compensation). Exotic financial instruments and ineffective regulatory oversight fueled the meltdown.

Today’s big banks largely resemble Zuzu’s petals in the film It’s a Wonderful Life, seemingly mended but not made better. During the  Crash/Panic of 2008, the federal government engineered the financial equivalent of pasting damaged rose petals in a desperate attempt to prevent the total collapse of the banking sector. It effectively merged the unwieldy likes of Merrill Lynch with Bank of America, Bear Stearns with JP Morgan and Wachovia with Wells Fargo. 

In the wake of the crisis (which ultimately required $1.59 trillion in government bailouts and another $12 trillion in guarantees and loans), The Dodd-Frank Wall Street Reform and Consumer Protection Act of 2010 became law. It sought to make the financial system safer and fairer than it had been, to reduce the risk of financial crises, to protect the economy from this kind of devastating costs of risky behavior, and to provide a process for the orderly disposition of failing firms. But after a staggering 848 pages and 8,843 new rules and regulations, Dodd-Frank does nothing to reduce the size of, and hence the systemic risk posed by, the biggest banks.

“We may have gotten past the crisis of 2008,” Foroohar concludes in her book, but, disturbingly, “we have not fixed our financial system.” Bankers still “exert immense soft power” via the revolving door between Washington and Wall Street. And today’s top government regulators are littered with former banking executives, who are “disinclined to police the industry.”  

The idea of overhauling big banks, however, is attracting some surprising converts.

In May, President  Trump acknowledged that he was “looking at” breaking up the big banks. And the principal architects and former co-heads of the first financial supermarket — Citigroup — have had an epiphany of sorts. In 2015,  ex-Citicorp CEO John Reed wrote in the Financial Times that “the universal banking model is inherently unstable and unworkable. No amount of restructuring, management change or regulation is ever likely to change that.” And in 2012, Sandy Weill, another former Citicorp  CEO, called for a return to Glass-Steagall.

Dodd-Frank mandates that the Federal Reserve conduct annual stress tests on financial institutions with assets over $50 billion. This year’s test on 34 banks relied upon enhanced computer modeling to assess how those banks would perform under “adverse and severely adverse” economic conditions. But such modeling is an unreliable predictor of reality. As JPMorgan Chase knows well.

In May 2012, The New York Times provided insights into massive trading losses at JP Morgan Chase. The bank had little idea that the losses were brewing. It entrusted computer modeling pioneered by its bankers in the 1990s to identify and measure potential losses. But the bank tripped up its measurements. It deployed a new model that underestimated losses; when it redeployed the old model, it nearly doubled the losses. As The Times chillingly remembers, such computer programs “proved useless during the financial crisis.”

James P. Freeman, a former banker, is a New England-based writer, former columnist with The Cape Cod Times and frequent contributor to New England Diary. This piece first ran in The New Boston Post.

 

Read More
oped RWhitcomb-editor oped RWhitcomb-editor

David Warsh: Deconstructing the Great Panic of 2008

By DAVID WARSH

BOSTON

Lost decades, secular stagnation -- gloomy growth prospects are in the news. To understand the outlook, better first be clear about the recent past. The nature of what happened in September five years ago is now widely understood within expert circles. There was a full-fledged systemic banking panic, the first since the bank runs of the early1930s. But this account hasn’t yet gained widespread recognition among the public. There are several reasons.

For one thing, the main event came as a surprise even to those at the Federal Reserve and Treasury Departments who battled to end it. Others required more time to figure out how desperate had been the peril.

For another, the narrative of what had happened in financial markets was eclipsed by the presidential campaign and obscured by the rhetoric that came afterwards.

Finally, the agency that did the most to save the day, the Federal Reserve Board, had no natural constituency to tout its success in saving the day except the press, which was itself pretty severely disrupted at the time.

The standard account of the financial crisis is that subprime lending did it. Originate-to-distribute, shadow banking, the repeal of Glass-Steagall, credit default swaps, Fannie and Freddie, savings glut, lax oversight, greedy bankers, blah blah blah. An enormous amount of premium journalistic shoe leather went into detailing each part of the story. And all of it was pieced together in considerable detail (though with little verve) in the final report of the Financial Crisis Inquiry Commission in 2011.

The 25-page dissent that Republican members Keith Hennessey, Douglas Holtz-Eakin and Bill Thomas appended provided a lucid and terse synopsis of the stages of the crisis that is the best reading in the book.

But even their account omitted the cardinal fact that the Bush administration was still hoping for a soft landing in the summer of 2008. Nearly everyone understood there had been a bubble in house prices, and that subprime lending was a particular problem, but the sum that all subprime mortgages outstanding in 2007 was $1 trillion, less than the market as a whole occasionally lost on a bad day, whereas the evaporation of more than $8 trillion of paper wealth in the dot-com crash a few years earlier was followed by a relatively short and mild recession.

What made September 2008 so shocking was the unanticipated panic that followed the failure of the investment banking firm of Lehman Brothers. Ordinary bank runs – the kind of things you used to see in Frank Capra films such as "American Madness" and “It’s a Wonderful Life”– had been eliminated altogether after 1933 by the creation of federal deposit insurance.

Instead, this was a stampede of money-market wholesalers, with credit intermediaries running on other credit intermediaries in a system that had become so complicated and little understood after 40 years of unbridled growth that a new name had to be coined for its unfamiliar regions: the shadow banking system – an analysis thoroughly laid out by Gary Gorton, of Yale University’s School of Management, in "Slapped by the Invisible Hand'' (Oxford, 2010).

Rather than relying on government deposit insurance, which was designed to protect individual depositors, big institutional depositors had evolved a system employing collateral – the contracts known as sale and repurchase agreements, or repo – to protect the money they had lent to other firms. And it was the run on repo that threatened to melt down the global financial system. Bernanke told the Financial Crisis Inquiry Commission:

As a scholar of the Great Depression, I honestly believe that September and October of 2008 was the worst financial crisis in global history, including the Great Depression. If you look at the firms that came under pressure in that period… only one… was not of serious risk of failure…. So out of the thirteen, thirteen of the most important financial institutions in the United State, twelve were at risk of failure within a week or two.

Had those firms begun to spiral into bankruptcy, we would have entered a decade substantially worse than the 1930s.

Instead, the emergency was understood immediately and staunched by the Fed in its traditional role of lender of last resort and by the Treasury Department under the authority Congress granted in the form of the Troubled Asset Relief Program (though the latter aid required some confusing sleight- of-hand to be put to work).

By the end of the first full week in by October, when central bankers and finance ministers meeting in Washington issued a communique declaring that no systemically important institution would be allowed to fail, the rescue was more or less complete.

Only in November and December did the best economic departments begin to piece together what had happened.

When Barack Obama was elected, he had every reason to exaggerate the difficulty he faced – beginning with quickly glossing over his predecessor’s success in dealing with the crisis in favor of dwelling on his earlier miscalculations. It’s in the nature of politics, after all, to blame the guy who went before; that’s how you get elected. Political narrative divides the world into convenient four- and eight-year segments and assumes the world begins anew with each.

So when in September Obama hired Lawrence Summers, of Harvard University, to be his principal economic strategist, squeezing out the group that had counselled him during most of the campaign, principally Austan Goolsbee, of the University of Chicago, he implicitly embraced the political narrative and cast aside the economic chronicle. The Clinton administration, in which Summers had served for eight years, eventually as Treasury secretary, thereafter would be cast is the best possible light; the Bush administration in the worst; and key economic events, such as the financial deregulation that accelerated under Clinton, and the effective response to panic that took place under Bush, were subordinated to the crisis at hand, which had to do with restoring confidence.

The deep recession and the weakened banking system that Obama and his team inherited was serious business. At the beginning of 2008, Bush chief economist Edward Lazear had forecast that unemployment wouldn’t rise above 5 percent in a mild recession. It hit 6.6 percent on the eve of the election, its highest level in 14 years. By then panic had all but halted global order-taking for a hair-raising month or two, as industrial companies waited for assurance that the banking system would not collapse.

Thus having spent most of 2008 in a mild recession, shedding around 200,000 jobs a month, the economy started serious hemorrhaging in September, losing 700,000 jobs a month in the fourth quarter of 2008 and the first quarter of 2009. After Obama’s inauguration, attention turned to stimulus and the contentious debate over the American Recovery and Reinvestment Act. Summers’s team proposed an $800 billion stimulus and predicted that it would limit unemployment to 8 percent. Instead, joblessness topped out at 10.1 percent in October 2009. But at least the recovery began in June

What might have been different if Obama had chosen to tell a different story? To simply say what had happened in the months before he took office?

Had the administration settled on a narrative of the panic and its ill effects, and compared it to the panic of 1907, the subsequent story might have been very different. In 1907, a single man, J.P. Morgan, was able to organize his fellow financiers to take a series of steps, including limiting withdrawals, after the panic spread around the country, though not soon enough to avoid turning a mild recession into a major depression that lasted more than a year. The experience led, after five years of study and lobbying, to the creation of the Federal Reserve System.

If Obama had given the Fed credit for its performance in 2008, and stressed the bipartisan leadership that quickly emerged in the emergency, the emphasis on cooperation might have continued. If he had lobbied for “compensatory spending” (the term preferred in Chicago) instead of “stimulus,” the congressional debate might have been less acrimonious. And had he acknowledged the wholly unexpected nature of the threat that had been turn aside, instead of asserting a degree of mastery of the situation that his advisers did not possess, his administration might have gained more patience from the electorate in Ccngressional elections of 2010. Instead, the administration settled on the metaphor of the Great Depression and invited comparisons to the New Deal at every turn – except for one. Unlike Franklin Delano Roosevelt, Obama made no memorable speeches explaining events as he went along.

Not long after he left the White House, Summers explained his thinking in a conversation with Martin Wolf, of the Financial Times, before a meeting of the Institute for New Economic Thinking at Bretton Woods. N.H. He described the economic doctrines he had found useful in seeking to restore broad-based economic growth, in saving the auto companies from bankruptcy and considering the possibility of restructuring the banks (the government owned substantial positions in several of them through TARP when Obama took over). But there was no discussion of the nature of the shock the economy had received the autumn before he took office, and though he mentioned prominently Walter Bagehot, Hyman Minsky and Charles P. Kindleberger, all classic scholars of bank runs, the word panic never came up.

On the other hand, the parallel to the Panic of 1907 surfaced last month in a pointed speech by Bernanke himself to a research conference of the International Monetary Fund. The two crises shared many aspects, Bernanke noted: a weakening economy, an identifiable trigger, recent changes in the banking system that were little-understood and still less well-regulated, sharp declines in interbank lending as a cascade of asset “fire sales” began. And the same tools that the Fed employed to combat the crises in 2008 were those that Morgan had wielded in some degree a hundred years before – generous lending to troubled banks (liquidity provision, in banker-speak), balance-sheet strengthening (TARP-aid), and public disclosure of the condition of financial firms (stress tests). But Bernanke was once again eclipsed by Summers, who on the same program praised the Fed’s depression-prevention but announced that he had become concerned with “secular stagnation.”

The best what-the-profession-thinks post-mortem we have as yet is the result of a day-long conference last summer at the National Bureau of Economic Research. The conference observed the hundredth anniversary of the founding of the Fed. An all-star cast turned out, including former Fed chairman Paul Volcker and Bernanke (though neither historian of the Fed Allan Meltzer, of Carnegie Mellon University, or Fed critic John Taylor, of Stanford University, was invited). Gorton, of Yale, with Andrew Metrick, also of Yale, wrote on the Fed as regulator and lender of last resort. Julio Rotemberg, of Harvard Business School, wrote on the goals of monetary policy. Ricardo Reis, of Columbia University, wrote on central bank independence. It is not clear who made the decision to close the meeting, but the press was excluded from this remarkable event. The papers appear in the current issue of the Journal of Economic Perspectives.

It won’t be easy to tone down the extreme political partisanship of the years between 1992 and 2009 in order to provide a more persuasive narrative of the crisis and its implications for the future – for instance, to get people to understand that George W. Bush was one of the heroes of the crisis. Despite the cavalier behavior of the first six years of his presidency, his last two years in office were pretty good – especially the appointment of Bernanke and Treasury Secretary Henry Paulson. Bush clearly shares credit with Obama for a splendid instance of cooperation in the autumn of 2008. (Bush, Obama and John McCain met in the White House on Sept. 25, at the insistence of Sen. John McCain, in the interval before the House of Representatives relented and agreed to pass the TARP bill. Obama dominated the conversation, Bush was impressed, and, by most accounts, McCain made a fool of himself.)

The fifth anniversary retrospectives that appeared in the press in September were disappointing. Only Bloomberg BusinessWeek made a start, with its documentary “Hank,” referring to Paulson. The better story, however, should be called “Ben.” Perhaps the next station on the way to a better understanding will be the appearance of Timothy Geithner’s book, with Michael Grunwald, of Time magazine, currently scheduled to appear in May. There is a long way to go before this story enters the history books and the economics texts.

David Warsh is proprietor of www.economicprincipals.com, economic historian and along-time financial journalist. He was also a long-ago colleague of Robert Whitcomb.

Read More