The Century-Long History of Tapping Wall Street to Run the Government
Looking to the one-percent to lead the country goes back to the era of World War I
From our earliest days we Americans have embraced leaders from among the ranks of the nation’s moneyed elite. Voters set the tone when they chose George Washington, the wealthiest man on the continent at the time, as the first president.
But that choice was accompanied by a healthy skepticism of the role of money in the halls of government. As the years went by, recurrent scandals prompted rounds of reform, fostering an intricate system of rules to promote ethical conduct.
The result is a daunting interface between private and public life, the line marked by financial investigation, disclosure and divestiture. Still, from the early 20th century, U.S. presidents began to routinely call on leaders from business and industry to head key agencies of the government. And despite nagging public suspicion, the moguls drafted into service were consistently free of accusations—let alone outright findings—of corruption or misconduct.
Keep in mind, the sort of corruption threatened by the rich and powerful is quite distinct from the more garden-variety graft usually associated with public officials—bribery, principally; or undue allegiance to one political party or another. Such concerns were addressed in the late 19th century by the institution of the civil service, when federal employees were subjected for the first time to entrance exams, and protected from political removal. It marked the advent of a new kind of entity: the career civil servant.
Reckoning with the threat posed by wealthy appointees—that they might place their private interests ahead of the public’s, using their positions to help their friends or augment their fortunes—came later, and required more elaborate safeguards.
It was the onset of the first World War, and the attendant task of retooling the nation’s industrial economy for wartime production, that brought a surge of business executives into the government. Drafted by President Woodrow Wilson, starting in 1917, they signed on for service in new government bureaus at the nominal salary of a dollar a year.
First among these wartime stalwarts was Bernard Baruch, a financier and speculator known in his day as “the lone wolf of Wall Street.” Appointed head of the new War Industries Board, Baruch recruited a bevy of his tycoon chums and together they put the peacetime economy on footing to produce uniforms, tanks and ammunition.
Another Wilson appointee was Herbert Hoover. A mining executive then based in London, Hoover emerged on the public stage by leading humanitarian war relief efforts for neutral Belgium. Calling Hoover back to the U.S., Wilson named him Food Administrator, and charged him with limiting domestic consumption and keeping the U.S. Army and its allies fed in the field.
Both of these men—and the dozens of other businessmen drafted to assist them—performed capably. Though these appointments came at the height of the Progressive Era, and the wary view of wealth that went with it, the American public came to accept these appointments as legitimate without audible objection.
Skip forward a decade, to 1929, and wealthy office-holders had become a routine feature in the federal government. More than that, it was a non-partisan phenomenon. Bernard Baruch had become the titular head and chief fundraiser for the Democratic Party, while Hoover, after a brief dalliance with the Democrats, won the presidency as a Republican. When Hoover became president, he decided to continue the dollar-a-year tradition, donating his salary to charity.
During Hoover’s tenure the crisis was not war but the Great Depression, and he again turned to men of wealth. One of Hoover’s principal innovations was to launch the Reconstruction Finance Corporation, which would channel bailout funds to foundering banks and railroads. Selected to lead the new agency was Charles Dawes, a Chicago banker with a history of moonlighting for the government—he was the nation’s first Comptroller of the Currency, under President William McKinley, and later elected vice president with Calvin Coolidge. In 1925 he was awarded a Nobel Peace Prize in recognition of his adroit management of postwar international debts.
Dawes immersed himself in launching the RFC until the bank owned by his family, the Central Republic Bank of Chicago, began to founder. Despite Hoover’s protest, in June 1932 Dawes resigned his post and rushed home to wrestle with panicked creditors. Soon after, now against Dawes’ private protest (he feared, rightly, political blowback), Central Republic was named recipient of the largest loan yet issued by the RFC. Though the bank ultimately closed, the bailout made for an orderly transition and the loans were repaid. But public resentment over what appeared to be an in-house deal damaged the reputation of Hoover and of the relief agency.
Here was just the sort of misconduct that critics had feared from the outset—men of wealth protecting their personal interests. But the election of Franklin Delano Roosevelt later that year seemed to clear the air.
Roosevelt was more sparing in his reliance on the men of industry and finance—and yes, all were men— but utilize them he did, especially when faced with a new World War. As the crisis loomed, like President Wilson before him, Roosevelt called on the dollar-a-year crowd. Leading this troop of civilians was Bill Knudsen, then-president of General Motors. An expert in mass production, Knudsen was appointed in 1940 chairman of the Office of Production Management and member of the National Defense Advisory Commission, at a salary of $1 a year.
As production ramped up, Knudsen brought with him executives from car companies, AT&T, and U.S. Steel. New Deal bureaucrats and labor activists denounced the appointments, but despite all the procurement contracts, all the millions spent, there was hardly a whiff of scandal.
By 1942, when Knudsen was awarded with a formal commission as Lieutenant General in the Army, the worst his critics could say was that he had been too slow in converting from peaceful industrial production to a war footing. “We are beginning to pay a heavy price for leaving the mobilization of industry in the hands of business men,” the Nation warned in 1942. Steel makers, in particular, were fighting expanded production “as a menace to monopolistic practices and ‘stable prices,’” argued an editorial. It was “Dollar-a-Year Sabotage,” The New Republic headlined.
But those criticisms were drowned out by the din of factory production, the great outpouring of armament that yielded an “arsenal of democracy,” as Knudsen phrased it, that carried the Allies to victory. “We won because we smothered the enemy in an avalanche of production,” Knudsen remarked later. For all the fears of conflicted interest, the businessmen had proved their worth.
The dollar-a-year appointment routine went out with World War II, but presidents continued to tap the moneyed elite for advice and expertise, a practice that became the source of a growing thicket of regulations designed to forestall malfeasance. Roosevelt broke first ground here, in 1937, with an order barring purchase or sale of stock by government employees “for speculative purpose.” Later, his War Production Administration required its dollar-a-year men to disclose financial holdings and undergo background checks.
From there, safeguards advanced by stages. John F. Kennedy, during his aspirational 1960 campaign, called for a new standard, by which “no officer or employee of the executive branch shall use his official position for financial profit or personal gain.” Upon his election, he followed up with an executive order barring any “use of public office for private gain,” and then lobbied Congress for parallel laws. The result was new criminal statutes covering bribery and conflict-of-interest.
Lyndon Johnson was never an exemplar of disinterested politics, but early scandal in his administration, involving influence peddling by Johnson intimate Bobby Baker, a businessman and Democratic party organizer, prompted a new round of rulemaking. Each federal agency should have its own ethics code, Johnson ordered, and all presidential appointees were now required to file financial disclosure statements. In the 1970s, the fallout from the Watergate scandal, together with the troubles of presidential chum and advisor Burt Lance, prompted a new round of reform from President Jimmy Carter.
As with so many things, the status of ethics in an administration tends to reflect the character of the chief executive, regardless of the rules in place at the time. Consider the following exchange, in 1934, between Franklin Roosevelt, Joe Kennedy, and presidential aide Ray Moley, prior to Kennedy’s appointment at the SEC.
As recounted by Joe Kennedy biographer David Nasaw, Kennedy warned Roosevelt that he had “done plenty of things that people could find fault with.” At that point, Moley interjected: “Joe, I know you want this job. But if there is anything in your business career that could injure the president, this is the time to spill it.”
Kennedy’s reaction was quick and sharp. “With a burst of profanity he defied anyone to question his devotion to public interest or to point to a single shady act in his whole life. The president did not need to worry about that, he said. What was more, he would give his critics—and here again the profanity flowed freely—an administration of the SEC that would be a credit to his country, the president, himself and his family.”
After an exchange like that, codes and rules might seem superfluous. To outsiders, the Kennedy appointment appeared rash; “setting a wolf to guard a flock of sheep,” one critic charged. But Roosevelt was unfazed. Asked why he’d named such a notorious crook as Kennedy, Roosevelt quipped, “Takes one to catch one.” In the event, while nobody ever proposed Joe Kennedy for sainthood, he was never accused of misconduct or self-dealing while presiding at the SEC.
Charles Rappleye is a former news editor at the LA Weekly and the author of four books, his latest, Herbert Hoover in the White House, was published by Simon & Schuster in 2016.