Before Margaret Thatcher came to power in 1979, Britain was in trouble and headed for worse. The story was told on radio news every morning. Along with the weather and the traffic reports, there was daily a list of trouble spots of a different sort: industrial action.
Industrial action was the euphemism of the time for strikes; most of them unofficial, all of them debilitating. The national mood was sour, the economy perilous, and Britain’s international competitiveness was slipping fast. Commentators around the world talked about “the English disease.”
Thatcher’s challenge was to curb the unions; but before she could do that, she had to convince a doubting nation that the unions could become, or be made, responsible. Over the years, the unions had amassed quite extraordinary power that reached into lives of people who had never thought they were affected by unions.
Union excess was everywhere but because the British believed in the importance of unions, their strengths and excesses were taken as the necessary price for the fundamental right of collective bargaining.
The Labor Party derived much of its support and financing from the union movement. They were structurally entwined: The unions represented the core, or the “base,” of the party. Unfortunately for Labor, the base was toxic and threatened the health of the economy and, as the election of 1979 showed, the electability of the party.
Thatcher, though hard to love, did three enormous things for Britain. She restored the primacy of the free market, curbed union excess and, ironically, saved the Labor Party. Thatcher’s changes made it possible for what was to be called New Labor to modify its relations with its trade union base. The politicians got back the politics, which had been progressively assumed by union bosses of the base.
The British experience is redolent with lessons for the Republican Party. The “base,” represented by the aggressive broadcasters like Sean Hannity, Rush Limbaugh and Laura Ingraham, is goading the party in Congress to adopt positions that satisfy them, but not the electorate.
Building on the new reality created by Thatcher’s Conservatives, Tony Blair and his political brain, Peter Mandelson, were able to discipline or silence the trade unions in the Labor Party and present an alternative to the Conservatives that could plunder the best ideas of the right. When nobody was looking, Blair must have thanked God for Thatcher.
The agony of the Republicans is clearly on display with the nomination of Sonia Sotomayor to the Supreme Court: To oppose her blindly is to kiss off millions of Hispanic voters, maybe for generations. The party clearly had no strategy to deal with a candidate like Sotamayor. None.
The far right came out with, well, with an old argument: She is a liberal activist. Not much evidence of that, but the conservative talk-show hosts were ready for war. The last war. Or the one before that.
More damaging to serious Republicans has been the conversion, almost entirely on Fox, of respected Republican philosophers into political Vaudevillians. Enter, center stage, Newt Gingrich, Mike Huckabee and Karl Rove. Their collective TV antics are damaging to the movement they once led.
A lot of good thinking about the future of the Republican Party is taking place in the think tanks, particularly the American Enterprise Institute and the Heritage Foundation. But the solid work of restructuring the party for the new realities at home and abroad is drowned out by the eponymous broadcast wing of the party.
It is hard to believe that Newt Gingrich, broadcaster, is the same Newt Gingrich who masterminded the 1994 Republican midterm sweep. Or that Karl Rove was the genius who saw that George W. Bush could be presented as a convincing presidential candidate.
Absent any possibility of reform of the Republican base from the outside, in the Thatcher way, it has to come from the inside. Several astute conservative writers, like David Frum and Mickey Edwards, have lighted a path. A first step down that path could be a more even-handed examination of President Obama’s Supreme Court picks. He could have as many as four of them in his first term. Clearly he has an eye to the electorate, as much as to jurisprudence, if Sotomayor is a harbinger.
Thatcher built herself an entirely new base. Blair dismantled an old one. The Republicans need to examine both.
- no responses
Oh, to be a federal judge, a lifetime judge. Ah, to interpret the U.S. Constitution for lawmakers or to have the unbridled joy of deciding what Congress meant without being able to ask it.
It is delightful to be on the federal bench and divine to be on the Supreme Court, where you can play mind games and search for writers’ clues hidden in the 222-year-old Constitution. It is the treasure hunt that never ends. More: You can look at plain words–such as those in the Second Amendment—and, depending on your personal interests, opine on what they mean with two radically different interpretations.
You can also stir things up by interpreting what your predecessors had already interpreted. Stare decisis et non quieta mouvere (settled law)? That is just what they tell the kids in law school. The pranksters on the highest court in the land will have none of it. Hence, Roe v. Wade hangs in the balance all the time. No stare decisis there.
Can anything be as much fun as deciding what a group of, albeit exceptional and erudite, 18th-century white men thought about the Internet? Talk about trivial pursuit. But it is not trivial; it can reshape the country. As each term approaches, fancy contemplating how much fun it would be to rearrange history by persuading just four of your fellow justices.
But that is not all. Working conditions are pretty nice. You cannot be fired. The pay is good. There is no mandatory retirement. All heavy lifting, from your suitcase to a weighty opinion, can be delegated to those too-eager clerks. The little buggers plan to make millions on the strength of clerking for you. Make them work for it, whether it is picking up your laundry or redefining the rights of the press.
But that is still not all. As an added bonus, the evidence suggests you will live a long time. After all, there is no strain. You are treated with unctuous deference. Even if you are so gaga you cannot tell one colleague from another, a thousand law schools will hang on your ramblings. Clerks will write opinions for you based on what they think you said. Deferential colleagues will try to side with you, even if they think you’re full of it.
And do not forget the sheer exhilaration of writing a minority opinion. You can really let off steam in those. It is the next best thing to talk radio for venting, and it has a much greater impact. Just savor the shock on your colleagues’ faces when you turn against them and, quoting you smartest clerk, you tell them what the Founders meant.
You have enjoyed the reality television show “Survivor.” Well, that is what life on the court can be like–with the additional pleasure that you can’t be voted off, canceled or bitten by a poisonous snake.
The greatest pleasure of all, though, is to go against the constituency that endorsed you. From Earl Warren to David Souter, this fun has been intense. Appointing a justice is a crap shoot: Like Henry II appointing Thomas Becket to be Archbishop of Canterbury, high perfidy is possible.
In recent years, things have been changing for the justices: more women and minorities have joined the all-white-male rumpus room. This change to a representative court brings up issues we should be informed about. Have the Great Ones had to clean up their language, or put down the seat on the highest legal throne in the land? Does Clarence Thomas speak without being spoken to? Does Antonin Scalia smirk in private as well as in public? And does John Paul Stevens remember when he was born?
However democratic we try to be, when presidents nominate someone for the highest court in the land, they create a demigod, beyond the reach of politicians and their jackals, journalists. They enter their own Pantheon, as sanctified and superior as the gods of ancient Rome who were given, like the Supreme Court, a marble temple, courtesy of the Emperor Hadrian.
- no responses
With so much talk of infrastructure renewal, a case needs to be made for a few new toys for grownups of the kind that enliven London today, and once enlivened cities and nations.
Time was when you wanted to get your city spruced up, you held a world’s fair. All through the19th century and well into the 20th century, the legacy of world’s fairs was that they left permanent attractions for the public to enjoy long after the gates had closed.
London’s fair of 1851 left behind the glorious Crystal Palace, which sadly burned down in 1936.But the idea was sowed for the two legacies that outlasted all the other world’s fairs: Gustave Eiffel’s tower for the Paris Exhibition of 1889 and George Ferris’s wheel for the 1893 World’s Columbian Exhibition in Chicago. Both men were great bridge builders and enormously gifted engineers.
Eiffel, who originally wanted to build his tower for an exhibition in Spain but was rejected, faced limitless criticism. Architects, authors, journalists and poets formed a common front against the tower. They said it would destroy the beauty of Paris; it was ugly and dangerous; and, of course, it was too expensive.
Supposedly the writer Guy de Maupassant ate his lunch in the tower every day after it went up, so that he did not have to look at the “high and skinny pyramid of iron ladders.” Eiffel built himself an apartment at the top of the tower, and threw lavish parties there.
Today the Eiffel Tower is the symbol of Paris, and the most popular tourist attraction in the world.
When it became clear that the organizers of the Chicago exhibition were having trouble in coming up with a spectacular structure of their own, Eiffel, whose ego matched the height of the tower, offered to help them out. But the planners could not face the humiliation of bringing in a Frenchman to save the day.
Luckily for them Ferris, who was attending an engineering meeting where the lack of a project was lamented, sketched a passenger wheel on a napkin and the day was saved. Ferris’s original wheel did not survive, but countless Ferris Wheels have enhanced public entertainment ever since.
The London Eye, which opened on the South Bank of the Thames River in 2000, as part of the millennium celebrations (it is also known as the Millennium Wheel), is the most popular tourist attraction in Britain. Take another bow, George Ferris.
The Eye, designed by David Marks and Julia Barfield, a husband and wife architect team, was briefly the largest passenger wheel in the world. But Singapore and the eastern Chinese city of Nanchang rushed to build bigger wheels. However, the Eye is the largest cantilevered wheel–which means, like a windmill, it is supported only on one side–and this is what makes it so elegant.
World’s fairs are a thing of the past in the age of television, and the fact is their legacy has not always been as great the legacies from Chicago and Paris. The 1964 World’s Fair left behind nothing special in Flushing Meadow, N.Y. Its Unisphere still stands, but it is not a big attraction. Likewise, nothing spectacular remains from the 1967world’s fair in Montreal. And the Space Needle in Seattle is a local rather than a national attraction.
The message is that people want beauty, but also participation; a wheel to ride on, a tower to ascend.
When it comes to toys for millions, London stands front and center–and is even a little egocentric. Those buses! Those taxis! Where else? Recent additions to the public amusements of London, besides the Eye, the foot bridge over the Thames River, dubbed the Wobbly Bridge; the New Tate art gallery in the old Battersea power station; new subways and a revived St. Pancras railway station, which is even grander than it was in its Victorian heyday.
Not all of London’s attractions required public money. The Eye was largely funded by British Airways and is operated by the people who run Madame Tussauds.
Once, London ruled much of the world. Now, it beckons it. In America, we are losing the race for public fun–and profit. –For Hearst-New York Times Syndicate
- no responses
Corruption in Kenya? Blame it on the British and the psychological damage of colonialism. The partition of Cyprus? Step forward the social engineers in London, who underestimated the depth of feeling in the Turkish minority when the British were finally forced out.
When it comes to the Middle East, one can really get exercised about “Perfidious Albion.” The British had their fingers in every territorial dispute: They created whole countries and, with the help of the French, imposed borders from Morocco to China.
Trouble with Iran? Even before the CIA started meddling there in 1953, it was Winston Churchill who, as First Sea Lord in 1913, decided the Royal Navy would move faster, cleaner and have greater range if it switched from coal to oil. So he partially nationalized the Anglo-Persian Oil Company, the forerunner of BP, to exploit the newly discovered oil fields in Iran. Later, this led to a surge in Iranian nationalism and the CIA plot to restore the Shah.
On to Pakistan and the British legacy in the autonomous tribal lands, now home to the Taliban and al-Qaeda. Put the British colonial administration of the 18th to 20th centuries in the dock. Yes, three centuries of British commission and omission.
The British interest in Afghanistan, which they failed to subdue in a series of wars, was largely as a buffer between British India and the growing territorial interests of the Russian Empire. It was here that The Great Game was played: the romanticized espionage that flourished in the region. The British divided the traditional Pashtun lands with the Durand Treaty of 1893, creating a northwestern border for British India, and later Pakistan. It amounted to a land grab. However, the British did recognize the separateness of the people in the Northwest Territories and left them to their tribal and religious ways.
With independence and the partition of India in 1947, the incoming Pakistani government had enough problems without encouraging ethnic strife between the largely Punjabi Pakistanis and their difficult Pashtun brothers in the territories. So the government in Islamabad continued the British policy of benign indifference to the Pashtuns, with whom they were more closely linked by religion than ethnicity or politics.
Yet, the border dispute smoldered and periodically erupted. Kabul and Islamabad do not agree, both blaming the border drawn by the British.
What neither the British nor the Pakistanis wanted was a strong movement for a Pashtun state that would carve out territory from Afghanistan, as well as the tribal territories in Pakistan. There was a failed attempt to bring this about in 1949. Segments of the Pakistani army and the intelligentsia have feared this ever since. They are haunted by another stateless people living on both sides of a border: the Kurds who straddle the border between Iraq, a largely British creation, and Turkey and Iraq and Iran.
The message is that simply being Muslim does not wipe out tribal and ethnic identity any more than borders drawn by others create a new identity. If it were so, Cyprus would not be divided; Yugoslavia would have held together, as would have Czechoslovakia; and Britain would not be considering the possibility of an independent Scotland–after 300 years of union.
The current hostilities in the Pakistani tribal areas, U.S. drone strikes on suspected Taliban strongholds and renewed determination from the Pakistani army to crush extremists in the region could renew a sense of nationhood among the Pashtuns, and a movement toward the creation of Pashtunistan across the British-drawn border between Pakistan and Afghanistan.
In the long reaches of the night President Obama’s special envoy to the region, Richard Holbrooke, may wish one of the following had happened in the days of the British Raj: 1. the British had stayed home; 2. the British had insisted the Pashtuns submit to central authority; 3. the British had created a new country, Pashtunistan; or 4. the British had never created that troublesome border.
One way or the other, he can blame the Brits.
- no responses