Football Will Perish from the Earth

By 2050, the National Football League (NFL) will be like the Barnum and Bailey Circus of today. Bankrupt, closed, irrelevant, morally passe.

In the early 20th century, the circus was all the rage. After a century of the product’s consumption by a culture increasingly sensitive to the abuse of the weak and helpless—in this case, circus animals—the “Greatest Show on Earth” has been relegated to an empty sideshow. It is simply too brutish for sophisticated moderns who wince at the crack of a whip on an elephant’s rump.

Football as Bloodsport

The parallels of football and Roman gladiatorial games have been noted before. Football will soon follow. Its massive billion dollar stadiums and marketing machines seem immortal for now. But these titanic playpens will soon crumble under the same cultural force that killed the circus: our culture’s growing concern for victims.

I am not judging football’s coming demise as a good or bad thing. I see it as simply a symptom of larger social forces that we should understand.

The parallels of football and Roman gladiatorial games have been noted before. In the Colosseum, the Roman emperor would have a grand procession into the arena to the standing ovation of the assembled masses. Today, our U.S. Defense Department-sponsored games begin with the procession of the American flag and anthem. It is often accompanied by dramatic aerial flyovers by jet fighters and fireworks, symbolizing the transcendent might and grandeur of America’s military conquerings. So too, the Roman games often reenacted the empire’s greatest battles.

Today’s latest controversy involves whether football players should stand united in honor of the flag. The sacredness of the flag rests in its long-standing ability to unify even enemies as the opposing teams simulate. Like any symbol, the flag serves as a vessel for people to place powerful emotions: memories of grandpa’s military service, apple pie, cookouts, neighborly support for one another are all wrapped in its colors.

Above all, the one thing the flag represents the most is the unifying power of sacrifice. We are united as one collective family in our reverence for the flag and anthem. The flag is sacred because it represents, as its loudest defenders proclaim, the blood shed by soldiers fighting for our freedoms.

Interestingly, gladiatorial games were first started as sacrificial offerings accompanying funerals. It was thought that the blood spilled by slaves and captives honored the death of state leaders with the transcendent unity of the crowd. With every pitiful animal howl and human cry, citizens felt swept up as one body in collective satisfaction and relief from mundane rivalries and resentments.

But the state, in collusion with powerful corporate allies, uses spectacles like football to distract and pacify the people. Football as Distraction

Today, governments like to take the suffering and courage of our sons and daughters who enlist and turn it into a marketing ploy for why we all need government coercion controlling our lives—who we hire, what we pay them, permission to cut hair, how big our sodas can be, how much we cook our milk, which drugs we can use to alter our minds, and so on. Governments also like to transmute our goosebumps we feel when the anthem plays into maintaining a trillion dollar annual foreign policy paid by debt created out of thin air and backed by the OPEC oil cartel’s energy markets.

At sporting events, our government captures the nostalgia we feel for neighborhood friendship and family pastimes, associates it with the anthem and flag, and then converts it into passive, numb surrender to perpetual warfare. Even while the nation divides over whether players should kneel or stand for the flag, our government continues to expand its military footprint overseas and drop more bombs, all in our name.

But the state, in collusion with powerful corporate allies, uses spectacles like football to distract and pacify the people. Instead of the violent slaughtering of Roman games, our Christianized culture sends players into simulated, padded warfare. We pick teams to unite our personal lives under and forget about the state’s socio- and economic abuses just outside our doorsteps. Studies even suggest that violent crime drops during major televised sporting events.

But now, Trump and his liberal mirror rivals have pierced the veil by injecting the NFL with the profanity of politics: the realm where real factions use real violence of the state to punish their rivals through regulations, mandates, and taxes. When Trump said “fire them” about the protesting players, invoking the specter of both the penal and paternal side of government, forcing people to take sides and not over the gridiron but at either side of the water cooler and dinner table, it did the game no favors.

Eventually, it took a church monk named Telemachus challenging the violent sacrifice of the Roman gladiatorial games to end their carnage. He climbed into the arena and protested until he was summarily slaughtered. His self-sacrifice for the defense of victims led to the public’s loss of appetite for the violence.The last known Roman gladiatorial event was in 404 AD, less than two decades after Telemachus’s death.

Today, myriad scandals serve as a persistent Telemachus threatening to bring the NFL down. Mothers and fathers all around the country are pulling their sons out of football due to the increased revelations of concussions and resulting brain damage caused by the sport. Whereas Roman citizens demanded their fighters stripped of armor to maximize carnage, increased paddings will end up making players look like Michelin men with bobble head-sized helmets.

The suspension of disbelief required to enjoy the game is waning In Rome, no one cared how gladiators treated their lovers. Today, growing public disgust with widespread reports of spousal abuse is souring the NFL’s mystique.

In college, the NCAA’s state-protected profiteering off of unpaid players’ physical sacrifice is increasingly criticized as well.

Meanwhile, diehard fans once thrilled by simulated violence are losing interest with ever constrained penalty rules and concussion concerns. The suspension of disbelief required to enjoy the game is waning: talks of brain damage, flags no longer able to unify people around soldiers’ sacrificial deaths, spousal abuse, and racial undertones are all exposing football as just a silly game to appease desires for tribalism and aggression—and make fat cat owners fatter. Not worth all the drama.

We should be proud that we do not send hungry lions into arenas with naked prisoners anymore. We have made progress because of Christianity’s leavening of the collective’s history-long abuse against the misfit person. Yet absent such gladiatorial games, our culture must confront our sacrifices of the innocent and nonviolent to appease our love for aggression as the means of keeping peace.

Reprinted from American Conservative

This article was originally published on FEE.org. Read the original article.

Schooling Was for the Industrial Era, Unschooling Is for the Future

Our current compulsory schooling model was created at the dawn of the Industrial Age. As factories replaced farm work and production moved swiftly outside of homes and into the larger marketplace, 19th century American schooling mirrored the factories that most students would ultimately join.

The bells and buzzers signaling when students could come and go, the tedium of the work, the straight lines and emphasis on conformity and compliance, the rows of young people sitting passively at desks while obeying their teachers, the teachers obeying the principal, and so on—all of this was designed for factory-style efficiency and order.

The Imagination Age

The trouble is that we have left the Industrial Era for the Imagination Age, but our mass education system remains fully entrenched in factory-style schooling. By many accounts, mass schooling has become even more restrictive than it was a century ago, consuming more of childhood and adolescence than at any time in our history. The first compulsory schooling statute, passed in Massachusetts in 1852, required eight to 14-year-olds to attend school a mere 12 weeks a year, six of which were to be consecutive. This seems almost laughable compared to the childhood behemoth that mass schooling has now become.

Enclosing children in increasingly restrictive schooling environments for most of their formative years, and drilling them with a standardized, test-driven curriculum is woefully inadequate for the Imagination Age. In her book, Now You See It, Cathy Davidson says that 65 percent of children now entering elementary school will work at jobs in the future that have not yet been invented. She writes: “In this time of massive change, we’re giving our kids the tests and lesson plans designed for their great-great-grandparents.”

While the past belonged to assembly line workers, the future belongs to creative thinkers, experimental doers, and inventive makers. The past relied on passivity; the future will be built on passion. In a recent article on the future of work, author and strategist John Hagel III writes about the need to nurture passion to be successful and fulfilled in the jobs to come. He says:

One of my key messages to individuals in this changing world is to find your passion and integrate your passion with your work. One of the challenges today is that most people are products of the schools and society we’ve had, which encourage you to go to work to get a paycheck, and if it pays well, that’s a good job, versus encouraging you to find your passion and find a way to make a living from it.

Passion-Driven Learning

Cultivating passion is nearly impossible within a coercive schooling structure that values conformity over creativity, compliance over-exuberance. This could help explain why the unschooling, or Self-Directed Education, movement is taking off, with more parents migrating from a schooling model of education for their children to a learning one. With Self-Directed Education, passion is at the center of all learning. Young people follow their interests and pursue their passions, while adults act as facilitators, connecting children and teens to the vast resources of both real and digital communities. In this model, learning is natural, non-coercive, and designed to be directed by the individual herself, rather than by someone else.

Self-Directed Education and unschooling often take place in homes and throughout communities, but increasingly individuals and organizations are launching self-directed learning centers geared toward homeschoolers with both full- and part-time options. These centers make Self-Directed Education more accessible to more families in more places, and each has a unique philosophy or focus. Some are geared toward teens and value real-world apprenticeships and immersion; others are makerspaces that emphasize tinkering and technology, and so on. In Boston, for instance, the JP Green School in the city’s Jamaica Plain neighborhood serves as a part-time self-directed learning space for homeschoolers and unschoolers with a focus on sustainability and nature connection.  Co-founder Andrée Zaleska says:

People educated in coercive models will be damaged for life (most of us are). The lack of respect shown to their autonomous selves as children translates into a lifelong tendency to “get what they need” by any means necessary…We are part of a growing counterculture which finds traditional schooling damaging in ways that are intertwined with the general brokenness of our culture.

Instead of complaining about the education status quo, entrepreneurial individuals are building alternatives to school that challenge it. Centered around passion and an overarching belief in individual self-determination, these entrepreneurs — who are often parents, former school teachers, and others who have become disillusioned by coercive schooling — are freeing young people from an outdated and harmful mass schooling system. Enlightened parents and innovative entrepreneurs may be the key players in constructing a new education model focused on freedom and designed for the Imagination Age.

This article was originally published on FEE.org. Read the original article.

The US rejected Obamacare in 1918

The US rejected Obamacare in 1918. What a difference a mere hundred years makes! US voters rejected mandatory health insurance, or Obamacare, at the turn of the last century. It took supporters almost another century, but they finally won.

For a quarter century before WWI, many of the nation’s young people went to Germany to complete their college education and returned determined to recreate the US in the image of socialist Germany. Richard Ely was one. He founded the American Economic Association for that sole purpose. He and economist Irving Fisher would lead the drive for universal, mandatory health care insurance.

The lodges offered burial insurance because poor people were terrified of suffering a pauper’s burial. Later, they added healthcare.

At the time, middle class and wealthier Americans paid a fee each time they visited a doctor. But the fees were too high for the working poor who instead organized into mutual aid societies to help each other with medical costs. Known as lodges, such as the Elks, or secret societies such the International Order of Odd Fellows (IOOF) or the Freemasons, or just fraternal organizations, mutual help societies existed for centuries. They followed the ancient guild practices of mutual aid to craft members. David T. Beito beautifully writes their history in his book From Mutual Aid to the Welfare State: Fraternal Societies and Social Services 1890-1967, published by the University of North Carolina Press in 2000.

Socialists became wary of lodges, or fraternal societies, partly because of their secret passwords and handshakes. But the societies developed those for security purposes because they suffered from fraud by non-members wanting to cash in on the benefits. Two centuries ago an IOOF chapter in one state couldn’t easily contact another out-of-state chapter to confirm the membership of someone who wanted aid. The passwords and handshakes solved the problem.

In the earliest day, the lodges offered burial insurance because poor people were terrified of suffering the indignities of a pauper’s burial. Later, they added healthcare and life insurance, built orphanages and hospitals, and provided pensions. The Shriners branch of the Freemasons still maintain children’s hospitals. Without the lodges, most members could not afford to pay fee-for-service doctors and would otherwise go without medical care. Readers who want to know how medical care should operate and what is wrong with today’s system should read Mr. Beito’s book.

Medical Establishment Attack on Mutual Aid

The AMA launched campaigns to stigmatize the lodge system and the doctors who served the working poor.

The medical establishment began attacking the lodges as early as the 1890s because the lodges would contract with doctors for a flat fee per year per member to provide medical care for lodge members. The practice, known as “capitation,” is making a comeback with the federal government as a means to restrain the explosive growth in the costs of medical care. Lodges usually contracted with doctors from private medical schools set up by other doctors to fill the deficiency in the supply of new doctors by the state schools.

The American Medical Association (AMA) claimed that the lodges kept doctor pay too low, causing some to starve. So they launched public relations campaigns to stigmatize the lodge system and the doctors who served the working poor. They bribed politicians to shut down the medical schools they didn’t approve of, of course in the interest of “public health and safety” in the Baptists and Bootleggers style, in order to create a shortage of doctors. They bribed hospitals to reject doctors who worked with lodges and convinced medical organizations to ostracize them. AMA doctors refused to work at lodge owned hospitals and the AMA worked tirelessly to shut those hospitals down. The AMA’s assault on “low pay” for their doctors finally worked,

Lodge practice was also a victim of an overall shrinkage in the supply of physicians due to a relentless campaign of professional “birth control” imposed by the medical societies. In 1910, for example, the United States had 164 doctors per 100,000 people, compared with only 125 in 1930. This shift occurred in great part because of increasingly tight state certification requirements. Fewer doctors not only translated into higher medical fees but also weaker bargaining power for lodges. Meanwhile, the number of medical schools plummeted from a high of 166 in 1904 to 81 in 1922. The hardest hit were the proprietary schools, a prime recruiting avenue for lodges.

When socialists and the AMA proposed mandatory health insurance for every citizen in the early 1900s, the lodges saw it as an attack on their system of self-reliance and mutual aid. Enough Americans shared the same values as the lodges that they defeated the proposals in two referenda. In 1918 the citizens of California voted three to one to reject mandatory health insurance. It failed again in New York in 1919.

Abandoning Traditional Values

But the times they were a-changing, and morality with it. Americans were abandoning traditional Christianity rapidly and its values of self-reliance and mutual aid. Of course, churches had always provided charity to the poorest since the early days of Christianity recorded in the Book of Acts in the Bible. But until the 1920s, Americans resisted accepting charity as much as they could out of a sense of honor. The lodges intended to help the working poor, not supplant charitable work. By the 1920s Americans interpreted self-reliance as selfishness. As Beito wrote,

The traditional fraternal worldview was under attack. Age-old virtues such as mutual aid, character building, self-restraint, thrift, and self-help, once take for granted, came under fire either as outmoded or as drastically in need of modification.

In 1918 Clarence W. Tabor used his textbook, Business of the Household, to warn that if savings “means stunted lives, that is, physical derelicts or mental incompetents…through enforced self-denial and the absence of bodily comforts, or the starving of mental cravings and the sacrifice of spiritual development – then the price of increased bank deposits is too high.” An earlier generation would have dismissed these statements. Now they were in the mainstream. Bruce Barton, the public relations pioneer and author of the best-selling life of Christ, The Man Nobody Knows, espoused the ideal of self-realization rather than self-reliance, declaring that “life is meant to live and enjoy as you go along…. If self-denial is necessary I’ll practice some of it when I’m old and not try to do all of it now. For who knows? I may never be old.”

JM Keynes echoed Barton in the 1930’s with his famous line, “In the long run we’re all dead,” and with his continual assault on the evils of the Protestant work ethic and savings. The ideal of “service” replaced that of self-reliance. By “service” socialists meant that the wealthy should give to the poor. They helped remove the stigma of charity by convincing the poor that they shouldn’t be ashamed of receiving aid because the wealthy owed it to them.

The U.S. Became Increasingly Socialist

In addition to the efforts of the AMA to destroy the excellent system of healthcare insurance set up by the fraternal societies, the progress of socialism continued to erode the appeal of self-help. For example, the federal government gave favorable tax treatment to corporations who offered group insurance without extending that to individuals while members of fraternal organizations received no tax deductions for their healthcare insurance.

Corporations then paid the premiums so workers were fooled into thinking their insurance was free. Good economists understand that corporations merely deducted the premiums from future pay raises. The lodges argued that group insurance from the employer would enslave workers to a single company because they would lose their insurance if they lost their job whereas lodge insurance traveled with the individual. The lodges were right as we have found out.

In 1924, 48% of working-class adult males were lodge members.

The Great Depression weakened lodges as the bulk of the 25% unemployment came from their ranks, the working poor. More assaults on mutual aid came with the passage of social security legislation, company pensions, and worker’s compensation insurance. Again, the government allowed corporations to deduct expenses for those from their taxes without extending the privilege to individuals in fraternal organizations. Then came Medicare and Medicaid in the 1960s.

The book exposes the lie that socialists proposed their welfare measures because they saw a desperate need for them. Churches and charities had provided for the poor who couldn’t work since Biblical times, while the fraternal societies took care of the working poor very well. In 1924, 48% of working-class adult males were lodge members.

Socialists opposed the lodge system, not because it failed; it hadn’t. They opposed it because they wanted the services provided by the state as they were in Germany. They convinced the American people that socialism would not just help the poor, as the churches and fraternal organizations were, but would eliminate poverty. And as Helmut Schoeck warned us in his Envy: A Theory of Social Behavior, the lust to destroy successful people served as fuel for the fire. Beito’s concluding paragraph is worth reprinting in full:

The shift from mutual aid and self-help to the welfare state has involved more than a simple bookkeeping transfer of service provision from one set of institutions to another. As many of the leaders of fraternal societies had feared, much was lost in an exchange that transcended monetary calculations. The old relationships of voluntary reciprocity and autonomy have slowly given way to paternalistic dependency. Instead of mutual aid, the dominant social welfare arrangements of Americans have increasingly become characterized by impersonal bureaucracies controlled by outsiders.

This article was originally published on FEE.org. Read the original article.

It’s Time to End the Paid Militarization of the NFL

Before 2009, Colin Kaepernick would have had to find some other way to protest racism against African Americans. That’s because until the height of the Iraq War, NFL football players weren’t even required to leave the locker room for the national anthem, much less stand for it.

That’s not to say that the national anthem didn’t take place before every game. The singing of “The Star Spangled Banner” was mandated during another war, World War II, when the NFL commissioner at the time mandated it for the league.

Between 2012 and 2015, the DOD shelled out $53 million to professional sports.

The players were told to stand for it about the same time that the Department of Defense was ramping up massive recruitment and media operations around the wars in Iraq and Afghanistan.

Paid Patriotism 

They began paying sports teams millions in U.S. tax dollars for what amounted to “paid patriotism,” or mega-military spectacles on the playing field before the games. It got so bad that there was a congressional investigation led by none other than Sen. John McCain (R-Ariz.), a veteran and considered one of the most patriotic men in the Senate.

What McCain and Sen. Jeff Flake (R-Ariz.) found was that between 2012 and 2015, the DOD shelled out $53 million to professional sports—including $10 million to the NFL—on “marketing and advertising” for military recruitment. To be sure, some of that was bona fide advertising. But many of those heart-tugging ceremonies honoring heroes and recreating drills and marches and flyovers are what the report denounced as propaganda.

Of course, this being government, no one is really sure how much has been spent or where the money went. As their report revealed:

Over the course of the effort, we discovered the startling fact that DOD cannot accurately account for how many contracts it has awarded or how much has been spent; its official response to our request only accounted for 62 percent of its 122 contracts with the major league teams that we were able to uncover and 70 percent of the more than $10 million it actually spent on these contracts. And, although DOD has indicated the purpose of these contracts is to support recruiting, the Department doesn’t uniformly measure how and whether the activities under contract are actually contributing to recruiting.

Although the senators claimed to have ended such paid patriotism in the 2016 National Defense Authorization Act, the NFL’s willing role as a top cheerleader and recruiter for the warfare state is unlikely to end anytime soon (one need look no farther than Hyundai’s homage to the military in the 2017 Super Bowl for evidence).

In fact it has not stopped the war liturgy from playing out on the gridiron at all, and yes, given when and how it was mandated for players to honor it, the national anthem has been used a prop in this near-religious convocation.

In recent years, soldier parading, flag-waving, and jumbotron shout-outs to warriors have become de rigueur at NFL games. Consider the display put on at Super Bowl 50: A flyover by the Blue Angels fighter jets, and 50 representatives of all military branches singing “America the Beautiful” against a backdrop of a giant flag. During the game,a Northop Grumman advertisement  proudly announced America’s conceptual sixth-generation fighter jet “of the future” to an unsuspecting audience, a year after it presented its new long-range bomber during Super Bowl XLIX.

How much that ad time cost the company is anyone’s guess, but it is no surprise that defense contractors are hawking their billion-dollar war wares between game play these days.

In truth, the post-WWII NFL has always been militaristic. As Colorado State University historian Robert Gudmestad explains:

Postwar affluence and the increase in white-collar jobs, when combined with concerns about the power of the Soviet Union, led many Americans to fear that men were too effeminate and weak. These anxieties created fertile soil for the growth of football, which became a way to affirm masculinity and fight the supposed “muscle gap.” If you didn’t embrace football—which seemed to embody Cold War ideas of containment—you might be suspected of deviant behavior like homosexuality or communism.

The NFL’s rise has tracked with the growth of the warfare state.

America’s Real Favorite Pastime

Little wonder that the NFL’s rise has tracked with the growth of the warfare state. During the 1950s, NFL football went from being just another sport to near dominance. It surpassed baseball as America’s most popular sport in 1968—the same year that Air Force jets put on a show for the Orange Bowl. (The fact that Nixon was elected on a supposedly antiwar platform in 1968, as one writer claims, hardly disproves the overall correlation.)

Feeling a need to defend football, Matthew Walther for The Week recently lavished unbridled praise on its all-encompassing superiority:

Football is a varied, engrossing, mentally and physically demanding pastime; it is tag, Risk, kickball, and the Commentarii de Bello Gallico all rolled into one. (For the non-Latinists, the Commentarii de Bello Gallico is Julius Caesar’s memoir of the Gallic Wars.)

But isn’t that the problem? “Football is a warlike game and we are now a warlike nation. Our love for football is a love, however self-aware, of ourselves as a fighting and (we hope) victorious people,” opines University of Virginia scholar Mark Edmundson. Slate has called NFL games “the American war game.” George Carlin compared quarterbacks to generals, their thrown footballs to bullets and bombs, and their teammates to advancing troops.

Such comic exaggeration points to a fundamental truth: football—with its obsessive territorialism, regimented hierarchy, and peculiar combination of strategic prowess with brute force—has always been at risk of militaristic co-option.

No, this isn’t an argument against football, nor the national anthem. But it is a plea that the NFL stop shilling for the warfare state and using Americans and their patriotism as unwilling—or even willing—participants. Kneeling or not kneeling, protesting or leaving one’s politics at home are valid points of debate. But the issue of militarized football appears too hot to touch, although it is clearly not going away.

Reprinted from The American Conservative. 

This article was originally published on FEE.org. Read the original article.

Criminalizing “Lying” Will Only Lead to Censorship

In a unanimous decision two years ago, the Massachusetts Supreme Judicial Court struck down a state law that had criminalized the making of “any false statement” in a political campaign. Such a law was plainly incompatible with fundamental free speech rights under the US Constitution and the Massachusetts Declaration of Rights, and the SJC said so. The proper response to a political falsehood isn’t to prosecute, but to counter the lie with the truth, the court held. In a democracy, it is up to voters to decide what is true and false in political rhetoric: “Citizenry, not government, should be the monitor of falseness in the political arena.”

That case was styled Commonwealth v. Melissa Lucas.

Colleen Garry ought to read it.

Exclamation Points

Garry, a Democrat, is a longtime state representative from Dracut who has come up with a brainstorm: She wants the Legislature to pass a law punishing anyone who spreads “any false information” in a political ad. How’s that for original thinking?

Hell hath no fury like a career politician scorned. The bill Garry has introduced, H. 365, is as unenforceable as it is unconstitutional: “No political advertisement may contain any false information,” it decrees. “If a candidate or PAC is proven to have falsified or wrongly stated an opponent’s stand, vote, and/or background, the candidate or PAC shall forfeit all funds in their . . . accounts to the Commonwealth of Massachusetts General Fund.” The chief difference between Garry’s measure and the statute spiked by the Supreme Judicial Court is that hers is far more sloppily drafted. When the bill was taken up Wednesday by the Legislature’s Joint Committee on Election Laws, no one testified in its favor. Garry herself didn’t bother to show up.

What would prompt an experienced state rep to offer legislation so egregiously at odds with the First Amendment and the tradition of robust political argumentation? Pique, mostly. Garry has been stung by criticism from a conservative group, the Massachusetts Fiscal Alliance — criticism that she says has misrepresented her record. This bill is her way of lashing out, as she has made clear in posts on her Facebook page:

“This is not violating free speech!” Garry fumed in one post. “You can say whatever you want BUT if you LIE about your opponent, you have to pay for it!! The public deserves a clean, honest exchange of information!! . . . Mass Fiscal Alliance makes a business out of lying to further their political agenda!!”

Seven exclamation points. Hell hath no fury like a career politician scorned.

But punctuation is no substitute for reasoned argument, and that Garry lacks. A “truth in advertising” standard for political campaigns may sound appealing, but any attempt to impose such a rule would immediately degenerate into censorship and retaliation. Free political discourse would evaporate overnight, as challengers, gadflies, and dissenters found themselves facing excruciating legal proceedings and the threat — the explicit threat, in the case of Garry’s bill — of being wiped out financially.

Truth Is Important

Free speech is more than an abstract good: It is essential to democratic society itself. Truth is important. Politicians or interest groups that play fast and loose with the facts should be denounced. As a matter of decency and morality, no one has a right to lie invidiously about others. The Ninth Commandment’s injunction — “Thou shalt not bear false witness against thy neighbor” — is as indispensable to a just society today as it ever was.

When it comes to politics, however, enforcing that injunction is no job for government regulators or courts of law. Free speech is not absolute: Perjury, libel, and fraud are illegal and criminal convictions can be severely punished. But in the back-and-forth of political competition, where the truth lies is for the people to decide.

America’s commitment to free speech is often portrayed as a commitment to the marketplace of ideas — to the belief, as Supreme Court Justice Oliver Wendell Holmes famously said, that “the best test of truth is the power of the thought to get itself accepted in the competition of the market.”

But as Justice Louis Brandeis argued, free speech is more than an abstract good: It is essential to democratic society itself.

“Those who won our independence,” wrote Brandeis in one of his greatest opinions, “believed that . . . freedom to think as you will and to speak as you think are means indispensable to the discovery and spread of political truth; that, without free speech and assembly, discussion would be futile; that, with them, discussion affords ordinarily adequate protection against the dissemination of noxious doctrine; that the greatest menace to freedom is an inert people; that public discussion is a political duty, and that this should be a fundamental principle of the American government. . . . Believing in the power of reason as applied through public discussion, they eschewed silence coerced by law — the argument of force in its worst form.”

If Garry doesn’t like what other say about her, the remedy is to answer her critics. It is not to intimidate them with threats of prosecution or to introduce bills that would crush free speech.

Reprinted from jeffjacoby.com

This article was originally published on FEE.org. Read the original article.

Spanish PM Pulls a Lincoln on Catalan Secession

In the wake of Catalonia’s referendum on independence, Spanish Prime Minister Mariano Rajoy continued to argue, as he had in the weeks leading up to the vote, that any attempt by Catalans to become an independent state violates “the indissoluble unity of the Spanish nation, the common and indivisible homeland of all Spaniards.”

Americans watching with interest could hardly have missed the similarity to U.S. President Abraham Lincoln’s first inaugural speech, in which he declared, “It is safe to assert that no government proper ever had a provision in its organic law for its own termination.”

The difference is Lincoln was doing just what he said he was doing, “asserting.” His novel theory had no basis in the words of the U.S. Constitution itself and contradicted both the Declaration of Independence and the ratification statements made by three states, including Virginia, who all reserved the right to secede from the union as a condition of ratification.

The Catalonia Conundrum

Prime Minister Rajoy’s statement, on the other hand, was not based in theory. He was quoting directly Article 2 of the Spanish Constitution, which contains the provision Lincoln had to invent. But Rajoy wasn’t quoting the whole Article, which reads,

The Constitution is based on the indissoluble unity of the Spanish nation, the common and indivisible country of all Spaniards; it recognises and guarantees the right to autonomy of the nationalities and regions of which it is composed, and the solidarity amongst them all.

Jumbled together in that one paragraph are the same conflicting pressures which exploded into civil war in 19th century America and continue to smolder under the surface today. On one hand is the recognition that diverse cultures within the union have a natural right to govern themselves as they see fit, without having their political decisions overridden by politicians in a distant capitol who don’t share their values, have no local stake in the community and, in Catalonia’s case, don’t even speak the same language.

But at the same time, the Spanish Constitution expressly states what Lincoln argued was implied: whether that natural right is respected or not, secession won’t be tolerated. Upon ratifying their respective constitutions, it is as if the governments the Americans and Spaniards created became the character Sonny in A Bronx Tale, who said just after locking the door of his tavern on the troublemaking bikers, “Now, youz can’t leave.”

A Catalan vote on independence in which most eligible voters participated would likely be very close. It may even fail. There is also the related question, publicly debated by Edmund Burke and Thomas Paine during the French Revolution, of whether any one generation can bind future ones into a political arrangement in perpetuity. Burke, widely recognized as the father of British-American conservatism, said it could. Rejecting the idea that governments are formed to secure natural rights, Burke took the conservative position that only long-standing institutions can protect Man from his own barbarous nature.

Paine took the position established in the Declaration of Independence that the people have a right to alter or abolish their governments when they failed to secure or became “destructive” of their natural rights. Interestingly, this question is even more at the center of the Catalonia controversy than it was during the French Revolution or American Civil War.

Unlike the French peasants or Confederate states, Catalans themselves are divided on whether they want independence from Spain. Yes, the Confederate states had many of the same differences with Washington Catalonia has with Madrid: They were net taxpayers, meaning they paid more in taxes to the general government than they collected in benefits. They were culturally different, not quite so much linguistically, but certainly so in every other way. And they had a history of self-governance, even while part of the British Empire, that by the time of the Civil War was hundreds of years old.

The chief difference between the two conflicts is the absence of a single, defining issue around which the forces for secession can rally. Sadly, that issue for the Confederacy was slavery, although all the other grievances were part of the fuel which burst into flame. For Catalonia, there are only those longstanding grievances, which continue to smolder. And so, unlike the Confederate states’ secession conventions, a Catalan vote on independence in which most eligible voters participated would likely be very close. It may even fail.

But even if all eligible voters in Catalonia participated in a referendum and those opposing independence won a narrow electoral victory, would that really resolve anything? What about those over two million Catalans, roughly half the voting population, who had effectively withdrawn their consent to be governed by Madrid?

Democracy in the Digital Age

The Industrial Age was an age of consolidation, politically and economically. It mobilized people into factories to produce the economies of scale that raised the living standards of most of society. Similarly, the whole world consolidated into nation-states that brought together very large interest groups, who voted together largely in their perceived economic self-interest. If you were a factory worker in a union town, you voted with the union. If you were a farmer or a financier, you voted accordingly.

It was an age that naturally lent itself to democracy.

Last weekend’s events in Spain should remind us that governments have only one response to noncompliance with their rule: force. The Digital Age is leading in precisely the opposite direction. Economically and politically, it is a decentralizing force. Instead of driving to a big box retailer to purchase an item of clothing, consumers can now order them from Amazon on their phones, while sitting on their patios.

Similarly, the politics of one’s geographic region are beginning to lose their dominance over political sensibilities. While geography still matters more than anything else, there is an undeniable trend towards identifying politically with those in one’s social media networks, rather than merely in one’s city or state. It doesn’t take much imagination to look ahead a few decades and wonder whether geography will matter at all in a completely digital world, where even large-scale manufacturing has given way to the decentralizing influence of 3D printers or some new technology.

In such a brave new world, national or regional majorities based on geographical boundaries will seem far less legitimate to more autonomous individuals plugged into global networks, perhaps no longer needing even to travel to an office or factory to work. And resentment will continue to grow exponentially as geographically-based governments override what those individuals perceive as their own natural rights to liberty and to keep the fruits of their labor, instead of having them redistributed at the whim of politicians whose rule is based on regional majorities who may seem as alien to those people of the future as Washington seems to Iowans today.

As exciting as individual secession sounds in theory, last weekend’s events in Spain should remind us that governments have only one response to noncompliance with their rule: force. And just as in 19th century America, there is still plenty of support for governments to stamp out secession movements with violence. One can only hope the technological advances of the next several decades are accompanied by at least some small advances in wisdom.

This article was originally published on FEE.org. Read the original article.

This is How Shake Shack Will Pay for Higher Minimum Wages

CNBC reports that the burger chain Shake Shack is planning to trial a new restaurant in New York which will not have a traditional cashier’s counter. Instead, “guests will use digital kiosks or their mobile phones to place [and pay for] orders.” Their order will be processed immediately to the kitchen and the guest will receive a text message when their food is ready.

Great, you might think. Shake Shack is investing in innovations which could improve the productivity of remaining workers, increasing wages (indeed, they want to pay the lower relative number of staff in this restaurant at least $15 an hour). Such investments might provide a more efficient and desirable service to customers too. This frees resources and excess labor for other more productive pursuits in the economy.

But the kicker for why Shake Shack is undertaking such investments comes later in the article:

it’s likely that in the next 15 to 20 months that areas like New York, California and D.C., in which there are many Shake Shacks, will transition to a $15 minimum wage…Adopting this payment policy in Astor Place will give the company a chance to work out the kinks before it rolls out a $15 minimum wage in these locations.

Anyone who has been to a McDonald’s in France will know what’s going on here. Shake Shack suspects that the cost of labor will rise due to an increased minimum wage, and given that projection, it’s become economic to consider investments in labor-saving technologies. Higher minimum wages act in effect as a subsidy to automation.

But these investments for productivity improvements don’t come for free. A recent paper by Grace Lordan and David Neumark finds empirical evidence showing that between 1980 and 2015, increasing the minimum wage by $1 decreased the share of low-skilled automatable jobs by 0.43 percent in general and by 0.99 percent in manufacturing. Other jobs might be created of course, but they may well be more demanding or stressful, such as overseeing the running of multiple machines or having to have the skills to deal with technical problems etc. “Regulating to innovate,” subsidizing the rapid introduction of some technologies before they are actually high quality and cost-effective, drives up prices for consumers too.

Perhaps more pertinently, low-skilled workers younger than 25 and older than 40, especially women, tend to be particularly affected by the disemployment effects of automation and can find it very difficult to find replacement work given their productivity levels.

As I concluded in a recent Daily Telegraph article:

If we are moving into a period when technological innovations are speeding up, we could be hiking minimum wages dramatically at just the wrong time. It will prove enough of a policy challenge as it is, to equip people with new skills to adapt in a rapidly changing labor market. Making more low-skilled jobs uneconomic by artificially hiking the cost of labor substantially could exacerbate this change at a time before new investments would otherwise make economic sense.

Being worried about this consequence is not to be anti-technology or anti-innovation. We all recognize that mechanization and technological innovation are the only way to sustainably raise living standards. But encouraging new investments by raising business costs and driving out low-skilled jobs is another matter entirely.

Just because Luddite efforts to destroy machines was economically harmful does not mean that destroying low-skilled employment opportunities would be beneficial.

More on the minimum wage here, here, here, and here.

Reprinted from Cato Institute.

This article was originally published on FEE.org. Read the original article.

Gender Parity and Economic Freedom Are Closely Linked

The new Economic Freedom of the World Index ranks 159 countries based on five measures of economic freedom: size of government, legal system and property rights, sound money, freedom to trade internationally, and regulation.

Published by the Fraser Institute, the data show a strong correlation between economic freedom and human progress. People who live in countries in the top quartile of economic freedom not only have a greater GDP per capita than those in the bottom quartile, but also live longer, enjoy higher levels of political and civil liberties, and tend to be happier about their lives.

Previous reports assumed that women and men faced the same barriers. Gender Parity Matters

For the first time, the Index changed its methodology to include a Gender Disparity Index (GDI) which reflects legal and regulatory barriers to the economic activities of women. Previous reports assumed that women and men faced the same barriers. But that is not the case for many countries, like Saudi Arabia, where women still need a man’s permission to get a job or to open a bank account.

The GDI uses data from the World Bank’s Women, Business, and the Law and 50 Years of Women’s Rights, which track changes in gender equality over time. A country’s score on the GDI can range from 0 to 1. This means that a country will receive a score closer to 0 if women there do not have the same economic rights as men. On the other hand, a country with a score closer to 1 will more likely treat men and women equally under the law, and women do not face additional barriers to economic activity.

For the most recent year for which data are available, 2015, the GDI barely affected the overall scores for 122 out of 159 countries. But for a small set of nations, this new methodology meant a significant decline both in their scores and rankings. Some of the countries with the biggest changes were Saudi Arabia, United Arab Emirates, Kuwait, Jordan, Bahrain, Qatar, Oman, Iran, Egypt, Morocco, and Syria.

Contributor Rosemary Fike points out that the GDI had the largest effect on the Middle East and North Africa nations. Many such countries would be relatively economically free without the restrictions on women. For example, the United Arab Emirates and Jordan, which would have been among the top 20 nations in the EFW index if the gender adjustment had not been included, saw a decline to 37th and 39th respectively.

Equality for Women

Another of the most important findings is that the lowest scores in the GDI have increased significantly since 1970 when the lowest 15 scores ranged from 0.00 to 0.44. In 2015, they ranged from 0.41 to 0.65. Countries have begun to remove their barriers to the economic activities of women over time at the same time that women are becoming more equal to men in their choices and careers.

The graph below shows the gender disparity average scores by economic freedom quartiles. Countries in the bottom quartile have a lower score in the GDI, an average of 0.76. As one moves to higher quartiles, the GDI scores increase, and the most economically free quartile had an average GDI score of 0.95. 

Source: Economic Freedom of the World: 2017 Annual Report; Rosemarie Fike.

Gender equality under the law improves as countries become more economically free. Countries that still restrict women’s economic rights will pay higher economic costs. Treating women and men equally before the law brings more economic freedom and increases society’s potential growth.

Reprinted from Economics21

This article was originally published on FEE.org. Read the original article.

No, I Won’t Disavow My “Racist” Friend

I have a friend who is racist. He wouldn’t use that word to describe himself, and it’s probably unfair of me to use it. But given what he’s done and what he’s said, I think most of the rest of the world (our part of the world, anyway) would say that he’s racist. So I’m not going to waste time arguing about the definition of the word, or explaining that my friend doesn’t actually hate people of other races. The fact is that my friend did something that most decent people with any awareness of our country’s history would find abhorrent: He donated money to the campaign of David Duke, one-time head of the Louisiana-based Knights of the Ku Klux Klan.

For the benefit of anyone who doesn’t fully understand what the KKK is or was: The Ku Klux Klan was founded in the 1860s, and used violence to pursue the agenda of white supremacy. This included the extra-judicial hunting down, torture, and murder of thousands of black Americans between the 1870s and 1950s. We refer to these acts as “lynchings”, but that word does little to convey the full horror of what was done. Nor does it fully convey the motivations behind the atrocities.

Lynching has come to be associated with vigilante justice as if it was simply one method used by angry mobs to punish those who they (rightly or wrongly) believed to have committed violent crimes. But that’s not what lynching was all about. Yes, some lynching parties went after blacks and others who they believed to be criminals. But just as often, they sought retribution against blacks who were deemed too “uppity”, who competed too successfully with white business owners. This was not about protecting white people from black criminals, but protecting them from social and economic competition. It was about – violently – keeping blacks “in their place.” And the accounts of how that was done are stomach churning.

My Friend, the Racist

So I completely understand why anyone would be shocked and offended to find that the owner of their favorite Chinese restaurant was in any way associated with this organization.

Roger and I met in the mid-80s, at UC Santa Cruz.wasn’t especially shocked when I found out that my friend had been “outed” for having donated $500 to David Duke’s campaign. Roger and I have been friends for more than thirty years, but in recent years we have drifted apart as he became attracted to white racial-identity politics. I made a real effort to understand where he was coming from, and I know him well enough to know that he doesn’t just take on a new point of view or philosophy without doing some serious study of that view. In the end, though, I see identity-based politics as being antithetical to civil society and peaceful coexistence, and none of the writing or thoughts he shared with me made me think otherwise.

So when I saw that there was a campaign to boycott his restaurant, the O’Mei, in Santa Cruz, because of his donation, and that the O’Mei’s Yelp page and social media sites were full of comments referring to my friend as a “racist”, “white supremacist”, “Nazi”, and a few other choice epithets, I wasn’t entirely surprised. But when I learned, a few days later, that the O’Mei had closed, the news hit me like a stab in the heart.

Roger and I met in the mid-80s, at UC Santa Cruz. He was helping to start an alternative student newspaper (an alternative to the monolithically left-wing official student paper) and I became the editor of that paper. We disagreed about pretty much everything from the beginning. He was a conservative, with an appreciation for free markets, and I was an anarcho-libertarian. We both had an interest in China: He had lived in Taiwan many years ago, and had created what I was later to learn was one of the best Chinese restaurants in the Bay Area, and I was getting ready to head off for my junior year abroad in Hong Kong.

We spent a lot of time together over the years, both in Santa Cruz and in the various parts of Asia where I was living for most of the late 80s through 2000. I have fond memories of staying up all night to get the newspaper out only to find stacks of it dumped in trash bins on the UCSC campus; of heated arguments over the value (or lack thereof) of indifference curves; of listening to him discuss the finer points of recipes with cooks in China; of deep conversations about cultural differences over cans of Asahi on the streets of Tokyo. And of many, many, meals at the O’Mei.

When I said that the O’Mei was one of the best Chinese restaurants in the Bay Area, I was being modest on Roger’s behalf. I spent about a decade in Asia, living in three different countries and traveling quite a bit in China. I’ve eaten a lot of really really good Chinese food. But I’ve never encountered anything quite like the O’Mei.

It was unique. There is something that Roger did, not “Westernizing” traditional dishes, as many Chinese restaurants do, but something else. Re-creating them almost: Pine-nut shrimp with crunchy spinach, pork (or tofu) roasted with dates and yams, a crispy Taiwanese snapper that was nothing less than heavenly. He would take an already delicious dish and push it up a level, turning it into something even more beautiful and memorable. I guess that’s called “cultural appropriation” now. I call it genius.

Arguing With Roger

Roger and I argued all the time. About everything. In hindsight, it seems odd that we were such good friends when we had so little to agree about. But I think it had to do with the fact that we both cared about the big questions, and we had enough respect for each other, and for the pursuit of the truth, that we could be brutally honest with each other.

When the story emerged that Roger had made a donation to Duke’s campaign, and there were calls to boycott his restaurant, people who knew me asked for my thoughts. I tried to find a way to express them. But I couldn’t.

How could I say anything in my friend’s defense and not simply be dismissed as a “racist” myself? How could I condense a 30-year friendship into a comment on a FaceBook post? And how could I even begin to explain the revulsion I feel – absolute revulsion – at those who have decided that the only crime for which there can be no forgiveness is the holding of racist beliefs, but who tolerate and even admire those who perpetrate institutionalized violence and authoritarianism?

I’m not going to defend Roger’s views, but I’m also not going to pretend that they can be dismissed as simple “racism” or “white supremacy.” In his own words, commenting on the campaign against his restaurant:

We are just a token in a much larger process of terrorizing White European Americans into silence in what has come to be known as the ‘war on Whites.’ My campaign contribution was to one of the men supporting European American Civil Rights. As a European American, it would be insane for me to not support said rights.

Again, I don’t support his position, but after decades of being told that all the problems in the world are the fault of white males and that there is “no such thing” as racism against white people, I can understand his desire to defend what he sees as the interests of his group.

Nor am I going to equate my friend’s support for David Duke with the actions of the KKK a hundred years ago. In fact, Duke has said that he left the KKKK because he “…disliked its associations with violence and could not stop the members of other Klan chapters from doing ‘stupid or violent things’”. And in any case, the KKK today is hardly a force to be reckoned with. At its height, the Klan’s membership numbered in the millions. Today, it is estimated to be between 5,000 and 8,000 and is held in contempt by most of society. The KKK is a symbol of great evils committed in the past, but hardly one of the most serious threats to life and liberty today. Even its extrajudicial killings of black Americans has been appropriated by state and local police forces – yet I don’t see widespread calls to boycott those.

I can’t help thinking that the people who shut down my friend’s restaurant are more concerned with symbolic gestures against evil than they are with combatting actual evil in the world.

To Condemn One Is to Condemn All

Do I find my friend’s views on race to be repugnant and indefensible? Sure I do. But I find the views of most of the people around me to be repugnant and indefensible. If I were to “disavow” my friend who donated money to a man who was once a leader in the KKK, then I would have to disavow pretty much everyone I know.

One friend, for example, worked on the first presidential campaign of Barack Obama. You remember, the guy who went on to oversee the murder of thousands of civilians – more than six times as many as his predecessor; who could have ended the federal war on drugs but did not; who continued to imprison and torture people charged with no crimes in Guantanamo after promising to close it; who allowed unprecedented spying on the American people; and who gave himself the power to put any human being on earth on a “kill list”, as well as to wage war without Congressional approval. My friend knows how I feel about his support for this man, but I haven’t “disavowed” him. And oddly, no-one has pressured me to.

I have friends who support the military; I have friends who have no problem with the police state, or with throwing people into prison cells for years of their lives when they have harmed no-one; I have friends who eagerly support the idea of redistributing other people’s wealth at gunpoint; I have friends who openly support the ideology of communism, seemingly unconcerned about the hundreds of millions of corpses it has produced; I have friends who supported the warmongering and authoritarian Hillary Clinton; And I frequently eat at restaurants that display photos of murderous heads of state on their walls. I find those photos to be offensive.

Here’s what offends me: Violence offends me. War offends me. Slavery offends me. Imprisoning innocent people offends me. I know, I’m weird that way. I am told that the only things that should offend me are racism and sexism and “micro-aggressions.” But I am more bothered by the macro-aggressions.

I wrote about this a few years ago, when Helen Thomas was forced out of her 67-year career in journalism over “offensive” remarks about Israelis:

…there is nothing out of the ordinary about politicians calling for mass murder, torture, preemptive war and other acts of barbarism, while their careers remain intact. Meanwhile, a comment that can be construed as racist, or offensive to certain groups, can ruin a mere plebeian. We have elevated name-calling to a higher offense than advocating (state-sanctioned) mass murder and wars of aggression.

If I boycotted my friend’s restaurant, then I would be a hypocrite if I did not boycott a whole lot of other restaurants. I’d certainly have to stop going to the empanada shop down the street from me that proudly displays a photo of then-President Obama stopping in. If I boycotted every business owner, or denounced every friend or relative, who holds views I find repugnant, I’d be living a pretty isolated life.

Moral Crusade

This crusade that has been launched against my friend and his business is not about “standing up to evil” or “speaking out for what’s right”. If that’s what these people were up to, they’d be calling for a boycott of the military-industrial complex, of the TSA, the drug warriors, and a host of other genuine criminal enterprises. But they aren’t interested in confronting the real evil-doers in our society. They’re interested in easy targets. They are interested in looking good, and in getting to feel good about themselves without having to do the uncomfortable work of confronting the evil they may actually support and enable in their own lives.

So now a phenomenal restaurant no longer exists. Getting it closed hasn’t changed anyone’s views – certainly not those of my friend, who I’m sure now feels more persecuted than ever, and more justified in his belief that there is a campaign to stamp out those who speak on behalf of white males. But a lot of self-styled justice warriors get to feel like they’ve accomplished something.

I am sad that my friend has gone down the path of identity-based politics. Not only because I abhor the collectivist, racial-identity-based view he has adopted, but because it seems to me such a waste of what he has to offer the world. It is no exaggeration to say that Roger is a genius at working with Chinese cuisine, and he has the focus and discipline to create pretty much whatever he wants to in the world. For as long as I’ve known him, Roger has always been immensely creative, always coming up with new ideas for his restaurant, or for the many other projects he has taken on, whether writing a bike-tour guide to Japan or learning to play the accordion. He is a very thoughtful person and very well-read. He knows more about Marxism and left-wing ideology than do most of the people who advocate it, and knows more about getting to know other cultures than do most people who consider themselves to be tolerant and open-minded.

But none of that matters now. Roger has been tarred with the brush of “racism” and so nothing he says or thinks matters anymore. He has been effectively shut out of public discourse, by people who – I am absolutely certain – have no idea what he really thinks or believes about race or anything else.

And this is what is most insidious about this kind of political correctness. It has become little more than a bludgeon for shutting people up and shutting them out. I guess I wouldn’t mind so much if it was shutting up the right people, but it’s not. Madeline Albright hasn’t been shut up. As of 2010, she was bringing in between $60,000 – $75,000 for speaking engagements, even after her offensive statement that the economic sanctions imposed on Iraq were “worth” the deaths of half a million Iraqi children. Yet I don’t see calls to boycott her speaking engagements, or to shut her out of public discourse.

There is a reason for this. As I wrote about Helen Thomas in 2010:

There need to be some forms of behavior at which we can all shake our fists and declare “shame!” Everyone wants to feel righteous, to feel that they stand on the side of the good and against evil, and when someone like Helen Thomas makes a remark that offends an entire group of people – particularly a group of people who have been persecuted in unthinkable ways – she provides an outlet for that need. Those in government pile on too, not so much to deflect attention from their own acts of actual violence, but to reinforce the idea that while state violence is legitimate, name-calling and insult are not.

When the people who want me to disavow my friend start to turn their attention to the serious evildoers in our world, then I may begin to take them seriously. In the meantime, for those who have asked me for my thoughts on the controversy, here they are:

Roger Grigsby is one of the best people I’ve ever known. He’s one of the hardest working, most creative, and most genuine people I’ve had the pleasure of knowing. I am proud to call him a friend, and I’ll be damned if I help to sacrifice him on the altar of feel-good politics.

Reprinted from www.bretigne.com

This article was originally published on FEE.org. Read the original article.

The Case Against Overcriminalization

Lavrenti Beria, the infamous head of the Soviet secret police under Joseph Stalin, supposedly once said, “Show me the man and I’ll show you the crime.” In the Soviet Union, the regime could always find some crime to pin on anyone it chose to target.

Whether you get hauled into court or not depends more on the discretion of law enforcement officials than on any legal rule.

As a general rule, it would be silly to equate the modern United States with a mass-murdering totalitarian state. But in this one respect, the two regimes are more similar than we would like them to be.

Because of the vast scope of current law, in modern America the authorities can pin a crime on the overwhelming majority of people, if they really want to. Whether you get hauled into court or not depends more on the discretionary decisions of law enforcement officials than on any legal rule. And it is difficult or impossible for ordinary people to keep track of all the laws they are subject to and to live a normal life without running afoul of at least some of them.

Discretion and the Rule of Law

This sad state of affairs is deeply at odds with the rule of law. Whatever else that concept means, it surely requires that ordinary people be able to readily determine what laws they are required to obey, and that whether or not you get charged by authorities depends more on objective legal rules than the exercise of official discretion. Unfortunately, neither holds true in the United States today.

Several recent developments highlight these painful truths. President Trump’s controversial decision to end the Deferred Action for Childhood Arrivals program is one of them. Whether or not some 800,000 people will be subject to deportation ultimately depended on the whims of one man. Additional cases in point include conservative claims that President Barack Obama underenforced a variety of federal laws and liberal fears that Trump is “sabotaging” Obamacare by failing to fully enforce key provisions of that legislation.

The vast majority of adult Americans have violated criminal law at some point in their lives. Few serious political observers are naive enough to believe that presidential decisions on any of these issues were primarily dictated by the neutral application of objective legal principles, as opposed to the political agenda of the administration in power at the time.

There is much to criticize in both Obama’s and Trump’s approaches to legal issues. But the problem goes well beyond the flaws of any particular politician. The real threat to the rule of law is inherent in the enormous scope of discretion possessed by the executive in a system where there are so many legal rules that almost everyone has violated some of them, and it is not possible for law enforcement to target more than a small fraction of the offenders.

It’s Likely That You Too Have Committed a Crime

Scholars estimate that the vast majority of adult Americans have violated criminal law at some point in their lives. Indeed, a recent survey finds that some 52 percent admit to violating the federal law banning possession of marijuana, to say nothing of the myriad other federal criminal laws. If you also include civil laws (which, though theoretically less severe than criminal laws, often carry heavy fines and other substantial penalties), even more Americans are lawbreakers.

The federal government today regulates everything from light bulbs to toilet flows. There is even a federal regulation making it a crime to advertise wine in a way that suggests it “has intoxicating qualities.” The percentage of lawbreakers goes up even further if we include state and local laws and regulations as well as federal ones.

Ignorance of the law is virtually inevitable.

For most people, it is difficult to avoid violating at least some laws, or even to keep track of all the laws that apply to them. For example, it is almost impossible for small businesses to fully obey all the byzantine regulations that apply to them, for home and apartment owners to fully comply with every part of the complex building codes and zoning restrictions that apply in many jurisdictions, or for almost anyone to ensure perfect compliance with our hyper-complicated tax code.

Ignorance of the law may not be a legally valid excuse. But such ignorance is virtually inevitable when the law regulates almost every aspect of our lives and is so extensive and complicated that few can hope to keep track of it.

Most Americans, of course, never face punishment for their lawbreaking. But that is true only because the authorities lack the resources to pursue most violators and routinely exercise discretion in determining which ones are worth the effort. Unless you are very unlucky or enter the crosshairs of law enforcement for some other reason, you may well be able to get away with a good deal of low-level lawlessness.

In this way, the rule of law has largely been supplanted by the rule of chance and the rule of executive discretion. Inevitably, political ideology and partisanship have a major impact on the latter. For example, federal law enforcement priorities are very different under Trump than they were under Obama.

Interpretation and Enforcement

The law can change substantially whenever a Republican administration replaces a Democratic one, or vice versa.

Even the law itself is often interpreted differently, depending on who is in power. Under the doctrine of “Chevron deference,” federal agencies have very broad discretion to interpret and reinterpret the laws they enforce, so long as the agency’s view is “reasonable.” The result is that the law can change substantially whenever a Republican administration replaces a Democratic one, or vice versa – even if Congress does not pass any new legislation.

As Supreme Court Justice Neil M. Gorsuch puts it, an agency can “reverse its current view 180 degrees anytime based merely on the shift of political winds and still prevail [in court].” The enormous scope of federal regulatory law enables agencies to exercise extensive discretionary authority over many aspects of the economy and society.

Some will argue that the answer to these problems is simply to enforce every law to the hilt, without any favoritism or discretion. But the enormous scope of current law – and the vast number of violators – make it impossible to do that. Apprehending and prosecuting more than a small fraction of lawbreakers would require a virtual police state and probably bankrupt the government, to boot.

Some conservatives argue that Obama’s systematic use of executive discretion in the case of his DACA and Deferred Action for Parents of Americans immigration policies is especially problematic, far worse than “case by case” discretion. I am skeptical of such claims for reasons outlined here and here.

The difference between systematic and “case by case” discretion is more a matter of degree than kind. But even if such distinctions have greater merit than I believe, eliminating policies such as DACA would still leave enormous executive discretion in place. Even in the absence of formal, systematic orders from above, officials necessarily make choices about which lawbreakers to target, and those decisions are likely to be influenced by ideological and political considerations.

Such discretion will systematically treat some types of offenders more leniently than others.

Often, such discretion will systematically treat some types of offenders more leniently than others, even in the absence of a formal directive to do so. For example, federal authorities have long chosen to ignore nearly all illegal marijuana possession (and most other illegal drug use) on college campuses. Lots of prominent politicians – including several recent presidents – have benefited from that forbearance. The feds are often less forgiving in other settings.

Reduce the Scope and Complexity of Law

We might also be able to reduce executive law enforcement discretion if the Supreme Court were to abolish Chevron deference, as Gorsuch rightly advocates. But even if that happened, federal agencies would retain a great deal of discretionary authority to decide which lawbreakers to go after. That is unavoidable so long as the scope of federal regulation remains as enormous as is currently the case. And, in practice, judges would still often defer to agencies’ interpretations of complex regulatory laws on which bureaucrats seem to have greater expertise than the judges do. For these reasons, law enforcement priorities would continue to shift – sometimes drastically – whenever partisan control of the White House changed hands.

The only way to make major progress toward establishing the rule of law would be to greatly reduce the scope and complexity of legal rules. In a world where the scope of law is strictly limited, officials might have sufficient resources to go after a much larger percentage of lawbreakers. And if the law were limited to those areas where there was a broad consensus that the conduct in question should be illegal, there would be less incentive for officials to engage in selective enforcement based on the priorities of the party in power. If federal or state authorities engaged in such shenanigans with respect to laws that enjoyed widespread bipartisan support, they would risk provoking a major political backlash.

Of course, maybe we don’t value the rule of law enough to sacrifice other objectives to strengthen it.

There is no way to completely eliminate executive discretion over law enforcement or to make the law completely transparent to laypeople. But cutting back on the amount and complexity of law can help us make progress toward those goals.

Of course, it may be we do not value the rule of law enough to sacrifice any other objectives to strengthen it. The laws on the books are not there by accident. Most were enacted because they were supported by majority public opinion, influential interest groups or some combination of both.

Perhaps we just do not care about the rule of law enough to eliminate any substantial number of current laws and regulations – especially those supported by our side of the political spectrum. The rule of law may be less important to us than the rule of men whose agenda we like. If so, we might have more in common with Lavrenti Beria than we like to think.

Reprinted from The Washington Post.

This article was originally published on FEE.org. Read the original article.