By Sarah Isgur, David French, and Jonah Goldberg[1]
In April 2022, Jonathan Haidt published an article in the Atlantic, arguing that “[p]art of America’s greatness in the twentieth century came from having developed the most capable, vibrant, and productive network of knowledge- producing institutions in all of human history, linking together the world’s best universities, private companies that turned scientific advances into life-changing consumer products, and government agencies that supported scientific research and led the collaboration that put people on the moon.”
These institutions are the guardrails of our democracy and they are teetering on the brink of collapse under the weight of our own factionalism, driven in large part by the rise of social media networks that have “both magnified and weaponized the frivolous.” As Haidt concludes:
The norms, institutions, and forms of political participation that developed during the long era of mass communication are not going to work well now that technology has made everything so much faster and more multidirectional, and when bypassing professional gatekeepers is so easy. And yet American democracy is now operating outside the bounds of sustainability. If we do not make major changes soon, then our institutions, our political system, and our society may collapse during the next major war, pandemic, financial meltdown, or constitutional crisis.
To us, Haidt perfectly describes the disease; but here we have been asked to provide our thoughts on the cure. That is a harder task. How do we fix a Congress in crisis? How do we address a voting apparatus that no longer produces healthy results?
Congress in Crisis
First, we believe that the erosion of our constitutional structure has left many of these institutional guardrails vulnerable. Without a healthy legislative branch, it is not possible to maintain a system of self-government.
As great admirers of the American founding documents, it pains us to say that the Founders made a series of mistakes. We know the principal founding mistake— the deliberate exclusion of American slaves from the American promise. We also know that the American people could not, in fact, trust American states to protect the liberties guaranteed by the Bill of Rights. The initial restriction of their application to the federal government too often rendered the foundational liberties of the American republic a dead letter.
But the present crisis in congressional authority illustrates the extent to which the Founders counted on Congress to possess a core institutional identity that could or would ultimately trump the members’ factional interests. It turns out that members are more than happy to punt their responsibilities to a president and then turn their focus to achieving personal fame, mainly as ferocious partisans who support or oppose the person in the White House.
In other words, their identity as “members of Congress” is far less significant than their identities as Republicans or Democrats. Yet by subordinating their branch of government so thoroughly to the presidency, they’re frustrating the Founders’ virtuous intent of placing the most powerful branch of government closest to the people.
The result, as outlined below, is an enormous amount of public frustration and alienation. Very few Americans feel as if their vote truly counts, and every four years our upside-down system entrenches itself. Presidents rule, Congress serves the White House, and courts ratify the legislature’s abdication of responsibility of power. It’s time to turn the system right-side up. The reforms below will not guarantee congressional wisdom, but they will codify congressional responsibility, and perhaps, just perhaps, our republic will come closer to working as intended.
Elections in Crisis
Second, we believe that self-government cannot endure if both sides of America’s ideological spectrum find short term success in undermining the guardrails of our elections. As of earlier this year, only 20 percent of Americans were very confident in the integrity of our electoral system— a 17-point drop from just a year ago. The problem is that neither side agrees on why they are losing confidence, according to a Pew research study:
Democrats think that there are hurdles to the voting process and election rules that make it more difficult for people to cast their ballots. Republicans think that expanding these rules and making it easier to vote would make elections less secure. So those things are naturally at tension with one another, and likely why we’re not gonna see the polarizing aspect of American elections go away anytime soon.
While we endorse something like the wholesale adoption of the 2005 Carter Baker Commission Report, which produced 87 recommendations to make our elections both more open and more secure, multiple academic studies that have looked at voting restrictions have shown that such changes “had only minor effects on turnout and no effect at all on the Democratic margin in the presidential election” in 2020, and conversely that expanding of mail voting didn’t alter turnout either. And, of course, there has been no evidence of voter fraud— or even a valid theory of potential voter fraud— that could change the outcome of a statewide election.
And yet, we are hurtling toward a presidential election in which neither party’s voters may be willing to accept that their side could lose in a fair election. We believe that changes introduced into our political system in the last 20 years have eroded important guardrails that protected our elections, incentivizing individual candidates to build support with the most extreme factions of voters while leaving party leadership too weak and too marginalized to build effective coalitions, and changing the very nature of who is willing to stand for elected office in the first place.
Our recommendations below hope to give voters, parties, and candidates better tools to navigate the modern electoral landscape and return to the kind of political popularism that Democratic data analyst David Shor has described, in which parties “figure out which of their views are popular and which are not popular, and then [] talk about the popular stuff and shut up about the unpopular stuff.”
Article V of the Constitution requires 38 state legislatures to ratify an amendment after it is proposed either by 67 senators and 288 house members or by 34 states. Since only 14 million people live in the smallest 12 states, that means that an amendment could require the other 96 percent of Americans to agree to any change. So, it is
no wonder that of the nearly 11,000 constitutional amendments that have been proposed over the past 233 years, only 27 have made it through.
As we get further from the original drafting of the Constitution, one would expect more changes to be needed— as technology, culture, and mores continue to evolve. But the pace of amendments has actually slowed down. The most recent amendment— the 27th, which prevents a congressional pay raise from going into effect until after the next congressional election— was actually proposed with the original Bill of Rights and took another 200 years to get ratified. Only seven others were ratified in the last century. Even proposed amendments have slowed to a dribble, with about half as many proposed in the last Congress (78) as were proposed in 1996.
The result is that our courts are left between two untenable options: strike down legislative or administrative acts that are wise or even necessary because they violate the Constitution, or “amend” the Constitution through judicial fiat with nine people substituting their will in place of Article V’s process.
The late Justice Antonin Scalia believed it was too hard for the people to “overrule” Supreme Court decisions, which left the courts holding the bag on thorny issues better left to the political process. As reported by the Legal Times, “[Scalia] once calculated what percentage of the population could prevent an amendment to the Constitution, and found it was less than 2 percent. ‘It ought to be hard, but not that hard,’ Scalia said.”
Here is our proposed text:
An amendment to this Constitution proposed by a majority of both houses of Congress or a majority of states shall be valid when ratified by the legislatures of two thirds of the several states; provided that no amendment shall abridge the privileges or immunities of citizens of the United States. No state shall be able to withdraw their ratification and all deadlines for ratification must themselves be contained within the text to be ratified.
Our system of government is upside-down. The branch of government that is closest to the people, that was intended to be the most powerful— the legislature— is now the weakest. The power of the presidency has expanded to the point where, every four years, the American people are electing the most powerful peacetime president in American history.
This escalation of presidential power escalates the stakes of presidential elections and helps destabilize American democracy. Following decades of jurisprudential developments and legislative abdication, the president is no longer “merely” the nation’s chief executive, he is also the nation’s chief lawmaker and chief warmaker. Combine these new powers with the original power to appoint federal judges, and the president not only runs the most powerful branch of government, he determines the composition of the second-most powerful branch.
The president’s lawmaking function rests with his administrative agencies. The nation’s chief executive presides over an alphabet soup of monstrously large, immensely powerful organizations with acronyms such as EPA, CDC, TSA, CPFB, EEOC, and HHS. Though theyre created by Congress, they’re now under presidential control and possess individual rule-making authority. Their enactments carry the force of law.
Indeed, administrative rulemaking is often the primary form of lawmaking in the United States. Presidents use them to change immigration law, to change educational policy, to change environmental rules. When Congress does pass meaningful legalization— such as Obamacare— it often leaves the law intentionally incomplete, delegating to the executive branch the task of filling in the blanks.
Perhaps the most famous recent example of this phenomenon is the infamous Obamacare contraception mandate, a rule that spawned almost a decade of federal litigation. The mandate is nowhere in the Affordable Care Act itself. Instead, it’s a regulatory creation, enacted under the direction of the president.
What Congress gives, presidents have been happy to take. That applies to lawmaking, and it applies to warmaking. The Constitution gives Congress the exclusive power to declare war and makes the president the commander-in- chief of the armed forces. The meaning is clear. The president commands the forces that Congress chooses to deploy.
Yet time and again presidents have initiated substantial military actions without congressional approval. Presidents now have the power to take us to war all on their own. In recent years we’ve seen campaigns in the Balkans, in Libya, and in Syria commenced without so much as a single congressional vote. Previously-granted military authorities, such as the post-9/11 Authorization for the Use of Military Force (AUMF), have been stretched beyond all recognition. The AUMF is the purported legal foundation for military actions launched more than a decade after its enactment in theaters of conflict far beyond the original boundaries of the battle.
Congress, meanwhile, has neutered itself. The reasons for this are longstanding and complex, but the toxic combination of an increasing number of safe districts and metastasizing partisanship means that candidates run to become partisan warriors and not legislators. In Jonah Goldberg’s phrase, Congress has become a “parliament of pundits.”
The end result is deeply alienating for average Americans. If you, like most voters, live in a safely red or blue state, then you will not cast a meaningful vote for or against the most powerful person in the world. If you never move to a swing state, then you might never cast a meaningful vote for president in your life.
At the same time, as the president’s power escalates, you’ll very rightly feel that the stakes of presidential elections increase. Democracy will start to feel meaningless to you. A subset of voters exercises vastly disproportionate control over the course of American government.
The frustration is compounded by the sheer proliferation of safe legislative seats. Tens of millions of voters now live in congressional districts in which a single party’s primary election is the only truly relevant election in any given election cycle. If you’re a blue American living in a deep red district in a deep red state, it’s as if you don’t exist; and the same dynamic operates on the other side.
This reality does not reflect the Founders’ intent. Yes, one branch of government is supposed to be supreme, but it is not the presidency. It’s plainly Congress. Only Congress possesses the power of the purse. Only Congress possesses the power to declare war. Only Congress can fire the president or any member of the federal judiciary, including any member of the Supreme Court. Congress can override presidential vetoes.
It has such vast potential power that it’s plainly incorrect to say that the Constitution provides for “co-equal branches of government.” Each branch can check the other, but they were never intended to be equal.
The trend of accumulated presidential power has been going on for so long— with bipartisan legislative and judicial sanction— that there is no one, single reform that can restore congressional primacy. The process will take years, and it will require both judicial and congressional action. The list below isn’t exhaustive, but it represents a series of vital first steps towards righting the constitutional ship.
The Case Against the Legal Doctrines Shielding Congress From Its Own Inaction
Cornell Law School’s Legal Information Institute defines the nondelegation doctrine as “a principle in administrative law that Congress cannot delegate its legislative powers to other entities. This prohibition typically involves Congress delegating its powers to administrative agencies or to private organizations.” In essence, this doctrine holds that when Article I of the Constitution states that “all legislative powers” granted by the Constitution “shall be vested in Congress,” it means that Congress and Congress alone is empowered to legislate.
Or as the Supreme Court stated in a 1935 case called Schechter Poultry v. United States, “Congress is not permitted to abdicate or to transfer to others the essential legislative functions with which it is thus vested.”
Yet despite this constitutional history and judicial precedent, nondelegation doctrine is mostly dead. It hasn’t been utilized in generations to strike down a congressional delegation of authority, and Schechter Poultry was considered an artifact of New Deal-era judicial turmoil. As a practical matter, the Supreme Court has given Congress a free hand to delegate vast amounts of its legislative powers to Congress.
But the more originalist Supreme Court majority is signaling that “mostly dead” does not mean “entirely dead.” Four justices (Roberts, Gorsuch, Thomas, and Alito) have now joined opinions in two separate cases, Gundy v. United States (2019) and National Federation of Independent Businesses v. Department of Labor (2022), that have argued for a revived nondelegation doctrine.
If these justices add another, then they can start limiting Congress’s legal ability to punt its legislative functions to the executive. It won’t be able to pass bills that are deliberately incomplete. At the same time, a revived nondelegation doctrine will further inhibit the president’s ability to order executive agencies to act when he or she perceives Congress has failed.
If the nondelegation doctrine is about whether Congress can delegate, the major questions doctrine is about whether Congress did delegate. If the Supreme Court wants to incentivize congressional action, then it should also beef up the presumption against delegation where there is no clear statement in legislation that Congress intended to delegate a major question pertaining to the economy or national governance to an administrative agency. Administrative agencies should not be able to perpetually expand their own power whenever they can find any language in legislation that can be pried open through sheer force of will.
There is also the issue of so-called Chevron deference. In 1984, the Supreme Court decided a case called Chevron v. National Resources Defense Council that contained a simple, but far-reaching holding. As the Legal Information Institute describes the doctrine, judges should defer to executive agency interpretations of governing statutes when they are “not unreasonable.” At least “so long as the Congress had not spoken directly to the precise issue at question.”
When the Supreme Court 1) ignores nondelegation and 2) applies Chevron deference to agency actions, it grants the president incredibly broad lawmaking powers. Restoring nondelegation and reversing Chevron will mean that Congress will be dramatically limited in its ability to simply punt its constitutional responsibilities to the executive branch.
Lastly, though perhaps most controversially within the conservative legal academy, the dormant commerce clause should be cabined so that congressional inaction cannot be a basis for courts to solve a state-driven problem. The dormant commerce clause refers to “the prohibition, implicit in the Commerce Clause, against states passing legislation that discriminates against or excessively burdens interstate commerce.” Obviously, it is important for national harmony that states are prohibited from creating tariff-like barriers against their neighbors, but that role can and should fall to Congress. If Massachusetts is discriminating against Wisconsin dairy farmers, a simple act of Congress can override such a state law with a full understanding of the economic and political impact.
The Case for Structural Changes to Pave the Way
One of the constitutional cornerstones of congressional supremacy is the ability of the legislature to override presidential vetoes. In theory, a motivated Congress— which is more democratically accountable than the president— can pass laws over presidential objection. In reality, the combination of increased partisanship and the constitutional two-thirds majority requirement to override means that Congress rarely overrules the president. As a practical matter, the president exercises a nearly-unbreakable hold on the legislative process.
That needs to end. Amend the Constitution to reflect the modern political reality and restore congressional supremacy. Adopt the practice of states like Tennessee and permit Congress to override vetoes by simple majority vote. It’s difficult enough to pass legislation. Make the veto a pause–more of a speed bump than a brick wall.
And while we’re amending the Constitution, let’s grant Congress the ability to enact a legislative veto. A legislative veto allows Congress to pass a resolution through both houses that would void any executive rule or regulation. The Supreme Court struck down the legislative veto as fundamentally “executive” in nature in a 1983 case INS v. Chadha, and there’s no indication that the court is considering revisiting its reasoning.
Yet no branch of the government should be enacting legislation (even if disguised as “regulations”) without congressional consent. The nation’s legislators should have the final say over the existence of the nation’s legislation.
Next, one of Congress’s most important structural roles has become nonexistent. As commander-in-chief, the president enjoys a certain degree of inherent command authority over the armed forces of the United States, and that command authority includes the ability to authorize military strikes in limited circumstances. For example, the president can authorize immediate defensive military operations in response to actual or imminent attack. But outside of the immediate, emergency occurrence, only Congress should possess the ability to initiate American military action.
As a practical matter, it’s difficult to legislate a limit on the president’s actions. After all, when he issues a command, the armed force’s general obligation is to obey, in the absence of obvious illegality. But there are still some necessary legislative steps that can reign in the president’s current operational authority and deter overreach.
During the Trump administration, a nonpartisan public interest law firm called Protect Democracy issued a set of proposals for instituting additional checks on presidential war powers. A number of them have merit, including:
Lastly, Congress’s own rules are getting in the way. Current congressional rules and precedents provide congressional leadership with extraordinary ability to block legislation from receiving a floor vote, even if bipartisan majorities support the legislation. This hammerlock should end. Majorities should be able to force floor votes on amendments or final bills without leadership consent.
In the House, this means the era of the informal “Hastert Rule” (where speakers don’t schedule votes unless a bill enjoys a majority of the majority party’s support) has to end. In the Senate, it means the majority leaders shouldn’t enjoy the power to decide whether bills or amendments earn a vote.
It’s also time to think hard about filibuster reform. Presently the senate parliamentarian enjoys immense power to define whether bills qualify for the filibuster, and the filibuster has been eliminated for judicial nominees. The distinctions and processes are byzantine to most Americans.
There is still logic to the filibuster as a means of encouraging bipartisan compromise. As a short history will show, the elimination of the judicial filibuster is creating perverse incentives in the judicial branch. In the past, the judicial filibuster served as its own guardrail; a nominee could neither be too conservative nor too liberal to make it through. Now there is no guardrail and the pool of judicial nominees has shifted, as well as the tone of those within the pool.
By the Bush Administration, it was well understood that the goal was to find the most conservative judge that could get the necessary number of Democratic votes to overcome the filibuster. In the filibuster-era, even law students were aware of the tightrope they’d need to walk to make it to the federal bench someday.
This had some less-than-ideal consequences that must be acknowledged as well. Aspiring judges tried to avoid having a paper trail. For example, they might be less likely to write law review articles on provocative topics or would avoid using email that would be subject to Freedom of Information Act (FOIA) requests while serving in government jobs. Nominees would avoid answering even those most banal questions during their hearings to avoid taking a position on any topic.
And within Congress, Democrats used the filibuster strategically to hold up a higher percentage of non-white male nominees (Miguel Estrada, Priscilla Owen, Janice Rogers Brown, Carolyn Kuhl) to narrow Republicans options for Supreme Court justices and bolster the narrative the judges ruling against liberals were white and male. In the case of Carolyn Kuhl, they cited her work on a brief that advocated for the Court to overturn Roe v. Wade (1973). She eventually withdrew her name from consideration three and a half years after being nominated. The man whose name was on the brief, however, was never asked about it during his hearing and was confirmed by voice vote.
Nominees stalled during the Bush Administration, but eventually the “Gang of 14” made up of seven Republicans and seven Democrats broke the stalemate and pushed through two of the pending female nominees to circuit courts.
Senate Majority Leader Harry Reid ended the filibuster for executive branch and lower court judicial nominees in 2013. Nearly 10 years later, we can see the downstream effects of this change. It has eroded the guardrails.
Anyone aspiring to a Senate-confirmed job need not worry about the bipartisan tightrope. It is assumed that no nominees will go through when the opposition party controls the Senate. A potential nominee, therefore, need only worry about being outflanked or opposed by their party’s base for being insufficiently extreme. The effects should be obvious. Relationships across the aisle that were once both necessary and rewarded are now a liability. Instead of steering away from writing about partisan issues, would-be confirmees fight to outdo one another.
Current and prospective judges are much less circumspect about broadcasting their legal opinions, and there’s now a real danger that nominees will essentially “audition” for legal appointments by forecasting their court decisions— a trend that will discourage judicial independence, inhibit thoughtful deliberation, and sow further public doubts about judicial fairness.
And that is just our short experience without the judicial filibuster. Imagine what effect it would have to get rid of any need for bipartisanship for legislation as well. Even the possibility of getting rid of the filibuster has provided administrations and senators a talking point that prevents bipartisan outreach.
At the same time, the combination of a closely-divided country and stark negative partisanship makes a filibuster- proof majority a nearly-mythical “unicorn” moment in American politics. Aside from the first year of Barack Obama’s presidency, no party has enjoyed a 60-vote senate majority since the 61-vote Democratic majority in 1977.
Our proposal is to keep the 60-vote filibuster threshold but to create an alternative route based on the percentage of support within each party. If a bill or nominee has the support of both the majority of senators and 10 percent of each party’s caucus, that would also overcome the filibuster. In today’s Senate, that would mean that legislation could pass with 55 senators—50 Democrats and 5 Republicans. But if a party wins a true landslide based on the need for specific reform, they could still pass legislation with only one party if they had the 60 votes necessary.
The Case Against Transparency
Three years before he was appointed to the Supreme Court, Louis Brandeis wrote an essay for Harper’s Weekly titled, “What Publicity Can Do.” Thanks to this essay, Brandeis is often credited with coining the popular saying, “Sunlight is the best disinfectant.” But what he actually said was a little different. “Publicity,” he wrote, “is justly commended as a remedy for social and industrial diseases. Sunlight is said to be the best of disinfectants; electric light the most efficient policeman.”
Brandeis was wise not to claim authorship of this idea, as it can be traced back to many sources throughout the nineteenth century, from public health journals that used the phrase literally, to numerous journalists who liked the figurative interpretation, and even to such intellectuals as the poet Ralph Waldo Emerson and British historian and statesman James Bryce. It was Bryce who wrote in his canonical The American Commonwealth:
The conscience and common sense of the nation as a whole keep down the evils which have crept into the working of the Constitution, and may in time extinguish them. Public opinion is a sort of atmosphere, fresh, keen, and full of sunlight, like that of the American cities, and this sunlight kills many of those noxious germs which are hatched where politicians congregate. That which, varying a once famous phrase, we may call the genius of universal publicity, has some disagreeable results, but the wholesome ones are greater and more numerous.
No one can seriously dispute that there is much truth and wisdom in this idea. What was once called “publicity,” but is now often called transparency, plays a vital role in holding politicians and other government officials accountable. A free and independent press, skeptical of those in power, is clearly a vital tool in preserving democracy. The Washington Post may be at times a bit selective in its application of its new motto, “Democracy Dies in Darkness,” but that does not mean the basic point is wrong.
But is it true that “sunlight is the best disinfectant”? At best, this is debatable. Let us explore the metaphor a bit. Sometimes germs develop immunity or resistance to some forms of disinfectant. The weakening of the parties, the rise in politics-as-entertainment fueled by polarization, partisanship, and social media, have had the unintended and undesirable consequence of fostering sunlight resistance. Donald Trump often said and did things in plain public view— and with ample amplification by the media— without paying much of a price. (Though, ironically, when he suggested using ultraviolet light “internally” to cure covid, the ensuing outcry and mockery did induce him to claim he was just kidding). Reresentatives Paul Gosar (R-AZ), Marjorie Taylor Green (R-GA), Ilhan Omar (D-MN), Rashida Tlaib, and many other politicians have said things that in another age would ruin a career once the public was made aware of them; now such “trolling” is highly remunerative. Prior to President Bill Clinton, a president admitting he had an extramarital affair with an intern in the White House would have resulted in a swift resignation. In a healthier democracy, strong parties and a self-confident Congress would serve as alternative forms of disinfectant when sunlight alone was insufficient. Richard Nixon resigned rather than face impeachment and trial only when party leaders made it clear they could no longer support him. Tragically, no such similar effort materialized during Donald Trump’s two impeachments, but even if they had, it is debatable whether they would have resulted in a resignation anyway.
Regardless, the relevant question for this discussion is can too much sunlight be a problem? As Yuval Levin argues, “a legislature’s basic function is negotiation.” One might quibble. After all, it is reasonable to think that the basic function of a legislature is to legislate. But a basic function is different than a highest function. The highest duty for doctors is to save lives. But the basic, or first, duty for doctors is to do no harm. And even prior to that, doctors must acquire the basic skills and knowledge to qualify as doctors. Similarly, legislators must be able to negotiate before they can even attempt to legislate responsibly.
Negotiation is a very difficult thing to do in broad daylight. As with our gelded political parties, Congress is weakened by too much democracy. Our modern sensibilities are hostile to “smoke filled rooms”— literally and figuratively— but the relative secrecy implied by the term is essential to hammering out disagreements and reaching compromise. We would not have our Constitution if the delegates in Philadelphia in 1789 did their work in front of an audience. The light of our Constitution itself was lit in darkness. By way of comparison, many of the meetings that drove the French Revolution were conducted in front of a live-studio-audience, as it were, which encouraged leaders of different factions to pander to the cheers and boos of the onlookers rather than craft responsible positions that took into consideration unpopular, but important, compromises with political reality.
This is fairly analogous to the problems we have today with an excess of sunlight. Many senators and represen- tatives are legislators in name only. They see themselves as rebels, tribunes of “the people” (in reality a tiny slice of the people), or simply as pundits, who use Congress as a stage or studio to broadcast their “takes” to the wider world. In short, they are there entirely to talk, or shout, but rarely to listen.
In 1979, C-SPAN started shining a bright light on the once shadowy workings of Congress. The rationale at the time for televising Congress was sincere, persuasive, patriotic, and— in hindsight— largely wrong. Nearly a half-century later it seems clear that we can have too much of a good thing, because it was in those shadows that Congress did much of the hard work essential to democracy. As Levin writes, “when an institution becomes too thoroughly transparent, it becomes indistinguishable from the open public space around it, and so it is simply another arena for public speech rather than a structure for meaningful action.”
Fresh examples are available literally whenever Congress convenes. Oversight and confirmation hearings of any serious import routinely feature senators or representatives using all of their time to give speeches that can be cut up into video clips suitable for fundraising or viral tweets. The speeches will often repeat what other committee members have already said, in the spirt of Mo Udall’s famous quip, “Everything has been said but not everyone has said it.”
“Most of what happens in committee hearings isn’t oversight, it’s showmanship,” writes Nebraska Senator Ben Sasse. “Senators make speeches that get chopped up, shipped to home-state TV stations, and blasted across social media. They aren’t trying to learn from witnesses, uncover details, or improve legislation. They’re competing for sound bites.” But he adds, “There’s one notable exception: The Senate Select Committee on Intelligence, the majority of whose work is done in secret. Without posturing for cameras, Republicans and Democrats cooperate on some of America’s most complicated and urgent problems. Other committees could follow their example, while keeping transparency by making transcripts and real-time audio available to the public.”
In 2019 then Senator Joni Ernst ignited a firestorm of criticism when she suggested at a townhall meeting that Congress will eventually have to tackle social security reform and to do so, some of the negotiations will have to be in private. “So it’s, you know, a broader discussion for another day,” she stated. “But I do think, as various parties and members of Congress, we need to sit down behind closed doors so we’re not being scrutinized by this group or the other, and just have an open and honest conversation about what are some of the ideas that we have for maintaining Social Security in the future.” Activist groups denounced her desire to cut benefits “in secret.”
But Ernst was right. It is impossible to float a compromise on virtually any important issue if special interests, donors, and the general public, can listen-in on the conversation essentially in real-time. There’s a robust debate about putting cameras in the Supreme Court. Reasonable people can disagree about the advisability of that. But, to clarify our point, does anyone think it would be helpful if the Court’s private deliberations in conference— or Justices’ conversations with clerks— were televised? President Franklin Roosevelt often met with different leaders of his broad and diverse New Deal coalition in the privacy of the Oval Office. FDR was notorious for telling representatives of opposing factions that he was on their side, promising concessions to one group he’d already given to another. Say what you will about such tactics, they’d be impossible to do in front of live cameras. That may sound desirable in theory. But, in practice, politics requires such things from time to time. Voters and special interests alike have every right— and duty— to judge elected leaders by what they accomplish, but some discretion in how they accomplish is wise.
Again, all reasonable people agree that sunlight is a good thing. But too little consideration is given to the question of whether it is possible to have too much of a good thing.
Congress is supposed to be where politics happens. It is the one place in our constitutional order where representatives from different communities, with different interests and worldviews, meet to hammer out their disagreements. Forcing them to do all of that in public, under the bright lights of show business, banishes the business of democracy and leaves in place the show.
The Case for Refocusing K-12 Curriculum
Core to our electoral process and our system of self-government is the requirement of an informed citizenry.
It is no surprise that both sides have chosen to focus the culture war on what parts of American history we teach to our children. After all, “who controls the past controls the future: who controls the present controls the past.” But in many ways, this debate has missed the point.
Our priority should be teaching critical thinking and logic— not formulas, dates, and names. Students should be taught how to sift through opposing arguments— not taught to ignore the voices they disagree with.
And let schools compete. It makes no sense to limit what students can learn based on where they live. One STEM charter school for five counties in Virginia that is harder to get into than an Ivy League college should set off alarm bells. We should have schools focused on liberal arts, STEM, and great books. Just look at how many different types of pre-Kindergarten programs are springing up in cities offering universal pre-K for their residents. Why isn’t the same thing happening with the rest of the public school system? If we’ve learned anything in the CRT debates, perhaps it should be that a one-size-fits-all curriculum is no longer possible or desirable.
We mandate geometry and algebra but not statistics and coding; we mandate which sciences students take instead of offering them the wide array of STEM fields that could spark their imaginations. We don’t mandate a class in media literacy alongside government. This generation suddenly has access to every piece of knowledge ever attained by man, but we don’t show them how to wade through it all.
Even within the history curriculum, the more pressing fight should be about how we teach that history. For starters, history is getting longer. A half century has passed since some of us were born— a 25 percent increase in the amount of history— and yet the school year is the same length. Rather than skip or condense our history lessons, we should rethink how we approach it. Names and dates have their place in a curriculum, but that method of teaching reinforces a specific way of thinking about how that history happened. It encourages a sense of inevitability for the big events, a focus on wars rather than law, and the “great man” theory of history, which too often tells our story from the perspective of the wealthy, the educated, the white, and the male.
What if instead of a survey course that touches on the Civil War era, they read the autobiography of Frederick Douglass and the biography of Kentucky-born Justice John Marshall Harlan, the sole dissenter in Plessy v. Ferguson (1896)? Perhaps reading the letters of John and Abigail Adams would give them a better sense of what it was like to live in revolutionary times even if they have to skip learning about how many men died at Valley Forge. Teaching history is undoubtedly a tradeoff— but perhaps some of these fights would lose their venom if we let the history literally speak for itself.
In short, we’re bickering over the wrong things. Whether it’s competing with China on the global stage or how we run our elections at home, the guardrails of our democracy are only as strong as the next generation— and we are the ones responsible for teaching them not just our history but our values.
The Case for Stronger Parties
We live in an age where almost everyone involved in politics is a partisan who claims to hate partisanship. This should not surprise us given that partisanship is increasingly manifesting itself as a form of identity politics. “Partisanship, for a long period of time, wasn’t viewed as part of who we are,” according to political scientist Sean J. Westwood. “It wasn’t core to our identity. It was just an ancillary trait. But in the modern era we view party identity as something akin to gender, ethnicity or race— the core traits that we use to describe ourselves to others.” In such a climate, political compromise is no longer a question of prudential trade-offs, but a form of surrender and self- harm to one’s identity.
Obviously, in a country wracked by culture wars, this development has many causes, but one key driver is the dysfunctional state of the two-party system. Counter-intuitive as it may seem, weak parties invite strong partisanship. When the internal leaders of the parties abandon responsibility for policing and defining their “brand,” that task is outsourced to the public. Historically, the strong parties have decided for themselves what policies to prioritize, what compromises to make, what choices were in the long-term best interests of the party— and the country. Weak parties allowed such decisions to be determined by the loudest voices and the angriest voters.
And our political parties today have never been weaker.
Consider 2016, a year when Bernie Sanders— an independent self-declared socialist who had waged war on the Democratic Party from the left for his entire career— nearly won the nomination of his party; and Donald Trump— a former Democrat and reality-show celebrity— succeeded in claiming the Republican nomination, despite never having a majority of the party behind him. Both parties were like jets, fueled and prepped on the tarmac, just waiting to be hijacked (as New York Times columnist Ross Douthat told us).
The hollowing-out of the parties is a rich and complicated story, but it begins with the establishment of the modern primary system. As the political scientist Elaine Kamarck has observed, America is the only advanced industrialized democracy in the world in which its major political parties have so comprehensively abdicated the responsibility of picking their own candidates. Instead, they have put it up for a vote. But the voters aren’t primarily party officials or even party stalwarts, as in political conventions of old; they are a shifting blob of activists, and the voters they can entice to show up. Primaries have existed for over a century, but until 1972, they were largely toothless rituals. Consider that in the last election under the old system, Democrat Eugene McCarthy received 38.73 percent of the primary vote, Robert Kennedy (before his tragic death) received 30.63 percent, Lyndon Johnson and his various surrogates received, combined, less than 20 percent (CK) and the nominee was Hubert Humphrey with just over two percent of the vote.
None of this is to say that the old system was perfect. But it had many advantages, starting with the fact that party leaders had a sense of stewardship over the institution they controlled. They could filter out demagogues and opportunists, and they could extract commitments to principles and notions of responsible government— in part because party leaders had a sense of obligation to the long-term health of the party and, by extension, the country.
Thanks in large part to the dominance of primaries, parties have become, in effect little more than brand marketers: great at Tiktok videos, terrible at most everything else. Crucial party functions— educating and organizing voters, developing policies, building consensus— have been taken over by outside institutions, that do not necessarily have the best interests of parties in mind. Many media organs serve these roles. Candidates care more about winning the “Fox primary” than they do the views and interests of party stakeholders or even the voters, as traditionally understood. Organizations like Planned Parenthood and the NRA do the “party work” of organizing and mobilizing single issue voters better than either the RNC or the DNC. The more such outside groups “own” such voters, the more difficult it is for politicians of either party to compromise or even negotiate on these issues.
And while the old system of “smoke-filled rooms” may not have been as democratic as the modern primary system, this trustee model of institutional leadership fortified commitment to the proper functioning of American democracy itself. Historically, as the political scientist E.E. Schattschneider famously argued in his canonical Party Government: American Government in Action, democracy is what happens between the parties, not within the parties.
The current system lowers the common denominator of decision-makers and empowers candidates to whip up sentiment among the most impassioned and, often, the least informed. Instead of candidates trying to lay out an agenda that represents a consensus among the party as a whole, all of the incentives militate toward cobbling together the largest faction possible to achieve a plurality. The move toward “open” or “semi-closed” primaries fuels this incentive structure and often invites passionate single-issue and low-information voters. As Jeffrey Anderson and Jay Cost write in National Affairs, “This has had a significant effect on the process, as the winning candidate often claims victory not so much because he has articulated the values and interests he shares with the whole party, but because his advertisements managed to sway the late-deciding, quasi-independent voters who have little stake in the outcome.” When the British Labor Party weakened standards for voting in their party conference, they didn’t get some wise tribune of the people. They got Jeremy Corbyn as radical demagogue with a noted inability to stand-up to anti-Semitism.
Politically, abolishing primaries outright would be as difficult as it would be desirable. But by tightening the qualifications for voting in them, primaries would encourage and empower stakeholders to take a broader and more responsible view of party priorities. Even better, implementing some hybrid model where some states hold party conventions instead of, or in addition to, more selective primaries. Any measure that would give delegates to state and national conventions more responsibility to decide who does or doesn’t get the nomination would be a step in the right direction. As it is, the quadrennial national party conventions have become scripted infomercials and little more, and yet they are covered by the media as if they are a meaningful mechanism in our democracy. The reality is they are not meaningful precisely because the parties have become too democratic.
One is free to pound the table at the “undemocratic” nature of such reforms, but this is not the dispositive critique some might think. The first response to such objections is, “So what?” Countless vital institutions essential to democracy are not internally democratic. Newspapers are essential bulwarks of democracy and, to our knowledge, no major newsroom in the country puts its coverage or editorial stances up for a vote— not to staff and certainly not to readers. The military is essential to defending democracy, and yet nobody is arguing that the troops be allowed to vote on deployments or strategy.
There is nothing undemocratic about making political parties less democratic.
The Case Against Individual Contribution Limits to Federal Campaigns
Related to the goal of strengthening parties, the purpose of the 2002 Bipartisan Campaign Reform Act was to put an end to what Senator John McCain referred to as “an elaborate influence peddling scheme in which both parties conspire to stay in office by selling the country to the highest bidder.”
“Prior to the enactment of BCRA, federal law permitted corporations and unions, as well as individuals who had already made the maximum permissible contributions to federal candidates, to contribute ‘nonfederal money’— also known as ‘soft money’— to political parties for activities intended to influence state or local elections,” as Justice John Paul Stevens helpfully summarized in McConnell v. FEC (2003), the Supreme Court case which upheld the law.
That soft money, though, didn’t have to be used solely to help state and local campaigns. It could also be used to fund get-out-the-vote efforts that would also or even primarily benefit a federal candidate and could even be used toward “legislative advocacy media advertisements” that used the name of a federal candidate.
With individual donation limits still in place, the soft money loophole meant that the vast amount of money in politics was actually being channeled through political parties, giving them enormous amounts of power even as the smoke-filled rooms otherwise dissipated. It was obviously a hard system to defend. But perhaps presciently, then-Governor George W. Bush warned during one of the primary debates that “the ultimate extension of some of these campaign funding reform plans out of Washington, D.C., will mean that the people who decide who the candidates are and who the victors are will be the press.”
In 2002, BCRA became law and 20 years later the effects have been the greatest erosion of the guardrails of our democracy than any single other factor. BCRA didn’t get money out of politics. It’s hard to imagine that anyone thought that it could. Soft money itself was the result of post-Watergate election reforms. In 2020, total campaign spending hit $14 billion— more than double what was raised and spent just four years earlier. In comparison, the total soft money raised by the parties in 2000 for the entire two-year cycle was under $500 million— 3.5 percent of the 2020 budget.
Instead of getting money out of politics, it simply transferred power away from the political parties as Bush predicted. BCRA gave rise to super PACs, small dollar donations, and the end gatekeepers, leaving America vulnerable to the current strains of anti-democratic populism on both sides of the aisle. Money, like water, finds a way. But the money streaming into our politics now rewards extremism and demagoguery.
In the wake of BCRA, campaigns still relied heavily on their large donor programs— hosting fundraisers for people who could “max out” their federal donations to a candidate. But as the parties fell, the internet rose. Campaigns didn’t need to spend valuable candidate time flying to fundraising events across the country. Data and digital teams grew exponentially on campaigns and with them came an increasing reliance on small dollar donations. Using platforms like Facebook, Twitter, and YouTube, campaigns were beyond the “microtargeting” of the Bush years.
The rental value of one’s email list has become the new D.C. commodities trading.
On its face, crowdsourcing of candidates is a good thing. Large money interests weren’t representative of the average voter. But neither are small dollar donors it turns out. Just under half of all Americans voted in the last presidential election, but fewer than three percent donated money to a candidate. And those donors are motivated by anger and outrage.
As of this year, 22 cents of every dollar raised on the Republican side is going to Donald Trump. But because of list rentals, every campaign is far more interconnected than they’ve ever been before. That means that Trump directly controls nearly one quarter of the pie, but he indirectly controls far more of it. Trump hits his email lists sometimes 14 times a day. The question for anyone considering running for office is how to break through with anyone left on that list who hasn’t unsubscribed or tuned it out. The answer to that question should speak for itself— and it affects not only how candidates run, but who chooses to run in the first place.
Repealing BCRA is unthinkable in the modern era for many. The idea of advocating for millionaires and billionaires to be able to give unlimited sums to their favorite candidates sounds like a nonstarter. But very few seem to know that 11 states already do it, including Pennsylvania, Virginia and Texas. And we can see that state and federal candidates act very differently in those states when it comes to fundraising. The average donation to Texas Governor Greg Abbott’s campaign was just more than $119; but the average for Ted Cruz’s Senate campaign was only $37.
Virginia Governor Glen Youngkin famously kept Donald Trump at arms’ length— because he could. Congressional candidates in Virginia aren’t quite so lucky. In the most competitive congressional race in the state, the more moderate Republican candidate is trying to distance herself as much as she can from Donald Trump but has nevertheless been forced to agree that the 2020 election in Virginia was “rigged” for Joe Biden. Why? Because a disproportionate number of that three percent of donors believe that.
Repealing BCRA in favor of a no limits/immediate disclosure system isn’t a silver bullet for fixing the state of our democracy. But it may be the single biggest thing that can be done short of amending the Constitution. Political parties played a pivotal role as guardrails for candidates. Their role now is a permanent data and digital infrastructure for incumbents and future nominees of their parties. Candidates, in the meanwhile, are forced to appeal to the very online three percent. And legislators are left without any incentive to legislate, knowing that what motivates that three percent isn’t the compromise necessary to drive legislative accomplishment, but the outrage created by allowing problems to fester and finding someone else to blame for it.
The Case for Election Day
There is more diversity to the case for democracy than many realize. Some support democracy for wholly idealistic reasons, imbuing it with profound moral and philosophical nobility. Others prefer democracy on prudential or utili- tarian grounds; it’s flawed, but superior to all of the alternatives. Most advocates for democracy tend to make “both/and” arguments rather than “either/or” ones on its behalf. Most of us tend to think it’s both noble and good and procedurally preferable to unaccountable rule. In this, it is similar to the case for free markets. Some believe economic freedom is simply an indivisible subset of all other forms of freedom. As the philosopher Robert Nozick put it, “capitalist acts between consenting adults” are simply part of what it means to have a free society. But intellectual and political elites have a lot more room in our culture to argue for limiting economic freedom than they do for limiting democratic freedom. Senator Bernie Sanders famously declared, “You don’t necessarily need a choice of 23 underarm spray deodorants or of 18 different pairs of sneakers when children are hungry in this country.”
Let us leave the question of how reducing the number of deodorants and sneakers would feed hungry children for another time. The point is we consider questions of limiting or regulating economic choices to be well within the Overton window of legitimate political discourse. But even the most modest restrictions to ballot-casting are routinely described as “anti-democratic,” “racist,” “authoritarian,” and literally “Jim Crow on steroids,” in the words of President Biden.
To deal with the legitimate public health concerns amidst the Covid-19 pandemic, many states made the reason- able decision to expand options for voting. This entirely reasonable course of action was intended to minimize crowds and lines at polling places in order to keep democracy itself from becoming a super-spreader event. As the pandemic receded, some states and jurisdictions eliminated these explicitly temporary, one-time, adjustments to an emergency. For instance, Texas rescinded Covid-19 voting policies like 24-hour drive-through voting. This move was pilloried as Jim Crow. Georgia tightened its voting procedures, though many of them were still more generous than they had been prior to the pandemic and remained more generous than laws in New York state, Delaware and other reliably Democratic strongholds. President Biden also unhelpfully insisted that this was the return of Jim Crow. We are at a loss to understand how a return to the procedures that allowed Barack Obama, the first black president, to win two impressive victories can be described as “Jim Crow.”
Two historic wrongs cloud our minds in the contemporary discussion of democracy. The first is the aforementioned and execrable chapter of Jim Crow, beginning after the ratification of the 15th Amendment when states visited a great number of evils on African-American citizens, including deliberate efforts to prevent black citizens from exercising their right to vote. This was an indefensible practice morally, constitutionally and in every other meaningful regard. However, it should be noted that as heinous as Jim Crow-era restrictions on voting were, they were far from the most repugnant things about Jim Crow. State-sanctioned lynching, for instance, is a far greater evil than restricting access to the ballot box. The evil of Jim Crow-era election practices stemmed from the fact that the white majority was trying to prevent blacks to vote for changes to the far greater systemic evils. In other words, voting restrictions were a means, not an end. The end was apartheid. The means of sustaining it were poll taxes, literacy tests, and more. Thus, when people refer to things like voter ID requirements or reasonable limits to early voting as “Jim Crow” they are confusing the epiphenomenon for the phenomenon. Nobody is trying to reimpose separate water fountains or “back of the bus” requirements for blacks. Interracial marriage is here to stay. Pretending otherwise< is not merely a form of disinformation, it exacerbates racial tensions and ignores the massive racial progress this country has made.
The second historic wrong is more recent. President Donald Trump and his worst enablers tried to steal an election and, in the process, injected a flood of poisonous paranoia about elections into our politics.
Jim Crow was surely the more morally repugnant of the two, but the recency of the January 6 riot and the enduring presence of Trump and his supporters, renders both the former and the latter somewhat equivalent in their ability to distort contemporary debates about voting. Like magnets next to a compass, they make it very difficult to find a “true north” in such discussions.
So, if readers can detach themselves from such real-world distractions, we would like to ponder the question solely in the abstract, as a matter of principle. Why should voting always be easier? This is the default position many professional election reformers. One can agree with the common refrain, “everyone who wants to vote should be able to” without agreeing to the corollary: “We should make voting as convenient as possible.”
For many voting is the gateway to civic commitment. But why? Is it outlandish to think that voting should be closer to the culmination of civic commitment? Perhaps citizenship requires more than merely an opinion. Perhaps it requires an informed opinion. We all have an equal right to vote, but that doesn’t mean everyone puts equal thought or effort into voting.
This need not be an argument for citizenship tests or anything of the sort (though it would be nice if citizens could pass the same test that immigrants have to pass to gain citizenship). Part of being informed is simply paying attention. Yet excessively long early voting periods often lead to voters casting ballots long before all of the pertinent information is available.
In the final months of the 2016 election, there were a number of bombshell revelations about the Clinton Foundation and about the FBI’s investigation of Hillary Clinton’s use of classified emails as Secretary of State. NBC News also revealed a tape of Donald Trump boasting about sexually assaulting women. As a result, nearly a dozen women have come forward describing treatment that closely tracked the reprehensible behavior Trump himself described in an unguarded moment.
These, and other revelations, dropped well after voting had already begun in many states. Partisans can debate the significance of these disclosures and events. But for perhaps millions of voters this new information didn’t matter, because they had already cast their ballots. In 2020, by the time California’s primary took place, nearly half the presidential candidates on the ballot had dropped out, effectively nullifying the ballots of anyone who voted for them. But it’s even worse than mere vote nullification. Let’s say you’re a very progressive voter and on the first day of early voting you cast your ballot for the most progressive candidate—or simply the most progressive candidate you deem electable in the general election. But then your candidate drops out or no longer seems electable. If you still had your vote, you could support the next most progressive candidate. But now, the pool of progressive voters is smaller, effectively empowering the least progressive candidate.
Moreover, long periods of early voting have the effect of changing how candidates run. As early votes are banked, candidates look to ways to attract the votes of undecided or poorly informed voters. This is smart tactically, but we are at a loss as to how this is of much value for the country or its policies.
Every journalist and academic (presumably the bulk of people reading this report) understands that deadlines focus the mind. Universal deadlines guarantee that everyone is working from the same pool of available information. And an actual Election Day is an ideal way to guarantee that those who most care about the election will vote. Largely for political reasons— but also changes in our culture and economy— restricting elections solely to a single day is unfeasible and undesirable. But, many of those legitimate concerns could be remedied by making Election Day a national holiday or creating an Election Weekend. Some absentee and early voting is entirely justified as well.
People with disabilities, servicemen and servicewomen and other citizens living abroad have legitimate needs.
Still, the case for meaningful voting deadlines, with tighter rules and timeframes for early voting, is not in principle discriminatory against anyone save those who don’t care very much about voting in the first place. Making voting a little more inconvenient rewards those who take their votes seriously enough to tolerate the inconvenience.
Proposals like online voting have enormous and fatal practical, political, and technological flaws, but they also have a philosophical one. Is it really true that our democracy desperately needs input from voters who wouldn’t vote if they couldn’t do it on their phone during a commercial break? Do we really want politicians pandering to such voters?
Elections are a vital component of democracy and all reasonable and patriotic Americans should agree upon this basic proposition. But one of the ways we have enforced and promoted this fundamental ideal is through the civic ritual of in-person voting on Election Day. It would be a perversion to claim that maintaining the importance and relevance of Election Day is somehow undemocratic just as it would be perverse to suggest that making voting so convenient it can be squeezed into one’s schedule while waiting for the barista to finish your order would be a glorious celebration of democracy.
Sarah Isgur is co-host of the Advisory Opinions podcast for The Dispatch, a contributor for Politico Magazine and ABC News, and teaches at George Washington University’s School of Media and Public Affairs. David French is co-host of the Advisory Opinions podcast for The Dispatch, a columnist for The Atlantic, and the author of Divided We Fall: America’s Secession Threat and How to Restore Our Nation. Jonah Goldberg is the editor in chief & co-founder of The Dispatch and Remnant podcast host and the author of three New York Times’ bestsellers.↑
Explore our new 15-unit core curriculum with educational videos, primary texts, and more.
Search and browse videos, podcasts, and blog posts on constitutional topics.
Discover primary texts and historical documents that span American history and have shaped the American constitutional tradition.