Rowan County, Kentucky, has a population of just over 23,000, according to the 2010 census. Therefore, according to Kentucky law, the county clerk of Rowan County, with credit for years served as deputy county clerk, earns a salary of $87,997.
Jean Bailey served as Rowan County clerk from 1978 to 2015, and for 26 years she employed her daughter, Kim Davis, in the clerk’s office. When Ms. Bailey decided to retire, her daughter ran to succeed her. Ms. Davis had an exceptionally tight Democratic primary race, but she won by a 23-vote margin. She had only a slightly easier time in the general election, beating her Republican rival by 6.3 percent of the vote.
It is at least an interesting side note that one of the duties of a county clerk in Kentucky is to chair the county board of elections. The county clerk is also responsible for receiving “applications” for voter registration and entering them into the state’s database. County clerks are explicitly permitted to exercise their electoral responsibilities for elections in which they are candidates, and therefore presumably also for elections in which their daughters are candidates.
So Ms. Bailey employed her daughter as a deputy county clerk, then presided over the board of elections while her daughter ran to replace her. In a government that places any importance on ethics, both would be impermissible conflicts of interest.
During her election campaign, Ms. Davis told the Moreland News, “The public is my boss. As a deputy clerk …, being a public servant is ingrained in me and I want continue providing the high level of customer service we do while treating people with respect, kindness, and helping them with whatever situation they have.” Bookmark that phrase: “treating people with respect, kindness, and helping them with whatever situation they have.”
After Ms. Davis won the election, she told the News, “My words can never express the appreciation but I promise to each and every one that I will be the very best working clerk that I can be and will be a good steward of their tax dollars and follow the statutes of this office to the letter.” Bookmark that phrase, too: “follow the statutes of this office to the letter.”
Ms. Davis took office on Monday, January 5, 2015. On June 26, 2015, the United States Supreme Court held that same-sex couples are constitutionally entitled to marriage licenses issued by the states. One of the duties of a county clerk in Kentucky is to issue marriage licenses.
Rather than comply with the Supreme Court decision, Ms. Davis stopped issuing all marriage licenses – the only Kentucky county clerk to do so. A federal judge ordered her to resume issuing marriage licenses, and an appellate court declined to intervene. Yesterday, the Supreme Court also declined to intervene, but today Ms. Davis continued to refuse to issue any marriage licenses.
Initially this morning, Ms. Davis left it to her staff to turn away same-sex couples seeking marriage licenses – she sat in her office with the blinds drawn. Eventually she came of out of her office and asked the couples to leave. Someone said that he would not leave until he got his marriage license. “Then you’re going to have a long day,” Ms. Davis retorted. That is evidently what passes for “treating people with respect, kindness, and helping them with whatever situation they have” in Ms. Davis’s world.
Someone asked Ms. Davis under whose authority she was denying them marriage licenses. She responded, “Under God’s authority.”
The couples suing Ms. Davis immediately asked the federal court to find Ms. Davis in contempt of court, and a hearing on the motion was scheduled for Thursday.
Ms. Davis issued a statement this morning, saying in part, “I was elected by the people to serve as the County Clerk. I intend to continue to serve the people of Rowan County, but I cannot violate my conscience.” Evidently that is what passes for “follow the statutes of this office to the letter” in Ms. Davis’s world.
According to Ms. Davis’s attorneys, the act her conscience cannot abide is the placement of her name on a license for the marriage of two members of the same sex. A county clerk is also responsible, for instance, to record deeds for property – we don’t yet know whether Ms. Davis’s conscience can abide recording the deed for a house to be occupied by a same-sex couple. A county clerk is responsible for printing election ballots – we don’t yet know whether Ms. Davis’s conscience can abide printing a ballot with, say, the name of a candidate who is married to a person of the same sex.
For each of seven government jobs I have held, I was required to swear an oath to abide by the constitution of the United States. The oath provides no exceptions for religion or conscience. Ms. Davis swore a similar oath, and her oath included no exception for constitutional provisions she dislikes.
The point, of course, is that public officials are required to comply with the laws of the United States, and if they think the laws of God prohibit them from complying with the laws of the United States, they may resign and find other employment. There is no religious exception for the execution of public duties by a public servant, and there never has been.
Ms. Davis has failed to find a judge to say otherwise, even though she has gone up to the Supreme Court and back in the effort.
Vice President Joe Biden is thinking about running for president. Speaking purely statistically, it’s a long shot.
We’ve had 47 vice presidents, and, although the truism is that the vice president sits only a heartbeat from the presidency, only 14 of 47 have become president. Nine of those succeeded upon the death or resignation of the president, and of those nine, five were tossed out at the next election.
Only five vice presidents have succeeded to the presidency by election. Two of them don’t really count – John Adams and Thomas Jefferson – because they were elected vice president under the original constitutional scheme that made the presidential runner-up the vice president. One, Richard Nixon, didn’t succeed directly to the presidency, but only won election eight years after leaving office.
In other words, only two vice presidents have ever succeeded directly to the presidency by popular election – Martin Van Buren and George H. W. Bush. Neither, by the way, won a second term. Only one vice president in the history of the Republic has ever served two full terms as president – Thomas Jefferson.
One of the reasons that vice presidents have more trouble with the presidency than it seems like they should is that it’s very hard for one party to win three consecutive terms in the White House, whether their candidate is the sitting vice president or not. Just ask Al Gore.
Another reason is that vice presidents are selected very differently than presidents. Presidents withstand rigorous primary campaigns, and are the focus of voters’ choice on election day. Vice presidents are hand-picked by presidential nominees, subject only to relatively loose scrutiny by party conventions and general election voters. A successful vice presidential candidate is a national figure, but doesn’t necessary have wide national appeal – he may just fill a gap in the presidential candidate’s popular appeal.
When Barack Obama won the primaries, he relied heavily on African-American, Latino, Asian-American, and liberal white voters. Joe Biden gave Obama credibility in the white working class, and Biden has continued to serve that function throughout the Obama presidency. To win the nomination, let alone the presidency, Biden has to considerably broaden his appeal.
But I don’t agree with the pundits who say it’s too late for Biden to enter the race. This morning the talking heads on MSNBC were wringing their hands about how Biden wouldn’t be able to make the requisite “splash” if he entered the race. Au contraire. Biden’s entry into the Democratic primary campaign would be its own splash.
A Hillary Clinton-Bernie Sanders-Joe Biden three-way would be a lot of fun to watch. And at an average age over 70, they would surely make the oddest competition in history for the youth vote that energized the Obama campaign in 2008.
Dracunculus medinensis, otherwise known as guinea worm, is a parasite that apparently infects only humans and dogs. The host ingests the guinea worm larva by drinking water infested with tiny crustaceans called water fleas. Nothing happens immediately, except that the larva grows inside the body.
If the larva is female, it grows to more than two feet long. About a year after the larva is ingested, the adult guinea worm begins to leave the body to lay its eggs, which are in turn eaten by the water flea, perpetuating the cycle.
The female guinea worm leaves the human body through the skin – first a painful blister appears as the worm migrates to the skin, then an ulcer forms as the worm begins its slow exit. Dracunculiasis, or guinea worm disease, is rarely fatal, but it is excruciatingly painful, and the ulcer can become infected, and some people have allergic reactions.
With exquisite timing, the guinea worm typically exits the body during planting or harvest season. The victim is debilitated, unable to work for as long as three months. The loss of a single income can devastate a family, and the loss of several productive herders and farmers – say, from an infested well – can bring hunger to an entire community.
In 1986, former President Jimmy Carter decided that the guinea worm should be eradicated. That year, there were 3.5 million cases of guinea worm disease in 20 countries from Africa to Pakistan. Carter deployed his estimable diplomatic abilities, enlisting the vigorous assistance of heads of state and former heads of state.
By 1991, with cases down to 400,000, the World Health Organization decided that Carter might have a point, and joined the eradication effort. In 2008, Carter wangled a grant from the Bill and Melinda Gates Foundation.
Carter has been at this now for 29 years. Given the conditions of the countries endemic with the disease, the obstacles were considerable, but Carter rose to each of them. In 1995, for instance, he had to negotiate a cease fire in the civil war in Sudan to allow health workers a brief window to work there. As of May 31, there have been only five cases this year, down from millions of cases when Carter took up the fight. Aside from the untold human suffering Carter avoided, his effort constituted an important contribution to economic growth in the world’s poorest countries.
In addition to Carter’s guinea worm eradication effort, he has taken on river blindness, trachoma, schistosomiasis, lymphatic filariasis, and malaria.
Carter has done a wide range of other work. He helped Habitat for Humanity build housing; he has served as a respected neutral monitor for 96 elections in 38 countries; he has provided mediation and conflict resolution services in most of the world’s hot spots, including Haiti, Bosnia, Ethiopia, North Korea, and Sudan, winning him the Nobel Peace Prize in 2002 that had eluded him after the Camp David accords.
Carter’s presidency is remembered as feckless, and that is not entirely unfair. He won election as the Washington outsider, appealing to an electorate exhausted by the Vietnam War and a series of scandals culminating with Watergate and capped off with President Gerald Ford’s pardon of former President Richard Nixon. But as a Washington outsider, Carter was unfamiliar with its workings, and he never really mastered the job.
He was also dogged by some remarkably bad luck – from hostage-taking in Iran and the botched military effort to rescue the hostages, to the energy crisis, to the Soviet invasion of Afghanistan. This is how unlucky President Carter was: he is to this day the only president in American history to serve a full term but get to appoint no Supreme Court justices.
But President Carter also did some great and far-sighted things. He appointed unprecedented numbers of women and minorities to federal positions, including lifetime-tenured judges; he personally brokered the Camp David accords; he negotiated the return of the Panama Canal Zone to Panama; he made human rights a cornerstone of American foreign policy, which earned him the condescending scorn of the American realpolitik right. Also, Jimmy Carter nurtured relationships with African countries, probably contributing to the wave of democracy movements across that continent a decade after Carter left office, and certainly facilitating Carter’s disease-fighting efforts ever since.
At 90 years old, Jimmy Carter has been an ex-president for more than 34 years – making Carter’s ex-presidency the longest in American history. Carter beats out Herbert Hoover (31 years), Gerald Ford (29 years) and John Adams (25 years). Most ex-presidents make productive use of their time after serving in the country’s highest office, but some more than others.
I respectfully submit that Jimmy Carter is, hands down, America’s greatest ex-president.
As much as Americans are drawn to presidential candidates who run as “Washington outsiders,” we strongly prefer presidents with experience in government and public service. Of the 43 men who have served as president, all but five had prior electoral experience – two came to the presidency by way of appointive government positions (William Howard Taft and Herbert Hoover) and three came by way of military commands (Zachary Taylor, Ulysses Grant, and Dwight Eisenhower).
The nine men who came to the presidency by succession from the vice presidency obviously all brought prior electoral experience to the presidency. But vice presidents are even more likely to have prior electoral experience than presidents: 44 out of 47, excluding only Chester Arthur, Charles Dawes and Henry Wallace, all of whom held appointive federal positions before being elected vice president.
It may or may not be meaningful that four of the five presidents who had no prior electoral experience were Republicans – all but Taylor, who was a Whig. Today there are 17 Republicans running for president, counting former Virginia Governor Jim Gilmore, who has announced that he will announce his candidacy this week. Three of the candidates have no prior electoral, military or other government experience: Donald Trump, Ben Carson and Carly Fiorina.
Both Trump and Fiorina made their careers in business – he as a self-promoting entrepreneur, she as a corporate executive at AT&T, Lucent and Hewlett-Packard. Trump has made a side career as a political agitator, threatening to run for president in 1988, 2004 and 2012, and for governor of New York in 2006 and 2014. Trump did run for president briefly in 2000, as a candidate for the nomination of the Reform Party, and he actually won the California primary of the Reform Party. The nomination ultimately went to Pat Buchanan, who Trump compared to Attila the Hun.
Like Trump, Fiorina has no governmental experience, but she does have a history of public service. She served briefly on the board of the World Economic Forum, an international advocate of public-private cooperation in problem-solving, and she launched the One Woman Initiative to foster public-private empowerment initiatives in Muslim-majority countries. Fiorina served for a few years on the board of visitors of James Madison University, in Virginia. She chairs Good360, a non-profit that organizes corporate donations of excess inventory to charities. She served as chair of the CIA’s unpaid External Advisory Board under President George W. Bush.
Carson’s career was in medicine, specifically in neurosurgery. He retired on July 1, 2013, without any real demonstration of interest in politics, public policy, or government. By his own account, he didn’t even belong to a political party until November 4, 2014. Somewhere along the way, the idea hit Carson that he should be the president of the United States, but it’s not at all clear where the idea came from.
None of these three is going to succeed Barack Obama at the White House. As of now, it looks like Fiorina won’t even make it into the debates – her polling is running below one percent, down at the bottom with former New York Governor George Pataki and Louisiana Governor Bobby Jindal. Carson will probably get into the debates, with poll numbers around six percent, in the league of former Arkansas Governor Mike Huckabee and Senator Rand Paul. But Carson’s debate performance is likely to be uninspiring, at best – his television interviews to date reveal the amateur that he is.
Nor will Trump win the Republican nomination, even though he’s leading – for two reasons. First, he doesn’t want it. His campaign is a business campaign, not a political one, and his goals are financial, not electoral. Trump hasn’t got a public-spirited bone in his body. And second, for all that the Republican far-right has come to dominate Republican debate in recent years, Republicans have yet to nominate a full-on loony to run for president – although they’ve had opportunities – and I don’t think they will this time, either.
Wisconsin Governor Scott Walker and former Florida Governor Jeb Bush are already standing out as the grown-ups in the room, and they are polling in second and third place behind Trump. Trump’s problem is that his unfavorable ratings are in historically high territory, which means he has a low ceiling. Trump is in first place with 17 candidates in the field, at 20 percent. As candidates drop out, the bulk of their support will go to candidates other than Trump. His 20 percent will rise less than Walker’s 14 percent and Bush’s 12 percent.
Still, Donald Trump gives us a note of suspense and intrigue. Trump doesn’t lose, he quits. So the question is, where is Trump’s off-ramp? Under what guise will Trump leave the presidential campaign?
When I started law school in 1978, a gay law graduate’s legal right to admission to the bar was recognized in just two states – New York, fortunately for me, being one of them. In 48 states and the District of Columbia, it was still at least theoretically possible to deny bar admission to lesbians and gay men on the ground that homosexuality demonstrated lack of “good moral character,” one of the criteria for admission.
Indeed at that time, only the District of Columbia had a law on the books prohibiting employment discrimination against gay people. (Pennsylvania prohibited anti-gay discrimination only in state employment.) So even once a gay law student graduated and was admitted to the bar, she generally had no legal right to actually get a job.
Furthermore, being gay meant in those days that a person was likely to be a criminal, because gay sex in those days was a crime in almost every state, including New York. For my first summer internship, in 1979, I worked for the Lambda Legal Defense & Education Fund, a gay rights organization then led by a lawyer named Margot Karle. One of the cases I worked on that summer was People v. Onofre.
Ronald Onofre had a relationship with another man. At some point, the other man decided that sex with men was wrong, but that his involvement with Mr. Onofre could be excused if he hadn’t consented. He submitted a rape complaint to the police, and Mr. Onofre was arrested.
As luck would have it, the couple had taken some photographs of themselves. I personally never saw the photos, but, as Ms. Karle put it, the photos left no doubt that the relationship had been consensual. (I’ve always imagined it was the smiles on their faces.) But instead of charging the complainer with perjury or filing a false complaint, the police charged Mr. Onofre with consensual sodomy, which was then a crime under section 130.38 of New York state’s penal law.
Mr. Onofre challenged the constitutionality of that law. He lost in the Onondaga County trial court, and it was his appeal that I worked on in the summer of 1979. Mr. Onofre won his appeal, and he also won the prosecution’s further appeal to New York’s top court, the Court of Appeals, in 1980.
As it happens, my legal career took me to the New York state attorney general’s office in the mid-1980s. I was assigned to a division that defended state agencies and officials in litigation. But prestige in the office came from the progressive, reform-minded litigation the office did under then-Attorney General Robert Abrams. So lawyers in my division were encouraged to keep a look-out for litigation ideas that would shed glory on our division’s managers.
In 1985, the U.S. Supreme Court agreed to decide whether private, consensual sodomy could constitutionally be criminalized. I proposed that Attorney General Abrams file a friend of the court brief on behalf of Michael Hardwick, a Georgia man who had been arrested in his own bedroom on charges of consensual sodomy, an offense then punishable under Georgia law by imprisonment for up to 20 years.
We wrote and submitted our brief, but in a 5 – 4 decision issued 29 years ago next week, Justice Byron White rejected the assertion of a man’s constitutional right to engage in private, consensual sex with another man. Justice White called the claim “facetious.” To be “facetious” is to treat a serious subject with deliberately inappropriate humor – to be flippant. In other words, Justice White was saying that the claim of constitutional right was not even sincerely asserted; the whole case was a sick joke. That was in 1986.
I have noted that, especially with important social issues, the Supreme Court often gets it wrong before it gets it right. Not until 2003 did the Supreme Court get it right – in Lawrence v. Texas, when Justice Anthony Kennedy wrote a 6 – 3 decision finding the constitutional claim not only not “facetious,” but in fact correct. Justices Antonin Scalia, William Rehnquist, and Clarence Thomas bitterly dissented. It was very important to them that states retain the legal ability to throw gay people in jail for having sex in private.
By 2003, the question of same-sex marriage had developed into a serious national issue. In 1993, the Hawaii Supreme Court ruled that exclusion of same-sex couples from legal marriage could only be justified by “compelling evidence” under the judicial standard of “strict scrutiny.” The Court sent the case back to the trial court to determine whether the state had “compelling evidence” for could survive “strict scrutiny.”
The Hawaii Supreme Court did not then, and never did, find a legal right for same-sex couples to marry. But the mere possibility that it might sent anti-gay forces into national paroxysms, resulting in the infamous Defense of Marriage Act, passed by veto-proof majorities in both houses of Congress, and signed into law by President Bill Clinton in 1996. DOMA precluded the federal government from honoring any same-sex marriage, and authorized states to refuse to honor any same-sex marriage. Mini-DOMAs proliferated across the states in a panicked defense against the same-sex couples perceived to be facetiously massing at the gates.
When Justice Kennedy wrote the 6 – 3 majority opinion in Lawrence v. Texas, he had already written a pro-gay decision in Romer v. Evans. That case involved a provision in the Colorado constitution, adopted by popular referendum, that precluded municipalities from enacting any law that would give protection to gay people against discrimination. Justice Kennedy wrote for a 6 – 3 majority in that case, concluding that Colorado’s constitution uniquely burdened gay people’s ability to change the law through political activism, that the provision could have been motivated only “animus” toward gay people, and that a “desire to harm a politically unpopular group cannot constitute a legitimate governmental interest.”
The Romer v. Evans case was something of an outlier; the Colorado constitutional provision involved was so far out there that Justice Kennedy’s decision was not taken as an indication of any greater interest in the constitutional rights of gay people. But when Justice Kennedy – a lifelong Republican and a Ronald Reagan appointee – wrote the majority decision in Lawrence v. Texas, attention was paid.
At that point the path to constitutional recognition of same-sex marriage started to become clear, and Justice Kennedy was the North Star by which the path was navigated. Challenges to DOMA were argued in terms intended to appeal to Justice Kennedy’s rationale in Romer: that uniquely burdening the interests of gay people to obtain legal protections through political activity is constitutionally impermissible.
And sure enough, in 2013, Justice Kennedy wrote the decision by which the Supreme Court, 5 – 4, struck down the DOMA provision that barred the federal government from recognizing same-sex marriages. Justice Scalia famously predicted that constitutional recognition of same-sex marriage would soon and surely follow, and, of course, he was right. Just two years later, Justice Kennedy has written another 5 – 4 majority decision concluding that the United States Constitution precludes the states from excluding same-sex couples from legal marriage.
There was a certain logic to the Court deciding that private, consensual gay sex is constitutionally protected before deciding that same-sex marriage is constitutionally protected: it would be a little odd to say that a couple has a constitutional right to get married but not a constitutional right, as they used to say, to consummate the marriage. But even today, the federal government and 18 states have no laws precluding employment discrimination against gay people. Five more states have laws prohibiting anti-gay employment discrimination only in state employment. Only 17 states prohibit housing discrimination against gay people.
In the short span of my legal career, in just the last 37 years, gay people have made remarkable gains, and I think the rapid growth in popular acceptance of same-sex marriage stands as an excellent proxy measure of those gains.
But there remains work to do. Although we now enjoy a nationally recognized legal right to get married and have sex, in much of the country we have no legally recognized right to obtain housing or employment. We can count on conservatives to push hard to wall off the rights we have won, as they are doing with laws to allow public servants to deny marriage licenses to those whose marriages offend their religious sensibilities – that is, same-sex couples.
Yesterday’s victory was enormously important, but the battle is not yet won.
Charleston, South Carolina may never lose its association with slavery and rebellion. Antebellum South Carolina was the foremost advocate of nullification, and with a majority of its population enslaved to a minority, was perhaps the most economically dependent on slavery. South Carolina was the first to secede from the Union.
Charleston was one of the commercial and cultural capitals of the Old South. The Civil War began in Charleston, with the bombardment of Fort Sumter. Charleston, like South Carolina as a whole, remained at the forefront of segregation and racial oppression for nearly a century after reconstruction.
But something has happened in Charleston in recent decades, something many of us Northerners haven’t fully appreciated: Charleston has dramatically parted ways with the rest of its state.
After emancipation, many freed slaves fled the countryside for the relative protection of Charleston. Charleston’s population became as much as three-quarters black, and remained about half African-American until just a few decades ago. These demographics profoundly influenced Charleston’s politics in both pre-Civil Rights and post-Civil Rights eras: first the white minority ruthlessly suppressed African-American votes; more recently African-Americans became an essential voting bloc.
South Carolina was governed by Democrats from Reconstruction until 1975, when the state elected the first of five Republican governors. Charleston has not had a Republican mayor since Reconstruction. In other words, as Southern conservatives moved from the Democratic to the Republican party, Charlestonians moved from being conservative to being modern Democrats.
The current mayor of Charleston, Joseph Riley, Jr., took office in 1975. In the characterization of the New York Times, Riley “has been a singular political phenomenon, a white Southern progressive whose sympathy for black causes early in his career prompted conservative whites to derisively call him L.B.J., for Little Black Joe.” In 2000, Riley helped lead the protests against the display of the Confederate flag above the state capitol in Columbia. He joined New York Mayor Michael Bloomberg’s Mayors Against Illegal Guns.
Mayor Riley is credited with revival of the city’s economy, and Charleston is today culturally and politically all but unrecognizable from the point of view of its Civil War-era past. In 2006, while South Carolinians as a whole imposed a ban on same-sex marriage by a 56-point margin, a majority of Charlestonians opposed the ban. Today, although Charleston’s black population is down to about a quarter of the city, five of 12 City Council districts are represented by African-Americans. (By comparison, in New York City, which like Charleston is about 25 percent African-American, 11 of 51 City Council seats are held by African-Americans.)
After this week’s racist mass murder in Charleston’s Emanuel African Methodist Episcopal Church, Mayor Riley has been clear-eyed and resolute. While the right-wing has decided that the shooting was an attack on Christians and religious freedom, Riley has decried gun proliferation and worked for racial unity.
Jon Stewart was so disturbed by the attack that he devoted his customary introductory monologue to it, suspending comedy for the night. He referred to streets named for Confederate generals, and to the Confederate battle flag that still flies on the capitol grounds in Columbia, and he called them “racial wallpaper” – part of the American background that gains little explicit notice but subtly colors people’s moods.
We used to post signs that said things like “Whites only.” We got rid of those signs, but our national hallways are still wallpapered with Confederate flags and busts of Confederate generals, and the message remains the same: African-Americans need not apply; don’t even try; you do not belong. Daily life in America is rife with little messages that African-Americans do not belong. Those messages are part of American culture; they are our racial wallpaper.
Confederate nostalgia is a pretense. A Confederate flag bumper sticker certainly identifies the car’s owner as white; African-Americans do not frame portraits of Stonewall Jackson in their homes. Confederate nostalgia is a means to suppress African-Americans by denying them a sense of belonging and is therefore a means to preserve white supremacy – just as clearly as, if less violently than, the marauding terrorism of the Reconstruction era Ku Klux Klan.
This week’s mass murder of nine African-Americans by a white man who was given their courtly Southern hospitality is merely one more demonstration that America will not know peace until our walls are stripped clean of our racial wallpaper.