Showdown in Jamestown

The United States of America is one of the most ethnically diverse countries in the world. Since its discovery, this land has been the target of people from around the world. The New World offered a new life for many, and so people set sail to stake out their piece of ground. Rulers sent their explorers westward to claim land and riches. The attention paid by the entire world to this section of North America led to settlements by many countries, and combined with the rights and freedoms granted by the United States many years later, has created the melting pot we experience today.

As a result, the history of this land is marked by difficulties in race relations. This week, we mark the 395th anniversary of one of the most infamous events in colonial America—the Jamestown Massacre.

The Virginia Company of London was founded in 1606 with the purpose of creating settlements along the mid-Atlantic coast of what is now Virginia, Washington, D.C., Delaware, and New Jersey. A sister company, The Virginia Company of Plymouth was founded with a similar objective but with a more northern territory.

The early settlers founded Jamestown in 1607 in present-day Virginia. From the beginning, Jamestown leadership under Captain John Smith struggled to build positive relationships with the natives, while also struggling to survive. The elements, the lack of resources and cooperation, and native relationships put the settlement in a precarious situation.

About five years in, settler John Rolfe was successful in creating a new strain of tobacco, allowing Jamestown to focus its economy on agriculture. The success would allow for the population to grow, which required more land. The downside to tobacco was the damage it caused to the soil. This also called for more land.

The borders of Jamestown began to expand and all of this was watched with a suspicious and hostile eye by the neighboring Powhatans. The English sought to tear down forests to claim more farm ground. The Powhatans wanted the woods preserved for hunting. Of additional concern were the colonists’ attempts to education and “civilize” the natives.

For years, Chief Powhatan had attempted to keep peace with the settlers, as his daughter Pocahontas was married to Rolfe, but by the latter part of the decade, Powhatan had been replaced as leader by his brothers Opechancanough and Itoyatan. The pair was not as interested in peace.

On March 22, 1622, the Powhatan Indians, under the leadership of Opechancanough, attacked Jamestown, slaughtering 347 people, destroying crops, and stealing supplies. The motive for the attack may not have been complete extermination of the English presence, but rather the sending of a message regarding the ever-expanding borders of the settlement. Of note is the notion that a Powhatan boy, who was living in the village, warned the townspeople of the impending attack. Some scholars believe that Opechancanough himself sent the messenger.

Many of the English settlers, rather than being deterred from growing, felt justified in their actions. When King James found out a few months later, supplies and reinforcements were sent. Settlers conducted attacks on the Powhatans. They would resume peace during the growing season and once the Powhatan corn was ready for harvest, set the fields ablaze.

In 1624, partly in response to the warring, The Virginia Company of London was dissolved and Jamestown was placed under royal control. The conflicts, combined with disease, saw the population of the Powhatans plummet from 25,000 in 1607 to a few thousand in just over 20 years.

A final attack was planned in 1644 by an, at this time, elderly Opechancanough. The attack resulted in the deaths of over 400 English settlers but also spurred a two-year war that left Opechancanough dead in a Jamestown jail. The dominance of the English settlers in the region was cemented.

  1. Jamestown: Legacy of the Massacre of 1622. (n.d.). Retrieved March 17, 2017, from
  2. Virtual Jamestown. (n.d.). Retrieved March 17, 2017, from

Brown rice, white rice, wrong rice, right rice

For many people living in first world countries today, and especially in America, health has become a major concern. We all see that despite rising life expectancy, health concerns such as obesity, cancer, heart disease, diabetes, and depression are bogging down our people and healthcare system. One reason is that developments in medicine have been aimed at treating symptoms. The curing of the root problems often comes down to a lifestyle change in the departments of diet and exercise. Which no pill can change. We all know this. It’s simple, but difficult to do.

But it’s not only about the amount of food we eat, but the kind of food we eat. The ever-increasing reliance upon quick meals and processed food is a major culprit in the breakdown of our bodies. Most of what we eat lists ingredients we can’t pronounce, much less understand. If simplicity is key, eating foods that you can actually describe makes more sense (or at least that’s what all those Netflix documentaries tell me). I can explain where a tomato comes from, especially when it’s grown in my garden. I have no idea how a Twinkie is made or what’s in it (other than deliciousness with an aftertaste of guilt). I understand an apple, but sausage is a mystery (and should remain that way, as the old saying goes). The less we do to take the nature out of our food, the better it is for us.

As with most things, the idea that natural foods are healthier is not a new concept. We’ve known about this problem for a long time.

In the late 1800s, the Dutch East India Company saw worker after worker stationed in East Asia come down with a similar illness. They grew extraordinarily weak and lost severe amounts of weight. It grew painful for them to move their limbs and some even faced death from heart failure. The disease was called beriberi, meaning “I cannot,” in the native tongue of the region. As the problem persisted, company officials grew concerned and asked Robert Koch, a physician famous for his work with Louis Pasteur on germs, to investigate and find the germ wiping out their workforce. Koch, unable to commit to the length of time such a task could entail, declined but referred a former student of his, Dr. Christian Eijkman, himself Dutch.

In 1886, Eijkman began his research, using methods any other physician would have at that time. Pasteur and Koch’s research served as a guide, and so he began looking for germs. If the beriberi bacteria could be located, it could be eradicated. Armed with a microscope and syringes, Eijkman drew samples of blood, examined perspiration, water, and air. He found nothing.

Eijkman decided to use the chickens found at the worksites as a way to experiment and control variables. He injected them with the blood of infected workers, but to no avail. Suddenly and mysteriously, the chickens began displaying physical symptoms of beriberi. Their wings went limp and they struggled to survive. And soon, without explanation, their symptoms disappeared. Eijkman turned his attention to their diet. Their caretaker was questioned. Typically, the chickens were fed the less-expensive, less flavorful brown rice. But there was an exception. Some chickens had been given leftover rations from the workers which consisted of the polished, or white, rice. The stuff the people ate. Once a superior noticed what was happening, the caretaker was ordered to stop feeding the chickens the white rice, as it was reserved only for people.

Eijkman had found his root problem. Something about the white rice was leaving people sick. The white rice, processed to improve palatability and increase shelf life, was worse for them than the brown counterpart. Further research found that a deficiency of thiamine, which is removed in the creation of white rice from brown, was the cause of beriberi.

It would take years of study to fully understand the role vitamins play in our health. But it all started with this episode. Eijkman, along with Sir Frederick Hopkins, shared the 1929 Nobel Prize in Physiology and Medicine, both men receiving the prize for their groundbreaking discoveries on vitamins.

So even in the 19th century, we knew that processing our food with a focus on shelf life and flavor was harmful. Sometimes it just takes a while for lessons to sink in. And some never do.

  1. Tiner, J. H. (2006). Exploring the history of medicine: from the ancient physicians of Pharaoh to genetic engineering. Green Forest, AZ: Master Books.

Stroke of the pen, law of the land, kinda cool: Executive orders and the American Presidency

In his first couple of weeks in office, President Donald Trump has taken a lot of criticism for his executive orders, both the content and the number. He has used his pen to address the Affordable Care Act, immigration, the U.S.-Mexican border wall, and government regulation—all issues upon which he campaigned. Trump is certainly not acting in a manner outside the norm in his use of this executive function. The use of such orders has been around since George Washington, but like much else since the 18th century, this use of power has grown.

Nowhere in the United States Constitution is the president granted the authority to use the executive order. As the executive in charge of administering national laws, all presidents have written them to direct their agencies in “faithful execution.” A law granting the president to issue a specific order must be cited with the order.

The executive order is closely tied to the presidential memorandum and proclamation. The distinction is typically that an executive order sets government-wide policy, while the memorandum directs a specific department secretary to take a specific action and proclamation tends to be ceremonial (e.g. Red Ribbon Week), though Abraham Lincoln once used a proclamation to famous effect.

In 1907, the Department of State began numbering the executive orders, starting with one issued by President Lincoln in 1862. In 1936, the Federal Register Act created a more stringent process for documenting and number the orders. Still, orders pop up that have gone undocumented or unnumbered, in which case they are given a letter with an existing order number.

The American Presidency Project at the University of California – Santa Barbara has posted a table showing the number of orders by president. George Washington did in fact use the executive order, but sparingly. In fact, no president averaged more than one executive order per year until our seventh president, Andrew Jackson, wrote two per year. Martin Van Buren followed him, issuing 12 in his lone term in office. John Tyler averaged four per year. And so the use of the executive order climbed. But compared to today, this looks like the presidential equivalent of baseball’s dead ball era before Babe Ruth burst onto the scene.

So who was the executive order issuing Babe Ruth? It appears to be someone who shared Ruth’s penchant for “carrying a big stick.”

The Civil War and Reconstruction period saw a major rise in the use of the order. Lincoln issued 48 in his four years in office from 1861-65, an average of 12 per year. Franklin Pierce, from 1853-1857, had averaged nine. Following Lincoln, Andrew Johnson wrote 79 in just under four years in office, dramatically increasing the use of the power.

These figures steadily rose until William McKinley was averaging 41 per year. In 1901, his assassination vaulted Theodore Roosevelt into the big chair. A man of action, Roosevelt used the executive order to push his agenda. Whereas no one had ever written more than 217 (Ulysses Grant), Roosevelt issued 1,081 in his 7.5 years in office. Babe Ruth was the man who built Yankee Stadium and Roosevelt was the man who built the West Wing. Fairly symbolic, I believe.

From that point through World War II, the executive order increased in frequency. Woodrow Wilson issued over 1800. Even “do-nothing” President Calvin Coolidge issued 1200 in five and half years in office.

Franklin Roosevelt was elected president in 1932 during the Great Depression. A tool he used frequently to attempt to curb the financial collapse was the executive order. He remained in office for 12 years and wrote 3721, an average of 307 per year. Since that time, no president has come close. Using the baseball analogy, FDR was the steroid era.

Truman issued over 100 per year, but no president since has averaged more than 80 (Jimmy Carter). Reagan averaged 48, Bush 42, Clinton 46, Bush 36, and Obama 35.

So historically speaking, what Trump is doing has historical precedent. Obama issued a number of orders and memoranda in his first few weeks in office, but then slowed down. The same could be true for Trump. The reason I believe presidents use orders frequently after their inauguration is because they have been elected by a public on promises they make during campaigns and given the gridlock in Congress and the focus on the first 100 days, they know they don’t have time to wait to take the action for which voters are looking.

But in my mind, their desire to “get things done” and our similar demand that they act accordingly has had negative consequences. Amidst much fear and panic in the face of the Great Depression, President Franklin Roosevelt was able to dramatically and, in many ways, permanently increase the size of government and the power of the presidency. The precedent has been set for a president to make sweeping changes in the name of the common good with a signature. Are we sure that’s really in our collective best interest?

  1. Executive Orders. (n.d.). Retrieved February 03, 2017, from

Trump, Truman, and truth in media

Donald Trump did it. He won. He overcame the Clinton political machine and much of the media. He overcame almost every published poll. And now, in January, he will become the 45th President of the United States.

A couple of friends and I were discussing the election in the days leading up to the vote. We all assumed Clinton would win, but I made the comment that if anyone could topple widely-accepted polling practices and algorithms, it was Donald Trump. This wasn’t a case of genius political prognostication—it was an easy observation that he was running his campaign in an unorthodox way. And I added, maybe this could be our generation’s Dewey defeats Truman moment.

In 1948, after assuming the duties of the President following Franklin Delano Roosevelt’s death in 1945, Democrat Harry Truman faced an uncertain re-election campaign. Facing criticism from both his political opponents and his own party, Truman was urged to quit by liberal magazine New Republic. Southern Democrats were not pleased with his civil rights campaign. Some saw the former haberdasher with the folksy delivery to be unqualified and underprepared. Others simply didn’t like him because he wasn’t FDR.

But Truman patched together enough support from diverse factions of the Democratic Party to run for re-election. Republicans nominated Thomas Dewey, governor of New York.

Truman’s primary mode of campaigning was a 31,000-mile whistle-stop train tour of the country, visiting towns large and small. The electricity of his rallies was undeniable, except to those who never saw them. After a supporter in Harrisburg, Ill., yelled “Give ‘em hell, Harry!” the phrase became a rallying cry amongst his supporters. Truman had a magnetism, but no one from the national media was taking notice.

A poll released just after Labor Day showed Dewey leading 44.3 to 31.4 percent. Internally, some of Dewey’s team were urging the candidate to come out “slugging,” but were drowned out by those preferring a campaign above the fray. Why should Dewey consider changing a strategy that he was proving to be so effective?

Truman, despite the polls, was confident.

On the eve of the election, The Chicago Tribune published a story that predicted Dewey would win in a landslide, tallying over 400 electoral votes. As early voting returns came in, the same newspaper, pressed to get its early edition to print, printed the now infamous headline: DEWEY DEFEATS TRUMAN.

The problem, of course, was that he didn’t. It was Truman who claimed 303 electoral votes to Thomas Dewey’s 189. Truman also won the popular vote by more than two million. And he shocked a nation. Well, at least, the national media.

Leslie Biffle, a Democratic Party official, was perplexed by the polls showing the nation’s dislike for Truman. So he loaded up in a pickup truck, and posing as a chicken buyer, began driving from farm to farm discussing the election. He became convinced that Truman would win.

The rural vote, accessed by the whistle-stop tour, was the key to Truman’s win. It was also the vote that was ignored by polling methods. Those methods relied upon contacting voters using landline phones, silencing rural residents, who did not own phones at the same rate as urban dwellers. But the rural voice was certainly heard on Election Day.

Information is vital to a democracy, which is the primary reason for the protection of the press in our country. In a nation as large and complex as ours, we must rely on media, but examples like these two elections (in addition to a plethora of others), show the vulnerability of the institution. I highly doubt Donald Trump overcame a 12-point deficit in the final two weeks of the campaign; rather, the reported 12-point deficit never existed. These errors give me pause when I consider a whole host of other reported “facts,” from President Obama’s approval rating, to Trump’s surprise at the scope of the President’s job, or Clinton’s email contents . Or what any of the candidates’ tax plans will really do to the economy. I don’t ever feel confident that I can, with great authority, discuss the certainty of any of these things.

We’re all humans and therefore prone to error, bias, deceit, and downright conspiracy. Even if we’re objective journalists. But we, as engaged citizens of a (somewhat) democratic process, must demand better. And if we are going to uphold and value the institution of the press, it must do the same. News has become a product that needs to be sold and therefore is presented as stories and narratives. Stories and narratives, to be done well, require themes, heroes, villains, and the like, which the facts don’t always provide.

I feel like Sgt. Joe Friday of Dragnet, imploring the media to report “Just the facts ma’am.” But, as I found out recently, even that phrase isn’t quite the actual one uttered by Jack Webb–it’s “All we want are the facts, ma’am.” So maybe I’m more like the dispatcher from another classic police show, but instead of asking for Car 54, I’m calling out “Truth, where are you?”

  1. Grossman, R. (2016). It’s happened before: Truman’s defeat of Dewey had hints of Trump-Clinton. Retrieved November 18, 2016, from
  2. Truman defeats Dewey. (n.d.). Retrieved November 18, 2016, from
  3. Snopes. (2010). Dragnet ‘Just the Facts’ Retrieved November 18, 2016, from

Rock rivals: Robbins, Harding, and a battle for supremacy in Yosemite Valley


Rivalries in competition serve to draw the battle lines and drive the storylines. Especially the mano a mano type. Think Larry Bird vs. Magic Johnson. Think Muhammad Ali and Joe Frazier. Arnie (RIP) and Jack. Royal Robbins and Warren Harding.

Wait, who? Not the president. Yosemite rock climbers. Yes, even rock climbing has legendary rivalries.

The 1950s saw the rise of the middle class. World War II had served as a boon to the economy and corporations were paying well. People were earning higher incomes, expanding home and auto ownership in the country. Vacations, once something only the elite experienced, were now something the working class could enjoy, as well.

The National Park Service offered America’s great lands to this demographic. Yosemite catered to the middle class white family looking to expose itself to a bit of restrained, censored wildness. Restaurants and cozy accommodations offered families an escape from the suburbs without the hassle of leaving behind modern conveniences.

But there was culture clash emerging at this time, too, and Yosemite would serve as a battleground.

In contrast to the content, picket-fence owning, middle class, this period also saw the rise of the beat generation. The beatniks were a group of disillusioned young people in search of their souls, who focused more on travel, the arts, and experience rather than securing stability. Some were hanging out in coffeehouses and some were riding waves and catching rays on the beach. And, as told in the documentary Valley Uprising, some beatniks near San Francisco and Los Angeles were eschewing the rungs of the corporate ladder to climb the faces of giant rocks.

Eventually these bands of self-proclaimed “dirtbags” made their way to Yosemite, which became their Mecca. Yosemite is home to two famous granite formations that rise thousands of feet into the sky—El Capitan and Half Dome. Instead of being content marveling at the rocks from the valley below, these beatnik deadbeats began to climb them, ushering in a golden age of rock climbing from 1955-1970.

These climbers were hated by the park rangers for their loud parties at base camp. Unemployed and broke, the climbers scavenged for food, even looking to eat the leftovers from guests’ plates at the nearby restaurants. One climber recalled stumbling upon damaged cans of cat food that a grocery store had thrown out. He quickly collected them, providing food for a few weeks for his group.

At the heart of this period was the rivalry between Warren Harding and Royal Robbins, two men who served as stark contrasts to one another in appearance, style, and philosophy.

Robbins was an intense man, hyper-competitive and serious about the craft of climbing. He wore his hair short, was bespectacled, and could be found around camp reading classic literature. Even his name—Royal–gave him a regal air.

Harding, on the other hand, had been given a presidential name, but was hardly civilized. He was in his 30s and still living with his mother. His unshaven face matched his unkempt brown locks. He had a crude sense of humor and a deep penchant for women and booze.

Robbins and his climbing mates set forth principles of rock climbing that included rules minimizing the use of bolts, onto which climbers could hook, assisting them in their climb. He looked to elevate the sport of rock climbing to a noble pursuit, ensuring that climbs were done with integrity and a respect for the granite.

Harding was agitated by this. He mocked Robbins’ rules and snobbish way, referring to Robbins and his clan as the “Valley Christians.” He established the Lower Sierra Eating, Drinking, and Farcing Society, which was dedicated to gluttony and sloth.

Robbins was the first to tackle the northwest face of Half Dome. It was a multi-day climb and many wondered if he would even survive. He and a team of three climbers followed a trail of cracks up the 2000 foot wall, tethering themselves to the side of the wall at night. After five days of intense climbing, Robbins made his way to the summit, becoming a legend and receiving recognition as the best climber in the Yosemite Valley.

This didn’t sit well with Warren Harding, who decided that he’d head up the biggest wall in the valley—the 3000 foot nose of El Capitan. In an effort that took his team two years to complete, Harding used a series of fixed ropes and bolts to the make it to the top. The ropes allowed the climbers to come down at night and for longer stretches of rest. The ropes were used to haul up gear, including alcohol and a Thanksgiving turkey baked by Harding’s mother.

Everything about this climb irked the high-minded Robbins. The length of time, the ropes, the bolts, the obvious disregard for Robbins’ rules. He saw it as a slap in the face and responded in kind with a climb of his own up the nose of El Cap. Robbins scaled the wall without coming down, and without the use of the rope system. Ropes were used, but not in the same manner that Harding had utilized them. The climb reaffirmed his status as king of the climbers.

The next few years saw Robbins establish several new routes along the faces of El Cap and Half Dome. But his rivalry with Harding would include one more legendary battle.

In 1970, Harding targeted the “Wall of the Early Morning Light,” or “The Dawn Wall,” of El Capitan. Robbins had declared this wall off limits, as its blank surface would require the placement of too many bolts. Harding didn’t care.

He decided not to use fixed ropes this time, but hammered plenty of bolts into the face of El Cap. Harding and another climber didn’t come down for rest, even when a storm threatened their pursuit of the top. They clung to the wall, suspended there with jugs of wine, waiting out the storm. A crowd began to gather, assuming that the two were stuck. Park rangers initiated a rescue mission, when an empty can came tumbling down. A note was attached, written by Harding:

“A rescue is unwanted, unwarranted, and will not be accepted!”

The storm eventually passed and the men continued their ascent, reaching the top in 28 arduous days, placing 300 bolts along the way.

Harding became a national sensation, touring the country, appearing on television talk shows to discuss his feat.

Robbins, in defense of his philosophy, sensibilities, and ego, vowed to wipe away the blotches left by Harding along the Dawn Wall. With a chisel in tow, Robbins ascended the rock, chopping off the bolts as he encountered them. But something odd happened as he climbed. Rather than encountering a series of bolts mindlessly placed, Robbins found an incredibly difficult route, an inspired route, one that he could respect. Eventually he stopped removing the bolts and simply followed the path to the top.

The climb would be the last in the rivalry between golden age rock climbers Royal Robbins and Warren Harding. Shortly thereafter, both would move on to other pursuits. Well, actually only Robbins did. He founded and still runs a successful outdoor clothing company that bears his name, while Harding spent his days on the front porch with his mom, drinking his red wine. He died in 2002.

Rivalries often define styles of play, but also define generations, divide friends and family, and which side you stand on says something deeper about you. Not just that you love the Celtics or you like trash-talking, socially-conscious boxers. These divides are about the fundamentals of who you are as a person. They’re fights over your soul.

But this dichotomous way of thinking is flawed. On the surface, rivals often represent something very different, but at their core, similarities emerge. Look at Larry Bird and Magic Johnson. One was white, the other black. One was reserved and the other possessed a megawatt smile. One represented blue-collar Boston and the other the flash and excess of 1980s Los Angeles. They may have played basketball differently and represented different cultures, but they both loved the sport and competition. Their fuel came from the same source. And so it was with Robbins and Harding. They approached the sport of rock climbing differently (and possibly life), but both loved it equally.

And so it is with us humans. What is often juxtaposed as a battle of extremes is often, in many ways, a battle of similarities.

“We’re insane. Can’t be any other reason.”

–Warren Harding on the motivation of rock climbers

(I do not own the photographs in use. Simply contact me if they need to be removed and I will graciously do so.)





A blast from the past: A look at the hunter-gatherer lifestyle

No matter our age, we’ve certainly heard from people older than ourselves how much better everything was in the past. I will tell my children how great the 90s were while my parents roll their eyes. Older readers probably think nothing compared to the hippie days or disco nights of the 60s or 70s. Or maybe nothing compared to young slender Elvis in the 50s.

But go too far back and we all hit the brakes. Certainly our lives are better now than they were 250 years ago. The comforts of the Middle Ages can’t beat iPads and air conditioning. We often view history, and especially the human race, as progressive. We are getting better year after year. Our advancements are thought of as improvements. Tractors drive themselves and soon cars will too. The new iPhone 7 just came out and it’s most definitely better than iPhone 6. My house is 72 degrees all the time—no matter how hot or cold it is outside! But with all of the problems of the modern world we so often discuss, why are we so dismissive of the ways of the past?

A couple of books that I’ve read recently have touched on the subject of the lifestyle of the hunter-gatherer, which takes us back thousands of years, before agriculture. In Tribe, war journalist Sebastian Junger centers his focus on American soldiers’ experience with PTSD, and how social structures from hunter-gatherer tribes may help.

Junger’s thesis focuses on PTSD occurring as a result of soldiers’ disconnection from the society they serve. While overseas, soldiers live, work, and fight in smaller bands. They know each other, eat together, sleep in close proximity, and are accountable for one another. These living conditions are vastly different than modern American society, which has increasingly become centered on self. Neighbors often do not know each other, as we all focus on our work, our kids, our lives. High school students are encouraged to be thinking about their college choice, their career choice, their this and that. This focus on self creates a difficult transition for people coming from war. They lose that feeling of tribalism that is prevalent in the war zone, the feeling that was at the core of hunter-gatherer societies.

These groups were typically small and close-knit. In Sapiens (which aims to provide the arc of human history, which seems like a rather daunting task), Yuval Noah Harari writes that a person can only trust and know up to 150 people. Beyond that (and maybe below) we need structures, such as government or a managerial hierarchy. Tribes often shared in the raising of children (“it takes a village…”). They hunted together and shared the food (and their women). They provided for the sick and elderly (sometimes—other times, they were left tied to a tree to await their death). There was no need for government as the tribe members held each other accountable. A slacking member or mooch would be punished. Someone who would not share was ostracized. Today, people complain about the poor getting handouts and the rich hording money. But Junger points out that these acts were rare in a small tribe. Today in America, people break tax laws or commit Medicaid fraud because the victim—a group of over 300 million people—seems so impersonal.

Both Harari, from an evolutionary viewpoint, and Junger, from a social one, believe that we all yearn to some extent for the connectedness found in these bands. An interesting anecdote that Junger uses to illustrate this natural attraction comes from one of our own founding fathers. Benjamin Franklin noted that European settlers were leaving “civilized” life to live with Indians, but rarely did Indians join the settlers’ society.

We often have misperceptions of a hunter-gatherer lifestyle being burdened with a heavy workload because they could not depend upon machines or computers. But that was often not the case. In both books, the writers discuss the amount of leisure time afforded members of these tribes. While we are working 40+ hours per week to provide shelter and the “essentials” for a small family, hunter-gatherers often worked between 15 and 30 hours per week and spent the rest of the time playing and laughing. Even today, anthropologists studying the few hunter-gatherer tribes left on the planet remark at how much these people laugh. Their lives revolved (and still do) around relationships. Not success. Not possessions. But people.

Harari writes that contentedness is not the only benefit of such a lifestyle. While we are more intelligent as a collective society today, Harari claims that the breadth of knowledge, attention to detail, and fine motor skills possessed by individuals in hunter-gatherer bands were higher. Today, we are all so specialized, the entire labor force seemingly built like a Henry Ford factory.

While there was a high rate of infant mortality and life expectancy was not as long for these bands, they lived healthier lives. Studies of such cultures today show lower rates of obesity, heart disease, and cancer.

Not to gloss over all the problems with people who have or do live this way, I must point out that the incidence of being attacked by a lion and having your bones picked clean by a group of hyenas (and that’s before the vultures find your body) is/was higher in that form of society. They may not have dealt with chronic health issues but without the type of medical care we enjoy, minor infections or illnesses could send them to their graves. They often killed their young, either because they were a burden or as part of a religious ceremony. In addition, Harari points out that if one were to be ostracized, it was probably difficult to ever re-join or join another group. Life was far from perfect.

So these uncivilized, backward humans do have something to offer us. Despite my instruction, my kids will probably still remember the current time as the best time to have been alive. They’ll tell their kids about Playstations and Fitbits, mp3s and iPads.

And I’ll tell my grandkids about the great days of the 1990s—the 1990s B.C.

  1. Junger, S. (n.d.). Tribe: On homecoming and belonging
  2. Harari, Y. N. (n.d.). Sapiens: A brief history of humankind.
  3. Facts and Theories. Retrieved September 09, 2016, from

Opposing trenches: The political battle between Woodrow Wilson and Henry Cabot Lodge

Woodrow Wilson and Henry Cabot Lodge shared similarities. Both died from strokes in 1924. Both earned doctorates in political science from prestigious universities. Both became titans of early 20th century politics. Both were accomplished orators, with Henry Adams describing Lodge as “an excellent talker…an accomplished orator, with a clear mind and a powerful memory” and the historian H.W. Brands once describing Wilson as possibly the most eloquent speaker of all of our American presidents.

But despite their shared traits and achievements, these men are known to history for their differences. Lodge was born into a wealthy family with deep roots in Massachusetts. Wilson, on the other hand, was a son of the South, born in Virginia, but bounced around Georgia, South and North Carolina as a child. Lodge was a Republican and Wilson a Democrat. Most famously, Wilson favored united international cooperation and membership in the League of Nations. Lodge fiercely fought it.

When Wilson was elected president in 1912, he did so at the expense of Lodge’s dear friend, Theodore Roosevelt, becoming just the second Democrat to claim the office since 1856. Roosevelt, who had vowed not to run for re-election in 1908, became so frustrated with his handpicked successor, William Howard Taft, that he decided to oppose him in the Republican primary in 1912. After losing, he decided to run a third party campaign as nominee of the Bull Moose Party. This move served to split the Republican Party and hand Wilson the presidency.

At this time in Europe, tensions had reached a boiling point in the Balkans. The members of the Balkan League—Serbia, Bulgaria, Montenegro, and Greece—gained their independence from the Ottoman Empire through a series of wars and treaties, leading to a destabilized Europe. Combined with the fact that several new alliances had been formed in the latter part of the 19th century, this destabilization had a ripple effect when Serbian assassin Gavrilo Princip killed Austria-Hungary’s Archduke Franz Ferdinand. Countries, in the name of their alliances, began declaring war on one another, leading to World War I.

The United States, much as they would in the second World War, remained out of the fray for as long as possible. In a foreshadow of what was to come, Winston Churchill, serving Britain as the First Lord of the Admiralty, desperately wanted the Americans involved in the war as part of the Allied Powers (as opposed to the Central Powers). There even some speculation that he purposefully offered up the Lusitania to a German U-boat in an effort to get his wish. At this point a Massachusetts U.S. Senator, Henry Cabot Lodge feared that a weak response to the conflict would undermine American sovereignty and patriotism. He lobbied for war, criticizing President Wilson’s peaceful and neutral approach. Wilson would not be persuaded until 1917, when the Germans reneged on their pledge to enact restricted submarine warfare. The Germans had declared any vessel sailing the waters as targets for their U-boats, leading to the Lusitania sinking. After the global outcry, Germany backed off this stance, but reverted shortly thereafter. Wilson knew that the role he desired for the United States—neutral peacemaker—was no longer a possibility. The Americans must enter the war. On April 2, 1917, Wilson asked Congress for a declaration of war on Germany.

American involvement in the war was swift. The fighting ended on Nov. 11, 1918.

But the defining battle of Wilson’s Presidency was just beginning. He believed that the sacrifice paid by American and Allied soldiers would be in vain if the resolution simply ended the war. He envisioned a unified international community, based upon the Fourteen Points speech he gave to Congress in January 1918. He envisioned the League of Nations, a forum that would reduce war and bring order to international conflict. In fact, Germany’s decision to end the battle was based upon its interest in Wilson’s plan.

Lodge wholeheartedly disagreed with Wilson. Once the Treaty of Versailles was constructed between Germany and the Allied Powers in 1919, it was brought before the United States legislature, where the Republican, as unofficial Senate majority leader (he would become the chamber’s first official majority leader in 1920) and chairman of the Senate Committee on Foreign Relationships, led the fight against approval. Specifically, Lodge opposed Article X of the treaty, which entangled signatory countries in the affairs of others, in part creating the League of Nations. It did not require declarations of war, but could force the US to enforce embargoes or cut off diplomatic relationships based on its membership. Lodge felt it too restrictive to American interests. Lodge felt that the US should be afforded the authority to police the world and intervene in international affairs as it saw fit, but he did not like being tied to the problems and concerns of other nations. From a speech he gave on the treaty:

“The United States is the world’s best hope, but if you fetter her in the interests and quarrels of other nations, if you tangle her in the intrigues of Europe, you will destroy her powerful good, and endanger her very existence. Leave her to march freely through the centuries to come, as in the years that have gone. Strong, generous, and confident, she has nobly served mankind. Beware how you trifle with your marvelous inheritance; this great land of ordered liberty. For if we stumble and fall, freedom and civilization everywhere will go down in ruin.”

Lodge campaigned for approval of a modified version of the treaty.

Wilson, leading a fractured Senate Democratic minority, stood opposed to modifications. His view was that the treaty would be approved in its current form or not at all. With battle lines drawn, the Treaty of Versailles never reached the two-thirds vote necessary for ratification in the Senate. The US went on to negotiate a peace treaty with Germany, but did so separately from the Allied Powers.

The League of Nations was formed, but America did not join. The League of Nations never achieved its designed goal, dissolving in 1946. The Treaty of Versailles, given its War Guilt clause and calls for German disarmament and reparations, is often cited as a causal factor in the rise of German Nazism.

Lodge had won, thanks in part to party-line stances and ethnic interests. Being a melting pot of primarily European immigrants, the US had citizens with strong ties to Germany, Ireland, and Italy, all with interests and criticisms of the Treaty of Versailles. Lodge was able to build a coalition of these different groups to achieve his desired result.

Though the battle came to an end in 1919, the philosophical conflict between Lodge’s American supremacy and strong foreign interventionism and Wilson’s idealistic, intergovernmental approach to foreign affairs is still being had to this day.

  1. Wilson – A Portrait. (n.d.). Retrieved July 22, 2016, from
  2. “Woodrow Wilson.” A&E Networks Television, n.d. Web. 22 July 2016.
  3. “Henry Cabot Lodge.” A&E Networks Television, n.d. Web. 22 July 2016
  4. “The Treaty of Versailles and the League of Nations.” Independence Hall Association, n.d. Web. 22 July 2016.
  5. Showalter, Dennis E. “World War I.” Encyclopedia Britannica Online. Encyclopedia Britannica, n.d. Web. 22 July 2016.
  6. “Treaty of Peace with Germany.” Google Books. N.p., n.d. Web. 22 July 2016.

John Tyler: Far more interesting than you ever knew

Referred to as “His Accidency,” our nation’s 10th president, John Tyler, is largely forgotten. He served nearing four years in office from 1841-1845, but the man who preceded him, William Henry Harrison, is more famous despite (and probably because of) the fact that he served just one month in office.

A Virginian who practiced law, inherited a plantation, and was the son of a governor, Tyler’s status as chief executive of the nation was looked down upon in his time because he was never elected to the office. He was an “accidental” president, hence the nickname. But Tyler’s life does hold some interest.

He helped set presidential succession precedent

In its early forms, the United States Constitution was not clear on the subject of presidential succession. After Harrison’s death from pneumonia after 31 days as President, Tyler’s status was challenged. While the document read that the Vice President would fill the Office of President upon the President’s inability to carry out his office, the dispute arose concerning the Vice President’s title. Was Tyler actually the President or simply the Acting President? Daniel Webster, among other cabinet members, and political rivals believed that he was simply a placeholder without all the rights and abilities of an elected President.

When approached by Webster with the idea that the cabinet vote on his decisions, Tyler responded:

“I beg your pardon, gentlemen. I am very glad to have in my cabinet such able statesmen as you have proved yourselves to be, and I shall be pleased to avail myself of your counsel and advice, but I can never consent to being dictated to as to what I shall or shall not do. I, as president, will be responsible for my administration. I hope to have your cooperation in carrying out its measures. So long as you see fit to do this I shall be glad to have you with me. When you think otherwise, your resignations will be accepted.”

Tyler stood his ground and set a precedent that would lead to Constitutional amendments that would further outline in a clear manner the issue of presidential succession.

His party kicked him out… while he was president

Tyler, who was philosophically a Jeffersonian Republican, had instead become a member of the Whig Party. The Democrats, under Andrew Jackson and Martin Van Buren, were the closest adherents to the ideas of Jefferson, but because of his dislike and distrust of those two men, Tyler joined with the Whigs. The Whig Party avoided a platform, rather basing their affiliation on opposing what Democrats did.

The party had been suspicious of Tyler for some time and these feelings of unease reached a fever pitch when Tyler vetoed a bill seeking to re-establish the Bank of the United States. Henry Clay, who had plotted to be the power behind William Henry Harrison’s throne, attempted to impeach Tyler the next year when the President vetoed a tariff bill. Clay’s overthrow was unsuccessful.

Today, so many party leaders fall in line behind a president from their own party. I’m not sure even Trump or Cruz would face the same kind of attacks from within the party as Tyler did, if elected.

He became the first president to marry in office

In 1839, Tyler’s first wife, Letitia suffered a stroke that left her partially paralyzed. When he became president two years later, she was unable to perform the hostess duties of the First Lady. Tyler’s daughter-in-law, Priscilla, was chosen to assume the role.

In 1842, Letitia had a second stroke, this one fatal. Suddenly, Tyler was a bachelor, and the first president to have his wife die while in office. So what does a wealthy man who holds the highest office in the land do? Marry a woman 30 years his junior, of course. The new first lady of the United States—Julia Gardiner. Imagine if “The Bachelor” television show had been around back in 1842.

The relationship with Julia bore seven children. Combined with the eight Tyler had with his first wife, our 10th president had more children than any other chief executive.

He is the only president to commit an act of treason

Tyler retired to his 1200 acre plantation in 1844. He called the estate “Sherwood Forest” because he viewed himself as a political outlaw, like Robin Hood. Tyler took his outlaw status a step further in the later years.

Almost two decades after Tyler left the nation’s capital, issues between the North and South reached a boiling point. Tyler served as chairman of a peace conference between the two sides in Washington, D.C., in 1861. The last-ditch attempt at a resolution was unfruitful. The Civil War soon broke out and Tyler voted in favor of Virginia seceding from the Union. He was elected to the Confederate House of Representatives but died in early 1862 before taking his seat.

Seen as a traitor, there was no recognition of Tyler’s life and death by President Lincoln and the government of the United States. The death of former President and New Yorker Martin Van Buren that same year was met with pomp and circumstance.

He has living grandchildren in 2016

As mentioned earlier, Tyler had 15 children, the last of which was born in 1860 when he was 70 years old. Lyon Gardiner Tyler, a son born to Tyler when he was 63 years old, took after his (literal) old man. He had his last child when he was 75 years old.

Lyon Gardiner Tyler, Jr., born in 1924, and Harrison Tyler, born in 1928, sons of the eldest Lyon Tyler, are still alive in 2016. In fact, Harrison is the caretaker of the family farm in Virginia. They are the grandsons of John Tyler, who was born in 1790. Let that soak in a little bit.

So there you have it, the life and times of the incomparable, and completely disregarded, 10th President of The United States, John Tyler.

  1. Editors. “John Tyler.”
  2. Staff. “John Tyler.” 2009.
  3. McNamara, Robert. “John Tyler: What You Should Know About the 10th President.” Education. November 22, 2014.
  4. Amira, Dan. “President John Tyler’s Grandson, Harrison Tyler, on Still Being Alive.” January 27, 2012.

Dance like nobody’s watching…because they’re not

On Good Friday this year, a friend of mine died after a two-year battle with ovarian cancer. She was 61 years old and had just recently begun her third round of chemotherapy.

I enjoyed our conversations about nature, healthy living (though I didn’t always follow it), the work of Thoreau, and parenting. We shared similar outlooks on the trappings of our world, and the meaning of it all.

On a fairly regular basis, she would offer quotes from her favorite poets and thinkers, whether I asked for them or not. Some of the time, the quotes were her originals. For example, “Life is simple, but it’s never easy” is something she told me all the time. She meant that being a good steward of the gifts one has been given—our loved ones, our bodies, our world—was a very basic tenant of a good life, but so terribly difficult to do on a continual basis.

Her “20/40/60” rule is also something that she shared often. As she was closing in on 60 at the time, she’d tell me that “When you’re 20, you worry about what everybody thinks about you. At 40, you stop worrying. And at 60, you realize they were never paying any attention to you in the first place.”

It’s a thought that brings a smile to my face. I’d love to think that people don’t pay any attention to me.

But is that really true?

In a 2000 research article published in the Journal of Personality and Social Psychology, a team of psychologists led by Drs. Tom Gilovich and Kenneth Savitsky (and including Cornell doctoral student Justin Kruger, whose lab I would later work in at the University of Illinois) put this idea to the test. How much do people really pay attention to us?

They examined this idea through a group of controlled experiments. The first involved a college student walking into a crowded lecture hall with an embarrassing T-shirt. The researchers determined that, among college students, fandom for singer Barry Manilow was perceived to be quite embarrassing. So, in the study, the T-shirt was emblazoned with a large picture of the crooner’s mug. The experiment was simple. The student was asked what percentage of the others would notice his or her embarrassing shirt. The guess? Approximately 50 percent. The researchers then interviewed the students in the lecture hall to identify what percentage actually observed the Manilow T-shirt. The answer: 25 percent.

The second study was set up in a very similar manner to the first one: A student wearing a T-shirt walking into a lecture hall. This time, however, the face on the shirt was of someone for whom the wearers thought they may receive positive attention. Martin Luther King, Jr., Jerry Seinfeld, or Bob Marley was chosen. Again, wearers felt that roughly 50 percent of the other students in the lecture hall would identify the shirt. In reality, less than 10 percent did.

The third and final study focused on our behavior, rather than appearance. Participants worked on group projects and then were asked to rate how memorable their contributions—both positive and negative—had been. The result? Raters consistently overestimated how much their groupmates remembered either type of contribution.

This phenomenon is referred to as the spotlight effect. In a simple explanation, our own egocentrism clouds our ability to correctly judge how much people pay attention to us. We fret over our faux paus and bask in the glory of a job well done, and we expect others to do the same for us. But they don’t.

So we have empirical proof that my friend was right. But is this knowledge common for people her age, or was she some type of sage?

As the T-shirt studies involved college students, I do wonder if an older subject pool would yield different results. If you sent a 70-year-old man into McDonald’s with a Kanye West T-shirt, would he overestimate the percentage of his coffee-drinking buddies that would notice his goofy shirt? I have reached out to Dr. Gilovich for his thoughts on this question and will report back an answer if I receive one.

For now, though, I will relish the fact that no one’s paying any attention to me. The next time I’m at a wedding, I’m definitely going to hit the dance floor with no reservations. Egocentrism won’t hold be holding me back. And besides, in addition to human nature, there are typically other factors that keep people from remembering anything from a wedding reception.

Even more reason to let it all hang out.

  1. Gilovich, T., Medvec, V. H., & Savitsky, K. (2000). The spotlight effect in social judgment: An egocentric bias in estimates of the salience of one’s own actions and appearance. Journal of Personality and Social Psychology, 78(02), 211–222
  2. Denton-Mendoza, R. (2012, June 05). The spotlight effect.

Hair and the highest office in the land

Dr. Ben Carson was never going to be President. The American people wouldn’t allow it. No way. No how. Seventeen Republican candidates began campaigns for the highest office in the land. He was the only one with a distinguishable physical difference. Just look at him. You think we would elect someone who looks like him?

Wait, you thought I was talking about his skin color? Of course not—we have an African-American President, silly.

I’m talking about another group who has faced a reemergence in American society but still faces a glass ceiling (and not women, either). I’m talking about the bearded.

Ben Carson was the only 2016 Presidential contender who sported a beard, albeit a razor thin mustache and goatee.

And given the history of men elected to the Presidency, I’d say it had a great deal to do with his loss. We just don’t send people with facial hair to the White House. We rarely send men with facial hair to Washington, D.C. in any capacity and we’ve never elected such a woman.

The first Presidents—Washington, Adams, Jefferson, Madison, and Monroe—were as smooth-faced as my two-year-old. John Quincy Adams, the sixth President, maybe as a sign of a new generation of office holders, did let his sideburns grow a little long. Eight years after he left office, Martin Van Buren pushed the envelope a little farther with his unruly mutton chops.

But a real beard was something that was not seen. I imagine a few of the Presidents after Van Buren, the career Army general and frontier types (James Polk, Zachary Taylor, and William Henry Harrison come to mind) probably grew a mean beard at some point in their lives, but not even stubble was part of any Presidential portrait.

It’d take a man of real fortitude to sport a full beard in the White House. You know, the kind that would also oversee a civil war and attempt to eradicate slavery. Many of you know Abraham Lincoln for his Emancipation Proclamation, his address at Gettysburg, his assassination at Ford’s Theatre, his height, his honesty, or his stove top hat. But he was also the first President to sport a beard, in his case a chin curtain, while in office. And as leaders so often do, he created followers.

A strange series of events occurred following Lincoln’s assassination, events that can only be connected when viewed through the prism of facial hair.

Andrew Johnson, who ascended to the big seat upon Lincoln’s murder, did not follow the slain leader’s facial hair example. He was clean-shaved, but he was also impeached (though not convicted), so maybe he should have grown a beard.

The next seven presidents, if you include Grover Cleveland twice, all sported, at the very least, a mustache. Four—Grant, Hayes, Garfield, and Harrison—had full beards. Chester Arthur sported some aggressive sidewhiskers connected by a mustache, which could also be classified as an overgrown friendly muttonchops look. Cleveland simply had a mustache.

William McKinley moved into the White House in 1897 as the first clean-shaven President in 28 years. He was also shot and killed in Buffalo, N.Y. four years later.

So of the two smooth-faced chief executives to take the reins in the 36 years following Lincoln’s death, one was impeached and the other was killed.

It gets weirder.

With McKinley’s death, New York Republican Theodore Roosevelt took over the big job, wearing a thick mustache. His hand-chosen successor, William Howard Taft, famous for his girth, was also the proud owner of a nice handlebar.

Woodrow Wilson followed Taft. A Princeton academic and former Governor of New Jersey, the Democratic Wilson was without facial hair. Near the middle of his second term, he suffered a debilitating stroke that was hidden from the public and that many believe left his wife, Edith, in charge of the country.

Next up was Warren Harding, a clean-shaven Ohioan. He died of a cerebral hemorrhage about two-and-a-half years into his first term, in 1923.

When Calvin Coolidge, Harding’s successor, finished his second term in 1929, he became the first clean-shaven President since James Buchanan in 1861 to finish his tenure in office relatively unscathed. No assassination, impeachment, debilitating stroke, or deadly hemorrhage.

How does one make sense of something so random, and yet seemingly so calculated? Well, blame it on the cosmos. The cosmos were probably angry about the bearded Lincoln’s death and spent the next 58 years seeking retribution on the beardless. Now you may point out that it was the mustachioed John Wilkes Booth that killed Lincoln, so why would the cosmos be angry with the beardless? My response: never let details get in the way of an interesting theory.

Given the subject of last week’s column, you may remember that it wasn’t as if the bearded were without trouble. President James Garfield, who wore a full beard, was shot while in office. But it’s interesting to note that the man who was in charge of his care was Dr. Willard Bliss. While Dr. Bliss sported a pair of bushy mutton chops, he did not have a full beard. The unsanitary medical practices that Bliss employed in his search for the bullet have been credited as the true cause of Garfield’s death, not the injuries from the bullet itself. If he had simply let the fully-bearded Garfield to his own devices, he probably had a greater chance of living. Bliss could have also granted the very hairy Alexander Graham Bell greater access in his attempts to find the bullet with Graham’s newly-developed metal detector. Either choice, both of which involved someone with more facial hair than Bliss, would have been better than what Bliss did.

Now back to the clean-shaven Presidents. One could argue that Woodrow Wilson effectively ended what I term the Bearded Age of the American Presidency. He defeated major contenders Theodore Roosevelt and William Howard Taft in 1912 and the Van Dyke-sporting Charles Evans Hughes in 1916. But I’d say the Bearded Age didn’t end until the cosmos were satisfied. And so, Calvin Coolidge, famous for his silence and conservatism, was the President that brought about the end of the Bearded Age.

Since that time, a major party has only nominated one Presidential candidate with any facial hair whatsoever. That man was Thomas Dewey, the wearer of a well-groomed mustache, and he was nominated in both 1944 and 1948. He first lost to Franklin Delano Roosevelt and was famously defeated by Harry Truman in the latter election.

So to recap: America voted only clean-shaven men into the Presidential office from 1786-1856. In 1860 Lincoln sported the first beard in office and set a trend that would last for approximately five decades. The only people to stray from that pattern were killed or impeached. William Howard Taft was the last American President to have facial hair and the first two after him suffered deadly or significantly-impairing medical issues while in office. From 1912 to today, we have not had a President with facial hair.

I think that should change. It’s too late for 2016, but how about 2020? America is becoming more progressive and barriers are falling all over the place. Let’s give hope to all those young children who dream of becoming President of the United States of America, but whose achievement of that goal is so often derailed by their yearning to grow a beard.

This is America, where people of all races, genders, creeds, and yes, facial hair designs, should be free to chase their dreams.

You hear that sound? I think that’s the glass ceiling shattering.