The “Masculine Obsolete”–How the Traditional Man Became Redundant to the Modern Woman

The Masculine Obsolete

In 1963, Betty Friedan, in her groundbreaking book The Feminine Mystique, put her finger on a problem that had been the source of much bewilderment for college-educated, suburban, American white women of the 1950s and early ’60s: the inadequacies of a life limited to being wife, mother, and keeper of household. The “problem,” which had theretofore been sensed but never fully identified, was like a festering, undiagnosed illness, or, as Friedan would refer to it, “the problem that has no name.” So Friedan, who herself had forfeited a promising career in psychology for the “higher calling of motherhood,” decided to give the problem a name: She called it “The Feminine Mystique.” Then she immediately went about the business of finding a solution to the problem. And she found it: Friedan encouraged women to seek self-fulfillment through careers—outside the home. And in so doing, she ignited what would become the second wave of the Feminist Movement in America. (Of course, for women such as black women in the Western World, for whom working outside the home had been a long-established necessity or presumption, the significance of the career component to the healing of the “Mystique” and to the defining of the Movement seemed hyperbolized, even if those same women embraced the overall aim of the Movement and sympathized with the overall symptoms of the “Mystique”). What Friedan perhaps did not realize, however, was that once she had christened the phenomenon “The Feminine Mystique,” she had also inseminated its male counterpart, “The Masculine Obsolete,” which, like its female predecessor, would go undetected, undiagnosed, and unnamed for years—in its case for a half of a century. Consequently, in the 21st century, men all over the Western World face their own dirty, little, no-name problem: the inadequacies of a life limited to being husband, progenitor, and breadwinner.

Males, having long enjoyed the self-proclaimed status of the “superior sex” —a distinction that until the Feminist Movement had been conceded by many a traditional female—are quietly, but increasingly, feeling inferior to modern women, who not only are equaling and surpassing men in the once-male-dominated hallowed halls of academia and in the mahogany-paneled boardrooms of the corporate world, but are also typically more skilled on the domestic front than their male counterparts, even if the typical 21st -century woman is inept, vis a vis women of the previous century, at the traditional domestic arts of sewing and cooking, for example. (The fact is that very few modern women can thread a sewing machine, let alone make or mend a garment; and perhaps even fewer could bake a cake from scratch or make homemade bread if their lives depended on it. But as compared to modern men, women remain head and shoulders above men in matters domestic). For many Feminist Movement-impacted men, the lyrics of the 1946 Irvin Berlin song, “Anything You Can Do,” the spirited duet between a male singer and a female singer, each proclaiming to be able to outdo the other in a series of increasingly complex tasks, written for the Broadway musical Annie Get Your Gun, ring true—for women—especially the song’s most famous line, “Anything you can do I can do better.” In effect, then, the post-Friedan woman is both “woman” and “man,” while the present-day man is merely “man,” much to the chagrin of men and the frustration of modern women (who claim they want men to be their equals, not their superiors or their inferiors). That delicate imbalance of the sexes, which for thousands of years tipped in the favor of men, has now shifted in favor of modern women. The sixty-four-thousand-dollar question, then, is: So what’s in it for women? Or, more pointedly, why are 21st -century women cohabiting with and marrying men if women can “go it alone”? What’s a boy to do—if he wants to keep a modern woman in his life? And with the increasing legalization and social acceptance of gay marriage, how can two traditionally raised gay men keep house together? Are such households destined to suffer from a double dosage of “The Masculine Obsolete”? Will such households be examples of the blind leading the blind? Or will same-sex couples—gay and lesbian—serve as an example of modern marriage, both different-sex and same-sex, where roles and responsibilities are determined not by gender and tradition, but by capacity and interest?

Traditional marriage and male-female cohabitation presupposes a symbiotic, interdependent relationship between the male breadwinner and the female homemaker. Each needs the other, and each wants the other. But while modern women, as a result of the Feminist Movement, have become proficient at being self-sufficient, many men, who, unwittingly, saw no need for a “Masculinist Movement,” have remained dependent upon women for many of their most basic of needs—from cooking the food that nourishes them, to cleaning the bathrooms where they are supposed to cleanse themselves, to washing and ironing the clothes they wear to work, to making the beds in which they sleep and have sex. And, ironically, modern women are partly to blame for the domestic ineptitude of men, for women continue to play a pivotal role in the raising of antiquated sons: In general, men are not raised by their mothers and grandmothers to become the well-rounded men that modern women now desire and require. Instead, modern women continue to allow their sons to wallow in “The Masculine Obsolete,” while those same women raise their daughters to triumph over the “The Feminine Mystique,” becoming independent women, capable of thriving in professional and domestic arenas (even if pre- and anti-Feminism stalwarts insist that today’s women are a far cry from the ladies of yesteryear). When the chanteuse-protagonist of the 1970s’ television commercial for Enjoli perfume would proudly belt out during prime-time television, “I can bring home the bacon; fry it up in a pan… ‘Cause I’m a woman….,” men should have taken notice that something in society was simmering—especially since, at the tail end of the ad, a man’s voice, presumably that of the heroine’s husband, could be heard in the background uttering, “Tonight I’m gonna cook for the kids.” But men did not take heed. Instead, they continued being “just men” rather than striving, like their feminist counterparts, to become self-contained, woman-man entities. So arguably, men are even more responsible for the perpetuation of the “Masculine Obsolete”: They had ample warning and could have stopped it in its tracks from back in the 1970s. Fathers knew then and know now, first-hand, the justified resentment expressed by professional women who have to return home from work each day only to commence another full-time job of caring for their children and their professional husbands.

Despite the tendency of “mystiques” to be mystifying, a cogent argument can be made that “The Masculine Obsolete” is a male problem and therefore should be solved by men—the way women like Friedan had to take charge in creating solutions to the “The Feminine Mystique.” After all, had women left the unraveling of “the problem that has no name” to men, suburban American women would still be standing in front of their state-of-the-art kitchen appliances—albeit in Manolo Blahnik and Jimmy Choo shoes—meanwhile self-medicating with Valium-laced cocktails.

But why did women, after having solved “The Feminine Mystique” on their own, and knowing that men were not in possession of the skill sets to solve their own “no-name” problem, not intervene and at least raise their sons—even if not their husbands—to be modern men? Why did women leave men to have to reinvent that wheel? Why did mothers allow fathers to throw their sons “under the bus” (or, at best, abandon them “in front of” the oncoming bus) of social change? Was it payback for thousands of years of male dominance? Were Feminist-mothers longing for some of the vestiges of a bygone era where men were separately and distinctly “men” and women were separately and distinctly “women,” such that those mothers would persist in raising their sons to be traditionalists in the face of social upheaval? How much of male-female behavior is attributable to biology and how much to socialization? How could parents have raised their daughters to be feminist ladies and sons to be modern gentlemen without redefining—to the point of distortion—the terms “lady” and “gentleman”? Are the terms “feminist lady” and “modern gentleman” oxymora?

Concomitant with women’s triumph over the“The Feminine Mystique” is men’s succumbing to “The Masculine Obsolete.” Over the decades, there have been various attempts at bringing balance to the two social phenomena, but only with marginal success. After two generations of two-income families and latch-key children, many people would agree that the most effective way to raise children and run a household is for at least one person to be a stay-at-home spouse—even if the overall disposable income of the family is compromised as a result. Children, until they leave the home, need supervision. And the quality of life of a professional is enhanced if he or she is able to return to a home where he or she can relax and unwind instead of having to deal with the additional stresses of maintaining a household. And since hiring housekeepers and live-in childcare is beyond the economic reach of most families, one of the spouses must typically fill those roles. So in an attempt to accommodate the modern woman’s desire for careers outside the home, there have been various experiments with role-reversal where men—especially in cases where their wives have greater income-capacity—become stay-at-home husbands, or, as society refers to them in the cases where the couple is raising children, “Mr. Moms.” But while men, theoretically, are as capable as women at raising children and keeping house, the “Mr. Mom Model” overlooks certain social inconsistencies that are yet to be fully reconciled: Men, because of socialization, still believe that they should be the primary breadwinner in the household; and women—even bra-burning, Friedan-quoting modern ones—because of socialization, resent having to be the primary breadwinner, even if they take personal pride and feel a sense of accomplishment in the fact that they are. Consequently, many “Mr. Mom” men feel undervalued by society, themselves, and their wives (and later by their adolescent children); and many career women harbor resentment and sentiments of disrespect for their stay-at-home husbands—the way professional men have traditionally undervalued the contributions of the traditional housewife. And while some men believe that their wives should have careers and contribute towards the finances of the household, others firmly believe that the most masculine expression of manliness is the man who can support his family without the financial assistance of his wife—so much so that some such men feel justified, if not authorized, to avail themselves of certain “extramarital privileges” as a reward for being exceptional providers for their families. Meanwhile, many of even the staunchest professional feminists—lawyers, doctors, entrepreneurs—would have no qualms giving up their hard-earned careers to be the wealthy wives of wealthy men, passing days as “ladies who lunch,” volunteering with charitable organizations, serving on museum boards, being “the hostess with the mostest,” and living lives of leisure, travel, and shopping. Except for the women who have truly found their life’s calling and are therefore viscerally compelled to pursue those careers, many educated, modern women would gladly abandon professions if they were guaranteed that their husbands could provide them with all their needs and wants. After all, for many women, the primary purpose for their careers is to provide security for themselves before marriage, in the event they never marry, or upon divorce. Those same women, however, if they were the sole source of the family’s financial wherewithal, would harbor resentment and disrespect towards their “kept husbands.” And when both the man and woman of the house work, both tend to assume that the man should generate the higher income. So the stay-at-home-husband model seems to work best when the husband has a home-based business that generates at least as much income as that of his professional wife. In essence, then, modern women do not regard it as fundamentally wrong to be provided for by a man. But those same modern women regard it as fundamentally wrong to provide for a man. They regard the “kept man” as a corruption of nature, as an encroachment upon their femininity. Then to add fuel to the fire, some women—even professional ones—are “domestic territorialists,” oftentimes envying or resenting a man who is better at child care, cooking, and cleaning, for example, than they are—the way many men oftentimes resent and envy women who surpass them in the workplace. “Domestic territorialists” are infamous for bursting through the front doors of their homes after a long day at work in the corporate world and immediately beginning—like lionesses scent-marking their territory—“rearranging or tweaking or tidying up” the domestic accomplishments of their stay-at-home husbands.

So, in essence, men, women, and society have a lot to reconcile in the area of the parity of the sexes. As one social pundit puts it, “There won’t be true equality of the sexes until middle-aged, overweight women can walk up the beach, topless, and think they are God’s gift to men.”

Sexism, chauvinism, and a host of other “isms” convinced men and women that they were more dissimilar than alike. And it was not until the Feminist Movement that the exact opposite was proven true—that men and women are exactly alike, except in a few areas that are irrelevant under most circumstances. But despite the overall equality of the sexes, the fact remains that men are men for a reason, and women are women for a reason. Nature, in its infinite wisdom, made it so. So when Feminism, with its broad, sweeping broom, was “cleaning house,” discarding all possible distinctions between the sexes, some people—even some feminists—longed for some of the old distinctions to be preserved. For example, even the staunchest feminist regards it as infinitely charming when a gentleman rises as she enters a room or approaches his table. And even a socially unschooled man would refuse to allow a woman to hold open a door so that he may enter or exit a room before she does. (See chapter, “Out and About—Manners in Public Places”). But overall, The Feminist Movement, with its demands for equal treatment of women, has served to also relieve men of many of their social obligations to women, the result being generations of men and women neither of which knowing how to extend or receive the time-honored social graces. But at the end of the day, despite the Movement, the onus of most manners still falls on men: It is men who must tip and remove their hats; men who must pull out chairs and open doors; men who must walk curbside…. Similarly, in a world populated with modern women, the onus is on men to become modern men.

Demystifying “The Masculine Obsolete”

Regardless of the shouldah-couldahs and the blame-game, the fact remains that as of the first decades of the 21st century, “The Masculine Obsolete” remains unsolved, wreaking domestic havoc on men, with collateral effects on women. So today, any book on men’s comportment must demystify the “Obsolete,” for at the end of the day, a gentleman of today must appeal to the lady of today. He must be like a peacock with a full tail; a bull with pointy horns; a rooster with a melodious crow. He must be a man with the domestic skills of a woman. Today, in order for a man to be regarded as “marriage material” by the modern woman, he must not only be educated and gainfully employed or employable, he also must be able to cook and clean and mend and child-care and launder and organize play-dates and sleep-overs and schedule pick-ups and drop-offs. He must be able to set a formal table and pack a picnic basket, arrange flowers in a vase, wrap Christmas presents, hand-wash and drip-dry his fine garments, and braid his preteen daughter’s hair. The modern man must be able to replace a missing button, bake and decorate a birthday cake, and make a Halloween costume from scratch. He must pre-wash his dishes before placing them into the dishwasher—in an orderly manner; he must replace the cap onto the tube of toothpaste after brushing his teeth; and, for the one billionth time, he must remember to raise the toilet seat before urinating (and to lower it after use!).

Over the years, there have been some attempts to identify the well-adjusted modern man, the term “metrosexual” perhaps being the most recognizable. But that label, on its face, neither identifies nor addresses the essence of the solution to “The Masculine Obsolete.” Instead, the term “metrosexual” tends to conjure up images of a 30-something, sophisticated, urban male with a fastidious appreciation for the finer things in life, rather than images of a man willing to stand shoulder to shoulder with his woman in all matters domestic—from the careers that finance the home to the day-to-day chores that keep the home functioning. (Besides, for a lot of men—and women—the term “metrosexual” is really a euphemism for “gay,” not “modern”).

The 21st-century gentleman must be acutely aware of the fact that, more and more, women are finding marriage to be an unnecessary—even if desired—institution. Long gone are the days when women had to marry for survival into adulthood. Not even the bearing of children within the context of matrimony is of paramount importance to many modern women. Yes, some women (even some feminists)—especially young ones wishing to marry for the first time—still hope for “the perfect husband”: the princely, wealthy, handsome gentleman. But for the majority of women for whom such a reality exists only in fairy tales, they simply want a husband who can carry his own weight outside and inside the home.

For many 21st -century men, however, the idea of marriage or cohabitation with the opposite sex is far more practical and much less romantic: Many men need a woman in order to maintain the level of civility and domesticity to which they have become accustomed while living at home with their mothers. Left to their own devices, many men would revert to an almost feral state—within the context of society: bed sheets would go weeks unchanged; bathrooms would go years uncleaned; dirty dishes would remain piled up in the kitchen sink until a female friend offers to wash them; leftovers would grow mold in the refrigerator; worn underwear would get turned inside-out and worn again; last-year’s pizza box—with half-eaten slices in it—would remain under the bed; locating an ironing board and its accompanying iron would become the equivalent of looking for weapons of mass destruction.

So since parents, for the most part, continue to fail miserably at raising their sons to be domestically inclined—whether, in the case of mothers, it is intentionally done in order to inflict the pain they had to endure onto their daughters-in-law, or, in the case of fathers, on account of some irrational fear that domesticating their sons will render them gay or effeminate—men must take it upon themselves to educate themselves in the ways of keeping house. Men must cure men of “The Masculine Obsolete” by making men equally capable as women in the household, for the future of traditional marriage—the cornerstone of human society—depends upon domestically skilled men. Continued domestic ineptitude will make men irrelevant to the modern woman, or, at best, relegate their relevance to that of sexual objects for heterosexual and bi-sexual women. But marriage aside, while the average American male in 1950 was married by the age of 24, twenty-first-century men are increasingly getting married in their early 30s, thereby making it all the more practical for men to be able to take care of themselves.

Given the dismal failure of parents to educate their sons in the art of homemaking, perhaps the most efficient way to eliminate domestic impotence in men would be to mandate home economics courses for schoolboys and to offer and encourage such courses on the university level. “Home Ec. 101—for Jocks” could very well become a popular elective on college campuses across America. (Traditionally, as a supplement to basic “home training” by parents, young women, until the early 1970s, were expected to attend “finishing school” or “charm school” as well as take home economics classes as part of the junior high and high school curricula. And home economics was a popular college major amongst young women. On the junior high and high school levels, young men were trained in “wood shop”and/or “machine shop.” A similar approach could be instituted in the 21st century, except that young men and young women would both take home-ec and wood shop/machine shop courses, the aim being to make men and women domestically competent and complementary. In addition to courses, men should also be encouraged—the way they are with sports, for example—to routinely view home improvement programs so as to hone their prowess in the home). Until such policies and practices are implemented, “The Masculine Obsolete” will likely persist; modern women will continue to regard men as incompatible for marriage or long-term cohabitation; and the institution of marriage, the cornerstone of human society, will decline even further.

The Rich History of Penny Loafers–one of the all-time classic shoes

Loafers

Around 1930, Norwegian shoemaker Nils Gregoriusson Tveranger (1874-1953) introduced a new slip-on design, which he called the “Aurland moccasin.” (The shoe would later come to be called the “Aurland shoe”).

As a boy of 13, Tveranger traveled to North America, where he learned the craft of shoemaking. At age 20, he returned to Norway, apparently influenced by what he had experienced in the New World because his “Aurland moccasin” resembles the moccasins typically worn by the Iroquois (as well as the moccasin-like shoe traditionally worn by the local people of Aurland). Shortly thereafter, Norwegians began exporting the shoe to the rest of Europe. Americans visiting Europe took a liking to the shoe, so much so (perhaps because of their stylistic similarities to the Native American moccasin) that Esquire magazine featured an article on the by-then-popular shoe. The article was visually enhanced by photographs of Norwegian farmers wearing the shoe in cattle loafing sheds, and the rest, as it is said, is history…. In the 1930s, the Spaulding family of New Hampshire, inspired by the Norwegian shoe, began manufacturing a similar moccasin-like shoe, which they called “loafers.” The appellation would eventually become a generic term used to describe any moccasin-like slip-on shoe.

In 1934, G. H. Bass of Wilton, Maine began making his version of the “loafer” which he called “Weejuns,” a corrupted truncation of “Norwegians.” One of the distinguishing features of Bass’ design was a strip of leather stitched across the saddle of the shoe, the strip featuring a stylized crescent-shaped cutout. By the 1950s, “Weejuns” had achieved ubiquity amongst students, who oftentimes would, for safekeeping their “candy money,” slip a coin—usually a penny, enough in those days to purchase one or two candies—into the crescent-shaped cutout. The shoes then came to be known as “penny loafers,” a moniker that endures to this day.

From Native American moccasin to Northern European farmers’ shoe to Southern European summer shoe to classic collegiate footwear, the loafer—especially in America—has evolved into one of the all-time great fashion classics. Today, it would be hard-pressed to find a manufacturer of shoes that does not have some version of the loafer in its collection. Besides being unisex, the shoe has acquired general acceptability: Men have been known to wear patent leather loafers with their tuxedos; lawyers wear calf skin loafers with their fine suits to court; university professors still regard them as essential to the academic wardrobe; and urban dwellers consider the loafer—in all its variations—the ultimate “city shoe.”

One of the greatest resurgences of the loafer occurred in the 1980s when 1950s-inspired secondary school and collegiate fashion called “preppy”(the affectionate of [college] preparatory) became all the rage in the United States and then beyond. By the end of the ’80s, the loafer had come to symbolize a “nonchalance towards privilege.” It was not (and still is not) uncommon for a fashionable young man to wear a blazer with faded jeans and loafers—without socks, of course.

The History of Umbrellas

The Umbrella

The word “umbrella” derives from the Latin word “umbra,” meaning “shade.” And “parasol” derives from the Spanish/French “para,” ” meaning “stop” and “sol,” which means “sun.” The archaeological record establishes the presence of the parasol and umbrella in many of the great cultures of the ancient world: in Egypt; Assyria; Ethiopia; India; Persia; Greece; Rome; the Mali, Ghana, and Songhai empires of West Africa; and the Aztecs of Mexico, for example. But nowhere was the use of umbrellas and parasols more prominent than in ancient China, where it appears by the 11th century B.C.E.

Until the 18th century, parasols and umbrellas were used for protection from the sun, the difference being that parasols were carried over the person (by an attendant), while umbrellas were carried by the person (him/herself). For a brief time during the Roman era, people used umbrellas for protection from the rain, but the idea never garnered popular support. And in ancient Greece, it became popular for ladies to have parasols held over their heads at feasts in honor of Pallas Athena. As a result, the parasol came to be associated with women—especially those of the privileged classes.

The popularity of umbrellas in Europe derives from the 12th century when Pope Alexander III presented the Doge of Venice with a parasol to be carried over his head, (a custom that would endure until Napoleon Bonaparte brought an end to the Venetian Republic in 1797). By the 15th century, the umbrella had become a fashion accessory for shielding people from the sun.

According to the Oakthrift Corporation article titled “The History of Umbrellas—10 interesting facts you never knew,” umbrellas became popular in France amongst ladies in the 17th century, and by the 18th century, use of the accessory had spread across Europe. But umbrellas and parasols did not appear in England. Scotland, and Ireland until Portugal’s Catherine of Braganza married Charles II in 1662 and introduced the accessory to his realm. [It is believed that the superstition forbidding the opening of an umbrella indoors derives from the untimely death of Prince Rupert shortly after he had been presented with a gift of two umbrellas from the King of Batam in 1682].

Between 1685 and 1705, there were efforts, led by the English, perhaps on account of their notoriously rainy weather, to waterproof umbrellas. Unlike in the Roman era, the concept of using umbrellas for protection against the rain caught on in England. And the waterproofing of umbrellas gave rise in the mid-18th century to the distinction between the parasol as an accessory to shade one from the sun, and the umbrella for protection from the rain.

The earliest written record of a collapsible umbrella with bendable joints dates to 21 C.E. China, but the archaeological record suggests that such devices may have been used as early as the 6th century B.C.E. in China. In 1786, John Beale registered the first umbrella patent. His design was of a circular, coned canopy supported by ribs connected to a central shaft. But it was Samuel Fox’s U-shaped steel ribs construction that revolutionized umbrella construction. Fox’s design is still used today.

Perhaps because of the precedent set with the Pallas Athena feasts during antiquity, the umbrella would remain an accessory primarily used by women—until Jonas Hanway popularized its use by English gentlemen in the middle of the 18th century, so much so that umbrellas would, for a time, come to be called “Hanways.” But even so, there was resistance to umbrellas: Hackney coachmen regarded the accessory as competition; and some members of the privileged classes regarded being seen in public with an umbrella as a tacit admission of one’s inability to own a carriage. By the 19th century, perhaps in a reactionary ostentatious display of wealth coupled with a bona-fide need for portable protection from the rain, umbrellas had become fancy—with handles of precious metals studded with gemstones, for example. But by 1852 the umbrella had become a necessity, so much so that whalebone was replaced by mass-produced steel ribs as the structural foundation for umbrellas.

Tanned skin became fashionable around the 1930s, and with it came the beginning of the end of the parasol. The umbrella, however, being a portable, practical shield from rain, remained popular.

In 1928, Hans Haupt’s pocket umbrellas became available on the market. In the 1950s, nylon replaced oiled cotton canvas as the fabric of choice for umbrella canopies. And in 1969, Totes, Inc., obtained a patent for the first functional folding umbrella. Today, in the United States alone, over 33 million umbrellas are sold per year.

The English have, by necessity, mastered the art of making umbrellas: English umbrellas are considered the best in the world. And of all English umbrella manufacturers, Swaine, Adeney, Brigg & Sons is considered the absolute best. And the best “Brigg” umbrellas are those made of triple-woven silk (which expands a little when wet), thereby making the fabric absolutely waterproof.

If a gentleman intends to invest in a good umbrella, he would be wisest to select one in the color black, for wherever umbrellas are appropriate, a black one is always most appropriate.

20/40/60 Marriage–Redefining the “Successful Marriage”

It is a well-documented fact that in free societies where nubile persons choose their marriage partners, half of all marriages end in divorce. And it is fair to postulate that a significant percentage of the marriages that endure are not happy, satisfying unions. The institution of marriage in such societies, then—at least in its present expression—is significantly flawed, for a success rate of less than 50% would qualify most other things as in need of “improvement” or “major overhaul.”

Many people would agree that the concept of marriage—of two people officially and legally joining forces and resources to build a life together—is a good thing. After all, “life is hard,” so why “go it alone”? Besides, everyone needs someone to drop him off at the airport or pick him up off the bathroom floor if he falls and injures himself. But one of the fundamental flaws of marriage as presently defined is that it must endure for life in order to be regarded as “successful.” And it is that premise—codified in the “until death do us part” clause commonly found in religion-based marriage ceremonies—that generates much of the grief associated with the dissolution of marriages. Another fundamental flaw of marriage is the notion that spouses are self-contained, self-sufficient, autonomous units, capable of providing for all the needs and wants—emotional, sexual, financial, social—of each other for life. But that is simply too tall an order for many people.

While there is something sublime about two people meeting and falling in-love in their 20s, getting married, building a life together, raising children and then grandchildren, then walking off, hand in hand, into the sunset of their lives, that is not always or even typically the case. And while such a scenario may be regarded as the ideal expression of marriage, it should not, in a free society, be regarded as the institution’s only valid expression. What reasonable person would insist that a Rolls-Royce is the only valid automobile, or that the Gucci loafer is the only legitimate loafer, or that filet mignon is the only cut of beef worth eating, or that the only ice cream worthy of that delicious appellation is Häagen-Daz? If graduating summa cum laude were the only acknowledged way to graduate from college, very few people would possess acknowledged college degrees. In other words, in many facets of life, “good enough” is good enough. So why not apply that same standard when assessing the success or failure of a marriage?

In modern societies, where individual freedoms and pursuits are regarded as birthright, marriages that endure 10 years and beyond are increasingly being regarded as “successful” marriages. After all, in free, 21st-century societies, where people are expected to move in search of opportunity, where women have the means for independence, and where personal happiness is paramount, it is not unlikely that two people who are compatible today will become incompatible a decade later—at the fault of neither person. The fact is that people change—fundamentally—decade by decade. And a gentleman’s outlook on life in his 20s could be very different to that in his 30s, with both outlooks, even if diametrically opposed, being appropriate for their corresponding decade of life. Likewise, society also changes in fundamental ways: There was a time, for example, when the ideal career model was to secure employment upon leaving college or high school, work for the same enterprise for 40 or even 50 years, then retire (and hope to live long enough to enjoy retirement). In the 21st century, however, that outlook is almost inconceivable—for both employer and employee. Similarly, marriage-for-life may have been the only ideal model fifty years ago, when the genders were interdependent; where people were more religious and were married in religious ceremonies that incorporated the promise of marriage for life; and where, because of the preceding, there was a stigma attached to divorce. But by the 1970s, with the instituting of no-fault divorce, the rekindling of the Women’s Movement and the Sexual Revolutions, the decline of religion (even if not spirituality), etc., society’s outlook on what constitutes a successful marriage began to be more broadly interpreted and defined.

Today, a significant percentage of nubile persons in free societies marry more than once—and for good and different reasons. It is not uncommon for people to finally “get it right” only after their second or third attempt. So it would perhaps behoove society to own up to the fact of multiple marriages over the course of the typical lifetime and create an outlook on marriage that accommodates multiple marriages.

Twenty-Forty-Sixty Marriage (or Stability/ Sex/ Compatibility Marriage)

Twenty-Forty-Sixty Marriage offers the social framework for the average person to marry three times over the course of his lifetime—having a “successful” marriage each time by accomplishing the realistic goals earmarked for of each of the three tiers of marriage: marriage for stability in one’s 20s; marriage for sex in one’s 40s; and marriage for companionship in one’s 60s.

Twenties Marriage (Stability-Marriage)

One of the age-old problems with child-bearing is that marriage is the condition precedent to the “legitimacy” of children. But in a liberal, tolerant society where “family” is being redefined to include other models, and where same-sex marriage and the use of marijuana for medicinal purposes are now accepted, why should marriage remain the prerequisite for legitimacy? Why couldn’t legitimacy, for example, be accomplished via a contractual agreement between consenting parents who legally acknowledge paternity/maternity and agree to share equally in the responsibility of the raising of those children? Would women feel less compelled to marry in their 20s so as to bear their children within the context of wedlock?

But 21st-century society, despite all its social advancements and inclinations towards tolerance, is not yet at that juncture. So for the time being, women continue to marry in their 20s—not necessarily because they want to be married, but because their 20s is the best decade for bearing children, even if not for raising them. Then to further complicate matters is the fact that their male contemporaries—young men in their 20s—typically do not have the financial or emotional wherewithal to support a child-bearing wife and the couple’s offspring. Consequently, many well-intentioned marriages between loving young people end in bad divorce. And it is bad divorce—not divorce in and of itself—that is the culprit. The days of “marriage come hell or high water” are practically over. Today, thankfully, people recognize that the right to marry is concomitant with the right to divorce, and that divorce, under certain circumstances, can be a very good thing.

Under the Twenty-Forty-Sixty Marriage model, a woman in her twenties would marry a man in his forties who is capable of providing a stable environment for the bearing and caring of children. The symbiotic trade-off is stability for the younger woman, sensuality for the older man. While for a woman in her 20s, marriage to a man twenty years her senior may not be as sexually rewarding as a marriage to man in his 20s, marriage to a financially and emotionally established older man is arguably the relationship that is in the best interest of child-rearing. (For many couples comprised of two young adults in their twenties, the stresses of child-rearing, career, financial instability, compromised individuality, etc., are overwhelming).

The Twenty-Forty-Sixty Marriage model further postulates that men in their 20s should also marry for stability—to women in their 40s (some of whom will have obtained their stability whilst married to men 20 years older). A young gentleman’s marriage to a a more mature, stable woman—without the burden of child-bearing/rearing (since, theoretically, his more mature wife would have already born her children in her 20s with an older husband)—could serve to be the perfect environment for a young man to obtain upper-level education, build his career, and mature emotionally. The benefit to the more mature wife is sex with a young, virile husband, thereby compensating for the sex she may have missed during her child-bearing twenties while married to a man in his forties.

Forties Marriage (Sex-Marriage)

Everyone has a right to experience sexual pleasure in his lifetime; it is his birthright. The sexual peak of the human animal—when his body, mind, and soul are most at equilibrium—occurs between the ages of 24 and 48. While there is a sublime beauty to young love-making, where two inexperienced lovers uncover the art of sex together through trial and error, there is wisdom in the notion that sex is best enjoyed when at least one partner is sexually experienced and can guide the other. In Forties Marriage, the older man, who, while in his 20s would have learned the art of pleasing a woman while he was under the tutelage of his more mature wife, in his 40s marries a woman in her 20s. He provides a stable environment for himself, her, and the children they will bear together, while his young wife invigorates his sex life. Similarly, a woman in her 40s, having born her children with an older man and having learned the art of pleasing a man during her previous marriage with her first, older husband, in her 40s will bestow that knowledge upon her young husband.

When Marriage for Stability and Marriage for Sex are combined, then, they allow for everyone—women and men—to experience the comforts of a stable environment in which to bear and rear children and to establish him/herself professionally, emotionally, and financially. Everyone gets a chance to do everything—well.

The Twenty-Forty-Sixty model gives rise to the question: How does divorce—even “good” divorce—impact children? Children are astoundingly resilient, malleable, and accommodating to change—much more so than adults. When parents uproot and move in pursuit of opportunity, for example, children move along with them and readjust. For children, “normal” is whatever has been presented to them as “normal”—even if, objectively, it is “abnormal.” Unlike children of the 1950s, for whom divorce was abnormal and oftentimes traumatic, 21st-century children regard divorce as “the new normal.” Rarely, in the modern family is there a family without numerous examples of divorce. So children today understand and are comfortable with the concepts of shared custody, alternating holidays, multiple homes, etc. And children today feel that they and their parents are entitled to individual happiness. Amicable (“good”) divorce, where parents, in the best interest of the children, remain civil or friendly during and after divorce, tends to be far less traumatic for children and is even regarded by a growing number of them as the preferred alternative to a contentious or dysfunctional marriage. Under the Twenty-Forty-Sixty model, “transitional divorce” is factored into the tiers of marriage from inception, thereby minimizing the occurrence of bad divorce while increasing the likelihood of divorce that is in the best interest of all parties involved. Under the Twenty-Forty-Sixty construct, transitional divorce is part and parcel to marriage. It is the norm. It gracefully (even if not seamlessly) allows for divorce to occur in Stability-Marriage and Sex-Marriage when the objectives of those marriages have been achieved. Under the Twenty-Forty-Sixty construct, transitional divorce does not sever relationships on the emotional, spiritual level; only on a legal one. And transitional divorce is, of course, not mandatory: If both parties a couple agree to transition together into the next tier of marriage, they are able to do so. Additionally, the Twenty-Forty-Sixty model in no way infringes upon the traditions or moral fabric of marriage for life. Proponents of marriage for life are free to pursue their traditional ideals.

Sixties Marriage (Compatibility-Marriage)

A person in his sixties is in a different place—physically, emotionally, spiritually, and socially—than a person in his forties: Their priorities are fundamentally different (partly because people in their sixties are acutely aware that their lives are beyond half-lived). Compatibility-Marriage allows for people in their sixties—after having raised their children, having had fulfilling sex lives, having obtained their career goals, and having made their marks on the world—to align with each other for the sheer pleasure of companionship, with sex being, at best, incidental to or a perquisite of the union. It is not uncommon in modern, transient societies for people in their sixties to establish new homesteads for their retirement years, oftentimes geographically distancing themselves from their children and grandchildren in the process. Conversely, it is not uncommon for people in their sixties to be left behind by children seeking opportunities and establishing their own branches of family in other cities and countries. So many people in their sixties desire to be officially and legally assured of companionship that is likely to endure for the remainder of their lives: marriage. But unlike their previous marriages, the primary motivation for Compatibility-Marriage is friendship. Thus, in societies where same-sex marriage is legal, the pool of potential Compatibility-Marriage spouses is twice as large. Because of the nature of Compatibility-Marriage, the gender or sexual orientation of one’s spouse becomes less material. A platonic relationship is the foundation of Compatibility-Marriage. And its objective is to endure for the remainder of life.

But sex may still play a meaningful role in Compatibility-Marriage. Typically, by age 60, many people will have conquered or come to terms with their “hang-ups” about sex, so sex (to the extent that it exists) within the context of Compatibility-Marriage, can oftentimes be exciting and liberating. Because sex is not the primary motivation for Compatibility-Marriage, partners tend to be less sexually possessive or exclusive with their spouses. Compatibility-Marriage, therefore, is oftentimes open to non-committal extramarital sex, ménage à trois, hired sex (in jurisdictions where it is legal, of course), etc. After all, people in their 60s are grown people and should be mature enough to handle the intricacies, subtleties, and complexities of sex and sexual relationships. People in their sixties are also acutely aware that they are in the final phase of any meaningful sex life. So whatever sex they engage in during those years should be fulfilling, exciting, interesting, and engaged in for the purpose of strengthening their Compatibility-Marriage.

[  The Twenty-Forty-Sixty model also applies to same-sex marriage, except that the child-bearing component of Stability-Marriage (Twenties Marriage) does not apply to same-sex male marriage (though it does apply to same-sex female marriage). ]

The History of the Fork

The Fork

The fork was around long before it staked out its place on the dining table. The Egyptians used large forks for cooking; and the word “fork” derives from Latin “furca,” meaning “pitchfork.” As a dining utensil, however, the fork is believed to have originated in the Eastern Roman Empire, also known as the Byzantine Empire, where it was in common use by the 4th century. By the 10th century, the fork had become popular in Turkey and the Middle East, spreading thereafter to southern Europe by the second millennium.

The earliest forks had only two widely spaced tines, which were straight, not curved slightly upward as they are today. And their handles tended to be about four inches long and thin, with a circumference about half that of a modern-day drinking straw.

To a large extent, the popularity of forks in the West came literally and figuratively at the hands of two Byzantine princesses who married into Western aristocracy: Theophano, who married Holy Roman Emperor (967-983) and Germany’s King Otto II in 972 C.E.; and Maria Argyropoulaina, who wed the son of the Doge of Venice in 1004. By the end of the 11th century, the table fork had become known in Italy amongst the wealthier classes. By the 14th century, the fork was clearly on its way towards being an accepted dining utensil in Italy. And its widespread acceptance in Italy remained steady, eventually becoming a typical household utensil by the 16th century, some 500 years after its introduction. In 1533, at age 14, Catherine de’ Medici and her entourage introduced the fork to the French when she left Italy for France to marry the future King Henry II. During the Italian Renaissance, each guest would arrive with his own fork and spoon in a decorative box called a “cadena,” and Catherine and her court took that custom along with them to France. It was not uncommon for royals and nobles to have forks made of solid gold or silver, though iron and pewter, for example, were used for the forks of the less privileged.

By the 16th century, the fork had become a part of Italian etiquette, and Spain, Portugal, and France followed suit (though it is widely believed that the Infanta Beatrice of Portugal introduced the fork to her country in the middle of the 15th century). Thomas Coryate is credited with introducing forks to England in 1608 after seeing them in use in Italy during his travels; the initial English reaction was consistent with that of most of Europe—that forks were effeminate and pretentious. In much of northern Europe especially, where most eating was done with the hand or with the aid of a spoon when necessary, the fork was viewed as a decadent, Italian affectation. By the 18th century, however, most of Europe used the fork.

The fork design popular today, with its four, slightly curved tines, was developed in France at the end of the 17th century and in Germany in the middle of the 18th century. It was not until the 19th century—almost 1500 years after it was first popularized in Byzantium—that the fork would become a household item in North America.

The History of Men’s Underwear–from the caveman’s loin cloth to the Calvin Klein boxer-brief

Underwear

The primary purpose of underwear is to protect principal garments from bodily soilure. (In a world without underwear, men who do not properly clean themselves would have “skid marks” on their Brooks Brothers suit-pants rather than on their Fruit of the Loom underpants! And thank God it is relatively inexpensive T-shirts—rather than relatively expensive dress shirts—that bear the brunt of those unsightly underarm deodorant and perspiration stains). Undergarments serve the additional purposes of supporting, contouring, and protecting certain body-parts. And in addition to providing warmth, underwear, if styled properly and selected appropriately, may enhance sex appeal.

History of Men’s Skivvies

Throughout history, the relationship between underwear and primary garments can best be characterized as “in-again, out-again.” For example, what would become the earliest manifestation of underwear—the loin cloth—actually began as outerwear. But as man’s ability to clothe himself evolved, the loin cloth lost its prominence as king of outerwear and was relegated to the humble position of underwear. When the naturally mummified body of Ötzi the “ice man,” who lived around 3300 B.C.E., was discovered in the Ötztal Alps between Italy and Austria in 1991, underneath his cloak of woven grass was a leather loin cloth, indicating that by 5,000 years ago, loin cloths had already become underwear rather than serving as the outerwear they once were for Ötzi’s cave-dwelling ancestors. And Egyptian tombs dating from as early as the second millennium B.C.E. contain extra supplies of linen loin cloths (which the Egyptians would wear under their linen skirts) to sustain them into the afterlife. The wrap-around loin cloth served as underwear for centuries until the Middle Ages—in the 13th century—when step-into, pull-up underwear was invented. Typically made of linen, men would step into their drawers, called “braies,” then secure them around the waist and at the mid-calf by tying or lacing. While the historical record suggests that men of all social classes wore braies during the Middle Ages, only men of the upper echelons wore “chausses”—tights that covered their feet and the lower portions of their legs.

During the Renaissance, chausses became form-fitting (like present-day hose) and covered the entire foot and leg, resulting in braies becoming shorter so as to allow for more of the chausses-covered legs to be exposed. During the Renaissance, then, long, skin-tight chausses and short braies—both regarded as underwear during the Middle Ages—became outerwear. (But See Piero della Francesca’s The Baptism of Christ [1450], where a half-dressed figure is wearing the mid-15th century equivalent of a Jockey-cut white brief).

Men’s underwear as it is primarily known today—the “tighty whitey” look—began taking form in the Victorian era. Until the early 19th century, underwear was made in the home, by hand, primarily of woven linen, cotton, wool, or silk fabrics. Underpants were loose-fitting, extended to the knees, and typically had a drawstring waist. Undershirts, also loose-fitting, resembled what is today referred to as the “painter’s shirt,” but without the collar. But it was the Industrial Revolution plus two pre-Victorian inventions—the knitting machine by William Lee in 1589, and Eli Whitney’s cotton gin in 1793—that led to the mass production of machine-knitted cotton underwear, beginning in the last decades of the 1800s. And for the first time, rather than treating “intimate apparel” intimately, men would purchase ready-made underwear from retail stores instead of having the garments made by hand at home.

By the 1870s in the United States, the standard men’s underwear was the “union suit,” so called because it was an all-in-one, skin-tight undergarment typically made of machine-knitted cotton or wool. Union suits featured buttons down the center-front, from its crew neckline to the crotch, and covered men to their wrists and ankles (Women and children also wore union suits). Some union suits were knee-length and sleeveless. The union suit would remain the gold standard in men’s underwear until the 1930s, when boxers and briefs became preferred. By the late 1800s, men’s knitted underwear was also being made in two separate parts: a long-sleeved top, and long-legged pants. Unlike union suits, the separate tops were only buttoned quarter-way from the neck. And it was those undershirts that became the inspiration for what would become the Henley shirt (See Henley Shirt above, this chapter). The separate underpants were secured at the waist by buttons, snaps, tie-closures, or drawstring. (In World War II, long, skin-tight underpants that extended to the ankles—but were not connected to a top—were issued to American soldiers. And because they resembled the boxing gear prized fighter John L. Sullivan would wear during the height of his career between 1882 and 1892, they were dubbed “Long Johns,” a term still applied to fitted, long-legged underpants).

Elastic, invented by Thomas Hancock in 1820, revolutionized underwear over a century later in the 1930s by simplifying it: Underwear could be put on and taken off easily; buttons, snaps, and tie-closures became superfluous or irrelevant (During World War II, with a shortage of the rubber needed to manufacture elastic, buttons, “French backs,” and snaps were again used as fasteners and size-adjusters in men’s underwear); and since underpants could be secured at the waist (rather than supported from the shoulders), the union suit fell out of favor, boxers and briefs becoming all the rage. The 1930s also saw the introduction of functional design elements—such as the Y-vent, access-flap, and “kangaroo pouch”—to boxers and briefs. And leg, arm, and neck bands were added to knit briefs and undershirts for enhanced fit and aesthetics.

Color was introduced to underwear in the 1940s. And as with so many other fashion trends throughout history, the inspiration came from the military: During World War II, soldiers were issued olive-drab undergarments, which, when hung out to dry on battlefields, provided for better camouflage than traditional white underwear. Another major advancement in underwear in the 1940s came as a result of nylon, which served as the foundation of a new outlook on the cut and fit of men’s underwear. The use of Sanforized (preshrunk) fabrics also became commonplace in the production of underwear in the 1940s—and for good cause since before that technology, men would have to buy underwear one size larger in order to accommodate for shrinkage when laundered.

The in-again, out-again relationship between underwear and primary garments was perhaps epitomized in the 1950s when the T-shirt became a popular outer garment for young men (See The T-shirt above). It was also in that decade that patterned briefs became popular. For the first time, underwear was used to express personal style or to make fashion statements. Synthetic fabrics allowed for innovative designs, and color and patterns (some whimsical) became commonplace. The bikini brief, inspired by women’s swimwear of the late 1940s, made its menswear debut in the 1950s—in everything from salacious animal-skin prints to peekaboo mesh fabrics.

Led by Italy’s Peppino Gheduzzi, the 1960s was the decade of overall elasticity in men’s underwear. Rather than using elastic only in waistbands, elasticized fabrics such as Spandex were used to construct entire undergarments, thereby allowing for close-fitting silhouettes and support. The stretchability of such fabrics also enabled smaller, less cumbersome underwear. By the end of the 1960s, however, with the coming of the Hippie Movement and the attendant Sexual Revolution, some men completely eschewed underpants, even if undershirts became evermore popular, being elevated not only to shirt- status, but also to “talking-shirt” status as men used words, symbols, and other graphics on T-shirts to make social and political statements.

If the Sexual Revolution of the 1960s encouraged men to take off their conventional underpants, then the sexy underwear of the ’70s and the designer underwear of the ’80s encouraged them to put their “undies” back on. While the design and marketing of women’s underwear had from decades earlier embraced a sexual component (à la Frederick’s of Hollywood, established in 1947 by Frederick Mellinger, inventor of the push-up bra), it was not until the 1970s that men’s underwear became intentionally sexual. And as the sexual element got bigger, the underwear itself got smaller—and tighter. By the mid-1980s, designers, led by Calvin Klein, were styling sexy underwear and marketing them in sexy packaging typically depicting muscular male models—like Antonio Sabato, Jr., and Mark “Marky Mark” Wahlberg—packing “six-pack” abdomens. In addition to fitted boxer-briefs (a hybrid featuring the length of traditional boxer shorts, but with the close fit of traditional knit briefs), men started wearing thong, jockstrap-inspired, and G-string underwear. And just as women’s underwear offered support and enhancement features, some men’s underwear was designed and constructed to support and enhance the male anatomy.

One of the most appreciated advancements in men’s underwear in the 21st century came from HanesBrands: In a campaign that used basketball legend Michael Jordan as its spokesperson, the company led the way in replacing abrasive, annoying, skin-irritating manufacturers’ labels with labels printed directly onto the inside fabric of underwear—much to the delight of customers. Other major underwear manufacturers immediately followed suit.

The internet has also significantly advanced men’s underwear. During the first half of the 20th century, only a handful of manufacturers were regarded as viable in the underwear industry: Fruit of the Loom, Jockey, Hanes, Zimmerli of Switzerland, and BVD, for example. Then in the 1980s and ’90s, with the rising popularity of “designer underwear,” the market expanded to include such labels as Calvin Klein, 2(x)ist, Pierre Cardin, and Perry Ellis. Still, though, without major financial backing and access to and a presence in major retail establishments, small underwear companies struggled for market share. A few companies such as International Male cultivated niche markets through direct-mail catalog sales. But with the popularity of the internet, beginning in the late 1990s, small, “boutique” underwear manufacturers have been able to cultivate wide-reaching customer bases with their innovative designs, resulting in the commercial success of companies like Andrew Christian, C-IN2, Diesel, Jack Adams, and Good Devil. In addition, hosting sites such as www.UnderGear.com and www.InternationalJock.com serve as umbrellas for a plethora of up-and-coming men’s underwear companies.

And the in-again, out-again relationship between underwear and primary garments appears to be alive and well going into the 21st century. The tank top is now treated as a shirt in the young, well-muscled, Western man’s wardrobe, and the hip-hop tradition of wearing loose-fitting pants low on the hips so as to deliberately display boxer shorts is, for the most part, regarded as a fact (albeit an unfortunate one) of 21st-century life. Even women have embraced the notion of underwear as primary garments. Since the “grunge look” of the 1990s, the daughters and granddaughters of the women who defiantly burned their bras in the 1960s are deliberately showing their bra straps, wearing bras with garments designed to be worn without support.

Care

Undergarments are the first line of defense against bodily odor and soilure and should therefore be laundered with special care—above and beyond that stated on the manufacturers’ care labels. Because undergarments are worn directly against the skin, they are most effectively laundered inside-out after being pre-soaked for about 24 hours. Special care, for example, should be given to the armholes of undershirts, pre-washing them by hand. And men who do not use or have access to bidets or wet-wipes should pay special attention to their underpants to ensure that they emerge stain-free from the laundering process. No lover needs to remove a gentleman’s sexy underwear only to have a head-on collision with skidmarks! Conventional deodorants tend to cause armhole discoloration on white T-shirts; but mineral salts rock deodorants are a good, stain- and residue-free alternative.

How Should Transgender People Conduct Themselves in Gender-Specific Public Restrooms?

Transgender Persons in Shared, Public, Gender-Designated Restrooms

A “transgender male” is a person with a female or part-female anatomy who self-identifies as of the masculine gender. A “transgender female” is a person with a male or part-male anatomy who self-identifies as of the feminine gender.

A transgender person whose outward appearance is consistent with the gender with which he or she identifies should use public facilities consistent with his/her self-identified gender.

When using public restroom facilities, a transgender person should, to the extent possible, conduct him/herself in a manner regarded as generally consistent with the gender with which he or she identifies. For example, a transgender female, when using the women’s restroom, should sit to urinate when using a shared facility even though her anatomy would allow for urinating from a standing position—the reason being that urinating from an upright position would alert, alarm, and may make uncomfortable other women in the facility. A transgender male, on the other hand, would not attempt to urinate at a stand-up urinal in a male restroom, but should, instead, urinate in a toilet stall from a sitting or stooping position.

A transgender male desiring access to a dispenser of feminine products within a female restroom facility should not enter the female facility, but should, instead, request the kind assistance of custodial personnel or a female entering or exiting the facility in the securing of the desired items.

Inter-sex, inter-gender, bi-gender, gender-neutral, gender-ambiguous, gender-fluid, gender-curious, a-gender, etc., persons should utilize the gender-specific public restroom that is consistent or more consistent with their outward appearance, conducting themselves accordingly therein.

As with all other matters of etiquette, one’s personal needs and wants must be balanced as against those of others. The objective should be for everyone using public restrooms to be comfortable and at ease.

With the increasing acknowledgment and acceptance of transgender people, business establishments and municipalities are increasingly offering gender-neutral restroom facilities with individual stalls, or offering gender-neutral accommodations in addition to gender-specific ones. But in the meantime, the onus is on gentlemen to make the transition as comfortable and as seamless as possible.