Millennials to business: Social responsibility isn’t optional

By Michelle Nunn, Washington Post, December 20, 2011

Michelle Nunn is CEO of Points of Light Institute, a nonprofit nonpartisan volunteer organization with more than 20 years of history. She is also the co-founder of the HandsOn Network, the volunteer-focused arm of the Points of Light Institute.

Excerpt

….As consumers, employees and entrepreneurs, Millennials are shifting the norms of corporate America’s conduct, ethical imperatives and purpose. In his book, “The Way We’ll Be,” pollster John Zogby documents how these “First Globals” are more conscientious consumers than their predecessors, demanding greater honesty and accountability from businesses.

Millennials are bringing their values into the career equation by placing a premium on employers’ reputation for social responsibility and the opportunities those companies and organizations provide their employees to make a positive impact on society. ..

Millennials, as consumers, are pushing companies to change the ways of doing business to align with the values of civic and global responsibility largely held by Millennials…

While Millennials are transforming established businesses, they are also starting a new breed of businesses with built-in social missions that are resonating with the marketplace and revolutionizing their sectors…The values behind Occupy Wall Street are manifesting themselves in the marketplace and companies that are failing to take notice should start…

A new generation of employees, consumers and entrepreneurs is stepping forward with a better way of doing business — putting its bets on the goodness of people rather than loading the dice in its own favor.

Full text

The Occupy Wall Street movement is largely fueled by a relatively small set of young people who view the protests as a fight for their future. The vast majority, however, are getting up and going to work every day — or wishing they could. These individuals are part of a less dramatic but, perhaps, equally powerful movement of Millennials shaping the future of business. As consumers, employees and entrepreneurs, Millennials are shifting the norms of corporate America’s conduct, ethical imperatives and purpose. In his book, “The Way We’ll Be,” pollster John Zogby documents how these “First Globals” are more conscientious consumers than their predecessors, demanding greater honesty and accountability from businesses.

Millennials are bringing their values into the career equation by placing a premium on employers’ reputation for social responsibility and the opportunities those companies and organizations provide their employees to make a positive impact on society. Sixty-one percent of 18- to 26-year-olds polled in a 2011 Deloitte Volunteer IMPACT survey said they would prefer to work for a company that offers volunteer opportunities. Over the past decade, this generational shift has pushed these programs to be more sophisticated, generating billions of dollars of pro bono support for nonprofits and activating millions of skilled volunteers.

Even as the economy has slowed, companies are expanding volunteer programs because these programs attract, develop, motivate and retain the most dynamic and passionate employees. The most innovative of these companies also understand these programs as critical to their bottom line. IBM’s Corporate Service Corps, launched in 2008, has deployed 1,200 IBMers to more than 20 countries, in both a highly competitive leadership development program and a rigorous endeavor to bring the corporation’s skills to bear on complex problems in developing communities. IBM Chairman and CEO Sam Palmisano said at the program’s founding that “we fully expect [this] will make IBM a more competitive and successful business.”

Millennials, as consumers, are pushing companies to change the ways of doing business to align with the values of civic and global responsibility largely held by Millennials. Monitoring supply chains, safeguarding labor and environmental conditions for the creation of products and embracing environmental sustainability have become basic requirements to preserve relationships with customers and retain young employees. A recent market study by the public relations firm Edelman shows that consumers now expect brands to support causes. Many companies are responding to this market shift in ways that integrate causes fully into their business strategy and brand identities. Earlier this year, the Millennial founders of GOOD Magazine launched a subsidiary consulting business called GOOD/Corps, which is helping some of the world’s most recognizable brands navigate and profit from what they call the “Values Revolution” driven by this generation. Companies like Pepsi, Toyota and Starbucks are seeking their guidance on building the meaningful connections that these consumers demand.

While Millennials are transforming established businesses, they are also starting a new breed of businesses with built-in social missions that are resonating with the marketplace and revolutionizing their sectors. TOMS Shoes was founded in 2006 by 30-year-old Blake Mycoskie and has quickly become one of the fastest-growing apparel companies in the world. Well known for its groundbreaking “One-for-One” model that donates a pair of shoes in the developing world for every pair sold, it is also growing a fiercely loyal and active following through its anti-poverty advocacy efforts. It is hard to imagine a traditional shoe brand being able to mobilize a network of 1,200 campus chapters and 250,000 young people in a single day to promote its brand, but that is exactly what TOMS has accomplished with its “One Day Without Shoes” campaign.

Despite the economic downturn and the headlines, the nation’s private sector is still lively. The values behind Occupy Wall Street are manifesting themselves in the marketplace and companies that are failing to take notice should start. These people-powered movements may not have stopped the markets in their tracks, but they are creating the demand for new forms of corporate behavior and ethical imperatives. The winning brands of the future will be ones that authentically respond.

This may result in an aligning of private-sector muscle to address the very inequities, lack of transparency and poverty that Occupy Wall Street has spotlighted. A new generation of employees, consumers and entrepreneurs is stepping forward with a better way of doing business — putting its bets on the goodness of people rather than loading the dice in its own favor.

http://www.washingtonpost.com/national/on-innovations/millennials-to-business-social-responsibility-isnt-optional/2011/12/16/gIQA178D7O_story.html?wpisrc=nl_headlines

The Young are the Restless

By CHARLES M. BLOW, New York Times, April 5, 2013

Excerpt

The surge of generational change continues in this country, altering the cultural landscape with a speed and intensity that has rarely — if ever — been seen before…millennials (defined by Pew as people born in 1981 or later), Generation Xers (those born between 1965 and 1980) and baby boomers (those born between 1946 and 1964)…The millennial generation is the generation of change. Millennials’ views on a broad range of policy issues are so different from older Americans’ perspectives that they are likely to reshape the political dialogue faster than the political class can catch up…a generation bent on rapid change — even if that means standing alone…Young people also are the least religious (more than a quarter specify no religion when asked), and they are an increasingly diverse group of voters. Fifty-eight percent of voters under 30 were white non-Hispanic in 2012, down from 74 percent in 2000. Like it or not, younger Americans are thirsty for change that lines up with their more liberal cultural worldview. Advantage Democrats.

Full text

The surge of generational change continues in this country, altering the cultural landscape with a speed and intensity that has rarely — if ever — been seen before.

The latest remarkable change concerns the decriminalization of the use of marijuana. A poll released Thursday by the Pew Research Center found that for the first time more Americans support legalizing marijuana use than oppose it.

It was rather unsurprising that more young people would support the move, but it was striking how quickly they adopted a more liberal position. About seven years ago, millennials (defined by Pew as people born in 1981 or later), Generation Xers (those born between 1965 and 1980) and baby boomers (those born between 1946 and 1964) shared the same view on marijuana: Only about a third thought it should be legalized. Since then, the share of millennials supporting its legalization has risen more than 90 percent. Meanwhile, the number of legalization supporters in Generation X and among the baby boomers has risen by no more than 60 percent.

The millennial generation is the generation of change. Millennials’ views on a broad range of policy issues are so different from older Americans’ perspectives that they are likely to reshape the political dialogue faster than the political class can catch up.

I surveyed the past six months of Pew and Gallup polls, to better understand the portrait of a generation bent on rapid change — even if that means standing alone.

ON GAY MARRIAGE Much has been made of the growing acceptance of same-sex marriage in this country, but a Pew poll last month found that that the change is driven mainly by millennials. Theirs was the only generation in which a majority (70 percent) supported same-sex marriage; theirs was also the only generation even more likely to be in favor of it in 2013 than in 2012, as support in the other generations ticked down. The longer-term picture is even more telling. Support for same sex-marriage among Generation X is the same in 2013 as it was in 2001 (49 percent). But among millennials, support is up 40 percent since 2003, the first year they were included in the survey.

Some of this no doubt is the result of younger adults’ having more exposure to people who openly identify as LGBT. According to an October Gallup poll, young adults between 18 and 30 were at least twice as likely to identify as LGBT as any other age group.

But this doesn’t necessarily mean that millennials overwhelmingly agree, on a moral level, with same-sex relationships. In fact, a survey released last year by the Berkley Center for Religion, Peace and World Affairs at Georgetown University in conjunction with the Public Religion Research Institute found that they “are nearly evenly divided over whether sex between two adults of the same gender is morally acceptable.”

ON GUN CONTROL According to a February Gallup report, Americans ages 18 to 29 are the least likely to own guns, with just 20 percent saying that they do. That is well under the national average of 30 percent of Americans who own guns.

And in a Pew poll taken shortly after the Newtown, Conn., shootings, younger Americans were the most likely to say that gun control was a bigger concern in this country than protecting the right to own a gun. (Younger respondents barely edged out seniors with this sentiment.)

In fact, a Gallup poll found that the percentage of those 18 to 34 years old saying they want the nation’s gun laws and policies to be stricter doubled from January 2012 to 2013. No other age group saw such a large increase.

It is remarkable that young people’s opinions shifted so dramatically, especially since a December Pew poll found that young adults under 30 were the least likely to believe that the shootings in Newtown reflect broader problems in American society. This age group was, in fact, the most likely to believe that such shootings are simply the isolated acts of troubled individuals.

Young people also are the least religious (more than a quarter specify no religion when asked), and they are an increasingly diverse group of voters. Fifty-eight percent of voters under 30 were white non-Hispanic in 2012, down from 74 percent in 2000. Like it or not, younger Americans are thirsty for change that lines up with their more liberal cultural worldview.

Advantage Democrats.

http://www.nytimes.com/2013/04/06/opinion/blow-the-young-are-the-restless.html?nl=todaysheadlines&emc=edit_th_20130406&_r=0

The Twinkie Manifesto

By PAUL KRUGMAN, New York Times, November 18, 2012

The Twinkie, it turns out, was introduced way back in 1930. In our memories, however, the iconic snack will forever be identified with the 1950s, when Hostess popularized the brand by sponsoring “The Howdy Doody Show.” And the demise of Hostess has unleashed a wave of baby boomer nostalgia for a seemingly more innocent time.

Needless to say, it wasn’t really innocent. But the ’50s — the Twinkie Era — do offer lessons that remain relevant in the 21st century. Above all, the success of the postwar American economy demonstrates that, contrary to today’s conservative orthodoxy, you can have prosperity without demeaning workers and coddling the rich.

Consider the question of tax rates on the wealthy. The modern American right, and much of the alleged center, is obsessed with the notion that low tax rates at the top are essential to growth. Remember that Erskine Bowles and Alan Simpson, charged with producing a plan to curb deficits, nonetheless somehow ended up listing “lower tax rates” as a “guiding principle.”

Yet in the 1950s incomes in the top bracket faced a marginal tax rate of 91, that’s right, 91 percent, while taxes on corporate profits were twice as large, relative to national income, as in recent years. The best estimates suggest that circa 1960 the top 0.01 percent of Americans paid an effective federal tax rate of more than 70 percent, twice what they pay today.

Nor were high taxes the only burden wealthy businessmen had to bear. They also faced a labor force with a degree of bargaining power hard to imagine today. In 1955 roughly a third of American workers were union members. In the biggest companies, management and labor bargained as equals, so much so that it was common to talk about corporations serving an array of “stakeholders” as opposed to merely serving stockholders.

Squeezed between high taxes and empowered workers, executives were relatively impoverished by the standards of either earlier or later generations. In 1955 Fortune magazine published an essay, “How top executives live,” which emphasized how modest their lifestyles had become compared with days of yore. The vast mansions, armies of servants, and huge yachts of the 1920s were no more; by 1955 the typical executive, Fortune claimed, lived in a smallish suburban house, relied on part-time help and skippered his own relatively small boat.

The data confirm Fortune’s impressions. Between the 1920s and the 1950s real incomes for the richest Americans fell sharply, not just compared with the middle class but in absolute terms. According to estimates by the economists Thomas Piketty and Emmanuel Saez, in 1955 the real incomes of the top 0.01 percent of Americans were less than half what they had been in the late 1920s, and their share of total income was down by three-quarters.

Today, of course, the mansions, armies of servants and yachts are back, bigger than ever — and any hint of policies that might crimp plutocrats’ style is met with cries of “socialism.” Indeed, the whole Romney campaign was based on the premise that President Obama’s threat to modestly raise taxes on top incomes, plus his temerity in suggesting that some bankers had behaved badly, were crippling the economy. Surely, then, the far less plutocrat-friendly environment of the 1950s must have been an economic disaster, right?

Actually, some people thought so at the time. Paul Ryan and many other modern conservatives are devotees of Ayn Rand. Well, the collapsing, moocher-infested nation she portrayed in “Atlas Shrugged,” published in 1957, was basically Dwight Eisenhower’s America.

Strange to say, however, the oppressed executives Fortune portrayed in 1955 didn’t go Galt and deprive the nation of their talents. On the contrary, if Fortune is to be believed, they were working harder than ever. And the high-tax, strong-union decades after World War II were in fact marked by spectacular, widely shared economic growth: nothing before or since has matched the doubling of median family income between 1947 and 1973.

Which brings us back to the nostalgia thing.

There are, let’s face it, some people in our political life who pine for the days when minorities and women knew their place, gays stayed firmly in the closet and congressmen asked, “Are you now or have you ever been?” The rest of us, however, are very glad those days are gone. We are, morally, a much better nation than we were. Oh, and the food has improved a lot, too.

Along the way, however, we’ve forgotten something important — namely, that economic justice and economic growth aren’t incompatible. America in the 1950s made the rich pay their fair share; it gave workers the power to bargain for decent wages and benefits; yet contrary to right-wing propaganda then and now, it prospered. And we can do that again.

http://www.nytimes.com/2012/11/19/opinion/krugman-the-twinkie-manifesto.html?nl=todaysheadlines&emc=edit_th_20121119&_r=0

How to Live Without Irony

By CHRISTY WAMPOLE,  New Yok Times, November 17, 2012

If irony is the ethos of our age – and it is – then the hipster is our archetype of ironic living.

The hipster haunts every city street and university town. Manifesting a nostalgia for times he never lived himself, this contemporary urban harlequin appropriates outmoded fashions (the mustache, the tiny shorts), mechanisms (fixed-gear bicycles, portable record players) and hobbies (home brewing, playing trombone). He harvests awkwardness and self-consciousness. Before he makes any choice, he has proceeded through several stages of self-scrutiny. The hipster is a scholar of social forms, a student of cool. He studies relentlessly, foraging for what has yet to be found by the mainstream. He is a walking citation; his clothes refer to much more than themselves. He tries to negotiate the age-old problem of individuality, not with concepts, but with material things.

He is an easy target for mockery. However, scoffing at the hipster is only a diluted form of his own affliction. He is merely a symptom and the most extreme manifestation of ironic living. For many Americans born in the 1980s and 1990s – members of Generation Y, or Millennials – particularly middle-class Caucasians, irony is the primary mode with which daily life is dealt. One need only dwell in public space, virtual or concrete, to see how pervasive this phenomenon has become. Advertising, politics, fashion, television: almost every category of contemporary reality exhibits this will to irony.

Take, for example, an ad that calls itself an ad, makes fun of its own format, and attempts to lure its target market to laugh at and with it. It pre-emptively acknowledges its own failure to accomplish anything meaningful. No attack can be set against it, as it has already conquered itself. The ironic frame functions as a shield against criticism. The same goes for ironic living. Irony is the most self-defensive mode, as it allows a person to dodge responsibility for his or her choices, aesthetic and otherwise. To live ironically is to hide in public. It is flagrantly indirect, a form of subterfuge, which means etymologically to “secretly flee” (subter + fuge). Somehow, directness has become unbearable to us.

How did this happen? It stems in part from the belief that this generation has little to offer in terms of culture, that everything has already been done, or that serious commitment to any belief will eventually be subsumed by an opposing belief, rendering the first laughable at best and contemptible at worst. This kind of defensive living works as a pre-emptive surrender and takes the form of reaction rather than action.

Life in the Internet age has undoubtedly helped a certain ironic sensibility to flourish. An ethos can be disseminated quickly and widely through this medium. Our incapacity to deal with the things at hand is evident in our use of, and increasing reliance on, digital technology. Prioritizing what is remote over what is immediate, the virtual over the actual, we are absorbed in the public and private sphere by the little devices that take us elsewhere.

Furthermore, the nostalgia cycles have become so short that we even try to inject the present moment with sentimentality, for example, by using certain digital filters to “pre-wash” photos with an aura of historicity. Nostalgia needs time. One cannot accelerate meaningful remembrance.

While we have gained some skill sets (multitasking, technological savvy), other skills have suffered: the art of conversation, the art of looking at people, the art of being seen, the art of being present. Our conduct is no longer governed by subtlety, finesse, grace and attention, all qualities more esteemed in earlier decades. Inwardness and narcissism now hold sway.

Born in 1977, at the tail end of Generation X, I came of age in the 1990s, a decade that, bracketed neatly by two architectural crumblings – of the Berlin Wall in 1989 and the Twin Towers in 2001 – now seems relatively irony-free. The grunge movement was serious in its aesthetics and its attitude, with a combative stance against authority, which the punk movement had also embraced. In my perhaps over-nostalgic memory, feminism reached an unprecedented peak, environmentalist concerns gained widespread attention, questions of race were more openly addressed: all of these stirrings contained within them the same electricity and euphoria touching generations that witness a centennial or millennial changeover.

But Y2K came and went without disaster. We were hopeful throughout the ’90s, but hope is such a vulnerable emotion; we needed a self-defense mechanism, for every generation has one. For Gen Xers, it was a kind of diligent apathy. We actively did not care. Our archetype was the slacker who slouched through life in plaid flannel, alone in his room, misunderstood. And when we were bored with not caring, we were vaguely angry and melancholic, eating anti-depressants like they were candy.

FROM this vantage, the ironic clique appears simply too comfortable, too brainlessly compliant. Ironic living is a first-world problem. For the relatively well educated and financially secure, irony functions as a kind of credit card you never have to pay back. In other words, the hipster can frivolously invest in sham social capital without ever paying back one sincere dime. He doesn’t own anything he possesses.

Obviously, hipsters (male or female) produce a distinct irritation in me, one that until recently I could not explain. They provoke me, I realized, because they are, despite the distance from which I observe them, an amplified version of me.

I, too, exhibit ironic tendencies. For example, I find it difficult to give sincere gifts. Instead, I often give what in the past would have been accepted only at a White Elephant gift exchange: a kitschy painting from a thrift store, a coffee mug with flashy images of “Texas, the Lone Star State,” plastic Mexican wrestler figures. Good for a chuckle in the moment, but worth little in the long term. Something about the responsibility of choosing a personal, meaningful gift for a friend feels too intimate, too momentous. I somehow cannot bear the thought of a friend disliking a gift I’d chosen with sincerity. The simple act of noticing my self-defensive behavior has made me think deeply about how potentially toxic ironic posturing could be.

First, it signals a deep aversion to risk. As a function of fear and pre-emptive shame, ironic living bespeaks cultural numbness, resignation and defeat. If life has become merely a clutter of kitsch objects, an endless series of sarcastic jokes and pop references, a competition to see who can care the least (or, at minimum, a performance of such a competition), it seems we’ve made a collective misstep. Could this be the cause of our emptiness and existential malaise? Or a symptom?

Throughout history, irony has served useful purposes, like providing a rhetorical outlet for unspoken societal tensions. But our contemporary ironic mode is somehow deeper; it has leaked from the realm of rhetoric into life itself. This ironic ethos can lead to a vacuity and vapidity of the individual and collective psyche. Historically, vacuums eventually have been filled by something – more often than not, a hazardous something. Fundamentalists are never ironists; dictators are never ironists; people who move things in the political landscape, regardless of the sides they choose, are never ironists.

Where can we find other examples of nonironic living? What does it look like? Nonironic models include very young children, elderly people, deeply religious people, people with severe mental or physical disabilities, people who have suffered, and those from economically or politically challenged places where seriousness is the governing state of mind. My friend Robert Pogue Harrison put it this way in a recent conversation: “Wherever the real imposes itself, it tends to dissipate the fogs of irony.”

Observe a 4-year-old child going through her daily life. You will not find the slightest bit of irony in her behavior. She has not, so to speak, taken on the veil of irony. She likes what she likes and declares it without dissimulation. She is not particularly conscious of the scrutiny of others. She does not hide behind indirect language. The most pure nonironic models in life, however, are to be found in nature: animals and plants are exempt from irony, which exists only where the human dwells.

What would it take to overcome the cultural pull of irony? Moving away from the ironic involves saying what you mean, meaning what you say and considering seriousness and forthrightness as expressive possibilities, despite the inherent risks. It means undertaking the cultivation of sincerity, humility and self-effacement, and demoting the frivolous and the kitschy on our collective scale of values. It might also consist of an honest self-inventory.

Here is a start: Look around your living space. Do you surround yourself with things you really like or things you like only because they are absurd? Listen to your own speech. Ask yourself: Do I communicate primarily through inside jokes and pop culture references? What percentage of my speech is meaningful? How much hyperbolic language do I use? Do I feign indifference? Look at your clothes. What parts of your wardrobe could be described as costume-like, derivative or reminiscent of some specific style archetype (the secretary, the hobo, the flapper, yourself as a child)? In other words, do your clothes refer to something else or only to themselves? Do you attempt to look intentionally nerdy, awkward or ugly? In other words, is your style an anti-style? The most important question: How would it feel to change yourself quietly, offline, without public display, from within?

Attempts to banish irony have come and gone in past decades. The loosely defined New Sincerity movements in the arts that have sprouted since the 1980s positioned themselves as responses to postmodern cynicism, detachment and meta-referentiality. (New Sincerity has recently been associated with the writing of David Foster Wallace, the films of Wes Anderson and the music of Cat Power.) But these attempts failed to stick, as evidenced by the new age of Deep Irony.

What will future generations make of this rampant sarcasm and unapologetic cultivation of silliness? Will we be satisfied to leave an archive filled with video clips of people doing stupid things? Is an ironic legacy even a legacy at all?

The ironic life is certainly a provisional answer to the problems of too much comfort, too much history and too many choices, but it is my firm conviction that this mode of living is not viable and conceals within it many social and political risks. For such a large segment of the population to forfeit its civic voice through the pattern of negation I’ve described is to siphon energy from the cultural reserves of the community at large. People may choose to continue hiding behind the ironic mantle, but this choice equals a surrender to commercial and political entities more than happy to act as parents for a self-infantilizing citizenry. So rather than scoffing at the hipster – a favorite hobby, especially of hipsters – determine whether the ashes of irony have settled on you as well. It takes little effort to dust them away.

Christy Wampole is an assistant professor of French at Princeton University. Her research focuses primarily on 20th- and 21st-century French and Italian literature and thought.

http://opinionator.blogs.nytimes.com/2012/11/17/how-to-live-without-irony/?nl=todaysheadlines&emc=edit_th_20121118

The mythology of the 1980s still defines our thinking on everything from militarism, to greed, to race relations

by David Sirota, http://inthearena.blogs.cnn.com 

The mythology of the 1980s still defines our thinking on everything from militarism, to greed, to race relations 

ONLY ON THE BLOG: Answering today’s six OFF-SET questions is David Sirota, author of the new book, “Back to Our Future: How the 1980s Explain the World We Live in Now—Our Culture, Our Politics, Our Everything.” 

Sirota is a journalist, nationally syndicated newspaper columnist, and host of a daily talk show on KKZN-AM inDenver. He is also a senior editor at “In These Times” magazine and a contributor to The Huffington Post. 

Sirota will appear In the Arena in the near future. 

You begin your exploration by making the case that the political and cultural references from the 1980s have not only become cool again, but may be a way to explain our present-day issues and conflicts, and even influencing our thinking today? Please give us a few then-and-now examples? 

 Consider, for instance, the Tea Party – a revival of what the New York Times called “modern Boston Tea Party” revolts against taxes on the eve of the 1980s. Notably, today’s iteration of this uprising regularly laces its rhetoric with revivalist paeans to the Eisenhower Era. Summarizing the sentiment, one Tea Partier said: “Things we had in the fifties were better.” 

This rhetoric has resonated because for many, it no longer stirs memories of the actual 1950s of Jim Crow laws, gender inequality and religious bigotry. Instead, it evokes the sanitized idea of “The Fifties” that was originally created in the 1980s through movies like Back to the Future, Stand By Me and Hoosiers, television shows like Happy Days and Laverne & Shirley, and rockabilly greaser bands like the Stray Cats. 

Same thing for the Tea Party’s use of red-baiting language that suggests the individual is more important than the common good. Though the Cold War ended years ago and though Ayn Rand is long dead, the bromides elicit Red Dawn fears and Michael Jordan dreams from a generation that grew up being taught to see ourselves as both Soviet-oppressed Wolverines and the next superstars singularly soaring to MVP awards – as long as we will ourselves to just do it. 

You write, “It is impossible to consider the enduring legacy of the 1980s without first returning to and prostrating ourselves at the altar of Michael J. Fox.” What is Fox’s enduring impact today? 

Michael J. Fox’s two most iconic characters in the 1980s were Marty McFly and Alex P. Keaton. Those two characters perfectly represent exactly how the 1980s was revising and reimagining contemporary American history on ideological lines. 

Think about it: Marty McFly was a suburban teen fleeing the cartoonized dangers of modern life (ie. bazooka-weilding Libyan terrorists stalking the suburbs) into an idyllic Fifties of unity and safety. Alex P. Keaton, by contrast, spends his life lambasting his parents Sixties idealism. 

This “Back to the Future”-versus-”Family Ties” war between the 1980s version of “The Fifties” (supposedly 100% unified, universally happy, optimistic, safe, etc.) and the 1980s version of “The Sixties” (supposedly 100% violent, chaotic, overly idealistic, etc.) defines our politics today. 

We are, for instance, supposed to forget thatAmericain the actual 1950s was basically an apartheid state, and also had a 90% top tax bracket. Likewise, we are supposed to forget that the 1960s saw great progress on civil rights and that liberals in the 1960s ultimately helped end the Vietnam War.  

The dominant political narrative today – whether through the Tea Party or through criticisms of President Obama as a supposed “socialist” – tells us that if we only go back to “The Fifties” (ie. the 1980s-revised memory of the 1950s) and shun “The Sixties” (ie. the 1980s-revised memories of the 1960s) then our problems will be solved. It’s the replay of a bad 1980s movie – but it keeps playing. 

You also make a case that the original “A-Team,” which reached new levels of prime time TV violence, may have something to do with how a generation views our government. How so? 

First, it’s important to remember just how influential the A-Team was among ‘80s kids – who are, of course, today’s world-shaping adults. Though it’s easy to retroactively trivialize that show, according to the New York Times in 1983, the program’s first season had a particularly “large following of teen-agers and children aged 6 to 11” and by it’s second season People magazine estimated that a whopping 7 million preteens were watching each week. So this was a show that was really shaping kids minds at precisely the moment that they are forming their storylines about the world. 

And what is the storyline of the A-Team? It’s one of the single-most anti-government parables of the modern age. From the beginning, we are told that the government wrongly accused and incarcerated these heroes; that the government is too inept to keep them incarcerated; that the A-Team is solving societal problems that the government refuses to solve; that the average person can find the A-Team but that the government can’t; and that the government is actually trying to stop the A-Team from its good samaritan work. 

Sounds familiar, right? Of course it does – this is the way government is framed in the 21st century. We’re constantly told the government is either inept, evil, or both – and that the only way to solve problems is to either “go rogue” or hire a private contractor to fix the problem. That was the theme of not only the A-Team, but the entire “vigilante” genre of similar ‘80s productions like The Dukes of Hazzard, Ghostbusters, Die Hard and all the cheesy private detective shows. Their message was simple: You can’t rely on government, you must instead rely on the private corporation. 

Ronald Reagan, our 40th president, served from 1981 to 1989. In your analysis, did he reflect the 1980s or did he shape the 1980s? 

Reagan epitomized how the 1980s began mixing together politics and pop culture to the point where the distinction became blurred. He epitomized this mix both because he was originally known to the country as an actor, and because he regularly wove pop culture references into his speeches (two obvious examples: He made Rambo references when it came to international relations, and he made Star Wars references when it came to nuclear defense). 

So considering that, Reagan had a symbiotic relationship with the zeitgeist of the 1980s – he really did both shape it and reflect it at the same time. 

The key point of my book is to make the case that while Reagan was certainly a factor in creating what we now think of as “The Eighties,” the enduring ideas and narratives of that age were just as powerfully shaped and promoted through the decade’s popular culture. Psychological research tells us that children are deeply affected by fiction, entertainment and media – and that means that the children of the 1980s bring that era’s politicized popular culture with them today, whether on issues of militarism, race or economics. 

What is the main lesson Barack Obama should learn from what happened in the 1980s? 

There are two, in my opinion – one that he seems to really understand, the other that I think he doesn’t fully appreciate. 

The first – the one he gets – is that for better or worse, Americans since the 1980s have come to understand their world as much (if not more) through entertainment and popular culture as they do through conventional politics. By that I mean, the political messages embedded in things like sitcoms, movies, toys, video games, etc. can be just as culturally formative as messages that come from political television ads, politicians’ speeches and theWashington press corps. 

I think that as someone who was culturalized in the 1980s, Obama understands this – his campaign seemed to appreciate that in its use of social media, and I think his expansion of the presidential bully pulpit to multiple platforms shows an appreciation of this truism. 

The second lesson which I don’t think he appreciates is the idea that in order for him to be the transformational president he says he wants to be, he’s going to need to introduce genuinely new narratives and storylines, rather than simply trying to tweak the current ones that endure from the 1980s. 

This is a key point of my book: The mythology of the 1980s still defines our thinking on everything from militarism, to greed, to race relations. If he is going to really change the country in a way he himself said he aspires to, he cannot simply accommodate or play within those fundamentally 1980s narratives. He has to offer up whole new storylines that say, for instance, unquestioned militarism is problematic, that greed is not good and that non-whites do not have to “transcend” their race/ethnicity in order to be valuable people in our society. 

To date, Obama (like most politicians) has not done that – he has not offered up a fundamentally different analysis than the one that came out of the 1980s. 

And what is your most embarrassing 1980s guilty pleasure? Would you reveal to us the ickiest idea, object, event, TV show or movie that you still hold dear to to you? 

Probably that as much as I’ve realized the really pernicious messages of 1980s pop culture, I still nonetheless love a lot of it. For instance, I can see the ugliness of the anti-government message embedded in Ghost Busters, but it remains one of my favorite movies – a film I watch over and over again and enjoy on Saturday nights whenever it reruns on cable. 

Same thing for video games – as hideously militaristic as Atari’s Combat and Missile Command were, I still love playing them on my old Atari, just like I now love playing Halo on my Xbox. In short, as much as I now see the problems of my propagandized youth, I still cling to that youth in a lot of ways. Maybe that’s the definition – and power – of that ethereal thing we commonly call “nostalgia.” 

Die, Hippie, Die! 

Every time one of these ex-hippies comes prancing in from yesteryear, we gotta get out the love beads and pretend we care about people. – Alex P. Keaton, 1986 

For the past several days I’ve been noticing a steep rise in the number of hippies coming to town. . . . I know hippies. I’ve hated them all my life. I’ve kept this town free of hippies on my own since I was five and a half. But I can’t contain them on my own anymore. We have to do something, fast! -Eric Cartman, 2005 

In 1975, a Democratic Party emboldened by civil rights, environmental, antiwar, and post-Watergate electoral successes was on the verge of seizing the presidency and a filibuster-proof congressional majority. That year, The Rocky Horror Picture Show and One Flew Over the Cuckoo’s Nest were two of the three top-grossing films-the former a parody using the late-sixties sexual revolution to laugh at the puritanical fifties, the latter based on the novel by beat writer Ken Kesey. Meanwhile, three of the top-rated seven television shows were liberal-themed programs produced by progressive icon Norman Lear, including All in the Family-a show built around a hippie, Mike Stivic, poking fun at the ignorance of his traditionalist father-in-law, Archie Bunker. 

A mere ten years later, Republican Ronald Reagan had just been reelected by one of the largest electoral landslides in American history, and his party had also gained control of the U.S. Senate. Two of the top three grossing films were Back to the Future, which eulogized the fifties, and Rambo: First Blood Part II, which blamed sixties antiwar activism for losing theVietnamconflict. Most telling, All in the Family’s formula of using sixties-motivated youth and progressivism to ridicule fifties-rooted parents and their traditionalism had been replaced atop the television charts by its antithesis: a Family Ties whose fifties-inspired youth ridicules his parents’ sixties spirit.

The political and cultural trends these changes typified were neither coincidental nor unrelated, and their intertwined backstories explain why we’re still scarred by the metamorphosis. 

The late 1970s and early 1980s marked the birth of an entire industry organized around idealized nostalgia, and particularly midcentury, pre-1965 schmaltz. You likely know this industry well-it survives in everything from roadside Cracker Barrel restaurants to theJerseyshore’s Old Time photo stands to Michael Chabon’s novels to Band of Brothers-style miniseries glorifying the valor of World War II vets- and it first found traction in the 1980s creation of The Fifties(tm). 

Turning a time period into a distinct brand seems common today, what with the all-pervasive references to generational subgroups (Gen X, Gen Y, etc.). But it was a new marketing innovation back in the 1980s. As Temple University professor Carolyn Kitch found in her 2003 study of mass-circulation magazines, generational labeling is “primarily a phenomena of the last quarter of the 20th century,” and it began (as so many things have) as an early-1980s ad strategy aimed at selling products to Baby Boomers and their parents. 

Like all sales pitches, fifties hawking employed subjectivity, oversimplification, and stereotypes. For eighties journalists, advertisers, screenwriters, and political operatives seeking a compelling shorthand to break through the modern media miasma, that meant making The Fifties into much more than the ten-year period between 1950 and 1959. It meant using pop culture and politics to convert the style, language, and memories of that decade into a larger reference to the entire first half of the twentieth century, all the way through the early 1960s of the New Frontier-those optimistic years “before President Kennedy was shot, before the Beatles came, when I couldn’t wait to join the Peace Corps, and I thought I’d never find a guy as great as my dad,” as Baby from the classic eighties film Dirty Dancing reminisced. 

Why The Fifties, and not the 1930s or ’40s, as the face of the entire pre-sixties epoch? Because that decade was fraught with far less (obvious) baggage (say, the Depression or global war) and hence was most easily marketed in the saccharine entertainment culture of the devil-may-care 1980s. 

Indeed, as the Carter presidency started to crumble in 1978 and Reagan began delivering fiery speeches in preparation for his upcoming presidential run, the crew-cut-and-greaser escapades of Happy Days and the poodle skirts of Laverne & Shirley overtook the sixties- referencing urbanity, ethnicity, and strife of Norman Lear’s grittier sitcoms. In movie theaters, Animal House and Grease hit classic status almost instantly. These successes encouraged the culture industry to make the eighties the launching point for a self-sustaining genre of wildly popular back-to-the-fifties productions. 

There were retrospectives such as Diner, Stand By Me, and Peggy Sue Got Married and biopics of fifties icons such as The Right Stuff, La Bamba, and Great Balls of Fire! There was Hoosiers, with its bucolic small towns, its short shorts, and its nonbreakaway rims. There were Broadway plays such as Brighton Beach Memoirs and Biloxi Blues, commemorating the honor, frugality, and innocence of the World War II years. And there was a glut of new Eisenhower biographies. 

Even 1980s productions not overtly focused on decade nostalgia were decidedly recollective of fifties atmospherics. 

There was Witness, which used the story of aPhiladelphiacop’s voyage into lily-white Amish country to juxtapose the simplicity ofAmerica’s pastoral heritage against the crime-ridden anarchy of the black inner city. 

There was Superman and Superman II-films that reanimated a TV hero of the actual 1950s, idealized ClarkKent’s midcentury youth, and depicted his adulthood as the trials of a fedora-wearing anachronism trying to save modern Metropolis from postfifties peril. And there were the endless rip-offs-the Jets-versus-Sharks rivalry of West Side Story ripened into the socs-versus-greasers carnage of The Outsiders, while the hand-holding of Grease became the ass-grabbing of Dirty Dancing. 

Through it all, pop culture was manufacturing a Total Recall of the 1950s for a 1980s audience-an artificial memory of The Fifties that even came with its own canned soundtrack. 

Though we tend to think of the late 1970s and early 1980s as the glory days of punk rock and the primordial soup of what would become rap, Wurlitzer-ready rockabilly and doo-wop were the rage. This was the heyday of the Stray Cats and their standing base, the moment when Adam Ant released the jukebox jam “Goody Two Shoes,” and Queen’s rockabilly hit “Crazy Little Thing Called Love” hit number one on the charts. As the Hard Rock Cafe and Johnny Rockets franchises created a mini-fad of fifties-flavored restaurants, the B-52s’ surf rock was catching a new wave; Meat Loaf was channeling his Elvis-impersonation act into the absurdist 1950s tribute “Paradise by the Dashboard Light”; and ZZ Top was starring in music videos featuring a muscle car that Danny Zuko might have driven at Thunder Road. Even Billy Joel, until then a folksinger, was going all in with a blatant teenybopper tribute, “Uptown Girl.” 

This sonic trend wasn’t happening in a vacuum-it was thrumming in the shadow of the chief missionary of 1950s triumphalism, Ronald Reagan. 

The Gipper’s connection to The Fifties wasn’t just rooted in his success as a midcentury B-movie actor nor in his American Graffiti pompadour. The Fifties had long defined his persona, career, and message. Here was “the candidate of nostalgia, a political performer whose be-bop instrument dates from an antediluvian choir,” as The Washington Post wrote in 1980. Here was a man campaigning for president in the late 1970s and early 1980s calling for the country to go back in time. And not just a few years back in time-way back in time to the dreamy days before what he called the “hard years” of the late 1960s. 

“Not so long ago, we emerged from a world war,” Reagan said in a national address during his 1980 presidential campaign. “Turning homeward at last, we built a grand prosperity and hopes, from our own success and plenty, to help others less fortunate. Our peace was a tense and bitter one, but in those days, the center seemed to hold.” 

Writing to a campaign contributor, Reagan said he wanted to bring forth a “spiritual revival to feel once again as [we] felt years ago about this nation of ours.” And when he won the White House, his inauguration spelled out exactly what he meant by “years ago”: The lavish celebration dusted off and promoted fifties stars such as Frank Sinatra and Charlton Heston. 

This wasn’t a secret message or a wink-and-nod-it was the public theme of Reagan’s political formula. In a Doonesbury comic about the 1980 campaign, cartoonist Gary Trudeau sketched Reagan’s mind as “a storehouse of images of an idyllicAmerica, with 5 cent Cokes, Burma Shave signs, and hard-working White People.” When naming him 1980 “Man of the Year,” Time said, “Intellectually, emotionally, Reagan lives in the past.” The article added that the new president specifically believes “the past”-i.e. the The Fifties-”is his future.” And as both the magazine andAmericasaw it, that was the highest form of praise- just as it is today. 

This all might have gone the way of New Coke if the early-1980s celebration of The Fifties(tm) was happening in isolation. But those Bob Ross paintings of happy Levittown trees and Eisenhower-era blue skies only became salient because the eighties placed them in the American imagination right next to sensationalized images ofWoodstockand theKentStatemassacre. 

Securing that prime psychological real estate meant simultaneously doing to the sixties what was being done to the fifties-only with one twist: Instead of an exercise in idealization, The Sixties(tm) brand that came out of the 1980s was fraught with value judgments downplaying the decade’s positives and emphasizing its chaos. 

Through politics and mass media, a 1960s of unprecedented social and economic progress was reremembered as a time of tie-dye, not thin ties; burning cities, not men on the moon; LBJ scowls, not JFK glamour; redistributionist War on Poverty “welfare,” not universalist Medicare benefits; facial-haired Beatles tripping out to “Lucy in the Sky with Diamonds,” not bowl-cut Beatles chirping out “I Want to Hold Your Hand.” 

Some of the sixties bashing in the 1980s came from a media that earnestly sought to help Baby Boomers forgive themselves for becoming the buttoned-down adults they had once rebelled against. Some of it was the inadvertent side effect of an accelerating twenty-four-hour news cycle that historian Daniel Marcus notes almost always coupled references to the sixties with quick “shots from Woodstock of young people cavorting in the mud, perhaps discarding various parts of their clothing or stumbling through a drug-induced haze.” 

And some of it was just the uncontrived laziness of screenwriters and directors. 

“Getting a popular fix on the more elusive, more complicated, and far more common phenomena of the sixties is demanding because a lot of it isn’t photogenic,” says Columbia professor Todd Gitlin, the former leader of Students for a Democratic Society and author of The Sixties. “How easy it was to instead just make films about the wild people, because they are already an action movie, and their conception of themselves is already theatrical.” 

The revisionism and caricaturing revolved around three key themes, each of which denigrated the sixties as 100 percent awful. 

The first was the most political of all-patriotism. Love of country, loyalty toAmerica, national unity-these were memes that Reagan had been using to berate the sixties since his original jump fromHollywoodto politics. 

During his first campaign forCaliforniagovernor, he ran on a platform pledging to crush the “small minority of beatniks, radicals, and filthy speech advocates” at Berkeley who were protesting the Vietnam War. As president, he railed on nuclear-freeze protesters (like Steven and Elyse Keaton in that first season of Family Ties) as traitors “who would place theUnited Statesin a position of military and moral inferiority.” 

The media industry of the time followed with hypermilitarist films blaming antiwar activists forAmerica’s loss inVietnam(more on that in the chapter “Operation Red Dawn”), and magazine retrospectives basically implying that sixties social movements were anti-American. As just one example, a 1988 Newsweek article entitled “Decade Shock” cited the fact that “patriotism is back in vogue” as proof that the country had rejected the sixties-the idea being that the sixties was wholly unpatriotic. 

But while flag-waving can win elections and modify the political debate, it alone could not mutate the less consciously political, more reptilian lobes of the American cortex. So the 1980s contest for historical memory was also being waged with more refined and demographically targeted methods. 

For teenagers, The Fifties(tm) were used to vandalize The Sixties(tm) through a competition between the Beatnik and the Greaser for the mantle of eighties cool. As historian Daniel Marcus recounts, the former became defined as “middle-class, left-wing, intellectual and centered in New York City and San Francisco”-that is, defined as the generic picture of weak, effete, snobbish coffeehouse liberalism first linked to names such as Hart and Dukakis, and now synonymous with Kerry, Streisand, and Soros. Meanwhile, the Greaser came to be known as an urbanized cowboy-a tough guy who “liked cars and girls and rock and roll, was working class, usually non-Jewish ‘white ethnic’ and decidedly unintellectual.” 

This hero, whose spirit we still worship in the form of Joe the Plumber and “Bring it on” foreign policy, first stomped the Beatnik through the youth-oriented iconography of the 1980s-think idols such as the Fonz, Bruce Springsteen, and Patrick Swayze; movies like Staying Alive, Rocky, and The Lords of Flatbush; bands such as Bon Jovi, Guns N’ Roses, and Poison; and, not to be forgotten, the chintzy clothing fad of ripped jeans and tight white T-shirts. 

For adults who experienced the real fifties and sixties, the propaganda had to be a bit less overt to be convincing. So their memories were more subtly shaped with the arrival of a life-form whose mission was to absolve the hippie generation for becoming the compromised and depoliticized elders they had once railed on and protested against. 

This seductive species became known as yuppies-short for young urban professionals. 

The invasion of the yuppies and all of their requisite tastes, styles, and linguistic inflections officially commenced when Newsweek declared 1984 the Year of the Yuppie, following the publication of The Yuppie Handbook and the presidential campaign of Gary Hart-a New Agey candidate who looked as if he carried a dog-eared copy of the tome around in his breast pocket. A few months later, Adweek quoted executives from the major television networks saying their goal in coming years would be to “chase yuppies with a vengeance”-a prediction that came true, according to Rolling Stone’s 1987 report on a series of hit shows that the magazine called Yuppievision. By 1988, a suited Michael J. Fox eating sushi was on the cover of an Esquire magazine issue devoted entirely to “Yupper Classmen.” Fittingly, one of the articles noted a poll showing that 60 percent of Americans could identify the word yuppie-almost twice the number that could identify the nation’s secretary of state. 

While yuppie certainly evoked supermodern feelings in the 1980s, the concept was etymologically rooted in a politicized past. The word made its public debut in a 1983 newspaper column about Jerry Rubin, the leader of the Youth International Party (yippies) who had abandoned his sixties radicalism for the 1980s world of business. His life story was a textbook yuppie parable of sixties rejection: He was a member of the “vanguard of the baby-boom generation,” which had “march[ed] through the ’60s” but was now “advancing on the 1980s in the back seat of a limousine,” as Newsweek put it. 

Excerpted from Back to Our Future by David Sirota Copyright © 2011 by David Sirota.

http://inthearena.blogs.cnn.com/2011/03/14/david-sirota-the-mythology-of-the-1980s-still-defines-our-thinking-on-everything-from-militarism-to-greed-to-race-relations/

What is civilization by Will Durant

Excerpt

Civilization is social order promoting cultural creation. Four elements constitute it: economic provision, political organization, moral traditions and the pursuit of knowledge and the arts. It begins where chaos and insecurity end.

Physical and biological conditions are only prerequisites to civilization; they do not constitute or generate it. Subtle psychological factors must enter into play. There must be political order…some unity of language to serve as medium of mental exchange. ,Through church, or family, or school, or otherwise, there must be a unifying moral code, some rules of the game of life acknowledged even by those who violate them, and giving to conduct some order and regularity, some direction and stimulus.Whether through imitation, initiation or instruction, whether through father or mother, teacher or priest, the lore and heritage of the tribe — its language and knowledge, its morals and manners, its technology and arts — must be handed down to the young, as the very instrument through which they are turned from animals into men.

The disappearance of these conditions — sometimes of even one of them — may destroy a civilization. A geological cataclysm or a profound climatic change…the failure of natural resources, either of fuels or of raw materials…a pathological concentration of wealth, leading to class wars, disruptive revolutions, and financial exhaustion: these are some of the ways in which a civilization may die.

For civilization is not something inborn or imperishable; it must be acquired anew by every generation, and any serious interruption in its financing or its transmission may bring it to an end. Man differs from the beast only by education, which may be defined as the technique of transmitting civilization…

 

Full text

Civilization is social order promoting cultural creation. Four elements constitute it: economic provision, political organization, moral traditions and the pursuit of knowledge and the arts. It begins where chaos and insecurity end. For when fear is overcome, curiosity and constructiveness are free, and man passes by natural impulse towards the understanding and embellishment of life.

Physical and biological conditions are only prerequisites to civilization; they do not constitute or generate it. Subtle psychological factors must enter into play. There must be political order, even if it be so near to chaos as in Renaissance Florence or Rome; men must feel, by and large, that they need not look for death or taxes at every turn. There must be some unity of language to serve as medium of mental exchange. Through church, or family, or school, or otherwise, there must be a unifying moral code, some rules of the game of life acknowledged even by those who violate them, and giving to conduct some order and regularity, some direction and stimulus. Perhaps there must also be some unity of basic belief, some faith — supernatural or utopian — that lifts morality from calculation to devotion, and gives life nobility and significance despite our mortal brevity. And finally there must be education — some technique, however primitive, for the transmission of culture. Whether through imitation, initiation or instruction, whether through father or mother, teacher or priest, the lore and heritage of the tribe — its language and knowledge, its morals and manners, its technology and arts — must be handed down to the young, as the very instrument through which they are turned from animals into men.

The disappearance of these conditions — sometimes of even one of them — may destroy a civilization. A geological cataclysm or a profound climatic change; an uncontrolled epidemic like that which wiped out half the population of the Roman Empire under the Antonines, or the Black Death that helped to end the Feudal Age; the exhaustion of the land or the ruin of agriculture through the exploitation of the country by the town, resulting in a precarious dependence upon foreign food supplies; the failure of natural resources, either of fuels or of raw materials; a change in trade routes, leaving a nation off the main line of the world’s commerce; mental or moral decay from the strains, stimuli and contacts of urban life, from the breakdown of traditional sources of social discipline and the inability to replace them; the weakening of the stock by a disorderly sexual life, or by an epicurean, pessimist, or quietist philosophy; the decay of leadership through the infertility of the able, and the relative smallness of the families that might bequeath most fully the cultural inheritance of the race; a pathological concentration of wealth, leading to class wars, disruptive revolutions, and financial exhaustion: these are some of the ways in which a civilization may die.

For civilization is not something inborn or imperishable; it must be acquired anew by every generation, and any serious interruption in its financing or its transmission may bring it to an end. Man differs from the beast only by education, which may be defined as the technique of transmitting civilization.

Civilizations are the generations of the racial soul. As family-rearing, and then writing, bound the generations together, handing down the lore of the dying to the young, so print and commerce and a thousand ways of communication may bind the civilizations together, and preserve for future cultures all that is of value for them in our own.

Let us, before we die, gather up our heritage, and offer it to our children.

http://www.willdurant.com/civilization.htm

Corporate America, meet ‘Generation C’ by Brian Solis

Washington Post, June 28 2012

Brian Solis is the author of The End of Business as Usual. He is also a principal analyst at Altimeter Group, a research based advisory firm in San Francisco where he studies the impact of new media on business and consumer behavior.

Excerpt

But, while these people may seem distracted, they are, in fact, very much a part of the occasion. Multitasking is a way of life for them, but there’s something more to it than just a love affair with smartphones and tablets. These “always on” audiences share real-world experiences as they happen with friends and acquaintances who, in turn, respond in real time.

This means word-of-mouth has evolved from one-to-one to one-to-many conversations. Shared experiences become a formidable currency in the networked economy where the influence of an individual is significantly augmented. And, it’s this influence that changes the game for how consumers and organizations connect in the future.

In the age of social media, we are witnessing a C-change (as in “C” for customer) in the balance of power between consumers and businesses. This increasingly empowered generation of connected customers, which I often refer to as Generation-C, is changing the face of engagement and is re-writing the book for how businesses market and serve them in the future.

Today, customers realize that social networks give them influence over how other consumers view a company and they are learning how to influence companies to listen, respond and resolve problems directly. At the center of this evolving customer landscape are shared experiences. People share just about everything and, whether we believe it or not, the activity around these shared experiences influences the impressions and behavior of other consumers in social networks to varying effects.

Full text

You’re at a concert and you notice nearly everyone in the audience is either looking down at their phone or holding it up in the air. A question slowly dawns on you: “What’s the point?”

Going to an event is about being in the moment and enjoying the experience to the fullest, right?

Yes.

But, while these people may seem distracted, they are, in fact, very much a part of the occasion. Multitasking is a way of life for them, but there’s something more to it than just a love affair with smartphones and tablets. These “always on” audiences share real-world experiences as they happen with friends and acquaintances who, in turn, respond in real time.

This means word-of-mouth has evolved from one-to-one to one-to-many conversations. Shared experiences become a formidable currency in the networked economy where the influence of an individual is significantly augmented. And, it’s this influence that changes the game for how consumers and organizations connect in the future.

In the age of social media, we are witnessing a C-change (as in “C” for customer) in the balance of power between consumers and businesses. This increasingly empowered generation of connected customers, which I often refer to as Generation-C, is changing the face of engagement and is re-writing the book for how businesses market and serve them in the future.

Think about this for a moment. Have you ever noticed that it’s almost always social media experts who have problems with companies or products on Twitter? Here’s why, they figured out that, by leaning on the reach and volume of their networks, they can make a difference. They can also jump ahead of traditional service queues to earn attention over one-to-one channels. Businesses are more inclined to respond quickly to these types of complaints in an effort to limit the extent of negative sentiment and improve perceptions.

Today, customers realize that social networks give them influence over how other consumers view a company and they are learning how to influence companies to listen, respond and resolve problems directly. At the center of this evolving customer landscape are shared experiences. People share just about everything and, whether we believe it or not, the activity around these shared experiences influences the impressions and behavior of other consumers in social networks to varying effects.

Services such as Klout, Kred, and PeerIndex now measure social media activity and translate it into an “influence” score. This, for better or for worse, introduces a social consumer hierarchy, creating a new standard for consumer marketing and service – and connected consumers know it. A report by my employer, Altimeter Group, titled “The Rise of Digital Influence” takes a look at precisely this phenomenon.

Dissatisfied customers are not the only ones getting attention. Many businesses also take a very important next step, which is to acknowledge happy customers. This form of positive reinforcement serves as a form of “unmarketing” where consumers feel appreciated and are encouraged to share what they love about the business, product and overall experience.

Individuals with the largest, most loyal, or actively engaged networks form a powerful and connected consumer landscape. What they share or don’t share contributes to a collective brand or service experience that, without engagement, is left for the connected audiences to define.

Suddenly, the audience with an audience becomes a formidable foe or ally for any organization. As such, the proactive investment in positive experiences now represents a modern and potentially influential form of consumer marketing and service. But to engage in the new realm of digital influence will take more than tweets or participating social media conversations. Connected audiences demand that marketers and executives alike rethink the entire customer experience before, during, and after transaction. But remember: No amount of responses can fix a broken product or service.

Read more news and ideas on Innovations and Brian Solis on “Digital Darwinism and why brands die.”
http://www.washingtonpost.com/national/on-innovations/corporate-america-meet-generation-c/2012/06/27/gJQAQlKG9V_story.html?wpisrc=nl_headlines