Donald Trump Is Becoming an Authoritarian Leader Before Our Very Eyes

By Jeet Heer, The New Republic, January 23, 2017 …The new administration’s bewildering boasts and outright lies are what make it so frightening, as they’re early signs of what many of us in the media have warned about for months: Authoritarianism….The purpose of the Trump administration’s lies is not necessarily to deceive, but to separate the believers from the disbelievers—for the purpose of rewarding the former and punishing the latter. …what Trump did in his CIA speech, which was rife with deceptions and examples of a narcissistic will to reshape the truth…Turning a speech at an intelligence agency into a political rally is a deep betrayal of political norms. But it is very much in keeping with Trump’s disturbing habit of claiming the armed wing of the state, including the military and law enforcement, as his political allies…. John MacGaffin, a high-ranking veteran of the agency. “What self-centered, irrational decision process got him to this travesty?” MacGaffin told the magazine. “Most importantly, how will that process serve us when the issues he must address are dangerous and incredibly complex? This is scary stuff!”… One of the defining tactics of his campaign was disinformation, coupled with accusations of the same against the media. That hasn’t changed now that Trump is president. The administration’s unified anti-press and anti-fact message over the weekend is part of a deliberate, long-term strategy that was hatched many months ago, and is only likely to intensify. The president will wage a rhetorical war against the media, with the intent of delegitimizing one of the few institutions that can hold him accountable, and he will wage it with his most effective weapon: Lies, damned lies, and false statistics.


 

The administration’s many lies this weekend should frighten all Americans.  The Central Intelligence Agency (CIA) is expert at estimating crowd sizes. When trying to figure out whether a protest in some foreign hotspot could turn into a revolution, the CIA uses satellite imagery to get a sense of how many people are protesting. So it was particularly brazen of Donald Trump, while addressing the agency for the first time as president, to lie about the size of Friday’s inauguration crowd.

“We had a massive field of people,” Trump told a crowd of about 400 CIA employees at the agency’s headquarters in Langley, Virginia, on Saturday. “You saw them. Packed. I get up this morning, I turn on one of the networks, and they show an empty field. I say, wait a minute, I made a speech. I looked out, the field was—it looked like a million, million and a half people. They showed a field where there were practically nobody standing there. And they said, Donald Trump did not draw well.” Crowd scientists estimate that there were around 160,000 people at Trump’s inauguration in the hour before his speech.

In a bizarre press briefing later on Saturday, Trump Press Secretary Sean Spicer ranted against the media and claimed, not just falsely but nonsensically, that Trump enjoyed “the largest audience to ever witness an inauguration, period—both in person and around the globe. These attempts to lessen the enthusiasm of the inauguration are shameful and wrong.” In fact, the record is still held by Barack Obama for his 2008 inauguration, which drew an estimated 1.8 million.

And on Sunday’s Meet the Press, when asked to explain why Spicer “uttered a falsehood,” senior adviser Kellyanne Conway told Chuck Todd, “Don’t be so overly dramatic about it, Chuck. You’re saying it’s a falsehood…Sean Spicer, our press secretary, gave alternative facts to that.”

Some observers have warned journalists against an “alarmist” response to Trump’s early actions, lest the media too quickly exhaust our capacity for outrage and cause readers, especially those inclined to give the new president a chance, to tune out. “The danger for the established press,” New York Times columnist Ross Douthat wrote in a column over the weekend, “is the same danger facing other institutions in our republic: That while believing themselves to be nobly resisting Trump, they end up imitating him. Such imitation will inspire reader loyalty and passion—up to a point. But beyond that point, it’s more likely to polarize than to persuade, which means it often does a demagogue’s work for him. Fellow journalists, don’t do it.”

That column appears to have been completed before the weekend’s events, though; it makes no mention of Trump’s speech or Spicer’s briefing, which ought to change the calculus on the merits of press alarmism. The new administration’s bewildering boasts and outright lies are what make it so frightening, as they’re early signs of what many of us in the media have warned about for months: Authoritarianism.

The purpose of the Trump administration’s lies is not necessarily to deceive, but to separate the believers from the disbelievers—for the purpose of rewarding the former and punishing the latter. As chess champion Garry Kasparov, an expert in authoritarianism as an outspoken opponent of Russian President Vladimir Putin, tweeted on Saturday:In an already hyper-partisan political landscape, the Trump administration can blatantly lie, knowing that his base trusts him more than the “dishonest media.” And that’s exactly what Trump did in his CIA speech, which was rife with deceptions and examples of a narcissistic will to reshape the truth. While telling a story about a Time magazine reporter who wrongly reported that Trump removed the Martin Luther King, Jr. bust from the Oval Office (a mistake that was quickly corrected, but which the Trump staff continues to harp on), the president went on a tangent about Time.

“I have been on their cover, like, 14 or 15 times,” he said. “I think we have the all-time record in the history of Time magazine. Like, if Tom Brady is on the cover, it’s one time, because he won the Super Bowl or something, right?  I’ve been on it for 15 times this year. I don’t think that’s a record…that can ever be broken. Do you agree with that? What do you think?” (The all-time record is held by Richard Nixon, who appeared on 55 Time covers.)

Aside from these lies and factual mistakes, Trump’s speech was genuinely weird on a number of a counts. His intended purpose was to mend fences with the agency, with which he’s feuded over their conclusion that Russia interfered in the election to help him defeat Hillary Clinton. Yet he did very little to reassure CIA staff, only briefly acknowledging their sacrifice and service by alluding to a wall commemorating agents who died in line of duty.

Rather, Trump was in full campaign mode, attacking the media (“among the most dishonest human beings on Earth”) and praising himself (“they say, ‘is Donald Trump an intellectual?’ Trust me, I’m like a smart person”). He also indicated the U.S. might reinvade Iraq for imperial plunder. “The old expression, ‘to the victor belong the spoils’—you remember,” he said. “I always used to say, keep the oil…So we should have kept the oil. But okay. Maybe you’ll have another chance.” The entire event was orchestrated like a campaign stop, so much so that Trump even brought along around 40 supporters, who could be heard cheering and clapping during his applause lines.

Turning a speech at an intelligence agency into a political rally is a deep betrayal of political norms. But it is very much in keeping with Trump’s disturbing habit of claiming the armed wing of the state, including the military and law enforcement, as his political allies. He said early in the CIA speech that “the military gave us tremendous percentages of votes. We were unbelievably successful in the election with getting the vote of the military. And probably almost everybody in this room voted for me, but I will not ask you to raise your hands if you did.” At the end of his speech, Trump sounded like a pathetic suitor making his final pitch: “I just wanted to really say that I love you, I respect you. There’s nobody I respect more.”

While Trump’s antics might have impressed his fans watching from home, they seem to have done little to assuage worries in the agency. The New Yorker interviewed a variety of intelligence experts, including John MacGaffin, a high-ranking veteran of the agency. “What self-centered, irrational decision process got him to this travesty?” MacGaffin told the magazine. “Most importantly, how will that process serve us when the issues he must address are dangerous and incredibly complex? This is scary stuff!”

Trump’s self-centered decision process is authoritarianism, and it’s anything but irrational. He campaigned in an authoritarian style, with rallies where he riled up large crowds to jeer at the press and protesters. One of the defining tactics of his campaign was disinformation, coupled with accusations of the same against the media. That hasn’t changed now that Trump is president. The administration’s unified anti-press and anti-fact message over the weekend is part of a deliberate, long-term strategy that was hatched many months ago, and is only likely to intensify. The president will wage a rhetorical war against the media, with the intent of delegitimizing one of the few institutions that can hold him accountable, and he will wage it with his most effective weapon: Lies, damned lies, and false statistics.

The Age of Post-Truth Politics

By WILLIAM DAVIES, New York Times,

Facts hold a sacred place in Western liberal democracies. Whenever democracy seems to be going awry, when voters are manipulated or politicians are ducking questions, we turn to facts for salvation.

But they seem to be losing their ability to support consensus. PolitiFact has found that about 70 percent of Donald Trump’s “factual” statements actually fall into the categories of “mostly false,” “false” and “pants on fire” untruth.

For the Brexit referendum, Leave argued that European Union membership costs Britain 350 million pounds a week, but failed to account for the money received in return.

The sense is widespread: We have entered an age of post-truth politics.

As politics becomes more adversarial and dominated by television performances, the status of facts in public debate rises too high. We place expectations on statistics and expert testimony that strains them to breaking point. Rather than sit coolly outside the fray of political argument, facts are now one of the main rhetorical weapons within it.

How can we still be speaking of “facts” when they no longer provide us with a reality that we all agree on? The problem is that the experts and agencies involved in producing facts have multiplied, and many are now for hire. If you really want to find an expert willing to endorse a fact, and have sufficient money or political clout behind you, you probably can.

The combination of populist movements with social media is often held responsible for post-truth politics. Individuals have growing opportunities to shape their media consumption around their own opinions and prejudices, and populist leaders are ready to encourage them.

But to focus on recent, more egregious abuses of facts is to overlook the ways in which the authority of facts has been in decline for quite some time. Newspapers might provide resistance to the excesses of populist demagogy, but not to the broader crisis of facts.

The problem is the oversupply of facts in the 21st century: There are too many sources, too many methods, with varying levels of credibility, depending on who funded a given study and how the eye-catching number was selected.

According to the cultural historian Mary Poovey, the tendency to represent society in terms of facts first arose in late medieval times with the birth of accounting. What was new about merchant bookkeeping, Dr. Poovey argued, was that it presented a type of truth that could apparently stand alone, without requiring any interpretation or faith on the part of the person reading it.

In the centuries that followed, accounting was joined by statistics, economics, surveys and a range of other numerical methods. But even as these methods expanded, they tended to be the preserve of small, tight-knit institutions, academic societies and professional associations who could uphold standards. National statistical associations, for example, soon provided the know-how for official statistics offices, affiliated with and funded by governments.

In the 20th century, an industry for facts emerged. Market-research companies began to conduct surveys in the 1920s and extended into opinion polling in the 1930s. Think tanks like the American Enterprise Institute were established during and after World War II to apply statistics and economics to the design of new government policies, typically in the service of one political agenda or another. The idea of “evidence-based policy,” popular among liberal politicians in the late 1990s and early 2000s, saw economics being heavily leaned on to justify government programs, in an allegedly post-ideological age.

Of course the term “fact” isn’t reserved exclusively for numbers. But it does imply a type of knowledge that can be reliably parceled out in public, without constant need for verification or interpretation.

Yet there is one much more radical contributor to our post-truth politics that could ultimately be as transformative of our society as accounting proved to be 500 years ago.

We are in the middle of a transition from a society of facts to a society of data. During this interim, confusion abounds surrounding the exact status of knowledge and numbers in public life, exacerbating the sense that truth itself is being abandoned.

The place to start in understanding this transition is with the spread of “smart” technologies into everyday life, sometimes called the “internet of things.” Thanks to the presence of smartphones and smartcards in our pockets, the dramatic uptake of social media, the rise of e-commerce as a means of purchasing goods and services, and the spread of sensory devices across public spaces, we leave a vast quantity of data in our wake as we go about our daily activities.

Like statistics or other traditional facts, this data is quantitative in nature. What’s new is both its unprecedented volume (the “big” in big data) and also the fact that it is being constantly collected by default, rather than by deliberate expert design. Numbers are being generated much faster than we have any specific use for. But they can nevertheless be mined to get a sense of how people are behaving and what they are thinking.

The promise of facts is to settle arguments between warring perspectives and simplify the issues at stake. For instance, politicians might disagree over the right economic policy, but if they can agree that “the economy has grown by 2 percent” and “unemployment is 5 percent,” then there is at least a shared stable reality that they can argue over.

The promise of data, by contrast, is to sense shifts in public sentiment. By analyzing Twitter using algorithms, for example, it is possible to get virtually real-time updates on how a given politician is perceived. This is what’s known as “sentiment analysis.”

There are precedents for this, such as the “worm” that monitors live audience reaction during a televised presidential debate, rising and falling in response to each moment of a candidate’s rhetoric. Financial markets represent the sentiments of traders as they fluctuate throughout the day. Stock markets never produce a fact as to what Cisco is worth in the way that an accountant can; they provide a window into how thousands of people around the world are feeling about Cisco, from one minute to the next.

Journalists and politicians can no more ignore a constant audit of collective mood than C.E.O.s can ignore the fluctuations in their companies’ share prices. If the British government had spent more time trying to track public sentiment toward the European Union and less time repeating the facts of how the British economy benefited from membership in the union, it might have fought the Brexit referendum campaign differently and more successfully.

Dominic Cummings, one of the leading pro-Brexit campaigners, mocked what he called outdated polling techniques. He also asked one pollster to add a question on “enthusiasm,” and, employing scientists to mine very large, up-to-the-minute data sets, to gauge voter mood and to react accordingly with ads and voter-turnout volunteers.

It is possible to live in a world of data but no facts. Think of how we employ weather forecasts: We understand that it is not a fact that it will be 75 degrees on Thursday, and that figure will fluctuate all the time. Weather forecasting works in a similar way to sentiment analysis, bringing data from a wide range of sensory devices, and converting this into a constantly evolving narrative about the near future.

However, this produces some chilling possibilities for politics. Once numbers are viewed more as indicators of current sentiment, rather than as statements about reality, how are we to achieve any consensus on the nature of social, economic and environmental problems, never mind agree on the solutions?

Conspiracy theories prosper under such conditions. And while we will have far greater means of knowing how many people believe those theories, we will have far fewer means of persuading them to abandon them.

William Davies is an associate professor in political economy at Goldsmiths, University of London, and the author of “The Happiness Industry: How the Government and Big Business Sold Us Well-Being.”


 

An Informed and Educated Electorate

By Thom HartmannTruthout.org, December 6, 2010

If a nation expects to be ignorant and free, in a state of civilization, it expects what never was and never will be.…Whenever the people are well-informed, they can be trusted with their own government; that, whenever things get so far wrong as to attract their notice, they may be relied on to set them right. —Thomas Jefferson

Talk Radio News Service, based in Washington, D.C., is owned and run by my dear friend Ellen Ratner. Ellen is an experienced and accomplished journalist, and a large number of interns and young journalism school graduates get their feet wet in reporting by working for and with her.

In March 2010 I was in Washington for a meeting with a group of senators, and I needed a studio from which to do my radio and TV show. Ellen was gracious enough to offer me hers. I arrived as three of her interns were producing a panel-discussion type of TV show for Web distribution at www.talkradionews.com, in which they were discussing for their viewing audience their recent experiences on Capitol Hill.

One intern panelist related that a White House correspondent for one of the Big Three TV networks (ABC, CBS, and NBC) had told her that the network registered a huge amount of interest 66 Rebooting the American Dream in the “hot story” that week of a congressman’s sexual indiscretions. Far less popular were stories about the debates on health care, the conflicts in the Middle East, and even the Americans who had died recently in Iraq or Afghanistan.

“So that’s the story they have to run with on the news,” the intern said, relating the substance of the network correspondent’s thoughts, “because that’s what the American people want to see. If the network doesn’t give people what they want to see, viewers will tune away and the network won’t have any viewers, ratings, or revenues.”

The two other interns commiserated with the first about what a shame it was that Americans wanted the titillating stories instead of the substantive ones, but they accepted without question that the network was therefore obliged to “give people what they want.”

When they finished their panel discussion, I asked these college students if they knew that there was a time in America when radio and TV stations and networks broadcast the actual news— instead of infotainment—because the law required them to do so. None of them had any idea what I was talking about. They were mystified: why would a station or network broadcast programs that were not popular or not what people wanted?

The Devolution of Broadcast News

But the reality is that from the 1920s, when radio really started to go big in the United States, until Reagan rolled it back in 1987, federal communications law required a certain amount of “public service” programming from radio and television stations as a condition of retaining their broadcast licenses.

The agreement was basic and simple: in exchange for the media owners’ being granted a license from the Federal Communications Commission (FCC) to use the airwaves—owned by the public—they had to serve the public interest first, and only then could they go about the business of making money. If they didn’t do so, when it came time to renew their license, public groups and individuals could show up at public hearings on the license renewal and argue for the license’s being denied.

One small way that stations lived up to their public-service mandate was by airing public-service announcements (PSAs) for local nonprofit groups, community calendars, and other charitable causes. They also had to abide by something called the Fairness Doctrine, which required them to air diverse viewpoints on controversial issues. Separately, during election campaigns, broadcasters had to abide by the Equal Time Rule, which required them to provide equal airtime to rival candidates in an election.

But the biggest way they proved they were providing a public service and meeting the requirements of the Fairness Doctrine was by broadcasting the news. Real news. Actual news. Local, national, and international news produced by professional, oldschool journalists.

Because the news didn’t draw huge ratings like entertainment shows—although tens of millions of Americans did watch it every night on TV and listened to it at the top of every hour on radio from coast to coast—and because real news was expensive to produce, with bureaus and correspondents all over the world, news was a money-loser for all of the Big Three TV networks and for most local radio and TV stations.

But it was such a sacred thing—this was, aft er all, the keystone that held together the station’s license to broadcast and thus to do business—it didn’t matter if it lost money. It made all the other money-making things possible.

Through much of the early 1970s, I worked in the newsroom of a radio station in Lansing, Michigan. It had been started and was then run by three local guys: an engineer, a salesman, and a radio broadcaster. They split up the responsibilities like you’d expect, and all were around the building most days and would hang out from time to time with the on-air crew—all except the sales guy. I was forbidden from talking with him because I worked in news. Th ere could be no hint—ever, anywhere—that our radio station had violated the FCC’s programming-in-the-public-interest mandate by, for example, my going easy on an advertiser in a news story or promoting another advertiser in a different story. News had to be news, separate from profits and revenue—and if it wasn’t, I’d be fired on the spot.

News, in other words, wasn’t part of the “free market.” It was part of our nation’s intellectual commons and thus the price of the station’s license.

After Reagan blew up the Fairness Doctrine in 1987, two very interesting things happened. The first was the rise of rightwing hate-speech talk radio, starting with Rush Limbaugh that very year. The second, which really stepped up fast after President Clinton signed the Telecommunications Act of 1996, which further deregulated the broadcast industry, was that the moneylosing news divisions of the Big Three TV networks were taken under the wings of their entertainment divisions—and wrung dry. Foreign bureaus were closed. Reporters were fired. Stories that promoted the wonders of advertisers or other companies (like movie production houses) owned by the same mega-corporations that owned the networks began to appear. And investigative journalism that cast a bright light on corporate malfeasance vanished.

And because newscasts had ads, and those ads were sold based on viewership, the overall arc and content of the news began to be dictated by what the public wanted to know rather than by what they needed to know to function in a democratic society.

The interns were aghast. “Reagan did that?!” one said, incredulous. I said yes and that Bill Clinton then helped the process along to its current sorry state by signing the Telecommunications Act, leading to the creation of the Fox “News” Channel in October 1996 and its now-legal ability to call itself a news operation while baldly promoting what it knows to be falsehoods or distortions.

Now here we are in 2010, and the news media is an abject failure when it comes to reporting the real news—news that citizens in a democracy need to know. Even Ted Koppel, no flaming liberal by any means, said in an April 2010 interview with the British Broadcasting Corporation (BBC) that he thought the state of the news industry was “a disaster.”[1] He went on:

I think we are living through the final stages of what I would call the Age of Entitlement. We fight two wars without raising a single nickel to support them. We feel entitled to mortgages whether we have jobs or not. We feel entitled to make $10 million, $50 million, or $100 million even though the enterprise we headed up is a total failure. And we now feel entitled not to have the news that we need but the news that we want. We want to listen to news that comes from those who already sympathize with our particular point of view. We don’t want the facts anymore.

Koppel was also well aware of the influence of profit-making on the news organizations, which he believed was driving the degradation of news so that it appealed to our baser instincts:

I think it’s the producer [of the particular news show] who is at fault, who desperately needs the consumer…In the good old days, when you only had three networks—ABC, NBC, and CBS—there was competition, but the competition still permitted us to do what was in the public interest. These days all the networks have to fi ght with the dozens of cable outlets that are out there, the Internet that is out there, and they are all competing for the almighty dollar, and the way to get there is to head down to the lowest common denominator.

When we talk about news that people “need,” we are really talking about the intellectual and informational nutrition that is essential for the health and the well-being of our democracy. We need an educated and informed citizenry to participate in our democratic institutions and elections, and we’re not going to get that if we keep dumbing down the news and giving people what they want and not what they and society need.

Breaking Up the Media Monopolies

The Studio System

Back in the 1930s and 1940s, the eight biggest movie studios owned the majority of movie theaters in America. A Paramount theater, for example, would show only movies produced by Paramount’s movie studios, which featured only people under contract to Paramount. The result was that the studios could make (or break) any movie star and control what people could see in their local community. It was very profitable to the studios, but it was stifling to competition and creativity and therefore a disservice to the moviegoing audience.

So through that era, in a series of actions that lasted almost a decade and which were capped by the big studios’ signing a major consent decree with the feds, the federal government tried to force the big theaters to open up the business to competition. The big theaters said that they would, even agreeing to the 1940 Paramount Decree, but they continued with business as usual.

The issue came to a head when it was argued in an antitrust case before the U.S. Supreme Court in 1948. Th e Court, in a 7-to-1 decision, ruled against the movie giants, saying that they could no longer have total control of the vertically integrated system—from contracting with actors to making movies to showing them in their own theaters across the country. They had to choose: operate in either the movie making business or the movie showing business. They couldn’t do both.

The result was the beginning of the end of the “kingmaker” movie studio monopoly and a boon for independent filmmakers. It also led to a proliferation of new theaters, from ones in urban areas (many retrofitting old opera or burlesque houses) to the new fad of drive-in movie theaters. Th e industry today is infinitely more diverse and creative as a result of that breakup.

Television and the Prime Time Access Rule

In the late 1960s, television was going through a similar vertical integration, with the Big Three TV networks dominating the content of local television stations they either owned or had as affiliates. In response the FCC promulgated the Prime Time Access Rule in 1970, which dictated that at least one hour out of the four “prime time” hours on every local TV station in the nation would have to come from some source other than the network.

This opened the door to independent TV production companies, like MTM Enterprises, which produced several sitcoms derived from the work of Mary Tyler Moore, and competition from the new television divisions of old-line movie houses, such as Twentieth Century Fox’s producing a TV version of M*A*S*H and Paramount’s producing Happy Days.[2]

Although the rules against vertical theater integration are no longer enforced, and the Prime Time Access Rule was blown up in 1996, both the movie and TV industries are broadly more diverse in their programming than they would have been without these “market interventions” that increased competition and decreased monopoly. Which brings us to radio.

The Vicious Circle of Conservative Talk Radio

Many people wonder why the big 50,000-watt AM stations (and even many of the big 25,000- and 10,000-watt stations) across the country carry exclusively conservative programming, particularly programs featuring Rush Limbaugh, Sean Hannity, and Glenn Beck. In most cases, it’s a simple matter of the economics of monopoly.

One of the largest owners of the biggest (full-power) radio stations in the country is a mega-corporation that also owns the largest talk-radio syndication service in the nation. When the corporation’s stations carry shows that its syndication service owns, it makes money both from the local station ownership and from the ownership of the syndication service. When the stations carry shows from other syndicators or independent shows, the corporation loses the syndication revenue and the local station (which it also owns) loses typically five minutes of advertising inventory per hour that it must barter with the syndicated show for in exchange for the right to air the show.

Thus, so long as the radio industry is allowed to run like the movie studio system in the 1940s, the “studio”—in this case the giant corporation that owns radio stations as well as the nation’s largest talk-radio syndication service—will have an outsized influence on what shows up on the very biggest stations in the largest markets across the country. Because of the huge, booming voice of those stations, those shows will have a significant edge in “finding” listeners (and vice versa), making those shows “successful” and thus creating demand for them from the independent stations. It becomes a self-fulfilling prophecy.

Some progressives have suggested that radio needs a “fairness doctrine” where a government panel will determine how much “liberal” or “conservative” programming each station carries and then force the stations to “balance” out any disequilibrium. But who decides what is “liberal” or “conservative”? Is there a checklist of political positions that a government watchdog would have to go through—immigration, taxes, protecting the commons, gay rights, abortion, gun control, foreign policy? It would be a mess, particularly since many of those issues don’t lend themselves to easy pigeonholing.

A much easier way to balance the playing field is simply to bring into the marketplace real competition by separating syndication companies from local radio stations so that the stations will no longer have an incentive to carry programming because “it’s in the family” and instead will look for shows that can attract and hold an audience.

Programming in the Public Interest

We need to return to the notion of “programming in the public interest,” making news back into news. We also need to start enforcing the Sherman Antitrust Act and use it to break up the large media monopolies that have re-formed since the Reagan and Clinton eras, thus effectively rolling back media deregulation.

And this isn’t limited to radio and TV. Consumer-friendly regulation almost always has a similar effect in breaking up monopolies when it’s designed to help people get around the monopoly.

For example, the company that owns the copper wires, cable, G3 or G4 wireless, or fiber-optic cabling going into your house also owns the exclusive right to carry the content that goes over that infrastructure. If you have a cable company supplying your home, it’s probably competing only with the local phone company for your business. Because those two companies (and maybe a mobile provider) are the only ones “competing” for your business, they can easily keep prices—and profits—very high.

In most other developed countries, however, regardless of who owns and maintains the wires, cable, or fiber, anybody can off er content over it. Th e rationale for this is that infrastructure of physical wires and the wireless frequencies constitutes a “natural monopoly” that heavily uses public spaces (cables and phone lines go through and along public streets and rights-of-way); and so while a company can make a small profit on that part of its business, the wires and the wireless frequencies are really a part of the commons that can be regulated.

Help fight ignorance. Click here for free Truthout email updates.

On the other hand, these developed countries believe that the content delivery should be competitive. After all, this is where most of the innovation comes from: it’s not a matter of the newest, coolest copper wires; it’s the content that draws customers.

The result of this is that the average citizen in France, for example, pays about $33 per month for what the New York Times described as “Internet service twice as fast as what you get from Verizon or Comcast, bundled with digital high-definition television, unlimited long distance and international calling to 70 countries and wireless Internet connectivity for your laptop or smartphone throughout most of the country.”[3]

And that’s all from private companies, with no government subsidies. Why? Because small and new companies are allowed to compete by the government’s requiring whichever company carries the signal (wire, cable, fiber, wireless) to make that signal path available to any company that wants to off er content to consumers.

Competition—mandated by the French government—has driven the price down and innovation up. Th e average French citizen is not only paying one-fifth of what the average American pays for such services but is also getting better quality, more variety, and much faster Internet access.

Breaking up the media monopolies and fostering more competition, innovation, and creativity in the media world clearly has public benefits, especially in ensuring that people have access to information they need to participate in our democracy. An informed and educated electorate would be one major result of such government regulation.

The same result can also be helped by making higher education more accessible to the average American.

Access to Higher Education

Jefferson’s Tombstone

Thomas Jefferson’s tombstone contains an epitaph that he wrote before his death with a directive that not a single word be changed. He had been the president of the United States for two terms and the vice president for one, was a member of the Virginia legislature, and was a famous inventor and architect as well as the author of nearly a million words in various letters, diaries, notebooks, books, pamphlets, and rants. But he chose not to mention any of that on his gravestone.

Besides the dates of his birth and death, he chose to be remembered for three things that he did in his 83 years of life on earth:

Here Was Buried Thomas Jefferson

Author of the Declaration of American Independence

of the Statute of Virginia for Religious Freedom

and Father of the University of Virginia

Writing the Declaration of Independence was an obvious choice, and declaring forever his opposition to integrating church and state also made sense (although it got him demoted in 2010 in schoolbooks in the state of Texas). But “Father of the University of Virginia” being more important than “President of the United States of America”?

Jefferson, it turns out, had this wacky idea. He actually believed that young people should be able to go to college regardless of their ability to pay, their station in life, and how rich or poor their parents were. He thought that an educated populace was the best defense of liberty and democracy in the new nation he’d helped birth.

So the University of Virginia that he started was free.

Reagan’s Legacy

Ronald Reagan certainly thought that that was a wacky idea, and he was diametrically opposed to the Jeffersonian ideal. When he took office as governor of California in 1967, he quickly called for an end to free tuition at the University of California and an across-the- board 20 percent cut in state funding for higher education.[4] He then argued for a cut in spending on construction for higher education in the state and set up the fi ring of the popular president of the university, Clark Kerr, whom he deemed “too liberal.”

When asked why he was doing away with free college in California, Reagan said that the role of the state “should not be to subsidize intellectual curiosity.”

Reagan further referred to college students who nationwide were protesting the Vietnam War as “brats,” “cowardly fascists,” and “freaks.” Adding that if the only way to “restore order” on the nation’s campuses was violence, that was fine with him. Just a few days before the Kent State shootings, he famously said, “If it takes a bloodbath, let’s get it over with. No more appeasement!”[5]

The trend that Reagan began with the UC system continues to this day. During Republican governor Arnold Schwarzenegger’s tenure, state funding for education saw drastic cuts and tuition for undergraduate students rose by more than 90 percent.[6]

Reagan set a tone as governor of California that metastasized across the nation through the 1970s and became federal policy when he was elected president in 1980. By the time he left offi ce in 1988, federal funding for education in the United States had declined from 12 percent of total national educational spending in 1980 to just 6 percent.[7]

Interestingly, to find most of this information you have to dive into recent biographies of the former president or read old newspaper archives that are usually not available online. Not a word of Reagan’s role in slashing the UC funding exists, for example, on the Wikipedia pages for either the University of California or Reagan himself. Conservative foundations have poured millions of dollars into campaigns to scrub the Internet clean when it comes to Reagan’s past (and that of most other right-wingers).

Yet the reality is that before the Reagan presidency, it was possible for any American student with academic competence to attend college and graduate without debt.

Even in Michigan in the late 1960s, where education was not free but was highly subsidized by the state, my wife paid her way through college by working part-time as a waitress at a Howard Johnson’s. To the extent that I went to college (I completed less than a year altogether), I paid my own way by working as a DJ for $2.35 per hour, running my own TV repair business, pumping gas, and working as a cook at a Big Boy restaurant on weekends.

Such a scenario is unthinkable today. Instead public higher education has become a big business and is oft en totally corporate; costs are through the roof; and if you’re not from a very wealthy family, odds are you’ll graduate college with a debt that can take decades to repay. As a result, the United States is slipping in virtually every measurement of innovation, income, and competitiveness. A highly educated workforce is good for innovation and entrepreneurialism: every one of the top 20 innovative countries in the world—except the USA—offers free or very inexpensive college to qualified students.

Ireland took a cue from the pre-Reagan University of California and began offering free college tuition to all Irish citizens and a fl at-rate registration fee of 900 euros per year for all European Union citizens. Th e result, decades later, is that Ireland has gone from having a backwater economy that was largely based on agriculture and tourism to becoming one of the high-tech and innovation capitals of the world.

Ironically, Ireland’s vision—and California’s pre-Reagan vision—of education was at the core of Thomas Jefferson’s hopes for the country he helped found.

Jefferson’s Vision

On June 14, 1898, more than 70 years aft er Jefferson’s death, a new building (then called the Academic Building, now called Cabell Hall) was inaugurated at the University of Virginia. One of the nation’s most prominent attorneys at the time, James C. Carter of New York City, gave the dedication speech.[8] Carter noted that when Jefferson retired from public office, he was only 66 years old and still energetic and enthusiastic to do something for his country. That something was founding the University of Virginia. Carter said:

He had cherished through life a passion for the acquisition of knowledge, and was one of the best educated men, if not the best educated man, of his country and time…

He had in early manhood formed a scheme of public education, which, from time to time, had pressed itself on his attention throughout even the busiest years of his public life. It was part of his political philosophy.

Lover of liberty as he was, firmly as he believed that popular government was the only form of public authority consistent with the highest happiness of men, he yet did not believe that any nation or community could permanently retain this blessing without the benefit of the lessons of truth, and the discipline of virtue to be derived only from the intellectual and moral education of the whole people.

Carter noted that Jefferson had laid out, in numerous letters and discussions throughout his life, a broad overview of how education should be conducted in the United States. Jefferson envisioned the division of states into districts and wards with primary schools and the establishment of colleges and universities where deserving students “might acquire, gratis, a further and higher education.”

Jefferson envisioned the goal of free public education—from childhood through university—to be straightforward. In a report he prepared for a state commission in Virginia, Jefferson laid out the six purposes of education:[9]

1. To give to every citizen the information he needs for the transaction of his own business.

2. To enable him to calculate for himself, and to express and preserve his ideas, his contracts and accounts in writing.

3. To improve, by reading, his morals and faculties.

4. To understand his duties to his neighbors and country, and to discharge with competence the functions confided to him by either.

5. To know his rights; to exercise with order and justice those he retains; to choose with discretion the fiduciary of those he delegates; and to notice their conduct with diligence, with candor and judgment.

6. And, in general, to observe with intelligence and faithfulness, all the social relations under which he shall be placed.

In other words, a well-educated citizenry can “choose with discretion” the elected representatives who are the holders of our government that protects our rights, and hold those politicians accountable “with diligence, with candor and judgment.”

Ronald Reagan, on the other hand, promised during his election campaign of 1980 to “eliminate the Department of Education” from the federal government; and he appointed his friend William Bennett, who had campaigned and written extensively about destroying the federal Department of Education, as secretary of education —akin to asking the fox to guard the chicken coop. Between Reagan’s ax hacking at the roots of our educational systems and his tax cuts to “starve the beast” of government, we are now left with the highest illiteracy rate in the developed world and an electorate that is spectacularly vulnerable to demagoguery and cynical political manipulation.

The experiment of Reaganomics and Reagan’s anti-intellectual worldview are demonstrably disordered and dead; we must put them behind us and build anew our country on the solid Jeffersonian foundation of good and free education for all.

Combine that with breaking up the media monopolies in this country and fostering competition and its attendant innovation through intelligent regulation of the “natural monopolies” in our nation, and we would have a more informed citizenry with better and faster access to real news and information—including information about our body politic.

These “radical” concepts of free public education all the way up to graduate degrees, breaking up companies that vertically integrate entire markets (particularly in the media), and requiring infrastructure-owning companies to off er their infrastructure to a wide variety of competitors work quite well in dozens of countries around the world. They can here too.

NOTES

1. “Ted Koppel Assesses the Media Landscape,” BBC World News, April 12, 2010, http://news.bbc.co.uk/2/hi/programmes/world _news_america/8616838.stm.

2. “Studio,” Museum of Broadcast Communications, http://www .museum.tv/eotvsection.php?entrycode=studio.

3. Yochai Benler, “Ending the Internet’s Trench Warfare,” New York Times, March 20, 2010, http://www.nytimes.com/2010/03/21/ opinion/21Benkler.html.

4. Wallace Turner, “Gov. Reagan Proposes Cutback in U. of California Appropriation; Would Impose Tuition Charge on Students from State; Kerr Weighs New Post,” New York Times, January 7, 1967, cited in Gary K. Clabaugh, “Th e Educational Legacy of Ronald Reagan,” NewFoundations.com, January 24, 2009, http://www.newfounda tions.com/Clabaugh/CuttingEdge/Reagan.html#_edn3.

5. Steven V. Roberts, “Ronald Reagan Is Giving ‘Em Heck, New York Times, October 25, 1970, cited in Clabaugh, “Educational Legacy.”

6. Richard C. Paddock, “Less to Bank on at State Universities,” Los Angeles Times, October 7, 2007, http://articles.latimes.com/2007/ oct/07/local/me-newcompact7.

7. Gary K. Clabaugh, “Th e Educational Legacy of Ronald Reagan,” NewFoundations.com, January 24, 2009, http://www.newfounda tions.com/Clabaugh/CuttingEdge/Reagan.html#_edn3.

8. James C. Carter, The University of Virginia: Jeff erson Its Father, and His Political Philosophy: An Address Delivered upon the Occasion of the Dedication of the New Buildings of the University, June 14, 1898 (Ann Arbor: University of Michigan Library, 1898).

9. Albert Ellery Berch, ed., The Writings of Thomas Jefferson, vol. 2 (Washington, DC: Th omas Jefferson Memorial Association, 1905), http://www.constitution.org/tj/jeff 02.txt.

Thom Hartmann is a New York Times bestselling Project Censored Award winning author and host of a nationally syndicated progressive radio talk show. You can learn more about Thom Hartmann at his website and find out what stations broadcast his program. He is also now has a daily television program at RT Network. You can also listen to Thom over the Internet.

Copyright Thom Hartmann and Mythical Research, Inc. Truthout has obtained exclusive rights to reprint this content. It may not be reproduced, and is not covered by our Creative Commons license.

Want a copy of the book? Receive “Rebooting the American Dream: 11 Ways to Rebuild Our Country” as a thank-you gift with a donation of $35 or more to Truthout.

Overview- Communications

2013 could mark the end of the era of Internet openness….a court ruling against the FCC would usher in an age of online payola, and fundamentally alter the character of the Internet. Allowing paid prioritization would shift power away from the upstarts and visionaries—those who have sparked one of history’s greatest periods of economic and technological growth—toward established companies like AT&T, Comcast and Verizon, which want to rein in any online innovation that threatens their plans to control the new media economy. This is bad news for anyone who thinks the Internet marketplace should remain open to all comers… The openness principle, often referred to as Net Neutrality, is why the Internet has become a network for the truest expression of the free market. Regardless of the court’s decision, the FCC has the responsibility to ensure that ISPs don’t use their power over Internet access to seize control of content and destroy this marketplace. Stopping Internet payola now is vital to saving the Internet over the long term. The End of Free Internet as We Know It? By Timothy Karr, Save the Internet, posted on Alternet.org, December 12, 2013 

… coverage in 2012 was a particularly calamitous failure, almost entirely missing the single biggest story of the race: Namely, the radical right-wing, off-the-rails lurch of the Republican Party, both in terms of its agenda and its relationship to the truth… Democrats were hardly inno­cent but…the Republican campaign was just far more over the top.”… fabulists and liars can exploit the elite media’s fear of being seen as taking sides… if the story that you’re telling repeatedly is that they’re all…equally to blame — then you’re really doing a disservice to voters, and not doing what journalism is supposed to do…How the Mainstream Press Bungled the Single Biggest Story of the 2012 Campaign by Dan Froomkin

Your False-Equivalence Guide to the Days Ahead James Fallows Sep 27 2013

Fox’s Unbalanced Ethics Threatens Democracy by Wendell Potter

Republicans Lied To by ‘Conservative Entertainment Complex’

Social media

While the traditional media loves fights, the new and emerging social media loves connections. We can leverage the wisdom and creativity of crowds to find win-win solutions to our common problems. We can scale our efforts to tens of thousands of conversations, giving individuals the power to begin to reweave the social fabric of our communities… Reweaving the Fabric of our Society by Joan Blades, co-founder, MoveOn.org /

How You Will Change the World with Social Networking by Deanna Zandt

The American Public’s Shocking Lack of Policy Knowledge is a Threat to Progress and Democracy

By Justin Doolittle, Truthout | Op-Ed, October 12, 2013

Excerpt

The genuinely shocking degree of public ignorance regarding the ACA that has been revealed by this slew of recent polls… ought to be viewed as a very serious political crisis and a grave threat to whatever semblance of health our badly disfigured democratic culture still maintains…..

The public is not just uninformed, but also misinformed…having been helplessly subjected to four years of relentless and fantastically dishonest propaganda regarding the policy on which they are opining…

Widespread civic ignorance is intrinsically beneficial to reactionaries and anathema to progressive politics. The lack of basic, sensible policy knowledge among the general public is hardly limited to the arena of health care…

The Republican Party, as well as what David Frum refers to as the “conservative entertainment complex,” combine to operate an extremely powerful, 24/7 propaganda machine, specifically designed to misinform Americans and spread an inherent distrust of any progressive policy ideas…

The reality of a massively uninformed public, though, is simply incongruous with this vision of a progressive future. So long as colossal swaths of the population are in the dark about the major policy issues of our time, the political scene will be ripe for ultra-right-wing demagogues and faux-populists to thwart progress at every turn...The vibrancy and legitimacy of our democratic culture depend on it.

Full text

With implementation of the core provisions of the Affordable Care Act (ACA) finally upon us, pollsters have been busy soliciting current public opinion of the law, as the fanatical Republican Party continues its quixotic mission to destroy it. Ludicrous though it is to subject such a complex and multi-dimensional law to an up-or-down, binary referendum, the American people continue to “disapprove” of the ACA by a modest margin; this polling dynamic has been relatively consistent since 2010.

Far more interesting than the fatuous and oversimplified “approve or disapprove” question, though, is the more concrete polling data that reveals public perception of what is, and is not, actually contained in the reform package. Surely, before subjectivity even enters the equation, there must be a coherent, objective understanding of the law itself. Any negative – or positive – opinion about the ACA that is premised on a thorough lack of knowledge of the law’s actual substance may, of course, be dismissed as essentially meaningless.

The genuinely shocking degree of public ignorance regarding the ACA that has been revealed by this slew of recent polls, more than three years after the law was signed by President Obama, should not be something to which we respond by simply shaking our heads and lamenting that the American people are so “disengaged.” No, this ought to be viewed as a very serious political crisis and a grave threat to whatever semblance of health our badly disfigured democratic culture still maintains.

The public is dramatically uninformed, and misinformed, about the law. A recent Wall Street Journal/NBC News poll found that 76 percent of people currently lacking health insurance “didn’t understand the law and how it would affect them.” Just 51 percent of respondents were aware of the existence of the exchanges that launched; 49 percent were aware of the subsidies available for low-income people.

A Kaiser Family Foundation poll asked respondents about the exchanges and even provided several choices of launch dates: October 1; Other date in 2013; Date in 2014; Other; or Don’t know/Refused.

Given the structure of the question, the results are astonishing: 64 percent of the public refused to even venture a guess, with just 15 percent answering correctly that the exchanges were set to launch on October 1. Among the uninsured, the bloc of people the new marketplaces are designed to help, it was even more striking: 74 percent did not know, and just 12 percent answered October 1. The poll was taken less than two weeks before the exchanges were set to go live.

As the years pass, there is virtually no evidence that accurate information about the ACA is successfully penetrating the public consciousness: Kaiser concluded “the public’s level of awareness about exactly which provisions are – and are not – included in the health care law has generally not increased in the three and a half years since the law was passed.”

If anything, the trend is going in the opposite direction, with the public having actually become less informed with time: Levels of awareness of other key provisions have either remained stable or declined over time.

For example, the shares who are aware of the law’s subsidies, Medicaid expansion, and closing of the Medicare prescription drug “doughnut hole” have all decreased since right after the law’s passage in 2010 (by 12 percentage points, 6 percentage points, and 7 percentage points, respectively). This leaves substantial shares unaware that the law includes each of these key provisions.

The public is not just uninformed, but also misinformed, about the most consequential social reform signed into law in many decades. More than half of Americans reported to Kaiser that a public option was, in fact, included in the ACA. More than 40 percent believe that the law provides subsidies to undocumented immigrants to purchase health insurance, establishes something very closely resembling a death panel, and cuts benefits for seniors currently enrolled in Medicare (43 percent, 42 percent and 42 percent, respectively).

These statistics simply preclude any kind of serious discussion about the law’s “popularity” or lack thereof. It would be no more absurd to approach a group of 12-year-olds and inquire about their views of the conflict in Syria. And that analogy actually understates the case, because the 12-year-olds in question could at least respond honestly and instinctively, without having been helplessly subjected to four years of relentless and fantastically dishonest propaganda regarding the policy on which they are opining.

It’s worth remembering that these are the central provisions of the ACA. If so many citizens are unaware of the core tenets of the law, including those which are specifically geared to benefit them and have generated an enormous amount of media attention, it stands to reason that the dozens of smaller-scale programs and reforms contained within the law are unknown to virtually everyone except the most dedicated policy wonks.

What percentage of the public, for example, is aware of the provisions that

-   Prohibit insurance companies from imposing annual dollar limits on coverage, the cause of so many bankruptcies among American families?

- The 85 percent Medical Loss Ratio provision (requiring that percentage of premiums be spent on actual care rather than administration) that has already resulted in billions of dollars in rebates?

- The expansion of free preventive care?

- The development of Accountable Care Organizations, Electronic Health Records, and bundled payments, to finally address the rampant structural dysfunction in health care?

- The myriad cost control mechanisms?

Our health care system has long been an unconscionable international scandal, with terribly high costs and tens of millions of citizens lacking access. Democratic control of the White House and both chambers of Congress presented a unique opportunity to finally address the problem. The long and often grotesque legislative process eventually produced a law that, while further postponing the country’s eventual and inevitable embrace of some sort of single-payer system, contained a number of exceptionally propitious ideas and long overdue reforms. It’s a clear and undeniable step forward for health care in this country. Indeed, it’s virtually impossible to coherently argue – even for those of us who support single payer – that the ACA, in its totality, is not exceedingly preferable to what had been the status quo, at least from a utilitarian perspective.

How then, can the public be so consistently antipathetic about the law, given that it merely removes some of the most excessive brutality from the American system of health care, while making a multifaceted effort to both expand access and control costs?

It’s been theorized that the broad aversion stems from the fact that around 85 percent of American adults already have health insurance, are reasonably satisfied with it, and are therefore anxious about any sweeping overhauls of a system that is, as they see it, working tolerably well for them.

To be sure, there is some truth to this. For all the systemic problems with health care in this country, most Americans families do have health insurance, and, within this group, many families are facing other, often quite dire economic challenges. Understandably, they might have wanted federal policymaking to be focused elsewhere in 2009 and 2010 (on the foreclosure crisis, for example).

Less Than Half Know About Closing “Doughnut Hole”

Of course, this explanation fails to account for the lack of support for the numerous cost-control mechanisms in the ACA, which, if effective, will benefit everyone, not just the currently uninsured; or, for example, the closing of the dreaded Medicare “doughnut hole,” something that benefits all seniors – and something that, to this day, less than half of Americans know is included in the law.

Nevertheless, due to various social, political and economic realities, not everyone will support sweeping social legislation, and this is unavoidable. The one strain of opposition – or indifference – to the ACA that is not unavoidable, though, and on which the progressive movement should focus, is that which is based on pure lack of information.

Thomas Jefferson once said that, “though the people may acquiesce, they cannot approve what they do not understand.” This is especially true when what they are being asked to approve is something as complicated and far-reaching as the ACA.

Widespread civic ignorance is intrinsically beneficial to reactionaries and anathema to progressive politics. The lack of basic, sensible policy knowledge among the general public is hardly limited to the arena of health care.

It’s no great secret which political actors are responsible for this democratic crisis. Our national political media is led by people like Chuck Todd, who, in a moment of breathtaking honesty, openly confessed that he does not view the dissemination of accurate and factual information as part of his job description as a journalist.

The Republican Party, as well as what David Frum refers to as the “conservative entertainment complex,” combine to operate an extremely powerful, 24/7 propaganda machine, specifically designed to misinform Americans and spread an inherent distrust of any progressive policy ideas.

Indeed, going down the line of policy realms on which the public is ignorant, in virtually every instance, it benefits the ultra-right-wing. Indifference to climate change is rampant. The public judges levels of income inequality in the United States to be much less dramatic than they are in reality. Support for the extraordinarily idiotic “balanced budget amendment” is overwhelming. In all of these cases, wildly uninformed public opinion serves to provide tactical support to Republicans and aid them in their vicious ideological crusades.

There is cause for optimism about the future of progressive politics in the US Demographic realities, seismic shifts in public attitudes on social issues and a lasting feeling of abomination at the unhinged lunacy of the Bush years are just some of the reasons to feel hopeful about our political direction over the long term.

The reality of a massively uninformed public, though, is simply incongruous with this vision of a progressive future. So long as colossal swaths of the population are in the dark about the major policy issues of our time, the political scene will be ripe for ultra-right-wing demagogues and faux-populists to thwart progress at every turn. Progressives have to confront the fact that, technically, Republicans are right when they say that the ACA is “unpopular” or that “the American people” want a balanced budget amendment. This has to change. The vibrancy and legitimacy of our democratic culture depend on it.

http://truth-out.org/opinion/item/19279-the-american-publics-shocking-lack-of-policy-knowledge-is-a-threat-to-progress-and-democracy

The Rise of the New New Left

by Peter Beinart, The Daily Beast, Sep 12, 2013

Bill de Blasio’s win in New York’s Democratic primary isn’t a local story. It’s part of a vast shift that could upend three decades of American political thinking. By Peter Beinart

Maybe Bill de Blasio got lucky. Maybe he only won because he cut a sweet ad featuring his biracial son. Or because his rivals were either spectacularly boring, spectacularly pathological, or running for Michael Bloomberg’s fourth term. But I don’t think so. The deeper you look, the stronger the evidence that de Blasio’s victory is an omen of what may become the defining story of America’s next political era: the challenge, to both parties, from the left. It’s a challenge Hillary Clinton should start worrying about now.

To understand why that challenge may prove so destabilizing, start with this core truth: For the past two decades, American politics has been largely a contest between Reaganism and Clintonism. In 1981, Ronald Reagan shattered decades of New Deal consensus by seeking to radically scale back government’s role in the economy. In 1993, Bill Clinton brought the Democrats back to power by accepting that they must live in the world Reagan had made. Located somewhere between Reagan’s anti-government conservatism and the pro-government liberalism that preceded it, Clinton articulated an ideological “third way”: Inclined toward market solutions, not government bureaucracy, focused on economic growth, not economic redistribution, and dedicated to equality of opportunity, not equality of outcome. By the end of Clinton’s presidency, government spending as a percentage of Gross Domestic Product was lower than it had been when Reagan left office.

For a time, small flocks of pre-Reagan Republicans and pre-Clinton Democrats endured, unaware that their species were marked for extinction. Hard as they tried, George H.W. Bush and Bob Dole could never muster much rage against the welfare state. Ted Kennedy never understood why Democrats should declare the era of big government over. But over time, the older generation in both parties passed from the scene and the younger politicians who took their place could scarcely conceive of a Republican Party that did not bear Reagan’s stamp or a Democratic Party that did not bear Clinton’s. These Republican children of Reagan and Democratic children of Clinton comprise America’s reigning political generation.

By “political generation,” I mean something particular. Pollsters slice Americans into generations at roughly 20-year intervals: Baby Boomers (born mid-1940s to mid-1960s); Generation X (mid-1960s to early 1980s); Millennials (early 1980s to 2000). But politically, these distinctions are arbitrary. To understand what constitutes a political generation, it makes more sense to follow the definition laid out by the early-20th-century sociologist Karl Mannheim. For Mannheim, generations were born from historical disruption. As he argued—and later scholars have confirmedpeople are disproportionately influenced by events that occur between their late teens and mid-twenties. During that period—between the time they leave their parents’ home and the time they create a stable home of their own—individuals are most prone to change cities, religions, political parties, brands of toothpaste. After that, lifestyles and attitudes calcify. For Mannheim, what defined a generation was the particular slice of history people experienced during those plastic years. A generation had no set length. A new one could emerge “every year, every thirty, every hundred.” What mattered was whether the events people experienced while at their most malleable were sufficiently different from those experienced by people older or younger than themselves.

Mannheim didn’t believe that everyone who experienced the same formative events would interpret them the same way. Germans who came of age in the early 1800s, he argued, were shaped by the Napoleonic wars. Some responded by becoming romantic-conservatives, others by becoming liberal-rationalists. What they shared was a distinct generational experience, which became the basis for a distinct intra-generational argument.

If Mannheim’s Germans constituted a political generation because in their plastic years they experienced the Napoleonic Wars, the men and women who today dominate American politics constitute a political generation because during their plastic years they experienced some part of the Reagan-Clinton era. That era lasted a long time. If you are in your late 50s, you are probably too young to remember the high tide of Kennedy-Johnson big government liberalism. You came of age during its collapse, a collapse that culminated with the defeat of Jimmy Carter. Then you watched Reagan rewrite America’s political rules. If you are in your early ‘40s, you may have caught the tail end of Reagan. But even if you didn’t, you were shaped by Clinton, who maneuvered within the constraints Reagan had built. To pollsters, a late 50-something is a Baby Boomer and an early 40-something is a Gen-Xer. But in Mannheim’s terms, they constitute a single generation because no great disruption in American politics divides them. They came of age as Reagan defined a new political era and Clinton ratified it. And as a rule, they play out their political struggles between the ideological poles that Reagan and Clinton set out.

To understand how this plays out in practice, look at the rising, younger politicians in both parties. Start with the GOP. If you look at the political biographies of nationally prominent 40-something Republicans—Bobby Jindal, Scott Walker, Paul Ryan, Marco Rubio, Ted Cruz—what they all have in common is Reagan. Jindal has said about growing up in Louisiana, “I grew up in a time when there weren’t a whole lot of Republicans in this state. But I identified with President Reagan.” At age 17, Scott Walker was chosen to represent his home state of Colorado in a Boys Nation trip to Washington. There he met “his hero, Ronald Reagan,” who “played a big role in inspiring me.” At age 21, Paul Ryan interned for Robert Kasten, who had ridden into the Senate in 1980 on Reagan’s coattails. Two years later he took a job with Jack Kemp, whose 1981 Kemp-Roth tax cut had helped usher in Reaganomics. Growing up in a fiercely anti-communist Cuban exile family in Miami, Marco Rubio writes in his autobiography that “Reagan’s election and my grandfather’s allegiance to him were defining influences on me politically.” Ted Cruz is most explicit of all. “I was 10 when Reagan became president,” he told a conservative group earlier this year. “I was 18 when he left the White House … I’ll go to my grave with Ronald Wilson Reagan defining what it means to be president … and when I look at this new generation of [Republican] leaders I see leaders that are all echoing Reagan.”

Younger Democratic politicians are less worshipful of Clinton. Yet his influence on their worldview is no less profound. Start with the most famous, still-youngish Democrat, a man who although a decade older than Rubio, Jindal, and Cruz, hails from the same Reagan-Clinton generation: Barack Obama. Because he opposed the Iraq War, and sometimes critiqued the Clintons as too cautious when running against Hillary in 2008, some commentators depicted Obama’s victory as a rejection of Clintonism. But to read The Audacity of Hope—Obama’s most detailed exposition of his political outlook—is to be reminded how much of a Clintonian Obama actually is. At Clintonism’s core was the conviction that to revive their party, Democrats must first acknowledge what Reagan got right.

Obama, in describing his own political evolution, does that again and again: “as disturbed as I might have been by Ronald Reagan’s election … I understood his appeal” (page 31). “Reagan’s central insight … contained a good deal of truth” (page 157). “In arguments with some of my friends on the left, I would find myself in the curious position of defending aspects of Reagan’s worldview” (page 289). Having given Reagan his due, Obama then sketches out a worldview in between the Reaganite right and unreconstructed, pre-Reagan left. “The explanations of both the right and the left have become mirror images of each other” (page 24), he declares in a chapter in which he derides “either/or thinking” (page 40). “It was Bill Clinton’s singular contribution that he tried to transcend this ideological deadlock” (page 34). Had the term not already been taken, Obama might well have called his intermediary path the “third way.”

The nationally visible Democrats rising behind Obama generally share his pro-capitalist, anti-bureaucratic, Reaganized liberalism. The most prominent is 43-year-old Cory Booker, who is famously close to Wall Street and supports introducing market competition into education via government-funded vouchers for private schools. In the words of New York magazine, “Booker is essentially a Clinton Democrat.” Gavin Newsom, the 45-year-old lieutenant governor of California, has embraced Silicon Valley in the same way Booker has embraced Wall Street. His book, Citizenville, calls for Americans to “reinvent government,” a phrase cribbed from Al Gore’s effort to strip away government bureaucracy in the 1990s. “In the private sector,” he told Time, “leaders are willing to take risks and find innovative solutions. In the public sector, politicians are risk-averse.” Julian Castro, the 39-year-old mayor of San Antonio and 2012 Democratic convention keynote speaker, is a fiscal conservative who supports NAFTA.

The argument between the children of Reagan and the children of Clinton is fierce, but ideologically, it tilts toward the right. Even after the financial crisis, the Clinton Democrats who lead their party don’t want to nationalize the banks, institute a single-payer health-care system, raise the top tax rate back to its pre-Reagan high, stop negotiating free-trade deals, launch a war on poverty, or appoint labor leaders rather than Wall Streeters to top economic posts. They want to regulate capitalism modestly. Their Reaganite Republican adversaries, by contrast, want to deregulate it radically. By pre-Reagan standards, the economic debate is taking place on the conservative side of the field. But—and this is the key point--there’s reason to believe that America’s next political generation will challenge those limits in ways that cause the leaders of both parties fits.

America’s youngest adults are called “Millennials” because the 21st century was dawning as they entered their plastic years. Coming of age in the 21st century is of no inherent political significance. But this calendric shift has coincided with a genuine historical disruption. Compared to their Reagan-Clinton generation elders, Millennials are entering adulthood in an America where government provides much less economic security. And their economic experience in this newly deregulated America has been horrendous. This experience has not produced a common generational outlook. No such thing ever exists. But it is producing a distinct intragenerational argument, one that does not respect the ideological boundaries to which Americans have become accustomed. The Millennials are unlikely to play out their political conflicts between the yard lines Reagan and Clinton set out.

Even if they are only a decade older than Millennials, politicians like Cruz, Rubio, and Walker hail from a different political generation.

In 2001, just as the first Millennials were entering the workforce, the United States fell into recession. By 2007 the unemployment rate had still not returned to its pre-recession level. Then the financial crisis hit. By 2012, data showed how economically bleak the Millennials’ first decade of adulthood had been. Between 1989 and 2000, when younger members of the Reagan-Clinton generation were entering the job market, inflation-adjusted wages for recent college graduates rose almost 11 percent, and wages for recent high school graduates rose 12 percent. Between 2000 and 2012, it was the reverse. Inflation-adjusted wages dropped 13 percent among recent high school graduates and 8 percent among recent graduates of college.

But it was worse than that. If Millennials were victims of a 21st-century downward slide in wages, they were also victims of a longer-term downward slide in benefits. The percentage of recent college graduates with employer-provided health care, for instance, dropped by half between 1989 and 2011.

The Great Recession hurt older Americans, too. But because they were more likely to already have secured some foothold in the job market, they were more cushioned from the blow. By 2009, the net worth of households headed by someone over 65 was 47 times the net worth of households headed by someone under 35, almost five times the margin that existed in 1984.

One reason is that in addition to coming of age in a terrible economy, Millennials have come of age at a time when the government safety net is far more threadbare for the young than for the middle-aged and old. As the Economic Policy Institute has pointed out, younger Americans are less likely than their elders to qualify for unemployment insurance, food stamps, Temporary Assistance for Needy Families, or the Earned Income Tax Credit. (Not to mention Medicare and Social Security.)

Millennials have also borne the brunt of declines in government spending on higher education. In 2012, according to The New York Times, state and local spending per college student hit a 25-year low. As government has cut back, universities have passed on the (ever-increasing) costs of college to students. Nationally, the share of households owing student debt doubled between 1989 and 2010, and the average amount of debt per household tripled, to $26,000.

Economic hardship has not always pushed Americans to the left. In the Clinton-Reagan era, for instance, the right often used culture and foreign policy to convince economically struggling Americans to vote against bigger government. But a mountain of survey data—plus the heavily Democratic tilt of Millennials in every national election in which they have voted—suggests that they are less susceptible to these right-wing populist appeals. For one thing, right-wing populism generally requires rousing white, Christian, straight, native-born Americans against Americans who are not all those things. But among Millennials, there are fewer white, Christian non-immigrants to rouse. Forty percent of Millennials are racial or ethnic minorities. Less than half say religion is “very important” to their lives.

And even those Millennials who are white, Christian, straight, and native-born are less resentful of people who are not. According to a 2010 Pew survey, whites under the age of 30 were more than 50 points more likely than whites over 65 to say they were comfortable with someone in their family marrying someone of another ethnicity or race. A 2011 poll by the Public Religion Research Institute found that almost 50 percent of evangelicals under the age of 30 back gay marriage.

Of course, new racial, ethnic, and sexual fault lines could emerge. But today, a Republican seeking to divert Millennial frustrations in a conservative cultural direction must reckon with the fact that Millennials are dramatically more liberal than the elderly and substantially more liberal than the Reagan-Clinton generation on every major culture war issue except abortion (where there is no significant generational divide).

They are also more dovish on foreign policy. According to the Pew Research Center, Millennials are close to half as likely as the Reagan-Clinton generation to accept sacrificing civil liberties in the fight against terrorism  and much less likely to say the best way to fight terrorism is through military force.

It is these two factors—their economic hardship in an age of limited government protection and their resistance to right-wing cultural populism—that best explain why on economic issues, Millennials lean so far left. In 2010, Pew found that two-thirds of Millennials favored a bigger government with more services over a cheaper one with fewer services, a margin 25 points above the rest of the population. While large majorities of older and middle-aged Americans favored repealing Obamacare in late 2012, Millennials favored expanding it, by 17 points. Millennials are substantially more pro–labor union than the population at large.

The only economic issue on which Millennials show much libertarian instinct is the privatization of Social Security, which they disproportionately favor. But this may be less significant than it first appears. Historically, younger voters have long been more pro–Social Security privatization than older ones, with support dropping as they near retirement age. In fact, when asked if the government should spend more money on Social Security, Millennials are significantly more likely than past cohorts of young people to say yes.

Most striking of all, Millennials are more willing than their elders to challenge cherished American myths about capitalism and class. According to a 2011 Pew study, Americans under 30 are the only segment of the population to describe themselves as “have nots” rather than “haves.” They are far more likely than older Americans to say that business enjoys more control over their lives than government.  And unlike older Americans, who favor capitalism over socialism by roughly 25 points, Millennials, narrowly, favor socialism.

There is more reason to believe these attitudes will persist as Millennials age than to believe they will change. For starters, the liberalism of Millennials cannot be explained merely by the fact that they are young, because young Americans have not always been liberal. In recent years, polls have shown young Americans to be the segment of the population most supportive of government-run health care. But in 1978, they were the least supportive. In the last two elections, young Americans voted heavily for Obama. But in 1984 and 1988, Americans under 30 voted Republican for president.

Nor is it true that Americans necessarily grow more conservative as they age. Sometimes they do. But academic studies suggest that party identification, once forged in young adulthood, is more likely to persist than to change. There’s also strong evidence from a 2009 National Bureau of Economic Research paper that people who experience a recession in their plastic years support a larger state role in the economy throughout their lives.

The economic circumstances that have pushed Millennials left are also unlikely to change dramatically anytime soon. A 2010 study by Yale economist Lisa Kahn found that even 17 years later, people who had entered the workforce during a recession still earned 10 percent less than those who entered when the economy was strong.  In other words, even if the economy booms tomorrow, Millennials will still be suffering the Great Recession’s aftershocks for decades.

And the economy is not likely to boom. Federal Reserve Chairman Ben Bernanke doesn’t believe the unemployment rate will reach 6 percent until 2016, and even that will be higher than the 1990s average. Nor are the government protections Millennials crave likely to appear anytime soon. To the contrary, as a result of the spending cuts signed into law in 2010 and the sequester that began this year, non-defense discretionary spending is set to decline by decade’s end to its lowest level in 50 years.

If Millennials remain on the left, the consequences for American politics over the next two decades could be profound. In the 2008 presidential election, Millennials constituted one-fifth of America’s voters. In 2012, they were one-quarter. In 2016, according to predictions by political demographer Ruy Teixeira, they will be one-third. And they will go on constituting between one-third and two-fifths of America’s voters through at least 2028.

This rise will challenge each party, but in different ways. In the runup to 2016, the media will likely feature stories about how 40-something Republicans like Marco Rubio, who blasts Snoop Dog from his car, or Paul Ryan, who enjoys Rage Against the Machine, may appeal to Millennials in ways that geezers like McCain and Romney did not. Don’t believe it. According to a 2012 Harvard survey, young Americans were more than twice as likely to say Mitt Romney’s selection of Ryan made them feel more negative about the ticket than more positive. In his 2010 Senate race, Rubio fared worse among young voters than any other age group. The same goes for Rand Paul in his Senate race that year in Kentucky, and Scott Walker in his 2010 race for governor of Wisconsin  and his recall battle in 2012.

Pre-election polls in Ted Cruz’s 2012 senate race in Texas (there were no exit polls) also showed him faring worst among the young.

The likeliest explanation for this is that while younger Republican candidates may have a greater cultural connection to young voters, the ideological gulf is vast. Even if they are only a decade older than Millennials, politicians like Cruz, Rubio, and Walker hail from a different political generation both because they came of age at a time of relative prosperity and because they were shaped by Reagan, whom Millennials don’t remember. In fact, the militantly anti-government vision espoused by ultra-Reaganites like Cruz, Rubio, and Walker isn’t even that popular among Millennial Republicans. As a July Pew survey notes, Republicans under 30 are more hostile to the Tea Party than any other Republican age group. By double digits, they’re also more likely than other Republicans to support increasing the minimum wage.

Republicans may modestly increase their standing among young voters by becoming more tolerant on cultural issues and less hawkish on foreign policy, but it’s unlikely they will become truly competitive unless they follow the counsel of conservative commentators Ross Douthat and Reihan Salam and “adapt to a new reality—namely, that today, Americans are increasingly worried about their economic security.” If there’s hope for the GOP, it’s that Millennials, while hungry for government to provide them that economic security, are also distrustful of its capacity to do so. As a result of growing up in what Chris Hayes’ has called the “fail decade” —the decade of the Iraq War, Hurricane Katrina and the financial crisis—Millennials are even more cynical about government than the past generations of young Americans who wanted less from it. If a Republican presidential candidate could match his Democratic opponent as a champion of economic security and yet do so in a way that required less faith in Washington’s competence and benevolence, he might boost the GOP with young voters in a way no number of pop-culture references ever could.

If the Millennials challenge Reaganite orthodoxy, they will likely challenge Clintonian orthodoxy, too. Over the past three decades, Democratic politicians have grown accustomed to campaigning and governing in the absence of a mobilized left. This absence has weakened them: Unlike Franklin Roosevelt or Lyndon Johnson, Bill Clinton and Barack Obama could never credibly threaten American conservatives that if they didn’t pass liberal reforms, left-wing radicals might disrupt social order. But Democrats of the Reagan-Clinton generation have also grown comfortable with that absence. From Tony Coelho, who during the Reagan years taught House Democrats to raise money from corporate lobbyists to Bill Clinton, who made Goldman Sachs co-chairman Robert Rubin his chief economic adviser, to Barack Obama, who gave the job to Rubin’s former deputy and alter ego, Larry Summers, Democrats have found it easier to forge relationships with the conservative worlds of big business and high finance because they have not faced much countervailing pressure from an independent movement of the left.

But that may be changing. Look at the forces that created Occupy Wall Street. The men and women who assembled in September 2011 in Zuccotti Park bore three key characteristics. First, they were young. According to a survey published by City University of New York’s Murphy Institute for Worker Education and Labor, 40 percent of the core activists involved taking over the park were under 30 years old. Second, they were highly educated. Eighty percent possessed at least a bachelors’ degree, more than twice the percentage of New Yorkers overall. Third, they were frustrated economically. According to the CUNY study, more than half the Occupy activists under 30 owed at least $1,000 in student debt. More than a one-third had lost a job or been laid off in the previous five years. In the words of David Graeber, the man widely credited with coining the slogan “We are the 99 percent,” the Occupy activists were “forward-looking people who had been stopped dead in their tracks” by bad economic times.

For a moment, Occupy shook the country. At one point in December 2011, Todd Gitlin points out in Occupy Nation, the movement had branches in one-third of the cities and towns in California. Then it collapsed. But as the political scientist Frances Fox Piven has argued, “The great protest movements of history … did not expand in the shape of a simple rising arc of popular defiance. Rather, they began in a particular place, sputtered and subsided, only to re-emerge elsewhere in perhaps a different form, influenced by local particularities of circumstance and culture.”

It’s impossible to know whether the protest against inequality will be such a movement. But the forces that drove it are unlikely to subside. Many young Americans feel that economic unfairness is costing them a shot at a decent life. Such sentiments have long been widespread among the poor. What’s new is their prevalence among people who saw their parents achieve—and expected for themselves—some measure of prosperity, the people Chris Hayes calls the “newly radicalized upper-middle class.”

If history is any guide, the sentiments behind Occupy will find their way into the political process, just as the anti-Vietnam movement helped create Eugene McCarthy’s presidential bid in 1968, and the civil-rights movement bred politicians like Andrew Young, Tom Bradley, and Jesse Jackson. That’s especially likely because Occupy’s message enjoys significant support among the young. A November 2011 Public Policy Polling survey found that while Americans over 30 opposed Occupy’s goals by close to 20 points, Millennials supported them by 12.

Bill de Blasio’s mayoral campaign offers a glimpse into what an Occupy-inspired challenge to Clintonism might look like. In important ways, New York politics has mirrored national politics in the Reagan-Clinton era. Since 1978, the mayoralty has been dominated by three men—Ed Koch, Rudy Giuliani, and Michael Bloomberg—who although liberal on many cultural issues have closely identified Wall Street’s interests with the city’s. During their time in office, New York has become far safer, cleaner, more expensive, and more unequal. In Bloomberg’s words, New York is now a “high-end product.

City Council Speaker Christine Quinn, despite her roots on the left as a housing and LGBT activist, became Bloomberg’s heir apparent by stymieing bills that would have required businesses to give their employees paid sick leave and mandated a higher minimum wage for companies that receive government subsidies. Early in the campaign, many commentators considered this a wise strategy and anticipated that as New York’s first lesbian mayor, Quinn would symbolize the city’s unprecedented cultural tolerance while continuing its Clintonian economic policies.

Then strange things happened. First, Anthony Weiner entered the race and snatched support from Quinn before exploding in a blaze of late-night comedy. But when Weiner crashed, his support went not back to Quinn but to de Blasio, the candidate who most bluntly challenged Bloomberg’s economic philosophy. Calling it “an act of equalization in a city that is desperately falling into the habit of disparity,” de Blasio made his central proposal a tax on people making over $500,000 to fund universal childcare. He also called for requiring developers to build more affordable housing and ending the New York Police Department’s “stop and frisk” policies that had angered many African-Americans and Latinos. Bloomberg’s deputy mayor Howard Wolfson tweeted that de Blasio’s “agenda is clear: higher taxes, bigger govt, more biz mandates. A u-turn back to the 70s.”

But in truth, it was Wolfson who was out of date: Fewer and fewer New Yorkers remember the 1970s, when economic stagnation, rising crime, and bloated government helped elect both Ed Koch and Ronald Reagan. What concerns them more today is that, as The New Yorker recently noted, “If the borough of Manhattan were a country, the income gap between the richest twenty per cent and the poorest twenty per cent would be on par with countries like Sierra Leone, Namibia, and Lesotho.”  In Tuesday’s Democratic primary, Quinn defeated de Blasio in those parts of New York where average income tops $175,000 per year.  But he beat her by 25 points overall.

Democrats in New York are more liberal than Democrats nationally. Still, the right presidential candidate, following de Blasio’s model, could seriously challenge Hillary Clinton. If that sounds far-fetched, consider the last two Democratic presidential primary campaigns. In October 2002, Howard Dean was so obscure that at a Jefferson-Jackson Day dinner, Iowa Sen. Tom Harkin repeatedly referred to him as “John.” But in the summer of 2003, running against the Iraq War amidst a field of Washington Democrats who had voted to authorize it, Dean caught fire. In the first quarter of the year he raised just under $3 million, less than one-third of John Kerry’s total. In the second quarter, he shocked insiders by beating Kerry and raising over $7 million. In the third quarter, he raised almost $15 million, far more than any Democrat ever had. By November, Harkin, Al Gore, and the nation’s two most powerful labor unions had endorsed Dean and he was well ahead in the Iowa polls.

At the last minute, Dean flamed out, undone by harsh attacks from his rivals and his campaign’s lack of discipline. Still, he established a template for toppling a Democratic frontrunner: inspire young voters, raise vast funds via small donations over the Web, and attack those elements of Clintonian orthodoxy that are accepted by Democratic elites but loathed by liberal activists on the ground.

In 2008, that became the template for Barack Obama. As late as October 2007, Hillary enjoyed a 33-point lead in national polls. But Obama made her support for the Iraq War a symbol of her alleged timidity in challenging the right-leaning consensus in Washington. As liberals began to see him as embodying the historic change they sought, Obama started raising ungodly amounts via small donors over the Internet, which in turned won him credibility with insiders in Washington. He overwhelmed Hillary Clinton in caucus states, where liberal activists wield greater power. And he overwhelmed her among younger voters. In the 2008 Iowa caucuses, youth turnout rose 30 percent and among voters under the age of 30, Obama beat Hillary by 46 points.

Hillary starts the 2016 race with formidable strengths. After a widely applauded term as secretary of state, her approval rating is 10 points higher than it was when she began running in 2008. Her vote to authorize Iraq will be less of a liability this time. Her campaign cannot possibly be as poorly managed. And she won’t have to run against Barack Obama.

Still, Hillary is vulnerable to a candidate who can inspire passion and embody fundamental change, especially on the subject of economic inequality and corporate power, a subject with deep resonance among Millennial Democrats. And the candidate who best fits that description is Elizabeth Warren.

First, as a woman, Warren would drain the deepest reservoir of pro-Hillary passion: the prospect of a female president. While Hillary would raise vast sums, Dean and Obama have both shown that in the digital age, an insurgent can compete financially by inspiring huge numbers of small donations. Elizabeth Warren can do that. She’s already shown a knack for going viral. A video of her first Senate banking committee hearing, where she scolded regulators that “too-big-to-fail has become too-big-for-trial,”  garnered 1 million hits on YouTube. In her 2012 Senate race, despite never before having sought elected office, she raised $42 million, more than twice as much as the second-highest-raising Democrat. After Bill Clinton and the Obamas, no other speaker at last summer’s Democratic convention so electrified the crowd.

Warren has done it by challenging corporate power with an intensity Clinton Democrats rarely muster. At the convention, she attacked the “Wall Street CEOs—the same ones who wrecked our economy and destroyed millions of jobs—[who] still strut around Congress, no shame, demanding favors, and acting like we should thank them.”

And in one of the biggest applause lines of the entire convention, taken straight from Occupy, she thundered that “we don’t run this country for corporations, we run it for people.”

Don’t be fooled by Warren’s advanced age. If she runs, Millennials will be her base. No candidate is as well positioned to appeal to the young and economically insecure. Warren won her Senate race by eight points overall, but by 30 points among the young. The first bill she introduced in the Senate was a proposal to charge college students the same interest rates for their loans that the Federal Reserve offers big banks. It soon garnered 100,000 hits on YouTube.

A big reason Warren’s speech went viral was its promotion by Upworthy, a website dedicated to publicizing progressive narratives. And that speaks to another, underappreciated, advantage Warren would enjoy. Clinton Democrats once boasted a potent intellectual and media infrastructure. In the late 1980s and 1990s, the Democratic Leadership Council and its think tank, the Progressive Policy Institute, were the Democratic Party’s hottest ideas shops, and they dedicated themselves to restoring the party’s reputation as business-friendly. Influential New Democratic–aligned magazines like The New Republic and Washington Monthly also championed the cause.

Today, that New Democratic infrastructure barely exists. The DLC has closed down. The New Republic and Washington Monthly have moved left. And all the new powerhouses of the liberal media—from Paul Krugman (who was radicalized during the Bush years) to Jon Stewart (who took over The Daily Show in 1999) to MSNBC (which as late as 2008 still carried a show hosted by Tucker Carlson)—believe the Democrats are too soft on Wall Street.

You can see that shift in the race for governor of the Federal Reserve, where the liberal media has rallied behind Janet Yellen and against the more Wall Street–identified Larry Summers. In the age of MSNBC, populist Democrats enjoy a media echo chamber that gives them an advantage over pro-business Democrats that did not exist a decade ago. And if Clinton, who liberal pundits respect, runs against Warren, who liberal pundits revere, that echo chamber will benefit Warren.

Of course, Warren might not run. Or she might prove unready for the national stage. (She has no foreign-policy experience). But the youthful, anti-corporate passion that could propel her candidacy will be there either way. If Hillary Clinton is shrewd, she will embrace it, and thus narrow the path for a populist challenger. Just as New York by electing Ed Koch in 1978 foreshadowed a national shift to the right, New York in 2013 is foreshadowing a national shift to the left. The door is closing on the Reagan-Clinton era. It would be ironic if it was a Clinton herself who sealed it shut.

http://www.thedailybeast.com/articles/2013/09/12/the-rise-of-the-new-new-left.html

The Collapse of Journalism, and the Journalism of Collapse

By Robert Jensen, May 2013, The Rag Blog | News Analysis, Truthout

For those who believe that a robust public-affairs journalism is essential for a society striving to be democratic, the 21st century has been characterized by bad news that keeps getting worse.

Whatever one’s evaluation of traditional advertising-supported news media (and I have been among its critics; more on that later), the unraveling of that business model has left us with fewer professional journalists who are being paid a living wage to do original reporting. It’s unrealistic to imagine that journalism can flourish without journalists who have the time and resources to do journalism.

For those who care about a robust human presence on the planet, the 21st century has been characterized by really bad news that keeps getting really, really worse.

Whatever one’s evaluation of high-energy/high-technology civilization (and I have been among its critics; more on that later), it’s now clear that we are hitting physical limits; we cannot expect to maintain contemporary levels of consumption that draw down the ecological capital of the planet at rates dramatically beyond replacement levels. It’s unrealistic to imagine that we can go on treating the planet as nothing more than a mine from which we extract and a landfill into which we dump.

We have no choice but to deal with the collapse of journalism, but we also should recognize the need for a journalism of collapse. Everyone understands that economic changes are forcing a refashioning of the journalism profession. It’s long past time for everyone to pay attention to how multiple, cascading ecological crises should be changing professional journalism’s mission in even more dramatic fashion.

It’s time for an apocalyptic journalism (that takes some explaining; a lot more on that later).

The basics of journalism: Ideals and limitations

With the rapid expansion of journalistic-like material on the Internet, it’s especially crucial to define “real” journalism. In a democratic system, ideally journalism is a critical, independent source of information, analysis, and the varied opinions needed by citizens who want to play a meaningful role in the formation of public policy.

The key terms are “critical” and “independent” — to fulfill the promise of a free press, journalists must be willing to critique not only specific people and policies, but the systems out of which they emerge, and they must be as free as possible from constraining influences, both overt and subtle.

Also included in that definition of journalism is an understanding of democracy — “a meaningful role in the formation of public policy” — as more than just lining up to vote in elections that offer competing sets of elites who represent roughly similar programs. Meaningful democracy involves meaningful participation.

This discussion will focus on what is typically called mainstream journalism, the corporate-commercial news media. These are the journalists who work for daily newspapers, broadcast and cable television, and the corporately owned platforms on the internet and other digital devices.

Although there are many types of independent and alternative journalism of varying quality, the vast majority of Americans continue to receive the vast majority of their news from these mainstream sources, which are almost always organized as large corporations and funded primarily by advertising.

Right-wing politicians and commentators sometimes refer to the mainstream media as the “lamestream,” implying that journalists are comically incompetent and incapable of providing an accurate account of the world, likely due to a lack of understanding of conservative people and their ideas. While many elite journalists may be dismissive of the cultural values of conservatives, this critique ignores the key questions about journalism’s relationship to power.

Focusing on the cultural politics of individual reporters and editors — pointing out that they tend to be less religious and more supportive of gay and women’s rights than the general public, for example — diverts attention from more crucial questions about how the institutional politics of corporate owners and managers shapes the news and keeps mainstream journalism within a centrist/right conventional wisdom.

The managers of commercial news organizations in the United States typically reject that claim by citing the unbreachable “firewall” between the journalistic and the business sides of the operation, which is supposed to allow journalists to pursue any story without interference from the corporate front office.

This exchange I had with a newspaper editor captures the ideology: After listening to my summary of this critique of the U.S. commercial news media system, this editor (let’s call him Joe) told me proudly: “No one from corporate headquarters has ever called me to tell me what to run in my paper.” I asked Joe if it were possible that he simply had internalized the value system of the folks who run the corporation (and, by extension, the folks who run most of the world), and therefore they never needed to give him direct instructions.

He rejected that, reasserting his independence from any force outside his newsroom.

I countered: “Let’s say, for the purposes of discussion, that you and I were equally capable journalists in terms of professional skills, that we were both reasonable candidates for the job of editor-in-chief that you hold. If we had both applied for the job, do you think your corporate bosses would have ever considered me for the position, given my politics? Would I, for even a second, have been seen by them to be a viable candidate for the job?”

Joe’s politics are pretty conventional, well within the range of mainstream Republicans and Democrats — he supports big business and U.S. supremacy in global politics and economics. I’m a critic of capitalism and U.S. foreign policy. On some political issues, Joe and I would agree, but we diverge sharply on these core questions of the nature of the economy and the state.

Joe pondered my question and conceded that I was right, that his bosses would never hire someone with my politics, no matter how qualified, to run one of their newspapers. The conversation trailed off, and we parted without resolving our differences.

I would like to think my critique at least got Joe to question his platitudes, but I never saw any evidence of that. In his subsequent writing and public comments that I read and heard, Joe continued to assert that a news media system dominated by for-profit corporations was the best way to produce the critical, independent journalism that citizens in a democracy needed.

Because he was in a position of some privilege and status, nothing compelled Joe to respond to my challenge.

Partly as a result of many such unproductive conversations, I continue to search for new ways to present a critique of mainstream journalism that might break through that ideological wall. In addition to thinking about alternatives to this traditional business model, we should confront the limitations of the corresponding professional model, with its status-quo-supportive ideology of neutrality, balance, and objectivity.

Can we create conditions under which journalism — deeply critical and truly independent — can flourish in these trying times?

In this essay I want to try out theological concepts of the royal, prophetic, and apocalyptic traditions. Though journalism is a secular institution, religion can provide a helpful vocabulary. The use of these terms is not meant to imply support for any particular religious tradition, or for religion more generally, but only recognizes that the fundamental struggles of human history play out in religious and secular settings, and we can learn from all of that history.

With a focus on the United States, I’ll draw on the concepts as they are understood in the dominant U.S. tradition of Judaism and Christianity.

Royal journalism

Most of today’s mainstream corporate-commercial journalism — the work done by people such as Joe — is royal journalism, using the term “royal” not to describe a specific form of executive power but as a description of a system that centralizes authority and marginalizes the needs of ordinary people.

The royal tradition describes ancient Israel, the Roman empire, European monarchs, or contemporary America — societies in which those with concentrated wealth and power can ignore the needs of the bulk of the population, societies where the wealthy and powerful offer platitudes about their beneficence as they pursue policies to enrich themselves.

In his books The Prophetic Imagination and The Practice of Prophetic Imagination, theologian Walter Brueggemann points out that this royal consciousness took hold after ancient Israel sank into disarray, when Solomon overturned Moses — affluence, oppressive social policy, and static religion replaced a God of liberation with one used to serve an empire.

This consciousness develops not only in top leaders but throughout the privileged sectors, often filtering down to a wider public that accepts royal power. Brueggemann labels this a false consciousness: “The royal consciousness leads people to numbness, especially to numbness about death.”

The inclusion of the United States in a list of royalist societies may seem odd, given the democratic traditions of the country, but consider a nation that has been at war for more than a decade, in which economic inequality and the resulting suffering has dramatically deepened for the past four decades, in which climate change denial has increased as the evidence of the threat becomes undeniable. Brueggemann describes such a culture as one that is “competent to implement almost anything and to imagine almost nothing.”

Almost all mainstream corporate-commercial journalism is, in this sense, royal journalism. It is journalism without the imagination needed to move outside the framework created by the dominant systems of power. CNN, MSNBC, and FOX News all practice royal journalism. The New York Times is ground zero for royal journalism.

Marking these institutions as royalist doesn’t mean that no good journalism ever emerges from them, or that they employ no journalists who are capable of challenging royal arrangements. Instead, the term recognizes that these institutions lack the imagination necessary to step outside of the royal consciousness on a regular basis. Over time, they add to the numbness rather than jolt people out of it.

The royal consciousness of our day is defined by unchallengeable commitments to a high-energy/high-technology worldview, within a hierarchical economy, run by an imperial nation-state. These technological, economic, and national fundamentalisms produce a certain kind of story about ourselves, which encourages the belief that we can have anything we want without obligations to other peoples or other living things, and that we deserve this.

Brueggemann argues that this bolsters notions of “U.S. exceptionalism that gives warrant to the usurpatious pursuit of commodities in the name of freedom, at the expense of the neighbor.”

If one believes royal arrangements are just and sustainable, then royal journalism could be defended. If the royal tradition is illegitimate, than a different journalism is necessary.

Prophetic journalism 

Given the multiple crises that existing political, economic, and social systems have generated, the ideals of journalism call for a prophetic journalism. The first step in defending that claim is to remember what real prophets are not: They are not people who predict the future or demand that others follow them in lockstep.

In the Hebrew Bible and Christian New Testament, prophets are the figures who remind the people of the best of the tradition and point out how the people have strayed. In those traditions, using our prophetic imagination and speaking in a prophetic voice requires no special status in society, and no sense of being special. Claiming the prophetic tradition requires only honesty and courage.

When we strip away supernatural claims and delusions of grandeur, we can understand the prophetic as the calling out of injustice, the willingness not only to confront the abuses of the powerful but to acknowledge our own complicity. To speak prophetically requires us first to see honestly — both how our world is structured by systems that create unjust and unsustainable conditions, and how we who live in the privileged parts of the world are implicated in those systems.

To speak prophetically is to refuse to shrink from what we discover or from our own place in these systems. We must confront the powers that be, and ourselves.

The Hebrew Bible offers us many models. Amos and Hosea, Jeremiah and Isaiah — all rejected the pursuit of wealth or power and argued for the centrality of kindness and justice. The prophets condemned corrupt leaders but also called out all those privileged people in society who had turned from the demands of justice, which the faith makes central to human life.

In his analysis of these prophets, the scholar and activist Rabbi Abraham Joshua Heschel concluded:

Above all, the prophets remind us of the moral state of a people: Few are guilty, but all are responsible. If we admit that the individual is in some measure conditioned or affected by the spirit of society, an individual’s crime discloses society’s corruption.

Critical of royal consciousness, Brueggemann argues that the task of those speaking prophetically is to “penetrate the numbness in order to face the body of death in which we are caught” and “penetrate despair so that new futures can be believed in and embraced by us.” He encourages preachers to think of themselves as “handler[s] of the prophetic tradition,” a job description that also applies to other intellectual professions, including journalism.

Brueggemann argues that this isn’t about intellectuals imposing their views and values on others, but about being willing to “connect the dots”:

Prophetic preaching does not put people in crisis. Rather it names and makes palpable the crisis already pulsing among us. When the dots are connected, it will require naming the defining sins among us of environmental abuse, neighborly disregard, long-term racism, self-indulgent consumerism, all the staples from those ancient truthtellers translated into our time and place.

None of this requires journalists to advocate for specific politicians, parties, or political programs; we don’t need journalists to become propagandists. Journalists should strive for real independence but not confuse that with an illusory neutrality that traps mainstream journalists within ideological boundaries defined by the powerful.

Again, real independence means the ability to critique not just the worst abuses by the powerful within the systems, but to critique the systems themselves.

This prophetic calling is consistent with the aphorism many journalists claim as a shorthand mission statement: The purpose of journalism is to comfort the afflicted and afflict the comfortable. That phrase focuses on injustice within human societies, but what of the relationship of human beings to the larger living world? How should journalists understand their mission in that arena?

Ecological realities

Let’s put analysis of journalism on hold and think about the larger world in which journalism operates. Journalistic ideals and norms should change as historical conditions change, and today that means facing tough questions about ecological sustainability.

There is considerable evidence to help us evaluate the health of the ecosphere on which our own lives depend, and an honest evaluation of that evidence leads to a disturbing conclusion: Life as we know it is almost over. That is, the high-energy/high-technology life that we in the affluent societies live is a dead-end.

There is a growing realization that we have disrupted planetary forces in ways we cannot control and do not fully understand. We cannot predict the specific times and places where dramatic breakdowns will occur, but we can know that the living system on which we depend is breaking down.

Does that seem histrionic? Excessively alarmist? Look at any crucial measure of the health of the ecosphere in which we live — groundwater depletion, topsoil loss, chemical contamination, increased toxicity in our own bodies, the number and size of “dead zones” in the oceans, accelerating extinction of species and reduction of bio-diversity — and the news is bad.

Add to that the mother of all ecological crises — global warming, climate change, climate disruption — and it’s clear that we are creating a planet that cannot indefinitely support a large-scale human presence living this culture’s idea of the good life.

We also live in an oil-based world that is rapidly depleting the cheap and easily accessible oil, which means we face a huge reconfiguration of the infrastructure that undergirds our lives. Meanwhile, the desperation to avoid that reconfiguration has brought us to the era of “extreme energy” using even more dangerous and destructive technologies (hydrofracturing, deep-water drilling, mountain-top removal, tar sands extraction) to get at the remaining hydrocarbons.

Where we are heading? Off the rails? Into the wall? Over the cliff? Pick your favorite metaphor. Scientists these days are talking about tipping points and planetary boundaries, about how human activity is pushing the planet beyond its limits.

Recently 22 top scientists in the prestigious journal Nature warned that humans likely are forcing a planetary-scale critical transition “with the potential to transform Earth rapidly and irreversibly into a state unknown in human experience.” That means that “the biological resources we take for granted at present may be subject to rapid and unpredictable transformations within a few human generations.”

That means that we’re in trouble, not in some imaginary science-fiction future, but in our present reality. We can’t pretend all that’s needed is tinkering with existing systems to fix a few environmental problems; significant changes in how we live are required. No matter where any one of us sits in the social and economic hierarchies, there is no escape from the dislocations that will come with such changes.

Money and power might insulate some from the most wrenching consequences of these shifts, but there is no permanent escape. We do not live in stable societies and no longer live on a stable planet. We may feel safe and secure in specific places at specific times, but it’s hard to believe in any safety and security in a collective sense.

In short, we live in apocalyptic times.

Apocalypse

To be clear: Speaking apocalyptically need not be limited to claims that the world will end on a guru’s timetable or according to some allegedly divine plan. Lots of apocalyptic visions — religious and secular — offer such certainty, imaging the replacement of a corrupt society by one structured on principles that will redeem humanity (or at least redeem those who sign onto the principles). But this need not be our only understanding of the term.

Most discussions of revelation and apocalypse in contemporary America focus on the Book of Revelation, also known as The Apocalypse of John, the final book of the Christian New Testament. The two terms are synonymous in their original meaning; “revelation” from Latin and “apocalypse” from Greek both mean a lifting of the veil, a disclosure of something hidden from most people, a coming to clarity.

Many scholars interpret the Book of Revelation not as a set of predictions about the future but as a critique of the oppression of the empire of that day, Rome.

To speak apocalyptically, in this tradition, is first and foremost about deepening our understanding of the world, seeing through the obfuscations of people in power. In our propaganda-saturated world (think about the amount of advertising, public relations, and marketing that we are bombarded with daily), coming to that kind of clarity about the nature of the empires of our day is always a struggle, and that notion of revelation is more crucial than ever.

Thinking apocalyptically, coming to this clarity, will force us to confront crises that concentrated wealth and power create, and reflect on our role in these systems. Given the severity of the human assault on the ecosphere, compounded by the suffering and strife within the human family, honest apocalyptic thinking that is firmly grounded in a systematic evaluation of the state of the world is not only sensible but a moral obligation.

Rather than thinking of revelation as divine delivery of a clear message about some fantastic future above, we can engage in an ongoing process of revelation that results from an honest struggle to understand, a process that requires a lot of effort.

Things are bad, systems are failing, and the status quo won’t last forever. Thinking apocalyptically in this fashion demands of us considerable courage and commitment. This process will not produce definitive answers but rather help us identify new directions.

Again, to be very clear: “Apocalypse” in this context does not mean lakes of fire, rivers of blood, or bodies lifted up to heaven. The shift from the prophetic to the apocalyptic can instead mark the point when hope in the viability of existing systems is no longer possible and we must think in dramatically new ways.

Invoking the apocalyptic recognizes the end of something. It’s not about rapture but a rupture severe enough to change the nature of the whole game.

Apocalyptic journalism

The prophetic imagination helps us analyze the historical moment we’re in, but it’s based on an implicit faith that the systems in which we live can be reshaped to stop the worst consequences of the royal consciousness, to shake off that numbness of death in time.

What if that is no longer possible? Then it is time to think about what’s on the other side. “The arc of the moral universe is long, but it bends toward justice,” said Martin Luther King, Jr., one of the more well-known voices in the prophetic tradition. But if the arc is now bending toward a quite different future, a different approach is needed.

Because no one can predict the future, these two approaches are not mutually exclusive; people should not be afraid to think prophetically and apocalyptically at the same time. We can simultaneously explore immediate changes in the existing systems and think about new systems.

Invoking the prophetic in the face of royal consciousness does not promise quick change and a carefree future, but it implies that a disastrous course can be corrected. But what if the justification for such hope evaporates? When prophetic warnings have not been heeded, what comes next? This is the time when an apocalyptic sensibility is needed.

Fred Guterl, the executive editor of Scientific American, models that spirit in his book The Fate of the Species. Though he describes himself on the “techno-optimistic side of the spectrum,” he does not shy away from a blunt discussion of the challenges humans face:

There’s no going back on our reliance on computers and high-tech medicine, agriculture, power generation, and so forth without causing vast human suffering — unless you want to contemplate reducing the world population by many billions of people. We have climbed out on a technological limb, and turning back is a disturbing option. We are dependent on our technology, yet our technology now presents the seeds of our own destruction. It’s a dilemma. I don’t pretend to have a way out. We should start by being aware of the problem.

I don’t share Guterl’s techno-optimism, but it strikes me as different from a technological fundamentalism (the quasi-religious belief that the use of advanced technology is always a good thing and that any problems caused by the unintended consequences of such technology can be remedied by more technology) that assumes that humans can invent themselves out of any problem. Guterl doesn’t deny the magnitude of the problems and recognizes the real possibility, perhaps even the inevitability, of massive social dislocation:

[W]e’re going to need the spirit with which these ideas were hatched to solve the problems we have created. Tossing aside technological optimism is not a realistic option. This doesn’t mean technology is going to save us. We may still be doomed. But without it, we are surely doomed.

Closer to my own assessment is James Lovelock, a Fellow of the Royal Society, whose work led to the detection of the widespread presence of CFCs in the atmosphere. Most famous for his “Gaia hypothesis” that understands both the living and non-living parts of the earth as a complex system that can be thought of as a single organism, he suggests that we face these stark realities immediately:

The great party of the twentieth century is coming to an end, and unless we now start preparing our survival kit we will soon be just another species eking out an existence in the few remaining habitable regions. … We should be the heart and mind of the Earth, not its malady. So let us be brave and cease thinking of human needs and rights alone and see that we have harmed the living Earth and need to make our peace with Gaia.

Anything that blocks us from looking honestly at reality, no matter how harsh the reality, must be rejected. It’s a lot to ask, of people and of journalists, to not only think about this, but put it at the center of our lives. What choice do we have? To borrow from one of 20th century America’s most honest writers, James Baldwin, “Not everything that is faced can be changed; but nothing can be changed until it is faced.”

That line is from an essay titled “As Much Truth as One Can Bear,” about the struggles of artists to help a society, such as the white-supremacist America, face the depth of its pathology. Baldwin suggested that a great writer attempts “to tell as much of the truth as one can bear, and then a little more.” If we think of Baldwin as sounding a prophetic call, an apocalyptic invocation would be “to tell as much of the truth as one can bear, and then all the rest of the truth, whether we can bear it or not.”

That task is difficult enough when people are relatively free to pursue inquiry without external constraints. Are the dominant corporate-commercial/advertising-supported media outlets likely to encourage journalists to pursue the projects that might lead to such questions? If not, the apocalyptic journalism we need is more likely to emerge from the margins, where people are not trapped by illusions of neutrality or concerned about professional status.

[INSERT HOPEFUL ENDING HERE]

That subhead is not an editing oversight. I wish there were an easy solution, an upbeat conclusion. I don’t have one. I’ve never heard anyone else articulate one. To face the world honestly at this moment in human history likely means giving up on easy and upbeat.

The apocalyptic tradition reminds us that the absence of hope does not have to leave us completely hopeless, that life is always at the same time about death, and then rejuvenation. If we don’t have easy, upbeat solutions and conclusions, we have the ability to keep telling stories of struggle. Our stories do not change the physical world, but they have the potential to change us. In that sense, the poet Muriel Rukeyser was right when she said, “The universe is made of stories, not of atoms.”

To think apocalyptically is not to give up on ourselves, but only to give up on the arrogant stories that we modern humans have been telling about ourselves. The royal must give way to the prophetic and the apocalyptic. The central story that power likes to tell — that the domination/subordination dynamic that structures so much of modern life is natural and inevitable — must give way to stories of dignity, solidarity, equality. We must resist not only the cruelty of repression but the seduction of comfort.

The best journalists in our tradition have seen themselves as responsible for telling stories about the struggle for social justice. Today, we can add stories about the struggle for ecological sustainability to that mission. Our hope for a decent future — indeed, any hope for even the idea of a future — depends on our ability to tell stories not of how humans have ruled the world but how we can live in the world.

Whether or not we like it, we are all apocalyptic now.

This article was also published at AlterNet.

http://truth-out.org/news/item/16475-the-collapse-of-journalism-and-the-journalism-of-collapse

Critical thinking or ignorance

37 Percent of People Don’t Have a Clue About What’s Going on By Mark Morford, San Francisco Chronicle, March 20, 2013...about 37 percent of Americans are just are not very bright. Or rather, quite shockingly dumb…reading anything even remotely complex or analytical is something only 42 percent of the population enjoy doing on a regular basis, which is why most TV shows, all reality shows, many major media blogs and all of Fox News is scripted for a 5th-grade education/attention span…The smarter you are, the less rigid/more liberal you become…

The Decline of Critical Thinking, The Problem of Ignorance by Lawrence Davidson, CounterPunch Weekend Edition April 5–7, 2013the habit of asking critical questions can be taught. However, if you do not have a knowledge base from which to consider a situation, it is hard to think critically about it.  So ignorance often precludes effective critical thinking even if the technique is acquired… loyalty comes from myth-making and emotional bonds. In both cases, really effective critical thinking might well be incompatible with the desired end…The truth is that people who are consistently active as critical thinkers are not going to be popular, either with the government or their neighbors.…

Disrespect, Race and Obama By CHARLES M. BLOW, New York Times, November 15, 2013 …No one has ever accused [Rush] Limbaugh of being a complex thinker, but the intellectual deficiency required to achieve that level of arrogance and ignorance is staggering…simpletons have simple understandings of complex concepts…

Why We Need New Ways of Thinking by Barry Boyce, Shambhala Sun, September 2008 – The same old thing doesn’t work… because when it comes to complex, tough problems—global warming, food crises, civil war, terror, drugs, urban decay, persistent poverty—we have to go beyond the approaches that got us there in the first place…a loose but growing collection of thinkers, activists, academics, and social entrepreneurs who are searching for the “unthinkable”—the new ways that we can’t see because of our old ways of looking…they all firmly believe that the good old world we’ve come to know and love is coming apart at the seams. Systems of all kinds are breaking down and will continue to do so. In response, they champion ways of seeing and acting that acknowledge that the world is a chaotic, deeply interdependent place, a place that won’t yield to attempts to overpower it. We must come to understand, they argue, the nature of complexity, chaos, and interconnectedness—and to train ourselves in ways of acting that embrace this unmistakable reality.

Dark Ages Redux: American Politics and the End of the Enlightenment by John Atcheson, Com­mon Dreams, June 18, 2012 … Much of what has made the mod­ern world in gen­eral, and the United States in par­tic­u­lar, a free and pros­per­ous soci­ety comes directly from insights that arose dur­ing the Enlightenment. Too bad we’re chuck­ing it all out and return­ing to the Dark Ages. …Now, we seek to oper­ate by revealed truths, not real­ity.  Decrees from on high – often issued by an unholy alliance of reli­gious fun­da­men­tal­ists, self-interested cor­po­ra­tions, and greedy fat cats – are offered up as real­ity by rightwing politicians… Sec­ond, the Enlight­en­ment laid the ground­work for our form of gov­ern­ment. The Social Con­tract is the intel­lec­tual basis of all mod­ern demo­c­ra­tic republics, includ­ing ours… our founders used rea­son, empiri­cism and aca­d­e­mic schol­ar­ship to cob­ble together one of the most endur­ing and influ­en­tial doc­u­ments in human his­tory.  For all its flaws, it has steered us steadily toward a more per­fect union. Until recently…We are, indeed, at an epochal thresh­old.  We can con­tinue to dis­card the Enlight­en­ment val­ues which enabled both an untold increase in mate­r­ial wealth and a sys­tem of gov­ern­ment which turned serfs into cit­i­zens.  A sys­tem which – for all its flaws – often man­aged to pro­tect the rights of the many, against the preda­tory power of the few. Or we can con­tinue our abject sur­ren­der to myths, mag­i­cal think­ing, and self-delusion and the Medieval nation-state those forces are resurrecting. Repub­li­cans and Tea Partiers may be lead­ing this retreat from rea­son, but they are unop­posed by Democ­rats or the Press. And in the end, there is a spe­cial place in Hell for those who allow evil to pros­per by doing nothing.

The Will­ful Igno­rance That Has Dragged the US to the Brink by Sarah Church­well, The Independent/UK, August 2, 2011  - The Tea Party version of the American Revolution is not just fundamentalist. It is also Disneyfied, sentimentalized, and whitewashed..In one sense, it is difficult to know what to say in response to the utter irrationality of the Tea Party’s self-destructive decision to sabotage the American political process – and thus its own country’s economy, and the global economy…the Tea Party has never let facts get in the way of its belief system, and now that belief system is genuinely threatening the well-being of the nation they claim to love…

The Politics of Lying and the Culture of Deceit in Obama’s America: The Rule of Damaged Politics  by Henry A. Giroux - Sep­tem­ber 21, 2009 — In the cur­rent Amer­i­can polit­i­cal land­scape, truth is not merely mis­rep­re­sented or fal­si­fied; it is overtly mocked… it becomes dif­fi­cult to dis­tin­guish between an opin­ion and an argument …At a time when edu­ca­tion is reduced to train­ing work­ers and is stripped of any civic ideals and crit­i­cal prac­tices, it becomes unfash­ion­able for the pub­lic to think crit­i­cally. Rather than intel­li­gence unit­ing us, a col­lec­tive igno­rance of pol­i­tics, cul­ture, the arts, his­tory and impor­tant social issues, as Mark Slouka points out, “gives us a sense of com­mu­nity, it con­fers citizenship.”…matters of judg­ment, thought­ful­ness, moral­ity and com­pas­sion seem to dis­ap­pear from pub­lic view. What is the social cost of such flight from real­ity, if not the death of demo­c­ra­tic pol­i­tics, crit­i­cal thought and civic agency? …Obama’s pres­ence on the national polit­i­cal scene gave lit­er­acy, lan­guage and crit­i­cal thought a new­found sense of dig­nity, inter­laced as they were with a vision of hope, jus­tice and pos­si­bil­ity — and rea­son­able argu­ments about the var­ied crises Amer­ica faced and civilized…The pol­i­tics of lying and the cul­ture of deceit are wrapped in the logic of absolute cer­taintyDemoc­racy is frag­ile, and its fate is always uncer­tain…We now find our­selves liv­ing in a soci­ety in which right-wing extrem­ists not only wage a war against the truth, but also seek to ren­der human beings less than fully human by tak­ing away their desire for jus­tice, spir­i­tual mean­ing, free­dom and individuality…

The Virus of GOP Ignorance: Why Don’t Media Protect Us From the Lies Spewed in the Republican Primary? November 23, 2011  …Gingrich…Cain and Bachmann…Pretending that these people might be president, and hence deserve to be treated as if what they say is true, is not merely unjustified…but akin to playing accessory to a kind of ongoing intellectually criminal activity….people like alleged “historian” David Barton… and psychologist James Dobson…proffer phony-baloney history lessons that distort almost everything professional historians know to be true about America’s founders. Reporters representing reliable media outlets are supposed to defend the discourse from the virus of this ignorance. But for a variety of reasons they no longer do so. Part of the explanation can be found in the foolish willingness of so many reporters to treat Fox News, Drudge and various talk-radio hosts as respectable voices in the debate without regard to their motives or qualifications. A second, no less significant problem is the tendency of even the most sophisticated political reporters to treat the entire process as a contest between rival teams and ignore the substance of their arguments and policies, as if politics were simply a spectator sportreformed right-winger David Frum writes that a “political movement that never took governing seriously was exploited by a succession of political entrepreneurs uninterested in governing—but all too interested in merchandising.…these tea-party champions provide a ghoulish type of news entertainment each time they reveal that they know nothing about public affairs and have never attempted to learn.”

Shameless GOP Lies: Is There Any Limit to What Republicans Will Say — And What People Will Believe? By Ernest Partridge, The Crisis Papers, Alternet.org, April 20, 2011 — Is there any limit to the outrageousness of the GOP lies? Is there any limit to the capacity of a large number of our fellow citizens to accept these lies?…a long string of Republican lies thrown at the public by right-wing politicians and pundits and largely unchallenged by a compliant corporate media. Among them:…Barack Obama was born in Kenya and is a secret Muslim. Saddam Hussein possessed weapons of mass destruction and was involved in the attacks of September 11, 2001…Global warming is a gigantic hoax, perpetrated by thousands of deceitful scientists. Obama has raised taxes…These are not “matters of opinion,” they are flatly and demonstrably false. Clear and decisive refutation of all these claims are available to anyone who cares to examine the evidence. As Daniel Patrick Moynihan famously remarked, while we are entitled to our own opinions, we are not entitled to our own facts…finally, there is the “dogma” —  “Government is not the solution to our problem, government is the problem.” (Ronald Reagan). Market fundamentalism: “A free market [co-ordinates] the activity of millions of people, each seeking his own interest, in such a way as to make everyone better off.” (Milton Friedman) Privatization: “Whenever we find an approach to the extension of private property rights [in the natural environment,] we find superior results.” (Robert J. Smith). “There is no such thing as society.” (Margaret Thatcher) “There is no such entity as ‘the public.’” (Ayn Rand) These last two dogmas bear significant implications. For if there is no such thing as “society,” it follows that there are no social problems or “social injustice. Poverty is the fault of individuals who are sinful and lazy. And if there is no “public,” then there is no “public interest,” and thus no need for government to promote same. A large portion of the American public believes these lies, accepts these contradictions, and embraces these dogmas, not because of supporting evidence (there is none) or cogent arguments (there are none), but out of sheer unquestioned repetition in the corporate media. …as long as …millions accept uncritically the lies, myths and dogmas fed to them by the mega-corporations that own our government, there appears to be little hope of a return to economic justice and democratic government that we once enjoyed in the United States of America. But all is not lost…liars tend through time to lose their credibility. We should strive to accelerate this process as it applies to the corporate media by exposing the lies and boycotting the sponsors of those who tell the lies…The restoration of sanity in our public discourse is essential to the restoration of our democracy.

In Ignorance We Trust by Timothy Egan, New York Times, December 13, 2012history, the formal teaching and telling of it, has never been more troubled …we are raising a generation of Americans who are historically illiterate.…And today, in part by design, there’s a lot of know-nothingness throughout the land… in the great void between readable histories and snooze-fest treatises have stepped demagogues with agendas, from Glenn Beck and his paranoid writings on the perils of progressivism, to Oliver Stone and his highly selective retelling of the 20th century.…

The New Know Nothing Party and the High Price of Willful Ignorance by John Atcheson, February 19, 2013 by Common DreamsIgnorance: The condition of being uneducated, unaware, or uninformed.Here in the 21st Century the Republicans have become the new Know Nothing Party…  Just as the original Know Nothings employed fear, bigotry, ignorance and hate to motivate its base, so too does the Republican Party… one area of willful ignorance eclipses all others in terms of its denial of fact and the consequences of that denial:  Climate change. The scientific consensus is clear at this point, and it’s backed up by empirical evidence…We are trading away children’s future.…So we have a clear and present danger, a strong scientific consensus, and empirical evidence that we are on the verge – or well into – irrevocable global disasters of epic proportion. How does the Party of Willful Ignorance respond?  With intentional ignorance, of course. The question is why. And the answer is simple. They sell ignorance because it is in the interests of their true constituency – the uber wealthy and the corporate special interests.  While tackling climate change would avoid catastrophic costs and create jobs, it will hurt the coal, oil and gas interests. Austerity preserves the status quo on who has and who doesn’t.  Them that has would continue to get, them that doesn’t would lose even more. Hawking government as the problem lets them turn over trillions in retirement and health care profits to the private sector while increasing your individual debt.  It converts education from a public right to a private profit center. It allows them to justify cutting back on regulations so banks can once again screw you with impunity. It trashes your air, pollutes your water, and destroys your climate..We the people must seize control of the political process with our votes.  Progressive principles won the recent election, and a majority of Americans support progressive positions on a case-by-case basis. If we fail to translate these individual beliefs into broader political practices and votes, it’s due largely to the mainstream media, which regularly presents the “debate” about issues as opposed to the reality of the issues.  Thus, Rubio can still trot out the entire Republican playbook of lies and willful ignorance…As a result, our democracy is diminished; our children’s world is compromised; and our economy remains in service only to the uber rich.

The Right’s Stupidity Spreads, Enabled by a Too-Polite Left by George Monbiot, The Guardian/UK, February 7, 2012 – …we have been too polite to mention the Canadian study published last month in the journal Psychological Science, which revealed that people with conservative beliefs are likely to be of low intelligence.…There is plenty of research showing that low general intelligence in childhood predicts greater prejudice towards people of different ethnicity or sexuality in adulthood. Open-mindedness, flexibility, trust in other people: all these require certain cognitive abilities. Understanding and accepting others – particularly “different” others – requires an enhanced capacity for abstract thinking.… tends not to arise directly from low intelligence but from the conservative ideologies to which people of low intelligence are drawn. Conservative ideology is the “critical pathway” from low intelligence to racism.…This is not to suggest that all conservatives are stupid. There are some very clever people in government, advising politicians, running think tanks and writing for newspapers, who have acquired power and influence by promoting rightwing ideologies…they now appeal to the basest, stupidest impulses..former Republican ideologues, David Frum warns that “conservatives have built a whole alternative knowledge system, with its own facts, its own history, its own laws of economics”. The result is a “shift to ever more extreme, ever more fantasy-based ideology” which has “ominous real-world consequences for American society”… Confronted with mass discontent, the once-progressive major parties… triangulate and accommodate, hesitate and prevaricate… They fail to produce a coherent analysis of what has gone wrong and why, or to make an uncluttered case for social justice, redistribution and regulation. The conceptual stupidities of conservatism are matched by the strategic stupidities of liberalism. Yes, conservatism thrives on low intelligence and poor information. But the liberals in politics on both sides of the Atlantic continue to back off, yielding to the supremacy of the stupid…


Kierkegaard at 200

by Gordon Marino, New York Times, May 3, 2013

Northfield, Minnesota

THE intellectual immortal Soren Kierkegaard turns 200 on Sunday. The lyrical Danish philosopher is widely regarded as the father of existentialism, a philosophical and literary movement that emphasizes the category of the individual and meditates on such gauzy questions as, Is there a meaning to life?

Not surprisingly, existentialism hit its zenith after humanity got a good look at itself in the mirror of the Holocaust, but then memories faded and economies boomed and existentialism began to seem a little overwrought.

Still, throughout the ups and downs of the scholarly market, the intellectual world has remained bullish on Kierkegaard, in part because the Dane, unlike other members of the Socrates guild, always addressed what human beings are really up against in themselves, namely, anxiety, depression, despair and the flow of time.

There are at least a dozen scholarly fests going on around the globe to celebrate Kierkegaard’s bicentennial, and here at the Hong Kierkegaard Library at St. Olaf College we are expecting over 100 international participants at our birthday bash.

In his youth, Kierkegaard earned the nickname “gaflen,” or “the fork,” for his ability to discern the weaknesses in other people and to stick it to them. All his writing life, Kierkegaard wielded his red-hot stylus to stick it to bourgeois Christendom. His life was a meditation on what it means to have faith.

Although Kierkegaard never used the exact phrase, “the leap of faith,” those words have become his shibboleth. A Lutheran raised in a pietistic environment, Kierkegaard insisted that there was no being born into the fold; no easy passage, no clattering up a series of syllogisms to faith. For Kierkegaard, faith involved a collision with the understanding and a radical choice, or to use the terms of his singular best seller, life and faith demands an “Either/Or.” Believe or don’t believe, but don’t imagine you can have it both ways. As the mostly empty pews attest, much of Europe has taken Kierkegaard up on his challenge.

But Kierkegaard was more than a Luther of his Lutheran tradition; his writings bristle with insights about culture and humanity that can be redeemed in the currency of secularism.

For instance, Kierkegaard flourished at the inception of mass media. Daily and weekly journals and newspapers were just beginning to circulate widely. As though he could feel Facebook and Twitter coming down the line, he anticipated a time when communication would become instantaneous, but no one would have anything to say; or again, a time when everyone was obsessed with finding their voice but without much substance or “inwardness” behind their eruptions and blogposts.

In one of his books Kierkegaard moans, “The present age is an age of publicity, the age of miscellaneous announcements: Nothing happens but still there is instant publicity.” In the end, Kierkegaard was concerned about the power of the press to foment and form public opinion and in the process relieve of us of the need to think matters through on our own.

Over a 17-year span, Kierkegaard published a score of books and compiled thousand of pages of journal entries. Like Nietzsche and other geniuses who were more than less immolated by the fiery force of their own ideas, Kierkegaard sacrificed his body to dance out the riches of his thoughts. Self-conscious of his own preternatural powers, he wrote, “Geniuses are like thunderstorms. They go against the wind, terrify people, cleanse the air.”

Of course, we have all known hours of inspiration, but to live with the Muse on one’s shoulders year after year is to be able to abide close to the borders of what must sometimes feel like madness. Not an easy task.

Just to pluck a couple of plums from the sprawling tree of Kierkegaard’s extraordinary oeuvre: He was not what we would term an ethicist. He did not devote his energies to trying to find a rational basis for ethics, nor was he occupied with teasing out moral puzzles. And yet, he has something very significant to say about ethics.

According to Kierkegaard, it is not more knowledge or skills of analysis that is required to lead dutiful lives. If knowledge were the issue, then ethics would be a kind of talent. Some people would be born moral geniuses and others moral dolts. And yet so far as he and Kant were concerned, when it comes to good and evil we are all on an equal footing.

Contrary to the ethics industry that is thrumming today, Kierkegaard believed that the prime task is to hold on to what we know, to refrain from pulling the wool over our own eyes because we aren’t keen on the sacrifices that doing the right thing is likely to require. As Kierkegaard put it, when we find ourselves in a moral pinch we don’t just take the easy way out. Instead, he writes, “Willing allows some time to elapse, an interim called: We shall look at it tomorrow.”

And by the day after tomorrow, we usually decide that the right way was, after all, the easy way. And so the gadfly of Copenhagen concludes, “And this is how perhaps the great majority of men live: They work gradually at eclipsing their ethical and ethical-religious comprehension, which would lead them out into decisions and conclusions that their lower nature does not much care for.”

There are epiphanies in every nook and cranny of Kierkegaard’s authorship, but paradoxically enough, from the beginning to the end, the man who seemed to know just about everything, was gently emphatic: “The majority of men … live and die under the impression that life is simply a matter of understanding more and more, and that if it were granted to them to live longer, that life would continue to be one long continuous growth in understanding. How many of them ever experience the maturity of discovering that there comes a critical moment where everything is reversed, after which the point becomes to understand more and more that there is something which cannot be understood.”

And what might it mean to understand that there is something of vital importance in life that cannot be understood? With the indirection of a Zen master, our Birthday Boy helps us unlock this and other koans that strike to the marrow of the question of what it means to be a human being.

Gordon Marino is a professor of philosophy and the director of the Hong Kierkegaard Library at St. Olaf College. His most recent book  “The Quotable Kierkegaard” will be published in the fall.

http://www.nytimes.com/2013/05/04/opinion/global/Kierkegaard-at-200.html?pagewanted=all&_r=0

Welcome to the Data Driven World

By Joseph Marks, nextgov.com, April 5, 2013

University of Wisconsin geologist Shanan Peters was frustrated by how much he didn’t know.

Most geological discoveries were locked away in troves of research journals so voluminous that he and his colleagues could read only a fraction of them. The sheer magnitude of existing research forced most geologists to limit the scope of their work so they could reasonably grasp what had already been done in the field. Research that received little notice when it was published too often was consigned to oblivion, wasting away in dusty journals, even if it could benefit contemporary scientists.

A decade ago, Peters would have had to accept his field’s human limitations. That’s no longer the case. In the summer of 2012, he teamed up with two University of Wisconsin computer scientists on a project they call GeoDeepDive.

The computer system built by professors Miron Livny and Christopher Re will pore over scanned pages from pre-Internet science journals, generations of websites, archived spreadsheets and video clips to create a database comprising, as nearly as possible, the entire universe of trusted geological data. Ultimately, the system will use contextual clues and technology similar to IBM’s Watson to turn those massive piles of unstructured and often forgotten information—what Livny and Re call “dark data”—into a database that Peters and his colleagues could query with questions such as: How porous is Earth’s crust? How much carbon does it contain? How has that changed over the millennia?

The benefits of GeoDeepDive will be twofold, Peters says. First, it will give researchers a larger collection of data than ever before with which to attack problems in the geosciences. Second, it will allow scientists to broaden their research because they will be able to pose questions to the system that they lack the expertise to answer on their own. 

“Some problems were kind of off limits,” Peters says. “You couldn’t really think about reasonably addressing them in a meaningful way in one lifetime. These new tools have that promise—to change the types of questions we’re able to ask and the nature of answers we get.”

Order From Chaos

GeoDeepDive is one of dozens of projects that received funding from a $200 million White House initiative launched in March 2012 to help government agencies, businesses and researchers make better use of what’s called “big data.”

Here’s what that means: Data exist all over the world, in proliferating amounts. Satellites beam back images comprising every square mile of Earth multiple times each day; publishers crank out book after book; and 4.5 million new URLs appear on the Web each month. Electronic sensors record vehicle speeds on the Interstate Highway System, weather conditions in New York’s Central Park and water activity at the bottom of the Indian Ocean. Until recently, scientists, sociologists, journalists and marketers had no way to make sense of all this data. They were like U.S. intelligence agencies before the Sept. 11 terrorist attacks. All the information was there, but no one was able to put it together.

Three things have brought order to that cacophony in recent years. The first is the growth of massive computer clouds that virtually bring together tens or hundreds of thousands of servers and trillions of bytes of processing capacity. The second is a new brand of software that can link hundreds of those computers together so they effectively act like one massive computer with a nearly unlimited hunger for raw data to crunch.

The third element is a vastly improved capacity to sort through unstructured data. That includes information from videos, books, environmental sensors and basically anything else that can’t be neatly organized into a spreadsheet. Then computers can act more like humans, pulling meaning from complex information such as Peters’ geosciences journals without, on the surface at least, reducing it to a series of simple binary questions.

“For a number of years we’ve worked really hard at transforming the information we were collecting into something that computers could understand,” says Sky Bristol, chief of Science Information Services at the U.S. Geological Survey. “We created all these convoluted data structures that sort of made sense to humans but made more sense to computers. What’s happened over the last number of years is that we not only have more powerful computers and better software and algorithms but we’re also able to create data structures that are much more human understandable, that are much more natural to our way of looking at the world.

“The next revolution that’s starting to come,” he says, “is instead of spending a lot of energy turning data into something computers can understand, we can train computers to understand the data and information we humans understand.”

Big Promises 

Big data has hit the digital world in a big way. The claims for its power can seem hyperbolic. A recent advertisement for a launch event for the book Big Data: A Revolution That Will Transform How We Live, Work, and Think (Eamon Dolan/Houghton Mifflin Harcourt, 2013) promised the authors would explain why the “revolution” wrought by big data is “on par with the Internet (or perhaps even the printing press).”

Big data’s promise to transform society is real, though. To see its effect one need not look to Guttenberg but to Zuckerberg, Page and Brin. Each day Facebook and Google chew through millions of pages of unstructured text embedded in searches, emails and Facebook feeds to deliver targeted ads that have changed how sellers reach consumers online.

Retailers are mining satellite data to determine what sort of customers are parking in their competitors’ parking lots, when they’re arriving and how long they’re staying. An official with Cisco’s consulting arm recently suggested big box retailers could crunch through security camera recordings of customers’ walking pace, facial expressions and eye movements to determine the optimal placement of impulse purchases or what store temperature is most conducive to selling men’s shoes.

Big data is making an appearance in international aid projects, in historical research and even in literary analysis.

Re, the University of Wisconsin computer scientist, recently teamed with English professor Robin Valenza to build a system similar to GeoDeepDive that crawls through 140,000 books published in the United Kingdom during the 18th century. Valenza is using the tool to investigate how concepts such as romantic love entered the English canon. Ben Schmidt, a Princeton University graduate student in history, has used a similar database built on the Google Books collection to spot linguistic anachronisms in the period TV shows Downton Abbey and Mad Men. His assessment: The Sterling Cooper advertising execs of Mad Men may look dapper in their period suits but they talk about “keeping a low profile” and “focus grouping”—concepts that didn’t enter the language until much later.

The ‘Holy Grail’

The White House’s big data investment was spawned by a 2011 report from the President’s Council of Advisors on Science and Technology, a group of academics and representatives of corporations including Google and Microsoft. The report found private sector and academic researchers were increasingly relying on big data but weren’t doing the sort of basic research and development that could help the field realize its full potential.

The council wasn’t alone. The research arm of McKinsey Global Institute predicted in May 2011 that by 2018 the United States will face a 50 percent to 60 percent gap between demand for big data analysis and the supply of people capable of performing it. The research firm Gartner predicted in December 2011 that 85 percent of Fortune 500 firms will be unprepared to leverage big data for a competitive advantage by 2015.

The White House investment was funneled through the National Science Foundation, the National Institutes of Health, and the Defense and Energy departments, among other agencies. The grants are aimed partly at developing tools for unstructured data analysis in the private, academic and nonprofit worlds but also at improving the way data is gathered, stored and shared in government, says Suzi Iacono, deputy assistant director of the NSF’s Directorate for Computer and Information Science and Engineering.

As an example, Iacono cites the field of emergency management. New data storage and analysis tools are improving the abilities of the National Weather Service, FEMA and other agencies to predict when and how major storms such as Hurricane Sandy are likely to hit the United States. New Web and mobile data tools are making it easier for agencies to share that information during a crisis.

“If we could bring together heterogeneous data about weather models from the past, current weather predictions, data about where people are on the ground, where responders are located— if we could bring all this disparate data together and analyze them to make predictions about evacuation routes, we could actually get people out of harm’s way,” she says. “We could save lives. That’s the Holy Grail.”

One of the largest impacts big data is likely to have on government programs in the near term is by cutting down on waste and fraud, according to a report from the industry group TechAmerica released in May 2012.

The Centers for Medicare and Medicaid Services launched a system in 2011 that crunches through the more than 4 million claims it pays daily to determine the patterns most typical of fraud and possibly deny claims matching those patterns before they’re paid out. The government must pay all Medicare claims within 30 days. Because it lacks the resources to investigate all claims within that window CMS typically has paid claims and then investigated later, an inefficient practice known as “pay and chase.”

The board that tracks spending on President Obama’s 2009 stimulus package used a similar system to weed out nefarious contractors.

Big data is having an impact across government, though, in areas far afield from fraud detection. The data analysis company Modus Operandi received a $1 million Army contract in late 2012 to build a system called Clear Heart, which would dig through hundreds of hours of video—including footage from heavily populated areas—and pick out body movements that suggest what officials call “adversarial intent.” That could mean the posture or hand gestures associated with drawing a gun or planting a roadside bomb or the gait of someone wearing a suicide bombing vest.

The contract covers only the development of the system, not its implementation. But Clear Heart holds clear promise for drone surveillance, Modus Operandi President Richard McNeight says. It could be used to alert analysts to possible dangers or to automatically shed video that doesn’t show adversarial intent, so analysts can better focus their efforts.

The technology also could have domestic applications, McNeight says.

He cites the situation in Newtown, Conn., where a gunman killed 20 elementary school students and six adults. “If you’d had a video camera connected with this system it could have given an early warning that someone was roaming the halls with a gun,” McNeight says.

Big data’s greatest long-term effects are likely to be in the hard sciences, where it has the capacity to change hypothesis-driven research fields into data driven ones. During a panel discussion following the announcement of the White House big data initiative, Johns Hopkins University physics professor Alex Szalay described new computer tools that he and his colleagues are using to run models for testing the big-bang theory.

“There’s just a deluge of data,” the NSF’s Iacono says. “And rather than starting by developing your own hypothesis, now you can do the data analysis first and develop your hypotheses when you’re deeper in.”

Coupled with this shift in how some scientific research is being done is an equally consequential change in who’s doing that research, Iacono says.

“In the old days if you wanted to know what was going on in the Indian Ocean,” she says, “you had to get a boat and get a crew, figure out the right time to go and then you’d come back and analyze your data. For a lot of reasons it was easier for men to do that. But big data democratizes things. Now we’ve got sensors on the whole floor of the Indian Ocean, and you can look at that data every morning, afternoon and night.”

Big data also has democratized the economics of conducting research.

One of NIH’s flagship big data initiatives involves putting information from more than 1,000 individual human genomes inside Amazon’s Elastic Compute Cloud, which stores masses of nonsensitive government information. Amazon is storing the genomes dataset for free. The information consumes about 2,000 terabytes—that’s roughly the capacity required to continuously play MP3 audio files for 380 years—far more storage than most universities or research facilities can afford. The company then charges researchers to analyze the dataset inside its cloud, based on the amount of computing required.

This storage model has opened up research to huge numbers of health and drug researchers, academics and even graduate students who could never have afforded to enter the field before, says Matt Wood, principal data scientist at Amazon Web Services. It has the potential to drastically speed up the development of treatments for diseases such as breast cancer and diabetes.

Over time, Wood says, the project also will broaden the scope of questions those researchers can afford to ask.

“If you rewind seven years, the questions that scientists could ask were constrained by the resources available to them, because they didn’t have half a million dollars to spend on a supercomputer,” he says. “Now we don’t have to worry about arbitrary constraints, so research is significantly accelerated. They don’t have to live with the repercussions of making incorrect assumptions or of running an experiment that didn’t play out.”

http://www.nextgov.com/big-data/2013/04/welcome-data-driven-world/62319/