Uncategorized Archive

Are Americans Too Broken by Corporate Power to Resist?

Tuesday, March 23rd, 2010
We need to take a look at what forces in American society are preventing people from being able to resist tyranny and dehumanization.
Editor's Note: The following is the transcript of a recent interview with Bruce E. Levine by OpEd News' Joan Brunwasser. Levine is a clinical psychologist and author of Surviving America’s Depression Epidemic: How to Find Morale, Energy, and Community in a World Gone Crazy (Chelsea Green Publishing, 2007).

Joan Brunwasser: Back in December, you wrote 'Are Americans a Broken People? Why We've Stopped Fighting Back Against the Forces of Oppression.' Could you tell our readers about your theory?

Bruce E. Levine: There are times when human beings can become so broken that truths of how they are being victimized do not set them free. This is certainly the case for many victims of parental and spousal abuse. They are not helped by people explaining to them that they are being battered, exploited, uncared about and disrespected. They know it already and somebody pointing it out is not helpful.

So, it seems to me that it is also possible that human beings can become so broken by the abuse of the corporate elite that they also are no longer set free by truth.

While certainly the corporate-controlled mainstream media does not report many important truths, the majority of the American people do know enough to oppose the war in Afghanistan, but they do almost nothing in response to recent troop surges.

Polls show that the majority of Americans actually support single-payer, Medicare-for all plan and even a larger majority support a public option, yet there are relatively few people on the streets protesting the Democratic party betrayal of them.

And look at the 2000 U.S. "banana-republic" presidential election, in which Gore beat Bush by 500,000 votes and the Supreme Court stopped the Florida recount, and 51 million Gore voters were disenfranchised. Yes, there were small protest demonstrations against this election farce, but the numbers of protesters were so small that they empowered rather than concerned the future Bush administration, which went on to almost vaunt its regime of anti-democracy and piss on basic human rights. How humiliating for an entire nation. The shame many Americans feel, at some level, for allowing torture and other abuses is similar to the shame that spousal abuse victims feel — and this routinely makes people feel even weaker. So, while not all Americans are broken, demoralized and feeling powerless, many are.

I wish the answer to restoring democracy was simply one of people getting more journalistic truths through a non-corporate media –and certainly I am all for that — but I think that much more is required. We need to take a look at what forces in American society are breaking the American people from the ability to resist tyranny and dehumanization, and we must start considering what are the antidotes to this. At least that's what any psychologist or social scientist who gives a damn about genuine democracy should be doing.

JB: So, our feelings of powerlessness are rooted in modern life, exacerbated by present political realities. I'd like to point out another factor, which is what Paul Rogat Loeb refers to as our 'historic amnesia.' Historian and social activist Howard Zinn spent decades trying to offset that amnesia by providing an alternate history of our country, emphasizing various movements that have spanned decades (or generations) and eventually brought about change. He told stories of ordinary people doing extraordinary things and his book, A People's History of the United Stateshas sold two million copies. So, it's obviously struck a chord. What do you think about the power of stories as an antidote to the ennui you describe?

BL: Stories of resistance to tyranny are great for the morale, so Howard Zinn did a great service by popularizing historical examples. These can be inspirational. A broken person and a broken people need morale. Inspirational models whom people can identify with can be energizing, and energy is exactly what demoralized people need.

It is important for people to know that, yes, there are historical examples of people rebelling against the elite. It is important, for example, for us to know that there once was something called the People's Party in the U.S. and a huge populist revolt that scared the hell out of the elite in the 1880s and 1890s.

But historical truths are not enough because sometimes people say, "That's just history, now is different, rebellion isn't possible." That's why not only historians need to report rebellions but journalists must report current resistance to the ruling elite corporations and their political lackeys, current resistance to this "corpocracy."

Bill Moyers has done a good job reporting on current resisters. I have seen a couple of examples on his recent shows. One is Steve Meachum and his group City Life, which has successfully kept people from being thrown out of their homes in foreclosure. Another example is pediatrician Margaret Flowers, a member of Physicians for a National Health Program, jailed for the cause of single-payer/Medicare for all.

JB: Good examples. I interviewed Dr. Flowers last May, shortly after she was released from jail.

BL: Historical examples and current examples of resistance against the corporacy can be inspiring, energizing and morale-boosting.

The elite know that to win the class war, just like winning any war, the goal is to crush the spirit of resistance of your opponent. So if you want to win the class war, you must care about the morale of your class.

Remember the "Tank Man" in China? While it is important for the people in China to know all the ways that they are being victimized, the problem is if they are completely terrified of their authoritarian government and too broken to resist, what's the good of knowing more and more about how they are being victimized? So, that one image of the guy getting out in front of the tank — "the Tank Man" — is hugely important.

I can tell you for sure that what I need is more models and fewer lectures. My sense is that is what many of us need.

JB: Your comment points to one of the big problems we Americans face. The corporate media is often part of the problem, rather than performing its historic 'watchdog' function. That's difficult to overcome, especially when so many exclusively read and listen to that right-wing echo chamber. Wasn't it Hitler's propaganda minister who said that all you have to do is repeat a lie 1,000 times and it becomes true? Those of us trying to practice responsible journalism online are fighting an uphill battle. Any recommendations?

BL: It's only going to make genuine journalists feel more powerless and broken if they focus on the ability of the corporate media to pound the airwaves with bullshit. The good news is that with all the money and power behind them, not all that many people take the corporate media seriously.

Of course, people don't get how impotent the corporate media is if they just watch the corporate media. But the polls show that, despite all their propaganda, the American people know that big business, the Democrats, the Republicans and the corporate media are all special-interest groups that work together for their own interest and not for the people.

I'm not going to worry about people like NBC's Brian Williams who spends a good part of his life appearing on every program possible to get his face and name out there. Williams makes it as clear as possible to anybody with half of a brain that what he's desperate for is publicity — not truth.

So the corporate media now even recognizes how bored people are by their boring bullshit. However, instead of trying to excite people with truths, they are now trying to ape Jon Stewart. But their apes are not witty or funny and not reporting any truths, even the obvious ones that Stewart points out. … So what are my recommendations to real journalists who actually give a damn about getting the truth out there and about having an impact?

Two things come immediately to mind. First, when you are preaching to the choir, when you are writing for a publication that is read by an audience that already has been radicalized one must think, "Is my piece going to simply depress them with one more truth of oppression and injustice? Or, is my piece going to stimulate some action in at least one reader, and hopefully more?"

I have written for publications such as Z Magazine, AlterNet, CounterPunch,Adbusters and The Ecologist, for readers who are already radicalized. I used to feel satisfied with informing readers about yet another industrial complex that I knew well, specifically, the psycho-pharmaceutical industrial complex. But now I think that's not enough. When one has an opportunity to write to people who are already aware of how they are being screwed by an oligarchy of industrial complexes, I believe it is one's responsibility to write in a way that galvanizes them to get off their asses and do something constructive.

Much of schooling teaches people that it is good enough to simply know the truth and care about injustices. But it's not enough to know and care if that concern is passive. Jonathan Kozol, the school critic, used the phrase "inert concern" to characterize what he was taught in his elitist schooling at a fancy prep school and later at Harvard. Kozol mocks "inert concern," and so do I.

Good journalism is going to energize people to take action. One way is, as we've already talked about, giving people inspiring models.

A second thing that journalists must do is to get creative in figuring out ways of expanding their audience rather than simply preaching to the choir. People who feel defeated, demoralized and broken want to be energized. This means it is not enough to report the truth — one needs to write in a way that is fun to read. Molly Ivins got it. Jon Stewart and Stephen Colbert get it. Gore Vidal has always gotten it.

Michael Pollan is an interesting example of somebody who has been able to expand the audience of people who get it about the food industry. I remember reading Pollan when he was a relative unknown writing for Harper's about drug hypocrisy issues — he was right on the money and damn near anarchistic. But Pollan is an entertaining guy who is fun to read and doesn't sound like some ideologue pushing counterpropaganda.

He's now going after the food-industrial complex. Pollan has been effective in making it quite mainstream to talk about some pretty radical stuff. I hear he is responsible for influencing Michelle Obama to have a vegetable garden. Now, having a vegetable garden and cooking your own food does not sound radical to people who get turned off by radicals, but there is no more radicalizing stuff than learning to become more self-reliant and independent of the food-industrial complex.

So, two solutions to your question involve expanding your audience and energizing people who already get it. If all journalists started to think about this and get creative, there would be a bunch more specific answers.

The real question for me is what can each of us do, at least each of us who gives a damn about genuine democracy and getting rid of the plutocracy we now have. What can journalists do? Psychologists? Teachers? Parents? Students? We need to try to think about this question strategically. Think about it creatively. We need to think about what can be energizing and fun and is thus sustainable.

JB: You're talking about advocacy journalism, aren't you?

BL: Let's take a look at this phrase "advocacy journalism." In reality, Brian Williams is advocating for the career of Brian Williams, and the New York Timesis advocating for the New York Times. Neither is advocating all that much for the truth.

The Times would like to us to believe that it is not advocating any political ideology, but in reality, it's advocating for readers to take the entire institutional establishment seriously. Times writer Judith Miller took establishment sources seriously about WMD in Iraq, and this greased the wheels of the U.S. invasion of Iraq. The Times would have us believe that Miller and WMD were an anomaly. Not true.

When the Times reports that the Food and Drug Administration has approved a new drug, the Times almost never reports that the FDA did not do independent tests but trusted drug company data — this is normal procedure. And the Timesdoes not report in any drug approval story that there is a revolving door of employment between the FDA and drug companies — this is the reality.

Advocating for the truth would mean reporting facts that question the credibility of institutions, especially ones such as the FDA with its history of getting it wrong so much of the time. The FDA example is only one of many. The New York Times is a major institution that benefits from the status quo being taken seriously. The Times keeps itself from being attacked by other major institutions by what the Times omits about these other major institutions.

Pretend neutrality and lies of omission insult the public. Genuine democracy needs people, including journalists, mixing it up honestly. So, journalists need to report the facts because they will not be taken seriously if they get the facts wrong. And journalists need to report facts that may be troubling for their position because that will gain a journalist even more credibility and power. But readers know that journalists are people who have a point of view, so journalists shouldn't pretend that they don't have one and then slant a story.

When New York Times apologists accuse Amy Goodman and "Democracy Now!" of advocacy journalism, I have to laugh. The Times is advocating taking the status quo and major institutions seriously, and "Democracy Now!" is advocating against that. The Times puts a lot of effort into not being transparent about its kind of advocacy, while "Democracy Now!" doesn't waste its time on such pretend efforts.

JB: Before we close, let's shift gears for a moment. Have you found that your clinical practice has changed over the last number of years, with patients feeling more overwhelmed and powerless than before?

BL: I see more powerlessness with teenagers and young adults now than I saw 20 years ago. Many extremely smart but nonacademic high school students who hate school have been told that they must go to college or they will never be able to make a living, and at the same time they know that increases in college tuition result in outrageous debt, and with increasingly crappy jobs out there, this debt will be difficult to pay off. And of course debt breaks people.

There remain young people who have not had their spirit of resistance against the corpocracy crushed out of them, and I ask them, "How many of your peers are aware of and rebelling against the reality that they are being turned into indentured servants and slaves?" They tell me practically none of their peers are resisting, at least constructively, as they feel too powerless to do anything but lots of alcohol, illegal and psychiatric prescription drugs to kill the pain of their hopelessness. I don't see a hell of lot of kids protesting about how they are getting screwed, and that tells me something.

This article originally appeared on Alternet.

Revolutionary Road, A Beautiful Mind, and Truthfulness

Thursday, March 26th, 2009

The films Revolutionary Road and A Beautiful Mind both portray mathematicians turned mental patients who create havoc for their families. But the similarity ends there.

In director Ron Howard's A Beautiful Mind (2001), the facts of the real-life recovery of Nobel prize winner John Nash are fabricated to create a politically-correct version of mental illness — and Howard's film was rewarded with four Oscars, including best picture and best director. In contrast, director Sam Mendes's recent Revolutionary Road stays true to the facts of Richard Yates's 1961 novel, including Yates's now politically-incorrect more psychological perspective of mental illness - and Mendes's film was not rewarded with any Oscars last February.

While the mental illness of John Nash (Russell Crowe) is the focus of Howard's A Beautiful Mind, mental illness is not at the center of either Mendes's film or Yates's novel, but the contribution of mental patient John Givings (Michael Shannon) is vital in both the film and the novel.

What then is the center of Yates's novel? In 1972 Yates told Ploughshares, "I thought I was writing a novel about abortion. . . a series of abortions, of all kinds — an aborted play, several aborted careers, any number of aborted ambitions and aborted plans and aborted dreams — all leading up to a real, physical abortion." Mendes's film certainly conveys that, and it is also true to Yate's character John Givings, who serves as the outspoken truth teller in both film and novel. When Michael Shannon was asked by the Los Angeles Times to describe the movie and his character John Givings's significance, he said:

He really is important to the story. It's a hard movie to describe to people when they ask you what it's about. But if you boil it down to its essence, it's about these two people - Frank [Leonardo DiCaprio] and April [Kate Winslet] — trying to make a decision: Do we stay here and suffer silently or do we try to liberate ourselves and escape to a better life? The first time we see John they are celebrating their decision and kind of reveling in this newfound sense of freedom. John is there to kind of celebrate and validate their decision. Then John comes back, and this freedom has crumbled, and he's there to castigate and punish them for their loss of faith. I think that's the really beautiful thing about the character.

John Givings is pained, and he can be obnoxious, hurtful, disruptive, and tension producing - what mental health professionals label as "inappropriate." But there are reasons for his anger, reasons that Yates did not reduce to defective biochemistry, and this gets the novel-faithful Mendes an F in political correctness.

The current PC explanation of serious mental illness brought to us by Big Pharma — follow the money trail — is that it is caused by this or that neurotransmitter or brain structure and has nothing to do with oppressive families and dehumanizing environments. It is also now PC to mock the notion that mentally-ill diagnosed people may sometimes be like canaries in the mine, more sensitive and reactive to insidious toxins.

John Givings — though psychiatrically hospitalized and a recipient of multiple electroshock treatments which have damaged his mathematical abilities — is clearly not delusional about oppressive family relationships, not wrong about meaningless jobs, not incorrect about gutless frauds, and not mistaken about a dehumanizing society. He, like many people I have known diagnosed with mental illness, feels alienated and powerless. And he is no diplomat. Truth serves as his only source of potency, and he uses it as both a constructive tool to celebrate and validate courage and as a hurtful weapon to castigate and punish gutlessness.

I am not alone in recognizing John Givings in real people diagnosed with severe mental illness. So did Richard Yates. Ploughshares asked Yates, "When you first planned the book, did you have John Givings in there?" Yates responded:

No, I didn't. He occurred to me as a character about midway through the writing of the book. I felt I needed somebody in there to point up or spell out the story at crucial moments, and I did know a young man very much like that at the time, a long-term patient in a mental hospital who had an uncannily keen and very articulate insight into other people's weaknesses, so I worked a fictionalized version of him into the book.

Today, the PC handling of Mr. Untreated Paranoid Schizophrenic is to depict him as having only meaningless craziness to utter until he begins taking his medication regularly, at which time he can function in the world and bring smiles to his long-suffering family. In this regard, Mendes gets another F on his PC report card by accurately depicting Yate's John Givings, who may be a son of a bitch but one who does not voice meaningless craziness. John Givings has valuable observations about familial relationships and society, and the powerful Shannon is so compelling that you want him on the screen.

In contrast to Revolutionary Road, A Beautiful Mind is, ultimately, a feel-good movie about mental illness that steps on no powerful institution's toes, and perhaps that is part of why it won all those Oscars.

First, let me be clear that Howard's film did some good things, including his emphasis on the therapeutic value of supportive relationships and his hopeful message about the possibility of recovery. I know many people who have been diagnosed with schizophrenia, paranoid schizophrenia, and other severe mental illnesses who have gone on to have satisfying and meaningful lives — with or without medication, with or without doctors, but always with respect and support.

The shame is that Howard, perhaps afraid of upsetting the mental health establishment, gave Russell Crowe's Nash a line which the real John Nash never said, a line which was untrue, a line which was unnecessary to move the story along, but a line which was completely necessary for the pharmaceutical industry and the institutions it financially supports — including the American Psychiatric Association, the National Alliance for the Mentally Ill, and the drug-advertisement addicted media.

The line? In Howard's A Beautiful Mind, John Nash, when informed that he was being considered for the 1994 Nobel Prize, mentions, "I take the newer medications." However, as the documentary A Brilliant Madness (broadcast on PBS's "American Experience" in 2002) reported, "Nash had stopped taking medication in 1970."

Howard's "newer medications" line served, in effect, as a product placement not for a single company but for an entire drug-dependent mental health industry that would show its appreciation. Former Boston Globe science journalist Robert Whitaker, author of Mad in America, reported in 2002, "The National Alliance for the Mentally Ill has praised the film's director, Ron Howard, for showing the 'vital role of medication' in Nash's recovery." However, notes Whitaker, Sylvia Nasar in her biography of Nash (also called A Beautiful Mind), reports something quite different about Nash's recovery. Specifically, Nasar writes:

Nash's refusal take the antipsychotic drugs after 1970, and indeed during most of the periods when he wasn't in the hospital in the 1960s, may have been fortuitous. Taken regularly, such drugs, in a high percentage of cases, produce horrible, persistent, symptom like tardive dyskinesia. . . and a mental fog, all of which would have made his gentle reentry into the world of mathematics a near impossibility.

Nash's recovery without psychiatric drugs is no anomaly. A third of so-called "chronic schizophrenic" patients released from Vermont State Hospital in the late 1950s completely recovered, reported psychologist Courtenay Harding in 1987; and she found that patients in the "best-outcomes" group shared one common factor: all had stopped taking antipsychotic drugs. So Howard actually could have made a more honest feel-good movie about the real life John Nash, but it would have been one that upset powerful institutions who are dependent on drug money.

The World Health Organization, in two different studies (1979 and 1992), reported that the United States and other "developed" countries are inferior to "developing" countries such as India, Nigeria, and Colombia in helping people diagnosed as psychotic to recover, One likely reason for this inferiority in the "developed world" is its almost complete reliance on drugs, and another likely reason is our relative absence of genuine community and supportive groups.

It is convenient for many people — and lucrative for drug companies and the institutions that they support - if all disruptive, crazy-sounding, tension-producing people can simply be handed off to doctors to be labeled and drugged. If we can neatly compartmentalize and medicalize the John Givings of the world, then families and society don't have to halt the assembly line and ask questions such as: "What is exactly happening in this person's life that has made him or her so angry or frightened? Why does he or she feel so alienated? Is society oppressive for many people, and is this person simply more unbridled in their reaction to that fact? Is there something suffocating about nuclear families in which temperamentally mismatched people are forced to have relationships? Should we be satisfied with a paycheck and a full belly — or is that not enough?

I have met many angry, rude, tension-producing people labeled with severe mental illness. Some of them are completely dominated by their own victimization and seek only to inflict payback pain on those around them. Others though, when feeling safe, state truths which, if taken seriously, would create a more loving family, a more caring community, and a more stimulating world.

When April and Frank take John seriously, he relaxes, stops being hurtful, and shares with them, among other insights, that "maybe it does take a certain amount of guts to see the emptiness, but it takes a whole hell of a lot more to see the hopelessness. And I guess when you do see the hopelessness, that's where there's nothing to do but take off. If you can."

Richard Yates, Sam Mendes, and Michael Shannon remind us that people who are diagnosed with seriously mental illness can — when feeling respected — say profound things that are worth taking seriously. But in today's PC mental health professional world, that kind of reminder is mocked as a romanticization of a disease. Amidst the emptiness of such a world, where only disturbing symptoms and biochemistry are taken seriously, it is no accident that many hurting people become hopeless and conclude that there's nothing to do but take off - any way that they can.

Bruce E. Levine, Ph.D., is a clinical psychologist and author of Surviving America's Depression Epidemic: How to Find Morale, Energy, and Community in a World Gone Crazy (Chelsea Green Publishing, 2007).

This article was originally published on The Huffington Post.

Eli Lilly and the Case for a Corporate Death Penalty

Thursday, March 5th, 2009

Eli Lilly & Company's rap sheet as a public menace is so long that for Lilly watchers to overcome the "banality-of-Lilly-sleaziness" phenomenon, the drug company must break some type of record measuring egregiousness. Lilly obliged earlier this year, receiving the largest criminal fine ever imposed on a corporation.

If Americans are ever going to revoke the publicly granted charters of reckless, giant corporations — well within our rights — we might want to get the ball rolling with Lilly, whose recent actions appalled even the mainstream media. And with Lilly's chums, the Bush family, out of power, now might be the right time.

On January 15, 2009, Lilly pled guilty to charges that it had illegally marketed its blockbuster drug Zyprexa for unapproved uses to children and the elderly, two populations especially vulnerable to its dangerous side effect. Lilly plead guilty to a misdemeanor charge and agreed to pay $1.42 billion, which included $615 million to end the criminal investigation and approximately $800 million to settle the civil case.

One of the eight whistle-blowers in this case, former Lilly sales representative Robert Rudolph, says the settlement will not completely change Lilly's business practices, and he wants jail time for executives. "You have to remember, with Zyprexa," said Rudolph, "people lost their lives."

Rudolph is not exaggerating. Zyprexa, marketed as an "atypical" antipsychotic drug, has been promoted as having less dangerous adverse effects than "typical" antipsychotic drugs such as Thorazine and Haldol. However, on February 25, 2009, the Journal of the American Medical Association reported that the rate of sudden cardiac death in patients taking either typical or atypical antipsychotic drugs is double the death rate of a control group of patients not taking these drugs.

Zyprexa — though not nearly as well known as Lilly's previous blockbuster Prozac — is today one of the biggest-selling drugs in the world. Zyprexa has grossed more than $39 billion since its approval in 1996, with $4.8 billion of that in 2007 (and it was projected to equal or surpass that gross in 2008 when earnings are reported).

Lilly has had other Zyprexa scandals, but in this current one, Lilly executives matched Charles Dickens scoundrels. Zyprexa is approved by the Food and Drug and Administration (FDA) for schizophrenia and bipolar disorder, but Lilly illegally marketed it for sleep difficulties, aggression, and other unapproved uses. Lilly sales reps aggressively pushed Zyprexa as a wonderful drug to chill out disruptive children and the elderly who were not schizophrenic or bipolar. The lawsuit against Lilly stated, "In truth, this was Lilly's thinly veiled marketing of Zyprexa as an effective chemical restraint for demanding, vulnerable and needy patients."

Doctors can prescribe drugs for unapproved uses (called "off-label prescribing"), but drug companies are not allowed to market drugs for unapproved uses. Many drug companies break this rule, but Lilly broke it with gusto. The company made hundreds of millions of dollars by trying to convince health care providers that Zyprexa was safe for unapproved uses," said Laurie Magid, acting U.S. Attorney for the Eastern District of Pennsylvania where the case was prosecuted. Magid said that Lilly was responsible for "putting thousands and thousands of patients at risk."

One marketing effort consisted of the Lilly sales force urging geriatricians to use Zyprexa to sedate unruly nursing home and assisted-living facilities patients. Lilly sales reps distributed a study claiming that elderly patients taking Zyprexa required fewer skilled nursing staff hours than were necessary for patients taking competing medications. Magid stated that Lilly sales reps were "trained to use the slogan five at five, meaning five milligrams at 5 o'clock at night will keep these elderly patients quiet." Illegally marketing Zyprexa for elderly patients was especially troubling for prosecutors because Zyprexa increases the risks of heart failure and life-threatening infections such as pneumonia in older patients.

In addition to targeting the misbehaving elderly, Lilly also targeted annoying kids. New York Times reporters Gardiner Harris and Alex Berenson, who have been covering Eli Lilly and Zyprexa for several years, reported on January 14, 2009, "The company also pressed doctors to treat disruptive children with Zyprexa, court documents show, even though the medicine's tendency to cause severe weight gain and metabolic disorders is particularly pronounced in children … The children receiving Zyprexa gained so much weight during the study that a safety monitoring panel ordered that they be taken off the drug."

Mainstream reporters were so appalled by Lilly's recent actions that some voiced caustic commentaries about the relatively small price Lilly paid for its transgressions. CBS reporter Sharyl Attkisson (January 15, 2009) noted, "Eli Lilly has pled guilty to marketing the sometimes dangerous drug Zyprexa in ways never proven safe or effective … Lilly has agreed to pay $1.4 billion, including the largest criminal fine ever imposed on a corporation. Ironically, that's about as much as the company's Zyprexa sales in the first quarter last year." However, the mainstream media failed to provide the context of Lilly's horrendous history which goes back decades.

The New York Times 2009 article did at least go back as far as 2006, reminding readers of the Times exclusive on another Zyprexa scandal. In December 2006, a whistle blower handed over to the Times hundreds of internal Lilly documents and e-mail messages among top company managers that showed how Lilly had downplayed Zyprexa's association with weight gain and metabolic disorders such as diabetes.

A Rolling Stone piece earlier this year ("Marketing Lilly's Zyprexa, a Phony ‘Miracle' Drug") details how Lilly minimized Zyprexa's relationship with dramatic weight gain. In 1995, prior to FDA approval of Zypexa , Lilly's own panel of experts concluded that Zyprexa produced an average weight gain of 24 pounds in a single year (one in six patients gained more than 66 pounds); that kind of weight gain can elevate blood-sugar levels and cause diabetes. This data, however, was not submitted by Lilly to the FDA.

Lilly-Zyprexa scandals didn't just start in 2006. A 2003 Lilly-Zyprexa scandal involved Medicaid and the National Alliance for the Mentally Ill (NAMI), ostensibly a consumer organization. That year, Zyprexa grossed $2.63 billion in the United States, 70 percent of that attributable to government agencies, mostly Medicaid. Zyprexa cost approximately twice as much as similar drugs, and state Medicaid programs, going in the red in part because of Zyprexa, were attempting to exclude it in favor of similar, less expensive drugs. When Kentucky's Medicaid program attempted to exclude Zyprexa — its single largest drug expense — from its list of preferred medications, NAMI bused protesters to hearings, placed full-page ads in newspapers, and sent faxes to state officials. What NAMI did not say at the time was that the buses, ads, and faxes were paid for by Lilly.

The Lilly-NAMI financial connection had already been exposed by Ken Silverstein in Mother Jones in 1999. Silverstein reported that NAMI took $11.7 million from drug companies over a three-and-a-half-year period from 1996 through 1999, with the largest donor being Lilly, which provided $2.87 million. Lilly's funding also included loaning NAMI a Lilly executive, who worked at NAMI headquarters but whose salary was paid for by Lilly.

Beyond Zyprexa, in 2002 fingers were pointed at Lilly for tampering with the Homeland Security Act. On November 25, 2002, soon after George W. Bush signed the Act, New York Times columnist Bob Herbert discovered what had been slipped into it at the last minute, "Buried in this massive bill, snuck into it in the dark of night by persons unknown . . . was a provision that – incredibly – will protect Eli Lilly and a few other big pharmaceutical outfits from lawsuits by parents who believe their children were harmed by thimerosal."

While it was recently revealed that research published in 1998 that linked vaccine use to autism was fraudulent, in 2002 the harmfulness of thimerosal (a preservative that contains mercury and used by Lilly and other drug companies in vaccines) was not clear. Specifically, in 1999 the American Academy of Pediatrics and the Public Health Service had urged vaccine makers to stop using thimerosal, and in 2001 the Institute of Medicine concluded that the link between autism and thimerosal was "biologically plausible." So in 2002, drug companies such as Lilly which had used thimerosal in vaccines were nervous about what scientists and the courts would ultimately determine.

How then did a drug-company protection provision get inserted in the Homeland Security Act? Here's my bet for one of Herbert's "persons unknown." In June 2002, then President George W. Bush had appointed Lilly's CEO, Sidney Taurel, to a seat on his Homeland Security Advisory Council. Ultimately even some Republican senators became embarrassed by the drug-company protection provision, and by early 2003, moderate Republicans and Democrats agreed to repeal that particular provision from the Act.

The year 2002 was a banner one for "Lillygates," with "60 Minutes II" ultimately airing another juicy Lilly scandal. Lilly's patent for Prozac had run out, and the drug company began marketing a new drug, Prozac Weekly. Lilly sales representatives in Florida gained access to patient information records, and, unsolicited, mailed out free samples of Prozac Weekly. Though they primarily targeted patients diagnosed with depression who were receiving competitor antidepressants, at least one such Prozac Weekly sample was mailed to a sixteen-year-old boy with no history of depression or antidepressant use. Law suits followed.

The most cinematic of all Lilly scandals began in 1989 and culminated in1997. One month after Joseph Wesbecker began taking Lilly's antidepressant Prozac, he opened fire with his AK-47 at his former place of employment in Louisville, Kentucky, killing eight people and wounding twelve before taking his own life. British journalist John Cornwell covered the trial for the London Sunday Times Magazine and ultimately wrote a book about it. Cornwell's The Power to Harm is not simply about a disgruntled employee becoming violent after taking Prozac; the book is about Lilly's power to corrupt a judicial system.

Victims of Joseph Wesbecker sued Lilly, claiming that Prozac had pushed Wesbecker over the edge. The trial took place in 1994 but received little attention as America was obsessed at the time by the O.J. Simpson spectacle. While Lilly had been quietly settling many Prozac violence suits, the drug company was looking for a showcase trial that it could actually win. Although a 1991 FDA "Blue Ribbon Panel" investigating the association between Prozac and violence had voted not to require Prozac to have a violence warning label, by 1994 word was getting around that five of the nine FDA panel doctors had ties to drug companies — two of them serving as lead investigators for Lilly-funded Prozac studies. Thus with the FDA panel now known to be tainted, Lilly wanted a Prozac trial it could win, and it believed that Wesbecker's history was such that Prozac would not be seen as the cause of his mayhem.

A crucial component of the victims' attorneys' strategy was for the jury to hear about Lilly's history of reckless disregard. Victims' attorneys especially wanted the jury to hear about Lilly's anti-inflamatory drug Oraflex, introduced in 1982 but taken off the market three months later. A U.S. Justice Department investigation linked Oraflex to the deaths of more than one hundred patients, and concluded that Lilly had misled the FDA. Lilly was charged with 25 counts related to mislabeling side effects and plead guilty.

In the Wesbecker trial, Lilly attorneys argued that Oraflex information would be prejudicial, and Judge John Potter initially agreed that the jury shouldn't hear it. However, when Lilly attorneys used witnesses to make a case for Lilly's superb system of collecting and analyzing side effects, Judge Potter said that Lilly itself had opened the door to evidence to the contrary, and he ruled that Oraflex information would now be permitted. To Judge Potter's amazement, victims' attorneys never presented the Oraflex evidence, and Eli Lilly won the case.

Later it was discovered why victims' attorneys remained silent about Oraflex. In a manipulation Cornwell described as "unprecedented in any Western court," Lilly cut a secret deal with victims' attorneys to pay them and their clients not to introduce the Oraflex evidence. However, Judge Potter smelled a rat and fought for an investigation, and in 1997 Lilly quietly agreed to the verdict being changed from a Lilly victory to "dismissed as settled."

If Americans want to take on Lilly, they might want to do it during a time when the Bush family is out of power. Sidney Taurel, former Lilly CEO and George W. Bush appointee to the Homeland Security Advisory Council, is not the only Bush family-Lilly connection. George Herbert Walker Bush once sat on the Eli Lilly board of directors, as did Bush family crony Ken Lay, the Enron chief convicted of fraud before his death. Mitch Daniels, George W. Bush's first-term Director of Management and Budget, had actually been a Lilly vice president, and in 1991 he had co-chaired a Bush-Quayle fundraiser that collected $600,000. This is the same Mitch Daniels who is now governor of Indiana, Lilly's home state.

Currently, the public's right to revoke corporate charters is still recognized by the courts, but attorneys general today rarely exercise this option, and then only against small corporations. Loyola Law School Professor Robert Benson, who in 1998 petitioned California's attorney general to revoke the corporate charter of Union Oil of California (Unocal), notes that state attorneys general "don't hesitate to draw this particular arrow from their quivers when the target is some small, unpopular or socially marginal enterprise." But when it comes to egregious large multinationals, Benson concludes, "They don't even want you to know about it because they don't want to appear to be soft on corporate crime."

In his book When Corporations Rule the World, David Korten, former Harvard Business School Professor writes, "In the young American republic, there was little sense that corporations were either inevitable or always appropriate." Early in American history, Americans were very much concerned about any entity achieving too much power, and so in corporate charters there were clear limits placed on: years permitted to exist, borrowing, land ownership, extent of enterprise, and sometimes even on profits. Korten notes that in the first half of the nineteenth century, "Action by state legislators to amend, revoke, or simply fail to renew corporate charters was fairly common."

The Program on Corporations, Law & Democracy (POCLAD) was created in 1994, in part to inform Americans that they can in fact revoke corporate charters. In 1890, POCLAD explains, the highest court in New York State revoked the charter of the North River Sugar Refining Corporation in this unanimous decision: "The judgment sought against the defendant is one of corporate death … the defendant corporation has violated its charter, and failed in the performance of its corporate duties, and that in respects so material and important as to justify a judgment of dissolution."

Giant drug corporations — especially ones that make a killing selling dangerous drugs by hyper-pathologizing people who can't defend themselves — get my adrenaline going; and so my candidate to get the ball rolling is Lilly, which has now made themselves vulnerable by getting in so much damn trouble. But with Lilly's man Mitch Daniels currently governor of Lilly's home state, Lilly still has pull; and so I won't be upset if some other giant sleazebag corporation receives the death penalty before Lilly.

Given the fact that Americans already have a history of revoking corporate charters, why shouldn't this practice be continued? Yes we did, yes we still can, and so yes let's do it.

Fundamentalist Consumerism and an Insane Society

Monday, February 2nd, 2009

This article was originally published in Z Magazine.

At a giant Ikea store in Saudi Arabia in 2004, three people were killed by a stampede of shoppers fighting for one of a limited number of $150 credit vouchers. Similarly, in November 2008, a worker at a New York Wal-Mart was trampled to death by shoppers intent on buying one of a limited number of 50-inch plasma HDTVs.

Jdiniytai Damour, a temporary maintenance worker was killed on "Black Friday." In the predawn darkness, approximately 2,000 shoppers waited impatiently outside Wal-Mart, chanting, "Push the doors in." According to Damour's fellow worker Jimmy Overby, "He was bum-rushed by 200 people. They took the doors off the hinges. He was trampled and killed in front of me." Witnesses reported that Damour, 34 years old, gasped for air as shoppers continued to surge over him. When police instructed shoppers to leave the store after Damour's death, many refused, some yelling, "I've been in line since yesterday morning."

The mainstream press covering Damour's death focused on the mob of crazed shoppers and, to a lesser extent, irresponsible Wal-Mart executives who failed to provide security. However, absent in the corporate press was anything about a consumer culture and an insane society in which marketers, advertisers, and media promote the worship of cheap stuff.

Along with journalists, my fellow mental health professionals have also covered up societal insanity. An exception is the democratic-socialist psychoanalyst Erich Fromm (1900-1980). Fromm, in The Sane Society (1955), wrote: "Yet many psychiatrists and psychologists refuse to entertain the idea that society as a whole may be lacking in sanity. They hold that the problem of mental health in a society is only that of the number of 'unadjusted' individuals, and not of a possible unadjustment of the culture itself."

While people can resist the cheap-stuff propaganda and not worship at Wal-Mart, Ikea, and other big-box cathedrals—and stay out of the path of a mob of fundamentalist consumers—it is difficult to protect oneself from the slow death caused by consumer culture. Human beings are every day and in numerous ways psychologically, socially, and spiritually assaulted by a culture which:

  • creates increasing material expectations
  • devalues human connectedness
  • socializes people to be self-absorbed
  • obliterates self-reliance
  • alienates people from normal human emotional reactions
  • sells false hope that creates more pain

Increasing material expectations. These expectations often go unmet and create pain, which fuels emotional difficulties and destructive behaviors. In a now classic 1998 study examining changes in the mental health of Mexican immigrants who came to the United States, public policy researcher William Vega found that assimilation to U.S. society meant three times the rate of depressive episodes for these immigrants. Vega also found major increases in substance abuse and other harmful behaviors. Many of these immigrants found themselves with the pain of increased material expectations that went dissatisfied and they also reported the pain of diminished social support.

Devaluing of human connectedness. A 2006 study in the American Sociological Review noted that the percentage of Americans who reported being without a single close friend to confide in rose in the last 20 years from 10 percent to almost 25 percent. Social isolation is highly associated with depression and other emotional problems. Increasing loneliness, however, is good news for a consumer economy that thrives on increasing numbers of "buying units"—more lonely people means selling more televisions, DVDs, psychiatric drugs, etc.

Promotes selfishness. Self-absorption is one of many reasons for U.S. skyrocketing rates of depression and other emotional difficulties—and self-absorption is exactly what a consumer culture demands. The Buddha, 2,500 years ago, recognized the relationship between selfish craving and emotional difficulties, and many observers of human beings, from Spinoza to Erich Fromm, have come to similar conclusions.

Obliterates self-reliance. The loss of self-reliance can create painful anxiety, which fuels depression and other problematic behaviors. In modern society, an increasing number of people—women as well as men—cannot cook a simple meal. They will never know the anti-anxiety effects of being secure in their ability to prepare their own food, grow their own vegetables, hunt, fish, or gather food for survival. In a consumer culture, such self-reliance makes no sense. At some level, people know that should they lose their incomes—not impossibilities these days—they have no ability to survive.

Alienation from humanity. The priests of consumer culture—advertisers and marketers—know that fundamentalist consumers will buy more if they are alienated from such normal reactions as boredom, frustration, sadness, and anxiety. If these priests can convince us that a given emotional state is shameful or evidence of a disease, then we will be more likely to buy not only psychiatric drugs, but also all kinds of products to make ourselves feel better. When we become frightened and alienated from a natural human reaction, this "pain over pain" creates more fuel for depression and other self-destructive behaviors and harmful actions.

Pain of false hope. The false hope of fundamentalist consumerism is that we will one day discover a product that can predictably manipulate moods without any downsides. Modern psychiatry is a full member of consumer culture. Its "Holy Grail" is a search for the antidepressant that can take away the pain of despair, but not destroy life. In the late 19th century, Freud thought he had found it with cocaine. In the middle of the 20th century, psychiatrists thought they had found it with amphetamines, and later with tricyclic antidepressants like Tofranil and Elavil. At the end of the 20th century, there were the SSRIs, such as Prozac, Paxil, and Zoloft, which were ultimately found to create dependency and painful withdrawal and to be no more effective than placebos. Whatever the antidepressant drug, it is introduced as taking away depression without destroying life. Time after time, it is then discovered that when one tinkers with neurotransmitters, there is—as there is with electroshock and psycho-surgery—damage to life.

Fundamentalists reject both reason and experience. Fundamentalists are attached to dogma and if their dogma fails, they don't give it up, but instead resolve to deepen their faith and double down on their dogma.

Erich Fromm, 54 years ago, concluded: "Man [sic] today is confronted with the most fundamental choice; not that between Capitalism or Communism, but that between robotism (of both the capitalist and the communist variety), or Humanistic Communitarian Socialism. Most facts seem to indicate that he is choosing robotism and that means, in the long run, insanity and destruction. But all these facts are not strong enough to destroy faith in man's reason, good will, and sanity. As long as we can think of other alternatives, we are not lost."

Breaking free of fundamentalist consumerism means thinking of alternatives and it also means an active defiance: choosing to experience the various dimensions of life that have been excluded by the dogma.

Are We Really Okay with Electroshocking Toddlers?

Friday, January 30th, 2009

"Of all tyrannies a tyranny sincerely exercised for the good of its victims may be the most oppressive." — C.S. Lewis

Psychiatry's "shock doctrine" is quite literally electroshock, and its latest victims are - I'm not kidding - young children.

On January 25, 2009, the Herald Sun, based in Melbourne, Australia, reported, "Children younger than four who are considered mentally disturbed are being treated with controversial electric shock treatment." In Australia, the use of electroconvulsive therapy (ECT) is increasing, and the Herald Sun's report on "Child Shock Therapy" stated that last year "statistics record 203 ECT treatments on children younger than 14 — including 55 aged four and younger."

Many Americans think that ECT has gone the way of bloodletting, but it continues to be regarded by American psychiatry as a respected treatment, especially for patients who are "treatment resistant" to drugs. Though ECT for young children is nowhere near as common as for adults, most states in the U.S. do not prohibit ECT for kids. California does prohibit ECT for children under the age of 12 but allows children between 12 and 15 to receive ECT if three psychiatrists are in favor of it.

You might think that before any child receives a series of 70 to 170 volts of brain zappings and is thrown into epilepsy-like seizures, every other nontraumatic therapy would have been attempted. You might think that before using ECT, in addition to trying every type of psychotherapy, there would also be an exhaustive effort to find a therapist with whom a kid might genuinely connect. You might think all this, but you would be wrong. It is not unusual for psychiatrists to simply prescribe one drug, then another drug, then several drug combinations (called "cocktails"), and if those fail, recommend ECT.

The disproportionate use of ECT on women, especially older women, once made it a feminist issue, but I heard no feminist opposition when Kitty Dukakis recently came out positively about her own ECT. Psychiatry is well aware of its historical bad press about ECT, including Sylvia Plath's nightmarish ordeal, so today ECT is far more pleasant to observe. Patients are administered an anesthetic and a muscle relaxant prior to ECT so they don't writhe in agony as seizures are induced. However, the effects on the brain have not changed.

There are various modern ECT techniques. However, the scientific reality is that for all of these techniques, without evidence of any brain malignancy, the brain is damaged. Neurologist Sidney Sament describes the process:

"After a few sessions of ECT the symptoms are those of moderate cerebral contusions . . . Electroconvulsive therapy in effect may be defined as a controlled type of brain damage produced by electrical means . . . In all cases the ECT 'response' is due to the concussion-type, or more serious, effect of ECT. The patient 'forgets' his symptoms because the brain damage destroys memory traces in the brain, and the patient has to pay for this by a reduction in mental capacity of varying degree."

In January 2007, the journal Neuropsychopharmacology published an article about a large-scale study on the cognitive effects (immediately and six months later) of currently used ECT techniques. The researchers found that modern ECT techniques produce "pronounced slowing of reaction time" and "persisting retrograde amnesia" (the inability to recall events that occurred before the traumatic event) that continue six months after treatment.

While ECT proponents admit to collateral damage, especially memory loss, they claim that it is an effective treatment. However, a Kitty Dukakis testimonial is not exactly science. With respect to preventing suicide, the Journal of Affective Disorders in 1999 ("Retrospective Controlled Study of Inpatient ECT: Does it Prevent Suicide?") reported, "We failed to demonstrate that ECT had prevented suicide in hospitalized patients." Longtime ECT critic, psychiatrist Peter Breggin, in the International Journal of Risk & Safety in Medicine in 1998 ("Electroshock: Scientific, Ethical, and Political Issues"), reported that at establishment psychiatry's "Consensus Conference on ECT" in 1985, ECT advocates were unable to come forth with one controlled study showing that ECT had any positive effect beyond four weeks, and that many other ECT studies showed that it had no positive effect at all. The heretical Breggin added, "That ECT had no positive effect after four weeks confirms the brain-disabling principle, since four weeks is the approximate time for significant recovery from the most obvious mind-numbing or euphoric effects of the ECT-induced acute organic brain syndrome." Breggin's "brain-disabling principle" is that even when ECT does "work," it works only temporarily — the same way that a blow by a sledgehammer or an acid trip might temporarily disconnect one from the reality of one's life and the sources of one's emotional pain.

Psychiatry will always find celebrities such as Kitty Dukakis who swear by ECT, but the American public rarely hears about those celebrities who have cursed their ECT. In Papa Hemingway, A. E. Hotchner recounts the sad end to Ernest Hemingway's life. Hemingway became extremely depressed, was medicated and ultimately given ECT; but he became even more depressed and complained about the effects of the electroshock, "Well, what is the sense of ruining my head and erasing my memory, which is my capital, and putting me out of business?" In 1961, after a second series of ECT, Hemingway used his shotgun to commit suicide.

If you feel sorry for Hemingway, then what kind of emotional reaction do you have upon discovering that last year 203 Australian children — including 55 aged four and younger — received ECT?

This article was originally published on The Huffington Post.

Just How Corrupted Has American Medicine Become?

Wednesday, January 14th, 2009

"Laws are like cobwebs, which may catch small flies, but let wasps and hornets break through." - Jonathan Swift

After reading "The Neurontin Legacy — Marketing through Misinformation and Manipulation" in the January 8, 2009 issue of the New England Journal of Medicine, one may conclude that (1) America's prisons would be put to better use incarcerating drug company executives instead of pot smokers, and (2) society may need a return of public scorn via the pillory for those doctors who are essentially drug-company shills.

Drug-company corruption of American medicine is of course not news. What is news is that such corruption has become so egregious, so transparent, and so embarrassing that the New England Journal of Medicine, perhaps the most influential American medical journal, is now stating that "drastic action is essential to preserve the integrity of medical science and practice and to justify public trust."

Neurontin was approved by the Food and Drug Administration (FDA) in 1993 in doses of up to 1800 mg per day as adjunctive therapy for partial complex seizures. How did U.S. annual sales of Neurontin increase from $98 million in 1995 to nearly $3 billion in 2004? The answer is "off-label" marketing, in which Neurontin manufacturer Parke-Davis (a division of Warner-Lambert purchased by Pfizer in 2000) marketed Neurontin to doctors for uses not approved by the FDA (because doctors can legally prescribe drugs for uses not approved by the FDA).

While aggressive off-label marketing to doctors is standard among drug companies, it is routinely kept quiet. But thanks to a Parke-Davis whistle blower, we have first-hand evidence of off-label marketing — and how the Neurontin financial bonanza was created.

In 1996, David Franklin, a young biologist, took a sales representative position for Parke-Davis. But shortly after beginning the job, Franklin grew concerned that he was participating in the illegal marketing of Neurontin. Franklin reports that a Parke-Davis executive informed him and his fellow sales reps:

"I want you out there every day selling Neurontin. . . .We all know Neurontin's not growing for adjunctive therapy, besides that's not where the money is. Pain management, now that's money. Monotherapy [for epilepsy], that's money. . . . We can't wait for [physicians] to ask, we need [to] get out there and tell them up front. Dinner programs, CME [continuing medical education] programs, consultantships all work great but don't forget the one-on-one. That's where we need to be, holding their hand and whispering in their ear, Neurontin for pain, Neurontin for monotherapy, Neurontin for bipolar, Neurontin for everything. I don't want to see a single patient coming off Neurontin before they've been up to at least 4800 mg/day. I don't want to hear that safety crap either, have you tried Neurontin, every one of you should take one just to see there is nothing, it's a great drug."

Franklin left Parke-Davis and filed suit (ultimately, United States of America ex rel. David Franklin vs. Pfizer, Inc., and Parke-Davis Division of Warner-Lambert Company) alleging that off-label marketing of Neurontin constituted false claims designed to elicit payments from the federal government. In 2004, Warner-Lambert resolved criminal charges and civil liabilities by agreeing to plead guilty and pay $430 million — less than 15 percent of the $3 billion the drug company had grossed on Neurontin in 2004.

The current New England Journal of Medicine article concluded that the marketing of Neurontin involved "the systematic use of deception and misinformation to create a biased evidence base and manipulate physicians' beliefs and prescribing behaviors." This is one of many examples:

"In a recently unsealed 318-page analysis of research sponsored by Parke-Davis, epidemiologist Kay Dickersin concluded that available documents demonstrate 'a remarkable assemblage of evidence of reporting biases that amount to outright deception of the biomedical community, and suppression of scientific truth concerning the effectiveness of Neurontin for migraine, bipolar disorders, and pain.' For example, publication was delayed for a report on a multi-center, placebo-controlled study that found no effect of Neurontin on the primary outcome measure for neuropathic pain because 'we [Parke-Davis employees] should take care not to publish anything that damages neurontin's marketing success.'"

Exactly what does it take for drug executives to do jail time?

And let's not kid ourselves about the innocence of doctors. The tactics used by Parke-Davis and other drug companies to manipulate doctors make it clear that too many doctors have been willing participants in the corruption of their profession.

The New England Journal of Medicine discusses some of the practices used by Park-Davis (and commonly used by other drug companies): recruit local physicians who are then trained and paid to serve as speakers in "peer-to-peer selling" programs; financially cultivate renowned professionals, so-called "thought leaders;" financially influence academics with educational grants, research grants, and speaking opportunities worth hundreds of thousands of dollars; create drug "advisory boards" to launder pay offs to "friendly" physicians; provide doctors employed by medical-education companies with "unrestricted educational grants" to produce programs that promote off-label (unapproved) uses of drug; fund doctors' "research" that in fact is designed and commissioned to promote a specific drug; and credit doctors as authors for ghost-written research articles that downplay drug ineffectiveness or lack of safety.

The New England Journal of Medicine is now warning physicians that medicine's corruption by drug companies has threatened public confidence in their profession. If those physicians who are not drug-company shills want to save their profession, they might want to start taking aggressive actions against their colleagues who are on the take. Perhaps it will help motivate clean physicians to be reminded that history shows that any institution — no matter how large and powerful — can arrogantly cross those lines leading to its demise.

Originally posting on The Huffington Post. Reprinted with permission.