Monday, February 25, 2013

Not Everyone is Optimistic about BAM


Neuroethics looks to be the open field for bioethicists after President Obama's state of the union address in which he optimistically referred to neuroscience as having a payback to the American economy. He said: "Every dollar we invested to map the human genome returned $140 to our economy -- every dollar," he said. "Today our scientists are mapping the human brain to unlock the answers to Alzheimer's. They're developing drugs to regenerate damaged organs, devising new materials to make batteries 10 times more powerful. Now is not the time to gut these job-creating investments in science and innovation."

President Obama appears to have been referring to the Brain Activity Map project (BAM) which is estimated to cost US$3 billion over 10 years. Such a narrow focus could mean that funding would dry up for other neuroscience projects.

Neuroscientist Christopher Chabris, of Union College, New York, expresses skepticism in his blog. He believes that 10 years is not be long enough to map every neuron in the drosophila brain. He also questions why $3 billion should be spent on a single project. Would the money be better spent on a thousand projects costing $5 million each? The return on investment for the Human Genome Project, which is 14,000% according to President Obama, may have been exaggerated. It turns out that the ROI figure was plucked from a report made by a company which makes equipment used in life science research. "I find this figure hard to believe, not to say preposterous," writes Dr Chabris.

The BAM would yield fascinating results, he says. "But the sheer size of a full BAM project might focus our attention and hopes on the BAM as the be-all and end-all of neuroscience, and distract the field from devoting energy to those other levels."
Scientists hope that the BAM project will lead to discoveries about diseases like Parkinson's, autism and Alzheimer's. It will certainly underscore the need for on-going ethical analysis. 

The Center for Cognitive Liberty and Ethics notes, "Emerging technologies that map the brain, reveal 'guilty knowledge,' and expose patterns associated with disfavored behavior raise thorny questions of law and ethics."

The CCLE article continues: "Three University of Pennsylvania professors grapple with these questions in a lucid article that appears in the June issue of the IEEE Spectrum, a monthly journal of the Institute of Electrical and Electronic Engineers. "Bioethics and the Brain," although written for specialists, is refreshingly free of the jargon and bad writing that mars so many academic publications.

Kenneth R. Foster, professor of Bioengineering; Paul Root Wolpe, in the department of Psychiatry in the university's Center for Bioethics; and Arthur L. Caplan, chairman of the Department of Medical Ethics, explain that microelectronics and medical imaging are bringing us closer to a world where mind reading is possible and some blindness is overcome with visual prostheses -- but one in which we may not want to live.

The authors write: "Researchers may one day find brain activity that correlates with behavior patterns such as tendencies toward alcoholism, aggression, pedophilia, or racism."

Racism? United Press International asked Caplan about that in a phone interview.

"Let's say I show you a series of photographs of people from around the world," he replied. "And every time a black face appears, you get a different brain pattern. (That is, different from a baseline pattern.) I start to suspect that you are different in your reactions to blacks than you are to others -- or whatever the group is. ...

"Does it mean that this bubbles to the surface and you act in racist ways? No. But subtle differences might silently shape your behavior.

"It may make sense sometimes to have a different reaction (to various racial types). ... But this does give you a chance to peek in and see what's going on. ... What Freud used to talk about as the unconscious, which I think is in pretty solid disrepute, may get a revival from this kind of brain examination."

The article presents the case of Nancy, a hypothetical airline pilot of the future, who arrives promptly for her routine physical. She is asked to place her head in a large metallic device while a video screen flashes a series of images before her eyes: the inside of a 747 cockpit, a view of a target seen through a rifle's scope, a chemical formula for polyester, a photo of Bill Clinton.

Later, her supervisor and a Federal Aviation Administration official inform Nancy that her brain images show that she might develop schizophrenia and that she has also has a surprising familiarity with assault rifles. The FAA revokes her pilot's license, and the airline fires her.

This fictitious scenario alludes to technologies that already exist in their basic form. Electrical activity in the brain can reveal the contents of a person's memory, and the same electrical stimulation technologies that enable some deaf people to hear can be engineered to control behavior.

These technologies have obvious and immediate application in criminal investigation. For example, a guilty suspect's brain might show recognition of a crime scene that an innocent suspect would have no knowledge of.

The polygraph (lie detector) measures physiologic responses such as heart rate, sweating, and respiration that are only indirectly related to brain function. But the microvolt signals of the new technologies come directly from the brain's function. Even so, the new tests do not measure truthfulness but seek to determine whether the subject has a particular memory.

Why, then, have the CIA, the FBI, and the Secret Service found such techniques to be of little use in screening for potential spies, terrorists, or other security risks?

"It's just not ready yet," Caplan replied. "The information that correlates complicated behavior with brain states is not developed, but it's coming. It's a little like genomics, where people look for correlation between a gene and, say, a predisposition to lung cancer. ... The false positive rates and the false negative rates are both too high right now."

The article also describes the capabilities of functional magnetic resonance imaging, or fMRI, which shows which parts of the brain are active by tracking changes in blood oxygen levels. Using fMRI, Psychiatry Prof. Daniel D. Langelben of the University of Pennsylvania found highly significant correlations between lying and truth telling and the metabolic activity in the region of the brain important to paying attention and monitoring errors.

MRIs and positron emission tomography, which uses radioactive tracers to image brain activity, can disclose subtle changes in brain structure and function that correlate to disease -- including mental illness. But who should receive pre-symptomatic testing or prophylactic treatment for such diseases? Relatives of those with symptoms? Those like Nancy, with particular jobs?

And what are the legal implications for employment screening? Caplan said an employer's attitude might be: "I don't care if it's accurate. I took a look at your brain, and you're not working here."

Caplan said that to forestall predatory peddlers of "truth machines," it's not to soon to start pushing for five things that "need to be happening" in bioethics.

The first is setting standards for what is ready to be used in the marketplace and what is not. Because this involves issues of accuracy and error rates, scientists rather than politicians should set the standards, he said.

Second, Caplan said, "We need to have some agreement about the admissibility of this evidence in court." Judges, prosecutors and defense attorneys should be introduced to the technology and confer. A defense attorney might ask the circumstances under which he could say: "My client may have done it, but he has a bad amygdala (a part of the brain that controls fear and arousal), so don't punish him." The new technologies also could be used in considering the suitability of an inmate for parole.

A third ethical issue is that of consent. "Must consent be given to this sort of brain testing?" Caplan asked. The professor presumes consent, but said compulsory testing might be necessary for national security reasons or for certain types of job screening. The conditions for these exceptions must be explicit, however.

Caplan was asked if withholding consent would be used against a defendant in a criminal case. He agreed that such a refusal would affect a jury regardless of a judge's instructions. The professor said he doesn't care how society establishes its standards, "but let's set the moral framework up right now." The courtroom is not the place to argue, he told UPI.

He said this became clear in determining what data are sufficient to establish paternity. "We had to hack it out in court, case by case." Experts could have set standards and guidelines, Caplan said.

A fourth ethical issue is access. "If you're going to build pictures of the brain, you can also build data bases just like you do genetic ones," Caplan said. "We could have pictures of everybody's head on file. Is that a good idea? Who would run it? How would you get access to such a thing? Somebody may say, 'I want to take a picture of my head to show you that I'm innocent, but it may cost something.' Will it be just a gimmick for the rich? Should we insist that everybody have fair access if it comes up for legal matters?"

Fifth, and most controversial, are the ethical considerations surrounding the testing of children.

"Parents might say: 'I want to find out if little Johnny is good at the violin. I'm not going to waste lessons on him if he's got no natural aptitude,'" Caplan told UPI.

"Just as there's an educational testing juggernaut, there could easily be a brain testing juggernaut tomorrow." Rules should be established about what can and can't be done with children, he said.

"Worried parents -- the worried well -- are going to be falling all over themselves to get this stuff done with their kids. This is the gift to neurotic parents everywhere, especially wealthy neurotic parents. Why waste your time getting your kid ready for that fancy Manhattan nursery school if he's not going to get into Princeton anyway?

"Can (brain) tests really predict all that? Probably not. But you can become very reductionistic about it and think that your fate is your brain. In some ways, of course, it is. But that doesn't mean you can't change things with learning and environment. Yes, it is predictive, but we don't want to sell it as 100 percent deterministic.

"So doctors and psychologists will have important questions about when they will test, why they will test, and how they will counsel."

Caplan was asked if he could envision a coffe-table book that showed the brain patterns of future Tolstoys, Mozarts, and Einsteins. Parents would try to match up little Johnny to see where he fits.

"Bet on it," Caplan said.

From here.

No comments: