Felicity Grisham is a recent Stanford graduate (who focused on neuropsychology) and will soon head off to law school. She has some thoughts on law and neuroscience (that overlap with mine in certain respects). Here is her post, entitled "The Problem of the Continuum," after the jump.
The Problem of the Continuum
No matter how neuroscientific technologies advance there will always be a continuum along which the data will fall, and it’s this continuum that will challenge yes-no determinations of the court. For instance, what if neuroscientific advances result in the ability to determine the likelihood an eyewitness indeed saw the defendant’s face? How much weight should that determination receive?
Though this postulate projects into the future, neuroscientific technologies hold such promise in their potential to aid criminal investigations. But with that promise comes a need to reconcile the continuum with the threshold for bar graphs don’t exist in the determination of someone’s guilt. Rather courts tend towards analog, yes or no, determinations that include and extend beyond judgments of guilt. Is she guilty or innocent? Should the minor be tried as an adult? Is he guilty by reason of insanity or not? These answers tend to have a yes or no answer, but what happens when that need for a yes or no meets the continuum? The possibility for investigating memory can help elucidate this point.
Jesse Rissman and Anthony Wagner at Stanford University are investigating an interesting angle on memory recognition . Is it possible to use multivariate pattern analysis and a well-trained classifier to then make determinations of facial recognition? For now, it seems that at least some percentage of hits and correct rejections can be made with this classifier, with some degree of certainty--- always with some degree of certainty. And that’s the point. When looking at memory, whether single item memory familiarity (Osama bin Laden) or conjuctive memory (Osama bin Laden, in Afghanistan, with a dialysis machine) assessments will be in the form of a percentage -- be it p < .0005 or an eighty percent likelihood this witness saw the defendant with the knife in the victim’s attic.
Though perhaps not so different from the way things operate now (see discussion of quantifying reasonable doubt) much of the way that courts work is in thresholds. The United States addresses crimes of minors separately from crimes of adults, though the maturation of the brain is an a continuum--- there is nothing unique about the day before you turn eighteen and the day of. But we establish a threshold. In addition, not guilty by reason of insanity defines a threshold whereby an individual who was “unable to appreciate the nature and quality or the wrongfulness of his acts” at the time of the crime can not be held responsible--- a take on the M’Naughten did the defendant know right from wrong? The phrasing essentially establishes a threshold for cognitive ability above which people are held responsible for their crimes. It’s these thresholds then, that will have to be reconciled with data points falling along a continuum. And how can they be reconciled? The reality of evidence falling on a continuum will likely reach beyond those assessments of memory to include assessments of volition, of lying, of bias, and even of responsibility. It will be exciting to see to where these technologies advance, and where the continuum of data, and courtroom procedures, follow.
Comments