DIGging The Rehnquist Legacy

We’ve dodged another bullet.  SCOTUS did take up the case of City of Hays, Kansas v. Vogt but then thought better of it, dismissing the case after oral argument because certiorari had been improvidently granted, known in SCOTUS circles vernacular as a “DIG”.

What was the issue?  Vogt applied to be a police officer and was (perhaps) compelled to give information about himself that caused the state to prosecute him, but the case was ultimately dropped – that is, it never went to trial.

Was his 5th amendment right against self-incrimination violated?

Well, that depends.  Some, including the USG, believe that the 5th amendment isn’t violated unless and until the inculpatory statements are used against you at a trial.  Succinctly put, the 5th amendment right against self incrimination is a “trial right”.

Others believe the right against self incrimination kicks in earlier in the “criminal case”, and would encompass the use of involuntary inculpatory statements at, say, a probable cause hearing.

Who’s right?

Neither, really.  But of the two, the second is very clearly, very plainly, abundantly, without question, definitively, absolutely, beyond any doubt more correct.

Why do we say this?

One reason is that we have read the relevant two amendments.  Relevant, that is, to the determination of what can be fairly characterized as a “trial right” and other rights that are not so limited, and for that purpose you have to look at two amendments, not just one.

So here’s the 5th amendment:

No person shall be held to answer for a capital, or otherwise infamous crime, unless on a presentment or indictment of a Grand Jury, except in cases arising in the land or naval forces, or in the Militia, when in actual service in time of War or public danger; nor shall any person be subject for the same offence to be twice put in jeopardy of life or limb; nor shall be compelled in any criminal case to be a witness against himself, nor be deprived of life, liberty, or property, without due process of law; nor shall private property be taken for public use, without just compensation.

And here, dear readers, is the 6th amendment – that is, and please note, it’s the very next one, which is to say we don’t have to read a lot of the constitution here:

In all criminal prosecutions, the accused shall enjoy the right to a speedy and public trial, by an impartial jury of the State and district wherein the crime shall have been committed, which district shall have been previously ascertained by law, and to be informed of the nature and cause of the accusation; to be confronted with the witnesses against him; to have compulsory process for obtaining witnesses in his favor, and to have the Assistance of Counsel for his defence.

So, a couple of elementary observations.  The 5th amendment is not restricted to what happens at a criminal trial, nor indeed even to criminal matters at all, including as it does the “just compensation” clause.  By contrast, the 6th amendment begins specifically by referring to “criminal prosecutions” and requiring “a speedy and public trial”.  That is the 6th amendment’s focus, not the 5th’s.

Thus, and without treating the subject exhaustively, if one is looking to characterize any constitutional right as a “trial right” in a criminal case it should be obvious that the search is confined to those rights mentioned in the 6th amendment, not the 5th.

And this obvious reality should easily dispose of the USG’s and City of Hays’ arguments that the 5th amendment right against self-incrimination is a “trial right”.  That contention is plainly and fundamentally at odds with the language and structure of the constitution.

But if that’s true, we hear our readers muttering, then why the big fuss over this simple issue at the SCOTUS?

Good question.  Enter the late Justice Rehnquist.

In Bracy v. United States (1978), he tried to sneak in through the back door, so to speak, the idea that the 5th amendment’s (and 14th’s, too) guarantee of due process of law, which had been held to prohibit the deliberate “framing” of individuals by government actors through the deliberate use of false evidence, was a “trial right”; that is, no violation occurred unless and until the government deliberately used false evidence at a criminal trial.  This was essentially an interpretation of a line of cases beginning with Mooney v. Holohan (1935).  And he kept at it, putting the same idea in a footnote in Albright v. Oliver’s plurality opinion in 1994.

But as the constitutional text shows (as we just demonstrated) – not to mention SCOTUS cases Waley v. Johnston (1942), Walker v. Johnston (1942) and New York ex Rel. Whitman v. Wilson (1943) – this is a flatly, not to say egregiously incorrect interpretation (All of those cases involved guilty pleas, not trials, and all cite Mooney.  We’ve been over this before, of course.  A lot.)

Yet there is a sliver of hope in all this, because the SCOTUS has never actually adopted Justice Rehnquist’s erroneous interpretation by a 5 vote majority opinion, although they came close in Albright.  And they might have come close in Pottawattamie County v. McGhee in 2009, too, but that case got settled and dismissed before any decision could be made.

And then yesterday they DIG’d Vogt.  Good.  Because even though it’s not a due process case, if the SCOTUS decided that the 5th amendment’s right against self-incrimination is a “trial right”, that ruling could spill over into the due process and Mooney area, since like the self-incrimination clause the due process clause is also in the 5th amendment.

We tried to help the SCOTUS with embarrassingly elementary problem numerous times ourselves, but they seem determined to wallow in it.  To be fair, it’s not an easy thing to conclude that your previous Chief Justice misled you, and indeed by a fair – though not conclusive – reckoning did so deliberately.

But we all have to look at evidence for what it is, SCOTUS Justices no less than we here at LoS.  If a conclusion is unavoidable, that is, we are not permitted to avoid it.

Ugh.

Advertisements

Leave a comment

Filed under Uncategorized

Final Nail In The Coffin…

of the Alien Tort Statute.  Opinion of the SCOTUS out this morning.

At least there were four dissents.  We have no comment at the moment, but may visit the subject another time, though we also may have exhausted our vitriol on the subject years ago.

Leave a comment

Filed under Uncategorized

Testilying And The Search For Truth

Apropos a few of our most recent posts, an opinion we should have run across years ago but for some reason did not.

Epistemology is not a joke, and apparently amateurs should stay away from the subject.  And amateurs include police officers, lawyers and judges without the proper training.  And “proper training” is, well, epistemology, metaphysics and the like.  Certainly not “criminal justice” courses or “political science”.

The rabbit hole Mr. Van Brocklin descends into in the linked article – completely ill-equipped, intellectually speaking – has no bottom.  Can we discuss and debate theories of knowledge and truth?  Surely.  But there are more than two thousand years of material to draw upon.  The questions Mr. Van Brocklin raises do not spring fully grown out of his own head.

Or Austrialian high court judge JJ Spigelman either, for that matter.

As a society we are suffering from profound deficiencies in education, and especially – and somewhat depressingly – among those considered highly educated.  Reading Judge Spigelman’s account of how he came to embrace the view that court proceedings are about ascertaining the truth, subject to some limitations, is not so much enlightening or satisfying as it is painful and embarrassing:

We seem now to have passed through the convulsion in the humanities and social sciences academy of that conglomeration of doctrines often referred to as ‘postmodernism’. The only thing that was ever interesting about ‘postmodernism’ was what it was ‘pre’. The ‘post modernist’ form of relativism that drew on the difficulties of proving truth and the distortions that can arise in the truth finding process to conclude that the search for truth should be abandoned would, in the end, have destroyed the cloistered academy which generated this perversion.

Perversion indeed.  How unfortunate for us that in 2011 this even needed to be said.

Post modernism let loose upon the college campus is a corruption of thought; let loose in a nation’s courts it is a corruption of society itself so thorough and profound that it will inevitably result in officially perpetrated rape and murder, because raw power fills the gap left by the abandonment of reason.

We have trained police and prosecutors, judges and lawyers, to be glib about their obligation to seek the truth by denying them the intellectual formation to appreciate the importance of that obligation.

In other words, we’re doomed.

 

 

Leave a comment

Filed under Uncategorized

Epistemology 101 (Updated)

We’re tired.  And this is tedious for us, but then again the occasional review of well worn territory has some value.  If only for nostalgia.  So on to it.

We refer the reader to the last post, not our best effort but it puts a name to the face so to speak.  The real issue, as it was in the beginning, is now and ever shall be, is this:

How do we “know” anything at all?

Parmenides says we know what our “reason” tells us and nothing else, certainly not what we see.  Heraclitus says we know only what we see, and what we see is unintelligible chaos, change, motion and “flux”, ultimately indecipherable to reason.  Or “the Reason”, as we hereinafter refer to it.

Adhering strictly – logically – to either of these bookends of the epistemological shelf is incompatible with life as we “know” it (forgive the loose use of the term here).  With Parmenides you couldn’t get out of bed in the morning and light a cigarette.  With Heraclitus you lurch from experience to experience but none of it means anything.

But as between the two alternatives, is one better?  Yes.  Parmenides.  Because at least it makes some sense.  Heraclitus would deny even the value of that, but of course he’s on a loop he can’t escape, because his denial would stand only to the extent that it makes more sense.  “the Reason” – or, as we like to call it over here at LoS, “natural reason” – is the one absolute, inescapable given of our existence.

So centuries later DesCartes embodies this thought in his famous “Cogito ergo sum”; but then this is subject to ridicule both by Kierkegaard (our hero)(“If I am thinking, what wonder, then, that I am”.) and Nietzsche (our antagonist).  And why is that?

Reason is how we understand whatever it is we understand, but we can never understand reason itself.  It just is, and we just submit to it every waking moment because practically speaking, or epistemologically speaking for that matter, there is no alternative.

Except to go insane.  Which is what Nietzsche did, and it’s not a coincidence.

So why does anyone go down that particular road?

Because the Reason, since we must submit to it even though it can’t explain itself, is just like the idea of the God, and in just that same way:  the Reason is accepted as binding not as a matter of the Reason itself (as the Reason can’t explain or justify itself) but as a matter of faith.  We submit to the Reason because we have faith in it, and that is the only possible basis for our submission to it.  And once we acknowledge that no matter how much we pretend otherwise we are governed ultimately by faith (for that’s what all this means) and not by the Reason – and certainly not by what we empirically observe – then faith in the God becomes reasonable (apologies!), so reasonable in fact that the opposite – atheism – becomes unreasonable, unsustainable and unjustifiable.  Arguably, on the level of the Reason faith in the God becomes mandatory, or at least as mandatory as faith in the Reason is.

But can one still be an atheist?  Of course.  But only on this condition:  I am an atheist not on account of the Reason, which I submit to in all other matters of every waking moment of my existence (because I must) and which rejects atheism; rather, I am an atheist because that is my will, the God and the Reason both be damned.

One danger in this  (and it is a profound and very real – that is, a practical and present  danger) and one that has affected us and our life, and our clients and their lives, is that naturally enough once one has rejected faith in the God one is liable to reject faith in the Reason as well, since the Reason leads inexorably to faith, and faith leads inexorably to the God.  And vice-versa.

Thus we see that:  “Reason is, and ought only to be, the slave of the passions and can never pretend to any other office than to serve and obey them.” is where any atheist is sure to end up, not just David Hume.  And the American legal profession became, as the 20th century wore on, rigidly and dogmatically David Hume’s intellectual heirs.  Often by way of Jeremy Bentham and Oliver Wendell Holmes.  And of course Friedrich Nietzsche.  And the term “intellectual” is of course advanced advisedly.  Anti-intellectual would be more accurate.

Re-read if you don’t understand.

And so here is one “real world” consequence of this otherwise arcane subject matter.  And another.  We could go on.

But here’s another important point.  We have described Nietzsche as our antagonist, which indeed he is, but we should also acknowledge our indebtedness to him:  he shows us the horror of atheism.  As we said earlier, it’s not a coincidence that Nietzsche went insane.  It is also not a coincidence that he became an icon of murderous 20th century ideologies like Nazism.  The absence of the God leads to the absence of the Reason and all that’s left is who or what is to be master, and that’s all.  Thy will or my will.  Reason, truth or justice have nothing to do with it.  Power is all there is.

It is a prescription for hell on earth, of course.  But perhaps worse than that, or maybe more part and parcel of it, is that the idea is (as we implied at the beginning of this morning’s discourse) ….tedious.  A colossal boreMilton had exhausted the idea two centuries before Nietzsche.

And so what happens to the schools of western thought that embrace Nietzsche, which at this point is most of them?  As you might imagine, they become less and less interesting.  Read Wittgenstein and stay awake, if you can.

And so we come back to the beginning, and again Kierkegaard says it so much better than we can:

Starting from a principle is affirmed by people of experience to be a very reasonable procedure; I am willing to humor them, and so begin with the principle that all men are bores. Surely no one will prove himself to be so great a bore as to contradict me in this….Boredom is the root of all evil.

We can only add this:  the reverse is also true.  Evil is the root of all boredom.  Intellectually, that is.  In practice, of course, evil can become very interesting indeed, at least short term.

Although we do not mean “interesting” in a good way, because it is most assuredly not a Good Thing.

Update:  A little ironic that we published this post, which we had composed over some days, on the same day that Stephen Hawking died, for his views on things are quite topical, notably these.

We swear, we had no notice of the event.  Hawking dying, that is.

Of course Hawking saying that “science” has superseded – indeed buried – philosophy is another way of saying that the truths of the Reason are subject to empirical confirmation, in the absence of which they are not truths at all, and accordingly philosophy is indeed dead.  As is God, of course, but then Hawking was hardly a trail-blazer on that score since Nietzsche said as much a century and a half ago.

A dubious proposition does not acquire more weight just because Hawking said it.  And we would note, empirically, that while the evidence of Hawking’s oft attributed “brilliance” is surprisingly sparse – he apparently had trouble learning to read, for example – the evidence of his overt politicization and ideological inclination is abundant:  he was a reliable apologist for every mainstream-liberal – and often flamboyantly “scientific” – bugaboo.

But perhaps most importantly, and like almost all other media-anointed-scientists such as Carl Sagan, Arthur C. Clarke and Bill Nye, he was frankly and candidly atheist if not openly hostile to “religion”.

Almost as if that is a prerequisite for being a media-anointed-scientist in the first place.

Leave a comment

Filed under Uncategorized

Nietzsche Redux And The Tyranny Of The Incoherent

“God is dead.” – Nietzsche

“Nietzsche is dead.” – God

Suitable for men’s room graffiti, we know.

Still.

We made quite a study of philosophy in our youth.  We cannot improve on Kierkegaard’s assessment of such an undertaking:

It is now about four years ago that I got the notion of wanting to try my luck as an author. I remember it quite clearly; it was on a Sunday, yes, that’s it, a Sunday afternoon. I was seated as usual, out-of-doors at the cafe in the Fredricksberg Garden. I had been a student for half a score of years. Although never lazy, all my activity nevertheless was like a glittering inactivity, a kind of occupation for which I still have a great partiality, and for which perhaps I even have a little genius. I read much, spent the remainder of the day idling and thinking, but that was all it came to.

So there I sat and smoked my cigar until I lapsed into thought. Among other thoughts I remember these: “You are going on,” I said to myself, “to become an old man, without being anything, and without really undertaking to do anything. On the other hand, wherever you look about you, in literature and in life, you see the celebrated names and figures, the precious and much heralded men who are coming into prominence and are much talked about, the many benefactors of the age who know how to benefit mankind by making life easier and easier, some by railways, others by omnibuses and steamboats, others by the telegraph, others by easily apprehended compendiums and short recitals of everything worth knowing, and finally the true benefactors of the age who make spiritual existence in virtue of thought easier and easier, yet more and more significant. And what are you doing?” Here my soliloquy was interrupted, for my cigar was smoked out and a new one had to be lit. So I smoked again, and then suddenly this thought flashed through my mind, “You must do something, but inasmuch as with your limited capacities it will be impossible to make anything easier than it has become, you must, with the same humanitarian enthusiasm as the others, undertake to make something harder.” This notion pleased me immensely, and at the same time it flattered me to think that I, like the rest of them, would be loved and esteemed by the whole community. For when all combine in every way to make everything easier, there remains only one possible danger, namely, that the ease becomes altogether too great; then there is only one want left, though it is not yet a felt want, when people will want difficulty. Out of love for mankind, and out of despair at my embarrassing situation, seeing that I had accomplished nothing and was unable to make anything easier than it had already been made, and moved by a genuine interest in those who make everything easy, I conceived it as my task to create difficulties everywhere.

In the 20th century at almost any university, even majoring in philosophy, you could escape any acquaintance at all with Kierkegaard (not that any sane person would want that); but it was impossible to escape from at least some familiarity with Nietzsche.

So Thus Spake Zarathustra and Beyond Good and Evil present themselves at some point, and the economy of expression afforded by aphorisms becomes a Thing,

Nietzsche had to be rehabilitated about the time we were engaged in our period of “glittering inactivity” ( late middle-to-late 20th century) because he had been so popular with Nazis.  Apparently this rehabilitation has accelerated and deepened in the time since, for as mainstream bellwether Wikipedia notes, later “scholars” have maintained that his apparent Nazi simpatico ideas were all a posthumous distortion by his demented, anti-semitic sister.

This is, to put it mildly, not plausible.  The “Ubermensch” is a central idea in Nietzsche’s thought.  Same for the “will to power”.  These are, you know, obviously Nazi friendly ideas.

And what else can we say about those central ideas? The ubermensch flows directly from an uncritical adoption of macro-evolution, a relatively new concept in Nietzche’s time, and of course an intellectual fad that lingers into the 21st century.  The argument is that the ubermensch is “…a goal humanity can set for itself…” and “…the creator of new values…” which is otherwise just a tad problematic in a post-modernist new age where everything before has been rejected and trashed.  You have to replace it with something, right?  Otherwise there’s nothing but nihilism.

But then that’s repeating ourselves.  Nothing but nihilism.  Get it?

So in order to reject the natural collapse into nihilism from this (frankly) silly musing about a “new [uber] man” – a bizarre, school boy fantasy better consigned to comic books than regarded as a serious contribution to western thought – we sanitize and over-complicate the thought and – again – blame Nietzsche’s horrible sister for the later affinity with Nazism.

But if you’re going to take the idea seriously – we don’t, but others do – it’s a natural fit both to nihilism and Nazism.  No way around that.  Nietzsche himself was said to have greatly feared the descent into the former as a consequence of his “theory”.  Post modernist Nietzsche fans should contemplate that for a change.

The “will to power“?  Let’s stipulate: it can be, lamentably, an accurate description and predictor of human behavior and to some extent the way the world works in general. Absent the phrase itself, the idea hardly originated with Nietzsche.  His contribution, rather, was to elevate the will to power as the highest principle of morality:  to be embraced, not resisted:

To speak of just or unjust in itself is quite senseless; in itself, of course, no injury, assault, exploitation, destruction can be ‘unjust,’ since life operates essentially, that is in its basic functions, through injury, assault, exploitation, destruction and simply cannot be thought of at all without this character. One must indeed grant something even more unpalatable: that, from the highest biological standpoint, legal conditions can never be other than exceptional conditions, since they constitute a partial restriction of the will of life, which is bent upon power, and are subordinate to its total goal as a single means: namely, as a means of creating greater units of power. A legal order thought of as sovereign and universal, not as a means in the struggle between power complexes but as a means of preventing all struggle in general perhaps after the communistic cliché of Dühring, that every will must consider every other will its equal—would be a principle hostile to life, an agent of the dissolution and destruction of man, an attempt to assassinate the future of man, a sign of weariness, a secret path to nothingness.

Intellectually, this is tediously familiar.  On the practical level, however, it is horribly fascinating:  we are enjoined to conduct ourselves and to order our lives in a manner so intellectually repugnant that in over two thousand years before Nietzsche not a single philosophical thinker of note had even seriously considered it.  We are to reject reason itself as mere emotional self-justification, the will to power dressed up as rational argument.  2+2=4 not because it does, but because in our insatiable desire for power we want it to.

Nietzsche was revolutionary indeed, but we don’t mean that as a compliment.

The decades long rehabilitation of Nietzsche has  apparently included the revelation – at least it’s a revelation to us – that although a scholar of ancient Greek and Latin he explicitly denigrated Parmenides and extolled Heraclitus:

Nietzsche’s philosophy, while innovative and revolutionary, was indebted to many predecessors. While at Basel, Nietzsche offered lecture courses on pre-Platonic philosophers for several years, and the text of this lecture series has been characterized as a “lost link” in the development of his thought. “In it concepts such as the will to power, the eternal return of the same, the overman, gay science, self-overcoming and so on receive rough, unnamed formulations and are linked to specific pre-Platonics, especially Heraclitus, who emerges as a pre-Platonic Nietzsche.” The pre-Socratic thinker Heraclitus was known for the rejection of the concept of being as a constant and eternal principle of universe, and his embrace of “flux” and incessant change. His symbolism of the world as “child play” marked by amoral spontaneity and lack of definite rules was appreciated by Nietzsche.  From his Heraclitean sympathy, Nietzsche was also a vociferous detractor of Parmenides, who opposed Heraclitus and believed all world is a single Being with no change at all.

How telling.  Where to begin?

Let’s leave the details out for a moment and bring out the broad brush to paint with.  Parmenides leads to Socrates and Plato, then of course to Aristotle, and together these can rightly be termed the foundation of western thought and even the foundation of western civilization, which later became Christian but always preserved the connection to these pre-Christian figures.

Heraclitus, on the other hand, although certainly known, was also a curiosity and a reject in western thought.

Ideas have consequences.  Ugh.

The difference is not complicated.  Heraclitus was exclusively empirical in approach, and Parmenides was exclusively rational.  Put another way, Heraclitus accepted sensory input as the only reality, and Parmenides rejected sensory input as unreal entirely.

Heraclitus and Parmenides could not have been further apart.

For more than two thousand years, though, western thought more or less starts with Parmenides and rejects Heraclitus.  Then around 1870 Nietzsche does the opposite.  What does it mean to do this?

As we have alluded to before, in posts and in comments and without purporting to take sides (although ultimately we do take sides but that’s not relevant right now), the belief in God is rational, but in order to be rational requires some level of rejection of the empirical, duh, because God is unseen.  Not as much rejection as Parmenides would have it, we note, but then it would be fair to describe the historical progression from Parmenides to Socrates to Plato and then to Aristotle as a gradual accommodation between the rational and the empirical, an accommodation that is possible if you start with Parmenides and reject Heraclitus but is impossible if you do the opposite.

Which is to say that if you embrace Heraclitus and reject Parmenides you will necessarily conclude that God is dead, just as Nietzsche did, because the belief in God’s existence is grounded in reason (rationality) but refuted by empirical observation.

But an astute reader will surely see the irony here:  this is, quite obviously, an entirely  rational process.  It is a simple syllogistic formulation:

Everything real is empirically observable.

God is not empirically observable.

Therefore God is not real.

The effort to elevate the empirical over reason by way of a syllogism, in other words, promptly self destructs.

If you go with Parmenides, on the other hand, the existence of a God becomes possible even though it is not empirically justified and can be seen as mandatory.  Because natural reason.*

So our ancestors in thought put Heraclitus behind them about the 3rd century B.C. and that’s where he stayed and that’s where he belonged, an historical and intellectual curiosity but ultimately unserious.

But then Nietzsche comes along and Heraclitus is reborn and becomes the new progenitor not of western thought and civilization, but rather post-western thought and civilization.  Which is a good way of putting it because this new worldview has no properties of its own; it exists solely as a negation of what came before it.

Kant no doubt thought it would be helpful to set forth a critique of pure reason, but in post modernism this is nothing but a fool’s errand.  To the post modernist, reason is not qualified to critique reason; that is the job of psychology:

Among his critique of traditional philosophy of KantDescartes and Plato in Beyond Good and Evil, Nietzsche attacked thing in itself and cogito ergo sum (“I think, therefore I am”) as unfalsifiable beliefs based on naive acceptance of previous notions and fallacies.  Philosopher Alasdair MacIntyre puts Nietzsche in a high place in the history of philosophy. While criticizing nihilism and Nietzsche together as a sign of general decay, he still commends him for recognizing psychological motives behind Kant and Hume‘s moral philosophy:

For it was Nietzsche’s historic achievement to understand more clearly than any other philosopher…not only that what purported to be appeals of objectivity were in fact expressions of subjective will, but also the nature of the problems that this posed for philosophy.

With apologies to Professor MacIntyre, it won’t do to call this an “historic achievement”.  Ascribing baser motives to what purport to be rational arguments has always been a common rhetorical device, and we mean “common” in the most derogatory sense:  intellectually low, a childish effort to one-up one’s interlocutor without engaging what is actually being asserted.  Reason maintains that propositions stand or fall on their own, and that whatever motives may be in play – and notably, it is not necessary to deny that that might be the case – are irrelevant.  Indeed the claim of reason is precisely that it inoculates against the infection of baser motives so as to better ascertain the truth of the matter.

It is therefore not surprising that those who deny this claim of reason would revert to the posture of Thrasymachus, but it is a bit startling to see this regression characterized as an historic achievement by those who should, and probably do, know better.  That is to say, Thrasymachus was a sophist.  In the modern sense.  Nietzsche’s revisionist intellectual somersaults are not historic achievements.

So at this point the question must be asked:  why do we pay any heed to Nietzsche at all?

Well, he was bereft of any genuine achievements, intellectual or otherwise, but that doesn’t prevent him from being post-modernism’s progenitor.  Indeed that is largely the point:  the meaninglessness of post-modernism begins with, and ought to begin with, a similarly meaningless “thinker”.  That doesn’t matter, because nothing matters.  That is the sole post-modernist principle, if you will, and it’s at work here.  Nietzsche matters precisely because he doesn’t.

Now.  Does all of this blather have any significance for the usual subject matter of this little blog-project of ours?

Oh, yes.

Nietzsche’s intellectual heirs came to dominate what we now call academia in the United States even before his unsavory (if posthumous) association with Nazism in the middle of the last century.  By the latter part of the 20th century this dominance had solidified into a monolith, particularly at the more prestigious universities, so much so that any hint of dissent from the foundational premises – atheism, disdain for religion, contempt for tradition, or at least any tradition pre-dating the degeneration into post-moderinism – became highly problematic.

It was impossible for his heirs, then, to discard Nietzsche – he was post-modernism’s intellectual father – so he had to be rehabilitated.  And that explains the revisionist work of Nietzsche “scholars”.

Law schools were especially vulnerable to intellectual fads and trendiness because, in the first place, in Nietzsche’s time they were brand new.  Law was one of the traditional “learned” professions, certainly, but “tradition”?  Meh.  We were busily forging the post-modern intellectual landscape out of the wreckage we ourselves had made of our intellectual past, and law schools were a trend.  A Thing.

Holmes and Nietzsche were contemporaries, but separated by language, culture and the Atlantic Ocean.  Yet how similar in outlook they were.  Talk about your weltgeist.

So Holmes gravitates towards eugenics (“…three generations of imbeciles are enough…”) during his much longer life and Nietzsche posthumously becomes a Nazi poster boy, and neither is at all surprising, given their common intellectual pedigrees, which is to say they didn’t have one, or maybe more properly speaking they had an anti-intellectual pedigree.

Anti-intellectualism has always been, and remains, an aberration in Europe; but in the United States it is part of our heritage.  European intellectuals could never fully embrace Nietzsche as much as American intellectuals have.  Europe has a vestigial loyalty to natural reason, even when they depart from it; America doesn’t.

Speaking of natural reason, the principle of non-contradiction is basic natural reasoning.  Unsurprisingly, Heraclitus rejected it and Parmenides surrendered to it, as any sane person will, at least to some degree.

So here’s a Europe-America contrast to illustrate the point.

Both Europe and America have a contentious political debate about abortion.  (We don’t want to run down that specific political road at the moment.  Just bear with us.)  Both have wound up “liberalizing” their legal treatment of abortion since the middle of the 20th century.  But in the US, some of the debate has involved the question of whether a human fetus is a “person”, because our SCOTUS in constitutionalizing the abortion issue in 1973 held that it wasn’t.  Which under the circumstances was, you know, a staggering intellectual error.  Europeans won’t truck with errors like that but Americans will because Nietzsche and Holmes and reason doesn’t matter and it’s all about what we want and who wins the struggle.  Will to power, doncha know.

So a few years later we are confronted with our error because the principle of non-contradiction will do that to you – and never mind that even at the time the SCOTUS made its ruling no less an abortion-favorable state than New York still dealt with abortion in its penal code under the heading “Abortion, Homicide and Related Offenses” – when some asshole shoots his pregnant girlfriend in the belly intending to kill the fetus and he succeeds and he’s charged with “murder”, but murder can only be of a “person” and so his lawyer says “what gives?” but the asshole is convicted and everyone is fine with that, including the SCOTUS, because non-contradiction is just an argument and arguments are cheap.

Just as with the definition of “person”, the most elementary natural reasoning is similarly dispatched in the courts all the time.  The examples are numberless.

So this is post-modernism applied. Quite simply, it is the tyranny of the incoherent.  It is madness, which is fitting because Nietzsche himself went mad at 44 and never recovered.  He lived out his days in the care of his mother and sister.  At one point he claimed to be the creator of the world.

If “the world” were to be redefined as the American legal profession and court system the claim would hardly be extravagant, though.

 

_____________________________________________________________________________________

* Nietzsche and post-modernism reject “natural reason”, of course, because they have to, but there’s no way to characterize this other than bizarre and perverse.  It would be a considerable understatement to say that we humans (and indeed animals) are utterly dependent upon natural reason every day; it’s more like every waking moment of every day.  Literally every single movement and every single thought, however trivial, is the product of it.

Leave a comment

Filed under Uncategorized

Petition of the Day…

on the SCOTUS blog is…..Long v. Pifster.

Ugh.  SCOTUS keeps revisiting this issue without clearing it up.  Do prosecutors get to lie and cheat to “obtain a conviction” without violating the defendant’s right to due process of law?  Such a hard, hard question.

Of course, merely being a SCOTUSblog petition of the day doesn’t mean there will be a grant.  But with an en banc opinion by Judge Easterbrook of the 7th circuit being appealed and Kirkland & Ellis representing the poor schmuck, it’s a pretty good bet.

2 Comments

Filed under Uncategorized

Long v. Pfister And Agendas And Footnotes

When Judge Easterbrook asks this red-herring question in particular:

Must the prosecutor correct false testimony when defense counsel already knows the truth?

or when he refers to “Napue and its successors” in another red herring question, or when he refers to the “Napue-Giglio rule”, he is committing the error of conflating Mooney cases with Brady cases.  And we say “error” because it’s not an arguable point.  Chronology, not capable of dispute and entirely independent of the matters actually under dispute, demonstrates this absolutely.

Napue was 1959.  Brady was 1963.  Napue cannot possibly be a Brady case.  Not to mention (again) that the whole Mooney line – that is, Mooney, Pyle, Alcorta and Napue – are cited in Miller v. Pate in 1967, making Miller the last Mooney case.

And Miller doesn’t cite Brady.

That is, Miller proves, beyond all rational questioning, that the Brady line of cases and the Mooney line of cases are distinct, even if related, because it post-dates Brady and doesn’t cite it even though it cites all the previous Mooney cases.

What about Giglio, then?

Giglio was 1972.  Giglio cites Napue due to the factual similarity involving the withholding of impeachment evidence, and the impeachment evidence being a deal having been made with a prosecution witness.  But that doesn’t make Giglio one of Napue’s “successors”.  In fact, Napue was one of Mooney’s successors, and has no “progeny” of its own.

The Giglio opinion arguably conflates Brady and Napue, true enough:

We granted certiorari to determine whether the evidence not disclosed was such as to require a new trial under the due process criteria of Napue v. Illinois, 360 U. S. 264 (1959), and Brady v. Maryland, 373 U. S. 83 (1963).

But this gets cleared up a few pages later:

As long ago as Mooney v. Holohan, 294 U. S. 103, 112 (1935), this Court made clear that deliberate deception of a court and jurors by the presentation of known false evidence is incompatible with “rudimentary demands of justice.” This was reaffirmed in Pyle v. Kansas, 317 U. S. 213 (1942). In Napue v. Illinois, 360 U. S. 264 (1959), we said, “[t]he same result obtains when the State, although not soliciting false evidence, allows it to go uncorrected when it appears.” Id., at 269. Thereafter Brady v. Maryland, 373 U. S., at 87, held that suppression of material evidence justifies a new trial “irrespective of the good faith or bad faith of the prosecution.”

Emphasis, as we say, supplied.  You see, the proper distinction between the Mooney line of cases and the Brady line is that good or bad faith is irrelevant in the latter, but the very essence of the former.  On that particular point the two lines of cases could not be further apart.  That is, that particular point is the very thing that distinguishes them.  And you don’t have to take our word for it (see pp. 47-49).

So, it’s not as if the Giglio court was really confused about the difference between Mooney and Brady, they just expressed themselves poorly in the first paragraph of the opinion.  After reading the rest of the opinion, no person of reasonable intelligence could maintain in good faith that Giglio was anything other than…a Brady case.

But if you graft Brady onto Napue – which is a Mooney case – then you graft Brady’s limitations onto Napue as well, and of course by extension to Mooney also.  Then you have limited Mooney by stealth.  And that’s what Judge Easterbrook is trying to do in Long v. Pfister, and what Justice Rehnquist tried to do in Bracy and Albright, and what the nation’s prosecutors (as a group, not every single one of them, of course) have been trying to do for decades.  This effort has produced such lamentable results as Albright v. Oliver, a plurality opinion from a fractured SCOTUS where Justice Rehnquist basically sneaks his Mooney limiting agenda into a footnote.

And here’s what limiting Mooney means:  the government can lie and cheat to get a criminal conviction and it doesn’t violate due process.

We do not believe such a result is tolerable in a free society.  And we don’t know how any sane person could disagree.  But even if some miscreant prosecutors, police and judges (repeat ourselves?) do disagree – believing perhaps that a little bit of lying and cheating is acceptable if it doesn’t affect the outcome, or some such – they should argue the point honestly and straightforwardly, taking the position that they think Mooney and its progeny were wrongly decided.

But then their honesty is the whole point in issue, isn’t it?

Ugh.

2 Comments

Filed under Judicial lying/cheating, wrongful convictions