Wednesday, August 8, 2018

Putting Clothes on the Ghost: Why formulating an promulgating a "school of thought" is an admission of intellectual defeat

[DRAFT] Putting Clothes on the Ghost: Why formulating and promulgating a "school of thought" is an admission of intellectual defeat


To Exaggerate is Human, to Speak Plainly, Divine


It is an unfortunate likelihood of human existence that you know both more and less than you realize you do. The proof is both statistical and anecdotal, perhaps one of the symmetries most difficult to mistake in the entire scope of our existence. Many of us may never completely integrate the realization that the face we passed in the road, the forgotten punchline, the exact location of our cell phone when we needed it, were all stored electrically and/or biochemically on our physical or virtual person, and thus even when we couldn’t remember that frustrating something, we still in some comforting sense “knew” it. However, in a moment of deflated ego we might ponder: is it an intentional quirk of definition that we can still “know” things we can’t exactly remember?


--- Academic/Scientific Crime or Misdemeanor?



The author speaks not as an initiate, merely as an observer - but one may also notice that in the various disciplines of the Humanities there can be argued to exist a certain level of dedication to scholarship and textual accuracy among the various “schools”, which then go right on to extol a particular set of assumptions about their field of study, and denigrate or ignore the assumptions of other schools (which seems mainly the result of semantic difficulties arising from inevitable differences in experience and subsequent classification of that experience). Within the sciences there exists a similar unifying commitment to the scientific process and a similar (though perhaps more limited) distaste for others’ theory and practice. One might surmise this entirely human state of affairs results from competitive monetary pressures, and of course ego, interacting with the pragmatic realities of research and public events (and other related misfortunes) -and perhaps not go far wrong.


It is rarer to encounter the scientist or academic who both privately and publicly admits to the basic competence of those dissenting from their own august opinion, but in times of plentiful funding a core conviviality may rise to the top, and might just tamp down the rougher exposed surfaces of professional interaction, mixing rather freely with a pragmatic professional courtesy, and perhaps even a healthy dollop of personal respect between colleagues. Competitive pressures across a host of domains are assuredly known to wax and wane. It could happen.


The human mind seeming such a provincial entity, however, each one's individual owner appears most likely to opt for willful disagreement instead of more effortful commitments at understanding the nuances of a genuinely differing point of view. We assume this to be ingrained adult behavior and indicative of having a distinct (if not reasoned) opinion. The author will, time permitting, strive to clearly show it to be neither.
---

Ah, did you suspect you'd had an original thought somewhere back there? No? Hardly possible, isn't it, given the sweeping immense wash of existence that came before now? We're not even sure this is our first universe - have you checked? So there's a definish chance that nothing'll ever be new under any sun. But the word "unique" has had its singular quality droned out of it by relentless profligate adverbiage, so no real harm done if your "unique" conception is impossibly more or less "unique" than what a silicon-based spider-crab a billion years and ten billion parsecs from here already came up with . It's all rather relative in the end and we should entertain the happenstance that "unique" as a concept can and probably should be written blithely off as axiomatically unworkable. Left unchecked it may even insult the very notion of intellectual perspective.

With that idea left sufficiently entertained in a quiet corner of the room, we can then move on to the question of who gets to choose what's acceptably interesting if nothing's "unique" per se. Surely in every era there exists a core system of rational beliefs that transmits a critical ontology, an essence of being, between civilizations, yes? How can we even pretend to know, for instance, that the ancient Greeks were referring to eternal, unchanging forms rather than some faddish, choking notion of perfection that we infinitely more modern minds recognize as laughably invalid? Need we extend every other culture - okay, let's be honest, every other mind - the generous benefit of doubt implied when we acknowledge that a net of words has successfully entrapped some unfortunate intellectual actuality that's truly common between us? In the words of a clear-thinking writer just a few miles down the road from here, we are forced by honest appraisal to ask:

how do we know that X is ABC? If we answer this by saying that we know what AB, and C are, and if we have to explain our understanding of AB, and C in a similar way, there is no way out. 1

Faced with "no way out," who wouldn't want to throw their lot in with Plato's eternal forms? To many of us their appeal is undeniable:


They are independently existing entities whose existence and nature are graspable only by the mind, even though they do not depend on being so grasped in order to exist.


Ahhhh. A safe place to anchor our boat in hard epistemological seas. We're guaranteed there's something that predates our fallible knowing - you could almost argue that grasping an eternal form represents a kind of a priori knowledge that's independent of bumbling experience, and therefore trustworthy, eh?

Rather further to the west someone argues:


A priori knowledge is the condition of the possibility of knowledge in general. 2



Yes, these are related statements. We're going back a couple of millennia to argue that here you can trust something in particular [sic] to be specifically true [sic] because it was around before we got here and should therefore outlast us. A priori knowledge may have had all manner of nonsense attached to it, in essentially the same fashion as the practice of war has been obliged to tow the precepts of the Geneva Convention around wherever it goes - but the fact remains that it's a type of knowledge that might actually earn that elusive badge of honor known as  "unique." Simply put, with a priori knowledge you either got it or you ain't.

Let's be a little less silly about this assertion for a moment. A philosopher is basically saying this - the definition of philosopher being "that class of cynical idealist who believes that if he makes his words shriek in epistemological indignation consistently and mercilessly enough, he has arrived at an iteration of Truth which is not shameful in the eyes of The Infinite and his fellow verity-seekers" - basically, if you're a philosopher and you lean on a priori knowledge even a little bit, then if you can't see things the way he does, you're wrong. Or (as Aristotle delicately put it), you've got the brains of a plant. Either way, game over. You lose, Alfie.


That sounds familiar. I recall hearing that in several churches I only entered a single time (my parents were pretty reasonable about such things). Those bodies of worship seemed incapable of brooking real inquiry, of sheltering the freethinking monk of David Drake and Eric Flint's An Oblique Approach (link).

I hear hackles going up in geographically diverse places at that one - which is good.

You get my point now? What kind of brain are you running up there, if you don't let it form reliable, high-quality conclusions? Sure, you can jump to them instead - and there are plenty of charismatic, vocal cheering squads who will clap your back and personally escort your faithful carcass up to the Gates of Heaven … except they cannot give you the answer you need the most: that yes indeed, it's all real. Of course that's where you go next. Life's fair after all, and you loved, suffered and were snuffed out of existence for a good reason.

I'm sorry - it's important to me that I believe what I can. Not what I should, or what I want, but what I'm CAPABLE of believing. It doesn't usually follow that believing what's comforting means I'm believing something that's true.


(editing note: Does it require systematic attachment damage to recruit an Islamist army of young boys? That's a dandy tangent to hang on this framework, eh?)




Sources:

1  - https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&uact=8&ved=0CB4QFjAAahUKEwiQzpPx5PDHAhUHD5IKHbarDpk&url=http%3A%2F%2Ffaculty.washington.edu%2Fsmcohen%2F320%2Fthforms.htm&usg=AFQjCNEPYv3NJtNJ0-lw_5Bc5dyHO2ZrYw&sig2=GvlxrVhe1GG_1xazR8P1TQ&bvm=bv.102537793,d.aWw

Downloaded 9/11/2015

2 - http://www.csudh.edu/phenom_studies/study/glossary.htm

Downloaded 9/11/2015


Tarantino / Hip Deep in Hyperbole

Why may we love Tarantino's "Inglourious Basterds" at our peril

One - Having just read "When Jews Attack," a very clever Newsweek review of Quentin Tarantino's recent "Inglourious Basterds," I thought about whether I would go and see this latest frothy tankard of Hollywood soft-core porn. I want to, because friends have said it's a lot of fun. But having read that review, I wonder if it might be a "Fun With Scapegoats!" kind of fun rather than a relaxing, an uplifting, an ennobling, enlightening, perhaps even bringing-the-human-race-together kind of fun. I wonder if it's the kind of fun I used to have when I stuck caterpillars down ant holes and watched them flail for their lives as the ants dragged them inexorably into their horrible tunnels. I couldn't exactly call those caterpillars Nazis (much less the ants), but I'm sure I could come up with something in order to justify the fun I was having - they're evil bugs, they destroy the plants we love and enjoy, er, something like that. There needs to be some reasonable rationalization in place in order to shut up that "hey, wait, isn't this wrong?" voice. In like fashion scapegoating truly works wonders, and Nazis have been making easy ones since before V-E day. A textbook case, they're easy to hate without compunction, dehumanize without consequence, and eliminate on film without guilt. They did it to themselves, you might say. Hm - am I just imagining I heard something like that in a dodgy translation of a scratchy old Hitler speech?

Hitler sure got people going. You could say he went from mixed-media to mass media in one grand, scheming leap. If you know German, aren't put off by his ludicrous posturing, and overlook his historical reputation as one of Satan's tools on Earth (if you'll pardon the incongruous-for-me Christian-flavoring), he gives the outward appearance of an inspiring figure, fighting for the rights of a beleaguered, trod-upon and indignant people. Factoring for the lies and maneuvering, he tailored his message for his audience in a time of great opportunity and got all the political power with which that audience felt like rewarding him. Of course, it wasn't just his audience who paid dearly for that zealous caprice.

Two - So Tarantino has learned his craft to at least the same degree of technical proficiency. Similarly, he has always used anger and vengeance as both vehicle and subtext in his work, and his final solutions are also similar, though small in scale, and incomparably more personal. But he's a fantasist, not a demagogue. Due simply to his tools, his methods, and the effect of his work, could you ever say Tarantino's a highly successful imp of Satan as well? Such an argument would surely imply that it's lucky his political ambitions are limited - and that he has a clear idea (up to now) of the dividing line between entertainment and incitement. And of course I say that as a card-carrying agnostic. So perhaps if I believed in Satan, and if Quentin Tarantino's campaign played rousing clips of his films to cheering supporters during a successful run for president, I'd have to conclude the Prince of Darkness was proud of his brown-eyed boy. But not yet, bub.

Three - Why would I malign an American success story this way, who once was a video store clerk, and who now commands such cultural power? I'd have to go with an ends-justify-means argument there. Do you really have to see such glorification of violence, or could you get by with a slightly less potent and clinically devastating brew for your cinematic grog? I say clinically devastating here because if the research hasn't yet been done to establish the likely damage to psyches and societies, it probably should be. So let me go out on a limb and assert that justifying violence (even as satisfying entertainment) is probably not a reliable way of addressing society's ills. It's got a bit of a reputation, hasn't it?

Four - So am I advocating censorship of this (or any) film? Hell no - but at the other extreme I don't advocate cigarettes for babies, either. I'm getting to be a big fan of balance and moderation the longer I inhabit the planet; maybe I'm actually learning something, or maybe I'm just getting more deeply invested in the process, I don't know. But I think people ought to have a clue what they're gorging on if it's going to change their brain chemistry (as the research does in fact show for intensely violent images).

I think you should know as much about what people are doing to you as the people who are doing it. That way you can use any part of your brain you like (not just your amygdala) as you're considering whether to vote the Pulp Fiction Party straight ticket.

Ok, I know, we're hip deep in hyperbole. But to twist the phrase "better the devil you know" - I would observe, better you know the devil. Speaking, as I mentioned, agnostically of course.

Hm. He's got a new one called "The Hateful Eight." Want to bet it caters to the Vengeance Instinct?