Wednesday, August 8, 2018

Putting Clothes on the Ghost: Why formulating an promulgating a "school of thought" is an admission of intellectual defeat

[DRAFT] Putting Clothes on the Ghost: Why formulating and promulgating a "school of thought" is an admission of intellectual defeat


To Exaggerate is Human, to Speak Plainly, Divine


It is an unfortunate likelihood of human existence that you know both more and less than you realize you do. The proof is both statistical and anecdotal, perhaps one of the symmetries most difficult to mistake in the entire scope of our existence. Many of us may never completely integrate the realization that the face we passed in the road, the forgotten punchline, the exact location of our cell phone when we needed it, were all stored electrically and/or biochemically on our physical or virtual person, and thus even when we couldn’t remember that frustrating something, we still in some comforting sense “knew” it. However, in a moment of deflated ego we might ponder: is it an intentional quirk of definition that we can still “know” things we can’t exactly remember?


--- Academic/Scientific Crime or Misdemeanor?



The author speaks not as an initiate, merely as an observer - but one may also notice that in the various disciplines of the Humanities there can be argued to exist a certain level of dedication to scholarship and textual accuracy among the various “schools”, which then go right on to extol a particular set of assumptions about their field of study, and denigrate or ignore the assumptions of other schools (which seems mainly the result of semantic difficulties arising from inevitable differences in experience and subsequent classification of that experience). Within the sciences there exists a similar unifying commitment to the scientific process and a similar (though perhaps more limited) distaste for others’ theory and practice. One might surmise this entirely human state of affairs results from competitive monetary pressures, and of course ego, interacting with the pragmatic realities of research and public events (and other related misfortunes) -and perhaps not go far wrong.


It is rarer to encounter the scientist or academic who both privately and publicly admits to the basic competence of those dissenting from their own august opinion, but in times of plentiful funding a core conviviality may rise to the top, and might just tamp down the rougher exposed surfaces of professional interaction, mixing rather freely with a pragmatic professional courtesy, and perhaps even a healthy dollop of personal respect between colleagues. Competitive pressures across a host of domains are assuredly known to wax and wane. It could happen.


The human mind seeming such a provincial entity, however, each one's individual owner appears most likely to opt for willful disagreement instead of more effortful commitments at understanding the nuances of a genuinely differing point of view. We assume this to be ingrained adult behavior and indicative of having a distinct (if not reasoned) opinion. The author will, time permitting, strive to clearly show it to be neither.
---

Ah, did you suspect you'd had an original thought somewhere back there? No? Hardly possible, isn't it, given the sweeping immense wash of existence that came before now? We're not even sure this is our first universe - have you checked? So there's a definish chance that nothing'll ever be new under any sun. But the word "unique" has had its singular quality droned out of it by relentless profligate adverbiage, so no real harm done if your "unique" conception is impossibly more or less "unique" than what a silicon-based spider-crab a billion years and ten billion parsecs from here already came up with . It's all rather relative in the end and we should entertain the happenstance that "unique" as a concept can and probably should be written blithely off as axiomatically unworkable. Left unchecked it may even insult the very notion of intellectual perspective.

With that idea left sufficiently entertained in a quiet corner of the room, we can then move on to the question of who gets to choose what's acceptably interesting if nothing's "unique" per se. Surely in every era there exists a core system of rational beliefs that transmits a critical ontology, an essence of being, between civilizations, yes? How can we even pretend to know, for instance, that the ancient Greeks were referring to eternal, unchanging forms rather than some faddish, choking notion of perfection that we infinitely more modern minds recognize as laughably invalid? Need we extend every other culture - okay, let's be honest, every other mind - the generous benefit of doubt implied when we acknowledge that a net of words has successfully entrapped some unfortunate intellectual actuality that's truly common between us? In the words of a clear-thinking writer just a few miles down the road from here, we are forced by honest appraisal to ask:

how do we know that X is ABC? If we answer this by saying that we know what AB, and C are, and if we have to explain our understanding of AB, and C in a similar way, there is no way out. 1

Faced with "no way out," who wouldn't want to throw their lot in with Plato's eternal forms? To many of us their appeal is undeniable:


They are independently existing entities whose existence and nature are graspable only by the mind, even though they do not depend on being so grasped in order to exist.


Ahhhh. A safe place to anchor our boat in hard epistemological seas. We're guaranteed there's something that predates our fallible knowing - you could almost argue that grasping an eternal form represents a kind of a priori knowledge that's independent of bumbling experience, and therefore trustworthy, eh?

Rather further to the west someone argues:


A priori knowledge is the condition of the possibility of knowledge in general. 2



Yes, these are related statements. We're going back a couple of millennia to argue that here you can trust something in particular [sic] to be specifically true [sic] because it was around before we got here and should therefore outlast us. A priori knowledge may have had all manner of nonsense attached to it, in essentially the same fashion as the practice of war has been obliged to tow the precepts of the Geneva Convention around wherever it goes - but the fact remains that it's a type of knowledge that might actually earn that elusive badge of honor known as  "unique." Simply put, with a priori knowledge you either got it or you ain't.

Let's be a little less silly about this assertion for a moment. A philosopher is basically saying this - the definition of philosopher being "that class of cynical idealist who believes that if he makes his words shriek in epistemological indignation consistently and mercilessly enough, he has arrived at an iteration of Truth which is not shameful in the eyes of The Infinite and his fellow verity-seekers" - basically, if you're a philosopher and you lean on a priori knowledge even a little bit, then if you can't see things the way he does, you're wrong. Or (as Aristotle delicately put it), you've got the brains of a plant. Either way, game over. You lose, Alfie.


That sounds familiar. I recall hearing that in several churches I only entered a single time (my parents were pretty reasonable about such things). Those bodies of worship seemed incapable of brooking real inquiry, of sheltering the freethinking monk of David Drake and Eric Flint's An Oblique Approach (link).

I hear hackles going up in geographically diverse places at that one - which is good.

You get my point now? What kind of brain are you running up there, if you don't let it form reliable, high-quality conclusions? Sure, you can jump to them instead - and there are plenty of charismatic, vocal cheering squads who will clap your back and personally escort your faithful carcass up to the Gates of Heaven … except they cannot give you the answer you need the most: that yes indeed, it's all real. Of course that's where you go next. Life's fair after all, and you loved, suffered and were snuffed out of existence for a good reason.

I'm sorry - it's important to me that I believe what I can. Not what I should, or what I want, but what I'm CAPABLE of believing. It doesn't usually follow that believing what's comforting means I'm believing something that's true.


(editing note: Does it require systematic attachment damage to recruit an Islamist army of young boys? That's a dandy tangent to hang on this framework, eh?)




Sources:

1  - https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=1&cad=rja&uact=8&ved=0CB4QFjAAahUKEwiQzpPx5PDHAhUHD5IKHbarDpk&url=http%3A%2F%2Ffaculty.washington.edu%2Fsmcohen%2F320%2Fthforms.htm&usg=AFQjCNEPYv3NJtNJ0-lw_5Bc5dyHO2ZrYw&sig2=GvlxrVhe1GG_1xazR8P1TQ&bvm=bv.102537793,d.aWw

Downloaded 9/11/2015

2 - http://www.csudh.edu/phenom_studies/study/glossary.htm

Downloaded 9/11/2015


No comments:

Post a Comment