Hallowe'en 2007 Graphic #3
This graphic may be reposted with attribute.
The Dark Wraith thanks readers for not running away screaming like a bunch of Democratic politicians.
Modernity and a Teacher's Answer from the Cave of Antiquity and Irrelevance
My own efforts to convey to these specialists a few of the issues surrounding the use of e-mail as an integral part of courses has fallen for the most part on deaf ears. For example, not one of these computer "experts"such as what passes for an expert on the budget of a public institution of higher learninghas even the slightest clue that there might be an issue with handing a suspect behemoth like Google wholesale access to faculty-student communications. For another example, the folks in ITS seem put out by suggestions that professors are genuinely and legitimately concerned with the possibility that ITS technical people are encroaching on the absolutely sacred ground that is the academic freedom a professor has in his or her own classroom.
In the continuing effort to harp on the theme of teachers at the college getting with the program, the chief of ITS late last week sent out a mass e-mail to every faculty member; the subject line of the message was, "Worth a Look," and the body of the message was nothing but a link to a YouTube video, which is herewith presented below.
Readers are encouraged to watch the video in the entirety of its four minutes and forty-four seconds before proceeding to the remainder of this article, which resumes with the e-mail message I sent out this afternoon as a "Reply All," meaning that everyone who had been sent the original e-mail message with the link to the YouTube video has now received my response to it.
Enjoy the show.
Now, this is the message I sent to every faculty member. It also went to several deans and the director of education, those being people to whom I did not realize I was broadcasting until about two seconds after I had hit the "Send" button.
I am deeply unimpressed by the latest in centuries of calls for a "new" way for a "new" age. Until such time as we can effectively teach students the fundamentals of coherent, rational thought processes conveyable through constructively coherent writing, the digital age can find its acolytes and promoters in another teacher's classroom. Google is not part of the solution; neither is Wikipedia; neither are any of the legion of online and other innovative ways to "connect" and "collaborate." To the extent that they are treated as other than convenient tools of modernity, they are all part of the problem, and that problem has persisted from age to age. (Here's a hint: the problem, for lack of a more diplomatic wording, is called ignorance with a side of cultural sloth.)
To represent that those signs the students in that act were displaying were shocking messages from the post-modern world of the young is to deny that students have been bemoaning their teachers, their assignments, and their very lots in life since the beginning of time... or, at the very least, since the first professor held up a stone tablet and called it "Neolithic PowerPoint."
Nothing is new. The only differences from one generation to the next are the particulars of the "solutions" that avoid the hard-core duty of teachers to teach well, test rigorously, and show compassion while awaiting their students' slow, uncertain decision about whether to find an individuating reason to succeed or a tired excuse to fail.
Forgive me my bluntness. I'm a teacher. (I'm also a blogger.)
Having sent the above e-mail message to everyone, I awaited what I expected to be the brutal backlash against my Luddite-oriented lifestyle. I was quite surprised at first to find that the messages to me were uniformly favorable, some even glowing. Words such as "erudite" were used. One commenter wrote to me, "I was beginning to think I was alone."
Another faculty member wrote, "Hurrah! I have been meaning to reply to the video but wasn't sure how to put my thoughts into words with sounding like something out of the stone age. I found your response on the money. I'm sorry to think the next generation cares more about the internet and text-messaging than reading a book or actually researching a paper without the benefit of the internet."
I was most happy that my thoughts had resonated with so many of my peers. How uniformly favorable were the comments I was receiving!
It then occurred to me that I had become the academic equivalent of toast.
In higher education, praise is readily at hand, generally conveyed in unabashedly kind words, friendly banter, and the occasional, stale doughnuts left over from meetings among important people. Praise comes quickly and generously.
Retribution, on the other hand, comes slowly, in its own time, in its own way. Almost always, it comes from behind, and its effect is as a blade of unforgiving certitude. Revenge in academia has both patience and stealth.
How do I know I am going to face the wrath of a few who matter? That's easy: I received compliments from many who do not.
I shan't concern myself with that eventuality, though. The worst that can happen to me is the punishment which has already occurred, perhaps the greatest curse and the highest reward a teacher could anticipate. For the sin of speaking my mindindeed, for the far worse error of doing so in such a manner and tone that I was for a moment actually heardI shall remain in obscurity.
Economic poverty and bouts of self-condemnation are just the gravy on the banquet meal of hierarchical intolerance at once so vigorously enforced and so roundly denied by the practitioners of group-think in higher education.
Here at The Dark Wraith Forums, of course, I can speak my mind without much concern; and that is the most delicious of ironies for one who cannot countenance this "Information Age" that is inexorably separating me from relevance in my own profession. It is, in fact, irony of the highest order, irony worthy of a good belly laugh.
The Dark Wraith will try to muster that laugh once the smell of toast has dissipated.
The Victim and His Victory
In the comment thread at BlondeSense, I offered this:
Without so much as a bat of the eye, the Federal Reserve took such action as was necessaryincluding multiple infusions into the top end of the banking system of tens of billions of dollarsto save the credit markets from the wrath of the free market.
By allowing banks to increase their exposure to 30% of assets to buoy their irresponsible credit operations, the Federal Reserve has also laid to rest the last vestiges of enforced prudence embodied by the old Glass-Stiegel Act.
By pretending to exercise tight monetary policy while continuing to print money hand over fist to finance the extremist Bush Administration's tax cuts, wars, and large-business welfare programs, the Federal Reserve has savaged any remaining credibility it might have had, such that it was under the increasingly capricious and partisan leadership of former Fed Chairman Alan Greenspan.
Now, let's be honest for a second. Who here thinksI mean, really, really believesthat any rich person, spitefully, blithely hateful to the working class, or pandering and blubbering to it, fundamentally has even a clue?
Let's see a show of hands. Does Bush care? That's not the question.
How about Hillary?
How about Barack?
How about Johnnie Boy?
No, none of them get it. The virtuous center has passed from the American political landscape, and all that remains is a wasteland of hateful Right-wingers and clueless Democrats who can't even so much as slow down, much less stop, a monstrosity of a useless, debilitating war.
That fellow down there in Houston? He finally figured it out. When hope is gone, all that remains is the one thing almost no authority can prevent a person from having:
The right not to die on his knees.
The Dark Wraith has spoken.
Hallowe'en 2007 Graphic #2
Hallowe'en 2007 Graphic #1 was certainly decent, but this one, which may be reposted with attribute, should be a good reminder to people everywhere that no grave can hold the brain-dead as long as there is still misery, fear, and wretchedly incompetent policy to be visited upon the world of the living.
Remember to lock your doors tonight.
The Dark Wraith will deliver yet one more graphic for this Hallowe'en season.
Theory of the Firm, Industry Structure, and Regulation, Part One
The follow-up question posed was thus: "[W]ould big business agree with him? Or, would they prefer the economy as it is now, with less regulation, and less oversight?" This edition of Pulp Economics is the first in a three-part series that begins by addressing the matter of what 'big business' seems to want in terms of a lax regulatory environment versus what is not only preferable for both workers and the economy, but also perhaps more desirable, at least in some cases, for businesses themselves in large-scale industries dominated by only a few companies. A widely applicable answer to what business wants versus what it needs must necessarily address different industry structures, even when discussing 'big business' since at least several broad and fundamentally different kinds of firms populate the large-scale enterprise landscape. To address the different types, they must be categorized and distinguished, which will be done in the second part of this series. The third part will then use the classification system so motivated to demonstrate a somewhat surprising result for a certain type of industry, a result which renders the very concept of a "free," private market moot in the instance.
For this first part of the series, however, the focus will be on introducing and demonstrating the use of a few key economics concepts in industries dominated by a relatively small number of very large firms. The emphasis will be on how they naturally tend operate in a relatively free, unregulated environmentone they might seem to desireand the consequences to them of that freedom from a government exercising control over the competitive conditions they face.
What big business wants and what it needs are two different things; and what big business thinks it wants and what it really wants are two different things, also. Right now, large-scale business enterprises are simply delighted with the regulatory environment, although parts of the old system are still functioning, much to the chagrin of those who are still under the thumbs of those regulatory bodies. That general sense of joy notwithstanding, the business environment always has always had a love-hate relationship with regulation because, while there was plenty to dislike about administrative agencies courts were reticent to control, establishing strongly enforced, consistent, well-thought-out ground rules that every business had to obey was a godsend to business as a whole because it created a less risky, less speculative environment in which day-to-day and year-to-year operations could be managed.
Even big companies are beginning to grasp that being on their ownleft to their own devices to deal with certain forces to a greater or lesser extent beyond their controlhas created a fairly brutal environment, especially in the wide swath of corporations that operate in what economists call "monopolistic competition," where short-term economic profits can be garnered through product differentiation, while long-term competitive entry absorbs those short-term gains. In this sector, a set of clear, consistent, well-enforced rules applied by overseeing regulators at the federal level might be glad news; for one thing, the federal government exercising the full extent of its authority under the so-called "commerce clause" of the U.S. Constitution keeps states from getting into the act and passing crazy-quilt patchworks of legislation enforced by sometimes less professional, sometimes more parochial state officials. That "commerce clause"the clause that gives the federal government broad power to regulate commerce within the U.S.when properly utilized by the government and applied by the federal courts, allows businesses across the country to understand that there is a uniformity of procedural and operational standards from one coast to the other and from one state to the next. No business need concern itself with the several states having too-lax or too-aggressive enforcement within their own borders, and no business need worry about some competitor using a loose-regulation state as a platform from which to project unlawfully uncompetitive practices upon interstate competitors.
On the other hand, if the discussion is about industries where market concentration reaches what economists describe as "oligopoly" or possibly even near-"monopoly," there is going to be little cheer in board rooms for any level of regulation since the industries can self-regulate by one or another mechanism broadly called "collusion" (perhaps less harshly described as "congenial competition" or maybe even "touch football"). Unfortunately, wide exemptions in anti-trust laws have always allowed for collusive behaviors under the general banner of "self-policing" or some other such term. This is the case in industries such as major league sports, city newspapers, HMOs, and others. In some cases, this kind of internal regulation might work, but only when there remains a federal regulatory framework that can and will come down as needed on the self-policing bodies and those they are supposed to regulate.
The problem becomes deeper with larger and larger scales of enterprise. One of the most fascinating places to look is in industries where economies of scale extend to such large scales of production that the industries become dominated by "natural monopoly" types of companies, which will be surveyed in Part Three of this series.
Just short of pure natural monopolies is what can be an ugly world of brutish competition that quite frequently has undesirable effects both upon the companies therein and upon the wider, national economy. Such industries were arguably far better off when a deeply committed, federal regulatory environment allocated market share and oversaw pricing while maintaining an iron fist of regulatory control over the corporate beneficiaries. The balance of the present article will survey the characterizing cost structure of firms that are "nearly," but not quite, natural monopolies and how this cost structure can lead to dire long-run consequences for typical firms within such industries left to their own rational incentives and the harsh realities of more-or-less "free market" action.
The term "nearly" qualifies the description of certain firms because of the extent to which long-run economies of scale are realized. A firm is said to be realizing economies of scale if, as the company produces more output, the average costthat is, per-unit costof the output declines. For example, if the per-unit cost for a firm's first 10,000 units was $8.25, but the per-unit cost for that firm's first 12,000 units was $7.95, then the company was realizing economies of scale between 10,000 and 12,000 units. Economies of scale are quite frequently the result of huge, up-front, costs, the kind that exist even before output begins: as more and more units of output are produced, those massive start-up costs get spread, on average, over more and more units. Economies of scale are best considered a long-run kind of concept; most companies for a while realize falling average costs, but it's what happens over a longer period of timethe forces that shape the long-run structure of an industrythat should be the focus of policy at the government level, even though no firm, by itself, can or should think about the "long run" so long as the short run, attended as it is by everything from marketing products to meeting payroll, is nipping at the heels of executive decision-making.
Although Part Two of this series will reinforce this point, average costs for a company are driven by what economist call marginal cost: the cost of the very lastor very next unit produced. If the cost of the last unit produced is less than the average cost of all units produced so far, then the cost of that last unit will pull down the average cost. On the other hand, if the cost of the last unit produced is greater than the average cost of all units produced, then the cost of making that last unit will drive average cost up.
If that explanation of the relationship between average cost and marginal cost doesn't mean much, think about it this way. Suppose a bunch of people take a test, and the average comes out to be, let's say, 78 percent. In other words, the overall, per-person test score was 78. Now, suppose one more person takes the test, and this person (being, for example, a neo-conservative Republican) gets a 62 percent. His score, which is the last (or "marginal") score, is going to pull down the overall average when it gets entered into the calculation. Thus, when the last is below the average, it causes the average to fall. On the other hand, if the overall average was 78 percent to start with, and someone then took the test and scored, say, a 90 percent, that score would pull the average up. Thus, when the last is above the average, it causes the average to rise. This is why economists draw average and marginal cost curves as depicted at left, below.
Notice how the graphs show a mathematically necessary relationship between marginal and average: when the marginal cost of the last unit produced is under the per-unit cost, average cost is being pulled downward; but when the marginal cost of the last unit produced is higher than per-unit cost, average cost is being pulled upward. This is not some economic phenomenon; the example above was with test scores, not costs. The relationship between marginals and averages is purely an arithmetic result; it's just that this mathematical phenomenon plays a powerful role in how costs shape not just the decision-making within individual firm, but also the overall structures of industries.
In the bestiary of industries prowling the planet, there are more than a few where extraordinarily steep fixed costs make per-unit costs very high at low output levels. For the most part, the cost of making the goods, themselves, isn't all that bad; it's just those up-front ("overhead") costs that are the killers, and it is these costs that create barriers to entry by young, start-up firms. That means the typical company in such an industry would, at low output levels, have its average cost curve way up and a marginal cost curve far below it. Recall from above that, in such a situationwhere marginal cost is below average costthe result would be a falling average cost curve. Not only that, the firm would experience falling average costs for a long while as it produced more and more and more, just because it would take a whole lot of output for per-unit cost to get down to the level of the cost of the last unit produced. That's the same thing as saying that the typical firm in such an industry would realize economies of scale (falling average cost per unit) over an enormous range of production levels, as depicted in the graph below, which shows a classical, "U"-shaped average cost curve.
The graph above exemplifies the usual situation, one where the marginal cost curve, hidden as it is in this picture, finally begins to close in from below on the average cost curve, slowly coming up to finally meet it, and then pulling above it, thereby ultimately drawing average cost per unit upward. The relatively flat part of the long-run average cost curve above might go on over quite a range of output levelsa range of output levels where constant returns to scale are being experiencedor the relatively stable per-unit cost might not exist over more than a quite narrow range; but it is depicted above to show all the possible long-run situations a firm might encounter. Sooner or later, though, the cost of the last unit, in many cases, will be greater than average cost per unit, and that is when a company is said to be experiencing diseconomies of scale. At what point this happens, if ever, is entirely dependent upon forces only partially within the control of the executive management of the company, itself.
If an industry is such that companies can configure themselves in a way so that their long-run average cost curves keep declining clear up to very large levels of output, the market will probably be dominated by these big behemoths since, the bigger they get, the cheaper the next unit is for them to produce. As long as such companies are realizing economies of scale, they will tend to keep expanding output, and this can go on in some cases to enormous output levels.
Nevertheless, unless the companies are what economists call the pure "natural monopolies" mentioned above, the economies of scale will eventually end: at some point, perhaps at a truly huge level of production, the cost of the last unit will be more the cost of the average unit, and the average cost curve will have bottomed out and begun the inexorable rise into the scale of output where diseconomies of scale are being experienced. The output level where long-run average cost bottoms out is called the minimum efficient scale of production, and it is a relatively logical long-term output level at which a firm would settle.
But what would happen in an industry where this long-term output level, even though huge, of the typical firm was well short of that necessary to satisfy the demand of the entire market? It is not difficult to imagine how such a situation would cause an industry to have awful cycles of entries and exits of giant competitor companies, with each downstroke attended by large-scale employee layoffs and substantial market disruptions.
Consider this example: an industry exists where the long-term minimum efficient scale of operation for the typical firm is at an output that would meet, say, 40% of the market demand at the prevailing price level. In other words, the cost structure of the industry is such that a single company would be at its long-term most efficient by providing the product to a very significant, but not overwhelming, part of the market. The graph below shows the scenario.
This is an industry where two very large companies could co-exist quite comfortably, pretending to compete against each other while enjoying a joint market share of 80%. It is possible in this industry for lots of small competitors to handle the other 20%, but that's not guaranteed since it could be the case that the reason economies of scale extend so far is because the fixed start-up costs are very high, which would create a more-or-less natural barrier to entry for small firms. Still, it is most definitely possible that, at least in some industries, we could see two-tiered competition; in others, however, those steep costs of entry would pretty much ensure that only big companies could even start, which means that their joint market share, 80 percent, would be insufficiently supplemented by smaller firms to satisfy the entire consumer universe of demand. That would mean the most logical route for the 20 percent deficit of output to be met would be through the entry of a third, very financially well-off company that could handle the high fixed start-up costs along with the attendant, low profits while ramping up to get unit costs down to minimum efficient scale. The big inducement to entry would most likely be prices in the industry. With only two competitors and 20 percent of a huge pool of demand not being met, the bait is almost too tempting to resist, especially for a firm that has known only success in its traditional arenas of activity.
But wait a minute. In this industry, minimum efficient scale for a typical company is at a production level satisfying about forty percent of the market, which means this new entrant is going to have to try to get big enough so that it and the other two companies are jointly producing one hundred twenty percent of what the consumer side of the market needs! That means the third entrant is going to kick the industry supply curve outward so hard that the three competing firms will all have to lower their prices and engage in the kind of competition that big companies have a really hard time handling because of the scales of their operations. First will come the price competition, which consumers will love and conservative pundits will declare as proof positive that a free market works; then, however, will come the company losses because of those massive fixed costs that made the long-run average cost curves so steeply downward-sloping in the first place. The lower prices they're being forced by competition to levy on consumers will force the companies to cut costs the only way possible: since, in the short run, the companies can't very well do anything about their huge fixed costs of operations, they'll have to go for the only type of costs they can affect: variable costs, the most vulnerable of which is labor. That's right: industry-wide, mass layoffs.
This won't fix the problem because, even though the three firms are, indeed, lowering total costs, they're also scaling back output, which means they're pulling back from that golden, minimum efficient scale of output. Long-term average costs are rising as the firms "downsize," and the competitive environment is still putting downward pressure on prices they can charge consumers, so sooner or later something is going to break; and it will probably be one of the three firms, which will finally go bankrupt, simply leave the industry, or, more likely, become so weakened that it will be receptive to a merger/acquisition arrangement with one of the other two companies. That, or it will become so debilitated that it will be vulnerable to a hostile acquisition by some other company, perhaps one that thinks it can make a buck in the industry that's already shown how it can wreck companies that step up to the plate without thinking deeply about long-run average cost curves, minimum efficient scales of operations, and market shares.
A market free of government interference will, of course, take care of the problems: over a long period of time, large firms will come and go, mass layoffs will happen, and industry consolidations will occur; then all will be well for a while, until the next time the incentive to enter a market with high profits and unmet demand draws in a large firm looking for a place to flex its competitive, well-financed muscles; and then the cycle will start all over again.
As an alternativethe one vigorously pursued in the Keynesian era of American economic policy-makingthe federal government could place a firm, steady hand of regulatory control over such industries. Policy could be to the end of ensuring that a sufficient number of companies was permitted to operate in an industry to satisfy most, if not all, potential demand, and the firms allowed to operate were guaranteed that, in exchange for abiding by tight regulation, they would be allowed to charge prices sufficiently high to earn a decent return on investment for their shareholders without breaking the wallets of consumers or making their industries so attractive that competitors would beat on the doors to get their shot at market penetration and eventual, almost inevitable, market disruption.
But that was the way things used to be done, for a while, anyway, during the era when the country was doing pretty darned well and even most of the conservatives bawling for free markets didn't believe their own drool. Fortunately for the United States of that era, neither did the politicians.
The Dark Wraith will continue this series in the weeks to come.
Shadows at Eight Degrees
The story about Cheney and Obama being blood relatives comes on the heels of a September report by the Chicago Sun-Times that Sen. Obama and President George W. Bush are 11th cousins.
The Dark Wraith herewith vows never again to ask, "Could things get any weirder?"
News Framing at CNN.com
Note carefully the manner in which the headline is presented:
The reader who clicks on the headline finds that the associated article recaps the highlights of what is on an official Webpage of Oral Roberts University, wherein Mrs. Roberts is given space to express her dismay about the allegations being leveled against her.
That front-page CNN.com headline might, however, give a somewhat different impression, possibly even that the allegations have actually made her physically ill. Once a reader clicks on the headline and goes to the article, the word "sicken" is in quotes because it is how she, herself, on the official Oral Roberts University Website, is characterizing how she feels about being accused of improprieties.
Now, let us pull back to exactly one week before, on October 8, 2007, when Blackwater USA was still headline news at CNN.com and many other media outlets. The Iraqi government had just ordered Blackwater out of the country. An official investigation by the Iraqi government concluded that 17 Iraqi civilians killed when Blackwater employees opened fire in a town square were murdered by the mercenaries. Gruesome photos and moment-by-moment accounts of the massacre had been delivered by news media outlets for more than a week, with at least one account by the Washington Post providing multiple, eyewitness accounts and on-the-scene interviews with those who saw and survived the relentless onslaught of firepower by the private American security firm's personnel. So, on October 8, recounting the conclusions of the Iraqi government report, which held Blackwater criminally and civilly responsible for the deaths and injuries of Iraqi citizens from the September 16 shower of gunfire in a Baghdad square, CNN.com had the headline, again highlighted here for emphasis, in the screen capture graphic below:
Note, again carefully, but this time also comparatively, the manner in which the headline is presented:
Note the single quotes around the word victim. Such quotes have the effect of removing a word or term from the realm of the factual and placing it squarely in the wide-open field of the claimed, the alleged, the scrutinizable, the questionable. That's the intended effect of so-called "scare quotes": they diminish the representation made by the words within them; and that's exactly what those single quotation marks did with respect to the use of the word victim to describe a person killed or injured in the September 16 Blackwater shooting incident at a square in Baghdad.
According to CNN.com, then, allegations sicken Mrs. Roberts, but someone killed or wounded by Blackwater mercenaries on September 16 is a 'victim' of the attack. A civil lawsuit backed by extensive photographic and textual, documentary evidence can sicken a wealthy, religious, American defendant in a sexually charged lawsuit; but when extracted from a foreign government's official report that included dozens of appallingly gruesome photographs and testimony from many eyewitnesses, the term 'victim' must have quotes around it.
CNN.com has used and avoided using scare quotes before, and it will undoubtedly do so in the future as a way of subtly shaping the perception its readers have of information it reports. Although journalism students are taught about this and other means of "framing" early in their training, many people, even those well-educated, are not consciously aware of it in actual use, even when it is deployed repeatedly, over an extended period of time, by a single news source deliberately attempting to shape perceptions and opinions. While it might have been hoped that, by now, framing would be relegated to local media, CNN.com is demonstrating that what is considered at best disingenuous news presentation and at worst dishonest journalism is alive and quite well in the ranks of 'professional' 'news media' outlets.
The Dark Wraith encourages readers to keep a close eye on CNN.com for more 'journalistic' hijinks.
A Hill People Story for Sunday Night
In one passage, he notes one of his own fuel saving habits: "Put in more personal terms, I'm probably keeping an extra $100 in my pocket each year by coasting when possible, and accelerating only when necessary."
That brought back to my own mind some fine, fine memories, along with some extensions into seemingly unrelated topics of current interest. To me, it all has a unifying theme and character; but I don't think it will to very many others. Nevertheless, I'll tell you a little story from my own time and the place whence comes my sometimes unpopular perspective.
I grew up in an old family. My father was born before the turn of the 20th Century (or just after, depending upon whose account is to be believed), and my mother was born at the end of the second decade of that magnificently complicated hundred-year stretch.
Both came from the hill country, and neither ever trusted the wealthy people. Didn't trust fancy technology all that much, either.
They hated FDR, too. Just the flip side of Hoover, selling snake oil to the desperately poor instead of the wickedly rich. The older I get, the smarter those two old folks are. That's a little strange, considering they've both been cold in their graves for many years.
My father's people came from Pennsylvania: Pennsylvania Dutch, in fact, but my grandfather broke away to make some actual money; he settled in Kentucky in the early 1800s. The census records show that, unlike just about everyone else in the county, he had no Negroes in his possession. He did, however, have no fewer than five unrelated girls between the ages of 13 and 20 in his household. We Germans are always supposed to be on the sociable side, or so I've been told.
My mother's folks came just about from the very soil of Kentucky: French and English trappers in the lower Ohio Valley. Just don't mention the Indian women some of them hitched up with, though; that was almost a worse secret than the European Jews in my father's bloodline.
When I was growing up, I did more than my share of time in hill country, and every last time we'd be at the top of a big rise, the manual transmission would get popped into neutral, and we'd coast down the road. Most of the time, my mother or father would say something like, "Time t' cheat ol' John D. for a while." This was, of course, a reference to John D. Rockefeller and the monopoly power he'd once had over the gasoline markets. My father even recounted watching as a youth as some towns fell to John D.'s gasoline station spatial monopolies created through predatory pricing.
Understand that, down in the hill country of Kentucky, West Virginia, and southeastern Ohio, it's not just that some of the hills go on for a long way down, and it's not just that they can get steep. The excitement comes with the surprisingly abrupt turns those roads can take. That, and the perilously icy conditions that prevail in the later Autumn, all through the Winter, and into the early Spring.
Although the seemingly uncontrolled descents into the valleys never bothered me at all, there was the occasional passenger who would get fussy when we were cheating ol' John D. and my dad would suddenly, in a small voice, say, "Shit."
It always worked out for the best, though.
Until I finally got an automatic transmission car, I would cheat ol' John D. whenever I could, except for when I had old junkers with bad (or, in one case, pretty much no) brakes.
To this very day, I still have my routines, but they're not so much with automobiles; instead, I have a Linux (SUSE) partition and a DOS (Novell 7.0) partition on my hard drive. Whenever I boot to one of those systems instead of Windows, I tell my cat, "We're gonna cheat ol' Bill G. for a while."
My cat understands. Sometimes, he even asks me to tell him again about the magnificent DLC/neo-Keynesian President and his Internet-inventin' Vice President who should have stopped ol' Bill G. in his tracks and thrown him in prison back in the mid-'90s but didn't, either because they really were clueless or, more likely, because they actually thought the Information Age needed its own round of Industrial Policy, complete with monopolies and oligopolies for a new American century.
Darned, but Mom and Dad were right when they said, "His mug's no better'n t'other one's butt." (They might have been right when they said Eisenhower was the last great President, too.)
I should go, now. People are suddenly standing around outside saying someone told them the sky is falling. I don't think they're really going to believe me when I tell them the inconvenient truth of this new American century.
The sky already fell.
It seems that last hairpin turn, the one into post-modernity, was a bit much.
The Dark Wraith should probably just stay inside and write code tonight.
Hallowe'en 2007 Graphic #1
Pleasant dreams, readers.
The Dark Wraith will deliver further troubling images in the days to come, so be forewarned: it's not just your imagination; there really are things that go "BUMP!" in the twilight of Empire.
The 21st Century, Epilogue
In 1999, the PBS series NOVA presented a documentary on the manned space race to the moon. Visually fascinating and quite informativeeven to those who were thoroughly engrossed in all things NASA during the 1960sthe producers expended no small effort convincing viewers of the enormity of the technological, logistical, and theoretical challenges that faced the National Aeronautics and Space Administration as it prosecuted fulfillment of President Kennedy's 1961 vow that the United States would put a man on the moon by the end of that decade. So much had to be done to make this happen. Technological achievements were abundant, and no less so were large leaps in understanding of how everything from massive ships to human beings can move in space. Engines more powerful than anything ever before built by humankind had to be designed, constructed, and then used to heave thousands of tons up the sky, into orbit around the Earth, and then out of that gravity well into the black, barren ether from our planet to a child world we had seen since forever but had touched only in our dreams and literature.
What was donewhat they who accomplished this did, what we as a very species didwas genuinely epic. The science fiction writer Ray Bradbury summarized the 20th Century processual change in our thinking with the titles of his two books, R Is for Rocket and S Is for Space.
Of the U.S. space program of the 1960s, the narrator of that PBS television program said that it was as if a slice of the 21st Century had been carved out and placed right in the middle of the century before it.
"Disruptive technology" is the name sometimes given to a feat of engineering because it is so extraordinarily different: so much more efficient, so lacking in use of existing ways of doing things, so innovative. Frequently, however, the disruption is only apparent: the technology that seems so radical is built upon ideas, work, and understandings that have been in development or even use for quite some time, perhaps even decades or centuries. Yet, still, the disruptive aspect is there because almost no one knows about all the work that came to be embodied in the amazing technology that suddenly arrives, seemingly in the blink of a single generation's eye, into the world of the known.
As the great achievement of putting a person on the moon illustrated, however, sometimes a disruptive technology can seem to vanish as quickly as it had appeared. After a few years and a few more manned missions to the moon, we stopped going; and we never went back, not for the long remainder of the 20th Century, not even in this first decade of the following century. It is as if the manned moon missions were, indeed, an out-of-place, out-of-context slice of another century set into the middle of a far less technologically advanced, far less prepared world. And so, even as a living thing that could manage to be born and grow to stun the 20th Century peoples of the Earth, its maturation required far more nourishment than that time, place, and people could give it; and so it died, leaving its seed for a more inviting, better prepared world of some tomorrow in the century from which its slice of life had been carved.
Although the mission to put a person on the moon might seem unique when described as a 'slice of the 21st Century' placed in the century before, it certainly was not: the 20th Century was a veritable grafting tree for apparently disruptive technologies that would come into the world of perception only to go away. Nuclear weapons are a striking example: used twice in short succession, these bombs so phenomenally more powerful and destructive than any war weapon ever before built then simply vanished from use to their intended purpose. And just like how the technologies and work on space missions survived the end of the successful Apollo program, so too did the technologies and work on nuclear weapons continue apace after the successful attacks on Hiroshima and Nagasaki. Certainly, people talked endlessly about nuclear weapons after World War II, just as people went on, sometimes breathlessly, about missions to the moon and beyond after Apollo; but neither moon landings nor nuclear attacks ever again occurred. Historical records have the last say: both technologies vanished from operational use.
Yet another technology came and went in the middle part of the 20th Century; but to note this one, the term "technology" must first be elucidated in a way somewhat unusual to common understanding. In typical usage these days, "technology" carries a connotation of something improved, probably electronic or otherwise employing very modern devices, but this usage of the word is only exemplary of technology as a much broader thing. "Technology," in its general sense, is just the combination of the five production factorsland, labor (brute muscle), human capital (knowledge and learned skill), physical capital, and (in capitalist types of economic systems) entrepreneurial skillused to produce a good or service. To make or create something, various combinations of the input factors might be employed, each with some cost associated with it. Some combinations work better than others to make specific things and provide specific services, and as a general rule, the most efficient combination to a given purpose tends eventually to dominate, although efficiency might be measured in some societies by measures other than mere cost per unit of output.
In terms of what is produced, technologies are always, by definition, being deployed. Some goods and services do not exist because no combination of inputs has been found to render the needed result. Before any "thing" can be made, of course, the very idea of it must first come into existence, and this is where the essence of human imagination has its great strength; but the idea must be brought by some means from the mind to the world of reality, and this requires the imagination of a combination of the factors of production that might accomplish the realization of the idea. In capitalist societies, the factor of production called entrepreneurial skill takes on this task: some person or group bears risk in combining the other four factors to the conscious or unconscious purpose of garnering a reward at least as great as he, she, or they would earn from doing the next best thing with their time, minds, resources, and energy. In what are called "command economies," this task of bringing together those four factors of production is assumed by a central authority acting perhaps for the good of the society or for those in control of the machineries of state. But whether by the initiative of those pursuing free choice in trying to make money or by the authority of sovereign controllers pursuing large ambition, ideas are brought to reality by combining factors of production in such a way that output is created; and the number of ways of combining the production factors is so vast that it might be that there is simply no limit to what can be made once the idea of it comes to someone and then one or more ways of combining the factors can be worked out to get an actual product or service into the world.
Technology, then, is a possibly infinitely large set of variations on land, labor, human capital, physical capital, and the willingness and ability to combine those first four. For example, one technology for making fire might involve flint, stone, and some dry grass, along with a bit of brute muscle (labor) and a learned or developed understanding (human capital) of how to use the muscles and other stuff to get the fire going; yet, that same flint, stone, dry grass, labor, and human capital might be used to make something else, instead. Furthermore, other technologies exist for making fire, technologies that employ quite different levels of some factors and perhaps not a bit of others that are used to make fire the old-fashioned way. Even more astonishing upon careful thought is that some of the very same factors of production (labor and human capital, for instance), combined with other resources, can be used to make something completely and fundamentally different from fire, something like, say, an axle for a truck, a masterpiece of sculpture, relief from a hard day at work with a back massage, or the pleasurable sensation of good music.
Technology is capable of rendering anything, and all that separates the 'what might be' from the 'what really is' boils down, first, to what has been imagined as possible and, second, to what combination of productive factors can make it real.
In this light, "disruptive" technology is nothing other than a combination of productive factors far outside the typically marginal steps that are taken from one means of production to another. Flying to the moon and backcertainly imagined for agesrequired an extraordinary effort to develop the full gamut of inputs and combinations thereof that would be capable of turning the idea into reality. Annihilating cities with single bombs, once again, required extraordinary effort to develop a previously uncontemplated inventory of inputs and combinations thereof that could turn what was also an age-old dreamin this case, utterly staggering, city-wide, virtually instantaneous destruction of property and livesinto reality.
Briefly stepping back from the truly awesome aspects of lunar landings and nuclear annihilations, though, the disruptiveness of the end results of such technologies is only relative. Over and over again, across the world, adventurers had throughout history flung themselves on voyages of unimaginable duration and peril; and by the same token, so too had the belicose wrought upon others unimaginably horrific destruction in what to the times was the blink of an eye. The great "disruptive" technologies of our own ages are but the most recent in a continuing history of technologies disrupting the world and leaving its people in awe and wreckage.
Nevertheless, traveling to another celestial body and wiping out entire cities with single bombs were both extraordinarily impressive to the peoples of the mid-20th Century, and so each rightly stands as disruptive to its time: they were, indeed, slices of the 21st Century arriving, then departing, a world not braced for the impact of the profound novelty of the end results.
Taking "technology" in its broadest sense, the 20th Century holds in its fading inventory another disruption worthy of note. This other disruptive technology was one of governance, a means by which society could be organized to the desirednot necessarily achieved, but nevertheless desiredend of maintaining internal stability and external security. In the celerity of its rise from the ashes of a defeated state, the scope and scale of its outward, militaristic push, and the sheer phenomenality of its willingness and ability to kill people by the millions, European fascism rightly stands as a disruptive technology of the 20th Century. Other states in historical proximity might have carried one or more of its striking featurescertainly, Stalinist Russia and Maoist China qualify as superficially comparable experimentsbut fascism stands alone in the technology it employed to govern by means of power consolidation within the states it infected, the speed and power with which it projected outward from its ideological and political bases of birth, and its sweeping, rapid, systematic, conscious use of human slaughter to achieve its perception of what constituted both internal stability and external security. And just like the other disruptive technologies described above, fascism was disruptive only in a relative sense: history offers many examples of quickly rising, unbelievably violent, hegemonic states; yet, just like the lunar landing program and nuclear weapons program, fascism stood in its time as a technology that legitimately merits the term "disruptive."
And there it was, right smack in the 20th Century. It arrived, it shocked an unprepared world, and then it seemed to vanish. As if by the magic of the words "never again," fascism evaporated in the overweaning hubris of the Allied victors hanging the men who had served their purpose as the faces of something far more evil than human souls, even in their collective madness, could construct from mere ideas brought to the world of the real by some remarkable combination of productive factors.
Only those completely bereft of futuristic hopes would claim people will not return to the moon and, in fact, go far, far beyond that small staging ground to the planets and then, someday, to the stars. That slice of the 21st Century will find its way back to where it belonged in the first place, not because it will again be disruptive, but rather because the process never ended, even as the outward product of the process was no longer seen by the common people. By that same motive force defying the terms of a single age and the limits of imagination of the people who populate it, nuclear weapons will again be used, not as disruptions to the time of their reintroduction in 21st Century war theatres, but as the inevitable operational outcome of processes that never ended even though use of the final products stopped so abruptly in 1945.
So, too, will fascism return, motivated as it was in its first instance by forces featured in awful ideas and violently sweeping solutions never globally addressed when its European version was crushed and hauled away in body bags from the gallows at Nuremburg.
The slices taken from the 21st Century will find their own ways home to their own time, where they will no longer be disruptions, but instead, anticipated arrivals to the world that will surely believe it needs them.
Anticipate too soon the return of previously disruptive technologies and thereby find frustration; believe that the undesirable of those old technologies can be stopped and thereby find even greater frustration. We will return to the moon and then go on to the planets, finally settling some of our kind there, not because we want to do that, but because it is the way we will escape, as people always have, the pressures of states and their weapons of repression that always attend too many people and too many ways to deny them their mind to free will, mythical as that might or might not be.
We will use nuclear weapons on one another again, and all the angry demands to forestall that time will be for naught because sovereignties and peoples cannot be stopped in the long run from manifestly fullfilling their belief that what they will do, however horrific it is, simply must be done because beliefs must endure, even if millions must die for the immortality of the imagined.
We will return to repressive statesto fascist and quasi-fascist means of governanceand the frustrations now evident as those who apparently could stop it, like the Democratic majority in the United States Congress, simply cannot stop it. Despite their representations to the contrary, Senators and Representatives just keep allowing further and further inroads into what at one time were unassailable and precious liberties of the "We the People" of this nation.
The re-emergence of the fascist technology of goverance is the Hegelian "historical inevitability," only Georg Hegel perhaps did not appreciate the processual imperative driving what is easily mistakable for mere similarities of precedents leading to similarities of outcomes. The world is, in fact, far more consistent: rending the weed from the soil does nothing to its roots; tearing the roots from the ground does nothing about its seeds; burning the very ground on which the seeds have scattered does nothing to prevent nature from bringing back, in another time, the very same intruder which, when all have wearied of the fight against the menace, will finally be accepted as part of the landscape. Stories about the weed when it first arrived will become fables about the flowering vine that was always there, even when no one could see it.
The 21st Century will proceed through its course, no less and no more than that for which it was born. Its peoples might have been prepared for its grand and fearsome inevitabilities if not for the awe with which they were blinded to forethoughts of inevitable, unstoppable triumph and grief in the generations and times to come. Even then as it proceeds, the 21st Century thus stands in epilogue.
The Dark Wraith has spoken.
French Cream Pies
To lessen the joylessness of the Saturday meetings, attendees are encouraged to bring in food for everyone to eat. Most people bring store-bought stuff: chips and dip; trays of sad, raw vegetables; boxes of cookies; and other barely edible treats. A few of the female faculty members prepare hot food: crock-pot fare, like those little weiners stewed in barbecue sauce, or swedish meatballs floating in gravy.
I, on the other hand, do desserts. Sometimes, I put together one of my very large (and appallingly heavy) white chocolate raspberry cheesecakes or a chocolate Texas sheet cake.
This time, I'm going to offer a couple of my French cream pies. I wouldn't have been able to do this save for a couple of glad events. First, I was able to borrow a mixer once again from Sunrock, whom I thank for the loan of equipment. I also have to (grudgingly, mind you) thank someone who somehow managed to notice a PayPal donation button I had buried in an obscure place where I didn't even notice it had gotten in when I redid the advertising frame to include a new ad for Harry Potter merchandise at Hot Topic. The donation allowed me to purchase the ingredients and get my good digital camera out of hock.
The recipe as readers here are going to see it made includes some cheating on my part. The topping on one pie will be made from scratch, but the topping on the other pie is store-bought. Also, the pie crusts are ready-made, although I do a little bit to them to make them work better for the recipe. Even with those cheats, the pies are pretty much homemade, and they are absolutely delicious. There isn't even any point in talking about calories for a slice of one of these pies, and that's because calories aren't supposed to be stated in scientific notation. As far as nutritional value goes, readers can see for themselves that a dessert like this is made from wholesome, healthful ingredients that are sure to extend your life, make you more attractive, and enhance your willingness and ability to endure this last year of the Bush Administration.
And if that doesn't make you want to read the recipe below and follow it carefully, there's a little something at the end that might be worth the while of at least a few regulars around here if they will only read and enjoy the exposition that now follows.
The ingredients are simple. This will be for two French cream pies, one apple, the other cherry.
• One eight-ounce package of cream cheese, warmed enough to be fairly soft.
• One cup heavy whipping cream.
• One cup confectioner's (10X) powdered sugar.
• One-half cup brown sugar.
• One-quarter cup granulated sugar.
• One-quarter cup lemon juice.
• Three small to medium Jonathan or Granny Smith apples, peeled and sliced thin.
• One-quarter teaspoon cinnamon.
• One can, at least 20 ounces, of cherry pie filling (the more cherries, the better in the filling.)
• Two pie shells, preferably shortbread.
Get your ingredients together and admire your culinary organizational skills.
2008 GOP Convention Logo (and Suggested Revision)
The Dark Wraith invites republication with attribute of the logo on the right, above, but herewith concedes that the Republican National Committee will undoubtedly stick with the convention logo on the left, unfortunate as that choice may be.
The Outrage This Time
The corporation of mercenaries called Blackwater now stands in line for its turn as the whipping boy. Tuesday evening, October 2, 2007, twocount 'em, two!headline stories at CNN.com: "Mom: Blackwater should never forget my boy" and "Blackwater pilot before crash: This is fun." The New York Times, that bastion of doggéd investigative journalism that served as the Bush Administration's top propaganda tool in the run-up to the American-Iraqi War, pounds it out: "Chief of Blackwater Defends His Employees," a headline that belies the framing in the story intended to deepen readers' sentiment against mercenaries, as brave and controversial an approach to journalistic subject matter as declaring that Mom and apple pie are good.
First, mercenaries are a dislikable breed through and through. They have been in wars, on battlefields, in state-sanctioned military activities since time immemorable. They are thoroughly hated in many situations. Even the putative godfather of irascibility in realpolitik, Niccolo Machiavelli, found them wanting. That said, they can be quite effective, which is why Alexander the Great was so merciless in butchering fellow Greeks fighting for the Persian Empire: those Greek mercenaries were among the best soldiers Persia was fielding, and Alexander wanted to send a message to fellow Greeks everywhere that they might want to consider an occupation other than fighting for the enemy.
The history of mercenaries and their venality aside, Blackwater mercs are doing exactly the same things that are being done by special ops, black ops, and other actual soldiers of the United States armed forces. Special units of real, uniformed, American soldiers are prosecuting a global war of staggering brutality. President George W. Bush gave an orderat one time referred to as a “GADA” (go anywhere, do anything)—similar to, but far more extensive than, the GADA issued by President Ronald Reagan years ago: assassinations; field executions of unarmed people stopped for questioning; violent, on-the-spot interrogations of detained individuals; and other activities abominable and o-so-necessary in the passions of the age and the minds of those with no better solutions to a world in which others hate us and want to hurt us.
As explained in Robert Parry's recent article, "Bush's Global 'Dirty War'," at Consortium News, violence in contravention of international law, American law, and plain decency is happening all over the world, and it is our soldiers committing these unspeakable cruelties. Parry relates the following incident from Afghanistan, one for which two soldiers faced disciplinary action:
"Two U.S. Special Forces soldiers took part in the execution of an Afghani who was suspected of leading an insurgent group.
"Though the Afghani, identified as Nawab Buntangyar, responded to questions and offered no resistance when encountered on Oct. 13, 2006, he was shot dead by Master Sgt. Troy Anderson on orders from his superior officer, Capt. Dave Staffel.
"According to evidence at the Fort Bragg proceedings, an earlier Army investigation had cleared the two soldiers because they had been operating under 'rules of engagement' that empowered them to kill individuals who have been designated 'enemy combatants,' even if the targets were unarmed and presented no visible threat."
So now the mainstream media have the private contractor Blackwater for their whipping boy du jour, a boy, it must be stipulated, wholly worthy of being whipped mercilessly and sent to bed without any supper; but this is nothing more than the latest episode in the continuing gambit of pointing to the minor actors like those same dedicated media hounds did with the hillbilly trash at Abu Ghraib. The ruse should now be so transparently obvious: its intention is to keep the klieg lights from turning on the central, spreading, ungodly cancer that has come to be “the American way” in trying to control a world that is leaving America behind. A few heads rolled and got prosecuted for Reagan's dirty war in Central America, a war that ultimately, by some accounts, left more than 45,000 people dead. In that instance, the major players in the Reagan Administration who were actually prosecuted and convicted were ultimately granted clemency by President George H.W. Bush, and some of them, along with those like John Negroponte, who was never prosecuted, are now working for the Administration of President George W. Bush. A few heads rolled in the Abu Ghraib scandal, but those poor suckersbeing nothing other than deliciously villifiable, common, poor trashare in prison, while their superiors, all the way up to President George W. Bush, himself, proceed onward, pursuing the very same, if not even more systematic and aggressive, policies of torture, renditions, extra-judicial executions, and other mayhem.
Readers are, of course, encouraged to enjoy what will come to be the mainstream media's continuing, exciting coverage of all that is evil about Blackwater Corporation and its venal mercenaries and their terrible, terrible ways. It is good theater, and it will sell newspapers for a while. Official outrage will attend this media circus: the Democrats in Congress are already jumping on the bandwagon, declaring their disgust, anger, and outright shock at what Blackwater mercs have been up to, these being the same Democrats who just slipped the Bush Administration a quick, cool $70 billion in stop-gap funding to continue the American-Iraqi War while they finish work on a final Defense Department budget for the fiscal year. Eventually, the President, himself, will visit the podium and intone gravely about the matter. Good theater, all around; the American people will get to feel the pulsating power of moral rectitude, and the politicians will pop a feather in their cap and call it righteousness.
But only a very few will wallow in exquisite frustration, especially at the news media, groveling once again as its august and worthy legion always do to this Administration as it leans forward with wholesale, systemic, unlawful extremities of violence that have become the vast tapestry of cruel and pale substitutes for compehensive, intelligent, forward-looking foreign policy. The mainstream media grapple mightily with miserable tidbits they can use to show their diligence in reporting the "truth"; and all the while these toothless, cowering watchdogs of public interest deliberately, with calculation and prejudice, ignore the larger, far more comprehensive and ugly truth that stands before them, the great truth of our time hiding right in plain sight lit so brightly that it would veritably blind the American people, even as it enraged them to self-righteous denials. For the record, here is that larger truth:
We are Empire, and we commit evil neither as error nor incidence, but as policy reflecting our very nature.
If you disagree with that, you are probably already looking forward to the next war scandal at which you can either shrug your shoulders or shake your fist and demand yet more flesh of the low-lives who merely carry out the orders of Empire as it backs further and further into the corner of history closing in on its final chapter.
The Dark Wraith has spoken.